Difference: GoogleCSR (1 vs. 16)

Revision 162021-04-22 - LabTech

Line: 1 to 1
 
META TOPICPARENT name="FordhamRoboticsAndComputerVisionLaboratory"

GoogleCSR Autonomous Robotics Project Spring 2001

Autonomous Robot Exploration

Line: 225 to 225
 **By 4/21 Each team will transmit a traditional poster, a set of slides and a recorded presentation 6-7 minutes long. Workshop 4/23 -- Poster/Video from each team showing results**
Added:
>
>
 Suggested slide sequence
  • Title slide, title names of team, schools, pictures?
  • Objective - what problem are you solving - make sure to use images and visual as well as words

Revision 152021-04-21 - LabTech

Line: 1 to 1
 
META TOPICPARENT name="FordhamRoboticsAndComputerVisionLaboratory"

GoogleCSR Autonomous Robotics Project Spring 2001

Autonomous Robot Exploration

Line: 225 to 225
 **By 4/21 Each team will transmit a traditional poster, a set of slides and a recorded presentation 6-7 minutes long. Workshop 4/23 -- Poster/Video from each team showing results**
Added:
>
>
Suggested slide sequence
  • Title slide, title names of team, schools, pictures?
  • Objective - what problem are you solving - make sure to use images and visual as well as words
  • what did you need to learn to address this: ROS, Gazebo simulator, ROS Gmapping stack, Turtlrbot3 robot
  • Why does wander not do a good job, shown examples
  • What strategy do you propose to address this and what data structure will you need (eg 2D pose histogram or wall histogram etc)
  • describe how you get the data structures and what they look like
  • present some results of running these and comment on effectiveness and how you measure it

Have a short video sequence of your best strategy run

Video is 6-7 minutes, each team member talk to 2 or 3 of the slides. Its not really a video, just add narration to your slides e.g., in powerpoint or whatever and save the result as an mp4

 WORKSHOP April 23rd!!

Revision 142021-04-21 - LabTech

Line: 1 to 1
 
META TOPICPARENT name="FordhamRoboticsAndComputerVisionLaboratory"

GoogleCSR Autonomous Robotics Project Spring 2001

Autonomous Robot Exploration

Line: 222 to 222
  * W8 4/19: *
Changed:
<
<
By 4/21 Each team will transmit a traditional poster, a set of slides and a recorded presentation 6-7 minutes long. Workshop 4/23 -- Poster/Video from each team showing results
>
>
**By 4/21 Each team will transmit a traditional poster, a set of slides and a recorded presentation 6-7 minutes long. Workshop 4/23 -- Poster/Video from each team showing results**
  WORKSHOP April 23rd!!

Revision 132021-04-15 - LabTech

Line: 1 to 1
 
META TOPICPARENT name="FordhamRoboticsAndComputerVisionLaboratory"

GoogleCSR Autonomous Robotics Project Spring 2001

Autonomous Robot Exploration

Line: 175 to 175
  Next meeting: 4/7. We need to have the code running by then.
Changed:
<
<
W6 4/5:
>
>
*W6 4/5: *
  Remember when calculating the transformation from the sensor values (angle, range) for the laser, than you need to 1) add the angle of the robot to the angle of the sensor (and make sure you have everything in radians) and, 2) add the position of the robot to the x and y values you get from your polar to cartesian mapping.
Line: 204 to 204
 Next meeting 4/14
Changed:
<
<
W7 4/12:
>
>
W7 4/12:
 
Changed:
<
<
W8 4/19:
>
>
The rubber hits the road this week! Develop and test your algorithms for improved mapping. Remember! You have to have evidence that it works, and you can only get this by conducting a principled parameter study of your code running. You need a number of gMapping maps generated just by wander, to use as your 'base' case. Then you need to generate the same number for each solution you have. Ideally you would show not just that your algorithm is better, but you would show the behavior of your algorithm as you modify parameters. E.g. if you are directing the robot to regions of empty space to improve mapping, then detail how well it works for different target sizes of empty space (numbers of 'pixels' / value of 'pixels' on you internal histogram data structure)

Since we are almost done, you need to start writing up your results now as well: Make some slides on how your method works and what your objective was. \Collect your gtMap outputs and put these on slides to present.

You can send me copies of these if you want my feedback, but do that by Sunday at the latest!

Discuss and agree on who will be saying what in your short video presentation.

Next meeting is 4/21 and that is the LAST meeting - so we're review all your materials then - which you will then submit.

* W8 4/19: *

  By 4/21 Each team will transmit a traditional poster, a set of slides and a recorded presentation 6-7 minutes long. Workshop 4/23 -- Poster/Video from each team showing results
Added:
>
>
WORKSHOP April 23rd!!
 

Permissions

Revision 122021-04-08 - LabTech

Line: 1 to 1
 
META TOPICPARENT name="FordhamRoboticsAndComputerVisionLaboratory"

GoogleCSR Autonomous Robotics Project Spring 2001

Autonomous Robot Exploration

Line: 177 to 177
  W6 4/5:
Added:
>
>
Remember when calculating the transformation from the sensor values (angle, range) for the laser, than you need to 1) add the angle of the robot to the angle of the sensor (and make sure you have everything in radians) and, 2) add the position of the robot to the x and y values you get from your polar to cartesian mapping.

You want the map to be big enough to represent the entire house area at the resolution you want. A 1 meter resolution is probably too big - maybe between 10 and 50 centimeters would be a good range to experiment in.

You can use your 2D pose histogram data very easily with wander -- make the velocity in the else clause not a random number, but directly proportional to the value in the histogram at the robot's location. Thus: faster where it has been before and slower otherwise. Make sure the 'cap' the value however, or you may get overly fast speeds.

You can use the wall histogram array, which is pretty close to a 2D floor map, in several ways: E.g., 1) Find 'breaks' in the wall area enclosing the current position of the robot and move towards the center of the break, 2) Find the close wall, move towards it and follow it around.

These are just some ideas - you can also invent new ones.

However, you have to be objective about your evaluation of a strategy. Lets say that our strategy is to run the program for 5 minutes and then ask the map server to store the map. You should do this 3 or 4 times now for the basic wander program, so you have some 'base' examples to compare your ideas to. For consistency, always move the robot to the same start location and the same angle before you run a test.

Implement the simplest of the ideas for improving coverage, e.g., the pose histogram one. Debug thoroughly of course to make sure it is working before you test it. Then run it 3 or 4 times and ask map_server to store the maps. Visually compare each of the wander maps with each of the new maps: are they all more complete, are some better, are they different in some other way?

Perhaps break up the task, so you can test a few simple ideas like this and report back next week. Lets discuss how the ideas worked, and how you did your map comparisons.

For you presentation you will need to show some videos and some images. Even though you are still working on your main result, you should spend some time (perhaps delegate team members) to 1) 10s to 20s recordings of wander in gazebo house model, and other models if you want 2) same length screen recordings of RViz showing the laser output 3) same length recordings of gmapping while the robot is wandering

I suggest start now since this is a standalone task that someone could do to help the team and become proficient at using the screen recording function to generate short visually attractive video clips. Its tougher to do, but you can also use the Gazebo interface to perform panning or zooming during the video to get a more 'live TV' effect.

Next meeting 4/14

 

W7 4/12:

W8 4/19:

Changed:
<
<
Workshop 4/23 -- Poster/Video from each team showing rsults
>
>
By 4/21 Each team will transmit a traditional poster, a set of slides and a recorded presentation 6-7 minutes long. Workshop 4/23 -- Poster/Video from each team showing results
 

Permissions

Persons/group who can view/change the page:

Revision 112021-04-07 - LabTech

Line: 1 to 1
 
META TOPICPARENT name="FordhamRoboticsAndComputerVisionLaboratory"

GoogleCSR Autonomous Robotics Project Spring 2001

Autonomous Robot Exploration

Revision 102021-04-01 - LabTech

Line: 1 to 1
 
META TOPICPARENT name="FordhamRoboticsAndComputerVisionLaboratory"

GoogleCSR Autonomous Robotics Project Spring 2001

Autonomous Robot Exploration

Line: 88 to 88
  Using "roslaunch turtlebot3_gazebo turtlebot3_house.launch" start the simulated house world. In a separate window, start the wanderT3 python program with "python wanderT3.py". Observe the behavior of the robot - have you any ideas for improvement so that it explores the house better and quicker?
Deleted:
<
<
  The demonstration of SLAM did not go as expected during our meeting. I had forgotten to type"source ~/catkin_ws/devel/setup.bash" before I started it, and thats how come I got some errors when it started!

You need to have started the house simulation first, and then start a new terminal window.

Added:
>
>
image
 When you type "roslaunch turtlebot3_slam turtlebot3_slam.launch slam_method:=gmapping" what you should see is this
Changed:
<
<
>
>
image
  This shows the 'progress' of the robot in using the 360 degrees of laser range finding data to make a map. If you start another new terminal shell and let wanderT3.py run, you will see the map grow. You can also use the teleoperation command from last week to 'drive' the robot around, and that is a much quicker way to make the map!
Line: 149 to 150
 
Changed:
<
<
W5 3/29 Begin Phase 2
>
>
W5 3/29 Begin Phase 2

  Design & testing of exploration algorithms
Added:
>
>
We discussed adding two items of state information to the Wander robot program. The first was to use a numpy array as a 'map' of the room. The robot pose information - as generated in a pose callback (see goto.py) - can be used to generate the x,y location of the robot on a grid. This needs to be transformed to array indices.

You can use a linear transformation: index = scale * coordinate + offset. In this the coordinate is either the x or the y coordinate of position. The scale magnifies the coordinates from meters to centimeters or millimeters - whatever resolution you want. The offset displaces the coordinates to the middle of the array, so that negative values will appear on the left or top and positive values on the right or bottom. And 0,0 will be right in the middle. Remember you must make the index an integer as the last step, in order to use it to access the numpy array. The attached code (plotting maps.txt) shows an example of setting up the numpy array.

The easiest way to use this map is to just add 1 to the elemnt at the index for every location that the robot occupies. That way you get a histogram in 2D of the robot's location - maybe it can avoid places it has been to frequently etc.

The second idea we discussed was to use another numpy array to store the position of the walls etc as detected by the laser range sensor. The laser range sensor produces a list of 360 values, the distances to the walls around the robot starting at angle 0 facing front, then angle 1 degree to the left, 2 degrees, etc. If the robot is at angle theta (that's gLoc[2] in the pose callback that gives the angle of the robot), then the angle of the laser range ray is theta +the index of the laser range list (msg.ranges in the laser callback) converted to radians. gLoc[2] is already in radians.

If we look at one specific laser range ray, say the one with index i. Then the angle is alpha=theta+math.radians(i). The coordinates of the end point of the ray (where the wall is) can be gotten by adding the coordinates of the position of the robot (x,y) to (r cos alpha, r sin alpha) where r =msg.ranges[i], the range value at index i.

Once you get the coordinates of the end point of the array, you can use the same linear transformation as used for the robot position to map it into the numpy array you make for walls. You can also use the histogram approach - adding one to the location every time you get a laser reflection from that location. Thus, places with high values are more likely walls.

You can use this information for example to determine if you are finished mapping a room, or if you can see doorways in the room and want to go into them. Note that the map generated by this method is very much inferior to the one generated by gMapping!

For next week: Modify Wander so that it includes the pose callback and pose subscriber, make your two numpy arrays, by adding the code to make the position histogram at the end of the pose callback and the wall 'histogram' at the end of the laser callback. Use the plotting code to show the two maps. Put the plotting code in the shutdown callback, so that it will display the maps when you stop the program.

Next meeting: 4/7. We need to have the code running by then.

 W6 4/5:

W7 4/12:

Revision 92021-03-31 - LabTech

Line: 1 to 1
 
META TOPICPARENT name="FordhamRoboticsAndComputerVisionLaboratory"

GoogleCSR Autonomous Robotics Project Spring 2001

Autonomous Robot Exploration

Line: 173 to 173
 
Added:
>
>
 
META FILEATTACHMENT attachment="moveSquareT3.py.txt" attr="" comment="Simple ROS Node" date="1615411134" name="moveSquareT3.py.txt" path="moveSquareT3.py.txt" size="1753" user="DamianLyons" version="1"
META FILEATTACHMENT attachment="WanderT3.py.txt" attr="" comment="WanderT3.py" date="1616030867" name="WanderT3.py.txt" path="WanderT3.py.txt" size="3264" user="DamianLyons" version="1"
META FILEATTACHMENT attachment="goto.py.txt" attr="" comment="Goto node" date="1616680185" name="goto.py.txt" path="goto.py.txt" size="3028" user="DamianLyons" version="1"
Added:
>
>
META FILEATTACHMENT attachment="plotting_maps.txt" attr="" comment="Instructions for platting maps" date="1617229558" name="plotting_maps.txt" path="plotting_maps.txt" size="779" user="LabTech" version="1"

Revision 82021-03-25 - DamianLyons

Line: 1 to 1
 
META TOPICPARENT name="FordhamRoboticsAndComputerVisionLaboratory"

GoogleCSR Autonomous Robotics Project Spring 2001

Autonomous Robot Exploration

Line: 43 to 43
  feizza01ATgmail.com, dengconnie0723ATgmail.com, silviadb48ATgmail.com
Added:
>
>

 

Schedule

W1 3/1: Begin Phase 1 UBUNTU in place, ROS installed, Start tutorials

Line: 84 to 86
  Review this program and discuss it in your group meeting. If your group has questions, please email me, but copy both teams so everybody can benefit from the answer.
Changed:
<
<
Using "roslaunch turtlebot3_gazebo turtlebot3_house.launch" start the simulated house world. In a separate window, start the wanderT3 python program with "python3 wantderT3.py". Observe the behavior of the robot - have you any ideas for improvement so that it explors the house better and quicker?
>
>
Using "roslaunch turtlebot3_gazebo turtlebot3_house.launch" start the simulated house world. In a separate window, start the wanderT3 python program with "python wanderT3.py". Observe the behavior of the robot - have you any ideas for improvement so that it explores the house better and quicker?
 
Line: 102 to 104
  W4 3/22: Demo gMapping. wander script. exploration mapping
Added:
>
>

NOTE: an important note for everybody: add the following line at the end of your .bashrc file in your home folder:
source ~/catkin_ws/devel/setup.bash

To start the SLAM mapping node in ROS do the following:
1. In a new terminal window type
roslaunch turtlebot3_slam turtlebot3_slam.launch slam_method:=gmapping
2. To move the map display in the RVis GUI that comes up when you type this:
left button drag will rotate the view
shift left button drag will move the view up/down and left/right
zoomin/out as you would for gazebo

To SAVE your map, in a new terminal window, type
rosrun map_server map_saver -f nameoffile

Here 'nameoffile' will be any name you pick to save the map. The map will be saved as a PGM image file. To view it, type "eog filename". The command"eog" is a linux command ("eye of gnome") related to the Gnome window system, and that runs on linux that you can use to view any kind of images.

To start a teleoperation ROS node to move the robot around:
1. start a new terminal window and type
roslaunch turtlebot3_teleop
and press the TAB key to allow it to auto-complete. Then press enter.
2. Try to have the Gazebo and/or RViz view oriented so that moving the robot forward is up on the screen
3. w and x increment/decrement forward velocity, a and d increment and decrement rotational velocity
pressing the space bar stops the robot
4. I recommend you navigate by rotating the robot until it faces in the right direction, and then go forward. When you want to change direction. stop, rotate to the right direction and only then go forward.

You can start the wander program anytime you want by making a new terminal window and typing "python wanderT3.py". Stop any of these nodes that you do not want by typing control-C (^C).

NOW you need to think about how to change wanderT3 so that it allows gmapping to make a better quality map. Think: how can you avoid it going over the same spot repeatedly? How can you encourage it to explore new areas, go into doorways and corridors and so forth?

It may be useful to have access to the robot's estimation of its own location in space. The location is just a coordinate on a 2D grid (x,y) in meters along with an angle that gives the direction the robot is facing (the yaw angle, or angle around the Z axis) in radians (remember 2pi radians = 360 degrees).

The program goto.py attached here shows you how to 1) declare that your node SUBSCRIBES to the position topic which is called /odom and 2) make a callback function for the topic that will place the x,y, and yaw angle in a global variable called gLoc, where gLoc=[x,y,angle].
If you include these two steps in YOUR program, then you will always have access to the current position of the robot as the global variable gLoc.

The program goto.py is executed in a new terminal window by typing "python goto.py" with two numbers on the same line separate by a space, e.g.
python goto.py 1 2
The robot will try to drive to this location on the grid (1,2). It does not avoid obstacles.

Next meeting is Wed 3/29 at 5:00pm


 W5 3/29 Begin Phase 2

Design & testing of exploration algorithms

Line: 124 to 171
 
Added:
>
>
 
META FILEATTACHMENT attachment="moveSquareT3.py.txt" attr="" comment="Simple ROS Node" date="1615411134" name="moveSquareT3.py.txt" path="moveSquareT3.py.txt" size="1753" user="DamianLyons" version="1"
META FILEATTACHMENT attachment="WanderT3.py.txt" attr="" comment="WanderT3.py" date="1616030867" name="WanderT3.py.txt" path="WanderT3.py.txt" size="3264" user="DamianLyons" version="1"
Added:
>
>
META FILEATTACHMENT attachment="goto.py.txt" attr="" comment="Goto node" date="1616680185" name="goto.py.txt" path="goto.py.txt" size="3028" user="DamianLyons" version="1"

Revision 72021-03-24 - DamianLyons

Line: 1 to 1
 
META TOPICPARENT name="FordhamRoboticsAndComputerVisionLaboratory"

GoogleCSR Autonomous Robotics Project Spring 2001

Autonomous Robot Exploration

Line: 56 to 56
  Plan to meet with your team -- anyone on the team can organize this, but if you don't hear anything by Saturday then YOU organize it.
Share progress and help each other.

Read the document "A Gentle Introduction to ROS" - use this as a reference document for the first tutorials.

Added:
>
>
 
Deleted:
<
<
 

Next meeting is 5pm on Wed Mar 10.

Revision 62021-03-18 - DamianLyons

Line: 1 to 1
 
META TOPICPARENT name="FordhamRoboticsAndComputerVisionLaboratory"

GoogleCSR Autonomous Robotics Project Spring 2001

Autonomous Robot Exploration

Line: 45 to 45
 

Schedule

Changed:
<
<
W1 3/1: Begin Phase 1 UBUNTU in place, ROS installed, Start tutorials
>
>
W1 3/1: Begin Phase 1 UBUNTU in place, ROS installed, Start tutorials
  Install up to 3.1.5 on the robotis 'quickstart guide. Examine your ~/bashrc file as in 3.1.6 part 3 and make sure you have the two lines ABOVE the red box.
Type them in if you do not. Ignore the two lines in the red box.
Line: 57 to 57
  Read the document "A Gentle Introduction to ROS" - use this as a reference document for the first tutorials.
Changed:
<
<
>
>
  Next meeting is 5pm on Wed Mar 10.
Changed:
<
<
W2 3/8: >= 50% ROS tutorials. One team member to specialize in gazebo knowledge
>
>
W2 3/8: >= 50% ROS tutorials. One team member to specialize in gazebo knowledge
 
Line: 73 to 75
  Next meeting is Wed Mar 17th 5pm and we will cover SLAM and gMapping.
Changed:
<
<
W3 3/15: ROS tutorials done. Motion and sensing scripts. One team member to specialize in gMapping
>
>
W3 3/15: ROS tutorials done. Motion and sensing scripts. One team member to specialize in gMapping

The file Wandert3.py is attached (you need to remove the ".txt" from the name WanderT3.py.txt when you download this to execute)

Review this program and discuss it in your group meeting. If your group has questions, please email me, but copy both teams so everybody can benefit from the answer.

Using "roslaunch turtlebot3_gazebo turtlebot3_house.launch" start the simulated house world. In a separate window, start the wanderT3 python program with "python3 wantderT3.py". Observe the behavior of the robot - have you any ideas for improvement so that it explors the house better and quicker?

 
Changed:
<
<
W4 3/22: Demo gMapping. wander script. exploration mapping
>
>
The demonstration of SLAM did not go as expected during our meeting. I had forgotten to type"source ~/catkin_ws/devel/setup.bash" before I started it, and thats how come I got some errors when it started!

You need to have started the house simulation first, and then start a new terminal window.

When you type "roslaunch turtlebot3_slam turtlebot3_slam.launch slam_method:=gmapping" what you should see is this

This shows the 'progress' of the robot in using the 360 degrees of laser range finding data to make a map. If you start another new terminal shell and let wanderT3.py run, you will see the map grow. You can also use the teleoperation command from last week to 'drive' the robot around, and that is a much quicker way to make the map!

Next meeting is Wed Mar 22, 5:00pm.

W4 3/22: Demo gMapping. wander script. exploration mapping

  W5 3/29 Begin Phase 2
Line: 96 to 120
 -- (c) Fordham University Robotics and Computer Vision
Added:
>
>
 
META FILEATTACHMENT attachment="moveSquareT3.py.txt" attr="" comment="Simple ROS Node" date="1615411134" name="moveSquareT3.py.txt" path="moveSquareT3.py.txt" size="1753" user="DamianLyons" version="1"
Added:
>
>
META FILEATTACHMENT attachment="WanderT3.py.txt" attr="" comment="WanderT3.py" date="1616030867" name="WanderT3.py.txt" path="WanderT3.py.txt" size="3264" user="DamianLyons" version="1"

Revision 52021-03-11 - DamianLyons

Line: 1 to 1
 
META TOPICPARENT name="FordhamRoboticsAndComputerVisionLaboratory"

GoogleCSR Autonomous Robotics Project Spring 2001

Autonomous Robot Exploration

Line: 66 to 63
  W2 3/8: >= 50% ROS tutorials. One team member to specialize in gazebo knowledge
Added:
>
>

Install the simulation packe as per class if you have not. Kill that shell and start a new one or you won't get the updated definitions from your install.

Launch the taurtlebot3_house gazebo simulation and explore ROS commands such as rostopic list and rostopic echo. The /odom topic is the odometry (position) information available to the robot. The /scan topic is the list of 360 range readings from the laser ranger, each at 1 degree around the robot, anticlockwise.

Review the code I attached moveSquaret3.py in terms of the Subscriber/publisher tutorial, be prepared to discuss this next week.

Next meeting is Wed Mar 17th 5pm and we will cover SLAM and gMapping.

 W3 3/15: ROS tutorials done. Motion and sensing scripts. One team member to specialize in gMapping

W4 3/22: Demo gMapping. wander script. exploration mapping

Revision 42021-03-10 - DamianLyons

Line: 1 to 1
 
META TOPICPARENT name="FordhamRoboticsAndComputerVisionLaboratory"

GoogleCSR Autonomous Robotics Project Spring 2001

Autonomous Robot Exploration

Line: 29 to 29
  Gentle introduction to ROS: https://www.cse.sc.edu/~jokane/agitr/agitr-letter.pdf
Added:
>
>
Gazebo Tutorials (only for the GUI!): " target="_blank">http://gazebosim.org/tutorials]]
 Overview of SLAM: https://www.dfki.de/fileadmin/user_upload/import/5034_freseki10.pdf

SLAM by GMapping: https://people.eecs.berkeley.edu/~pabbeel/cs287-fa11/slides/gmapping.pdf

Added:
>
>
Setting up Ubuntu on a VM (only if you need to): Here
 

Teams

Team1: Zarrin, Jessica, Duvall

Line: 85 to 87
 

-- (c) Fordham University Robotics and Computer Vision \ No newline at end of file

Added:
>
>

META FILEATTACHMENT attachment="moveSquareT3.py.txt" attr="" comment="Simple ROS Node" date="1615411134" name="moveSquareT3.py.txt" path="moveSquareT3.py.txt" size="1753" user="DamianLyons" version="1"

Revision 32021-03-04 - DamianLyons

Line: 1 to 1
 
META TOPICPARENT name="FordhamRoboticsAndComputerVisionLaboratory"

GoogleCSR Autonomous Robotics Project Spring 2001

Autonomous Robot Exploration

Revision 22021-03-03 - DamianLyons

Line: 1 to 1
 
META TOPICPARENT name="FordhamRoboticsAndComputerVisionLaboratory"

GoogleCSR Autonomous Robotics Project Spring 2001

Autonomous Robot Exploration

Line: 20 to 23
 _______________________________________________________________________________________

Important Links

Changed:
<
<
ROS Wiki: <a data-saferedirecturl="https://www.google.com/url?q=http://wiki.ros.org/ROS/Tutorials&source=gmail&ust=1614877639036000&usg=AFQjCNEr1snbNzZ-wGmAIbyBoynbyxbWfw" href="http://wiki.ros.org/ROS/Tutorials" target="_blank">http://wiki.ros.org/ROS/Tutorials</a>
>
>
ROS Wiki: http://wiki.ros.org/ROS/Tutorials
 
Changed:
<
<
Turtlebot3: <a data-saferedirecturl="https://www.google.com/url?q=https://emanual.robotis.com/docs/en/platform/turtlebot3/overview/&source=gmail&ust=1614877639036000&usg=AFQjCNFG3TqbJofsQDrda144mCNvFyN48Q" href="https://emanual.robotis.com/docs/en/platform/turtlebot3/overview/" target="_blank">https://emanual.robotis.com/docs/en/platform/turtlebot3/overview/</a>
>
>
Turtlebot3: https://emanual.robotis.com/docs/en/platform/turtlebot3/overview/
  Gentle introduction to ROS: https://www.cse.sc.edu/~jokane/agitr/agitr-letter.pdf
Line: 42 to 45
 

Schedule

Changed:
<
<
W1 3/1: Begin Phase 1
>
>
W1 3/1: Begin Phase 1 UBUNTU in place, ROS installed, Start tutorials

Install up to 3.1.5 on the robotis 'quickstart guide. Examine your ~/bashrc file as in 3.1.6 part 3 and make sure you have the two lines ABOVE the red box.
Type them in if you do not. Ignore the two lines in the red box.


Start the ROS twiki tutorials. You an use "rosrun turtlesim turtlesim_node" as a simulator if you want - thats what's in the tutorials.
or use "roslaunch turtlebot3_gazebo turtlebot3_house.launch" You need to get as far as #13. (Don't do #11 or any C++ material).
see if you can get to #6 before the next meeting.

If you find that you do not have the turtlebot3_gazebo package then you need to install from the robotis web page, item 6 Simulation.

Plan to meet with your team -- anyone on the team can organize this, but if you don't hear anything by Saturday then YOU organize it.
Share progress and help each other.

Read the document "A Gentle Introduction to ROS" - use this as a reference document for the first tutorials.

 
Changed:
<
<
UBUNTU in place, ROS installed, Start tutorials
>
>
Next meeting is 5pm on Wed Mar 10.
  W2 3/8: >= 50% ROS tutorials. One team member to specialize in gazebo knowledge
Line: 65 to 80
 Workshop 4/23 -- Poster/Video from each team showing rsults

Permissions

Changed:
<
<
Persons/group who can view/change the page:
>
>
Persons/group who can view/change the page:
 

-- (c) Fordham University Robotics and Computer Vision

Revision 12021-03-03 - DamianLyons

Line: 1 to 1
Added:
>
>
META TOPICPARENT name="FordhamRoboticsAndComputerVisionLaboratory"

GoogleCSR Autonomous Robotics Project Spring 2001

Autonomous Robot Exploration

Faculty Mentor: Damian Lyons (Fordham)

Description

Autonomous vehicles and robots are increasingly visible in magazines and news stories. It seems inevitable that they will soon be a major commercial technology. The objective of this project is to use the open-source ROS mapping and navigation stack and devise a way for a mobile robot to quickly and automatically learn the physical map for a house so that it is prepared to carry out domestic activities.

Participant Background

This project is appropriate for participants who have a background in computer science, especially Linux and Python programming. ROS is an open-source framework for building robot programs that runs under Linux, and Python is one of the easier languages in which to program ROS nodes. Experience cloning and ‘make’ing software from github would be a big help.

Objectives and Learning Goals

The participants in this project will achieve the following:

  • Gain a basic understanding robot programming.
  • Learn about ROS – one of the principal tools for programming robots.
  • Exposure to widely used techniques and algorithms for mapping and for robot navigation.
  • Experience writing ROS nodes and evaluating robot behavior.
_______________________________________________________________________________________

Important Links

ROS Wiki: <a data-saferedirecturl="https://www.google.com/url?q=http://wiki.ros.org/ROS/Tutorials&source=gmail&ust=1614877639036000&usg=AFQjCNEr1snbNzZ-wGmAIbyBoynbyxbWfw" href="http://wiki.ros.org/ROS/Tutorials" target="_blank">http://wiki.ros.org/ROS/Tutorials</a>

Turtlebot3: <a data-saferedirecturl="https://www.google.com/url?q=https://emanual.robotis.com/docs/en/platform/turtlebot3/overview/&source=gmail&ust=1614877639036000&usg=AFQjCNFG3TqbJofsQDrda144mCNvFyN48Q" href="https://emanual.robotis.com/docs/en/platform/turtlebot3/overview/" target="_blank">https://emanual.robotis.com/docs/en/platform/turtlebot3/overview/</a>

Gentle introduction to ROS: https://www.cse.sc.edu/~jokane/agitr/agitr-letter.pdf

Overview of SLAM: https://www.dfki.de/fileadmin/user_upload/import/5034_freseki10.pdf

SLAM by GMapping: https://people.eecs.berkeley.edu/~pabbeel/cs287-fa11/slides/gmapping.pdf

Teams

Team1: Zarrin, Jessica, Duvall

zali11ATfordham.edu,Jessica.DeMotaMunoz86ATmyhunter.cuny.edu,duvall.pinkneyATgmail.com,

Team2: Feizza,Connie, Sylvia

feizza01ATgmail.com, dengconnie0723ATgmail.com, silviadb48ATgmail.com

Schedule

W1 3/1: Begin Phase 1

UBUNTU in place, ROS installed, Start tutorials

W2 3/8: >= 50% ROS tutorials. One team member to specialize in gazebo knowledge

W3 3/15: ROS tutorials done. Motion and sensing scripts. One team member to specialize in gMapping

W4 3/22: Demo gMapping. wander script. exploration mapping

W5 3/29 Begin Phase 2

Design & testing of exploration algorithms

W6 4/5:

W7 4/12:

W8 4/19:

Workshop 4/23 -- Poster/Video from each team showing rsults

Permissions

Persons/group who can view/change the page:

-- (c) Fordham University Robotics and Computer Vision

 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback