ND2 - Project 2 - localization - ruby
Last updated
Was this helpful?
Last updated
Was this helpful?
Abstract - Give a high-level overview of work. (5-10 sentences)
In our daily life, we need to go to different places to complete tasks. For example, I need to take a bus to my office every working day. To complete this simple task, from human perspective, there are several things that I already knew. I know where is the bus stop, where to get off, and which building is my office. If I didn't know where I am or where target locations are, it is impossible for me to reach it.
This is the same for a moving-based robot. Imaging that a robot is in the living room, and a user asks it to go to kitchen to fetch a sandwich. The robot need to know how to move from the living room to kitchen and come back. Otherwise, it will take a very long time, that the user decide to go to a restaurant.
A localization problem is to know where a robot is, and it also knows the goal location when a user requires it to go.
A localization problem: given a map, receive sensor measurements, odom & hokuya, know where the robot is.
When my first travel in Paris, I didn't know how to speak French, and what I have is a map. I was looking for a famous restaurant, but now idea where it is because I didn't know where I am. The second time when I was in Paris, I know how to reach it: just take a metro to Hotel de Ville and 10 minutes walk.
measurement impact by noise env, wind, slip....
Here are tasks we need to solve:
[1] Kalman filter & Particle filter
[2] build ROS world / launch / urdf / rviz / gezebo
[3] tune amcl parameters
In the beginning, green arrows point to anywhere.
In the end, they point to same direction.
Higher probabilities arrows are kept, and the robot is located in a good position.
Measurements error. slip. noise.
partical filter =
Kalman = sensor fusion
Model Configuration - Justify your choice of parameters
understanding of the impact of these parameters (for example, how do more/fewer particles impact the results?)
Results - Show tthe robots' performances. charts, graphs, and tables.
image of RViz with the robot at goal position and the PoseArray displayed.
For this, the student should submit the results for both the Classroom robot and the robot they developed???
Discussion
[1] justifies their stance with facts.
[2] whether AMCL would work well for the kidnapped robot problem and what kind of scenarios would need to be accounted for it.
=> Not well, since there is no map. To build a map, we need SLAM.
[3 ]brief discussions on where they would use MCL/AMCL in an industry domain.
=> warehouse robots to pick and fetch: Amazon has big warehouses, and robot need to get items and return them to post
BMW need components to build cars.
The layout of the factory is known. Location of items & post are known.
=> self-driving car: they have google map
=> airplane / drone
Future Work - What types of enhancements could be made to the model to increase accuracy and/or decrease processing time?
[Optional] deploy this project on actual hardware
amcl_params.yaml
laser_max_beams > 500
base_local_planner_params.yaml
update params to [1]increase velocity [2]better turn at corner, smooth, without multiple-turn, sizzle turn