Personal tools
You are here: Home Courses Workshop Spring 2012 Project Physical Robots Accuracy Improvement for physical robot
« September 2019 »
September
SuMoTuWeThFrSa
1234567
891011121314
15161718192021
22232425262728
2930
Log in


Forgot your password?
 

Accuracy Improvement for physical robot

Project Summary

 
The goal of the project is to improve the accuracy of a working robotic system (Motion Planning for Physical Robots Among Dynamic Obstacles) by means of software and hardware modifications. 
 

robot pic

The Robotic System

 
The system, given a map of a maze and source and target locations, finds a path and navigates that robot on that path.
The robot is iRobot Create, it is controlled by a driver software that sends it commands and receives sensor data from the robots internal sensors (most importantly wheel encoders). With that information the software can track the robots real location and react to dynamic obstacles.
 
The wheel encoders are the only source of positioning data on the robot, but they have certain draw backs which cause inaccurate navigation and a significant divination from the target location.
These draw backs are:
  • Finite encoder resolution results in rounding errors.
  • Errors are accumulated over distance - the greater the distance the bigger the error.
  • Wheel slippage counts as movement.
 
 

The Hardware

 

We mounted two new sensors on the robot so that the driver software will receive positioning data from sources other the wheel encoders and using sensor fusion will pin point the robots position more accurately.

The robot is unable to operate more then one simple analog sensor and that capacity is already used up by the MaxSonar-EZ0. Therefore it was clear we will need a micro-controller. We chose the Arduino Mega for its user friendliness, open source libraries and extensive developer community.

arduino

The parameter for choosing the sensors were:

  • Have open source drivers.
  • Low price.
  • Integrate with the Arduino micro-controller.

The sensors we chose are:

  • ITG3200 Gyroscope: Provides the robots angular velocity, from which we calculate the robots orientation.
  •  ADXL345 Accelerometer: Provides acceleration, from which we calculate the robots position.
 
sensor

The Software

 
Once a path is calculated the driver software sends tree types of commands to navigate the robot on the path, these types are driver forwardrotate and stop. The robot rotation works in the following flow: the driver calculated which orientation the robot should take, sends a rotate command in the desired direction to the robot, receive a stream of sensor data which describes the robots progress and once the driver thinks the robot is in the desired orientation it sends the stop command.
 
The problem with this flow is that the driver receives many sensor data update and therefor his notion of the robots rotation contains many rounding errors. The drivers notion of the robots orientation is critical for accuracy since even small orientation deviations may result in significant position deviations.
 
To solve this problem we developed a new rotation flow which utilizes a special iRobot creates command called Wait Angle.
In the new flow the driver sends a rotate command which contains the desired orientation and when the robot has finished the rotation it will stop and send a single update from its sensors, thus minimizing rounding errors.
 

Results

 
To measure the accuracy improvement we devised an accuracy test in which the robot has to drive over 5 meters in a zig zag pattern and measured its deviation from the target location. 
We performed baseline measurements and compared the accuracy of the robot with our improvements to those baselines.
 
 
 graph

Accuracy Demo

 
 
 
 
 

The Team

 
Dorin Ben-Zaken, Gal Lerman
Document Actions