390 likes | 402 Views
Explore the transformation of the ANDROS control system from mechanical to software-based, enabling wireless operation and autonomous functions. Discover the challenges faced and innovative solutions implemented for improved maneuverability and operational flexibility. Dive into the future advancements and possibilities with the new control system, paving the way for enhanced robotics performance and control.
E N D
ANDIBOT ROSELYNE BARRETO SPRING 2005 Final Presentation Imaging, Robotics, and Intelligent Systems Laboratory Department of Electrical and Computer Engineering, The University of Tennessee, Knoxville, TN.
OUTLINE • Achievements in Wireless ANDROS • ECE 400 • Literature review • Implementing ANDIBOT • Conclusions • Future Work
WIRELESS ANDROS • Original Control System Consists of mechanical toggle switches, potentiometers, a four-position joystick, and a full analog two-axis joystick. The panel accepts user input, processes the signals, and transmits the data to the robot over a serial RS-232 cable.
WIRELESS ANDROS • Original Control System Main weaknesses: • Large and heavy • Mechanical • Connected to the robot through 150’ cable that can get caught in the robot’s tracks The size and weight of the robot added to the fact that the user has to constantly keep track of the cable make it hard to maneuver. The mechanical aspect makes it hard to upgrade. This last feature especially makes it impossible to have any autonomous function implemented on the robot.
WIRELESS ANDROS • Original Control System Solutions: • Software-based system: such system eliminates the dimensional problems since a program can be loaded on a small computer (tablet PC). Most importantly it is easier to upgrade and allows us to bring the robot to a higher level of technology and operation (semi-autonomy, homing, following…) • Wireless capabilities make the robot easier to maneuver by making sure there is no cable getting caught in the robot tracks
WIRELESS ANDROS • New control system The new control system includes an MFC program that sends RS 232 commands wirelessly to the robot. As of last semester it had 2 main limitations: • The last character of the 21-character command strings • The functionality of the program itself when sending strings
WIRELESS ANDROS • New control system • The last character of the command strings: this problem limited the control over the robot by making it hard to adjust. Although the characters changing with each body function were identified, without the correct check sum the string is invalid. Then the only way to access the settings (light, arm and vehicle speed, laser and weapons) was to create a database of command strings with all the possible settings.
WIRELESS ANDROS • New control system • The last character of the command strings: 0A000C2000908D80C0Æññ Check sum Fire weapon Laser Arm Speed Vehicle Speed Light Without the correct check sum the robot does not respond
WIRELESS ANDROS • New control system • The last character of the command strings
WIRELESS ANDROS • New control system • The last character of the command strings: this character is the sum of the ascii code of all preceding characters. This problem being fixed, the IRIS lab can now claim complete wireless control over the Remotec ANDROS with no restrictions • The new code is very flexible any character can be changed separately as long as the correct check sum is generated before sending the string no need for a database
WIRELESS ANDROS • New control system • Functionality problems: • First version: The user send a string with each button click. This technique is problematic because it is discontinuous. To try to send a continuous command the user has to keep clicking very fast on the buttons: not intuitive and not practical • Second version: The user start a timer that continuously sends a command to the robot but has to be careful and click another button to stop the timer before any damage is done to the robot or its surroundings: not intuitive and dangerous • Solution: The latest version of the GUI is safe and much more intuitive. The program sends a string as long as the user is pressing down on a specific button and stops sending the strings as soon as the user releases the button
WIRELESS ANDROS • New control system - Achievement : functional & flexible GUI Main window
WIRELESS ANDROS • New control system - Achievement : functional & flexible GUI Vehicle Drive
WIRELESS ANDROS • New control system - Achievement: functional & flexible GUI Settings Weapons
ECE 400 There are no problems in body functions, driving options or settings. However there remains one problem. I have tested the weapons on KERMIT and got conclusive results but when I test the weapons on the robot itself I do not get a voltage. • New control system - Problem:
ECE 400 • Senior Design Project – Encoding carriage for sensor bricks Objective: This semester the senior design team 14 will build a device that will carry and track the motion of a sensor brick during a scanning process. The main idea is to design a brick carriage that is: • Adjustable to different size bricks if necessary • Equipped with a encoding system that will keep track of the brick during scans • Maneuverable by either of the Remotec ANDROS robots
ECE 400 • Senior Design Project – Encoding carriage for sensor bricks
ECE 400 • Senior Design Project – Encoding carriage for sensor bricks Conflict: The team had a conflict over using an servo motor or an encoder to keep track of the carriage, which gave a new orientation to the project • For accuracy purposes the team opted for a servo motor • This decision removed the ANDROS robot from the picture since the carriage became its own robot • The final objective became to design a better SAFEBOT with a special new feature. The SAFEBOT has one major problem: battery life under certain load (range sensor brick) • The new design will improve battery amps • The new feature will be a center will for better turning mechanism and the servo motor for encoding which will go along with a display of the location and distance covered by the robot
Literature Review • New control system – Work in progress • Review on Autonomy – Objective • Extend and improve my last semester review on autonomy • Focus on finding one method to implement an autonomous ANDROS (ANDIBOT: Autonomous Navigation and Directed Imaging Robot)
WIRELESS ANDIBOT • New control system – Work in progress • 3 options for autonomous ANDROS (ANDIBOT) • Robot gets to a car or other target autonomously and an operator performs the inspection • Operator gets the robot to the car and the robot does the inspection autonomously • Robot gets to the car and performs the inspection autonomously and operator only intervenes if something goes wrong (ideal)
WIRELESS ANDIBOT • New control system – Work in progress • Robots gets to the car autonomously (focuses on navigation) • Use of range sensor + odometry for obstacle avoidance and localization • Use of visual sensor (visual homing) + odometry: wide panoramic fields of view and landmarks + odometry but no obstacle avoidance here Note: You may ask: if the operator has to take over when the robot gets to the car, why can’t he/she just drive the robot to the target? May be to capture images and bring them back to the operator’s station.
WIRELESS ANDIBOT • New control system – Work in progress • Robot performs the inspection autonomously • Use of range sensor and odometry for obstacle avoidance and localization • Focus on the arm, the robot may be “safe” but the arm could still collide with its surroundings. Even on the current system Remotec insists that the operator keeps a safe zone around the robot when he/she cannot see it because someone may get hurt or the robot may damage objects (car to be inspected)
WIRELESS ANDIBOT • New control system – Work in progress • Robot get to the car and performs the inspection autonomously • Obviously this option combines the 2 previous cases and presents the most challenges
WIRELESS ANDIBOT • Facts about implementing ANDIBOT • Autonomous robots usually use odometry to move a certain distance in a certain direction, then sensory information constantly verifies their actual position • New approaches use visual sensors for homing => suitable for navigation not so much for tasks oriented missions • String commands contain information regarding directionand speed and could be use to calibrate ANDIBOT’s motion instead of an encoder and I would just need a sensor to verify the robot’s position. (Note: ANDROS is already built and without encoders) • Problem: battery
WIRELESS ANDIBOT • Implementing ANDIBOT • The IRIS Lab focuses on modular robotics: We have a thermal, a visual and a range sensor brick. The general goal for the robots is to become mobility bricks • One purpose for ANDIBOT is to be able to carry sensors for scans of suspicious objects or environments • The preference then is: • Not to add sophisticated sensors to the robot • Focus on navigation first and possibly some manipulation
WIRELESS ANDIBOT • Implementing ANDIBOT: Visual Homing • Definition: • Homing: Inspired from insect physiology, it is the ability to return to their nest after traveling a long distance along a certain path. In robotics it is the ability for the robot to return to its initial position after accomplishing a certain task • Visual Homing: It is the ability to base the navigation on the memory of special landmarks and the use of large fields of view (no 3-D information required • Basic idea: Tracking special figures such as corners between two panoramic images • Some method use visual homing in addition to other localization method. Recent methods only use visual homing
WIRELESS ANDIBOT • Implementing ANDIBOT: Visual Homing • Focus: Navigation • Problems: • This method certainly involves a good visual sensor mounted on the robot • Visual homing is usually not concerned with obstacle avoidance • Solutions • Add a camera to the ANDROS Mark 5A • Assume no obstacle between the robot and its destination
WIRELESS ANDIBOT • Implementing ANDIBOT: Visual Servoing • Definition: position control of a mechanical device using real-time feedback • Basic idea: Controlling the robot using a camera • Mimic the human sense of vision • Look than Move loop control system • Used a lot in assembly tasks and manipulations • Accuracy depends on accuracy of the visual sensor
WIRELESS ANDIBOT • Implementing ANDIBOT: Visual Servoing • Objective • This option involves having one sensor at the control station • The system takes a picture, recognizes the robot on the image • The user indicates the destination on that same image • The robot starts heading that way • Problems • No 3-D information on the image • Obstacle avoidance? • Accuracy?
WIRELESS ANDIBOT • Implementing ANDIBOT: Visual Servoing • Solutions • The main assumption will be that the path between the robot and the object to be inspected is obstacle free • Another assumption is that I am able to get some consistency out of the strings as an encoding system – limit additional hardware • The user may still make mistakes and there will be some errors in the calibration –strings or encoder - So the idea is to add an inexpensive sonar sensor that will alert the user if the robot is about to collide with an unexpected obstacle
WIRELESS ANDIBOT • Implementing ANDIBOT: Visual Servoing • Additional features on the GUI • A window for the main view of the robot and its destination • An additional inexpensive camera could still send back a surveillance stream of images to the control station on what the robot “sees” • An alert system to signals obstacles • A manual option to circle or draw a rectangle around a suspicious object on the captured image. That option would correspond to the robot going around that particular object
WIRELESS ANDIBOT • Implementing ANDIBOT: Homing vs. Servoing • Despite the different definition, both of those methods use visual data to control a robot • Homing focus on navigation servoing is often more task oriented • Homing implies adding a sophisticated visual sensor on board or at the control station. This defeats the purpose of carrying a visual sensor brick in a way – despite the difference between inspecting and navigating • Because the purpose of ANDIBOT is to keep people safe in hazardous situation, we have to keep the manual option for teleoperation on ANDIBOT. This defeats the purpose of homing in a way
WIRELESS ANDIBOT • Implementing ANDIBOT: Homing vs. Servoing • Homing can be used with or without additional sensor information • Servoing does not use addition sensor information and uses a basic off-the-shelf camera which is an advantage in this modular robotics project • The lab has other projects focusing on localization, path planning using range sensors or GPS sensors. Homing would be a new method to implement • On the other hand the lab does not yet have a project on manipulation which favors the servoing option
WIRELESS ANDIBOT • Implementing ANDIBOT: Homing vs. Servoing In conclusion, considering the general work in the IRIS lab, the modular aspect of this project and the fact that we have to keep a teleoperation option for ANDIBOT the best option seems to be visual servoing rather than homing. Visual homing and following remains useful additional features to add to the actual GUI.
Conclusion • In the research on autonomous mobile platform a serious concern is the control of the motors and actuators on the robot • Here we have a robust system which is control simply by wirelessly sending strings from a computer to the robot • We have bypass one of the first difficulties in implementing an autonomous system • Our work now should be integrating sensor information to add intelligence to this robotic system
Conclusion • The IRIS Lab can now claim complete wireless control over the Remotec ANDROS Mark V - All the essential functions can be implemented and all the settings can be accessed • ANDIBOT is a mobility brick capable of carrying sensors for scans and inspections while maintaining itself autonomously • The ANDIBOT will keep the teleoperation option for delicate situation in which human intervention is still required • The work on ANDIBOT 1 will focus on autonomous navigation using visual homing or autonomous manipulation using visual servoing
Future Work • No more classes! • Possibly narrow the research to autonomy using visual servoing control • Acquire an inexpensive camera and sonar sensors and learn how to program • Start implementing ANDIBOT
THANK YOU !!! Questions?