Remotely Operated Vehicle for Exploration and Reconnaissance 2 (ROVER2)

Remotely Operated Vehicle for Exploration and Reconnaissance 2 (ROVER2)

Remotely Operated Vehicle for Exploration and Reconnaissance 2 (ROVER2)

Remotely Operated Vehicle for Exploration and Reconnaissance 2 (ROVER2)

Guest blogger Andrew Boggeri from UCLA Robotics Club discusses his team’s project, the Remotely Operated Vehicle for Exploration and Reconnaissance 2 (ROVER2). The ROVER2 competed in this year’s University Rover Challenge (URC) hosted by the Mars Society. Logic Supply helped sponsor the project by donating Mini-ITX hardware.

I want to start by thanking Logic Supply again for their support of our team, the Robotics Club at UCLA. Due to Logic Supply’s sponsorship, our team has seen much growth and continued success.

The Robotics Club at UCLA is focused on enhancing its members’ education through hands-on experience. We have a team of over 30 students working on 2 projects. Our flagship project is the University Rover Challenge (URC), a competition hosted by the Mars Society and was scheduled for May 28-30, 2009 at their Mars Desert Research Station (MDRS) in Hanksville, Utah.

The URC tasks teams with designing, building, and operating a Mars Rover-like system in a series of simulated Mars mission tasks. These tasks are described below:

  • Site Survey Task: This “mapping” task requires teams to drive to a vantage point and identify the locations of white PVC markers on a target range.
  • Construction Task: The construction task requires teams to tighten hex head bolts in a 3D assembly.
  • Extremophile Task: This task requires investigating sites of biological interest. There is a list of target sites which teams must investigate and then present a report on which sites merit further investigation and why.
  • Emergency Navigation Task: The emergency navigation task challenges teams to deliver an emergency payload to an astronaut in distress.

For a complete list of the URC Rules, Guidelines, and Requirements, please click here. (2/15/2012 Note: Link has been removed)

System Layout

System Layout

Our system for this year’s URC features a VIA EPIA EN15000G Mini-ITX motherboard with 2 GB of RAM running a customized version of Ubuntu Linux that handles the high level communication and control tasks. The EN15000G was chosen for its integrated serial and IEEE 1394 ports, as well as its low power draw and minimal heat generation. Overheating is a major concern in the desert, where surface temperatures can reach 100°F and processor loads can be intense due to high frame-rate cameras.

When the Rover is powered, the system boots, logs in, and then initiates the RoboServer control program and waits for a client to connect. Upon connection, the client computer is presented with a GUI showing the attached sensors and readings, a camera view, the arm control sliders, and the motor control wheel. The client is then able to remotely operate ROVER2. Depending on battery load-out and driving conditions, ROVER2 is able to operate continuously from 1-2 hours.

The server and client communicate over two 900 MHz ethernet data radios capable of up to ~1 Mbit/s data transfer. The AX3500 motor and arm controller communicates with RoboServer over the serial port while our ATMEGA microcontrollers gather sensor data and send it over USB. The Point Grey cameras use the FireWire bus available on the EN15000G, making integration extremely easy. All devices are hot-pluggable save for the AX3500.

Rover2 Arm

Rover2 Arm

The arm was built using standard hobby servos and off-the-shelf linear actuators to reduce costs and complexity. It is controlled by a slider panel or an optional full size model using potentiometer feedback for joint rotations on the client side.

Rover Chassis and Arm

Rover Chassis and Arm

The chassis is made of aircraft aluminum with the drivetrain featuring rubber tank treads and hardened steel drive shafts. The drive motors are capable of 3 Hp each, but due to weight/power limitations will not likely develop more than 0.5 Hp combined.

Application Screenshot

Application Screenshot

The device in the upper left of the “Application Screenshot” is our driving “compass.” You drag a vector and click on a location; based on the angle from the center line and the length of your vector on the compass, the software tells the motor controller how much power to send to each motor. The top 180° are for forward motion, the black 45° are for rotation in place, and the rear 90° are for reverse motion. The data readouts to the immediate right let us know temperature, time of activity, and data link activity (the comm protocol is self-adjusting based on available bandwidth), as well as GPS location and compass heading. The extra info is not visible because our sensor packages are not attached (they’re hot-pluggable). The rectangle is a representation of the ROVER, front facing up, with each perimeter rectangle representing a sonar sensor and a range in inches. This allows us to disable the video and “drive blind” in low bandwidth situations (loss of line of sight, extreme range). The center rectangle represents the ROVER’s position in 3D space with a vector and angle (grabbed from a 3-axis accelerometer). The bottom screen is our video, allowing for size changes, exposure variation, pan-tilt (the sliders), and swapping between the various cameras.

Stay tuned for more details and pictures of the actual URC.

Comments (2)

  1. June 2, 2009

    Very nice indeed.

  2. October 14, 2010

    Okay, nice. Especially the radio link and video over it!

Leave a Comment

Your email address will not be published.