When the current semester began, we had the opportunity to choose a few of our courses ourselves. One of those was Self-Localization held by Prof. Dr. Jörg Roth. When I read the description it got me pretty excited about the topic. Not only because we would be using LEGO MINDSTORMS to create a simple robot to apply the newly gained knowledge about how to find out where an object is located. When we’re talking about location here, it means an indoor location which implies that GPS or Galileo are not that suitable. So I was pretty happy I was chosen among about 14 other students to attend this course.
Once the course started we had to make a decision again. This time between different available projects. We took the one for which we had to build a robot that would try to find a moving infrared ball in a room, follow, navigate towards and stop right in front of it. There were two other small projects like ours and a bigger one with five people on the team. They had to build a Mars Rover like robot that would consist of two MINDSTORMS packages. The huge difference: This robot would be controlled from a computer in a different room sending commands to the bricks to move it and receiving the sensor data to make decisions about where to drive next. All the other bots had to run completely autonomously.
All the projects were implemented using LeJOS, an alternative firmware for the LEGO bricks which enables it to run Java code.
Back to our robot. We decided to build a small robot with two motor controlled wheels and a third wheel in the back to stabilize it. For this approach the DifferentialPilot comes in handy as it abstracts the motor controls and provides some functions to drive forward, backward, rotate the bot on it’s own footprint, or travel along an arc, etc.. For the sensors, we decided to use the basic ultrasonic and the touch sensor to detect obstacles in front of the bot and the infrared seeker to “see” the infrared ball.
We’ve been using the LEGO Digital Designer to plan and prototype our bot. Unfortunately, it did not provide all the LEGO bricks we had in our packages so the big wheels are missing and the infrared sensor got replaced by a color sensor, which comes in a similar case as the infrared one. But maybe it was just me, not finding all the bricks I needed.
For me, the most interesting part of the development process was the Behavior API. The documentation introduces the topic somewhat along those lines:
“Many people think developing a robot and it’s software which provides it’s behaviors is quite a lot of if-then-elses.”
It isn’t. The Behavior API is a great example. All it needs is one class, the arbitrator, and an interface for the behavior itself. The arbitrator takes an array with all the behaviors the robot knows and calls the right one for each situation. Every behavior provides one single action like “drive forward”, “obstacle seen” or “hit wall”, a method to ask the behavior if it’d be the appropriate solution for the current situation and a method to interrupt it. Using the order of the behaviors in the array it prioritizes the behaviors schedules them appropriately.
The other big chunk of work was the sensor data fusion and the interpolation. My colleague Patrick did a pretty good job on those. The infrared sensor, for example, provides an angle from -120 to 120 degrees to the seen infrared source. Unfortunately only in 30 degree steps. Interpolating the signal strengths from the different angle steps let us bring the precision up to 15 degree steps.
Last but not least, we had to provide a position, once the bot found the ball. After crunching the numbers and applying some basic math, we were able to provide the traveled distance, the ball’s linear distance to the starting point and x/y coordinates.
Keep in mind that if you want to build your own robot for whatever purpose that the LEGO sensors are not too precise. They do a pretty good job, when you just want to play around or get started with robotics. But from what I’ve seen over the last few weeks I think there should be kits that do way better.
I can’t wait to finish this semester and maybe my bachelor thesis to have some more time to work through the remaining bullet points of the left over ideas and optimizations.