The Dreaded Merge Test: The DARPA Urban Challenge (Part 2)

More on the DARPA robot car contest, currently underway, from Scott Ettinger. Read his previous blog for more info.

Today Junior (Stanford’s robotic vehicle) faced the dreaded merge testing at track A. As I described earlier, this test involves dense two way traffic of human driven vehicles on a loop that Junior must continuously merge into and out of by making left turns. Some of the tests here with other robotic vehicles have been very exciting to watch as some have misjudged the traffic and caused mayhem by pulling into traffic with cars directly in front of them. One managed to collide with one of the traffic cars driven by professionals who are very good at getting out of the way (I have watched them avoid many very close calls).

Luckily, Junior’s test was not exciting at all. Junior did not even get honked at by the human drivers. Everything went as planned except that at one intersection, Junior waited longer than expected to merge. For reasons not yet fully understood, Junior interpreted something in the sensor data in that area as potentially dangerous and so erred on the side of caution. The cause will soon be analyzed by looking at the log. Junior logged gigabytes of data during the test including data from sensors and messages between software modules. We can use these logs to play back the test using the visualization tools developed for this purpose to determine exactly what Junior was doing. Despite the cautious behavior, this was a very successful test for Junior.


[Photo 1: Junior waiting to make a left turn in traffic during the merge test]

Now, a few words about driving and computers. It is a sobering fact that nearly 40,000 people die each year as a result of traffic accidents in the U.S. alone. Far more are injured. But technology can help. These numbers do not have to be this high. Computer controlled vehicles have the potential to dramatically improve the way we do transportation. Computer planning and control can make better use of the roadways and reduce the congestion that so many of us spend a lot of our lives sitting in – not to mention the fact that former drivers would be free to do other things while in transit. People who have had too much to drink will be able to safely take their car home without endangering anyone. What is really going on here in Victorville is a step toward a vision of the future in which computers drive our cars with better safety and efficiency than we can ourselves. They don’t get tired, they don’t talk on cell phones, and they don’t look down to change the radio station. In my mind there is no question that this will happen, it is only a matter of when.

That said, here are Junior’s computing technical specs:

Junior has two rack-mounted computers in the trunk. Each is powered by a single socket Intel Q6600 quad-core (2.4 GHz) processor running on an Intel D975XBX2 motherboard with 2 Gb of DRAM. This configuration was mainly driven by the constraint on power. All electrical power comes from the vehicle alternator, so electrical power is very limited. Junior’s software is naturally parallel due to the many different tasks the vehicle must perform, so it can make use of the multiple cores. Although higher clock speeds are available in the Core 2 Quad family, the 2.4 GHz parts were chosen to reduce the amount of power and heat generated in the confined space in the back of the vehicle. Most communications are done via Gigabit Ethernet including sensor data and control commands sent to the vehicle. For storage, the machines use Solid State Flash drives to eliminate the possibility of hard drive crashes due to the motion of the vehicle. One machine is dedicated to dealing with the wealth of sensor data streaming in, while the other machine handles planning and control.


[Photo 2: Junior’s Electronics – Computers are bottom right]

I sat down with team leader Sebastian Thrun to talk about autonomous driving and computers. I asked him what he would do if he had 1000 times more computing available. His first response was that he could immediately use it. A key issue in the field of robotics is dealing with the uncertainty of your model of the world around you. To do this, researchers use a set of methods known as probabilistic computing. To determine a quantity in your model (such as the position of a nearby car) – given the data that you have, these methods test the probability of a large number of hypotheses about the actual value. The more tests you make, the more robust your answer. In this way, the accuracy of the solution directly scales with computing power. Robustness under uncertainty is the key to making robotic vehicles on city streets a reality. The future looks good for these techniques as they scale very well with parallelism. With Moore’s law continuing on its pace, vehicles of the near future will be able to take advantage of an enormous amount of computing horsepower.

It is starting to happen here in Victorville.

There is a large range of computing on board the vehicles here at the event. They span from the largest I have seen on board MIT’s vehicle with 10 dual-socket, dual core machines, down to teams that have a single laptop powering their vehicle. Carnegie Mellon also has an impressive array of computing in Compact PCI form factor.


[Photo 3: MIT’s robotic vehicle]


[Photo 4: A view of Carnegie Mellon’s computing system]

[For additional photo coverage of the event visit Jan Becker’s page]

One Response to The Dreaded Merge Test: The DARPA Urban Challenge (Part 2)

  1. John says:

    The Golem Group is the team that makes do with a single laptop. Lots of computing power is an asset, but using it efficiently is true innovation for potential applications.