Self-Driving Cars: How Sensors & Analytics Work

Matthew Walter was on the 4th place finisher of DARPA Grand Challenge II when he was at MIT. He explained in detail all the kit on their car, lasers, cameras, onboard analytics, and more.

Embed

  1. Official Event Info and Pre-Tweets

  2. Live Tweets & Notes by @csrollyson

  3. Matthew Walter kicking off
    Matthew Walter kicking off
  4. Notes by @csrollyson

    30 University of Chicago grad students, just sponsored by Toyota; enable robots to interact w environment, act deliberately. Many use cases, i.e. kitchen. Also self-driving cars.

    MIT alum, DARPA Challenge. Huge team, 20-30 worked to develop cars.

    DARPA Grand Challenge, driven by war in Iraq: 1/3 ground combat troops vs unmanned by 2015 to eliminate soldier deaths driving supplies around. This was the driving force behind the Challenge.

    DGC II in 2005 and DGC I in 2004. 7.4mi the most in I and no one finished. In DGC II five teams finished. Stanley won. Stanford. Sebastian Thrun started Google X after his team won DGC II. Hit the links to see what failures caused teams to drop out; they give more context to Matt's remarks.

    Urban Challenge in 2007. Dynamic obstacles, moving cars, obey traffic laws. Blocked GPS, each team got 2 files, topology i.e. roads, then mission, a list of places to visit. 89 teams started.

    3 questions: where is the road? what are static obstacles? what are other vehicles? Dealing w uncertainty environment, other vehicle locations, what are the intents of other vehicles?
  5. The DARPA Urban Challenge
    The DARPA Urban Challenge
  6. Advanced navigation; sparse waypoints.

    The course was a suburb, not really city. Paved roads, but not all had intact lane markings, 1 and 2 way roads.

    Scope, teams can pass vehicles, merging into traffic, parking in lots, potholes. No pedestrians, <30 mph; obey traffic laws, no real difficult terrain.

    Team MIT very multidisciplinary.

    Landrover LR3. Installed sensors. Drive-by-wire. 5 cameras, 16 radars. Velodyne 3D laser. 12 SICKs (lasers). Also GPS/IMU. Needed a big compute cluster in the trunk, 40 gig RAM ethernet to crunch the data locally. 6kW generator. 2kW a/c on the roof to cool the tech. As many sensors as possible.

    Cameras firewire; 4 face forward, 1 rearward, lane detection.

    LIDAR, 12 SICKs look at the ground, common in bots. Detect obstacles, curbs, other vehicles

    Velodyne for obstacle detection. Colors. blue low, red high, 3D view. Dead spot around vehicle, 3m radius. All sensors have own reference frame, need to integrate. Calibration. Relative for car’s reference frame. Not GPS.

    ACC radars. 16 narrow beam width. moving objects get individual IDs. Most data not useful.
  7. Getting into the details of how autonomous cars navigate
    Getting into the details of how autonomous cars navigate
  8. The MIT Team's car, all kitted up
    The MIT Team's car, all kitted up
Like
Share

Share

Facebook
Google+