Retrofitting a classic mini with an autonomy stack

Self Racing Cars (SRC) is an autonomous proving ground for companies big and small. SRC lets these companies, as well as hobbyists, students, and researchers, test the capabilities of their autonomous vehicles. Vehicles are grouped into classes like fully autonomous, tele-operated, or human-driven.

Mechanical mounting

In a Classic Mini, you are the bumper. Modern car design offers many suitable options for sensor mounting. A Classic Mini has none of these features. With the exception of the door skins and side window glass, everything is curved. We turned to a solution that many other autonomous vehicle developers have chosen for prototyping.

Transmitting data and power

Both the Velodyne and RealSense units require an AC power source, which meant installing an AC/DC inverter in the Mini. With power solved, the last challenge was data cabling from the sensors to a compute source. The Intel RealSense uses a single USB-C port for both power and data, hence the powered USB hub.

Tangram Vision Classic Mini

Software testing for the tangram vision sdk

We tested three aspects of the Tangram Vision SDK at SRC: sensor runtime, multi-modal sensor synchronization, and LiDAR streaming. We’d be remiss if we did not mention the harsh environment under which our systems and other teams’ systems were tested. Thunderhill’s two-mile long West track was opened in 2014, with a challenging layout.

NVIDIA Self Racing Cars

Other teams at src

SRC attracts a diverse set of teams that bring different kinds of vehicles with different levels of autonomy. Along with Tangram Vision, other teams that participated in SRC this year included: TangramVision, SRC’s partner company, and SRC partner, the University of California, Los Angeles.

  • PointOne Navigation: The company provides spatial localization for autonomous and ADAS-enabled cars. PointOne Navigation not only completed this year’s fastest full autonomous lap with its self-driving Lexus, but it also completed a full autonomous lap … in reverse. Over the two-mile course, its forward-facing lap time was a quick 2:49, with an even more impressive lap time of 4:37 in reverse.One more fun fact: PointOne’s autonomy stack was developed by CEO Aaron Nathan for the 2007 DARPA Urban Challenge. This 14-year-old code was written in C# and runs on Windows 7, yet it still regularly excels every year when Aaron brings it to SRC.
  • Qibus: vehicle tele-operation on demand
  • Faction: lightweight, driverless vehicle fleets for delivery and transportation. Faction uses ArciMoto three-wheeled EVs, and completed multiple autonomous laps at the event.
  • AEye: high-performance, adaptive LiDAR sensors
  • NVIDIA: the R&D team tested a Ford Fusion outfitted with multiple LiDARs, cameras, radars, and other sensors.
  • Monarch Tractor: The company develops compact, autonomous, electric tractors. Unfortunately, Monarch’s tractor was too heavy to be allowed on track, but it was able to navigate autonomously around the track paddock.
  • Boltu Robotics: it builds autonomous delivery robots. Boltu brought an autonomous Prius to this year’s event.
  • Self Racing Cars will return to Thunderhill Raceway Park in 2022 for another weekend of autonomous excitement. Tangram Vision will be back with our Classic Mini Cooper with added evolution in our sensor package. That said, we’re still trying to figure out how to automate a manual gear shift.

    Adam Rodnitzky

    About the author

    Adam Rodnitzky is a serial entrepreneur in perception and sensors. He was co-founder of ReTel Technologies, one of the first companies to use human-in-the-loop applied to video analytics. After ReTel, he joined Occipital as GM and launched Structure Sensor and SDK, the most ubiquitous mobile depth sensor and SDK platform for iOS.

    Related Articles

    LEAVE A REPLY

    Please enter your comment!
    Please enter your name here

    Stay Connected

    54FansLike
    9FollowersFollow
    7SubscribersSubscribe

    Latest Articles