Motional open-sources vr environments for autonomous vehicle research

Motional today open-sourced its nuReality set of virtual reality (VR) environments. The custom environments are designed to study interactions between autonomous vehicles and pedestrians. There are 10 VR scenarios modeled after an urban 4-way intersection. There is no clear pedestrian crossing zone, stop signs, stoplights, or other elements.

  • A human driver stopping at an intersection
  • An AV stopping at an intersection
  • A human driver not stopping at an intersection
  • An autonomous vehicle not stopping at an intersection
  • An autonomous vehicle using expressive behavior, such as a light bar or sounds, to signal its intentions
  • Motional uses two vehicle models in VR environments: a conventional, human-driven vehicle and an autonomous vehicle without a human operator. The human driven model includes a male driver who looks straight ahead and remains motionless during the interaction. In the clip below, the approaching Motional robotaxi uses an LED strip in the front windshield to indicate that the vehicle is stopping.

    In this next clip, Motional said the approaching robotaxi’s nose dips to signal that the vehicle is stopping. Motional: ‘The nose dips is a signal to the driver that the car is coming to a stop.’ The car is expected to be on the road by the end of the year.

    Motional, the autonomous vehicle company that is a $4 billion joint venture between Hyundai and Aptiv, recently said it plans to launch a fully driverless robotaxi service in Las Vegas in 2023. In November 2020, Motional received the go-ahead from the state of Nevada to test fullyDriverless vehicles on public roads. Motional’s autonomous vehicles use a sensor suite of advanced LiDAR, cameras, and radar.

    Related Articles


    Please enter your comment!
    Please enter your name here

    Stay Connected


    Latest Articles