Helm.ai releases new architectural framework for autonomous vehicles
Typically, in the autonomous driving industry, developers create massive black-box, end-to-end models for autonomy that require petabytes of data to learn driving physics from scratch. Helm.ai today unveiled its Factored Embodied AI architectural framework, which it says offers a different approach. With the framework, the company released a benchmark demonstration of its vision-only AI Driver steering the streets of Torrance, CA, with zero-shot success without ever having seen those specific streets before. This included handling lane keeping, lane changes, and turns at urban intersections. Helm.ai said it achieved this autonomous steering capability by training the AI using simulation and only 1,000 hours of real-world driving data.
https://www.therobotreport.com/helm-ai-releases-new-architectural-framework-for-autonomous-vehicles/

