AI inference for Automotive Sensing

Faster, higher resolution sensors in different modalities

More processing with less power consumption ahead

The automotive industry has been working on advanced driver assistance systems, or ADAS, where AI inference plays a big role. Next, self-driving vehicles will usher in faster, higher resolution sensors, likely in three modalities, requiring more AI operations. At the same time as more sensor processing is needed, sensor and processing power consumption must drop if vehicles are to achieve range goals.

edgecortix-lidar-3d-object-detectionLidar operates on a laser beam rapidly scanned in 3D space. Its output is a 3D point cloud indicating distances to objects the beam touches. Point clouds can be highly detailed but difficult to interpret, especially as a vehicle and surrounding objects move.

Advanced modeling can find scan-to-scan differences, such as determining when another vehicle has pulled alongside, or is in a blind spot, or is straying from its lane. Lidar has one other advantage - it can sense height, including clearances to bridges and parking structures, essential for high-profile vehicles.

Much of the research into AI inference is focused on analyzing these point clouds, since many types of information can be harvested from them. AI inference techniques such as open-set semantic segmentation may help with unstructured lidar data.

Vehicle-CameraCameras are good at replicating many of the features of human vision. Increasing frame rates and pixel resolution produces higher quality images. But, these increases introduce a challenge: more frames and pixels mean more processing.

Increased detail means an object in the distance may be detected more quickly. This may buy critical time, especially for a heavy truck at highway speeds which requires more distance to maneuver and stop. Accurate detection may mean every pixel in a scene needs semantic segmentation with a label.

AI can give cameras more capability such as real-time depth estimation, crucial for camera-only systems in self-driving vehicles to perceive correctly. Attention-based AI models may also improve detection of objects in camera images.

convoy-of-trucks-technologyMaking vehicles smarter with AI inference is only part of the change needed to help roads be safer and more efficient. Another initiative is intelligent transportation, adding features to roads for gathering real-time data and assisting drivers.

Vehicle counting has existed for years, but now much more information can be extracted from cameras and other sensors. Vehicle types can be identified, as well as pedestrians and bicycles. Vehicle spacing can be monitored and lane changes can be observed. Emergency vehicles can be notified and guided to incidents. 

Vehicle-to-everything (V2X) systems also come into play. V2X can help enforce vehicle speeds and spacing. Intelligent transportation may also eventually coordinate traffic on roads with other modalities such as light rail and air taxis.

EdgeCortix SAKURA PCIe Dev Card expands a host system with a powerful, yet efficient edge AI chip delivering 40 TOPS

Run-time reconfigurable, high utilization, low latency IP

  • MERA Software Framework for AI takes models from standard machine learning platforms including PyTorch, TensorFlow, and ONNX.
  • DNA IP empowers auto makers and equipment OEMs to create custom SoCs for sensing systems with scalable, flexible, power efficient AI inference.
  • The combination of MERA and DNA IP offers run-time reconfigurability for many AI models, including ones not yet devised, with 80%+ hardware utilization and extremely low inference latency.

Featured Resources

Edge AI software workflows can start with PyTorch, TensorFlow, or ONNX models, and MERA automatically converts them to code for EdgeCortix DNA IP
Connecting Edge AI Software
Data scientists can get their applications into edge devices with MERA and DNA IP
Read the Blog
Software,-the-Elephant-in-the-Room-for-Edge-AI-Hardware-Acceleration
AI Hardware Summit
Software, the Elephant in the Room
for Edge-AI Hardware Acceleration
Watch Now
Linley-Research-ESakura-Debuts-for-Edge-AI
SAKURA-I Debuts for Edge AI
A detailed overview of the novel SAKURA-I SoC for low-latency edge inference
Get the Linley Report
SAKURA
SAKURA-I Edge AI Co-Processor
ASIC for fast, efficient AI inference acceleration
Learn More
MERA
MERA Software Framework for AI
Modeling and co-processor core management
Learn More
DNA IP
DNA IP Inference Processing Core
Run-time reconfigurable neural network IP
Learn More

Ready to do more with AI inference for automotive sensing?