AI inference for Robotics and Drones
Efficient yet powerful AI inference increases flexibility while reducing payload size
Providing “eyes” around any scene
Embedded vision in robotics and drones extends perception to more places, often ones difficult for humans to reach easily. Smaller vehicles can leverage powerful AI inference processing, taking less size, weight, and power, translating to an improved range or more time on scene. Advanced inference techniques also help robotics reduce time-to-productivity with less programming burden.
Structural integrity is necessary for mission-critical infrastructure to remain in service safely. Routine inspections can be time-consuming and hazardous for people tasked with the job. Emergency inspections get even more urgent and sometimes dangerous.
Uncrewed drones can quickly perform inspections at high sensor resolution with little risk for human operators. Drones can often reach inside and around structures, providing new perspectives such as views of pipelines or storage tank interiors.
More AI processing in a drone can automate inspection patterns and spot faults in infrastructure without a persistent cloud connection or complex programming. It also allows for enhanced perception using increased-resolution sensors and different modalities. Reduced power consumption means drones go farther and dwell longer.
When monitoring comes to mind, people often think of security, detecting intruders in and around a facility. While surveillance is an important application for drones, it’s not the only one. Activity monitoring can have substantial productivity impacts.
Drones repeatedly flying over a facility can see changes occurring between missions. An example is using drones to see items delivered to the correct location for use within a construction site. Or a supplier can use a drone in their yard to view and verify inventory without sending a person out to look. Large-scale operations such as farms, oil fields, and mines can use drones to see where work needs to happen and where workers are.
Drone-based AI can help detect unsafe conditions such as hard hat non-compliance or highlight changes such as movements in vehicles or materials. It can also count how many people or vehicles are in an area, confirming work is in progress.
In an emergency such as a natural disaster, structure fire or collapse, or a major hazardous materials spill, time is critical. Conditions may be unsafe for any victims, responders, or nearby populations. Rapid assessment can save lives.
Whether air, ground, or sea-borne, robotic vehicles can venture into situations without placing first responders at greater risk. Depending on the sensor payload, they may be able to see through obscurants that would thwart human perception. They might even be able to get close enough to set up remote communication with a person trapped in a situation. Precise locations and visual triage can help prioritize and route assistance.
With AI, robotics gain more capability without difficult programming. Image segmentation, and object recognition and tracking improve. Enhanced simultaneous location and mapping (SLAM) helps chart movements accurately.

An edge AI chip minimizing SwaP
Using the EdgeCortix SAKURA-I with EdgeCortix Dynamic Neural Accelerator IP (DNA IP) inside or a custom edge AI chip designed with DNA IP, robotics and drone designers get these advantages:- Efficiency: SAKURA-I provides up to 16x the inference/sec/W of GPU-based architectures without higher power supply and cooling requirements.
- Run-time reconfigurability: DNA IP is not optimized for a specific AI inference model but is designed to balance flexibility with performance.
- High utilization: DNA IP achieves its performance partly from excellent hardware utilization, 80% or more for concurrent neural network models.
Featured Resources

