AI inference for Smart Manufacturing

Creating breakthroughs in efficiency, accuracy, and quality

Sensing and AI inference replacing tedious programming

AI has many applications in sorting through volumes of big data coming from manufacturing floors today. At the edge, higher speed, higher resolution sensors including video reveal more about processes and product quality than ever. Instead of creating and programming complex models, manufacturers are turning to edge AI inference to quickly judge outcomes, predict movements, and meet goals.

manufacturing-5Historically, human inspectors did meticulous, hands-on inspection of finished goods, gauging conformance to desired quality. Not only are these techniques time-intensive, they can be mind numbing with hour-after-hour, day-after-day repetition.

Increasing throughput requires automation of visual inspection techniques. Ideally, goods can be inspected as they flow through the production line, even at each step. Rejecting flawed goods sooner in the process reduces costs, simplifying rework and preventing accumulation of errors. High-resolution video can spot minute anomalies or flaws.

Maximizing productivity means running AI-based visual inspection at the pace of production, making decisions within a short period of time as goods appear. Automation never tires, and makes equally good decisions at the beginning and end of a shift.

Robotic-Guided-LearningRobotics programming is an artform. Conventional industrial automation robots with six degrees of freedom have their own programming languages and syntax. There are also many rules, particularly around safety of movements preventing damage or harm.

Programming involves a lot of trial and error. Understanding both the process steps and the manipulation capability of the robot is required for successful configuration. Robotic programming is only as good as the programmer’s ability to foresee and code for any unexpected situations.

One technique gaining momentum is human-guided learning, where a human demonstrates the motions for the robot. Also in play are unsupervised learning techniques such as generative adversarial networks (GANs), These techniques focus on reducing the time and effort needed to bring a robot up to speed.

smart-manufacturing-2In a modern manufacturing environment, robots generally do not work alone. Multiple robots engage in a sequence of activities, and sometimes several robots work in coordination on a single step of the process. Robots may also be involved in fetching raw materials and shuttling them to the production line, like in an automated warehouse.

How robots choose their path of motion can make a huge difference in throughput and efficiency. It’s not unlike the requirements for self-driving cars, albeit on a closed circuit under much more controlled conditions. Robots must understand a map from start to destination, and navigate successfully while avoiding collisions.

AI inference enables perception, prediction, and planning phases needed to sense objects and make motion decisions, and can help learn the most efficient paths.

EdgeCortix SAKURA PCIe Dev Card expands a host system with a powerful, yet efficient edge AI chip delivering 40 TOPS

Powerful AI inference IP enables more sophisticated models

  • SAKURA-I ASIC delivers 40 TOPS in 10W of power, and up to five chips can be tied together via PCIe in a configuration for 200 TOPS.
  • The same DNA IP in the SAKURA-I ASIC can be configured into fast FPGA AI accelerator cards, adding reconfigurability to high performance.
  • MERA Software Framework for AI takes models from standard machine learning platforms including PyTorch, TensorFlow, and ONNX.

Featured Resources

Edge AI software workflows can start with PyTorch, TensorFlow, or ONNX models, and MERA automatically converts them to code for EdgeCortix DNA IP
Connecting Edge AI Software
Data scientists can get their applications into edge devices with MERA and DNA IP
Read the Blog
Software,-the-Elephant-in-the-Room-for-Edge-AI-Hardware-Acceleration
AI Hardware Summit
Software, the Elephant in the Room
for Edge-AI Hardware Acceleration
Watch Now
Linley-Research-ESakura-Debuts-for-Edge-AI
SAKURA-I Debuts for Edge AI
A detailed overview of the novel SAKURA-I SoC for low-latency edge inference
Get the Linley Report
SAKURA
SAKURA-I Edge AI Co-Processor
ASIC for fast, efficient AI inference acceleration
Learn More
MERA
MERA Software Framework for AI
Modeling and co-processor core management
Learn More
DNA IP
DNA IP Inference Processing Core
Run-time reconfigurable neural network IP
Learn More

Ready to do more with AI inference for smart manufacturing?