Breaking the limits in AI processors and edge AI inference acceleration

Where AI inference acceleration needs it all – more TOPS, lower latency, better area and power efficiency and scalability – EdgeCortix AI processor cores make it happen.

edgecortix-sakura-icon
edgecortix-full-color-logo-hor-rgb-tm-v2
@
Embedded World Exhibition & Conference

Join EdgeCortix at Embedded World 2024

April 9th - 11th in Nuremberg, Germany

A software-first approach to edge AI processing

General-purpose processing cores - CPUs and GPUs - provide developers with flexibility for most applications. However, these general purpose cores don’t match up well with workloads found in deep neural networks. EdgeCortix began with a mission in mind: redefining edge AI processing from the ground up.

With EdgeCortix technology including a full-stack AI inference software development environment, run-time reconfigurable edge AI inference IP, and edge AI chips for boards and systems, designers can deploy near cloud-level AI performance at the edge. Think about what that can do for these and other applications.

security
Defense
Finding threats, raising situational awareness, and making vehicles smarter
Learn More
robotics-drones
Robotics & Drones
Improving flexibility, reducing time-to-productivity, and simplifying programming
Learn More
city
Smart Manufacturing
Creating breakthroughs in configurability, efficiency, accuracy, and quality
Learn More
smart-cities
Smart Cities
Keeping people and vehicles flowing, saving energy, enhancing safety and security
Learn More
automotive
Automotive Sensing
Helping drivers see their surroundings, avoid hazards, and ease into self-driving vehicles
Learn More

It’s time for better edge AI hardware, IP, and software technology

EdgeCortix MERA is a compiler and AI inference software framework translating models into code for an edge AI co-processor

EdgeCortix MERA: Software framework for modeling and hardware compilation

For full-stack AI inference applications, the MERA compiler and software framework translates AI models into code for an edge AI co-processor and a host CPU.

  • Native support for PyTorch, TensorFlow, TensorFlow Lite, and ONNX
  • INT8 quantization of user-defined and community AI inference models
  • Pre-trained segmentation, detection, point cloud, and more applications

Dynamic Neural Accelerator: Run-time reconfigurable neural network IP for AI processors

Modular and fully run-time configurable, the Dynamic Neural Accelerator is an AI processor core for edge inference acceleration.

  • Inference/sec/watt of 16x vs conventional GPU-based hardware
  • Scales from 1024 to 32768 MACs in three types of math units
  • Dynamic grouping handles workloads with +80% utilization
EdgeCortix Dynamic Neural Accelerator IP is run-time reconfigurable with high efficiency, low latency, and high utilization
Edgecortix Sakura-I AI Co-Processor

EdgeCortix SAKURA-I: ASIC for fast, efficient AI inference acceleration in boards and systems

The SAKURA-I Edge AI Co-Processor is an advanced design for a high-performance AI inference engine that connects easily into a host system.

  • Built in TSMC 12nm FinFET, 40 TOPS (dense) @ 800 MHz, TDP of 10W
  • Extended life cycle availability needed for defense and industrial
  • PCIe Gen 3 interface; can connect up to 5 together for 200 TOPS

SAKURA-I PCIe Low Profile Development Card

EdgeCortix SAKURA-I is available on a PCIe Low Profile development card, ready to drop into a host for software development and AI model inference tasks.

edgecortix-s1lp1-angle-image-v2
megachips-logo

Given the tectonic shift in information processing at the edge, companies are now seeking near cloud level performance where data curation and AI driven decision making can happen together. Due to this shift, the market opportunity for the EdgeCortix solutions set is massive, driven by the practical business need across multiple sectors which require both low power and cost-efficient intelligent solutions. Given the exponential global growth in both data and devices, I am eager to support EdgeCortix in their endeavor to transform the edge AI market with an industry-leading IP portfolio that can deliver performance with orders of magnitude better energy efficiency and a lower total cost of ownership than existing solutions."

Akira Takata
Former CEO of MegaChips Corporation
SoftBank

Improving the performance and the energy efficiency of our network infrastructure is a major challenge for the future. Our expectation of EdgeCortix is to be a partner who can provide both the IP and expertise that is needed to tackle these challenges simultaneously."

Ryuji Wakikawa
Head, Research Institute of Advanced Technology at SoftBank Corp
bittware-logo-v2

With the unprecedented growth of AI/Machine learning workloads across industries, the solution we're delivering with leading IP provider EdgeCortix complements BittWare's Intel Agilex FPGA-based product portfolio. Our customers have been searching for this level of AI inferencing solution to increase performance while lowering risk and cost across a multitude of business needs both today and in the future."

Craig Petrie
VP, Sales and Marketing at BittWare
trust-capitol-co-ltd-logo

EdgeCortix is in a truly unique market position. Beyond simply taking advantage of the massive need and growth opportunity in leveraging AI across many business key sectors, it’s the business strategy with respect to how they develop their solutions for their go-to-market that will be the great differentiator. In my experience, most technology companies focus very myopically, on delivering great code or perhaps semiconductor design. EdgeCortix’s secret sauce is in how they’ve co-developed their IP, applying equal importance to both the software IP and the chip design, creating a symbiotic software-centric hardware ecosystem, this sets EdgeCortix apart in the marketplace.”

DANIEL FUJII
President & CEO of Trust Capital Co., Ltd., member of the Executive Committee of Silicon Valley Japan Platform
renesas-quote-logo

We recognized immediately the value of adding the MERA compiler and associated tool set to the RZ/V MPU series, as we expect many of our customers to implement application software including AI technology. As we drive innovation to meet our customer's needs, we are collaborating with EdgeCortix to rapidly provide our customers with robust, high-performance and flexible AI-inference solutions. The EdgeCortix team has been terrific, and we are excited by the future opportunities and possibilities for this ongoing relationship."

Shigeki Kato
Vice President, Enterprise Infrastructure Business Division