Edge AI inference products and technology developers have waited for

When developers need AI at the edge, they turn to EdgeCortix for AI inference products fitting their workflow and achieving challenging design goals.

Supported Frameworks & Applications

Bringing everyone on the design team into the workflow

EdgeCortix products and technology work together for maximum productivity.

Application Software Developers
Full-stack from AI frameworks to deployment.
Learn More
FPGA or Custom SoC Developers
Reconfigurable IP, reference ports, dev cards.
Learn More
Board and System Designers
Off-the-shelf ASIC with PCIe x16 interface
Learn More
EdgeCortix edge AI inference products and technology include hardware, IP, and software in one workflow for AI developers
EdgeCortix MERA is a compiler and AI inference software framework translating models into code for an edge AI co-processor

EdgeCortix® MERA

EdgeCortix® MERA is a compiler and software tool kit providing all the necessary tools, APIs, code-generator and runtime needed to deploy a pre-trained deep neural network. MERA offers software developers and data scientists familiar workflows with native support for PyTorch, TensorFlow, TensorFlow Lite, and ONNX. MERA is the companion to the Dynamic Neural Accelerator™ IP (DNA IP) and provides the entire software stack for developing edge AI inference applications from modeling to deployment.

Dynamic Neural Accelerator™ IP

EdgeCortix Dynamic Neural Accelerator™ (DNA) is a flexible, modular neural accelerator IP core with ‘run-time reconfigurable’ interconnects between compute units, achieving exceptional parallelism and efficiency through dynamic grouping. Using a patented approach, EdgeCortix reconfigures data paths in real time, between DNA IP execution units to achieve outstanding parallelism and reduce on-chip memory bandwidth (or IP, allowing faster, more efficient hardware execution.). DNA IP works in conjunction with the MERA software stack to optimize computation order and resource allocation in scheduling tasks for neural networks.

Dynamic Neural Accelerator IP: Efficient, modular, scalable, and fully configurable hardware IP for FPGAs or SoCs.
EdgeCortix SAKURA: Deep learning accelerator ASIC for AI inference co-processing in boards and systems.

EdgeCortix SAKURA™-I

EdgeCortix SAKURA™-I is an advanced design Edge AI Co-Processor (ASSP) providing a high-performance AI inference engine implementing DNA IP to achieve up to 40 TOPS (dense). The SAKURA-I core is run-time reconfigurable using the MERA compiler and software framework. This enables multiple deep neural network models to run concurrently while maintaining critical performance characteristics. SAKURA-I is designed for applications requiring fast, real-time (Batch-I) AI inference on streaming data and offers excellent AI inference performance in a small footprint, low power silicon device.

SAKURA-I PCIe Low Profile Development Card

EdgeCortix SAKURA-I is available on a PCIe Low Profile development card, ready to drop into a host for software development and AI model inference tasks.

The BittWare IA-420 with an Intel Agilex FPGA hosts a reference port of EdgeCortix DNA IP for edge AI developers

Inference Pack

BittWare, an EdgeCortix partner, offers kits implementing EdgeCortix Dynamic Neural Accelerator (DNA) on Intel® Agilex™ FPGAs, using the BittWare IA-840F and IA-420F FPGA card platforms. DNA bitstreams for Agilex provide significantly lower inference latency on streaming data with a 2X to 6X performance advantage compared to competing FPGAs, and overall better power efficiency compared to other general-purpose processors.


Business Overview:

Delivering Energy Efficient Edge Based Artificial Intelligence (AI) Acceleration


Given the tectonic shift in information processing at the edge, companies are now seeking near cloud level performance where data curation and AI driven decision making can happen together. Due to this shift, the market opportunity for the EdgeCortix solutions set is massive, driven by the practical business need across multiple sectors which require both low power and cost-efficient intelligent solutions. Given the exponential global growth in both data and devices, I am eager to support EdgeCortix in their endeavor to transform the edge AI market with an industry-leading IP portfolio that can deliver performance with orders of magnitude better energy efficiency and a lower total cost of ownership than existing solutions."

Akira Takata
Former CEO of MegaChips Corporation

Improving the performance and the energy efficiency of our network infrastructure is a major challenge for the future. Our expectation of EdgeCortix is to be a partner who can provide both the IP and expertise that is needed to tackle these challenges simultaneously."

Ryuji Wakikawa
Head, Research Institute of Advanced Technology at SoftBank Corp

With the unprecedented growth of AI/Machine learning workloads across industries, the solution we're delivering with leading IP provider EdgeCortix complements BittWare's Intel Agilex FPGA-based product portfolio. Our customers have been searching for this level of AI inferencing solution to increase performance while lowering risk and cost across a multitude of business needs both today and in the future."

Craig Petrie
VP, Sales and Marketing at BittWare

EdgeCortix is in a truly unique market position. Beyond simply taking advantage of the massive need and growth opportunity in leveraging AI across many business key sectors, it’s the business strategy with respect to how they develop their solutions for their go-to-market that will be the great differentiator. In my experience, most technology companies focus very myopically, on delivering great code or perhaps semiconductor design. EdgeCortix’s secret sauce is in how they’ve co-developed their IP, applying equal importance to both the software IP and the chip design, creating a symbiotic software-centric hardware ecosystem, this sets EdgeCortix apart in the marketplace.”

President & CEO of Trust Capital Co., Ltd., member of the Executive Committee of Silicon Valley Japan Platform

We recognized immediately the value of adding the MERA compiler and associated tool set to the RZ/V MPU series, as we expect many of our customers to implement application software including AI technology. As we drive innovation to meet our customer's needs, we are collaborating with EdgeCortix to rapidly provide our customers with robust, high-performance and flexible AI-inference solutions. The EdgeCortix team has been terrific, and we are excited by the future opportunities and possibilities for this ongoing relationship."

Shigeki Kato
Vice President, Enterprise Infrastructure Business Division

Learn more about EdgeCortix solutions.