AI inference for Defense and Security

Edge AI processing in ASIC, FPGA, or custom SoC form factors

Improved object recognition for situational awareness

High resolution real-time imagery is transforming defense and aerospace capability. The next step is automating how objects in those images are recognized and tracked in real-time - an excellent fit for AI inference. Localized decision-making without cloud connections for recognizing threats speeds up responses. Scalable performance for edge AI processing fitting size, weight, and power (SWaP) constraints is essential.

SAKURA-I PCIe Low Profile Development Card

EdgeCortix SAKURA-I is available on a PCIe Low Profile development card, ready to drop into a host for software development and AI model inference tasks.

edgecortix-s1lp1-angle-image-v2

military-drone-2Many jobs in the air, on the ground, underwater, or on the surface call for smart unmanned vehicles, giving warfighters ways to project capability without risk to human lives. These vehicles are also taking on more complex missions requiring more intelligence.

Sensor resolution continues to grow. More pixels in a scene mean more pixels must be processed to find objects. However, the available window of time to find those objects is usually short. Sending images over a wireless network for processing increases delays and steals network bandwidth needed for other communication.

Improved edge AI processing helps vehicles identify threats faster and make basic decisions about next steps in their mission. Efficient processing in an ASIC lowers power consumption, improving range and loiter time.

SIGINT-2AI inference can also augment human operators gathering and processing intelligence in shipboard, airborne, and ground shelter-based applications. Using AI to sift through volumes of incoming data, highlighting areas that need further attention, decreases operator stress and increases productivity.

AI can recognize patterns in one dimensional (such as SIGINT) or two dimensional (such as video reconnaissance imagery) streams of data efficiently. As sample or frame rates increase, determinism and latency are critical parameters in analysis.

With threats constantly evolving, reconfigurable AI inference platforms are a necessity. Hardware-optimized solutions running specific neural network models lose flexibility over a program life cycle. FPGA-based solutions offer high-performance, reconfigurable inference without the SWaP and programming burden of GPU-based solutions. 

cyber-security-threat-computer-screenMuch of modern warfare now occurs in cyberspace. Systems are updated often, and threats are keeping pace, morphing to stay ahead of defenses. The higher value information has, the more intensely attackers try to expose or corrupt it.

Cybersecurity is using AI inference to help spot probing patterns before full-scale attacks develop. Once an attack begins, AI can guide system and network reconfiguration to mitigate damage faster than human personnel can respond. AI also helps sort through volumes of data for post-incident analysis, highlighting areas for experts to scrutinize.

Ultimately, no system is inherently secure, no matter how carefully designed or maintained. Security comes down to faster threat detection and response. AI inference will be part of every modern cybersecurity system because it offers capability to detect even obscure patterns that would otherwise escape real-time human observation.

DSEI-Japan-Logo

DSEI Japan 2023: Edge AI Reshaping the Battlefield

Dr. Tomoyuki Furutani of Keio University and Stan Crow of EdgeCortix present on today’s geopolitical environment and edge AI reshaping the battlefield.

EdgeCortix SAKURA Low Profile Dev Card expands a host system with a powerful, yet efficient edge AI chip delivering 40 TOPS

One scalable IP base for adding AI inference to applications

  • ASIC: SAKURA-I provides up to 16x the inference/sec/W of GPU-based architectures without higher power supply and cooling requirements.
  • FPGA: The same DNA IP in the SAKURA-I ASIC can be configured into fast FPGA AI accelerator cards, adding reconfigurability to high performance.
  • Custom SoC: DNA IP can also be deployed in a purpose-built SoC, which could include features like extended temperature and radiation hardening.

Ready to do more with AI inference for defense and security?

SAKURA
SAKURA-I Edge AI Co-Processor
ASIC for fast, efficient AI inference acceleration
Learn More
MERA
MERA Software Framework for AI
Modeling and co-processor core management
Learn More
DNA IP
DNA IP Inference Processing Core
Run-time reconfigurable neural network IP
Learn More

Featured Resources

Edge AI software workflows can start with PyTorch, TensorFlow, or ONNX models, and MERA automatically converts them to code for EdgeCortix DNA IP
Connecting Edge AI Software
Data scientists can get their applications into edge devices with MERA and DNA IP
Read the Blog
AI Hardware Summit Software the Elephant in the Room for Edge-AI Hardware Acceleration
AI Hardware Summit
Software, the Elephant in the Roomfor Edge-AI Hardware Acceleration
Watch Now
Linley-Research-ESakura-Debuts-for-Edge-AI
SAKURA-I Debuts for Edge AI
A detailed overview of the novel SAKURA-I SoC for low-latency edge inference
Get the Report
megachips-logo
Given the tectonic shift in information processing at the edge, companies are now seeking near cloud level performance where data curation and AI driven decision making can happen together. Due to this shift, the market opportunity for the EdgeCortix solutions set is massive, driven by the practical business need across multiple sectors which require both low power and cost-efficient intelligent solutions. Given the exponential global growth in both data and devices, I am eager to support EdgeCortix in their endeavor to transform the edge AI market with an industry-leading IP portfolio that can deliver performance with orders of magnitude better energy efficiency and a lower total cost of ownership than existing solutions."
Akira Takata
Former CEO of MegaChips Corporation
SoftBank
Improving the performance and the energy efficiency of our network infrastructure is a major challenge for the future. Our expectation of EdgeCortix is to be a partner who can provide both the IP and expertise that is needed to tackle these challenges simultaneously."
Ryuji Wakikawa
Head, Research Institute of Advanced Technology at SoftBank Corp
bittware-logo-v2
With the unprecedented growth of AI/Machine learning workloads across industries, the solution we're delivering with leading IP provider EdgeCortix complements BittWare's Intel Agilex FPGA-based product portfolio. Our customers have been searching for this level of AI inferencing solution to increase performance while lowering risk and cost across a multitude of business needs both today and in the future."
Craig Petrie
VP, Sales and Marketing at BittWare
trust-capitol-co-ltd-logo
EdgeCortix is in a truly unique market position. Beyond simply taking advantage of the massive need and growth opportunity in leveraging AI across many business key sectors, it’s the business strategy with respect to how they develop their solutions for their go-to-market that will be the great differentiator. In my experience, most technology companies focus very myopically, on delivering great code or perhaps semiconductor design. EdgeCortix’s secret sauce is in how they’ve co-developed their IP, applying equal importance to both the software IP and the chip design, creating a symbiotic software-centric hardware ecosystem, this sets EdgeCortix apart in the marketplace.”
DANIEL FUJII
President & CEO of Trust Capital Co., Ltd., member of the Executive Committee of Silicon Valley Japan Platform