Skip to main content

Syntiant Introduces Production-Ready Edge AI Software Solutions for Image Detection, Tracking and Classification

IRVINE, Calif., Jan. 05, 2023 (GLOBE NEWSWIRE) -- Syntiant Corp., a leader in delivering end-to-end edge AI solutions for always-on applications, today announced the availability of a full suite of pre-trained and customizable deep learning models for computer vision applications.

Capable of running on most hardware platforms including CPUs, GPUs, DSPs, FPGAs, and ASICs, Syntiant’s edge AI algorithms are being deployed in security and IP cameras, 360/VR cameras, video doorbells, video conferencing systems and other use cases.

“We are building upon our leadership position in voice and audio by offering scores of off-the-shelf machine learning models for edge-based image and vision applications,” said Kurt Busch, CEO of Syntiant. “These hardware agnostic models can be easily customized and work on a wide range of SOCs, including our own NDP200, which brings together the best of both worlds into a powerful, compact, highly efficient turnkey solution.”

Syntiant’s turnkey solution provides the data, tools and training for quick and easy edge deployments across the following industries:

  • Smart home (intrusion detection, doorbells, face recognition)
  • Personal Devices (face recognition, gesture recognition, noise suppression)
  • Automotive (vehicle identification, theft detection, driver awareness)
  • Government (national security applications for air, sea, land and space)
  • Industrial (object identification, condition-based monitoring, analytics)

“Software service revenue for edge AI is expected to grow to over five billion in 2027, with most of that derived from computer vision,” said Lian Jye Su, analyst at ABI Research. “New models and use cases are emerging every month. Furthermore, end users need to integrate computer vision into end devices with various form factors, processing power, and battery consumption. Having developer-friendly and production-grade edge AI software allows quick onboarding and development of edge AI-based computer vision models. As a result, enterprises can overcome the lack of edge AI expertise and focus on operation.”

CES 2023

Syntiant will be demonstrating its end-to-end deep learning solutions for always-on vision, audio and sensing applications at CES 2023, including the newly introduced NDP115 Neural Decision Processor, at the Venetian Hotel (Room 108, 36th floor). Visit www.syntiant.com/ces to schedule a demo of the company’s technology being deployed in smart homes, battery management systems, teleconferencing solutions and event detection devices, among other use cases.

About Syntiant  
Founded in 2017 and headquartered in Irvine, Calif., Syntiant Corp. is a leader in delivering end-to-end deep learning solutions for edge deployment. The company’s purpose-built silicon and hardware-agnostic models are being deployed globally to power edge AI speech, audio, sensor and vision applications across a wide range of consumer and industrial use cases, from earbuds to automobiles. Syntiant’s advanced chip solutions merge deep learning with semiconductor design to produce ultra-low-power, high performance, deep neural network processors. Syntiant also provides compute-efficient software solutions, with proprietary model architectures and hardware-specific optimizations, that enable world-leading inference speed and minimized memory footprint across a broad range of processors. The company is backed by several of the world’s leading strategic and financial investors including Intel Capital, Microsoft’s M12, Applied Ventures, Robert Bosch Venture Capital, the Amazon Alexa Fund and Atlantic Bridge Capital. More information on the company can be found by visiting www.syntiant.com or by following Syntiant on Twitter @Syntiantcorp or LinkedIn.  

Media Contact:

George Medici
PondelWilkinson
gmedici@pondel.com
310.279.5968


Primary Logo

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.