TMCnet News

Kinara and NXP Collaborate to Provide Customers with Scalable AI Solutions Optimized for Deep Learning at the Edge
[September 13, 2022]

Kinara and NXP Collaborate to Provide Customers with Scalable AI Solutions Optimized for Deep Learning at the Edge


Kinara, the developers of AI processors for edge computing applications, today announced its collaboration with NXP Semiconductors, the world leader in secure connectivity solutions for embedded applications. Through this collaboration, customers of NXP Semiconductors' AI-enabled product portfolio will have the option to further scale their AI acceleration needs by utilizing the Kinara Ara-1 Edge AI processor for high performance inferencing with deep learning models?. Working together, the two companies have tightly integrated the computer vision capabilities of the NXP i.MX applications processors with the performance- and power-optimized inferencing of the Kinara Ara-1 AI processor to deliver computer vision analytics for a range of applications that include smart retail, smart city, and industrial.

This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20220913005255/en/

Kinara and NXP Collaborate to Provide Customers with Scalable AI Solutions Optimized for Deep Learning at the Edge (Photo: Business Wire)

Kinara and NXP Collaborate to Provide Customers with Scalable AI Solutions Optimized for Deep Learning at the Edge (Photo: Business Wire)

Kinara's patented Edge AI processor, named Ara-1, delivers a ground-breaking combination of performance, power, and price for integrated cameras and edge servers. Kinara AI complements it processing technology with a comprehensive and robust set of development tools that allow its customers to easily convert their neural network models into highly optimized computation flows ready to be deployed on the Ara-1 chip.



"Intelligent vision processing is an exploding market that is a natural fit for machine learning. But vision systems are getting increasingly complex, with more and larger sensors, and model sizes are growing. To keep pace with these trends requires dedicated AI accelerators that can handle the processing load efficiently - both in power and silicon area," said Kevin Krewell, principal analyst at TIRIAS Research. "The best modular approach to vision systems is a combination of an established embedded processor and a power-efficient AI accelerator, like the combination of NXP's i.MX family of embedded applications processors and the Kinara AI accelerator."

NXP's AI processing solutions encompass its microcontrollers (MCUs), i.MX RT series of crossover MCUs and i.MX applications processor families, which represent a variety of multicore solutions for multimedia and display applications. NXP's portfolio covers a very large portion of AI processing needs natively, and for any use case that requires even higher performance AI due to increases in frame rates, image resolution, and number of sensors, the demand can be accommodated by integrating NXP processors with Kinara's Ara-1 to deliver a scalable, system-level solution where customers can scale up and partition the AI workload between the NXP device and the Ara-1, while maintaining a common application software running on the NXP processors.


"Our processing solutions and AI software stacks enable a very wide range of AI performance requirements - this is a necessity given our extremely broad customer base," said Joe Yu, Vice President and General Manager, IoT Edge Processing, NXP Semiconductors. "By working with Kinara to help satisfy our customer's requirements at the highest end of edge AI processing, we will bring high performance AI to smart retail, smart city, and industrial markets."

"We see two general trends with our Edge AI customers. One trend is a shift towards a Kinara solution that significantly reduces the cost and energy of their current platforms that use a traditional GPU for AI acceleration. The other trend calls for replacing Edge AI accelerators from well-known brands with Kinara's Ara-1 allowing the customer to achieve at least a 4x performance improvement at the same or better price," said Ravi Annavajjhala, CEO, Kinara. "Our collaboration with NXP will allow us to offer very compelling system-level solutions that include commercial-grade Linux and driver support that complements the end-to-end inference pipeline."

Access a new White Paper outlining how the Kinara and NXP collaboration can help boost the AI performance of embedded platforms here.

About Kinara

Kinara is deeply committed to designing and building the world's most power- and price-efficient edge AI inference platform supported by comprehensive AI software development tools. Designed to enable smart applications across retail, medical, industry 4.0, automotive, smart cities, and much more; Kinara's AI processors, modules and software can be found at the heart of the AI industry's most exciting and influential innovations. Led by Silicon Valley veterans and a world class development team in India, Kinara envisions a world of exceptional customer experiences, better manufacturing efficiency and greater safety for all. Kinara is a member of the NXP Partner Program. Learn more at www.kinara.ai

All registered trademarks and other trademarks belong to their respective owners.


[ Back To TMCnet.com's Homepage ]