최근 온디바이스 AI 기반 제품들이 다양한 분야에서 빠르게 빌드업되고 있다. 이러한 상황에서 하드웨어 중심의 기업들이 AI 기능 구현을 쉽고 빠르게 달성해 제품 개발 기간을 단축시키고 선제적인 시장 진입을 지원하는 인사이트 공유의 장이 마련돼 참관객들의 큰 호응을 얻었다.
▲2024 Start On-Device AI Conference Presentation Site
2024 Start-on-device AI Completion
Sharing development cases such as ST, Gamba Labs, and Nota
“It is important to develop products that utilize the advantages of On-AI”
Recently, on-device AI-based products are being built rapidly in various fields. In this situation, a venue for sharing insights was created that allowed hardware-centric companies to easily and quickly implement AI functions, shorten product development periods, and support preemptive market entry, which received a great response from visitors.
On the 15th, the 2024 Start On-Device AI Conference was held by e4ds news at the main auditorium of the Korea Conference Center in Gangnam Station, Seoul.
Sponsored by STMicroelectronics, the conference began with a keynote speech by Professor Kyung-gi Kim of Daegu University, followed by sessions on AI lightweight and optimization solutions by Gamba Labs and Nota Co., Ltd., while STMicroelectronics and Codezoo attended and presented on MCU and embedded solutions that implement on-device AI.
On-device AI is key to implementing simple AI functions on low-power, low-spec boards, and the market is paying attention to the possibility of applying it to low-power applications and the immediacy achieved through reduced latency.
Gamba Labs CEO Park Se-jin evaluated that the Tiny ML market ecosystem is still in its very early stages and expected that there will be ample market opportunities. CEO Park listed the following as considerations for Tiny ML: cost, performance, power consumption, size, and interoperability.
In particular, on-device AI products require extensive collaboration across hardware, embedded, software, and AI. In this process, on-device AI products involve optimization processes such as lightweight AI models for installation on low-power, low-spec boards and AI model embedding optimized for heterogeneous boards. During this process, time-to-market delays occur.
▲2024 Start On-Device AI conference presentation in progress
To address these challenges, Gamba Labs is developing the 'VIOLA Framework' to present an AI model automation development framework and is providing an AI optimization and lightweight integrated developer tool through Nota's Netspresso.
Gamba Labs and Nota each shared their own development and success stories, introducing examples of implementing ultra-small AI models and utilizing devices. It is reported that Gamba Labs has carried out related projects in kiosks, car trunks, hospital operating rooms, factory facilities, and robots through its ultra-light voice recognition, speaker recognition, and anomaly detection solutions.
Nota also introduced a case study on the lightweight and optimization of image processing models in STMicroelectronics MCUs. Nota, which is partnering with ST, is also providing AI model optimization solutions in conjunction with development tools from Arm, Renesas, etc.
Professor Kim Gyeong-gi emphasized in his keynote speech that “it is important to develop applications and uses that can maximize the advantages of on-device AI,” and that “the development trend of lightweight models will continue, and there will likely be issues in the Auto ML field.” He added that “finding success stories will be a key issue.”