The Rise of On-Device AI: Faster, More Secure, and Cost-Effective Solutions Needed | Kisaco Research

The Rise of On-Device AI: Faster, More Secure, and Cost-Effective Solutions Needed

AI Hardware & Edge AI Summit
9-12 September, 2024
Signia by Hilton, San Jose, CA
AI
Semiconductors
On-device AI
AI Hardware

The Rise of On-Device AI: Faster, More Secure, and Cost-Effective Solutions Needed

The era of On-Device AI has officially begun, marking a significant shift in the landscape of artificial intelligence technology. This new phase is characterized by the emergence of innovative products equipped with On-Device AI, such as Samsung's Galaxy S24 series, Apple Intelligence, and Microsoft's Copilot+PC. These advancements aim to address the limitations of traditional cloud-based AI systems, which have been the backbone of popular services like OpenAI's ChatGPT, Google's Gemini, and Microsoft's Copilot.

On June 25, Samjong KPMG published a report titled "A New Stage for Generative AI, On-Device AI," analyzing the utilization strategies of On-Device AI technology. The report highlights the growing importance of On-Device AI as a new platform that processes user requests directly on the device, eliminating the need for external data transfer. This approach offers several advantages, including faster analysis speeds, lower system operating costs, and enhanced data security, making it highly suitable for handling sensitive information.


The rise of On-Device AI is driven by the increasing size and complexity of AI models, which consume significant amounts of data and computing resources such as power and semiconductors. Traditional cloud-based AI systems face challenges related to high data transfer costs, latency issues, and substantial power consumption. On-Device AI addresses these challenges by utilizing semiconductor-based data processing systems embedded within the device, thereby reducing the burden on external networks and cloud infrastructure.

Samjong KPMG's report proposes the 'SCALE' strategy (Semiconductor, Cloud, Ambient Computing, Language Model, Explainable AI) to drive business innovation and scale-up for enterprises based on On-Device AI. The strategy emphasizes the importance of low power consumption during AI service operation, which is crucial for the success of On-Device AI. Consequently, semiconductor technologies such as FPGA (Field Programmable Gate Array) and ASIC (Application Specific Integrated Circuit), which can operate with low power, are expected to grow significantly at an annual average rate of over 40% by 2028. High-performance semiconductors are anticipated to expand primarily in areas with fewer power constraints, such as cloud and data centers.

The report also highlights the potential of the ambient computing market, where IT devices around users autonomously learn user patterns and provide necessary services. The excellent security performance of On-Device AI holds high value in establishing an ambient computing environment that secures and analyzes individual data. Furthermore, the market for small AI models is expected to expand significantly, as these models are developed to enhance the quality of results required by individual On-Device AI devices, such as smartphones and home appliances.

Source, Business Korea. Read the full article here

Other events you might be interested in:

AI Hardware & Edge AI Summit Europe