As billions of IoT devices permeates into our daily lives, their resource limitations and connectivity constraints have confined them to only a narrow subset of machine learning capabilities within the broader scope of AI. Edge Processing stands as a pivotal solution, optimizing large-scale data mining and aggregation by relocating the data processing segment of an application to more resourceful devices edge devices within the local network. The integration of GPU-accelerated ML inference on edge devices opens new avenues for harnessing the full potential of AI, fostering a future where intelligent decision-making in IoT is not only accessible via cloud.
Mo will commence the talk by outlining the current landscape of IoT networks, emphasizing the increasing demand for intelligent decision-making at the edge. Traditional challenges associated with centralized cloud-based ML models will be highlighted, setting the stage for the exploration of decentralized solutions.
We will delve into the technical considerations for integrating GPUs, including model optimization, compatibility with popular ML frameworks, and the advantages of parallel processing. Real-world examples will be presented to showcase the transformative impact of GPU-accelerated ML inference on edge devices, enabling the deployment of pre-trained models using latest hardware without compromising performance. Practical considerations, such as power efficiency and scalability, will also be addressed to provide a comprehensive understanding of the benefits and challenges associated with this approach.
Mo Haghighi
Dr Mo Haghighi is a director of engineering/distinguished engineer at Discover Financial Services. His current focus is hybrid and multi-cloud strategy, application modernisation and automating application/workload migration across public and private clouds. Previously, he held various leadership positions as a program director at IBM, where he led Developer Ecosystem and Cloud Engineering teams in 27 countries across Europe, Middle East and Africa. Prior to IBM, he was a research scientist at Intel and an open source advocate at Sun Microsystems/Oracle.
Mo obtained a PhD in computer science, and his primary areas of expertise are distributed and edge computing, cloud native, IoT and AI, with several publications and patents in those areas.
Mo is a regular keynote/speaker at major developer conferences including Devoxx, DevOpsCon, Java/Code One, Codemotion, DevRelCon, O’Reilly, The Next Web, DevNexus, IEEE/ACM, ODSC, AiWorld, CloudConf and Pycon.