Qualcomm Partners with Meta to Bring AI Processing On-Device by 2024

By:

Published on:

Chipmaker Qualcomm has announced a new partnership with Meta to optimize the company’s next-generation Llama 2 large language model for AI to run locally on smartphones and PCs powered by Qualcomm’s Snapdragon processors. Llama 2 was unveiled through the partnership between Meta and Microsoft.

The collaboration aims to enable the latest advances in generative AI to happen directly on users’ devices without needing to connect to the cloud. This on-device approach provides reliability, privacy, and personalization compared to current AI services like ChatGPT, which rely on cloud computing.

Llama 2 is the successor to Meta’s first Llama model for natural language processing. Meta has touted Llama 2’s training on 40% more data, improving its conversational abilities. The partnership with Qualcomm will optimize Llama 2 to perform AI tasks like text generation swiftly on Qualcomm’s Snapdragon mobile chips.

Qualcomm stated that supporting on-device AI processing eliminates privacy concerns associated with cloud services. The company expects to roll out compatibility with Llama 2-based AI applications starting in 2024, though it’s unclear if the latest Snapdragon chips will be required.

Vishak
Vishak
Vishak is a skilled Editor-in-chief at Code and Hack with a passion for AI and coding. He has a deep understanding of the latest trends and advancements in the fields of AI and Coding. He creates engaging and informative content on various topics related to AI, including machine learning, natural language processing, and coding. He stays up to date with the latest news and breakthroughs in these areas and delivers insightful articles and blog posts that help his readers stay informed and engaged.

Related Posts:

Leave a Reply

Please enter your comment!
Please enter your name here