Apple's New AI Development May Integrate AI into iPhones
The introduction of new AI models by Apple suggests a clear focus on embedding artificial intelligence directly into Apple devices. The tech giant recently unveiled OpenELM, a collection of four compact language models hosted on the Hugging Face model library, indicating a step towards enhancing AI capabilities on mobile devices. OpenELM, which stands for "Open-source Efficient Language Models," is particularly adept at handling text-based tasks such as composing emails and is freely available for developers to utilize.
These models are offered in four different sizes—270 million, 450 million, 1.1 billion, and 3 billion parameters—each designed to understand a varying number of variables from their training data, making them suitable for diverse application needs. The smaller models are cost-effective and specifically tailored to perform well on personal devices such as smartphones and laptops.
Additionally, in December, Apple rolled out MLX, a new machine learning framework to optimize AI operations on Apple Silicon, further solidifying its commitment to advanced AI. This was accompanied by the release of MGIE, an image editing model that allows photo corrections through simple prompts, and Ferret-UI, a model designed to enhance smartphone navigation. These developments signal Apple's ongoing efforts to integrate more sophisticated AI technologies into its consumer devices.