“Apple-GPT”: What Apple’s own AI will look like

Like all Big Tech companies, Apple is currently working on generative AI in the form of a Large Language Model (LLM). IT journalists now use this as a joke “Apple GPT” The so-called AI system will soon support Siri and many other system functions of Apple devices and help users cope with their tasks. The highlight: Unlike Microsoft, for example, with its now deeply integrated OpenAI solutions, Apple’s AI should run locally on Apple devices.








And probably as early as September 2024: iOS 18 is expected to be released then. And if you believe the rumors, the new iPhone system version will probably do that, according to Mark Gurmann from Bloomberg “biggest iOS update ever” become. There should already be plenty of AI functions.

Tim Cook interpreted this development at the beginning of February – in typical Apple cloudy manner – as part of the quarterly report Earnings call to the quarterly figures: Apple is investing in the future: “This includes artificial intelligence, on which we continue to devote significant time and effort. We look forward to sharing details of our ongoing work in this area later this year.” This can only mean WWDC 2024, the annual developer conference at which Apple traditionally presents new iOS and MacOS versions.

Ferret: Apple’s all-round AI

But what will come next? In October 2023, Apple released a Multimodal Large Language Model called Multimodal Large Language Model under a non-commercial license on Github Ferret. At the same time a appeared Preprint paper by researchers at Apple and Columbia University (PDF), which describes the functions of the model in more detail. According to the authors, Ferret has “spatial consciousness” and can analyze image content very precisely. This should make it possible, for example, to retrieve recipes based on a photo of the finished dish.




Apple’s rather unusual step of revealing an important product in advance via open source release is probably due to the fact that Apple currently does not have the capacity to develop complex AI systems completely in-house. The group has been around for quite some time looking for AI engineers. There is also a lack of its own hyperscaler solutions, like the ones that competitors Amazon, Google and Microsoft have with AWS, Azure and Google Cloud Platform. Therefore, the open source community is tapped.

Machine learning projects on Apple’s AI playlist

But not only Ferret is on Apple’s AI playlist: With HUGS (Human Gaussian Splats), the group is researching together with the Max Planck Institute for Intelligent Systems on animatable human 3D avatars that can be generated from video scenes and then used elsewhere. It should be obvious that Apple’s Augumented Reality and the Apple Vision Pro are the main target here. The relevant source code has already disappeared from Github.

source site