TECH

iOS 18 Project Greymatter will use artificial intelligence to summarize notifications, articles and more

AI will improve several core apps with summarization and transcription features

New generation of Apple In operating rooms systems will see Project Greymatter, which will bring many improvements related to artificial intelligence. We have new details about the AI ​​features planned for Siri, Notes and Messages.

Following widespread announcements and reports of AI-related improvements in iOS 18, AppleInsider has obtained additional information about Apple's AI plans.

People familiar with the matter said the company is internally testing a slew of new AI-related features ahead of its annual WWDC. The company's advances in artificial intelligence, codenamed Project Greymatter, will focus on practical benefits for the end user.

In preview versions of Apple's operating systems, the company was working on a notification summary feature known as “Greymatter Catch Up.” The feature is tied to Siri, meaning users will be able to request and receive an overview of their latest notifications through the virtual assistant.

Siri is expected to get a significantly upgraded response generation capability thanks to the new Smart Reply system, as well as Apple's on-device LLM. Siri will be able to consider things like people and companies, calendar events, places, dates, and more when creating responses and summaries.

Subscribe to AppleInsider on YouTubeSubscribe to AppleInsider on YouTube

In our previous reports on Safari on January 18, Ajax LLM and revamped Voice Memos app, AppleInsider reported that Apple plans to bring AI-powered text summarization and transcription to built-in apps. We later learned that the company intends to bring these features to Siri as well.

Ultimately, this means Siri will be able to respond to queries on the device, create summaries of longer articles, or transcribe audio, as in the updated Notes or Voice Memos apps. All of this could be done using Ajax LLM, or cloud processing for more complex tasks.

We're also told that Apple is testing improved and “more natural” voices, as well as text-to-speech improvements, which should ultimately lead to a much better user experience.

Apple is also working on media and TV controls for Siri across devices. This feature would allow someone to use Siri on their Apple Watch to play music on another device, for example, although that feature won't be available until 2024.

The company decided to introduce artificial intelligence into its devices. Several core system applications tailored to different use cases and tasks. One notable area of ​​improvement is in photo editing.

Apple has developed generative AI software to improve image editing

iOS 18 and macOS 15 are expected to introduce AI-powered photo editing capabilities in apps like Photos. Under the hood, Apple has developed a new Cleanup feature that will allow users to remove objects from images using generative AI software.

The Cleanup tool will replace Apple's current Touch Up tool

Also associated with Project Greymatter, the company has created an application for internal use known as the Generative Playground. People familiar with the app told AppleInsider exclusively that it can use Apple's generative software to create and edit images, and that it supports iMessage integration in the form of a dedicated app extension.

In Apple's test environments, you can create an image using artificial intelligence and then send it via iMessage. There are indications that the company is planning to bring a similar feature to end users of its operating systems.

This information is consistent with another report which claims that users will be able to use AI to create unique emojis, although there are additional options for image generation functions.

Preview versions of the Apple Notes app also include links to a generation tool, according to people familiar with the matter, although it's unclear whether that tool will generate text or images—as is the case with the Generative Playground app. .

Notes will receive AI-powered transcription and summarization along with math notes

Apple has major improvements to its built-in Notes app that debuts in iOS 18 and macOS 15. The updated Notes will get support for audio recording in the application, audio transcription and summarization based on LLM.

The Notes app for iOS 18 will support in-app audio recording, transcription, and summaries.

Audio recordings, transcriptions, and text summaries will be available. in one note along with any other content users want to add. This means that one note could, for example, contain a recording of an entire lecture or meeting, complete with whiteboard images and text.

These features will transform Notes into a truly powerful tool, making it a popular application for both students and business professionals. Adding audio transcription and summarization features will also allow the Apple Notes app to better position itself against competing offerings such as Microsoft OneNote or Otter.

While support for app-level audio recording, as well as AI-powered transcription and audio summarization capabilities will greatly improve the Notes app, it's not the only thing Apple is working on.

Math Notes — create graphs and solve equations using artificial intelligence

The Notes app is getting a brand new addition in the form of Math Notes, which will provide support for proper math notation and integration with Apple's new GreyParrot calculator. application. We now have more information about what the math notes will entail.

The Notes app for iOS 18 will include support for audio transcription using artificial intelligence and mathematical notes

People familiar with the new feature said that Math Notes will allow the application to recognize text in the form of mathematical equations and suggest solutions to them. Support for graphing expressions is also in the works, meaning we could see something similar to the Grapher app on macOS, but in Notes.

Apple is also working on math-related input improvements in the form of a feature known as “Keyboard Math Predictions.” AppleInsider reported that this feature will allow mathematical expressions to be completed whenever they are recognized as part of text input.

This means that in Notes, users will have the ability to automatically complete their math equations, similar to how Apple currently offers predictive text or inline completion in iOS. path to VisionOS later this year.

Apple's VisionOS will also see improved integration with Apple's Transformer LM, a predictive text input model that offers suggestions as you type. The operating system is also expected to receive a redesigned voice command user interface, which serves as an indicator of how much Apple values ​​input-related improvements.

The company is also aiming to improve the user experience through the use of so-called “smart replies”, which will be available in Messages, Mail and Siri. This will allow users to respond to messages or emails with basic text responses instantly generated by Apple's on-device Ajax LLM.

Apple AI vs Google Gemini and other third-party products

AI has penetrated almost all applications and devices. The use of AI-focused products such as OpenAI's ChatGPT and Google Gemini has also led to a significant increase in overall popularity.

Google Gemini is a popular artificial intelligence tool

Although Apple has developed its own artificial intelligence software to better position itself against competitors, the company's AI is far from as impressive as something like Google Gemini Advanced, AppleInsider has learned.

During its annual Google I/O developer conference on May 14, Google demonstrated an interesting use case for artificial intelligence: users can ask a question in video form and receive an answer or suggestion generated by the AI.

As part of the event, Google's artificial intelligence was shown a video of a broken player and asked why it wasn't working. The software detected the turntable model and suggested that the turntable might be out of balance and causing it to not work.

The company also announced Google Veo, software capable of generating video using artificial intelligence. OpenAI also has its own video generation model known as Sora.

Apple's Project Greymatter and Ajax LLM can't generate or process video, meaning the company's software can't answer complex video questions about consumer products. This is likely why Apple has been keen to partner with companies like Google and OpenAI to secure licensing agreements and make more features available to its user base.

Apple will compete with products like Rabbit R1 by offering vertically integrated AI software on existing hardware.

Compared to physical products on AI-themed projects like Humane AI Pin or Rabbit R1, Apple's AI projects, have the significant advantage of running on devices that users already own. This means that users will not have to purchase a dedicated AI device to reap the benefits of AI.

Humane's AI Pin and Rabbit R1 are also generally considered unfinished or partially functional products, with the latter even turning out to be nothing more than a dedicated Android app.

Apple's artificial intelligence projects are expected to debut at the company's annual WWDC conference on June 10 as part of iOS 18 and macOS 15. The Calendar, Freeform and System Settings apps will also be updated. everything works.

Follow AppleInsider on Google News.

Leave a Reply