iOS 18.2 beta 2 was rolled out to builders and testers on the developer beta channel on Monday, as Apple prepares the subsequent model of its smartphone working system that’s anticipated to reach in early December with extra Apple Intelligence options. The most recent beta releases additionally embody help for a brand new software programming interface (API) that can permit builders to offer the system with entry to on-screen content material, permitting Siri and Apple Intelligence to ship data to third-party companies for processing.
Apple Introduces API for Siri’s Onscreen Consciousness Characteristic
On the Apple Developer web site, the corporate has supplied documentation (through Macrumors) for the brand new API titled ‘Making onscreen content material out there to Siri and Apple Intelligence’ that’s designed to permit entry to an app’s onscreen content material, enabling Siri and Apple Intelligence to grasp what content material the consumer is accessing.
If a developer provides help for the onscreen content material API, their software will present the contents of the display to Siri/ Apple Intelligence when a consumer explicitly requests it, in response to the corporate. The knowledge on a consumer’s display can then be shared with a third-party service (equivalent to OpenAI’s ChatGPT).
Apple has additionally supplied an instance of Siri accessing onscreen content material. Whereas searching the online, a consumer can say or sort “Hey Siri, what’s this doc about?” to ask Siri to offer a abstract of a doc.
Builders also can add help for onscreen consciousness in browser, doc reader, file administration apps, mail, images, shows, spreadsheets, and phrase processing apps. Apple says that this listing isn’t exhaustive, so extra apps ought to have the ability to benefit from the API sooner or later.
It is price noting that iOS 18.2 will not deliver help for the brand new Siri, which is anticipated to supply enormously improved performance. That is anticipated to reach on iOS 18.4 together with help for in-app actions, which is able to reportedly be launched by Apple in April 2025, which is ample time for builders to combine help for the API into their apps.