
As part of its 2025 Worldwide Developers Conference, Apple held its Platforms State of the Union on June 9. This extra keynote address, which is intended for game developers, goes into more in-depth detail about some of the announcements.
View: WWDC 2025 Keynote Roundup for Apple.
Work” Liquid Glass” to game design
One of the biggest presentations was the release of” Liquid Glass,” the new software design to be used in the upcoming operating system update. Developers can now use SwiftUI, UIKit, or AppKit to create the glassy, bright new appearance of their apps, such as in the application symbol.
At WWDC, Billy Sorrentino, senior director of human program at Apple, stated that the new design aims to improve device pyramid, harmony, and consistency.
Key user interface ( UI) elements now float above content to clarify and concentrate, clarifying the hierarchy. Harmony is attained when user interfaces are more in tune with machine geometry and user-generated touch patterns. Apps feel steady and comfortable wherever they run because the new design is common across all Apple platforms.
With Apple Intelligence, create Artificial functions.
Additionally, the presentation also shared some information about Apple Intelligence, such as the announcement that developers can now get Apple’s on-device basis models to incorporate into their apps.
An educational software could use them to create a personalized quiz based on a student’s notes, for instance. It is much less expensive than paying for a third-party services because there are no related sky API or site expenses. It functions even when the customer is not connected to the internet, and protection is guaranteed.
Apple introduced the Foundation Models Framework, which is embedded in the Sharp programming language to make this connectivity yet more seamless. With just a few lines of code, developers can access conceptual capabilities like word summary and specialty application calling. This model is made to get user-friendly.
In addition to this, Apple has expanded its App Intents software. It used to be a tool for integrating game glad with features like Siri, Spotlight, or plugins, but it now supports Physical Intelligence. This development enables developers to incorporate physical search capabilities into their apps, enabling users to identify and communicate with information directly from images using AI, quite as recognizing an object in a photo or scanning handwritten notes.
ChatGPT is available for Xcode 26.
In its upcoming generation, Apple’s development toolkit, Xcode, will help ChatGPT more fully.
Developers who create applications within Xcode 26 you contact the Artificial assistant for bug fixes, documentation, and other coding support they may require. It both reacts to organic language instructions and suggests activities based on task. Designers can also use API keys to connect third-party AI bots, such as ChatGPT, and to undo any changes that AI-created or do not satisfy their needs.
Apple’s senior director of designer relations, Matthew Firlik, also revealed that the upcoming release of macOS Tahoe will be the last to support Intel Macs. He urged designers to use Xcode 26 when creating software so they can utilize the performance capabilities of the M-Series chips found in more recent Macs.
With a common alpha scheduled for July, all of these creator options are now open for testing.
VisionOS 26 supports the creation of geographical apps
The Vision Pro headset’s upcoming operating system, VisionOS 26, may include a number of features that will help developers create geographical apps, including:
- New cubic APIs: These enable, for instance, the 3D framing of widgets that are used to create entirely 3D UI layouts using SwiftUI.
- Outside Window Sharing: Developers can create shared geographic experiences for users using two different Vision headsets in the same space.
- The Image Presentation Component of RealityKit makes it possible for 2D graphics to transform into engaging 3D scenes.
- Support for wide-field of view, 180°, and 360° video platforms via Apple Projected Media Profile.
WebAssembly and other features are supported in Swift 6.2.
Version 6.0.2 of Swift is available. The updated type includes:
- New APIs that allow you to work with different types of remembrance, like Span and InlineArray.
- WebAssembly aid
- Improved compatibility with C++ and Java in addition to different improvements.
- A containerization device that makes it possible to run Linux box images on Mac.
Developers now have the tools to create unified cross-platform apps by bridging AI, pattern, and performance with Swift.