Apple today announced AI additions to its Xcode development environment, aiming to increase the productivity of programmers building apps across Apple's product line.
For those of you who aren't programmers, let's take a moment to discuss just what a development environment does. To do this, a good analogy is a chef's kitchen.
Also: Everything Apple announced at WWDC 2024, including iOS 18, Siri, AI, and more
A baker's kitchen, for example, will be different from one focused on low-carb cooking. A baker's kitchen may revolve around a stand mixer, and have an assortment of racks for cooling and preparation. There would be ample storage for flour, sugar, and other baking staples.
A kitchen focused on low-carb cooking would have equipment like spiralizers, chopping gadgets, sous-vide machines, and an air fryer or two. The storage focus would be the challenge of finding fridge and counter storage for fresh fruits and vegetables, as well as lean proteins.
Each of these work environments is tailored to the individual needs and working style of the person doing the work, customized with certain commonly used tools, and even optimized for reducing steps.
A programmer's development environment, whether it be Xcode for Apple development, Visual Studio for Microsoft applications, or PhpStorm (my primary coding environment) for building web applications, is also an environment that can be tailored to its user's needs.
Also: I put GPT-4o through my coding tests and it aced them - except for one weird result
Coders work on screen and define our "floor space" by the arrangement of windows and panes on that screen. We also have "major appliances," except instead of a stove and refrigerator, we have an editor and a debugger. Many of us carefully arrange our windows and panes to save steps, and often save different layouts of tools depending on what stage of coding we're involved in at the time.
Let's belabor our kitchen analogy a bit more. How many of us, growing up, helped out mom and dad by preparing the food, perhaps chopping up the vegetables or cleaning up or doing the dishes? When we were helping out, we were not the "chef," but instead very valued helpers (even if we did sneak a morsel here or there when a parent seemed like they weren't watching).
In the case of a coding environment, the AI additions are like these kitchen helpers. AI is nowhere near ready to go out and build a major application. But it can undertake numerous small and often tedious tasks that are part of the coding process. In the past year, I've used AI numerous times to help out my coding and I'm convinced I saved a month or more delegating the creation and analysis of small subroutines to the AI.
Apple's version of AI is called Apple Intelligence. At the very end of the keynote, Craig Federighi announced a number of key Apple Intelligence-powered features for Xcode, Apple's development environment.
Also: New VisionOS 2 features unveiled at WWDC: What I'm excited about (and puzzled by)
First, he discussed how Apple Intelligence has been built into developer SDKs. An SDK is a software developer kit, effectively a way for developers to incorporate pre-existing OS technology into their apps.
Continuing our kitchen analogy, think of SDKs as roughly analogous to meal kits. Federighi talked about incorporating Image Playground (Apple's text-to-image AI feature) into developer apps with just a few lines of code.
That's kind of like how a cook might make a meal by opening up the meal kit and just incorporating all the ingredients to come up with a delicious dinner. In the case of the meal kit, the kit developers did all the work in figuring out the ingredients, selecting and providing them, and creating the recipes and instructions. In the case of an SDK, the SDK developer did all the work in figuring out the technology (like text to image), and providing that to the app developers.
Any apps that use the standard editable text view for creating text gain full access to Apple Intelligence writing tools (summaries, etic).
Also: How LinkedIn's free AI course made me a better Python developer
Siri has also been upgraded with Apple Intelligence. Developers who use SiriKit (the SDK for Siri) will gain Siri-based enhancements for features like lists, notes, media, messaging, payments, restaurant reservations, VolP calling, and workouts.
Likewise, Apple is adding to its App Intents functionality. These are predefined actions or tasks that apps can perform, allowing them to integrate seamlessly with Siri and other system features to enhance user interactions and automation. Federighi stated that Apple is enhancing intents with Apple Intelligence capabilities in the following categories: Books, browsers, Cameras, document readers, file management, Journals, Mail, Photos, presentations, spreadsheets, whiteboards, and word processors.
These allow developers to easily add new AI functionality without much additional work, and certainly without the full investment in AI that was necessary to create the features to begin with.
In terms of the coding process itself, Apple announced that it's adding generative intelligence to Xcode. Specifically, it will provide on-device code completion (writing small chunks of code for a developer) for the Swift language. It's interesting that he used the term "code completion" instead of writing code, because code completion implies a much more controlled process simply extending and clarifying code production. Full code writing would involve telling the AI to write a module with a given spec, and from this announcement, it's not clear Xcode will do that.
Also: How to use ChatGPT to write code: What it can and can't do for you
Xcode will be available to answer questions for Swift developers. This can save a ton of time. Developers can ask how to code specific SDK calls (for example, "how can I add image playground here?"). Presumably (again, not specified in the keynote) developers could also ask the AI "what does this code do" and get a more detailed explanation.
Generative AI in development environments is a fairly new thing, and environment producers as well as individual developers are still learning where generative AI can be a helpful new power tool or where it becomes something that just gets in the way.
We came a long way in the last year, and my bet is that by WWDC 2025, this feature set will seem rudimentary because we've all learned a lot more about how AI can help coding.
You can follow my day-to-day project updates on social media. Be sure to subscribe to my weekly update newsletter, and follow me on Twitter/X at @DavidGewirtz, on Facebook at Facebook.com/DavidGewirtz, on Instagram at Instagram.com/DavidGewirtz, and on YouTube at YouTube.com/DavidGewirtzTV.