Regístrese ahora para una mejor cotización personalizada!

Noticias calientes

8 AI features coming to iOS 26 that actually excite me (and how to try them now)

Jun, 18, 2025 Hi-network.com
Apple iOS 26 with Liquid Glass on iPhone 16 Pro

Apple iOS 26 with Liquid Glass (from the Developer Beta).

Jason Hiner/

Apple Worldwide Developers' Conference (WWDC) was expected to have little AI news -- but Apple proved everyone wrong. 

Even though Apple has not yet launched the highly anticipated Siri upgrade -- the company said we will hear more about it in the coming year -- when it came to event time last week, Apple unveiled a slew of AI features across its devices and operating systems, including iOS, MacOS, WatchOS, and iPadOS. 

Also: 's WWDC 2025 recap with Sabrina Ortiz and Jason Hiner

While the features weren't the flashiest, many of them address issues that Apple users have long had with their devices or in their everyday workflows, while others are downright fun.  

I gathered the AI features announced and ranked them according to what I am most excited to use and what people on the web have been buzzing about.

1. Visual Intelligence

Apple introduced Visual Intelligence last year with the launch of the iPhone 16. At the time, it allowed users to take photos of objects around them and then use the iPhone's AI capability to search for them and find more information. Last week, Apple upgraded the experience by adding Visual Intelligence to the iPhone screen. 

To use it, you just have to take a screenshot. Visual Intelligence can use Apple Intelligence to grab the details from your screenshot and suggest actions, such as adding an event to your calendar. You can also use the "ask button" to ask ChatGPT for help with a particular image. This is useful for tasks in which ChatGPT could provide assistance, such as solving a puzzle. You can also tap on "Search" to look on the web. 

Also: Your entire iPhone screen is now searchable with new Visual Intelligence features

Although Google already offered the same capability years ago with Circle to Search, this is a big win for Apple users, as it is functional and was executed well. It leverages ChatGPT's already capable models rather than trying to build an inferior one itself. 

2. Real-Time Translation 

Since generative AI exploded in popularity, a useful application that has emerged is real-time translation. Because LLMs have a deep understanding of language and how people speak, they are able to translate speech not just literally but also accurately using additional context. Apple will roll out this powerful capability across its own devices with a new real-time translation feature. 

Also: Apple Intelligence is getting more languages - and AI-powered translation

The feature can translate text in Messages and audio on FaceTime and phone calls. If you are using it for verbal conversations, you just click a button on your call, which alerts the other person that the live translation is about to take place. After a speaker says something, there is a brief pause, and then you get audio feedback with a conversational version of what was said in your language of choice, with a transcript you can follow along. 

This feature is valuable because it can help people communicate with each other. It is also easy to access because it is being baked into communication platforms people already rely on every day. 

3. AutoMix in Apple Music

The AutoMix feature uses AI to add seamless transitions from one song to another, mimicking what a DJ would do with time stretching and beat matching. In the settings app on the beta, Apple says, "Songs transition at the perfect moment, based on analysis of the key and tempo of the music." 

Also: Your Apple CarPlay is getting a big update: 3 useful features coming with iOS 26

It works in tandem with the Autoplay feature, so if you are playing a song, like a DJ, it can play another song that matches the vibes while seamlessly transitioning to it. Many users are already trying it on their devices in the developer beta and are taking to social media to post the pretty impressive results. 

4. Apple Intelligence on-device is now available to developers

Apple made its on-device model available to developers for the first time, and although that may seem like it would only benefit developers, it's a major win for all users. Apple has a robust community of developers who build applications for Apple's operating systems. 

Tapping into that talent by allowing them to build on Apple Intelligence nearly guarantees that more innovative and useful applications using Apple Intelligence will emerge, which is beneficial to all users since they will be able to take advantage of them. 

5. Shortcuts using Apple Intelligence

The Shortcuts update was easy to miss during the keynote, but it is one of the best use cases for AI. If you are like me, you typically avoid programming Shortcuts because they seem too complicated to create. This is where Apple Intelligence can help. 

Also: My favorite iPhone productivity feature just got a major upgrade with iOS 26 (and it's not Siri)

With the new intelligent actions features, you can tap into Apple Intelligence models either on-device or in Private Cloud Compute within your Shortcut, unlocking a new, much more advanced set of capabilities. For example, you could set up a Shortcut that takes all the files you add to your homepage and then sorts them into files for you using Apple Intelligence. There is also a gallery feature available to try out some of the features and get inspiration for building. 

6. Hold Assist

The Hold Assist feature is a prime example of a feature that is not over the top but has the potential to save you a lot of time in your everyday life. The way it works is simple: if you're placed on hold and your phone detects hold music, it will ask if you want your call spot held in line and notify you when it's your turn to speak with someone, alerting the person on the other end of the call that you will be right there.

Also: What Apple's controversial research paper really tells us about LLMs

Imagine how much time you will get back from calls with customer service. If the feature seems familiar, it's because Google has a similar "Hold for me" feature, which allows users to use Call Assist to wait on hold for you and notify you when they are back.

7. Spatial photos 

The Apple Vision Pro introduced the idea of enjoying your precious memories in an immersive experience that places you in the scene. However, to take advantage of this feature, you had to take spatial photos and record spatial videos. Now, a similar feature is coming to iOS, allowing users to transform any picture they have into a 3D-like image that separates the foreground and background for a spatial effect. 

Also: iOS 26 will bring any photo on your iPhone to life with 3D spatial effects

The best part is that you can add these photos to your lock screen, and as you move your phone, the 3D element looks like it moves with it. It may seem like there is no AI involved, but according to Craig Federighi, Apple's SVP of software engineering, it can transform your 2D photo into a 3D effect by using "advanced computer vision techniques running on the Neural Engine. " 

8. Workout Buddy

Using AI for working out insights isn't new, as most fitness wearable companies, including Whoop and Oura, have implemented a feature of that sort before. However, Workout Buddy is a unique feature and an application of AI I haven't seen before. Essentially, it uses your fitness data, such as history, paces, Activity Rings, and Training Load, to give you unique feedback as you are working out. 

Also: Your Apple Watch is getting a major upgrade. Here are the best features in WatchOS 26

Even though this feature is a part of the WatchOS upgrade -- and I don't happen to own one -- it does seem like a fun and original way to use AI. As someone who lacks all desire to work out, I can see that having a motivational reminder can have a positive impact on the longevity of my workout. 

Honorable mentions

The list above is already pretty extensive, and yet, Apple unveiled a lot more AI features: 

  • Order tracking in Apple Wallet:Apple Intelligence can analyze your emails for tracking details and then populate all of that information in Apple Wallet, even if you used a different payment method than Apple Pay. 
  • Polls for Messages:Apple Intelligence can detect in your messages when a poll could be useful and suggest one. Users can then use Image Playground to create different backgrounds for the polls. 
  • Better routes in maps: Using on-device intelligence, users can better understand a user's daily route and then give them more tailored suggestions. 
  • Genmoji and Image Playground: The Genmoji update is simple. Now, you can mix two emojis to create something new. Image Playground was upgraded to create images leveraging ChatGPT's image-generating model for specific styles. 

How to get early access

Apple released an iOS 26 beta geared toward developers. The beta gives users access to all of the latest features available in iOS 26, but the downside is that since the features are not fully developed, many are buggy. 

So, if you want to try the features, some things to keep in mind are that you won't get the best experience and, as a result, will want to use either a secondary phone or make sure your data is properly backed up in case something goes wrong. 

To get started, you need an iPhone 11 or newer running iOS 16.5 or later and an Apple ID used in the Apple Developer Program. 's guide provides step-by-step instructions on downloading the beta.

Get the morning's top stories in your inbox each day with our Tech Today newsletter.

tag-icon Etiquetas calientes: innovación

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.
Our company's operations and information are independent of the manufacturers' positions, nor a part of any listed trademarks company.