Steve Thomas - IT Consultant

I am totally unsurprised that demand for triple-A games on iPhones and iPads reportedly isn’t high and could be considered commercial failures, according to data from Appfigures via MobileGamer.biz

Despite Apple having worked on getting Death Standing, Assassin’s Creed Mirage, and Resident Evil 7 ported over from last- and current-generation consoles onto the A17 Pro-powered iPhone 15 Pro and iPhone 15 Pro Max, and the iPads sporting M-series chips, these games have hardly been smash hits. Appfigures’ data shows that by downloads and revenue, these triple-A titles pale compared to simpler native mobile games based around free-to-play models. 

Now, I’m a technology journalist, not a developer or business strategist, but I could have predicted this from a country mile away. 

I think it’s genuinely impressive that a slim iPhone can run modern console games – albeit not at maximum graphics – that would normally require a dedicated pseudo rectangle of gaming hardware to work. It shows that Apple can produce impressive in-house chips that set a high bar for performance and graphics processing, even for the likes of Qualcomm and its flagship Snapdragon Gen 8 chip to chase. 

A time and a place

However, even as someone who enjoys games – I have a PS5, Xbox Series X, and a stupidly powerful PC – I remain unconvinced by console-quality gaming on smartphones or tablets.

The first issue is these triple-A games tend to need a controller or controller accessory to play, and that’s not only an extra cost – whether you’re buying a clip to hold a phone to, an Xbox Wireless Controller, or getting something like the Razer Kishi – it’s also an extra level of faff. At home, I can simply turn to one of the dedicated game consoles. When commuting, the last thing I want to do is try to connect a controller to my phone while being jostled by other people.

Furthermore, if I want a proper handheld gaming experience, I’m more likely to reach for the Nintendo Switch or Valve’s Steam Deck – the latter keeps impressing me with what it can run.

And the second point is smartphones are already bastions of casual and simple gaming. In general, I use my iPhone or iPad to play simple games that are easily controlled via a touchscreen and often one-hand; Plants vs Zombies, FLT: Faster Than Light, and Bleak Sword come to mind.

Apple Arcade

(Image credit: Future)

If I game while traveling or commuting, I tend to stick to games that are easy to use and immerse me with a setting or task rather than the need for finely tuned actions or to move and look at the same time. That way if someone needs me to move out of my seat while using the facilities on an airplane, my gaming flow isn’t hugely impacted.

Just looking around at people on my commutes from East London into the city center, I’ll see people playing games on their phones, only these won’t be console-quality titles but the likes of Candy Crush Saga. 

Equally, games like Call of Duty Mobile offer frenetic action and the need to use a virtual controller, but they have been built for native phone use and optimized for mobile platforms rather than being ports. They also offer quick bouts of fun rather than a lengthy campaign one needs to commit to.

Furthermore, Apple may be tackling gaming from two disparate sides. While the latest Pro iPhones can run ports of select triple-A games, Apple Arcade offers bespoke games predominantly designed for mobile use that can’t be found on other platforms. If I were going to do any dedicated gaming on my iPhone or iPad, I’d choose Arcade titles over ports or even native mobile games that cross both iOS and Android.

Playing the longer game

Doesn't this all mean I think Apple shouldn’t bother with triple-A game ports for iOS and iPadOS? No, not at all.

Apple has the money to afford to build out a mobile gaming ecosystem that offers both high-end and less-demanding games in terms of graphics, complexity, and composition. And while the A17 Pro is limited to just two iPhone 15 models right now, the chip’s power will surely bleed into future iPhones, namely the iPhone 16.

That should broaden the number of people with access to phones that can run these console ports, thus opening up the scope for triple-A gaming on smartphones.

I doubt it would transform mobile gaming as a whole. However, it could grow a niche of users who might not need to rely on the best gaming phones for a high-quality mobile gaming experience.

Ultimately, I feel triple-A gaming on the iPhone 15 Pro and 15 Pro Max was more of a tech demo for Apple to show off the power of the A17 Pro. There’s debate on whether such power is needed in a phone, but at least it somewhat futureproofs the iPhones. 

And I find it intriguing what Apple could do next with gaming; all that power could produce some truly smart Apple Arcade games, for example, ones that look stunning but are built natively for smartphones.

You might also like

Apple has long been rumored to be working on a foldable iPhone, but a new patent suggests there might be screens that stretch as well as screens that bend in development for numerous upcoming Apple products.

As spotted by AppleInsider, the patent is simply titled "Stretchable Display", and goes on to describe a screen that can wrap around different surfaces and shapes, while still operating like a standard OLED display.

While the patent doesn't specifically mention any devices in the Apple portfolio, it could potentially be used for a number of them. Imagine a HomePod with a display that curves around the top of the speaker, for example.

The patent is notable for the number of inventors attached: 55 rather than the usual 10 or so. However, as ever with patents, this is only an indication of what a company is spending time researching – there's no guarantee we'll ever see this tech in an actual product.

How this could be used

Apple stretchable display patent

The patent includes several stretchable display diagrams (Image credit: Apple/USPTO)

We've already mentioned how a future HomePod could have a display that stretches and wraps around the device, and such a display could also find a use in a foldable iPhone – enabling the phone to have a wraparound screen on the outside, as has been rumored.

Then there's the Apple Watch. The Apple Watch 10 is supposedly bringing some design tweaks with it, but a future model could have a screen that wraps all around the wrist (Apple has actually filed patents hinting at this in the past).

The patent mentions phones and watches, but also laptops, smart glasses, virtual reality headsets, and kiosk and car displays. This is clearly a technology that could find a place in many Apple products, if development on it continues.

That development will most likely take several years, so we're not going to see it appear in the iPhone 16. However, don't be surprised if future Apple devices, like an all-screen MacBook, arrive with displays that can stretch as well as fold.

You might also like

When Apple announced the iPhone 15 Pro and iPhone 15 Pro Max last year, I raised an eyebrow at the so-called Action button. While I like that Cupertino’s clever folks baked in more… errrr action into a somewhat redundant mute slider, I was less sold on the limitations of the button; after all, you could already use the volume buttons on previous iPhones to activate the camera app’s shutter. 

But as I used the iPhone 15 Pro Max more, extolling the virtues of titanium and the overall near-perfect iPhone experience, I started to really love the action button. 

I have it set to trigger the ‘torch’ option in iOS — aka the camera flash — and it’s surprisingly handy, especially on a Max phone. Before I’d have to swipe at the top of the phone to get the drop-down menu and then tap at the torch icon; if I happen to have wet hands (no not like that, live in rainy London) that wasn’t always easy. 

So, the action button became a real boon in my life; that could be a sad reflection on my existence, but never mind. 

However, as a tech journalist — or any journalist in general — I often find myself musing at quiet moments or during a commute. And today’s musing mixed with machinations over the Action button, triggered an unexpected thought.

I miss the squeezable sides of past Google Pixel phones

First introduced with the Pixel 2 phones, Google equipped its phones with sides that triggered the Google Assistant via a short, sharp squeeze. I’m not exactly sure how it worked — there was something about strain gauges — but it was an effective way of waking up Google’s smarter take on Siri without barking the occasionally clunky “ok Google” activation phrase. 

What might have felt like a superfluous feature became oddly useful and second nature. It also introduced a level of freshness and minor innovation into the Pixel phones, something I felt many phones were lacking at the time. 

Sadly, the squeezable sides of the Pixel phones only lasted a few generations, going the way of the dodo when the Google Pixel 5 arrived with the search giant taking a different approach to phone design — less is more — before settling on the Pixel aesthetics and AI focus debuted with the Pixel 6

Craving touch

an image of the Google Pixel 2 XL

(Image credit: Future)

Do I need squeezable sides to make a comeback in the smartphone arena? Not really. 

But while the best phones lean towards AI integration and smart features that range between a gimmick and proper smartness, this is all happening on the software side, except for AI-centric chipsets. Yet, I feel there’s still some scope to be innovative and creative with phone hardware beyond just making the screen flex, as we see in the best foldable phones.

I’m expecting the rumored iPhone 16 line to be very much an evolution of the current Apple phones. But I’d love it if Apple took some inspiration from some of the quirkier phones of the past and introduced some new physical features or made the Action button even more functional, at the very least.

With all the AI tech, I’d love phones to make better use of haptics, accelerometers and other touchpoints to let me do more with a smartphone without necessarily looking at and tapping on a specific app or function.

Going by past phones, I feel Google is the type of company to introduce new hardware quirks and then Apple is the one that refines them to a fine point.

The early tease of the Google Pixel 9 Pro doesn't suggest a big design change is coming, but I hope the search giant has put something special underneath its hardware to excite and delight me and inject some creativity in the best Android phones; we’ll hopefully see rather soon.

You might also like

Now that we've heard everything about the iOS 18 upgrade coming later this year, we can turn our attention back to the hardware – and the design of the iPhone 16 flagship due in September has leaked again, this time through what seems to be a case mold.

A short video of this clip was posted to social media by @UniverseIce, who is one of the more reliable tipsters out there. Nothing is official until Apple confirms it. but this does match up with previous leaks we've seen.

It appears we're looking at a case template for either the iPhone 16 or the iPhone 16 Plus, as it's quite tricky to judge the size from this quick clip. It's most probably the larger model, but it might be the standard one.

There's still some debate over the screen size the iPhone 16 Plus is going to come with, and from what we're heard so far, every iPhone model could be changing its screen size this year – perhaps up to 6.9 inches with the iPhone 16 Pro Max.

Cameras and buttons

See more

Whether this case mold is for the iPhone 16 or the iPhone 16 Plus, both phones are expected to sport the same design – the only difference between the two handsets should be the screen size and configuration, the dimensions, and the battery capacity.

You can see here the vertical camera alignment that's been previously rumored: it'll be a change from the diagonal arrangement on the iPhone 15 and the iPhone 15 Plus, and it's apparently going to enable the new versions to record spatial photos and videos.

There also seems to be room here for the new Capture button that's supposedly coming to all the iPhone 16 models this year. If the rumors are true, the new button will give users more control over how photos and videos are taken.

No doubt the rumors and leaks are going to continue between now and September, when another four iPhones are expected to launch – and they should come with iOS 18 and Apple Intelligence on board.

You might also like

iPhones are already pretty impressive video cameras, but Apple just made it easier to fine-tune your clips with its new Final Cut Camera app for iPhones and iPads, which you can now download for free.

Apple announced the standalone app in May, but it's just become available to download from the App Store. Final Cut Camera doesn't do a lot that you can't already do with third-party apps like Blackmagic Camera or Kino, but it is a lot more advanced than your iPhone's built-in Camera app – and it is also free.

The main benefit of using Final Cut Camera to shoot your videos is the manual control it offers. Like the stock Camera app, you can tweak your exposure by tapping the arrow in the bottom-right corner to reveal a sub menu, then tapping the second option from the left. 

However, unlike the iPhone's built-in Camera app, you can manually change your video's ISO or shutter speed. To do this, tap the 'auto' button within the exposure menu to get those extra options. A good rule of thumb for capturing natural motion blur is to make sure your shutter speed is double your frame rate (for example, at 24fps, set the shutter to 1/48s). You can also manually set the white balance by tapping the icon to the left of the exposure button.  

The Final Cut Camera app is particularly powerful if you have an iPhone 15 Pro or iPhone 15 Pro Max because it combines these manual controls with the option of shooting in the Apple ProRes format. Unfortunately, there's no option to shoot in the ProRes LT format for smaller file sizes, but ProRes – which is used by pro video editors in apps like Final Cut Pro and Premiere Pro – is still a nice option if you're looking for maximum dynamic range and editing flexibility.

Another nice trick in Final Cut Camera is the ability to 'focus pull' (or slowly change the focus to highlight a focal point in your scene). To do this, open up that same sub menu and tap the icon on the right, then tap the AF/MF button (third from the left). Choose 'Manual' focus and drag the dial to change the focus.

Three iPhones on a pink and blue background showing Apple's Final Cut Camera app

The Final Cut Camera app lets you adjust the white balance (left), turn on focus peaking (middle) and manually tweak your video's ISO and shutter speed (right) (Image credit: Apple / Future)

To help you do this precisely, you can also turn on 'focus peaking' (above), which will highlight the in-focus areas of your scene in green. You can find this by tapping the settings gear icon in the top-right, then Tools, then toggling the 'Focus Peaking' option. This option needs an iPhone or iPad with at least an A13 Bionic chip, which debuted on the iPhone 11 series back in 2019.

Final Cut Camera also offers superior zoom control to the iPhone's stock Camera app. To zoom in or out of a scene, tap the magnifying glass to the left of the three focal length options, then use the slider. This dial feels smoother than the one on the Camera app and is also restricted to a single focal length, so you don't get the jerk of switching between lenses.

All of these controls help you fine-tune your iPhone videos or create a particular mood (for example, by intentionally underexposing the scene or tweaking the white balance). But Apple has also baked in some more advanced multi-cam features for pro shooters that, naturally, push you towards its Final Cut Pro editing app...

Going multicam

For the past few years, Apple has been increasingly touting the iPhone as a professional video camera – and a few features in the Final Cut Camera app certainly help it to earn that moniker.

If you have multiple iPhones or iPads – and a Final Cut Pro for iPad subscription ($4.99 / £4.99 / AU$7.99 a month) – you can shoot a Live Multicam session using the Final Cut Camera app. This effectively gives you four different angles on the same scene, which all feed into the Final Cut Pro app for speedy capture and editing.

To start a Live Multicam session, set up a Live Multicam project in Final Cut Pro for iPad (go to New Project > Record with Live Multicam), then tap the camera icon in the top-left corner of the Final Cut Camera app. Once you've followed the instructions to set up your camera angles, you can start recording on all the devices by tapping the record button in either app.

That's a potentially handy tool for YouTubers, and if you have an iPhone 15 Pro or iPhone 15 Pro Max, you can also record your videos directly to an external SSD to avoid filling up your phone's storage. You'll need a USB cable that is rated as USB 3.2 Gen 1x1 or better (with a minimum speed of 5Gbit/s), but once you're hooked up you'll see the name of your device appear at the bottom of the Final Cut Camera screen.

While the Final Cut Camera app is a handy bonus for the average iPhone owner, these extra tricks make it particularly potent for pro video shooters and owners of the iPhone 15 Pro series. Naturally, Apple will hope it lures in a few more Final Cut Pro for iPad subscribers, too, but you don't need that to benefit from some of its features.

You might also like

Apple Intelligence is an exciting new upgrade for iPhone, iPad, and Mac fans, but slightly less appealing is the fact that it's going to be exclusive to recent devices. That led many to suggest that Apple's take on AI is a crafty way to force us into upgrading our tech, but Apple has now shot down those theories.

In an interview with Daring Fireball's John Gruber at WWDC 2024 (spotted by MacRumors), John Giannandrea (Apple's SVP of Machine Learning and AI) said in reference to Apple Intelligence that "you could, in theory, run these models on a very old device, but it would be so slow that it would not be useful".

When Gruber asked if Apple Intelligence was simply a "scheme to sell new iPhones", Greg Joswiak (Apple's SVP of Worldwide Marketing) said, "No, not at all. Otherwise, we would have been smart enough just to do our most recent iPads and Macs, too, wouldn't we?"

That last point refers to the fact that Apple Intelligence works on iPads and Macs that have an M1 chip or later, which means it'll be available on a much wider range of devices than on iPhones. We've rounded up the full list of devices that support Apple Intelligence, which includes 16 models of Mac and five versions of the iPad.

The most likely and credible explanation for Apple drawing the line on some devices and not others for Apple Intelligence support is memory. As noted by analyst Ming-Chi Kuo, the iPhone 15 comes with an A16 Bionic chip that has 6GB of memory, while the iPhone 15 Pro's A17 Pro chip comes with 8GB of memory.

The analyst also noted that Apple Intelligence uses a 3-billion-parameter large language model that, when compressed, needs around 0.7-1.5GB of spare memory. Considering that the M1 chip surprisingly has less processing power than the A16 Bionic, that means the memory requirement theory for Apple Intelligence's device exclusivity certainly has some credence. 

But it's also fair to suggest that Apple has had one eye on upgrade cycles when developing its shiny new AI feature.

Analysis: A very convenient truth

There are certainly two sides to this debate about Apple Intelligence's exclusivity. Apple's explanation that on-device AI is computationally demanding and not possible on older devices is certainly true.

John Giannandrea explained further in the interview: "So these models, when you run them at run times, it's called inference, and the inference of large language models is incredibly computationally expensive. And so it's a combination of bandwidth in the device, it's the size of the Apple Neural Engine, it's the oomph in the device to actually do these models fast enough to be useful".

The question is more about if Apple has decided to draw the line on which models can run Apple Intelligence in a "useful" way. Considering the standard iPhone 15 came out less than a year ago, it's disappointing that Apple didn't have the foresight to give it enough RAM to support its AI features. Cynics might say that was by design.

With the best phone cameras now so fully evolved that it's hard to add major new upgrades, there's also no doubt that AI is the next big poster feature to help spark a much-needed upgrade cycle. A few years ago, some studies suggested that the average iPhone upgrade time had increased from three years to four.

As the technological gap narrows between iPhones, and software support lengthens, Apple needs a new feature to convince us to trade in our old phones for new ones – and AI tricks conveniently do need a hardware boost when they're run on-device.

Whether Apple Intelligence is exciting enough for us to upgrade is another matter, but that's something we'll find out when it lands with iOS and the iPhone 16 later this year. 

You might also like...

If you've got an iPhone 15 Pro or iPhone 15 Pro Max, iOS 18 will enable any app on your handset to make use of the best camera trick that's exclusive to these devices: the ability to record ever-so-slightly 3D images and clips known as spatial photos and videos.

As reported by MacRumors, Apple announced a new API (application programming interface) at WWDC 2024 earlier this month, which means third-party app developers will be able to leverage the spatial features in the same way as the rest of the iPhone camera.

The technical name for the trick is stereoscopy, and it works by taking the same photo (or video) from two slightly different angles. You end up with something that has a little bit of depth to it, rather than being 2D and flat.

As the iPhone 15 Pro and iPhone 15 Pro Max have two cameras vertically aligned, they can capture spatial photos and videos – the iPhone 15 and iPhone 15 Plus can't, as the cameras on those phones are diagonally aligned (that might change with the iPhone 16).

Viewing photos and videos

Apple Vision Pro spatial video

You need a Vision Pro to look at spatial photos and videos (Image credit: Apple)

You can't actually view spatial photos and videos on an iPhone, what with it having a two-dimensional screen: you need an Apple Vision Pro so that you can move your head around, and we've been impressed by the experience. These files can also be viewed on other VR headsets, but not natively – you need to do some converting. Spatial videos can also be captured on a Vision Pro headset, as well as viewed.

With sluggish sales reported for Apple's mixed reality headset, developers might not be falling over themselves to get support for spatial photos and spatial videos built into their apps, but it is another selling point in a crowded field.

So far we haven't seen any photo or video apps implementing the feature, though iOS 18 is only at the developer beta stage right now. Given that the new API is closely integrated with the current iPhone camera APIs, it shouldn't be too hard to add support for it.

When September rolls around and we get more iPhones that can record in these formats, as well as a full release of iOS 18-compatible devices, spatial photos and spatial videos might start to become more mainstream. 

You might also like

Now that the dust has settled on WWDC 2024's many iOS 18 announcements, we've been keen to find out which iPhone feature Apple fans are most excited about – so we set out to answer that very question in a new WhatsApp poll.

We asked the many followers in our WhatsApp channel (368,000 subscribers and counting) the simple question "which iOS 18 feature are you most excited about for your iPhone?". The answers give us a revealing glimpse of the features that you'll use the most when iOS 18 lands in September – and which could get ignored.

The poll also gives us a snapshot of how much of the TechRadar audience is on Android. In fact, the top answer in our poll (with around 1,664 votes) was actually "I don't care (I'm on Android)".

But Android fans aside, a significant number of iPhone owners chose their favorite features from our list of iOS 18's biggest headlines. There wasn't much love for the upgrades to Apple Mail, the redesigned Photos app or, surprisingly, support for RCS messaging – which should make texting Android-owning friends a much better experience.

Still, here are the top three iOS 18 features that TechRadar's Apple fans are most excited about:

3. Texting via satellite (130 votes)

An iPhone on a blue green background showing satellite messaging in iOS 18

(Image credit: Apple)

The popularity of iOS 18's satellite texting feature was a slight surprise, given it'll be limited to those with the iPhone 14 and iPhone 15 series. But there's no denying how useful this extension on Emergency SOS could be for those who are caught in cellular dead zones.

If you've been disconnected from both cellular and Wi-Fi connections for a while, iOS 18 will give you an alert to ask if you want to use satellite connectivity. From there, you'll be able to use the Messages app to send or receive both SMS texts and iMessages. 

For now, the feature will be free for two years from the time you bought your iPhone 14 or iPhone 15, but Apple has said it plans to charge for the feature in future.

2. Customizable home screen (279 votes)

An iPhone on a blue-green background showing iOS 18

(Image credit: Apple)

We thought that iOS 18's advanced home screen customization (which has more than a hint of Android about it) would be the top choice in our poll. But it still came in at a respectable second place, with 9% of the total vote.

If you missed WWDC 2024, iOS 18 will let you do many things that Apple has steadfastly refused to allow on iPhone before, including letting you position apps at the bottom or side of the screen so you can see your lovely wallpaper behind. 

We'll also be able to hide app names (like the above), for a super-minimalist look. And there'll be the option of triggering new dark mode icons and adding color tints, too. Thanks Apple, it's about time.

1. The new Siri assistant (495 votes)

An iPhone on a blue-green background showing Siri

(Image credit: Apple)

Apple's big Siri upgrade was definitely one of the biggest stories of WWDC 2024, and you emphatically agreed. In total, 16% of the respondents to our WhatsApp poll said the new Siri assistant is the most exciting feature of iOS 18.

Unfortunately, the rebooted Siri – which will also be able to connect to ChatGPT, thanks to a deal with OpenAI – will be limited to the iPhone 15 Pro series (or iPads and Macs) with an M1 chip. It'll also no doubt be compatible with the iPhone 16 series in September.

But if you're lucky enough to have a recent iPhone, you'll get Siri's visual makeover, more natural conversation skills and the option of typing questions to Siri later this year. From next year, Siri will also get the ability to use "personal context" and on-screen awareness to become even more useful. 

Considering we've been waiting 13 years for Apple to deliver on Siri's original promise, we couldn't be happier – and you sound pretty excited about it, too. 

You might also like...

The Clicks keyboard case has arrived, and it’s delightful, if not entirely practical for everyday use — at least, not without weeks of practice. 

© 2024 TechCrunch. All rights reserved. For personal use only.

At WWDC 2024 earlier this week, Apple announced a whole host of AI features coming to its various software platforms, including an Image Playground AI art generator in iOS 18 – and now we've got some more information about how the app is going to work.

As reported by 9to5Mac, Apple VP of software engineering Craig Federighi said in an interview during WWDC that iOS 18 is going to "mark up metadata" so that image files come with details stating whether or not they're made by AI.

While the current developer beta of iOS 18 doesn't include any of the upcoming AI features – so no Image Playground app – some digging into the beta code by the 9to5Mac team found references to image "forensics" that would identify AI-generated files.

Put that together and it's clear that Apple has been thinking about how to encourage the responsible use of AI – which Apple is calling Apple Intelligence on its own devices – and how to stop faked pictures being passed off as real.

Apple and AI art

Image Playground app

Image Playground will be able to create AI art on demand (Image credit: Apple)

From what we know about Image Playground already, the app will pop up in various places – like Messages or Notes – whenever you need to create a new AI picture. It might be a custom emoji, for example, or an image of a friend in your choice of setting.

Pictures created by Image Playground will take the style of cartoons and illustrations rather than photos, so that's one way Apple is protecting against these tools being misused. If you're not sure what to create, prompt suggestions are included too.

These AI models have been trained on "licensed data" and "publicly available data" pulled from the web, Apple says. In other words, if your art is online, Apple Intelligence was probably trained on it – unless the site it was published on opted out of the process.

The Image Playground capabilities are going to be made available in a future beta, according to Apple, but out of the currently available iPhones only the iPhone 15 Pro and iPhone 15 Pro Max will have the necessary power to run it. These AI art generation features will also be available on Apple Silicon Macs.

You might also like

Apple announced a lot of new features coming to iOS 18 at its WWDC 2024 event, but there wasn't time to cover everything – and additional upgrades continue to be revealed, including live video support for emergency SOS calls.

As described in Apple's press release for iOS 18, the new functionality is going to enable you to "share context through streaming video and recorded media". It means emergency dispatchers will have a better idea of what's happening and can provide help accordingly.

You can place an emergency call on an iPhone by pressing and holding the power (or side) button and either of the volume buttons together: a countdown will show on screen, and if you keep holding, the call will be placed.

So far we haven't seen the new video feature demoed – iOS 18 is currently only available as a developer beta – but presumably there will be options on screen during the call to start livestreaming or to send recorded video.

A major upgrade

iOS 18 overview

There's a lot to look forward to in iOS 18 (Image credit: Apple)

We can add live video support for emergency SOS calls to the long, long list of new features arriving in iOS 18. Some of the ones we're looking forward to the most include the ability to lock away apps and customize the look of the home screens.

The iOS 18 upgrade might also mean you can do away with your password manager and your current audio transcription service. Several features currently covered by third-party apps will soon be integrated into Apple's own software.

Then of course we have all the Apple Intelligence tools offered by iOS 18, including a major upgrade for Siri that should make it much smarter – though bear in mind that Apple Intelligence will only be available on two iPhone models to begin with.

Next month iOS 18 is going to enter the public beta phase, so more of us will be able to try it out, and then it should start rolling out to everyone in September – more or less at the same time as the iPhone 16 handsets are unveiled.

You might also like

WWDC 2024 was all about software this year, with the big announcements being iOS 18 overhauling the iPhone’s operating system, macOS getting smart features, Siri gaining AI smarts, and the introduction of Apple Intelligence. But in terms of hardware, Apple has very little to say.

But with Apple’s smartest iOS 18 features limited to the iPhone 15 Pro and iPhone 15 Pro Max and Apple Intelligence offering consumer-focused AI tools, we can take a decent stab at what we could expect to see the iPhone 16, iPhone 16 Pro, and iPhone 16 Pro Max bring to the table; likely in a September Apple event

An intelligent smartphone 

Apple Intelligence features on stage at WWDC 2024

(Image credit: Future)

First, the next-generation Pro iPhones are surely going to lean on AI features via Apple Intelligence; we’re talking smarter ways of capturing and editing content, AI-assisted organization features such as setting agendas by crawling emails, and smarter web browsing with intelligent search features and web page summaries.

If these sound somewhat familiar, it's because the Google Pixel 8 and Samsung Galaxy 24 have similar generative AI features; for more check out our guide on the best AI phones to buy right now.

However, as US Phones Senior Editor Philip Berne discussed in his Samsung Galaxy S24 Ultra review, some of the AI features in the best Android phones aren’t the most intuitive to access and calibrate. But going by how Apple adopts existing technology, I’d bet that its integration of AI in the iPhone 16 range and iOS 18 – at least as it evolves out of its beta stage – will be deep and done in a way that makes tools like smart photo editors feel natural to use rather than as an extra feature.

A Siri showcase 

iOS 18 Siri

(Image credit: Future)

And I expect Apple will make a big deal out of SIri and its new smart features at the iPhone 16’s launch; while the new Siri is coming to other phones, I’d not be surprised if the next-gen iPhones get some exclusive Siri-centric features. 

The reason for that thinking is the iPhone 16 Pro and iPhone 16 Pro Max will surely come with a more powerful A-series chip, likely calibrated for AI workloads, and thus be ready to power new smart features. 

But I’ll go one step further and predict that the whole iPhone 16 range could get new chips to power on-board large language models for AI tools, all with the goal of spreading and democratizing the use of generative AI. 

Subtle tweaks

iPhone 16 dummy units leak

(Image credit: Weibo)

Now, I’m not expecting the design of the next-generation iPhones to be much different from what we have at the moment; Apple has arguably nailed its design language and made good use of titanium. But new chips that could be tasked with running demanding AI workloads for a good bit of time will likely need improved cooling to stop them from overheating in the compact iPhone chassis. That could mean internal design tweaks for the iPhone 16 and maybe even changes in the phones’ size and thickness. 

And with more powerful silicon comes the potential for more power drain, so I’d not be surprised if the iPhone 16 models come with larger batteries than their predecessors and thus have a bit of extra bulk; expect the titanium chassis of the current Pro iPhones to trickle down to the standard and Plus next-gen models in order to mitigate any potential increase in weight. 

Ultimately, WWDC didn’t shed the brightest of light on what to expect from the iPhone 16 family. But I’d bet a decent sum of money – if I had some – on AI features being front and center at Apple’s next iPhone keynote. I could see that also working in tandem with the rumored Apple Watch 10

We’ll have to play the wait-and-see game for a few months, but with Apple Intelligence grabbing headlines, Cupertino’s smartphones are only going to get smarter. 

You might also like