iOS 17 features that require at least an iPhone 12

Several new operating systems were unveiled by Apple during last week's WWDC23 keynote, including iOS 17 for the iPhone. However, not all of the new features and settings presented are for everyone compatible models thought. Although all models from 2018, including the iPhone XR and iPhone XS, can be equipped with iOS 17, some features will only be usable from the iPhone 12 or its Pro version from 2020. Here you will find a corresponding overview.

As of iOS 17, AR effects can be triggered with gestures, e.g. in FaceTime. However, at least an iPhone 12 is required for this. You can find out here which other features of the new operating system are also not available for all compatible devices.
As of iOS 17, AR effects can be triggered with gestures, e.g. in FaceTime. However, at least an iPhone 12 is required for this. You can find out here which other features of the new operating system are also not available for all compatible devices.

iOS 17 features not working with all iPhones

iOS 17 brings a number of innovations and improved functions to the iPhone. For example, you can use the Apple smartphone horizontally at a charging station as an information and control dock for the smart home. There will be a journal or diary app, contact posters display pictures or memojis of contacts, check-in ensures more safety on the way home, and more. But when it comes to the camera, there are a few model restrictions:

  • The Gesture Responses, in which AR effects such as fireworks or confetti are triggered with hand signals, require at least the iPhone 12. On the Mac it will Apple silicon or the iPhone 12 or newer is required as a webcam for this.
  • The suggestions for Word and sentence completions require at least an iPhone 12 and are only available in English for the time being.
  • The Point and Speak accessibility tool, thanks to which you can point the iPhone camera at an object with buttons, control panels or other content, point to a content and then have it read / described, requires a Pro model from 2020, i.e. iPhone 12 Pro (Max), iPhone 13 Pro (Max) or iPhone 14 Pro (Max). Because the LiDAR scanner is required for this.

Possible reason: Neural Engine only provides enough performance from the A14

In addition to the LiDAR scanner, which was used for the first time on the iPhone 12 Pro (Max) and which is required for the above-mentioned accessibility, the SoC is probably a reason for the limitations listed. Because the algorithms and processes for augmented reality, which are used for the gesture reactions, are calculated with the Neural Engine. This was introduced for the first time with the A11 Bionic Chip in 2017, but apparently only brings enough computing power for gesture effects and sentence completion with the A14 Bionic Chip from 2020. The Neural Engine of the A14 is said to be 80% more powerful than that of the A13 (Which).

My tips & tricks about technology & Apple

Did you like the article and did the instructions on the blog help you? Then I would be happy if you the blog via a Steady Membership would support.

Post a comment

Your e-mail address will not be published. Required fields are marked with * marked

In the Sir Apfelot Blog you will find advice, instructions and reviews on Apple products such as the iPhone, iPad, Apple Watch, AirPods, iMac, Mac Pro, Mac Mini and Mac Studio.

Shopping
  •  
  •