Dont Fret Intel Macs Will Still Get This M1-Exclusive macOS 12 Feature – iDrop News

When Apple unveiled macOS 12 Monterey last month, it drew a line in the sand between its new leading-edge M1-powered Macs and those models still using Intel chips. Now, however, it appears that this line has been blurred, at least a bit.

There are at least six big new features in macOS Monterey that were expected to require an M1 Mac, but now the latest macOS 12 beta is bringing at least one of them to Intel-flavoured Macs too Live Text in Photos.

The change comes in the fourth macOS 12 beta, released to developers earlier this week, which was quickly followed by a third public beta of the same build. There wasnt much else interesting about the latest macOS beta Apple briefly suggested that Universal Control had finally been enabled, but sadly, that turned out to be premature.

However, buried in the release notes was an oblique confirmation that Live Text was quietly being added to Intel Macs as well:

Live Text now works across all Mac computers that support macOS Monterey.

Apple didnt mention the word Intel in the release notes at all, but all Mac computers naturally includes the many Intel variants that Apple has sold in the past and still sells today.

As Rene Ritchie theorizes, Apple seems to have changed course and added Live Text based on demand, but it also helps that support for Live Text on the Mac doesnt need to be done in real-time like it does on the iPhone and iPad.

Even though most of Apples Macs feature a FaceTime camera, even the M1-powered models wont allow you to view Live Text through the camera. This seems reasonable considering that the MacBook and iMac cameras are designed for things like video conferencing, and most users arent likely going to be pointing it at signs and receipts the same way they would with an iPhone.

Of course, youll still be able to extract Live Text from pictures captured with the Macs FaceTime camera you just wont be able to do it while looking at a live preview of the image. Instead, youll have to load it up in Photos or Preview.

As Ritchie explains, this means that Live Text on the Mac doesnt really need the M1s Neural Engine. While M1-powered models will almost certainly still leverage that for even faster processing, Intel versions can simply handle it opportunistically. This means it might run a bit slower on older Intel Macs, but at least it will be possible.

The real benefit of the new Live Text feature will be found in iOS 15 on the iPhone, where users will be able to extract text right through the iPhones Camera app.

Its a magical feature, and it doesnt just work in the Camera and Photos apps, either. Its a system-level feature that should work for any photo that youre viewing, anywhere on your device. There are already a few exceptions with specific apps that dont use the standard iOS photo APIs, like Facebook and Instagram, but these may also simply be a matter of waiting for these apps to be updated.

Most significantly, Apple is doing all the processing directly on the A-series or M-series chip in your iPhone, iPad, or Mac. Your photo data never leaves your device for this purpose, and Apples servers dont analyze your photos for text at all. This is similar to the on-device photo processing for things like faces and objects that Apple has been doing since iOS 10 back in 2016.

Of course, if you sync your photos with iCloud Photo Library, then your photos will be stored on Apples servers, but the key is that Apple doesnt perform any computational analysis on the photos stored there. Every other service that does this kind of analysis relies on powerful cloud servers to do the heavy lifting, meaning that all the text in your photos is living in a database on that companys server. Apples approach is a massive win for privacy.

However, this requirement for on-device processing is precisely why the Live Photos feature wasnt originally going to be supported on Intel Macs. Apple presumably felt that the Intel chips werent up to the task, and more importantly, the code to handle this was built for Apples own Neural Engine, which is only found in the 2017 A11 Bionic and later A-series chips, and of course, the new M1 chip that powers Apples latest Macs. This is the same reason that youll need at least an iPhone XS/XR to use Live Text in iOS 15.

In the case of Intel Macs, however, Apple obviously realized that it could batch process photos for Live Text, rather than needing to do it in real-time. Its not yet clear exactly how this approach may differ under the hood its likely Apple is analyzing photos in the background and pre-storing the text for later but the result is the same: there will be one more feature for Intel Mac users to enjoy when macOS Monterey launches later this year.

Read this article:
Dont Fret Intel Macs Will Still Get This M1-Exclusive macOS 12 Feature - iDrop News

Related Posts

Comments are closed.