iPhone 16 series has arrived, featuring all-new cameras, Apple Intelligence, larger batteries, displays, and even a new Camera Control system. According to Apple, this system is “a result of thoughtful hardware and software integration.” Out of all the updates and announcements at the event, Camera Control stood out as the most intriguing to me, as it fundamentally changes how you operate your iPhone for taking pictures and even launching the camera. Let’s take a closer look at how the new system works and whether it lives up to the hype.
iPhone 16 Series Debuts the ‘Camera Control’: What It Does
Simply put, Camera Control represents Apple’s attempt to bridge the gap between professional SLR and mirrorless cameras and mobile devices. This is evident in the way the button/system operates. Want to zoom in on a subject? You can do that by sliding your finger across the capacitive touch surface. Want to adjust the depth of field? That’s possible too.
Not sure which
mobile to buy?
There are several elements to the overall Camera Control button—Apple says it features a tactile switch that “powers the click experience.” Additionally, it has a force sensor that allows for light presses and, of course, a capacitive touch area that facilitates touch interactions like zooming. Apple claims you can adjust creative parameters such as zoom, exposure, or depth of field by sliding your finger on the Camera Control.
If you’re patient enough to wait until autumn, Apple will also introduce a new function allowing you to light press the button to focus and lock exposure—something that has been part of the rumour mill for quite some time. This could be a game changer for someone like me, as it emulates a “real” camera. Based on what I heard in the presentation, this will also be complemented by haptic feedback, and I can only imagine how satisfying it will feel in practice.
Also Read: iPhone 16 series, new AirPods and everything else announced at Apple event 2024
Apple Camera Control Also Allows for Visual Intelligence
Using Camera Control isn’t limited to the camera app itself; you can scan real-world objects and instantly get information about them, much like Google’s Gemini integration on Google Pixel devices. A different take on Circle to Search, perhaps? Apple refers to this as Visual Intelligence and it appears to be part of the broader Apple Intelligence AI features. “Users can click and hold Camera Control to pull up the hours or ratings for a restaurant they pass, add an event from a flyer to their calendar, quickly identify a dog by breed, and more,” Apple said when announcing the feature.
Also, if you’re into the Action Button Siri Shortcuts scene, you might already be familiar with custom shortcuts that let you do some of this with the press of a button. However, it’s certainly nice to see an official implementation that leverages Apple’s own AI, rather than relying on Google Lens.
Also Read: Apple AirPods 4 launched in India: New design, H2 chipset, ANC and more