
Apple is reportedly preparing a major AI upgrade with iOS 27, where Siri could gain a new camera-based mode for real-time visual understanding. The feature would allow users to point their iPhone at objects, text, or places and get instant responses directly from Siri, reports Bloomberg. This builds on Apple’s earlier visual intelligence efforts but aims to make them faster, live, and more integrated into the system. The update is expected to be announced at WWDC 2026 and released later in the year.
The proposed ‘Siri Camera Mode’ is designed to go beyond basic recognition by turning Siri into a real-time visual assistant that can interact with external AI services. Users may be able to point their camera at an object and directly query tools like ChatGPT for deeper explanations, while also having the option to run a Google Reverse Image Search to gather broader information from the web. Such a layered approach suggests the tech titan is not limiting itself to a single AI model, but instead building a flexible system that can pull insights from multiple sources depending on the task.
The feature builds on Apple’s existing Visual Intelligence tools, which already help iPhones understand things in the real world. For example, users can scan a concert poster and quickly turn it into a calendar event, identify plants and animals using the camera, or look up useful details about a business like its phone number, opening hours, and menu. However, currently, these features feel a bit separate and limited. Therefore, with iOS 27, the Tim Cook-led firm is expected to bring everything together into a smoother, faster experience where Siri can continuously understand what the camera sees and respond immediately.
Apple is also redesigning how this feature looks and operates within the camera app. One notable change is the replacement of the traditional white shutter button – used in the current Visual Intelligence interface – with a new capture control styled around the Apple Intelligence branding. Hardware integration will play a key role as well. The feature will continue to be accessible through the Camera Control button introduced on the side of the iPhone 16. Instead of launching a separate visual lookup interface, pressing and holding this button is expected to open the new Siri-powered camera mode directly within the camera app.
Beyond recognition and search, the company is expanding the practical use cases of this system. One addition includes the ability to scan nutrition labels on packaged food and automatically log dietary information, potentially integrating with health and fitness apps. Another capability involves capturing contact details from physical objects and instantly saving them to the user’s contacts.
The visual intelligence upgrade is also closely tied to improvements in Apple’s imaging and photo-editing capabilities. Reports suggest iOS 27 will introduce advanced AI tools that allow users to extend images beyond their original frame, adjust composition after capture, and automatically enhance lighting, colour, and detail. Object removal features are also expected to become more precise, competing with similar tools already available on rival platforms. At the same time, Siri itself is expected to undergo a major redesign. Instead of appearing as a separate, full-screen interface, the assistant may become more deeply embedded into the system UI, including integration with the Dynamic Island and other on-screen elements.