iOS 26
Apple is set to release iOS 26 today, September 15, 2025, marking the most significant iPhone software redesign in over a decade with its new "Liquid Glass" design language that adds translucent effects throughout the interface.
Published: 09/15/2025
Apple is set to release iOS 26 today, September 15, 2025, marking the most significant iPhone software redesign in over a decade with its new "Liquid Glass" design language that adds translucent effects throughout the interface. The update brings expanded Apple Intelligence features, enhanced communication tools, and improvements to core apps, available for iPhone 11 and newer models starting around 10 a.m. Pacific Time.
Liquid Glass Design Language

The Liquid Glass design language represents Apple's most ambitious visual overhaul since iOS 7, introducing translucent materials that dynamically reflect and refract their environment while adapting to enhance content focus within controls, elements, icons, and widgets. This design system draws inspiration from the Vision Pro headset, bringing real-time rendering with specular highlights that create an interface feeling alive and responsive across all Apple devices. reddit.comaol.comdev.to
Key visual elements include frosted glass backgrounds for the Dock, translucent widgets like calendar and weather apps, and redesigned tab bars that minimize when scrolling to prioritize content visibility. The effect combines multiple layers of transparency using backdrop filters, subtle blur effects, and carefully positioned highlights that simulate light flowing across glass surfaces. While the implementation varies by device—appearing more subtle on Mac's larger displays compared to iPhone—the design creates a cohesive premium aesthetic that maintains usability while adding depth and sophistication to everyday interactions. aol.comtheverge.comchameleon.ioreddit.com
Sources: reddit.comaol.comdev.totheverge.comchameleon.iocomputerhardwareinc.commacrumors.comtapptitude.comappleinsider.commartiancraft.com
Live Translation in Messages

Messages introduces automatic real-time translation that converts text as you type and instantly translates incoming replies, enabling seamless conversations across language barriers without switching apps. The feature operates entirely on-device using Apple Intelligence models, ensuring private conversations remain secure while supporting nine languages including English, Chinese (Simplified), French, German, Italian, Japanese, Korean, Portuguese (Brazil), and Spanish. techcrunch.comidownloadblog.comyoutube.com
Setting up live translation requires downloading offline language packs through Settings > Apps > Translate, then enabling "Automatically Translate" within individual conversations in Messages. Users can choose to display both original and translated text or view only translations, with the system showing a small "Translating Language" pill to indicate when the feature is active. While the system handles SMS, MMS, and iMessages seamlessly, it currently supports translation between only two languages at a time—multilingual conversations require manually switching language pairs as needed. idownloadblog.com
Sources: techcrunch.comidownloadblog.comyoutube.commacworld.com9to5mac.comapple.comreddit.comapple.comyoutube.com
Visual Intelligence Features

Visual Intelligence expands beyond camera-based functionality to include comprehensive screenshot analysis, bringing AI-powered object recognition and search capabilities directly to captured screen content. Users can take a screenshot and access Visual Intelligence through the screenshot interface, where the system identifies text, objects, dates, and other elements for immediate action—from adding events to calendars to conducting Google image searches for specific items within the capture. macrumors.com
The feature introduces "Highlight to Search," allowing users to draw around specific objects in screenshots for targeted searches, similar to Android's Circle to Search functionality. Beyond basic identification, Visual Intelligence now recognizes art, books, landmarks, natural landmarks, and sculptures in addition to its existing plant and animal recognition capabilities. When objects are successfully identified, a glowing icon appears that provides detailed information without requiring external services, as this recognition happens entirely on-device. For more complex queries, users can tap "Ask" to send questions about screenshot content to ChatGPT or use "Search" for Google-powered results. The functionality requires iPhone 15 Pro models or newer, leveraging the A17 Pro chip's processing power for real-time analysis. smartsight.inmacrumors.com
Sources: macrumors.comsmartsight.inreddit.com9to5mac.comapple.comapple.comyoutube.commashable.comreddit.comyoutube.com