ARTICLE AD BOX
Apple kickstarted its annual developer conference—
WWDC 2025
today (June 9). At the event the company unveiled the live translation feature which enable
real-time language translation
across Messages, FaceTime, and Phone apps. The new capability, powered by on-device Apple Intelligence, ensures private and seamless communication without relying on external servers. This innovative capability is powered by Apple-built models that operate entirely on-device, ensuring user privacy and real-time responsiveness.
WWDC 2025: How live translation works
- In the Messages app, text will be automatically translated as users type, delivering messages directly in the recipient's preferred language. Responses will also be instantly translated back, allowing for fluid, back-and-forth communication.
- For FaceTime calls, the feature introduces live caption translation. This enables users to follow along with translated text that appears on screen, while still hearing the original speaker's voice. This dual approach aims to preserve the authenticity of the conversation while providing immediate comprehension.
- Live Translation extends to traditional Phone calls, offering spoken translation throughout the conversation. This means users can engage in real-time verbal exchanges with individuals speaking different languages, with the AI facilitating the translation on the fly.
Apple emphasised that Live Translation runs entirely on-device, ensuring that personal conversations remain private. The feature is designed to break down language barriers, making global communication more accessible.Live Translation will roll out with iOS 26, expected later this year, and will support multiple languages at launch.