iOS 18 is about to launch with many powerful AI features, Apple is ambitious to make a breakthrough in the artificial intelligence race.
Apple is said to be rushing to completely revolutionary features for iOS 18, the next operating system version for the iPhone. So when will this platform reach users? As expected, iOS 18 will be officially launched in September 2024, after being introduced at Apple's Worldwide Developer Conference (WWDC) on June 10.
iOS 18 is expected to launch in September this year.
One of the highlights of iOS 18 is the strong integration of AI. According to multiple sources, Apple is collaborating with OpenAI, the company behind ChatGPT, to bring advanced AI experiences to iPhone users.
Integrating AI into iOS 18 is expected to help Apple catch up with rival Samsung, which introduced Galaxy AI on the Galaxy S24 earlier this year. Apple CEO Tim Cook also expressed confidence that Apple has a competitive advantage in the field of AI and will "resonate" in the near future.
Besides, one of the reasons why Apple just announced the M4 chip earlier than expected is because this chip has a more powerful neural processor, designed to work well with AI. The M4 chip will be available on the new iPad Pro line from May 15.
However, partnering with OpenAI comes with its own challenges, especially around data privacy. Large language models (LLMs) like ChatGPT require too much processing resources, making it difficult to do everything on the device. As a result, Apple will likely operate some of its upcoming artificial intelligence features through data centers equipped with its own processors.
Apple has just revealed valuable accessibility improvements for users in iOS 18.
At a recent press release event, Apple revealed a series of impressive accessibility features for iPhone and iPad, expected to be included in the iOS 18 and iPadOS 18 updates released later this year. .
Watch More Image Part 2 >>>
Developed, the most notable feature recently introduced by Apple is "Eye Tracking", which allows users to control the device with just eye movements. This feature is supported on both iPhone and iPad, is powered by artificial intelligence (AI) and requires no additional hardware or accessories.
The feature will be initially set up using the front camera in a few seconds. All collected data is processed right on the device through machine learning technology, ensuring security.
Users will soon be able to control iPhone/iPad through eye gestures.
Users will soon be able to control iPhone/iPad through eye gestures.
Besides Eye Tracking, the company also introduced “Music Haptics” for iPhone, providing the ability to emit vibration signals from the Taptic Engine according to the rhythm and melody of the song being played. This feature is built into Apple Music and will be expanded to other apps via API.
Even to help reduce the feeling of motion sickness for users when using an iPhone or iPad while traveling, Apple is also about to launch the "Vehicle Motion Cues" feature. This feature will display animated dots on the edge of the screen to reflect the vehicle's movement, helping users feel more comfortable.
In addition, the company also added many other support features to CarPlay, such as voice control support. In addition, there is a sound recognition feature (Sound Recognition), which helps deaf people recognize car horns and sirens, and Color Filters for people with color blindness.