Home » PC Tech & Gaming News » Why didn’t Apple unveil AI at the WWDC 2023?

Why didn’t Apple unveil AI at the WWDC 2023?

Apple unveiled a lot at the WWDC 2023, but AI was not among them.

Updated: Jun 6, 2023 9:22 am
Why didn’t Apple unveil AI at the WWDC 2023?

WePC is reader-supported. When you buy through links on our site, we may earn an affiliate commission. Prices subject to change. Learn more

Apple WWDC was a huge event full of so many surprises, with that being said, it did leave those of you with an itch for some new AI technology a little unsatisfied. So Why didn’t Apple unveil AI at the WWDC 2023?


Why didn’t Apple unveil AI at the WWDC 2023?

It’s possible that Apple isn’t currently working on any new AI projects. With everything they have going on, such as the changes in iOS 17, the Vision Pro, and the new MacBooks, it could be that they’re focusing on those areas for now.

It’s also possible that any AI developments they’re working on are not ready to be announced publicly yet. Aside from the ongoing improvements to Siri, we don’t have any information about Apple’s current AI endeavors.

Developing an AI is a complex and time-consuming process. It takes years of development, even for a large company like Apple. While Apple was a pioneer in the AI voice assistant space, it doesn’t necessarily mean they have something new in the works at the moment. However, there could be projects underway that we are not aware of.

However, there are more device and software-level improvements to AI and machine learning that Apple continues to make with every update.


Apple’s machine-learning technology

When the iOS segment of the WWDC started, there was a lot of talk about Apple’s on-device machine learning, machine learning is a kind of AI, but not presented in the way we’re used to seeing today.

Apple mentioned the term “transformer” in relation to artificial intelligence (AI). Specifically, they discussed a “transformer language model,” which means their AI model uses the transformer architecture. This architecture has been instrumental in advancing generative AI technologies like the DALL-E image generator and the ChatGPT chatbot.

A transformer model, introduced in 2017, is a type of neural network used in natural language processing (NLP). It employs a self-attention mechanism, allowing it to prioritize different words or elements in a sequence. This parallel processing capability has greatly improved efficiency and facilitated breakthroughs in NLP tasks such as translation, summarization, and question-answering.

Apple’s new transformer model in iOS 17 brings sentence-level autocorrections. When you press the space bar, it can complete a word or an entire sentence. Moreover, it learns from your writing style, providing personalized suggestions.

Apple’s on-device AI processing is made possible by a specialized part of their Apple Silicon chips, known as the Neural Engine. This component, present in Apple chips since the A11 in 2017, is designed to accelerate machine learning applications. Additionally, Apple mentioned that dictation now employs a new transformer-based speech recognition model, which utilizes the Neural Engine to enhance accuracy.

Apple is always working to further its AI advancements, just not in the way we’re all expecting, at the moment.


Jack is a Tech and News Writer who has a vast and proficient knowledge of CPUs, Motherboards, and Computer technology.

Trusted Source

WePC’s mission is to be the most trusted site in tech. Our editorial content is 100% independent and we put every product we review through a rigorous testing process before telling you exactly what we think. We won’t recommend anything we wouldn’t use ourselves. Read more