Apple has been expanding its computing budget for building artificial intelligence to millions of dollars a day. The Information:

One of its goals is to develop features such as one that allows iPhone customers to use simple voice commands to automate tasks involving multiple steps, according to people familiar with the effort. The technology, for instance, could allow someone to tell the Siri voice assistant on their phone to create a GIF using the last five photos they’ve taken and text it to a friend. Today, an iPhone user has to manually program the individual actions.

The moves come four years after Apple’s head of AI, John Giannandrea, authorized the formation of a team to develop conversational AI, known as large-language models, before the technology became a focus of the software industry, according to people with knowledge of the team. That move now seems prescient following the launch last fall of ChatGPT, a chatbot that catalyzed a boom in language models.

Although Giannandrea has repeatedly expressed skepticism to colleagues about the potential usefulness of chatbots powered by AI language models, the fact that Apple wasn’t completely unprepared for the language model boom could be considered an accomplishment — and is the result of changes he made to the company’s software research culture, several colleagues said.


Originally Posted at