> Apple will deliver some of its upcoming artificial intelligence features this year through data centers powered by its own in-house chips
Emphasis mine. I don't think there's any chance that means ChatGPT will run on Apple Silicon. It seems more likely that they will use it to power their internal CoreML/MLX-based research, anything more would kinda be a waste of money and electricity.
They don’t need to if they can process a significant chunk of the queries on-device. Llama3-level inference works fine on M2-level chips today and the M4 is already in a mobile device.
[0] https://www.reuters.com/technology/apple-power-ai-servers-wi...