Thats true! I was referring to their wider line up, especially the iPad, where users will expect the same performance as on the Mac’s (they payed for an Mx chip) and they sold me an iPad Air this year that comes with a really fast M3 and still only 8 GB of RAM (you only get 16 on the iPad Pro btw if you go with at least 1TB of storage on the M4 Pro one)
You probably wouldn’t with a Pro but you might between an iPad Pro and an MacBook Air.
With the foundation models API they basically said that there will be one size of model for the entire platform, making smarter models on a MacBook Pro unrealistic and only faster ones possible.
Isn't Private Cloud Compute already enabling the more powerful models to be run on the server? That way the on-device models don't have as much pressure to be The One.