The first one does, then prompt caching kicks in.. turns out many people ask similar questions. People who frequently ask complicated questions might have to pay extra, we can already see this playing out.
Also, most ChatGPT users have their “personalization” prefix in the system prompt (which contains things like date/time), which would break caching of the actual user-query.
The prompt has to be precisely the same for that to work (and of course now you have to have an embedding hashmap which is its own somewhat advanced problem.) I doubt they do that especially given the things I've heard from API users.