Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Each ChatGPT query costs orders of magnitude more than a google search. I can’t say for sure how many orders, but I suspect more than a few.


The first one does, then prompt caching kicks in.. turns out many people ask similar questions. People who frequently ask complicated questions might have to pay extra, we can already see this playing out.


That’s not what prompt caching is.

Also, most ChatGPT users have their “personalization” prefix in the system prompt (which contains things like date/time), which would break caching of the actual user-query.


The prompt has to be precisely the same for that to work (and of course now you have to have an embedding hashmap which is its own somewhat advanced problem.) I doubt they do that especially given the things I've heard from API users.


you realise now google is plugging gemini to all their queries and giving you summaries and stuff

so maybe not so much anymore? would be true if it was -pure- search on google's part but it isn't anymore


> google is plugging gemini to all their queries

Not to all, definitely. I haven't figured out what is the differentiator here but many queries are excluded.


I read someone that adding -nsfw- or such words to the prompt made it go away reliably funnily enough


The delta might not be that large these days, with the AI suggestions that Google is placing on search result pages.


Because they have their own hardware.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: