> The level of resources to deliver a feature like this is staggering.
Are they, though? AzureML will straight up let you spin up a chatbot in a few clicks, and you could just have the ops team set up auto scaling so you only consume resources you need (because really, how many people are actually going to use this?).
It could be expensive if you find tune, but a free POC would just be a to prompt an LLM to act like a celebrity, maybe with an example of text they wrote.
So at worst, you're paying for compute that is a drop in the bucket to someone like Meta
Are they, though? AzureML will straight up let you spin up a chatbot in a few clicks, and you could just have the ops team set up auto scaling so you only consume resources you need (because really, how many people are actually going to use this?).
It could be expensive if you find tune, but a free POC would just be a to prompt an LLM to act like a celebrity, maybe with an example of text they wrote.
So at worst, you're paying for compute that is a drop in the bucket to someone like Meta