One of the things I think is hugely important about GPT3 is that it's paid for by the token. So for small users, it's pretty cheap, and it's fully workable for testing and experimentation.
This statement is accurate for testing and experimentation purposes. Additionally, the $18 starting credit provided by OpenAI is a nice bonus.
I created a small Anki plugin that tags cards based on their content using the Text-Davinci-003 model. However, the cost quickly becomes prohibitive when a large number of tokens/ Anki cards need to be processed. To be fair that plugin is not working great, since tags should be viewed, merged and reused on a global scope.
At 2 cents per 1000 tokens (~750 words), unless you're doing it a lot that should be quite manageable? That's ~$13 for 1000 cards of 500 words (prompt + card + tags). The smaller models are cheaper still and might perform well enough.
I guess you are generally right. My prompts are lengthy, since I let GPT-3 handle the full html syntax of both sides, mathjax and so forth. I found the ada model to be not sufficient, but have not tried the other two models, I will try those!
There is room for improvement on my side.
From a budget-conscious student's perspective, this amounts to roughly $20 per semester in my case, just for the tags. Despite this, I am still fascinated by the automation possibilities offered by GPT-3.
That's totally fair, I should be more conscious about different positions (and not get stuck in "as someone with a stable income..." thinking) and you're right these things can build. Curie as a model should cost 1/10th of Davinci - also the recent models are all better I think so if you've not tried them since the GPT-3.5 release it might be worth trying the smaller ones again. If you could get that to $2/semester it's a lot more palatable, as long as the quality isn't too bad. Maybe there's a flow where you can identify low performing cards/subsets so you can do 90% in Curie and 10% in Davinci.
I think my expectation of this kind of service (on demand huge network) is fixed monthly fees, call us for pricing, etc. It's cool that it's approachable as a student. It'll be very interesting as costs for these things drop and/or performance improves at the same price points.
One of the things I think is hugely important about GPT3 is that it's paid for by the token. So for small users, it's pretty cheap, and it's fully workable for testing and experimentation.