Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
ariedro
9 days ago
|
parent
|
context
|
favorite
| on:
This is not the future
Well, the LLMs were trained with data that required human effort to write, it's not just random noise. So the result they can give is, indirectly and probabilistically regurgitated, human effort.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: