Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Well, it is a speaker (writer) after all. It has to use some way to refer to itself.


I don't think that's true. It's more of a function on how these models are trained (remember the older pre-ChatGPT clients?)

Most of the software I use doesn't need to refer it itself in the first person. Pretending what we're speaking with an agent is more of a UX/marketing decision rather than a technical/logical constraint.


I'm not sure about that. What happens if you "turn down the weight" (cf. https://www.anthropic.com/news/golden-gate-claude) for self-concept, expressed in the use not of first-person pronouns but "the first person" as a thing that exists? Do "I" and "me" get replaced with "this one" like someone doing depersonalization kink, or does it become like Wittgenstein's lion in that we can no longer confidently parse even its valid utterances? Does it lose coherence entirely, or does something stranger happen?

It isn't an experiment I have the resources or the knowledge to run, but I hope someone does and reports the results.


So is a command prompt.


Command prompts don't speak English.

Command prompts don't get asked questions like "What do you think about [topic]?" and have to generate a response based on their study of human-written texts.


There is again no need for first person pronouns there.

E.g. 'File not found' vs 'Sorry I could not find the file you were looking for.' Same stuff, but one just adds an artificial and unnecessary anthropomorphization.


A procedure is agnostic past the procedural rules (a calculator only follows a determined flow the way anyone could do without relevant differences); a stochastic process is inherently personal (non deterministic processes have internal biases).

In your example:

-- "iteration over filenames table reaches end → file not found";

-- "non-deterministic choice over lookup strategy does not return a positive → sorry I could not find the item"


LLMs are 100% deterministic. The facade of randomness is injected solely by a superfluous rng factor.


Even if in the forward pass there would be no "temperature" tilting, the NN training would still be performed through different processes on different implementations, making the outputs "personal".


Sure but I don't control the LLM's first person pronouns.

It anthropromorphizes itself.


Agnew, if you converse with your command prompt we are glad you came here for a break ;)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: