The idea that language influences the world view isn't new, it was speculated upon long before artificial intelligence was a thing, but it explicitely speculates about having an influence on the world view of humans. It doesn't postulate that language itself creates a worldview in whatever system processes text. Or else books would have a worldview.
It's a categeory error to apply it to an LLM. Language works on humans, because we share a common experience as humans, it's not just a logical description of thoughts, it's also an arrangement of symbols that stand for experiences a human can have. That why humans are able to empathically experience a story, because it triggers much more than just rational thought inside their brains.
Again LLMs DO NOT THINK. If you quote me then at least do it correctly, I never said "processing text" is equal to human thinking, my entire point is the opposite. The "magic" still happens in OUR brains no matter if we read a fixed text (book) or a predicted text by an LLM. It's both illusions created by ourselves.
It's a categeory error to apply it to an LLM. Language works on humans, because we share a common experience as humans, it's not just a logical description of thoughts, it's also an arrangement of symbols that stand for experiences a human can have. That why humans are able to empathically experience a story, because it triggers much more than just rational thought inside their brains.