Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> What I don't think this technology will do is replace game engines. I just don't see how you could get the very precise and predictable editing you have in a regular game engine from anything like the current model. The real advantage of game engines is how they allow teams of game developers to work together, making small and localized changes to a game project.

I've been thinking about this a while and it's obvious to me:

Put Minecraft (or something similar) under the hood. You just need data structures to encode the world. To enable mutation, location, and persistence.

If the model is given additional parameters such as a "world mesh", then it can easily persist where things are, what color or texture they should be, etc.

That data structure or server can be running independently on CPU-bound processes. Genie or whatever "world model" you have is just your renderer.

It probably won't happen like this due to monopolistic forces, but a nice future might be a future where you could hot swap renderers between providers yet still be playing the same game as your friends - just with different looks and feels. Experiencing the world differently all at the same time. (It'll probably be winner take all, sadly, or several independent vertical silos.)

If I were Tim Sweeny at Epic Games, I'd immediately drop all work on Unreal Engine and start looking into this tech. Because this is going to shore them up on both the gaming and film fronts.



As a renderer, given a POV, lighting conditions, and world mesh might be a very, very good system. Sort of a tight MCP connection to the world-state.

I think in this context, it could be amazing for game creation.

I’d imagine you would provide item descriptions to vibe-code objects and behavior scripts, set up some initial world state(maps), populated with objects made of objects - hierarchically vibe-modeled, make a few renderings to give inspirational world-feel and textures, and vibe-tune the world until you had the look and feel you want. Then once the textures and models and world were finalised, it would be used as the rendering context.

I think this is a place that there is enough feedback loops and supervision that with decent tools along these lines, you could 100x the efficiency of game development.

It would blow up the game industry, but also spawn a million independent one or two person studios producing some really imaginative niche experiences that could be much, much more expansive (like a AAA title) than the typical indie-studio product.


> you could 100x the efficiency of game development.

> It would blow up the game industry, but also spawn a million independent one or two person studios producing some really imaginative niche experiences that could be much, much more expansive (like a AAA title) than the typical indie-studio product.

All video games become Minecraft / Roblox / VRChat. You don't need AAA studios. People can make and share their own games with friends.

Scary realization: YouTube becomes YouGame and Google wins the Internet forever.


You’ve just described what Roblox is already doing.


I haven’t checked on Roblox recently, but afaik it doesn’t really allow complete creative freedom or the ability to have a picture and say “make the world look like this, and make the character textures match the vibe” and have it happen. Don’t they still have a unified world experience or can you really customize things that deeply now?

Can you make a basically indistinguishable copy of other games in Roblox? If so, that’s pretty cool, even without AI integration.


Roblox can't beat Google in AI. Roblox has network effects with users, but on an old school tech platform where users can't magic things into existence.

I've seen Roblox's creative tools, even their GenAI tools, but they're bolted on. It's the steam powered horse problem.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: