Hacker Newsnew | past | comments | ask | show | jobs | submit | Impossible's commentslogin

Pico-8 isn't super accessible as a game making platform, although it is a very accessible programming introduction platform.

HN tends to over use "boilerplate" as Pico-8 has zero boilerplate and is super minimal. Your post is no exception. Core gameplay code is not boilerplate and Pico-8 is not a game engine in that sense.

Playdate has Pulp, which does include built in collision, dialog systems etc and is inspired by Bitsy. Pulp has it's own (optional) scripting language and is specifically designed as an accessible game making tool. Even the Playdate Lua and C SDKs have more gameplay functionality than pico-8 though, including a simple game object model, collisions, animation, etc


I think he meant that Pico-8 is so stripped down you have to spend a lot time writing lower level game engine stuff before you get to the fun, moving things around, stuff. Not boilerplate in that you are repeating code within your Pico-8 game, but that you have to rewrite basic functions that come built in to typical game frameworks.


Exactly. And it's not that PICO-8 is bad for it; in fact PICO-8 is marvelously flexible since it imposes so little on you in terms of structure, so the games are a lot more variable than, say, something like rpgmaker, where the results can end up feeling a bit samey (until you have something really exceptional on the art and writing side, like Rakuen).

But yeah, as far an introductory programming platform, it's much better of think of PICO-8 as an upgrade from QBasic than an upgrade from ZZT.


I would always start with Scratch

https://scratch.mit.edu/

It has build-in collision detection etc. but to me turle graphics is the way to properly understand programming. Most other environments help with getting people interested in tech (and perhaps game development), but visual configuring event handlers is not programming. When I taught programming I often got people from other courses with an visual environment. They knew how to build forms and save data, but they could not programm: (to me) Think about a problem, make the problem into smaller problems until you can write code for them. Many were happy when we did switch to turle graphics and said "We should have started with that".

Scratch helps transition from the event handler configuration (which is fine if you want to write games, many environments don't make you write code) to coding (write code in a programming language) via turtle graphics.


My daughter uses a couple of block-programming browsers environments. One is Frozen-themed where you direct the characters to draw by skating as directed. Modern turtle graphics.



I always have a desktop computer and mostly can only do real work on desktop (graphics/VR/game programmer). Laptops are nice for portability so I always try to have one or two available. Because of my role my jobs have always provided one in the office, but I keep a personal one for gaming and side projects/learning.


All of these are used in the wild at various companies. Maybe these companies are entirely staffed by subpar engineers, but FAANG style whiteboarding is extremely gameable, especially if you know someone in the inside already, but even if you don't and leetcode enough.


I had a lot of managers and higher level ICs explain calibration and rage quit (although it was long overdue) after an unfairly low review. There were extreme politics happening at the time that the current ED laid out to me in an exit conversation, and most of it came into play. I should have made way more money off of my contributions than I did but didn't play politics, and honestly, given my role and position could not. I was a resource and not a player:)


The platform is still web, even if it's not using HTML and CSS. The success of figma has lead to a lot of start ups targeting the web as there main platform, but using WASM (C++ or Rust) and WebGL (soon WebGPU). This makes sense in creative tools, I think. We're also starting to see tools that run on the server and stream video to the browser.


Christ, don't get me started on that last point. One of the most egregious examples I've ever heard of is Mighty, an app that streams Chrome from a server to your laptop, because apparently fixing Chrome's performance is too much of a hassle that now we need to send a video of our browser back to us.

And the worst part is, they might win. Lots of apps are web based with WASM and WebGL like you say, which are purely client side, and lots of people that need to do work might not have good enough computers to run such apps, if we assume (and I believe this to be true) that applications with more and more complexity will be pushed onto the web browser as the universal app interface. If a server can render those apps better and send them back, the client doesn't have to any work.

We continuously invent and reinvent the terminal/mainframe architecture, it appears like.


Basically what was done in X Windows and Citrix/RDP is now the browser's role, and the wheel keeps on turning.


> And the worst part is, they might win

Might? They will win, is it not obvious at this point that every single app in the near future will be sandboxed in some way? The unix grey beards had a couple decades to prove they could write secure software and they failed.


Or to write a UIKit that works across Windows, Linux and macOS with no hassle, making cross-platform development easier. But no, we have 3 SDKs on a single platform already and it apparently is confusing.

Oh wait, they did create a true cross-platform platform: the browser.


It is so cross platform, that some think it is too much work to support various browsers so they bundle Chrome with the application.

Could be worse I guess, there are those that ship a whole OS with the application.


Having worked at Facebook and seeing where they are headed I think the HN numbers are way off and this poll is mostly "what company does HN hate the most" not which is most likely to decline. Hating FB as a company is fine, and FB as a product is probably likely to decline, but they are as OP mentioned significantly better at shipping new product than any other company listed except Apple. Facebook's biggest challenges are not owning a real platform (platform holders could hurt them) and having lots of serious competitors. These are real problems, but I think their ability to fast follow can put them in #2 for a lot of different categories they care about.

Netflix has a lot of room to grow in new markets, gaming and possibly other types of media. I think they can continue to produce good content and I think some smaller players that are locking content to their own services might return to Netflix in the future (like Microsoft, Sony and EA are now all back on Steam)

I don't think any of these companies is in serious threat of decline, but out of all of them Google seems most stagnant.


Nanite doesn't do any rendering on CPU and does a lot of work that is traditionally done on CPU (culling) on GPU. Nanite does use a software rasterizer for most triangles, but it runs on GPU.


Oh, I think you're right. The nested FOR loops of the "Micropoly software rasterizer" are running on the GPU.[1] I think. I though that was CPU code at first. They're iterating over one or two pixels, not a big chunk of screen. For larger triangles, they use the GPU fill hardware. The key idea is to have triangles be about pixel sized. If triangles are bigger than 1-2 pixels, and there's more detail available, use smaller triangles.

[1] https://youtu.be/eviSykqSUUw?t=2134


This is one of the main advantages of using a lower level library like Vulkan or DX12. Higher level APIs like OpenGL and DX11 do a lot of work for you, but that work can result in extra allocations that you can't control, starting driver threads, extra resource transitions, etc


I agree with similar level of immersion, I actually think a lot of parser IF would translate to VR well (at a much greater cost). I thought that Andrew Plotkin's dual transform might make for a good VR game, for example https://eblong.com/zarf/if.html


When I worked on Call of Duty (~7 years ago) almost all of our source images were TGA. The runtime format that almost all games use are various block compressed formats (BC7, ASTC, etc) and usually these are grouped in some kind of package/archive format (although engines are moving away from that in order to stream at the asset level) and lz4 compressed on disk.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: