Hacker Newsnew | past | comments | ask | show | jobs | submit | car's commentslogin

When the Mac and Atari ST first hit the market in the 80's, there were Comics created in this 1-bit "ordered-dither" style. For error-diffusion dithering (Floyd-Steinberg etc.), you needed more bits per pixel, to carry the error.

SHATTER:

https://imgur.com/gallery/shatter-1984-was-first-commerciall...

Robot Empire:

https://www.reddit.com/r/atarist/comments/xgs4rh/comicbook_c...


Andrej Karpathy got one from Jensen Huang.

https://blogs.nvidia.com/blog/gtc-2026-news/#dgx-station


Tried to build from source on MacOS, but got this error:

  (base)   unsloth git:(main) unsloth studio setup
  ╔══════════════════════════════════════╗
  ║     Unsloth Studio Setup Script      ║
  ╚══════════════════════════════════════╝ 
   Node v25.8.1 and npm 11.11.0 already meet requirements. Skipping nvm install.
   Node v25.8.1 | npm 11.11.0
   npm run build failed (exit code 2):

  > unsloth-theme@0.0.0 build
  > tsc -b && vite build

  src/features/chat/shared-composer.tsx(366,17): error TS6133: 'status' is declared but its value is never read.

Hey will check ASAP and fix - sorry about that

Apologies on the delay - we fixed it! Please re-try with:

curl -LsSf https://astral.sh/uv/install.sh | sh

uv venv unsloth_studio --python 3.13

source unsloth_studio/bin/activate

uv pip install unsloth --torch-backend=auto

unsloth studio setup

unsloth studio -H 0.0.0.0 -p 8888


Thank you for the follow up! Big fan of your models here, thanks for everything you are doing!

Works fine on MacOS now (chat only).

On Ubuntu 24.04 with two GPU's (3090+3070), it appears that Llama.cpp sometimes uses the CPU and not GPU. This is judging from the tk/s and CPU load for identical models run with US-studio vs. just Llama.cpp (bleeding edge).


Can Unsloth Studio use already downloaded models?

IDK how it did but it detected my LM studio downloaded models I have on a spinning drive (they're not in the default location).

For posterity, I can very much recommend MacMousefix. It's $2.99 to own, totally worth it to me. Open source.

https://macmousefix.com/en/

Also available via brew:

  brew install mac-mouse-fix
And on Github too:

https://github.com/noah-nuebling/mac-mouse-fix


> You may not charge users money for Your Program, and Your Source must contain the monetization systems, including the licensing, trial period tracking, and payment system, present in the MMF Source without an alterations, and all of these systems must be active and working as intended in Your Program.

License is not Open Source.


On the fun side: can this thing help me do left-click burst in browser HTML5 games? I tried hammerspoon and other methods but nothing worked thus far.

That's pretty cool, I've been wanting something like this so I don't have to reach for the touchpad on my Mac all the time. But I gotta say, I did NOT expect to be scrolling in the Z axis all of a sudden on that site!

The site is a good example of looking good but being very annoying to use.

Is there anything like this for the MX Ergo? I would be very interested in any software-based “hacks” for the mouse.

You could also try out a Steermouse free trial, it has MX Ergo S on the recommended mice list, so MX Ero is a maybe

https://plentycom.jp/en/steermouse/

I don’t have a mouse on my Mac now (trackpad too good) but Steermouse has been around for about 25 years and I used it for many of those. Way less awful than the Logitech software.


SteerMouse is legendary. I think it’s been around since Jaguar? Case study in getting it right the first time.

Not quite as old as I remembered, their About page puts it at 2005. But yes it's a remarkably long lasting product.

https://steermouse.com/about-us/

Funny how 20 years ago Logitech's software sucked enough for me to pay for an alternative, and two decades later Logitech's software still sucks enough for people to pay for an alternative.


Thank you!

Steermouse fills this gap brilliantly. Covers every device I’ve ever tried and I have some significant exotics.

I tried this, and it's nice .. but it did let me programme all the buttons on my Logitech MX Vertical.

Holy crap, just tried this and I was skeptical, but it sold me within minutes. This truly is great!

Can it do FizzBuzz in Brainfuck? Thus far all local models have tripped over their feet or looped out.


122B-A10B-UD-Q4-K-XL generated https://pastebin.com/j3ddfNtS -- but I can't get it to do anything in a couple of online interpreters. Guessing it wasn't trained on a lot of Brainfuck code.

Edit: it looks like the flagship models work by writing a C or Python program to do the bookkeeping. I don't have Qwen set up to use tools, and even Opus 4.6 shits the bed when told to do it without tools [1], so not too surprising that it didn't work.

1: https://claude.ai/share/1f5289ae-decd-4dfa-98fd-0d34346008c6 -- I interrupted it and told it not to use a C/Python program or any other tools to generate the Brainfuck code, and it gave me an error message after about 10 minutes that wasn't logged to the chat.


Great job, really well done.

Also cool: https://sunclock.net

I enjoy running clocks on this 5" inch circular touch screen IPS display from Waveshare: https://www.amazon.com/dp/B0C14CZ2GG.

The content is provided by a Raspberry Pi 4, and these Javascript/CSS/SVG clocks can be quite taxing. Especially a smooth running seconds hand often causes visual stuttering. Chrome had the best FPS I recall.

If anyone knows of other large circular displays, please post here.

The new BMW Mini has a gorgeous 24cm circular OLED display, but that's not generally available, OEM only [1][2].

[1] https://www.mini.com/en_MS/home/new-family/a-digital-quantum...

[2] https://www.bhtc.com/en/news/bhtc-entwickelt-erstes-rundes-o...


Thank you for your reply. Sunclock looks nice! I miss analog clock on cars, digital clocks are all around. Maybe I'm old because I read analog clock better than digital, somehow it's easier to visualize time from that. Young generation reads and likes digital clocks more. Maybe digital clocksimulator alternative also who knows...


So great to see my two favorite Open Source AI projects/companies joining forces.

Since I don't see it mentioned here, LlamaBarn is an awesome little—but mighty—MacOS menubar program, making access to llama.cpp's great web UI and downloading of tastefully curated models easy as pie. It automatically determines the available model- and context-sizes based on available RAM.

https://github.com/ggml-org/LlamaBarn

Downloaded models live in:

  ~/.llamabarn
Apart from running on localhost, the server address and port can be set via CLI:

  # bind to all interfaces (0.0.0.0)
  defaults write app.llamabarn.LlamaBarn exposeToNetwork -bool YES

  # or bind to a specific IP (e.g., for Tailscale)
  defaults write app.llamabarn.LlamaBarn exposeToNetwork -string "100.x.x.x"

  # disable (default)
  defaults delete app.llamabarn.LlamaBarn exposeToNetwork


Github is showing me unicorn - is there an Linux equivalent? I have a old Thinkpad with a puny Nvidia GPU, can I hope to find anything useful to run on that?


Building Llama.cpp from source with CUDA enabled should get you pretty far. llama-server has a really good web UI, the latest version supports model switching.

As for models, plenty of GGUF quantized (down to 2-bit) available on HF and modelscope.


I learned programming on a Sharp MZ-80K. Rectangular sheet metal case with an amber monochrome monitor and a built in cassette tape drive for storage. The keyboard keys were neatly squared up, zero ergonomics. You could flip it open like the hood of a car. And I faintly recall that there was some kind of UV erasable EEprom inside, not sure what for.


You can point CC at other models, e.g. via OpenRouter. It only requires three env variables to be set.


Exactly. If Codex is really as good, it should have no problem porting any settings or config from the Claude setup. (and I do believe it wouldn't have much of a problem)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: