Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Just installed, this is very cool. Local AI is the future I want (and what I'm working on too). A few notes using it...

Pros:

- seems pretty self contained

- built in model installer works really well and helps you download anything from CivitAI (I installed https://civitai.com/models/183354/sdxl-ms-paint-portraits)

- image generation is high quality and stable

- shows intermediate steps during generation

Cons:

- downloads 6.94GB SDXL model file somewhere without asking or showing location/size. Just figured out you can find/modify the location in the settings.

- very slow on first generation as it loads the model, no record of how long generations take but I'd guess a couple minutes (m1 max macbook, 64GB)

- multiple user feedback modules (bottom left is very intrusive chat thing I'll never use + top right call for beta feedback)

- not open source like competitors

- runs 7 processes, idling at ~1GB RAM usage

- non-native UX on macOS, missing hotkeys you'd expect, help menu. electron app?

Overall 4/5 stars, would open again :)



You should check out Draw Things on macOS. It works well enough for SDXL on 8GiB macOS devices.


Thanks. Yeah I played with your app early on and just fired it up again to see the progress. Frankly I find the interface pretty intimidating but it is cool that you can easily stitch generations together.

Unsolicited UX recs:

- strongly recommend a default model. The list you give is crazy long. It kind of recommends SD 1.5 in the UI text below the picker but has the last one selected by default. Many of them are called the same thing (ironically the name is "Generic" lol).

- have the panel on the left closed by default or show a simplified view that I can expand to an "advanced" view. Consider sorting the left panel controls by how often I would want to edit them (personally I'm not going to touch the model but it is the first thing).

You are doing great work but I wouldn't underestimate the value of simplifying the interface for a first-time user. It seems to have a ton of features but I don't know what I should actually be paying attention to / adjusting.

Is there a business model attached to this or do you have a hypothesis for what one might look like?


Agreed on UX feedback. It accumulated a lot of crufts from the old technologies to the new. This just echos my early feedback that co-iterating UI and the technology is difficult, you'd better pick the side you want to be on and there is only one correct side (and unfortunately, the current app is trying hard to be on both-side).


Are you the developer by any chance? If so, it would be helpful to state it.


I am. I thought this is obvious. My statement is objective. I would go as far as: it is the only app works at 8GiB macOS devices with SDXL-family models.


"You should check out this thing" has a very different implied context than "You should check out this thing I made". The first sounds like a recommendation from an enthusiastic user, not from the the author. Because of this, discovering that you are the author makes your recommendation feel deceptive.


I am sorry if you feel that way. I joined HN when it was a small tight-knit community without much of marketing presence. The "obvious" comment is more like "people know other people" kind of thing. I didn't try to deceive anyone to use the app (and why should I?).

If you feel this is unannounced self-promotion, yes, it is, and can be done better.

---

Also, for the "objective" comment, it meant to say "the original comment is still objective", not that you can be objective only by being a developer. Being a developer can obviously bias your opinion.


I think it was obvious. That said, thank you so much for your labor of love! The app is amazing! Any plans for SDXL Turbo support?


Should be in a few days. I asked Stability to clarify whether I can deliver weights through my Cloudflare bucket and whether qualifying as non-commercial is who runs the model, not who delivers the model.


2008, when we both joined, was 15 years ago. In the interim, the userbase has grown. Most people aren't recognizable as the author of an app under discussion, so a simple "Developer here" is appreciated as it was not obvious to me.


Whoa, well let me just say thanks for the awesome app!! it's pretty entertaining to spin this up in situations where I don't have internet (Airplane, subway etc.)

I was also surprised on how well it ran on my iPhone 11 before I replaced it with a 15 pro.

(Let me know if you're looking for some Product Design help/advice, totally happy to contribute pro bono. No worries if not of course!)


How would that be obvious to anyone?


Nice app - but for future reference it is very much not obvious to any native English speaker. "You should check out X" sounds like a random recommendation.


What do you mean it was obvious? Only the developer could make that objective statement?


I've been generating stuff non-stop in Draw Things for a few days, it's very good. Agree with the comments elsewhere about the rather overwhelming UI, and I have only one feature request: let us input the number of images we want to generate - the 100 limit means I keep having to check if it's finished to restart it.


Any plans for SD Turbo? Both base and XL models would be a great fit for a mobile device.


Draw Things is amazing. Great work and thanks for developing it!


If you are interested in the tech-stack:

https://noiselith.notion.site/License-61290d5ed7ab4c918402fd...

So yes, it is an electron app with svelte, headless-ui, tailwindcss etc


+1 for asking download location.


Another con is it only works on Silicon Macs.


Apple Silicon* I presume?

This could honestly be the excuse I need (want) to order an absolute beast of a macbook pro to replace my 2013 model.


If you want an absolute beast, especially for this stuff, you probably want Intel + Nvidia. Apple Silicon is a beast in power efficiency but a top of the line M3 does not come close to the top of the line Intel + Nvidia combo.


Well this would just be the excuse. I'm typing this on a Ryzen 5950X w/32 GB of RAM and a 4090. So I guess I already have the beast?


Get M1 macbook, throw your ryzen with 4090 in a closet and use it as a remote API endpoint for comfyui lol


I guess EPYC and a few H100s is the next big step, at a much higer price point...


If it's just for hobby/interest work, then just a heads-up that even the 1st generation Apple Silicon will turn over about one image a second with SDXL Turbo. The M3s of course are quite a bit faster.

The performance gains in recent models and PyTorch are currently outpacing hardware advances by a significant margin, and there are still large amounts of low-hanging fruit in this regard.


Is that 1GB idle per process or total for all 7 processes?


> not open source like competitors

Who are the competitors?


DiffusionBee: AGPL-3.0 license (Native app)

InvokeAI: Apache license 2.0 (web-browser UI)

automatic1111: AGPL-3.0 license (web-browser UI)

ComfyUI:GPL-3.0 license (web-browser UI)

There's more, but I don't pay enough attention to it


Thanks! https://lmstudio.ai/ too. For the more technically inclined perhaps.


I don't think lmstudio is competes with Stable Diffusion frontends, even for the technically-inclined.


for people with Intel video cards (all 10 of us!) there's also SD.Next (automatic1111 fork): https://github.com/vladmandic/automatic


I like ComfyUI the most now but it's probably not the most beginner friendly. But has great features, is extensible, and you can build workflows that work for you and save them so you don't have to click a million times like Auto1111.


I'd also recommend InvokeAI, an open source offering which has a very nice editable canvas and is very performant with diffusers.

https://github.com/invoke-ai/InvokeAI


I just installed InvokeAI and wish I hadn't. It installs -so much- outside of its target directory. A1111 and ComfyUI are fairly self contained where you put them.


It's all isolated in a single directory, though, right? I set it up ages ago, but my recollection is that it installs itself in ~/invoke on Linux and stays contained there.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: