Hacker Newsnew | past | comments | ask | show | jobs | submit | sixtyj's commentslogin

LLMs are really eager to start coding (as interns are eager to start working), so the sentence “don’t implement yet” has to be used very often at the beginning of any project.

Most LLM apps have a 'plan' or 'ask' mode for that.

I find that even then I often need to be clear that i'm just asking a question and don't want them running off to solve the larger problem.

Plug it into skull bone. Neuralink + slot for a model that you can buy in s grocery store instead of prepaid Netflix card.

We better solve the energy usage and cooling first otherwise that will be a very spicy body mod.

Anything bigger in context? Unfortunately - maybe I have bad luck…

But I don’t get how they code in Anthropic when they say that almost all their new code is written by LLM.

Do they have some internal much smarter model that they keep in secret and don’t sell it to customers? :)


>> when they say that almost all their new code is written by LLM.

Kepping in mind they are trying hard to sell their code assistant what else they can say?

Goal is simple: just lie your way forward to the next VC funding round.


Loom is a good metaphor.

https://nekonomicon.irixnet.org/gallery/IRIX-Screenshots/Sil...

I have forgot how this interface is awesome. Thanks for the resurrection.


You're welcome :)

It is really a crazy speed. 15k tokens/second.

I have tried it again. This is the future of chat UI, imho.

Generated in 0,074s • 15 754 tok/s


Windy.com - both website and app. It covers the whole world and seems that they have very large number of models available.

Also yr.no app - the Norwegian weather service. Covers the whole world, uses a decent selection of models. I go between this and windy.

And be nice and careful, please. :)

Claw to user: Give me your card credentials and bank account. I will be very careful because I have read my skills.md

Mac Minis should be offered with some warning, as it is on pack of cigarettes :)

Not everybody installs some claw that runs in sandbox/container.


Isn't the Mac mini the container?

It is... but then many people hook it up to their personal iCloud account and give it access to their email, at which point the container isn't really helping!

For CDN, you can try CDN77, they have servers all around the world. No affil, just they are based in Europe (Prague) :)

Right now, I would only switch from Bunny if they allow IPv6-only origin servers and route IPv4 traffic to it.

Also no pricing and a "Talk to sales" only link. Which usually means super expensive, or B2B only. I pay like 10 cents a month on Bunny something


Ah yeah I see CDN77 has no signup button and just says "Talk to sales" instead. That's not helpful to small self hosters.

Even for big corp.

When I see "talk to sales" I just move on. I don't have time to waste on that.


There was a discussion yesterday that LLM generated “Show HN” posts should be moved to another thread :)

Nevertheless, it looks nice but I can’t be sure that texts are correct. Did OP check everything because she/he knows deeply the topic?

Citations give credits to text but can we be sure about them if they are automatically generated? Live links to arXiv or ResearchGate would be much better.

Graphics and visualisations look great, well done.


I verified the Fourier one and the LLM one. The scaling law one is likely okay too as I long back read the book.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: