Hacker Newsnew | past | comments | ask | show | jobs | submit | ledauphin's commentslogin

have you gotten a terminal interface on your phone to be acceptably usable? I haven't - not without a real keyboard attached in any case. too many parts of the UX are designed for a true keyboard.

I’ve had decent luck with Termius because it gives you a row of keyboard shortcuts above the usual keyboard. Still cramped, but it works.

Tmux is annoying with a mobile keyboard, so I vibe coded a little mobile-friendly wrapper https://github.com/zakandrewking/pocketbot

Someone is going to solve this with a non-buggy app, but it really needs to have all the features of Claude code. Everyone is a power user in this segment


I used connectbot on android and it worked fine for me, the new 'terminal' with debian also worked well

take a look at opencode, it doesn't even have to be a terminal anymore to command your terminal from whatever device you are using

how is it different or better than maintaining an index page for your docs? Or a folder full of docs and giving Claude an instruction to `ls` the folder on startup?



It's hard to tell unless they give some hard data comparing the approaches systematically.. this feels like a grift or more charitably trying to build a presence/market around nothing. But who knows anymore, apparently saying "tell the agent to write it's own docs for reference and context continuity" is considered a revelation.


this is likely in reference to the fact that dicts have maintained insertion order since Python ~3.6 as property of the language. Mathematically there's no defined order to a set, and a dict is really just a set in disguise, but it's very convenient for determinism to "add" this invariant to the language.


Sets use a different implementation intentionally (i.e. they are not "a dict without values") exactly because it's expected that they have different use cases (e.g. union/intersection operations).


Debugging is a completely different and better animal when collections have a predictable ordering. Else, every dict needs ordering before printing, studying, or comparing. Needlessly onerous, even if philosophically justifiable.


then this will get filed by every corporation against every lawsuit


Sure. But they're probably not going to have significantly different content than their normal motion for summary dismissal, so it adds, say, 25% more work for the courts compared to the normal process of motions for summary dismissal, while greatly benefiting individual defendants.


Is that not an absolute win?


this is often a sign that the design/solution you've chosen is an unfruitful path.

know when to cut your losses and try something different.


as an aside on Nimony aka Nim 3:

can somebody provide a reference explaining/demonstrating the ergonomics of ORC/ARC and in particular .cyclic? This is with a view toward imagining how developers who have never written anything in a non-garbage-collected language would adapt to Nimony.


ORC/ARC are a reference counting garbage collector. There's a bit of a terminological clash out there as to whether "garbage collection" includes reference counting (it's common for it to not, despite reference counting... being a runtime system that collects garbage). Regardless: what makes ORC/ARC interesting is that it optimizes away some/most counts statically, by looking for linear usage and eliding counts accordingly. This is the same approach taken by the Perseus system in use in some Microsoft languages like Koka and Lean, but came a little earlier, and doesn't do the whole "memory reuse" thing the Perseus system does.

So for ergonomics: reference counting is not a complete system. It's memory safe, but it can't handle reference cycles really very well -- since if two objects retain a reference to each other there'll always be a reference to the both of them and they'll never be freed, even if nothing else depends on them. The usual way to handle this is to ship a "cycle breaker" -- a mini-tracing collector -- alongside your reference counting system, which while is a little nondeterministic works very reasonably well.

But it's a little nondeterministic. Garbage collectors that trace references, and especially tracing systems with the fast heap ("nursery" or "minor heap") / slow heap ("major heap") generational distinction are really good. There's a reason tracing collectors are used among most languages -- ORC/ARC and similar systems have put reference counting back in close competition with tracing, but it's still somewhat slower. Reference counting offers one alternative, though -- the performance is deterministic. You have particular points in the code where destructors are injected, sometimes without a reference check (if the ORC/ARC optimization is good) and sometimes with a reference check, but you know your program will deallocate only at those points. This isn't the case for tracing GCs, where the garbage collector is more along the lines of a totally separate program that barges in and performs collections whenever it so desires. Reference counting offers an advantage here. (Also in interop.)

So, while you do need a cycle breaker to not potentially leak memory, Nim tries to get it to do as little as possible. One of these tools they provide to the user is the .acyclic pragma. If you have a data structure that looks like it could be cyclic but you know is not cyclic -- for example, a tree -- you can annotate it with the .acyclic pragma to tell the compiler not to worry about it. The compiler has its own (straightforward) heuristics, too, and so if you don't have any cyclic data in your program and let the compiler know that... it just won't include the cycle collector altogether, leaving you with a program with predictable memory patterns and behavior.

What these .cyclic annotations will do in Nim 3.0, reading the design documentation, is replace the .acyclic annotations. The compiler will assume all data is acyclic, and only include the cycle breaker if the user tells it to by annotating some cyclic data structure as such. This means if the user messes up they'll get memory leaks, but in the usual case they'll get access to this predictable performance. Seems like a good tradeoff for the target audience of Nim and seems like a reasonable worst-case -- memory leaks sure aren't the same thing as memory unsafety and I'm interested to see design decisions that strike a balance between burden on the programmer vs. burden on performance, w/o being terribly unsafe in the C or C++ fashion.


The short answer is you'd write your code the same, then add .cyclic annotations on cyclic data structures.

("The same" being a bit relative, here. Nim's sum types are quite a bit worse than those of an ML. Better than Go's, at least.)


notably, the paper uses the capitalization XAI.


average, yes, but living to 70 was reasonably common if you made it past childhood.


I was under the impression living to 70 would have been very rare in, say, 1100 CE


Figure 2 in https://gurven.anth.ucsb.edu/sites/secure.lsit.ucsb.edu.anth... suggests that about 15% of hunter-gatherers would reach age 70.


Not deeply knowledgeable here but imagine this depended quite a bit on where you were living in 1100 CE.

I think it was fairly rare in Europe, but IDK how well those numbers capture what was common for the majority of the human population living elsewhere.


It was pretty rare even among medieval kings to live to be 70.

The first English king to be definitely alive on their 70th birthday (though no longer "in office") was Philip of Spain (jure uxoris) in 1597, so not a medieval king. That is Early Modern Age.

Elizabeth I. didn't make it, though barely, and so the next to reach 70 was George II. in November 1753! Only since the second half of the 18th century is it common for British monarchs to reach their seventies.

Richard Cromwell lived to be 85, but he was never a king, only Lord Protector.

Edgar Aetheling lived to be 73, but he was never king either, due to certain William arriving en force from Normandy.


Was this meant for someone else?

I did not dispute that this was likely rare in medieval Europe (for the same reason you cite).


Yeah, it was midnight back here, I possibly chose the wrong thread comment. Sorry.


Medieval kings were warriors and very often victims of assassination, so they had a way lower life expectancy than a typical peasant of their times.


Another commenter raised the ransom point for kings. One of the reasons why higher nobility and the king's household was so visible on the battlefields was that they shouldn't get killed by mistake.

As for the common folk, if you look at actual medieval cemeteries that were excavated and studied, the peasants didn't live long either. The age of death can be assessed by looking at the bones, and already the above 50 cohort is somewhat thin, while the above 60 is infrequent.

You underestimate the effects of hunger on mortality. Prior to introduction of potatoes (e.g., ~ 18th century in much of Europe), failed crops would be a common occurrence, happening ~5-6 times during life of a normal rural person. If two of them happened back-to-back, the resulting mortality was already serious, and older people would often be victims. It made sense to use whatever food was left for the younger, stronger generation which was still able to work.

Famine was basically never a concern for the royalty. We have a record of the English king going dinner-less once, but that is not a threat to your life.

BTW If you really want to find a relatively long-lived sector of the society, it would be the high clergy, which had all the upsides of noble life (food, warmth in winter) and almost none of the downsides (most wouldn't fight, murder was less common). This is the only "job" which saw some 70 y.o.s still alive and active, mostly as cardinals.


Medieval European nobility tend not to die in battle. They were captured and ransomed. Richard III died in battle but nobody was gonna ransom him.

Assassination definitely brings down the average. But a fair number of English monarchs managed to die in bed. (I was gonna write British, but no: the Scottish kings practically never died in bed. Unless they were stabbed in their sleep.)


Fair enough related to battle, but that's not the only risk of waging war. A fair few kings caught diseases from their campaigns - which they may not have at home. Overall from some light Wikipedia browsing, it seemed to me that around 1/3 of medieval English kings died from assassination or battlefield wounds & diseases. Note that Richard III is basically the last Medieval king, I'm talking about earlier periods.


you can remap individual LEDs - it's awkward but there's a community firmware that you can use without ever leaving the web editor. the layout language is also pretty raw but it's doable.


I have had literally this exact same experience with the Glove80.

I like everything about it except for the thumb cluster. it is, amazingly, worse than the Ergodox EZ.

I wish the author would spring for a wireless Dygma Defy and tell us how that thumb cluster compares :)


Not the author, I did that transition myself and so far I'm pretty satisfied with the Defy.


thanks! I'm not convinced I really care about the concavity - I got the Glove80 because it was cheaper for wireless and ZMK seemed like a safer bet than the Dygma custom layout engine. You're happy with that side of things?


I only have the wireless version. The keyboard management software Dygma makes is spectacular, I was quite pleased. It's quite a step up from the janky web editor Moergo provides, requiring downloading a file and then doing a keyboard re-flashing dance.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: