Hacker Newsnew | past | comments | ask | show | jobs | submit | Underqualified's commentslogin

My company recently decided to move away from Power Automate after having trained lots of people on this 'no code' platform and then having to hire expensive consultants to support or flesh out the apps those users had actually written.

We were talking about TVs recently in the office and pretty much everyone agreed that even 4K is overkill for most TVs at a reasonable viewing distance.

I got a 65inch TV recently, and up close HD looks pretty bad, but at about 3m away it's fine.


One just need to get a big house so that the 8k 77 inch TV can be put like 6m+ away (/s). See the https://www.rtings.com/tv/reviews/by-size/size-to-distance-r... for more accurate distance calculations.

For normal size apartments 55-65 inch is indeed the best size considering the distances. It's a pity that the manufactures don't make 8k 55 inch TVs..


These calculators never made much sense to me. The whole point of getting a bigger screen is to cover a larger portion of your field of view. Sitting close to a giant TV is like being in your own cinema.

The point they use is that our vision angles are fixed and if you sit too close you won't see the whole screen at once. What's the point of having bigger screen in that case? Also the angles of viewing won't let you see that juicy colorful picture at the sides even if you turn your head.

As for the cinema, there are a limited amount of seats where you can get the best immersion and they are in the sweet spot of screen size/distance


Those calculations are way too conservative for the screen to cover your field of view. Unless you mean your fovea but then that's much smaller. Also depending on the display tech you can see colors at angles just fine.

> if you sit too close you won't see the whole screen at once

Human FOV is very nearly 180 degrees.


This resonates with me, but I quit programming about a decade ago when we were moving from doing low level coding to frameworks. It became no longer about figuring out the actual problem, but figuring out how to get the framework to solve it and that just didn't work for me.

I do miss hard thinking, I haven't really found a good alternative in the meantime. I notice I get joy out of helping my kids with their, rather basic, math homework, so the part of me that likes to think and solve problems creatively is still there. But it's hard to nourish in today's world I guess, at least when you're also a 'builder' and care about efficiency and effectiveness.


Many years ago I read the classic 'How to win friends and influence people' and I was just hit with, according to that book, how little people actually care about other people and how fundamentally lonely our existence is.

I don't think that was the message the book was trying to give, but that's what I got out of it.

So yes, people will wonder, subconsciously or not, what's in it for them. If you can give status or if you are naturally entertaining, this might all seem a little less obvious.


Do people think they taste the same ? I can normally tell them apart on smell alone.


Sure it can, that's why healthcare is so profitable.


Manufacturing, close to the floor, where you often need to react fast and start in the morning with no idea what you will be doing that day, yet will still have more work than hours.


I've seen UseNet die, so I'm not so sure. What Reddit does (and UseNet before it) is really niche. There aren't as many people on Reddit as there are on Facebook and Reddit isn't as entangled with people's 'real' life as much as social media is.


You are right with your last sentence, but reddit is anything but niche. It attracts even politicians and other celebrities, albeit typically only for Q&As. Arnold Schwarzenegger and William Shattner post there for fun with their real names, who knows how many post under pseudonyms. The beauty of reddit is that it attracts even non-technical people who share their insight in their domain (usually anonymous, so to be taken with a grain of salt - but it's usually plausible precisely because reddit is so popular).


Reddit has grown significantly the last few years, but it still has only about 50-60 million daily users, compared to Facebook's nearly 2 billion daily users.

Don't get me wrong, I like Reddit. Before it I used Usenet, which died and after Usenet I was active on forums (most of those also seem to be gone now). I really prefer those topic based social media to the other user-based social media. But I've seen enough of them die before.


I loved Usenet before the mixture of web forums and spam killed it... we need something like this again - decentralised platforms without decentralised community.


The GUI apps have the benefit of being easier for onboarding. We've redesigned the workplace to deal with constant employee turnover.

I guess they also make more sense to management since it looks like something they could do themselves, or at least understand.


You can have both. GUIs were a breakthrough because they enabled much better discoverability, allowed images in the UI and so on. But they were also designed to be fully keyboardable and low latency.

Web tech broke all that:

- UI was/still is very high latency. Keystrokes input whilst the browser is waiting do not buffer, unlike in classical mainframe/terminal designs. They're just lost or worse might randomly interrupt your current transaction.

- HTML has no concept of keyboard shortcuts, accelerator keys, menus, context menus, command lines and other power user features that allow regular users to go fast.

We adopted web tech even for productivity/crud apps, because browsers solved distribution at a time when Microsoft was badly dropping the ball on it. That solved problems for developers and allowed more rapid iteration, but ended up yielding lower productivity than older generations of apps for people who became highly skilled.


Well browsers solved multiple other issues too: cross platform apps, updating all clients in a single place, sharing data between devices, and the most important for many developers - switching software from an ownership to a rental model, killing piracy, and easy access to user metrics and data.

All of these (except logging on to the same data from all my devices, which is nice) benefit the developer at the expense of the user.


> All of these (except logging on to the same data from all my devices, which is nice) benefit the developer at the expense of the user.

Glad you pointed that out. And, in the most prevalent application of Conway's law[0], those changes enabled and are entrenched by the "agile" practices in software development. Incremental work, continuous deployment, endless bugfixing and webapps fit each other like a glove (the latex kind that's used for deep examination of users' behavior).

It also enables data siloes and prevents any app from becoming a commodity - making software one of the strongest supplier-driven markets out there, which is why the frequent dismissal of legitimate complaints, "vote with your feet/wallet", does not work.

----

[0] - https://en.wikipedia.org/wiki/Conway%27s_law


Yes, "updating all clients in one place" is what I meant by distribution. Windows distribution suffered for many years from problems like:

- Very high latency

- No support for online updates

- Impossible to easily administer

Cross platform was much less of a big deal when web apps started to get big. Windows just dominated in that time. Not many people cared about macOS Classic back then and desktop UNIX didn't matter at all. Browsers were nonetheless way easier to deal with than Windows itself.

Agree that killing piracy was a really big part of it. Of course, you can implement core logic and shared databases with non-web apps too, and the web has a semi-equivalent problem in the form of ad blockers.


You missed privacy. The user lost privacy with webapps.


I figured that came under "easy access to user metrics and data", but I did consider some kind of rhyme linking piracy to privacy but it was a little early in the day to commit that sin. It's probably worth mentioning twice anyway.


> HTML has no concept of keyboard shortcuts, accelerator keys, menus, context menus, command lines and other power user features that allow regular users to go fast.

HTML has had a limited concept of accelerator keys for years, but it's not pretty:

https://developer.mozilla.org/en-US/docs/Web/HTML/Global_att...


This is a good observation. Constant employee turnover also reduces worker productivity, as it means most current employees are juniors in their role (regardless of what their title says).


Problem is the GUI could have shortcuts for everything, but usually won’t.

It doesn’t help that the evaluators for a new system will also approach from the perspective of a new user, even though none of them will be a new user in some months.

I’ve so wanted to create auto-hot-keys for many tasks, but end up having to use (x,y) clicks where I get boned every design touch-up (deliberate or side-effect of another change).


They are now using a 15 minute delay to avoid outside communication. Jamming signals is also entirely possible.


The method in this article uses no outside communication or signalling at all


But the Moon Ribas example requires outside communication, I'm just saying that it seems possible to prevent using that effectively.

As for electronics, I assume most can be detected ? Maybe implants might pose a problem ?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: