Yes, immich is incredible! I have my entire family's photos (not all technical users mind you) synced and backed up thanks to immich server and its phone app. The web UI is easy, fast and they've even started adding basic editing capabilities recently.
I see a decent amount of skepticism and negativity in this thread, but I'm glad to see that someone is funding projects like this.
Even the previous-gen Pis would easily throttle under load without a heat sink (although usually installing a decent heat sink was enough to resolve the problem, assuming decent airflow). As the article notes, the heat was mainly from the SoC in the past. The Pi 4 is on a whole different level indeed.
I imagine the logic here is the same as with many other Pi accessories: if you need it, you'll buy it or get it as part of a bundle. In some cases throttling is not a huge issue. But we're a far cry from the simple plug-and-play Pis of the past... Another issue is power -- you can't simply run the newer Pis reliably off any old cell phone charger you have laying around.
Yeah, I'm kinda wondering just what the Pi is supposed to be at this point. Is it still an inexpensive low power device for teaching basic computer science? Because it seems like it's more for making Kodi boxes or the worlds most powerful LED blinkers.
I'd love to see some kind of reasonable data on the breakdown of how many Pis actually end up in education environments vs RetroPie/Kodi/PiHole/whatever else "service on an SD card image" turnkey solution type thing people use them for. At this point, my guess/gut feel is that education is an increasingly small part of the market for the Pi, yet appears to still be the main focus of the Pi Foundation.
I've always felt the educational focus was much more about the rose-tinted memories of parents who grew up in the homebrew computing generation in the UK. It feels much more a product for me in that regard than it does my kids. Ebon Upton et al are of this generation as well. British computer people of a certain age love to remember their ZX Spectrums etc.
Why does it need to be either/or? The way I look at it, every single hobbyist, retropie, dust collector, etc is just enabling the scale that allows them to deliver the hardware at the price point.
If 100% of Pi went into schools, the foundation would likely have dried up 4 years ago. We might not be the target market, but we enable it. We enable the development, the third-party/after-market, we provide the community. The foundation provide the "noble goal", and we provide the cashflow.
Because the two goals are somewhat incompatible. The more powerful they make the hardware to please one crowd the less they can meet the "low power", and to a lesser extent price (need a fan now, etc), goals.
I'm not sure low power was ever a goal. They've chosen USB as the power source, for whatever reason, and fairly consistently managed to just scrape (or just miss) the power budget. I mean, this is a device with no power management (other than temperature throttling), no sleep states, etc. If low power was the goal, they've managed to put out a successful product in spite of themselves.
It feels to me like they've only ever had two solid targets. One is the price-point, and they've shed everything that stood, and the other has been a rather solid attachment to backwards compatibility. (It often feels like the model A only exists because their original claim a $25 computer, and the A means they technically stuck it - despite it being one of their less popular models).
I think another commenter hit the nail on the head though - the whole thing feels like an emotional attachment to the way computing was learnt in bedrooms in the late 80s / early 90s - especially the success of the BBC micros (which I believe the model A & B are named after). Hooking up turtle bots, sticking wires straight into parallel ports, makes mail merges feel like a hollow shell of computing. I think that's what the Pi is trying to bring back (in a manner that makes the computer itself cheap enough to be disposable, rather than some expensive relic that you're afraid to mess with).
(Side ramble: I learnt computing (at school) in the UK in the era directly after this. As strange as this will likely sound to anyone who isn't British, the era when various supermarkets kindly volunteered to replace all our beebs with nice new Windows PCs. We went from wiring weird and wonderful things into parallel ports, to seeing computers as these expensive things that "we" had worked hard for. They went from being tools to being appliances. They weren't to me messed with, modified and tortured - they were to run Claris and Publisher, and later Netscape. My work now owes more to replacing burnt out serial controllers in Amigas, than to anything I learnt at "high school" level computing. We spent a decade or two insulating students from any nuts & bolts understanding, and modern environments are getting worse, not better. So I see Arduino, Raspberry Pi, as the antidote to being taught "computing" on an iPad.)
I can't agree with that conclusion. Where has the "low power" goal come from (as if 15W wasn't low power anyway). And as has been said already, you 100% do not need to actively cool a RPi 4 to teach kids to code with it!
That said, there is a huge number of applications for overpowered LED blinkers, and no lack of people to use them. And there is a great deal of educational value on having those boards available and easy to get.
Besides, it's becoming hard to find people without access to a machine where they can learn basic computer science. So this one niche is closing down, while the Pi is still unbeatable on hardware hacking.
Could you elaborate on why you think Pi 4 is "overpowered" for Kodi?
From what I've seen, there are still some kinks playing 4K HEVC videos. So hopefully, when software/firmware/etc. catches up, it should be "just powerful enough" for a Kodi box, all because of hardware decode.
Barely handling 4K does not scream "overpowered for a media center" to me.
That’s just it, though. The official Pi power supply is adequate and good for powering the Pi in any condition, but the case is a far cry from adequate. At best, it just toasts everything inside more evenly, at worst it contributes to throttling because now the heat from every other part of the Pi forms a hot thermal blanket over the CPU.
I’m amazed they didn’t at least add a few passive ventilation holes or slots, somewhere. It gets crazy hot inside the case.
Yeah snagging the official case was a big let down imo. Having to modify it just so the board will operate for a generic use case was a sad moment for me. Glad to see your solution though, time to bust out my old hole saw kit!
I'm sad about the case too, it needs some extensive Dremel work to allow prolonged database queries or movie transcoding or basically anything I'd like to use it for
Exactly. It will be interesting to see how something like this sells.
I have a feeling that it will fill a niche, but not be a hugely popular option with most MBP users. Folks can whine about the lack of ports, but in a few years USB-C devices will be quite common and reliable.
For something like ethernet, I'd honestly rather "haul" a small dongle around.
That thing does look like it suffers for having an Ethernet port, because it has to be taller to accommodate it. I would similarly rather have an Ethernet dongle than a much thicker laptop.
>> Folks can whine about the lack of ports, but in a few years USB-C devices will be quite common and reliable.
For most people, dongles probably a non issue - you'll get used to it.
But the thing with dongles is that a) you can forget them, b) they can stop working -- and that usually happens at the most inopportune time (in front of a client during an important meeting).
I've killed plenty of the MiniDisplayPort to [VGA/HDMI/DVI] dongles just based on the repetitive flexing where the cable meets the DP plug.
Awesome! This is exactly what I've been hoping for. The state of terminal emulators on macOS is particularly bad, at least when it comes to speed. Both the built-in term and iTerm have a lot of features, but really start to lag on big screens with a lot of text. I used to run urvxt under XQuartz for this reason, but there's scaling problems with retina screens these days.
Nice work. Hopefully this can fill a particular void for folks that want no-frills fast terminal emulation.
I can't remember the last time I thought that my terminal was too slow, except maybe when I tried that Electron-based program a while back (Hyper, I think it was). My priorities are more like the following:
1. Stability. I crashed Alacritty thirty seconds after opening it; possibly related to issue #12.
2. Emulation correctness. In Terminal.app, the cursor often gets out of sync when I "turn the corner" (i.e., backspace across a line boundary).
3. Font rendering. Text in some terminals just looks ugly.
4. Features. Alacritty doesn't seem to show the number of rows and columns when I resize. Scrollback!
This looks like a really interesting project, but it seems really strange to make performance such a high priority. I tried the find /usr test, and it seemed equally fast in Terminal.app and Alacritty.
Just curious: how many folks can accurately touch-type the Fn-keys? Personally they're too far away from the home row for me to hit accurately, but maybe that's because I use too many different keyboards and they're all slightly different with spacing. The esc key is easy with it being located in the corner, though.
So in my case, I would already look down to hit an F-key, and I imagine functionality such as "compile" would now be represented by a touch button. So no problem there. And point #2 is just ridiculous -- today's MacBook CPUs and Memory are faster and consume less power than they did 6 years ago.
Touch typing or not, I think you are missing an important point here with regard to programming - tactile feedback. Doubly so with anyone that uses mechanical keyboards like me.
For example, you may not know where the "f8" key is on your keyboard, but once you look down and locate it, you won't have trouble repeatedly pressing it, accurately, with velocity. A good example of where and why this pattern happens is debugging. A lot of IDEs and editors are setup by default to use f-keys for debug or other ancillary functions like build, specific menus, etc. I believe IntelliJ and Visual Studio in at least a few default setups and versions did this.
More specifically, I can't imagine pressing a touch button possibly dozens of times, sometimes rapidly to advance through a bunch of break points, set new break points, eval things, etc. It is true you could just map these to other keys, but that becomes an issue with anything that is using default key maps for functions as I describe. Additionally, touch buttons promote people to start using more and more keyboard chords as they start shuffling around things in their keymaps, which a lot of people dislike. I'm an Emacs and IntelliJ user primarily, so the former and my arthritis in my hands are well acquainted with regard to keyboard chords and complex mapping sequences.
I have not used this keyboard obviously, but it seems to me from my experiences using similar tech that this is only good for much lesser used macro-like or launch actions. Useful still, yes, but I don't think it is a 1:1 replacement for function keys. I said it in another article and I'll say it here, I don't think this keyboard setup is aimed at programmers, but for most consumers it is likely just fine as much as I hate it. As for value added, that's another discussion.
I do know that if I ever buy or am forced to use one of these, it will always be with my own keyboard. I don't mind the chiclets as much as some people, but at home or the office I use external monitors and mechanical keyboards whenever I can.
Honestly, I've always felt that Fn keys were an outdated and faulty design. They're too far away to reach gracefully, usually offset from the standard zig-zag of other keys, and awkwardly multi-roled with screen/sound options on most keyboards.
If I'm in an IDE that uses a function key for something like 'compile', I usually rebind it for my own sanity. I have mixed feelings about this update (contextual buttons are often terrible, and I don't mind escape), but losing Fn won't really matter.
I run several Java applications on a FreeBSD server without problems. Is there a particular subset of Java features that don't work other than on Linux? I also haven't had any issues with Java on macOS.
When I put on my pointy corporate IT hat, and I go to the Oracle web site to download Java, I see that Java runs on Linux, MacOS, Solaris, and Windows. Then when I go to download WebSphere, I see that it runs on a similar set plus AIX. Java support for Linux comes from the vendor while Java support for FreeBSD comes from the "community." Corporations will generally not depend on infrastructure that does not have reliable vendors who will sell them support contracts. Linux has this ecosystem in place. FreeBSD does not.
I see a decent amount of skepticism and negativity in this thread, but I'm glad to see that someone is funding projects like this.