Hacker Newsnew | past | comments | ask | show | jobs | submit | vctrnk's commentslogin

> Every time a manufacturer says vague descriptions like "security" or "performance" fixes, be wary - they probably removing perfectly working functionality for "reasons"

I have a pair of WF-1000XM3s and this is painfully true. ANC was brilliant on these until I naively updated, and whoosh - instantly and grossly degraded ANC, to the point I previously almost didn't hear people talking at distance, keyboard chatter, city traffic etc. and now I do, no matter the app settings.

I wanted to upgrade to the in-ear XM4s, but after this? NEVER again Sony. At least for portable audio. I got instead a pair of cheap QCY HT07s (then $28, now ~$20) and got quite surprised with ANC performance on these: easily beats the crap of the XM3s-on-latest-firmware, and gets close to the previous one in audio quality. Which is a lot to say about Sony "updates".


Not to defend Google, but they end up saying much the same:

> The next challenge for the field is to demonstrate a first "useful, beyond-classical" computation on today's quantum chips that is relevant to a real-world application. We’re optimistic that the Willow generation of chips can help us achieve this goal. So far, there have been two separate types of experiments. On the one hand, we’ve run the RCS benchmark, which measures performance against classical computers but has no known real-world applications. On the other hand, we’ve done scientifically interesting simulations of quantum systems, which have led to new scientific discoveries but are still within the reach of classical computers. Our goal is to do both at the same time — to step into the realm of algorithms that are beyond the reach of classical computers and that are useful for real-world, commercially relevant problems.


Up to that point, most Windows iterations didn't require (for the time) big upgrades to run fine. Many PCs designed for a given version could run the next, maybe with a little elbow grease, but the bottom line is: the out-of-the-box experience on new AND upgraded PCs was mostly okay.

Then it came Vista. An OS designed for at least 1Gb RAM, preinstalled on machines who stubbornly refused to sell with more than 512Mb (even 384Mb, the horror!) for a looong time. I remember that, at least where I live, RAM prices sky rocketed just months afther Vista came out, because almost all people irremediably needed the upgrade.

It also didn't help that vendors were happy to fill new systems with their auto-installing crapware. While this wasn't Microsoft fault, it certainly helped to cement Vista's reputation as a very heavy-weight OS.

Having said that. I concede the point that Vista was pretty alright, provided your PC had the grunt to run it.


Oh wow. Count me in on the I didn't know I had that! camp.

When younger I struggled horribly with ALL things math, and to this day still do. OTOH I've always had a knack for DIY involving measurements: lenghts, rythms, quantities, sizes, you name it. I just invoke my own "dynamic mind ruler" for the task at hand and usually get it right 1st try. Cooking something new? I intuitively know the proper amount of ingredients and spices. Doing work in a friend's car? That nut looks like a 3/4 and that one a 11/16, and who the heck put a 11mm in place of a 7/16??

Incidentally, the whole concept of Time always flows from right-to-left to me. 1000BC is waaay to the right, and 2030AC is just a stone throw away to the left. Now I wonder if it's something only I perceive that way, or everyone does.


Are you left-handed?


My course of action was to run Memmaker, and let it take care of (almost) everything. It usually worked fine. Sometimes I had to fine-tune config.sys/autoexec.bat to make a bit more room or disable EMS, but those were edge cases.

Now that I think about it, it'd be fun to know how Memmaker works internally, but I can't seem to find info on that. Maybe no one has done such analysis... yet.


> Now that I think about it, it'd be fun to know how Memmaker works internally, but I can't seem to find info on that. Maybe no one has done such analysis... yet.

It has been a long time...doesn't it just go through and LOADHIGH all the obvious possibilities? I feel like it maybe had some library of candidates it would check.

I never had this program, I think I did it all manually after analyzing config.sys and autoexec.bat on systems with more free ram than I had. It makes me realize how much knowledge I just mined from school computers.


Out of curiosity, did you needed some particular feature in such an app? I've found that Google Lens is pretty darn good at identifying plants, insects, fungi and whatnot (assuming your camera has a decent macro mode).


The only problem with Lens is that it is "magic" and doesn't have a failure state beyond giving junk results should it fail. I don't think I would ever trust it for a "can I eat this" indicator on a mushroom with how many visual lookalikes there are out there. What if the contrast isn't good enough to catch colorations and the gills are not in sight?

Merlin Bird ID is so good in comparison, probably the best in the "ID this thing" category of apps I have ever tried. Photos do a lot, but if you don't get a good ID it will ask some questions about the bird's behavior and your circumstances to narrow down your search.


Even if you identified the mushroom species, sometimes that is not enough to know whether it is safe to eat. The same species can be edible (perhaps after some soaking) or dangerously poisonous depending on the geographic area where it grew.


Mushrooms are literally one for the categories of things that are "DO NOT EAT UNLESS YOU'RE 100% CONFIDENT AND KNOW THROUGH YOUR OWN KNOWLEDGE" types. (Caps used because it is a yelling thing)

Seriously, just don't eat any mushroom that you can't personally identify with 100% confidence from your own knowledge and references, with some app saving its xyz not being considered a reference.

Even experienced mycologists have sometimes made mistaken identifications so anyone else should be so cautious as to presume poison in all cases except the most certain.

(Obviously store bought are an exception)


You cannot market nature, but you CAN market everything around it: Tourism, clothing/footwear, camping gear, even technology (cameras, GPS etc). All of them to give you the "upper hand" over your peers - And of course, you can buy that from us!


Same here, and it's not region-targeted (I'm not from US/EU). This is borderline FUD.


I'm still looking for an open ECU focused only on old, carbureted cars. Projects like Speeduino, rusEFI, MegaSquirt, etc. are wonderful, but I have no use for them because they're more geared towards fuel-injected engines (though some do have ways to run injector-less).

Case in point: I have a Renault R5 GTL with a Renix pseudo-ECU module [1] taken from a Renault R11 TX, along with its camshaft, head, double barrel carb and other bits. It has inputs only for manifold vacuum & crankshaft sensor (some support a knock sensor, mine does not). A crude ROM map is applied to the advance curve from just those two parameters. It works really well, but that's it, no way to change or fine-tune anything.

> while testing showed it produced more power and better emissions

A customizable "ECU" similar to the Renix approach that does EXACTLY this, would be a godsend to me (and also my local old Fiat/Lada/you-name-it heads). At these small displacements, we just don't care how the engine sounds: every HP gained is a win.

[1] http://boursinp.free.fr/pdgdiag2.htm

PS. Yes, I know an 123ignition dizzy is what I need, but they're way too expensive - and proprietary. Where's the fun in that?


Electromotive sells a waste spark system they call "XDi" that seems to be the go to replacement ignition system for the earlier carborated Ferrari folks. Perhaps something to look at if you haven't already?

> and also my local old Fiat/Lada/you-name-it heads

If you've got any Fiat friends who have a car with a Digiplex ignition system that needs a replacement, my ignition units will work with many of them as well, as the Fiat units are basically identical to the Ferrari units but with different curves in them. I've never actually sold a unit to a Fiat customer, but I do have curve information on several Fiat and Lancia cars. The exterior boxes are 100% identical, and I've used some of the Fiat boxes as cores before (I use the old boxes to make the new units, as sandcasting new replacement aluminum boxes would be way too cost prohibitive)

> 123ignition dizzy

I've heard good things about those distributors, but I've never played with one myself. I've used the Petronix solid-state triggers to replace points in a distributor plenty, but several of my mechanic friends now use the 123 ignitions instead.


Don't even need Pertronix. You can use a TFI module from 1990's Ford vehicles and trigger it directly from the points distributor, or build a simple circuit with 1 transistor to invert the signal and trigger an HEI ignition module.


This is true, but the Pertronix units are cheap, easy to install, easy to remove and put points back in, and readily available. The downside I've found from them is that their reliability isn't perfect, but they're definitely better than points!


An OEM TFI or HEI is even cheaper and more reliable. If you splice everything with bullet connectors and keep the points coil with you, it's the same thing to go back to points, with no tools.


There's at least one Speeduino based model aimed at carbureted cars:

https://wtmtronics.com/product/carbumate/


Check out this guy. He built what calls a "carb cheater". It's a Chevy Idle Air Controller (IAC) from a TBI setup, and is essentially a very controlled vacuum leak. Using it, he's been able to get some pretty impressive feats done. Might be the level of hackery you're looking for. It doesn't do timing, but this is the right kind of tinkering you're looking for, I think.

https://www.youtube.com/watch?v=FwhMz2kR4pw


You can use a single injector at the intake—Throttle Body Injection (TBI)—to replace the carburation metering with a fuel injector. A lot of domestic manufacturers did this for a time when they were struggling to convert to EFI.

For the MegaSquirt (at least) the default metering is speed-density which sounds like what your Renix is, and this is the minimum needed to run the MS, unless alternatively you go for Alpha-N fuel mapping.

https://www.holley.com/blog/post/fuel_injection_fundamentals...


(bit of a rant here, you've been warned)

As someone who could've developed an IT/programming career, but didn't because I felt things were already bloating back in the '00s, I agree with the majority: "harvesting your own food" can be rewarding but also a tedious and thankless job. It's certainly not for everyone, but if it works for some people then it is (let's put efficiency aside for a moment) perfectly valid. In fact, being more of a H/W guy I find myself gravitating towards this approach more often than not. Leanness and reproducibility is key for my workflow (I went the RF-world path), I can't afford different end results when a dependency changes/breaks something.

IMHO, keeping up with the modern paradigms for S/W development looks like a never-ending nightmare. Yes it's the modern way, yeah it's the state of the art. Still, I didn't feel it was a wise investment of my time to learn all those "modern dev" ropes, and I still feel that 20 years later. I'm nowhere near antiquated and I'm on top of all things tech (wouldn't read HN otherwise), it's just...

I see former friends/classmates that went this way, and they're in a constant cat-and-mouse game where 50% of time they're learning/setting up something dev-chain related, the rest 50% doing actual work, and 98% of it feeling way too stressed. I see modern Android devices with their multi-MB apps, bloated to hell and beyond for a simple UI that takes ages to open on multi-core, multi-GHZ SOCs. I see people advocating unused RAM is wasted RAM, never satisfied until every byte is put to good use, reluctant to admit that said good use is just leaving the machine there "ready" to do something, but not doing anything _productive_ actually.

And yet.

Without that bloat, without the convienience of pre-made libraries and assist tools for almost every function one could desire, we wouldn't be where we are now. Imagine for a moment doing AI-work, 3D movie rendering, data science etc. with a DBless approach on single-core machines with every resource micro-managed to eke out the most performance. It's simply not feasible, we would still be on the 90s... just a bit more hipster.

This article resonates so well with me. And at the same time, it feels so distant.


I was trying to square the same tension in my mind when I made OP. And the compromise I arrived at was, "try to find people with complementary interests to organize with." That's really what "software with thousands of users" boils down to. If programmers who can take the lead when software is still small and approachable, and non-programmers coalesce around their forks rather than upstream, we might slowly evolve towards the hazy societal organization I'm vaguely pointing in the direction of.

But an essential component of this plan is for non-programmers to articulate early and often their desire to migrate away from the current monopoly they are forced to use.


Of course we non-programmers want to move away from big corps environments. Here's my 50 cents of what would be ideal for me...modular software, easy to assemble, no code. And if a module is not available, I'd be happy to pay a (reasonable) amount to get it done. All this open source.


Why does open source matter to you? Is it to preserve the option to pay someone to make changes to it?


For sure among the hardest problems in software engineering are versioning dependencies, and managing dependencies. At least those are the two I find the most aggravating. It seems like almost nobody can get it right even though component-based software engineering, SoA, etc. I think are generally extremely good ideas. The execution is pretty crummy pretty much everywhere.

With all that said, my sense is that hardware engineering has its own heap of Sisyphean problems and complexities. I definitely would not go back to working on hardware engineering problems like I did super early in my career (a mix of embedded firmware, device drivers, PCB design, and web development). I shudder at the thought of ever working with anything Verilog/VHDL, Xilinx, or SPICE ever again, or debugging PCB designs on the bench top in the lab with an oscilloscope and a logic probe. At least in school I ran more than a few bodge wires to patch a mistake in a PCB design iteration. Maybe in some sense, it's a blessing that those linear systems theory abstractions fall apart utterly in RF engineering problems, and one has to contend with the fact that all circuits radiate. At least circuits that still contain the magic smoke.


I believe nix is the logical solution to dependency management and thus is the future of it.


Imagine for a moment doing AI-work

In many ways it's still the same. Transformers use matrix multiplication is their main operation, the underlying matrix multiplication libraries have mostly seen incremental performance improvements over the last two decades or so. Most other ops in eg. core PyTorch are implemented using C++ templates and are mostly familiar to a 2008 C++ programmer. Most of my work is largely C++/Python/Cython as it has been the last 1-2 decades. Sure, the machine learning models have changed, but those are relatively easy to pick up.


But most software is not 3d movie rendering or AI. Look at the AppStore. 99% of these apps could have been written with 1970's Pascal. 1996s ICQ had 90% of functionality of modern messengers. People just love new things.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: