Hacker Newsnew | past | comments | ask | show | jobs | submit | bloateddevtards's commentslogin

Cute. Except the ones suffering the most from the HFT machinations were most likely pension and mutual funds, the ones most folks parents rely on. Not rich fucks or their catered hedge funds.


Vanguard says exactly the opposite is true. Which makes sense: passively managed funds and funds with simple value-based investment parameters don't trade aggressively. On the infrequent occasions when they do trade, they benefit from the reduced cost of trading.

If you have your retirement fund parked in an actively-managed fund where you pay an annual fee to have someone who makes $500,000/yr base continuously push "buy" and "sell" buttons on their computer screen all day, you have bigger problems than HFT.


That's why ETFs and index funds, which simply match a market or index passively, are the real deal.

Going for more than market growth is a strategy only brokers, fund managers and banks make profit of.


Whatever the realities of the market, people have an expectation of "fairness", and are generally against the idea of a party or parties being able to insert themselves into a situation where they can essentially get a speed jump over the rest of the market by virtue of buying a place that enables this (microwave links, colo in the exchange etc.)


A very good point, we should have a metric that establishes how much carbon is released extra into the environment from cpu cycle wasting crap like atom and even more bloated versions of word. Clippy could pop up: "I can see you would like to release an extra ton of carbon into the atmos, why not upgrade word now!"


Yeah, and then we could see how small the differences would actually be.

A 15" Macbook Pro has a 99.5Wh battery and lasts about 8h; that's 12.5W. A program that decreases battery life by 20% would only mean an increase of 2-3W. Even if the machine and program are running 24/7 and all the energy comes from coal, that means an increase of less than 0.03T of CO2 per year. For comparison, the average carbon footprint of an US citizen is 20 Tons per year.


Apple has started doing something about energy efficiency, probably because most of their computers are laptops where energy use is quite important. OSX tracks energy efficiency, which somehow calculates how much power a single application uses (CPU + GPU if I remember correctly). But I don't think they do anything with that info at the moment.

I guess it would be a nice incentive for developers if OSX could notify you that an app that is on background is using a lot of energy at the moment, maybe even with a quit-button when on battery power. At least I wouldn't want my app to end up on that kind of a popup.


The real problem is precious snow flake developers who do not care about bloated code, who think it becomes the dev ops problem to go and get bigger aws instances, or newer macs. Shit attitude to your work will always show. If engineers, plumbers, electricians, doctors, chefs - basically any other field of employment took the same cavalier approach to the end product, we would right castigate them. But somehow, we give devtards a free pass. Why?


I could edit a doc with a word processor that could fit onto a floppy, and it worked well. Now, why does the same task require a multi GB pos bloatfest to do the same job? And oftentimes, more slowly! Imagine for a second, the power of today's hardware running optimised lean code. It could be so good, but we accept bloated crappy bug ridden shitfests of OS and application software. We seem to care more about glossy and social then performance. It is a real problem.


If I want to start Microsoft Word right now, here's what I have to do: I press the Windows key, type "word", press enter, and then wait for it to start. That entire process takes three seconds; I just timed it. Now that Word's been run once already, I can start it again in less than one second. That compares pretty favorably with the process of finding a floppy disk, inserting it, and then finally running a command on it (it's even worse if I don't remember the cryptic command name, because then I have to wait for the floppy's directory to be read). To say nothing of how much easier to deal with this "bloatfest" will be once I get it loaded, compared to your leaner alternative.


That's probably because you have SSD, but anyway; I could do the exact same process 10 years ago (with "winword" instead of "word"), get similar response times, and yet the software was an order of magnitude smaller and more responsive. And it's not like Word gained many actually useful features during that time.


The Word I used 10 years ago was nowhere near as responsive as the one I use today.

Maybe yours was faster, but I think a lot of people have rose-tinted memories of the speed of applications in the old days.


>> And it's not like Word gained many actually useful features during that time.

Then use the 10 year old version! Unless, of course, one of those few "actually useful features" is something you can't live without.

A lot of people use Word for a lot of use cases. The value of an added feature that someone needs always trumps the performance cost of adding it until the performance becomes so bad that it becomes the reason that other people stop using it.


People who complain about bloat are usually complaining about features they don't use. Of course watch what happens when people start doing metrics and optimizing their UI for "common uses" - turns out you're optimizing for non-existent users (see: the Ribbon).


The ribbon gets knocked a lot; but the median word processor user has zero experience. The ribbon gives them a fighting chance to find what they need. At the expense of 'expert users' with their idiomatic expectations.


Which is a problem because tools should be just that - tools. You don't make welders or soldering irons easy for people with zero experience. You make them effective and efficient tools, and then train people to use them. Heck, show me a single musical instrument giving zero-experienced users "a fighting chance to find what they need".

It's weird how new trends in UX design try to make a first-time user become a genius immediately after double-clicking on the program icon. The only way you can do that is by dumbing down the software to the point it can actually be comprehended this way - which makes it much less usable and effective as a tool.


That's the expert talking. The tool is made for the most common case - a new hire meeting it for the first time. They're not gonna do a good job; but anything that can improve their performance, pays. Follow the money.


Once it installed from the floppy, why would I need to keep using from the floppy? You also described caching behaviour of the OS, rather than the huge size of modern day word.


You have a hard drive? Rich people these days...


How often do you need to edit a doc using a word processor that's running on a floppy? I'm guessing that it's not very often.

If you really need a lightweight word processor, there's plenty out there. They just probably won't have a lot of the features that the big hitters have now. And if you really want a lightweight OS, there's a fair few Linux distros aimed at that market.

What do you want your code to optimised for? Fast startup times? Functionality? Extensibility? Stability? Compatibility? Security?

None of those things come magically for free, and every day that a developer spends on one is a day that they aren't spending on another. And if they are waiting until any, or all, of those are optimised for any possible scenario, you'll be waiting a long time for any software at all.

As Blaise Pascal (or possibly one of many other possible candidates) once said - "If I had more time, I would have written a shorter letter." The challenge of optimisation v delivery is not a new one.


Je n’ai fait celle-ci plus longue que parce que je n’ai pas eu le loisir de la faire plus courte. – Blaise Pascal, Lettres Provinciales, 1657


Because it is _not_ the same program. For one the spell checker is much better, for another it has added a grammar checker, for a third it now has change tracking and comments and lots of other goodies.

Don't complain that you don't need those features: it is on you to choose the right tool for the right job. If all you need is a light spell check and text editing, notepad++, vim or Emacs will do nicely. Plenty of people need the more advanced features.


The problem is you can't write that optimized lean assembly code in a lifetime anymore. Software today does a lot more than it used to under the seams now that most people forget about.


I am not sure we need to go as lean as hand-rolled asm, but we sure went down a wrong path when we just accept that word will get bigger and bigger with each release. No one asks why.


I think so too. It's not about ASM snippets; it's about people not knowing what their software actually does. Software bloat isn't magic, nor is it static; your CPU is doing something. If with time application gets bigger and slower, and yet no extra features appear to explain it, it means your CPU is running code that shouldn't be run, wasting its cycles on pointless and irrelevant computations. That's laziness and/or stupidity.


It's not laziness or stupidity (well, it might be sometimes, but certainly not always).

It's using your time to do something else, such as improving security, or adding a new feature. Some people seem to be talking as if it automatically takes the same amount of time to create efficient code as inefficient code, and the idiot developer has simply forgotten to add the -run_faster option on the compiler.

I'm involved in a project at the moment where a team of developers and engineers have spent much of the last month trying to get response times in some very specific circumstances in an application down to an acceptable level. That's not because they are either stupid or lazy. It's because these things are hard - there's an awful lot of moving parts, and an awful lot of things that can go wrong in each of them.

When they get it down to an acceptable level, they'll ship it to the users. It won't be blazingly fast, but it'll be good enough. They could definitely spend longer tuning it further and getting it going even faster. But every day they do that will be a day longer until the features that the users want are actually available to them.


Perhaps too many moving parts. Engineers are criticized for 'reinventing the wheel' and writing code when open source is available. But use too much open source, and you have too many moving parts, you don't know how they work, and things like locks and latency are uncontrollable. A familiar storey!


Apart from the OS (which isn't where the problem lies), there's no open source in it.

The reason there's lots of moving parts is because modern computers, OSs, languages, application servers, libraries, databases etc are built to do a lot of very complicated things.

We could have written the entire application from scratch, not using any 3rd party elements so that we had full control over every element. But that would have taken several orders of magnitude longer to deliver actual useful working software to the users.


Or, to be non-Aristotelian about it, some parts could have been dispensed with and appropriate code written for the purpose, and other more robust parts could be kept.


Until you discover the issues, you don't know which bits aren't robust enough.

And rewriting them, along with modifying the rest of the application to use them, takes time and can lead to further unexpected behaviour.


The opposite of that. Using somebody else's code, written for a different app in a different environment, is where most unexpected behavior comes from.

Here's the bits that aren't robust enough: any open source that isn't either used by thousands, or used in exactly the way you will be using it.


I would still be mostly using a 3rd part application framework if I rewrote some bit of it that I found problems with. I wouldn't be rewriting the whole lot - that would take forever. So I'd be left with a 95% of the framework being written by the 3rd party and 5% written by my team.

That 95% would still need to interact with my new 5%, and as it was never designed to do that, it could easily introduce new unexpected behaviour. Imagine trying to rip out (for example) the indexing code out of a database because it's performing badly and then rewriting it.


Imagine having control over the behavior of your code! That's the advantage of writing it. And I thought we were talking about replacing open source packages, not doing open-heart surgery on a faulty package?


Why would you think that? I explicitly said "Apart from the OS (which isn't where the problem lies), there's no open source in it."

It's a 3rd party commercial platform for developing your own applications on. To rewrite individual bits that aren't working efficiently would be a nightmare.

As for imagining having full control over the behaviour of my code - I had that quarter of a century ago or more when I started coding, and spent weeks or months building windowing systems and low-level networking code in assembler that I could achieve in 30 seconds with any off the shelf language these days.

Yes, my code was extremely efficient (it had to be, trying to do graphical comms software on 8086 machines didn't give you much of an option to do otherwise), but the use of my time wasn't.

The third party product that we're using is the result of many years of a large team of dedicated developers. Rewriting that from scratch in order to make sure we've got full knowledge of what's going on is an utter ludicrous idea.


>That's laziness and/or stupidity

Or the failure of a large team to properly manage software complexity.


Or higher priority things to work on. Optimisation doesn't come for free.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: