Hacker Newsnew | past | comments | ask | show | jobs | submit | rleigh's commentslogin

When they used to have to fit the whole thing onto six floppy discs, it had to be constrained in size and scope. Today there are no constraints and it really shows. I think having hard constraints, be it storage, memory, cpu, update distribution, product requirements, drives quality and forces hard engineering decisions to fit within them. I think a lot that is wrong with Microsoft of today is a complete loss of enginering discipline and focus; you only have to look at the incoherence of their GUI development strategy to see how badly they are doing there.

I think it would have been useful for them to have really made a proper effort at modularising Windows along the lines of how Linux distributions and the BSDs do things. I can't see any way of recovery from the bloated mess they have created; they can't keep cramming any more in, it's an unstable, unusable and untestable mess.


Likely. Whenever I see that it usually means it itself created the test failures but won't admit to it!

It's not just Amazon. I bought a copy of an ARM assembly book from a proper bookseller (Blackwells) which was a proper hardback for a high price--something like £80, and I received a print-on-demand mess with a hardcover. The print was there but barely legible, a dotty mess which gave me a headache. I returned it.

I can see print-on-demand working very well, but not until the quality issues are sorted out. Being charged top dollar for something which is substantially inferior is unacceptable.


Yes, this.

Even hardcover books from "real publishers" have arrived with low print quality. The most common problem book-printing problems I have a real problem with today are

  1. text that is gray (not black) and
  2. text that is dotted (not solid)
I have, 20, 40, and 100+ year old books with phenomenal "solid black text", and they are an absolute pleasure to feast the eyes on. But more importantly, they are not so irritatingly bad while reading them that the bad presentation entirely and unavoidably distracts from the quick and enjoyable consumption of the content itself!

If you ask me, the following checkboxes should be standard ratings on all books sold:

  [ ] "solid, black text"
  [ ] "acid-free paper"
  [ ] <we could add a few here>
Everything else comes after knowing these aspects in my opinion. I guess these would require numeric, measured scores, too, with the binary checkboxes indicating some minimum threshold is surpassed. There are other important factors, too, of course, but getting basic text color and text character solidness is number one, easily.

Related, I used to buy 3rd party black laser printer toner that was guaranteed and warrantied to be made to OEM spec. It never, ever was, no matter how many returns/replacements/retries/print-settings-adjustments/other-part-replacements. Always gray text, always. Buying actual OEM black toner reliably results in (close enough to) jet black text. It costs more, but it's the only way to be sure for self-printed materials AFAIAA.


Just for a bit of balance, another book I bought was the ZYNQ book and companion materials. It's made by a university in collaboration with Xilinx. They don't hide that because it's niche and low volume, they used a print-on-demand service. I even went and looked them up, and it's a small UK printer with pretty reasonable pricing for self-publishing small runs of books. The quality was great, no problems with it at all. So it /can/ be done.

Similar experience, have some good quality print-on-demand niche academic books in my collection.

Acid-free is debatable, non-archival books frequently last decades or even centuries

Interestingly I bought a book on Z80 assembly last year, I thought I was buying a used book printed in the 80s or so. Instead it was shiny, glossy, and obviously printed on demand.

Terrible quality, and really did make me stop using Amazon for "vintage" books.

I prefer to buy used books locally, but given I don't speak the local languages I'm often forced to buy from abroad to get English editions.


If it loads at all. The last two days, the start menu refuses to launch it when you click on it.

The lack of quality in Windows is simply astonishing. And the new start menu and taskbar are terrible. Quite how a company can transform a product into such a mess in just a few years is incredible.


I love when it takes 3 minutes to open "Add or Remove Programs" because the Start menu search decides that typing a, ad, add, r, re, remove, unin, install, etc. definitely means "let me Bing that for you" instead of opening the one thing I clearly want.

It obviously knows what I'm trying to do (Bing search recommendation is for "Add or Remove Programs"), yet refuses to surface the actual shortcut to that settings page (or "app", or whatever Microsoft calls it this week). Even better: some days it pops up immediately after typing "Add" and other days I'm wrestling with it like I'm training a stubborn animal, clicking the result in the hope that the OS will "learn" that yes, this is what I want when I type "Add".

Most of the time I just give up and dig through the Settings menu like it's 1999.


I started on the cheapest £15/mo "Pro" plan and it was great for home use when I'd do a bit of coding in the evenings only, but it wasn't really that usable with Opus--you can burn through your session allowance in a few minutes, but was fine with Sonnet. I used the PAYG option to add more, but cost me £200 in December, so I opted for the £90/mo "Max" plan which is great. I've used Opus 4.5 continuously and it's done great work.

I think when you look at it from the perspective of how much you get out of it compared with paying a human to do the same (including yourself), it is still very good value for money whether you use it for work or for your own projects. I do both. But when I look what I can now do for my own projects including open-source stuff, I'm very time-limited, and some of the things I want to do would take multiple years. Some of these tools can take that down to weeks, do I can do more with less, and from that perspective the cost is worth it.


I've found it to be terrible when you allow it to be creative. Constrain it, and it does much better.

Have you tried the planning mode? Ask it to review the codebase and identify defects, but don't let it make any changes until you've discussed each one or each category and planned out what to do to correct them. I've had it refactor code perfectly, but only when given examples of exactly what you want it to do, or given clear direction on what to do (or not to do).


It's the other way around. It's the GPL which is incompatible with the CDDL (and many other licences).

The CDDL is actually very permissive. You can combine it with anything, including proprietary licences.


Yes, this is a prime example of completely gratuitous breakage.

The change adds zero value. It's a deliberate API break. And it could have been made a non-breaking change all for the sake of a single one-line macro or inline function.

This isn't unintentional. It's a deliberate choice they have made. And not just this one, it's happened repeatedly over the years.

The thing that really gets me, as an end-user/developer, is that it forces incompatible changes not only in my codebases, but in every other application developer's codebases worldwide. A small change in GTK+ imposes hundreds of thousands of man-hours of maintenance work upon every application developer. And this burdensome work not only takes time, effort and money, it doesn't improve our applications one iota, and on top of that, it breaks backward compatibility so our code will not longer build with older GTK+ versions. Most of us won't be chasing the latest development release, applications might need to target a wide range of distributions with a wide range of GTK+ versions. So it's a logistical nightmare as well.

The lack of concern for the needs of actual application developers is why I eventually had to give up on it entirely. At some point it doesn't make any sense either commercially or for free software development, it's just masochism.


I completely agree. This comment should be carved in stone for future generations to see. API breaking changes should never be made just to chase illusory butterflies.


Recently it's got really bad though. The taskbar is badly broken.

* If you use auto-hide, it won't show when some applications are open. Edge in particular is bad.

* Some applications simply don't show on the taskbar at all. Teams is one. It's in the alt-tab list.

* Sometimes it stops working entirely.

The testing and QA of this stuff appears to be largely absent.


And also in QtXmlPatterns (now also retired).

Just for the record, Xalan-C is even less maintained than libxslt. It had no releases for over a decade, and I made a final 1.12 release in 2020 adding CMake support, since the existing builds had bitrotted significantly, along with a number of outstanding bugfixes.

I initiated its removal to the Apache attic in 2022 in https://marc.info/?l=xalan-c-users&m=165593638018553&w=2 and the vote to do this was in https://marc.info/?t=166514497300001&r=1&w=2&n=20. It has now gone nearly four years without any commits being made.

It's a great shame we are now in a situation where there is only a single proprietary implementation of the very latest version of the standard, but even the open-source 1.x implementations are fading fast. These technologies have fallen out of favour, and the the size and complexity of the standards is such that it's a non-trivial undertaking to keep them maintained or create a modern reimplementation.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: