Hacker Newsnew | past | comments | ask | show | jobs | submit | bschne's commentslogin

> Was this physically difficult to write? If it flowed out effortlessly in one go, it's usually fluff.

Probably my best and most insightful stuff has been produced more or less effortlessly, since I spent enough time/effort _beforehand_ getting to know the domain and issue I was interested in from different angles.

When I try writing fluff or being impressive without putting in the work first, I usually bump up against all the stuff I don't have a clear picture of yet, and it becomes a neverending slog. YMMV.


My most successful blog post was written about something I felt strongly about, backed by knowledge and a lot of prior thought. It was written with passion.

People asked for permission to repost it, it got shared on social media, it ended up showing higher in Google than a Time magazine (I think) interview of Bill Gates with the same title.


Could you post a link?


It seemed a bit self promotional to do so as its not relevant to article, but as you ask its https://pietersz.co.uk/2009/11/fix-capitalism


Right? I think some of my best work flowed out effortlessly, it's amazing when you get into the flow state and just churn out line after line.



my take on this book is that 1) it contains a lot of foundational knowledge/wisdom about design as interpreted broadly that is very useful across contexts, and 2) it is itself, ironically, an example of poor design. Not in the visual sense, but in that it's structure and writing do a pretty bad job actually conveying that knowledge to the reader and being navigable.

I tried reading it and hated it, then I came back knowing bits and pieces of its contents from elsewhere and was like "yup, this is the only place I've seen all of this together".


unlike rich countries, which only lack the will and care to secure their ICS and SCADA /s


Or the US in particular that applies a lot of resources into willfully keeping every ICS and SCADA out there insecure, including their own.


"Now, one day back at Data General, his weariness focused on the logic analyzer and the small catastrophes that come from trying to build a machine that operates in billionths of a second. On this occasion, he went away from the basement and left this note on his terminal:

I'm going to a commune in Vermont and will deal with no unit of time shorter than a season."

— from "The Soul of a New Machine" by Tracy Kidder


fun anecdote about Ethiopia doing this to prevent cheating in national exams --- https://x.com/benkuhn/status/1339016975494811649?s=20


you're telling me the results of this paper were likely bs? --- https://www.sciencedirect.com/science/article/abs/pii/S10538...


The point of the salmon paper is to demonstrate to people “if you do your stats wrong, you’re going to think noise is real” and not “fmri is bs”


As the first author on the salmon paper, yes, that was exactly our point. Researchers were capitalizing on chance in many cases as they failed to do effective corrections to the multiple comparisons problem. We argued with the dead fish that they should.


Nothing to add to this conversation in particular, but just wanted to say - truly amazing paper. Well done!


Many thanks! It was a ton of fun. Hard to beleive that we are coming up on 20 years since the data for the salmon was first collected...


> We argued with the dead fish that they should.

Arguing with a dead fish may be a sign you're working too hard :)


Yeah, it did prove to be a rather one-sided conversation... ;)


Did you try tuning it? https://youtu.be/F2y92obnsc0


Curious what you find to be "bs" about the results of this paper? That statistical corrections are necessary when analysing fMRI scans to prevent spurious "activations" that are only there by chance?


They were being sarcastic.


Oh man you stole my thunder. I hoped to be the first to bring up the dead salmon.


information density of online maps is, in general, quite low compared to old paper maps: https://x.com/patrickc/status/1738646361128792402

I guess there's various reasons, ranging from "it's hard to make auto-layout algos produce stuff as dense as painstakingly handcrafted maps" to "let's make it harder to scrape/copy data"


Back then it was dedicated map makers that created maps. Now it's mainly programmers. So its not surprising that quality tanks when you go from disciplinary expert staff to IT day laborers.


I've been occasionally using futureme.org since ~15 years ago, in case you're a believer in the Lindy effect. FWIW I don't think I've ever used it for anything more than ~1 year ahead, that always seemed fun/interesting enough. Of course there's other considerations entering the picture if you plan ten years ahead, but then again this seems like the kind of fun/light-hearted thing where it doesn't really bother me that I might not end up reading it again --- life happens...


The same thing happened to German magazine Spiegel recently, see the correction remark at the end of this article

https://www.spiegel.de/wirtschaft/unternehmen/deutsche-bahn-...



I still think someone should have done this as a pun and get your paper trending everywhere.


Fair play to them for owning up to their mistake, and not just pretending like it didn't happen!


They do not deserve a shred of recommendation. This is just damage control, pretending that it did not happen never was an option. Instead they tried to claim that it was just a one of mistake. What it really shows is that nobody even bothers to read their articles before hitting publish and that AI is widely used internally.


You're absolutely right! but they can shove this euphemism. Just say that chatgpt wrote the article and no one read it before publishing, no need for all the fluff.


>> Just say that chatgpt wrote the article and no one read it before publishing

This is so interesting. I wonder if no human prompted for the article to be written either. I could see some kind of algorithm figuring out what to "write" about and prompting AI to create the articles automatically. Those are the jobs that are actually being replaced by AI - writing fluff crap to build an attention trap for ad revenue.


Very likely this already happens on slop websites (...which I can't name because I don't go there), which for example just republish press releases (which could be considered aggregation sites I guess), or which automatically scrape Reddit and translate them into listicles on the fly.


Maybe, although I'm a bit doubtful that they were 100% honest.

> Entgegen unseren Standards


As programmers I think we can extend some professional empathy and understanding: copy-and-pasting all day is a lot harder than you’d think.


compared to the writing yourself???? absolutely not


It was sarcastic.


Fair play to them for owning up to their mistake, and not just pretending like it didn't happen!

That's what the legitimate media has done for the last couple of hundred years. Every issue of the New York Times has a Corrections section. I think the Washington Post's is called Corrections and Amplifications.

Bloggers just change the article and hope it didn't get cached in the Wayback Machine.


"We regret to admit that our editors don't actually take the time to read these articles before hitting the PUBLISH button..."


The editors were laid off and replaced by an LLM. Or more likely, the editorial staff was cut in half and the ones who were kept were told to use LLMs to handle the increased workload.


This is the real issue; I'm sure journalists already use loads of shortcuts to do their job efficiently but the end responsible is the editor(s).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: