Hacker Newsnew | past | comments | ask | show | jobs | submit | okyup's commentslogin

Because they suffer more. Suffering is good. Without it, we tend to fall apart. With it, we tend to get better.


Ease of use.

Python is hard to learn to use for math. It requires a strong programming and math background.

Mathematica's syntax is more natural and it doesn't require learning numerous different packages and figuring out how to stitch them together - it just works. Graphing is more convenient, too.


Laptops in university are mostly about status signalling. Paper is much more efficient and practical for note taking.


Not really new. I've seen many companies creating similar homes - it's been popular for a few years at least. The more interesting ones (in my opinion) are the ones optimized to be highly efficient for living off-the-grid and self-sustainably.

It's scary at the same time. Humans are getting smaller and smaller and more irrelevant and powerless and less individual. In the future we'll probably all live in little pods under some kind of gigantic government that controls our lives. Maybe we'll be happy, but we'll be shadows of the diverse and complex beings we once were.


    Maybe we'll be happy, but we'll be shadows of the
    diverse and complex beings we once were
These "diverse and complex beings" you refer to. When did we fit the model you're thinking of, exactly?


At all times up until present - constantly fluctuating, but generally trending upwards with the ascent of the evolution of life and of human beings. And perhaps it will continue for a long time yet. However, the majority of humans long to end it or at least to set it back.


Climate Change = Fear-mongering. Remember 15 years ago the way people were predicting foot to meter per year seawater rise? It's a miracle the Empire State Building isn't underwater already as some predicted. There's been no global warming for over a decade. All of those models used to predict disasters that never happened turned out inaccurate and wrong. But that doesn't stop corrupt organizations who have perverted "science" from continuing to use these "theories" for their political ends.

Ocean levels have been rising for thousands of years and will continue to do so until the next ice age (barring human intervention to prevent it from occurring - might be a good thing?), so NASA would be wise to use technologies such as levies to prevent flooding of their facilities. They might've been even smarter to build them at higher elevation to begin with.


> Remember 15 years ago the way people were predicting foot to meter per year seawater rise?

What you suggest was far from a generally accepted prediction. The IPCC's 2001 report was for <1 meter rise over the entire century. See http://www.grida.no/publications/other/ipcc_tar/?src=/climat... . It emphatically was NOT a "foot to meter per year seawater rise."

Do you have evidence that it wasn't just kooks saying what you suggested?

> "as some predicted"

Who predicted that? Some guest on Coast to Coast AM who also thinks alien UFOs create chemtrails?

> use technologies such as levies to prevent flooding

As brought up several times in this thread, levies won't work at KCS. Water would simply go through the porous rock under any such levy.

If these statements of your are false, or at best straw man arguments, why should anyone believe the rest of what you write?


There's a lot of motte/bailey'ing[1] going on in the climate change debate.

There is tons of histrionics and scare mongering out there. I particular remember during the COP15 meeting in Copenhagen, art installations illustrating how the city would be under water if sea level rose 7 meters, which is the estimated rise if all of the ice on Greenland melts. The Day After Tomorrow, not to mention An Inconvenient Truth.

While there certainly are and were voices critical of these things and their dubious relationship with the science (whether settled or consensus or not), their usefulness in advancing the agenda appears to be appreciated more than their role in misrepresenting the science. This is an actual quote from a scientist listed in the "criticism" section of the "The Day After Tomorrow" Wikipedia page[2]: "I'm heartened that there's a movie addressing real climate issues. But as for the science of the movie, I'd give it a D minus or an F. And I'd be concerned if the movie was made to advance a political agenda.". This is an actual scientist on the record believing that TDAT is "addressing real climate issues". That is like applauding Swordfish[3] for addressing cyber security issues. That is the Bailey.

Then, years after, when these things perhaps look a bit more absurd (and some very specific predictions turned out not to come through, at least not as unambiguously as expected), we get "The IPCC's 2001 report was for <1 meter rise over the entire century ... Do you have evidence that it wasn't just kooks saying what you suggested?"

That's the Motte.

You can't have your cake and eat it, too. If it was only kooks saying it, certainly none of the non-kooks bothered throughly rebutting them at the time.

1:http://blog.practicalethics.ox.ac.uk/2014/09/motte-and-baile...

2: https://en.wikipedia.org/wiki/The_Day_After_Tomorrow#Critici...

3: https://en.wikipedia.org/wiki/Swordfish_(film)


Where are the people who said that the sea level would raise 1 foot to 1 meter per year?

You didn't answer that question. Everything you wrote is a distraction, which ends up increasing the confusion you describe and complain about.

You pointed to some artists and movies. So what? I can point to movies who have depicted meteors crashing into modern day Earth, and volcanoes erupting in Los Angeles, and even have the core of the Earth stop spinning. Artists have different goals. They are not required to only depict what science predicts +/- 1 sigma.

You point to a scientist who liked that "The Day After Tomorrow" used climate change as a plot device, even if the science was awful. So what? I'm sure some Spacewatch people like "Armageddon" and that some geologists like "The Core", even though the science is awful in both. Are you equally disdainful towards those scientists? Or do only climate scientists draw your ire? (BTW, "Water World" used global warming as a plot device, was almost 9 years before TDAT, and portrayed an equally scientifically absurd future.)

You mentioned "An Inconvenient Truth". Does Gore make a prediction that there will be 30cm or higher water rise per year? Does he even predict that the Atlantic will have reached the base of the Empire State Building by now?

How is the IPCC 2001 report not an attempt at a thorough rebuttal? What level of rebuttal would you require before you say it was sufficient? Do we all need to be like Neil deGrasse Tyson and pinpoint every single scientific flaw in a movie? Or go even further and reject every movie with a flaw?

In other words, I'm not going to get into a goalpost argument with you before you even say what the goalposts are.

Your argument of the structure of the argument is useless. It's so easily inverted. Watch: You have set up your own Motte and Bailey about what's going on in this HN thread, so nothing I say can dissuade you. You have inserted irrelevant commentary from your bailey to defend your views. Now you can ignore me because you think I'm using irrational arguments, because you have placed me in a spot where you think I can be ignored. Now you can enjoy your motte.

I argued that the OP presented a straw man argument that misrepresents what the large majority of climatologists and policymakers like Gore were making 15 years ago. Do you agree or disagree with me?

I presented the IPCC report to supports my argument. Gore's "An Inconvenient Truth" also supports my argument. If you disagree, do you have an actual examples of people who predicted 30+cm/year sea rise, or even that much of Manhattan would be underwater by now?


I am guilty of reading "predicting foot to meter per year seawater rise" as a satirising exaggeration for "predicting immediate, catastrophic (plausibly sea-level related) consequences" and I did not state that clearly. With that amendment, I think my comment still has merit, but it may no longer have responded to anything you wrote. Sorry about that.

That said, An Inconvenient Truth did suggest a 20-foot rise as something imminent. Did it "predict" it? No, not explicitly, it suggests and imagines and calls to action. It also doesn't specify the timescale, but there is great urgency. With that, your GP, while very much on the high end, is way closer to AIT than AIT is to IPCC.

A short note on movies and art: the difference between "The Core" and TDAT (and "regular" art and the "climate" ditto) is that the latter didn't use science as a plot device, it explicitly injected itself into a political debate, dominated by a scientific discourse.


How do you figure it is a satirising exaggeration and not a straw man designed for ridicule?

I found a transcript of "An Inconvenient Truth" at http://www.admc.hct.ac.ae/hd1/courses/blog/gw/An%20Inconvien.... The section on 20 foot sea level rise is a description of what would happen "If Greenland broke up and melted, or if half of Greenland and half of West Antarctica broke up and melted" and ends with "Is it possible that we should prepare against other threats besides terrorists? Maybe we should be concerned about other problems as well."

I agree that it's a call to action. I disagree that it's a prediction of something imminent, as in, an outcome that will happen within a couple of decades of when the movie was made. I read it as a need for imminent action, to prevent one possible long-term outcome.

BTW, Gore does make some short term predictions, which have proved to be incorrect. Gore said that within a decade there will be no snows of Kilimanjaro. That decade has passed. Instead, http://www.the-cryosphere.net/7/419/2013/tc-7-419-2013.pdf says that most of the ice cover will be gone by 2040, and the rest by 2060. http://lindseynicholson.org/wp-content/uploads/2011/07/Moelg... further points out that there will be snow even if there are no glaciers.

(While incorrect, I do not think it seriously affects the underlying meaning. The choice of "snow" over "glacier ice" should be read as an homage to Hemingway, with some poetic license allowed.)

Gore predicts Glacier Valley will have no glaciers within 15 years, or 2021. NPS estimates no later then 2030, https://www.nps.gov/glac/learn/nature/glaciers.htm , so Gore may be a couple years off there as well.

Gore was willing to make testable predictions in the 10-15 year span, which is a clue that the 20 foot rise in sea level that he described was not meant as a prediction of something imminent.


I can't understand what you're getting at with The Day After Tomorrow. The scientist is basically saying "WTF this sucks, what were they thinking?" but more nicely.

Have you ever seen someone express a position that you agree with, but do it so badly that you wish they hadn't, because they make you uncomfortable and just give ammunition to your opponents, while convincing nobody of the merits of your position? And then you try to formulate some sort of response to say, well, you're right, but oh how I wish you hadn't made your case so badly? That's basically this.


I don't think the scientific consensus was ever for that much sea level rise during the last few decades. The fact is that greenhouse gases are undeniably increasing the earths temperature and this is destabilizing the major ice sheets. Exactly how fast these ice sheets will melt is uncertain but there is no doubt that the rate of their melting is accelerating.


> Remember 15 years ago the way people were predicting foot to meter per year seawater rise?

No, I don't remember that. Can you provide some links for those predictions done 15 years ago?


He knows the importance of image and says all the right things to tug at your heart strings... they're fighting global warming, they're helping single moms, and they're running a startup-like environment unlike the other big evil corporations.


It blew my mind that his example of helping a single mom of EIGHT was her getting trained as a long haul truck driver. Who's taking care of that many kids while she's gone? Truck driving is not that well-paid, and the length of absences is right there in the job title.


His example is a subtle jab at working class. He thinks people are such trash that truck driver is aspirational for them. The education benefits for blue collar workers at Amazon take years to kick in and don't cover a single year at most community colleges. Workers are frequently fired before 90 days to prevent benefits from starting. Workers would do far better to do a small education loan and start training early.


>He thinks people are such trash that truck driver is aspirational for them

What planet are you on? Truck driver is an excellent working class job. It's skilled labour (yes, really), you need to be trustworthy and able to operate independently, work long hours and take responsibility for a very expensive asset. As a result it pays a significant premium compared to a lot of manual labour and service jobs.


At best it's an ok working class job; for Amazon, it's more of a stop-gap until delivery vehicles with auto-pilot are more widely available.


Yeah, obvious window-dressing is obvious.

Bezos does not give a flying fuck about the wellbeing of his workers, both in the warehouse and in engineering.


We are the mindless, soulless, mass of consumers and we are the future! New is good, old is bad. Forget the past, forget history - we are forging the future without any of that! An ideal future! Quality is measured in Megapixels, truth is what our prophets tell us, and individuality be damned. Now off to the future we head for the sake of it! Follow us over this cliff - redemption lies at the bottom. Anyone who claims otherwise is a racist troll!

Don't get left behind, old man. You will surely be a miserable person without the latest apps on the latest iGadget. Besides, there is no place in our ideal world for people like you.


Everything that's new is old already.


The obvious reason for this is government, not because they're great technologies.


This is true.

They are great technologies, but since they address the market externalities around carbon and other air pollution, the non-free market created by this situation won't achieve the aims that a theoretical free market would.

It's a bit embarrassing to free market boosters to see total global catastrophe be an unintended consequence of their policies, but they've often decided to dig in and deny obvious facts, which makes government intervention ever more necessary and welcomed by the population. Bit of an own goal really.


Trump bashing is lame. Trump has used words in his speeches that you probably couldn't define. I heard him casually use PROGNOSTICATION a couple weeks ago. He's not dumb.

It's actually a sign of intelligence to use the simplest language necessary when expressing yourself to a broad audience (many with low education). People who use unnecessarily large words are usually doing it to try to sound smarter than they are and to build up their own egos.


My intent is not to bash Trump. I actually think that he's one of the more talented people running for POTUS. This is not to say that I support him or his views (I am voting for Gary Johnson [1]), but I respect his ability to maneuver the media to suit his aims. It is my contention that his strong aptitude with using appropriate/effective language is one of the main reasons he can wield the media around so effortlessly.

[1] https://garyjohnson2016.com/


So... your throwing your vote away? The two party system is deeply flawed, but when has a third party candidate done anything but taken votes away from reasonable candidates?


If any third party candidate is going to do this in 2016, it is going to be Donald Trump after he and his supporters walk out of the GOP convention.

I'm an independent that lives in NYC. My vote is always a throw away vote.


Ah yes, prognostication. Such a rare word that it's only uttered multiple times in one of the most popular Bill Murray movies made which is guaranteed to be on TV at least once a year. Seriously, that's not even nearly one of the best words.


Maybe you should start the Thesaurus Party and run in the next election. I'm sure voters would swiftly capitulate to the erudition of your insuperable sesquipedalian fulminations. None of those pathetic made-for-TV words that Trump has bedizened his 8th grader lexicon with.


At what point does a word's existence become invalid? Searching for 'sesquipedalian' brings up pages upon pages of dictionary links on Google. Clearly this word exists only in dictionaries and is so rarely used as to be almost imaginary by those who use it. To put it another way, if it wasn't in the dictionary would it exist at all?


Yes because an English dictionary is a historical record of how English speakers and writers use words and phrases making no judgement as to their 'correctness' or whether an audience of current English speakers would understand you were you to use them.

The purpose of a dictionary is to aid you in understanding the meaning any English speaker or writer is trying to convey independent of the time period or location, not to dictate valid and invalid words. Some words might be dropped from abridged dictionaries when they fall out of popularity but in 'The English Dictionary' (which doesn't really exist but is approximated by dictionaries like Oxford's) words are added but never removed.

This is different, than say, the mission of L'Académie française which is to act as a language regulator and maintain the official vocabulary and grammar of French.


That's a good point. The makers of dictionaries have always sought catalogue language rather than create it. That's what Samuel Johnson set out to do. Without dictionaries we certainly would have lost a lot of words from our collective knowledge along the way.

Sesquiedalian seems to have been popping up since the 1600s.

http://www.oed.com/view/Entry/176752?redirectedFrom=sesquipe...


I agree that programming is a dead end job. It's popular and relatively easy. It doesn't require any special skills. Anyone with an average IQ can learn it. There's low cost and barrier of entry to getting started. It's fun and addicting when you're a kid. Then you get trapped and it becomes monotonous.

There's only so good you can get at programming. Beyond that, most of your time is spent on debugging trivial issues or trying to keep your knowledge of the overwhelming amount of tools and platforms up-to-date. If you're a really great programmer you might be able to get 3x as much done as an average programmer, but you won't be able to get 10x as much done to warrant 10x higher pay.

Programming is just one step above working on an assembly belt. There's plenty of competition for your position including from cheap foreign workers from the 3rd world and, since your job can be done remotely, you're even more expendable.

That technology is a luxury, makes your position nonessential. You will be worthless if Western civilization ever fails and drops out of the Technological Age.

When you stare into computers all day you aren't developing social skills or really any skills that would be applicable to most other jobs. If you don't want to be a programmer forever, the sooner you stop wasting your time staring into computers, likely, the better.

If you want to be highly valuable, you need to have skills that are rare and desirable, that make oodles of money or change the world. Such skills are usually of a social, political, or otherwise creative nature - things that can't be learned from a textbook by anyone capable of logical thinking.

Programming is simply grunt work. It's the grunt work of dealing with the disarray of present day technology. As we advance and become more organized and cohesive, the need for programmers will be reduced.


As everything is being digitized slowly, we truly are moving to an information economy.

Programming is the ability to manipulate that information for a purpose. You can be a programmer and make money if you combine that ability with specialized domain knowledge.

But the ability to program alone makes you interchangeable in the business. Just look at the hundreds of gigantic (5000+ employees) consulting firms selling expertise. The only people who make big bucks are the ones with domain knowledge.

I do think that programmers are at the forefront of the future of employment. Every other sector is afraid of automation, concerned about their algorithmic skills and routine tasks being done by machine and uncertain about how they will fit in a future where their skills are useless.

Programmers feel like that every few years (or months if you're a JS programmer ;-)) or on every new major project in a different domain.

But we get distracted by the fact that we spend so much time keeping up with the new tech that we confuse it for valuable knowledge. The new framework/language/api/etc. is only a tool to extend our ability and without domain knowledge in a sector (finance, healthcare, retail etc.), we have as little value as a paralegal or a doctor who can diagnose and treat but cannot invent a cure.

Unlike any of those professions, they are not 100% digitized. Programmers are forced to accept the limits of their intelligence and their capacity to learn new domains very early on. Nothing is preventing you from writing something truly amazing on your computer.

This community often gets it twisted. Let's face it, if you could innovate instead of repackage, you wouldn't need marketing or sales to communicate the value of your work. However exponential improvements and radical innovation are difficult, everyone settles for incremental improvements and have to spend their time marketing to persuade people that their product is better. (most likely by only a small margin from the competition. a margin most people wouldn't give a fuck about if they really understood the product)


Great comment. Echoes a lot of the sentiment I've been feeling since graduating college. Started out at a web startup where everyone was focused on learning the next new framework and applying clever language quirks and idioms, and heralding that knowledge as if it made them more valuable. And then applying it to a company whose product was stale and completely reproducible and non-innovative. I switched into embedded software after less than a year there and I've made a point to focus less on languages, or programming for the sake of programming, and more on learning hardware and Linux kernel internals because it's a domain toward which to apply the programming knowledge that, as you and GP pointed out, is relatively non-novel on its own.

I believe Carmack has a quote about how programming is just a mundane manner in which to solve more interesting underlying problems. I would imagine the world, or more specifically the economy, will eventually see it that way as well.


You've described exactly how I feel. I've only be at this game for 4 years and I'm already looking for a way out. Programming is fun in the way that doing drugs is fun; it's addicting and provides a quick high, but at the cost of staring into a computer screen for weeks on end with little to no interaction with other people aside from meetings (which only exist so that your superiors can tell you which widget to crank out next).

I consider myself to be very introverted yet programming is still too extreme on the asocial end. A career in programming will gradually cause your social skills to atrophy. It's pretty obvious that if you spend 40+ hours a week talking to a computer that eventually you'll start to feel and act more robotic than your peers who regularly interact with real people in the real world.

Working as a programmer doesn't provide real human experiences. Everything that you do and learn as a programmer is extremely abstract (and usually convoluted). You won't have interesting stories and wise aphorisms to share your children and grandchildren, because everything you did was inside a virtual world.

Everything you build as a developer is extremely temporary. You will work extremely hard to build something over a period of months or even years, only for that software to be immediately discarded when the business pivots and decides to pursue a totally different path. If you work for a startup, you're hard work will be absolutely worthless if the business goes belly up. If you decide to contribute to open source software and build the next generation of frameworks, tools, and languages, prepare for that to work to go out of fashion within just a couple years. A construction worker can go back to a building he built 50 years ago and it will still be standing. A doctor or lawyer with 20+ years experience will be well respected, but as a developer your experience will be disregarded unless you're an expert in whatever framework/language is the flavor of the month.

Programming pays well and is a cushy office job. It doesn't provide much else.

EDIT: I fully expect people to reply with the usual "well that's just your opinion, man!". Yes. I know. This is a personal rant and not a master's thesis. If you love this career more power to you. I'm just sharing my opinion because I know other developers feel like I do, and feel trapped in a well paying but otherwise unfulfilling career.


The 'everything you build as a developer is extremely temporary' was something I had to learn over the years. I used to make video games, and I grew up being able to play my old games no problem, but now nothing is physical, everything is sandboxed in an app store (or intrinsically glued into a website) and disappears the instant the company disappears, or moves on to a new project, unless you work on a major title.

And even if that weren't the problem, there's such a firehose of games getting released constantly that people no longer value them, and move on to the next game after barely spending any time with the existing one, unless you intentionally manipulate their psychology with some free to play garbage.

That's part of the reason why I now feel so drawn to board games, as they at least still have physical boxes that can last for decades.


It sounds like we're in similar positions. Programming is indeed a very cushy job. I almost got sucked in and became lazy and dependent on it (and maybe will yet, but hopefully not). It's a dangerous game.

I don't want to end up like my older professional peers in 10, 20, or 30 years. Most of them are very smart people, but in a way pathetic at the same time.

On the positive side, we should be able to leverage the cushiness for our own benefit. I'm planning to travel and work part time this summer while continuing to work my regular job remotely. I've enjoyed the little farm and construction work I've done in the past and would like to do more. Not only do I feel healthier when I spend more time away from technology, but it's much more enriching.


I find software can actually be pretty social as you ask your peers for code reviews, go have lunch with each other, ask each other about their opinions about things, have planning meetings, just chat randomly about stuff at the water cooler, make jokes on company chat and so on.

Sometimes it goes overboard and you don't have a contiguous 4 hours to just focus on coding.


I do agree quite a lot with your argument. However, as a former researcher, I would say that you can be creative a lot if you have margin for that.

But if you only work for enterprises, I agree it is mostly grunt work. I should know, as I've just quit working 8 months for a startup doing backend web dev and got bored and depressed as hell. Now I'm back at my own projects and doing freelance to get the money rolling until one of my projects hits the jackpot, if ever.

My opinion, maybe similar to yours, is that professional developers should see programming not as an end on itself but just as a mean. Focus on other areas (engineer, medicine, sports, whatever.) and try to apply your technical skills to fix a problem. Basically, it is what we do when we work for some company, but we should be conscious of that and maybe try to focus on IT + another area instead of just knonwing how to program a computer..

TL-DR: Domain knowledge is a must!


I'm sure that you are not getting a lot of upvotes for that comment, but you hit the nail on the head regarding programming being the translation layer between entropy of the real world and automated efficiency of computers. That is a very valuable service, but there are implications as to how this profession evolves long-term.

My most fulfilling and interesting career stories were about listening to people and reading their needs between the lines, un-kinking silly workflow knots, finding non-tech solutions to seemingly technical problems. Not constructing pristine castles of code (that ended up not even being used, ultimately).

This is also why I have tried very hard to define my identity around my general values and mission as a human being rather a specific career choice I serendipitously rolled into when I was 15. It has been very gratifying and liberating as a change of worldview.


It's popular and relatively easy. It doesn't require any special skills. Anyone with an average IQ can learn it. There's low cost and barrier of entry to getting started.

If all of those things were true, then the field would be saturated and it wouldn't pay nearly as well as it does. I would claim that while you don't need an exceptionally high IQ, you do need a systematic way of thinking that is relatively uncommon.

You will be worthless if Western civilization ever fails and drops out of the Technological Age.

Sure, as would most white-collar workers.

As we advance and become more organized and cohesive, the need for programmers will be reduced.

The need for all types of workers will be reduced. The demand for programmers will remain higher than for many other professions; somebody actually has to implement the automation.


Found the manager!


So programming is not creative in your opinion?


It's barely creative. You have a small amount of room to work within someone's requirements.

You can be the guy creating the requirements, but then you're better off specializing in that and letting someone else do the grunt work.


I think this statement is very extreme case scenario. I never really saw programming job without freedom to create requirement at least on technical front.


It's not creative in many cases because the technical solution already exists, invented and optimized decades ago. Just because developers are ignorant of known CS, prefer their ego (NIH), or chose to use the brand new shiny thing with no ecosystem to solve the business problems in front of them, doesn't mean they're engaging in effective creativity. How many times has the wheel been re-invented, do you think?


>You will be worthless if Western civilization ever fails and drops out of the Technological Age.

Who cares?


If this is true, why did the salary spreadsheet on HN last month show that some programmers are earning 3x, 4x, or even 5x others?

Non managers are pulling over 400k at companies just as engineers. Do you think this is untrue?


You should try something new. Try a startup or perhaps just something smaller where you have to wear a lot of hats and be included in decision making and such. I can't relate to your experience at all


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: