In the presence of each and every person living in an industrialized society are multiple devices running embedded software that is difficult to modify. Those devices are programmed using more flexible systems which to this very second are programmable any way you like. Where is the outrage over the hacker unfriendliness of my refrigerator? The iPad, iPhone, Android, and comparable devices are a hybrid of PC-like and embedded systems. The tradeoffs were made for reasons outside of anyone's programming/hacking needs. Should you want jailbreak a locked down device much of the work has already been done for you.
A hacker's "freedom" to poke around in the system has been traded for the freedom of an average person to use the damn thing without worry. This argument about post-PC devices and whether or not they're good for hackers is tired. Be thankful that powerful, accessible devices are being put in the hands of millions of enthusiastic people that you have a chance to influence and affect through software and services.
Finally, let's address the specter of censorship. This is brilliantly simple (in the United States at least). Address the entity censoring you. If it's a private entity then accept that they have no legal, ethical, or moral obligation to give you any access to their customers (much less complete access). Roll up your sleeves and compete. Should you be dealing with government censorship then pursue justice in whatever way your heart guides you.
Personally I'd like it if I could hack my fridge. And my thermostat. And my dishwasher.
But those actually aren't as important as a computer because I don't use them to make things.
I don't dismiss the power of giving access to information consumption devices to the masses. But I would prefer if we could put the means of production into their hands as well. So I agree to some degree with the author.
As for censorship, everything you say is true. It doesn't make fears of censorship unfounded. The iPad has close to 100% marketshare for tablets. And Apple actively censors its sole means of app distribution. That's a lot of power for a private company to hold. Not illegal or immoral, but scary.
I think a future in which everyone chooses to live in Disneyland is kind of scary, even if I can continue to choose not to live there. And the way to fight it is to convince people that there are alternative visions of the future. And that's how I read this post.
But please stop whining. <- This sounds more like whining to me than the post did.
I would go further. Illegality and immorality are very different. I think people have a moral responsibility to use their power well, or at least not abuse it, regardless of what the law requires of them. Using the law to enforce this kind of morality is dangerous, but it still stands.
Can you explain how open vs closed systems relate to user interface or experience issues? Those are orthogonal to each other, Apple made two choices here, not one.
I understand the reasons for the move towards closed systems. A combination of media provider demands, funding coming from advertising and user tracking and the standard walled garden land rush when a new market opens up.
> This argument ... is tired ... please stop whining
Oh, fuck you. Seriously. Your high school level libertarian argument about why you don't want a public discourse about the direction of technological openness and the cultural and market forces around it is remarkably silly considering you are posting this on Hacker News on the god damn World Wide Web.
> Can you explain how open vs closed systems relate to user interface or experience issues? Those are orthogonal to each other
You can have a better UI and experience in a closed system. Multiple reasons:
1) A closed system is more tightly controlled. The user cannot fuck every aspect of it, and so he wont. You cannot assume the average users knows what he is doing or what he is installing on his device. Historically he has been shown unable to do so. This is why even stuff like Nigerian scams (something anybody on HN would laugh off), are successful against the general population.
2) "Design by committee" has not historically produced a good end user experience, and this is not by accident. Good UI design needs people to understand UI design and be able to say "no". And an OSS project is design by committee --it's very difficult to fence the UI from clueless people wanting to add the "kitchen sink", say. It's also difficult to keep it stable enough (hackers need to hack, they get bored easily).
> This is why even stuff like Nigerian scams (something
> anybody on HN would laugh off), are successful against
> the general population
Nigerian scams are not a 'tech only' issue. Prior to email, these were sent through faxes or snail mail. The basic idea is that people can be stupid, especially when you put the possibility of a lot of money in front of them. This has nothing to do with a open vs. a closed system.
You're conflating open vs closed, customizable vs non-customizable and (strangely) open source development.
open/closed and customizability are at least commonly related in practice so I can see the train of thought there. Mentioning design by committee is just weird though, it's only related in that Apple has famously not done this as well as been closed and non-customizable.
I can only assume that you own a car you have modified with aftermarket parts all to hell and "fuck[ed] every aspect of" because it was possible and really wish someone had made it tinker proof to save you from yourself. Otherwise you sound like some arrogant technically literate person who feels that he or she knows what less technically competent people should be allowed to do with their own possessions because it's for their own good.
>You're conflating open vs closed, customizable vs non-customizable and (strangely) open source development.
It's not me thats conflating those, it's what the history of FOSS development has done in practice so far. Theoretically, yes, those things could be separate. But only if in said theory you abstract away human nature, group dynamics and such. For example, if something is developed as FOSS the people involved tend to make it customizable to the extreme.
>I can only assume that you own a car you have modified with aftermarket parts all to hell and "fuck[ed] every aspect of" because it was possible and really wish someone had made it tinker proof to save you from yourself.
When you assume, you make an ass out of you and me.
> Otherwise you sound like some arrogant technically literate person who feels that he or she knows what less technically competent people should be allowed to do with their own possessions because it's for their own good.
Yeah, the kind of arrogant technically literate person who has seen how less technically competent people use their products and what they ask of them. Ever noticed how the only people who give a flying duck about openness and modifiability in regards to iOS are a tiny minority of tinkerers, hobbyists, foss zealots and such? Oh, and press pundits of the calibre of Dvorak, Cringely etc looking for a headline.
How did FOSS come into this? You're all over the map here.
To remind you, this was a conversation about products that allow kids to learn through tinkering vs those that are completely locked down.
I don't particularily care to debate you on random semi-related topics especially when you aren't really debating with shit like this:
> When you assume, you make an ass out of you and me.
Ok, since my little rhetorical device gave you a smart ass way to evade the question let me put it another way:
You say "A closed system is more tightly controlled. The user cannot fuck every aspect of it, and so he wont. You cannot assume the average users knows what he is doing or what he is installing on his device. Historically he has been shown unable to do so."
Why does this problem not appear in any other industry? Normal people seem to do just fine with their cars, clothes, power tools, etc. All things that it is possible to tinker with and break doing so. People who take the risk and have some expertise do and those that don't just enjoy the purchase and leave that stuff to the experts. Historically they have been show to be able to do so.
I think we agree that forcing users to tweak all sorts of configurations either during setup or during normal use isn't the right way to design software. That's not related to whether it's possible to modify things if you want to.
Do you honestly not understand that this is a narrower subject than the Apple way (closed, novice centric simple UI, proprietary) vs. the Linux Way (open, expert centric UI, free)?
My points where only ever about open vs. closed, as was the article we commenting on.
>A hacker's "freedom" to poke around in the system has been traded for the freedom of an average person to use the damn thing without worry.
The two are not mutually exclusive. Some Macs have been made in the past that were easy to open up, should you choose to. Others required special screwdrivers be used, removing choice. What is wrong with giving grown ups who have purchased hardware the ability to choose what to do with it? Support can simply be limited, for those who choose to customize, to system wipes.
Exactly. Apple's iOS policies are not required to preserve Apple's user experience or protect cellular networks, no matter how many times they claim they are. They are about Apple making sure they are in control of the iOS ecosystem, no matter what. That motivation, of course, is pretty scary by itself...
There's a surprising amount of belief in historical determinism in the discussion here, which is utterly unjustified. It is easy to believe that the future will fix itself on its own. But such beliefs are mostly the result of not being able to imagine alternative histories in which things don't get to be so rosy.
The truth is that so many things in the history of computing were improbable, and made possible by a select few very strong-willed individuals. Personal computing was improbable. The dominance of open source was improbable. etc.
It's up to us to ensure that the post-PC era won't be the era of walled gardens, which is definitely where it's headed. And that would be perfectly fine, if our walled gardens weren't so darn suboptimal.
True, our kids won't grow up hacking the same systems we did, just like we didn't grow up hacking the same systems as our parents. But there is something basic to the human spirit that guarantees as long as this branch of the gene pool is still around, we'll still tinker and build and create. It's just that the blocks are constantly changing. Which keeps things interesting, and is a good thing.
Yep, the more things change, the more they stay the same...
When I was a kid, you could but germanium diodes and ferrite core coils, and make yourself a crystal radio - it was _magic!_
But...
There was no way I could have gotten hold of something like an Arduino - probably not for _any_ sort of money, never mind the sort of budget kids birthday or Xmas presents run to...
The same with a lot of other stuff, I had to wait till my first paper round job to be able to save up enough money to buy the ~$200 (in 1982 dollars) radio control gear before I could build an rc plane. These days that's $24 or so delivered from China...
I remember playing with my fatyer's grandfathers predecessor to Meccano, stone building blocks with pressed steel girders. I suspect children today probably look on historical toys like lego with similar charm/disdain...
Exactly. The end of the PC era doesn't give rise to the 'iPad era,' it gives rise to Ubiquitous computing, and Apple knows this.
The trajectory goes:
mainframe: one computer, many people
personal computer: one computer, one person
ubiquitous computing: many computers to many people
If anything, people are going to be programming the shit out of everything in the future, not pulling back.
Someone, somewhere will have to program all these wonderful gadgets.
While it is conceivable that all development will occur only inside big mega-corps, I somehow doubt it. People have a voracious appetite for new content.
Smart money is on the tools to create new stuff becoming more widespread, not less.
Yes. The fact that all these useful sensors (touch, display, camera, location, attitude, wifi/3g) are integrated in one small, portable, and not-that-expensive device opens interesting new possibilities.
Think of the hobbyists who send their phone into space on a balloon.
That's true, I have no worry that "the future" is going to be worse.
But that's long term, it doesn't mean that we don't want to avoid repeating the eras where this kind of tinkering was repressed or legislated against. Those eras slow down innovation and take wasted effort to recover from.
Just look at what happened on the web when all those tinkerers stopped having to break into the tools they wanted to use and could spend all their energy doing new things.
My first computer booted into a BASIC interpreter. That was pretty awesome, and gave me an early window into programming. On the other hand, it didn't have a lot of things. It didn't have a text editor. It didn't have anything other than a BASIC interpreter. I knew that the games I played on that machine weren't written in BASIC, but I didn't know where to go to learn how they were made.
My current computer, made by Apple, comes with interpreters for several languages, all more powerful than BASIC. It comes with text editors. And, best of all, it comes connected to an Internet from which I can not only download interpreters and compilers for all sorts of languages, but find extensive documentation on how to use them.
A child today has access to so much more in the way of programming tools than I did. We live in a glorious golden age of hobbyist programming.
I think the author's point is that the post-PC era implies that the golden age you're describing is dying, replaced with a clean future, wherein people don't have computers with text editors and that can run interpreters and compilers etc.
The children of the post-PC future will only have access to iPads. Or Kindle Fires, as pure a consumption device as I can imagine.
Sorry, that part was hyperbolic. But it's true that what the author is describing is the end of precisely the golden age you desribe. To whatever degree consumption devices replace production devices, there will be less exposure to them.
It's not an insult. But consumption devices are lamentable as far as they replace production devices. I love books, and I don't sneer at children who read them. But I would like all children to have access and exposure to writing tools so they realize that they too make books.
Books are a horrible example of a consumption device, the printing press profoundly opened up the ability to write and distribute books and ideas.
Sure, the modern pace of life makes writing and printing a book seem difficult but at the time modern books were a fantastic democritization of the ability to produce literature and distribute ideas compared to the status quo of being illiterate and going to church to be read to by those with access to the books.
Nobody's gonna buy devices that let them create in those ways anymore? Surely there will be demand for things to be created in the forms that those devices aren't good at creating for.
But I think the author's point is that part of his penchant for making is due to exposure as a child. In a post-PC era, even if professionals still have PCs, many fewer people will than do today, because they won't need them. That means fewer people, fewer children will be exposed to them as the author was. I think it's a valid concern.
I suppose I (personally) think it's generally inevitable (and even mostly good!) that machines continue to evolve the way they are doing now - "good" primarily because the newer types of systems provide so much more creative power to non-engineering types, which make up the majority of people.
But, as someone who is deeply interested in computers today primarily because of the power that was available to me as a young child on an early PC, I think that one of the most noble goods that can be done for the future is to provide tools to allow future generations to have access to similar capabilities for exploration and design like we had. The most obvious way to do so seems to me to be "open Web" tools. Web stuff that lets kids play the way we did in BASIC (hopefully even better!) seems like a really good thing to me.
There may be fewer of these sitting around living rooms. But a greater proportion will be available to dual-boot as Dad won't be so worried about losing his TurboTax data.
The devices supplementing or supplanting them will give kids access to the entire library of human writing prior to 1900, plus an extraordinary ability to review art and musical history. They'll be one copyright violation away from K&R. They will essentially be dropped into the Library of Congress, with the Louvre and the Hermitage bolted on at one annex. In the basement will be the nightclubs of Harlem and the salons of Vienna. This is before anyone uses these things for anything _new_. And these contraptions are somehow _limiting_?
Nor will they remain closed. Proprietary tablet systems will suffer the same relentless competition that has grown $600 of desktop from pipe-dream to humdrum commodity. People and companies will code these things precisely to match the capabilities, and revenues, now locked away.
I get the philosophic distaste for closed systems. But haven't the past ten years demonstrated that, in a free market, they can't stay closed? Or that the revenues available to proprietary innovators can motivate accelerations in innovation? Worries about "closed" were one thing in 1995 when people thought they were doomed to Windows. Maybe it's time to adjust the anxieties for what we've seen of the long-term competitiveness of "open".
Plenty of people reference things like Apple's dominance of the tablet market and Apple's profitability compared to other device manufacturers and claim that "this time it's different". That may be famously wrong with respect to the stock market, but we have relatively few examples of the evolution and adoption of computing platforms so I'd say it is a much harder call here.
This guy needs a better metaphor. Imagine a world where Legos didn't exist? But Legos do exist. It has probably never been easier to be a Lego hobbyist.
What this guy should do is come up with an actual example of a hobby that has died out because the parts are no longer available. Used Model-T Fords. Urban horse infrastructure. Manual typewriters. Dial telephones.
What one realizes is that actual extinct things are only extinct because the demand is gone. They are still making vacuum tubes somewhere, for god's sake - audiophiles can have passionate debates about how much better the tubes were back in 1967, but apart from such quibbling 2011 is still a pretty good time to build a tube amplifier. There are lots of plans on the Internet!
So I wouldn't count the PC out yet. Aren't they just about to release a tiny PC for $25? It has never been cheaper or easier to be a PC electronics hobbyist.
The point is that the "post-PC era" will make it so that being someone who thinks that you can modify or change the computing experience is relegated to "oddball hobby" status, like it would be if someone today wanted to start playing with vacuum tubes.
The current PC ecosystem lets you explore and create and play and learn on the same tools that you use to consume. You can build a lego kit with the instructions, or you can use the pieces and make something that's uniquely you. You can run the apps on Windows that you download, or you can write your own without asking for anybody's permission.
In the magical "post-PC" world, where everyone is inexplicably using an iPad, you can run what Apple says, period, and if you want to develop something they don't approve of, welp, too bad.
This loss of flexibility, exploration, and sense of wonder is bad for future computing developments and bad for future computer users.
The point is that the "post-PC era" will make it so that being someone who thinks that you can modify or change the computing experience is relegated to "oddball hobby" status
It was like that during the pre post-PC era, anyway.
You can always ssh to a server and do development work there. That's what I do at my job anyways. It's advantageous because compiling programs on the server doesn't affect the computer you are using.
Could not agree more. I can only imagine how much closer my thirteen year old self would have come to building my iron man armor had I'd been armed with a broadband connection and a digikey.com account back in the 80s.
Let me preface this by stating that much I'm about to post, I've learned from my 18 month old boy.
Aside from the sentimentality of a parent watching his kid develop physical and mental abilities, it's a very interesting phenomenon watching the way the development occurs.
Kids basically learn one of two ways: imitation and exploration. Imitation being quite simple, so I'll move right on the exploration. Now, there's recently been a lot of discussion whether kids in America are being hampered by overprotective parents. I believe that notion is quite correct.
My son has legos, but he also has access to my touchpad and android phone. To him, they're not so different. Legos teach him that he can piece them together to make new things or break them apart for the components (creativity). The electronics teach him that there is a logical action and reaction (logic).
It's absolutely amazing the first time I saw his eyes light up because he figured out how to wake the device from sleep or how to unlock it. It's a physical indication that a neural connection has been made in his head. To me, that indicates these types of tools will indeed facilitate a much earlier exposure to logic we have not seen in previous generations.
So to come full circle, I don't think the death of legos (I don't really foresee it happening) will hamper our kids because there will be other mediums. As long as we don't hamper exploration, kids will develop both creativity and logic.
This. When my kid will be big enough I'll buy him toys he can break, like Legos, later I'll buy him gadgets he can break, like PCs, open garden tablets. If it results in him having to fiddle the os in order to bypass the antiporn barrier I'll setup, it will be all for the best.
There are multiple BASIC and Scheme interpreters in the App Store, as well as apps for writing HTML documents. If you want to write proper apps, an iOS developer certificate costs money, but so did Visual Basic back in the day, and on Android or a jailbroken iPhone it's free.
Post-PC devices are inherently bad for programming because they don't have proper keyboards, but by the same token you'll likely have access to a keyboarded computer to do homework on, at least until someone invents a new input method that's better for both. (Nuance dictation for code, anyone?)
Post-PC devices are inherently bad for programming because they don't have proper keyboards
I went on a road trip this summer and continued to do some programming while on it. I found the iPad to be more comfortable than my laptop when sitting in the passenger seat.
A keyboard is pretty hard to beat when you've got a desk in front of you, but when you are left in weird positions, the iPad actually isn't that bad for editing code.
The author does sound like a retro-grouch. As I often say to hipsters on fixies, “For you it’s retro, for me it’s nostalgia.” That being said, the post-PC era does not mean that nobody has a PC, it means that people don’t have to buy PCs to do non-PC things. Imagine if you needed a PC to watch television. It’s the same thing with email, FB, and web browsing. Why do I want to know how to format a hard drive to read Hacker News?
Steve Jobs described PCs as being like pickup trucks, and he described post-PC devices as being like all the other vehicles people use, from bicycles to SUVs. None of those made the pickup truck go away, and for that matter there is a sizeable market of people who take pride in driving a pickup truck even though they never haul anything bigger or dirtier than a chest of drawers in it.
PCs will be the same way. Available and cheap if you need one, and also available for those just like the status symbol of being a touch guy who fdisks and bash scripts and thinks curl beats Firefox.
the post-PC era does not mean that nobody has a PC, it means that people don’t have to buy PCs to do non-PC things.
Fewer people needing to buy PCs to do non-PC things means fewer people buying PCs. Fewer people buying PCs means fewer people exposed to the PC things early on. Fewer people who are familiar with the things that you use to make things.
Well, suppose all you have is an iPad, and you want to write some code. Sad panda, yes?
Well, not necessarily.
Let's say you go plop down 50 bucks or whatever on a keyboard -- or maybe you don't because you have a bigger tolerance for trying to type code on an onscreen keyboard than I personally do.
Next, you have to locate yourself a web browser. And there's one right there on your iPad already, so you're done.
Next you have to find somewhere on the internet where you can do coding. Here's the first one I found from a quick Google search: http://www.coderun.com/ . It's a browser-based IDE for php or asp.net that allows you build, test, and deploy a website right from the comfort of your own browser. I'm sure there are other options out there, but I didn't look very hard.
So it doesn't seem like you're going to have to work very hard to find a way to learn to write code as long as the web browser doesn't go anywhere. Even on an iPad.
This is like telling me that it's lamentable that I can't fix a car. Had I been forced to spend many waking hours tuning and maintaining my car, I would have learned something.
Well, I spent those hours programming, and the people who love cars are using the time saved by their iPad to tinker ith internal combustion engines and electric vehicles.
Who's forcing whom to do anything? Why is having the option to do something such a bad thing? Windows gives its users plenty of opportunity to tinker. Most of them don't bother. With the ipad, however, the option itself is taken away.
Meh. Unfounded fears. There will always be computers/desktops/laptops. Just because tablets and smartphones are ubiquitous doesn't mean that they won't exist.
People will always want huge monitors, and there will always be programmers. Granted, in 20 years it might not be someone sitting in front of 3 LCD screens, but it will be something.
Sure, but desktops and laptops are starting to become more and more like tablets and smartphones. Look at the changes that started in Lion, especially the Sandboxes. Look at what is in store for us with Windows 8. It seems to me that if things keep heading in the same direction, laptops will end up just as locked-down as tablets.
If you think that the market will never let it happen, realize that at least 80% of consumers only really use their computers for the web and/or for games. What do you use your smartphone for?
In 1951 you could have whined, "With the rise of pro-grammable computers, the Engineer can simply turn his brain off and let the computer do all the work. The era of craftsmanship has come to a close. No, no need to think dear friend, we have ourselves a Computer. Aughh!"
Then, in 1961 you could have sighed, "The knowledge of how to maintain a computer will be gone forever with this increase in reliability. How will someone ever know truly how computers work unless they have to fix them piece by piece."
Then, in 1971 you could have pined, "With the rise of these time-share based operating environments, the future programmer has all the hard things taken care of for them. All that you need now is a data-bank administrator and record input clerk. There is no future in computing!"
Then, in 1981 you could have lamented, "Baugh! The rise of these pre-built micros means that the future generations won't know how to work a logic analyzer or an oscilloscope. They will never use a soldering gun or know the joy of assembling a memory board because they will just drop it in the slot. Ug!"
Then, in 1991 you could have scoffed, "well with all these new fancy compilers, nobody will appreciate the joys of directly manipulating registers and stacks. Instead, they will spend their career in higher order abstractions without ever truly knowing the soul of the machine."
Then, in 2001 you could have cried, "This era of the world wide web is hastening the decline of single system software and entertainment consumption is simply supplanting productivity for the largest use case of computing. Programming has become nothing more then playing Oz in the Emerald City; pulling pre-built levers as the scarecrows and tin-men of the world marvel on the sidelines"
And if you did, you'd be missing the point. The explosion of computing in the general public over the past 20 years has seen the explosion of innovation - "the internet", as an industry, is really only 15 years old. Not even that. And yet it's innovated and transformed forever our modern societies.
That innovation has been driven by the masses being able to create things on their computers. Something that potentially restricts the ability to innovate across the masses shouldn't just be handwaved away like yet another whine.
Software represents only a fraction of "everything that's on the Internet", creators of software only a fraction of creators in general, and creators in general being only a fraction of "the masses".
That there has been so much software written, and content created with that software, and support structures to supply that content to you (like networking or ecommerce) indicates that folks have to be able to innovate - the greater the mass of people that can tinker, the greater the mass of creators you get.
There was a much higher ratio of creators to users in the computer world in the 60s. But there wasn't the sheer volume of innovation there has been since the masses got involved - even though there's a lower ratio of creators, the absolute number of creators is far higher - hell, we're even on a site that is designed to do nothing but cater for this influx of creators, and it's getting one application a minute.
The fewer people that have exposure to tinkering, the lower the number of innovators overall.
Yes, and? Pretty much all the 'non-software' stuff you create with some form of electronic device requires software to make it. If there are fewer software writers, there are fewer kinds of software, and less non-software-cration stuff you can do with it.
We already have this era right now. It's called the "Our computer classes are typing and learning to use MS Office" era.
Nobody really learned how to do anything complex on the computer. CLI's are scary. Programming is hard.
As a result, most computer users, many of whom are very intelligent people who would easily be capable of basic programming and understanding unix pipes and similar are never are exposed to it, and thus it's a foreign language.
Don't blame the iPad. Blame GUI software and the illusion of ease hiding complexity for where we're at.
I'm not. Just port a basic interpreter to iPad, or write Logo for iPad. No one begets the fact that computers come preassembled now and no one needs to learn how to solder to have a computer. Imagine a basic interpreter that DOESN'T lose everything you typed in when you turn off your computer.
The opportunities iPad enables far outweighs the cost of having to learn to program to use your computer. iPad will get more people using computers in more ways which while reducing the percentage of computer users who can program will vastly increase the overall number of programmers.
I agree that the device offers many opportunities. Nevertheless it's not just about porting some interpreter, you also have to deal with apple policy which is not that friendly to such ideas. See for example how Scratch was rejected last year: http://www.wired.com/gadgetlab/2010/04/apple-scratch-app/.
Apple doesn't control the Internet, and while HTML5 has its own limitations, you could still probably port Logo to it, especially since Logo isn't all that complicated.
The writer makes the wrong assumption that the current mobile devices (e.g. iPad) will never have the creation tools that the previous generations of computer have had. IMHO, it's only a matter of time. And in the same vein it's never been more accessible to build something better than what's out there and get it out to the public.
I think that may be a fairly safe assumption. The very form factor of the Ipad and tables in their current incarnation makes them poorly suited for most types of content creation.
However, among content producers tablets supplement rather than replace other form factors of computers for the time being.
Also, docking stations are likely to improve and become more common. A future tablet which would be more powerful than current generations with a solid docking station that would support a full keyboard, mouse and second monitor is a standard computer with the ability to undock it temporarily when not doing real content creation...
"The very form factor of the Ipad and tables in their current incarnation makes them poorly suited for most types of content creation."
Excuse me?
The Ipad and the tablets are excellent for content creations.
I created a program to analize the spectrum of voice in opengl, and multitouch is way more natural to manipulating thousands of things at the same time(realtime zoom, panning and selection). It feels like a highway for your mind not to be constrained by language(that is so slow).
If you program using blocks it is way faster and easier than using keyboard.
...look at the keys and then look at a computer keyboard.
teletypes used typewriter machines because people got used to it, and then people made programming for teletypes as it was the only possible input on these days.
It does not mean that it it the best input for a computer, far from it, you just have to redesign everything(as iphone did with the iphone) in the programming world too.
Of course, this takes work and good programmers are lazy.
I grew up on Windows PCs, the barrier to entry for tinkering with apps on Windows was way higher than building things with iOS SDK or HTML. I am not worried at all about the children of tomorrow.
I think it's more meant to be "bulk of computers people use in general won't be traditional desktops/laptops" and less "desktops/laptops are going to be eradicated"
Your iPad has a browser. That browser runs HTML5 and JavaScript. That's the quick way to get to something programmable. If you want to get down and dirty, drop thirty bucks on an Arduino.
Personally, I think the post-PC era is going to be even more awesome.
I could program an Arduino with a pair of tweezers, son.
...
Seriously, I have no idea. I suppose Apple's too cool for USB ports, but maybe Wifi? That ups the cost of your Arduino setup, it's true, but still - there has to be something you can do to communicate between your iPad and your Arduino that doesn't involve a general-purpose machine in between.
There's now a blessed serial cable with a dock connector, and some form of an arduino programming environment on the iPad. The only drawback is that the cable is more expensive than the arduino.
The "post-PC era" is greatly exaggerated. It may eventually happen but only for those kinds of consumers who would have never bothered with LEGO anyways.
For hackers, the PC (or Mac) isn't going anywhere.
In other words, if everyone uses the iPad, who creates iPads and all the apps?
Legos, I should remind, are existing technology, which didn't exist in 2 centuries ago. And I am sure stones and sticks were precursors to the invention of Legos, and now Legos will be replaced with computers as a tinker device.
Lots of innovation has come from post-PC devices, including all of the development for these app stores and jail breaking. Playing with my iPhone 4S, especially Siri. I know this is the future.
Laptops will still be around for a long time, Microsoft's vision is confusing admittedly but I think not everyone needs a full computer. Those that do can have both or continue to use their computers.
I think having a lower barrier to entry is always good and think we have to let the way we build things evolve otherwise it goes against this whole ideology.
I think I will be like those old men who still rave about their vinyl records and would part with them. As long as Pc devices as we know them such as desktops and laptops are produced and supported by websites and apps I will use them. I believe there will always be a use for a more physical computing experience..
What is ironic is that this post-PC era is driven largely by open source software Unix,gcc toolchain,webkit,etc.(iOS), Linux,Java,Webkit(Android), with "the cloud" built on top of Linux, Javascript, Ruby, PHP, Python, Java, Apache, Nginx, git, svn, etc....
So many devices might be "closed systems" but they are almost all built on the shoulders of open standards, open tools, and as a creator you have so many more opportunities to do interesting things in both the hardware and software space for next to no investment that it's incredible.
Post-PC might change how end-users consume our software, but it's not going to make the tools to create it less available.
Stop whining about the theoretical future and build something.
Why scared?
There is nothing to stop someone from hacking a device - but then do we want all devices to be hacked? If Siri is somehow integrated into my car in the future, do I want my kid hacking into it and messing around with it, potentially resulting in who knows what.
The ability to play around and change things is always gonna be there and in fact with locked devices in some ways it's even better - you'll have to be more enthusiastic and perhaps more talented to hack it, and that breeds and entirely new generation of hackers.
And remember - even Apple ended up hiring the person who created his own version of an iPhone notification system on his jailbroken phone.
Obviously the author doesn't have kids.
I don't think anything can prevent them from breaking and changing things. Maybe they won't do it with iPad (although I doubt it), but they'll still do it. Part of being a human child and all.
Is it bad that people no longer learn how an MBR works, or how useful command lines are, or what real memory versus protected memory is, because we have GUI operating systems?
Is there less people going to hack their own chips, connecting wires together and learning how to write micro code, because we have devices that don't require it?
Technology always works like that. Today's kids start with an iPad and iPhone. A few years ago kids started with a GUI OS like Windows of MacOS. A few years prior they started with DOS. Before that they started with a Commodore or Amiga. It's just evolution, and every time you add another layer of abstraction.
It is just a PC with multitouch support, this does not mean that all the tools that create things are going to die, just because today they are designed for the mouse.
When Apple started with the desktop paradigm, there were not tools for it. The had to convince people to use mouse and clicks first, they even had to remove the cursor keys so developers were forced to adapt their software.
Ipad has made a fantastic job convincing people that yes, people love the multitouch interface so if you want to make money you need to adapt your software. Adobe and others are listening to the signal.
And if everything fails, we will always have Linux.
What the writer is trying to get at is the idea that people will just accept that computers work by magic. That's by far the dominant way most people view automobiles today, for example.
And while this is distressing in some ways, I don't see it as necessarily a show stopper. People who are curious will find a way to peak under the veneer of magic and fiddle with the internals. Maybe kids will tend not to learn programming via command line apps written in basic, Java, python, or C, but that won't stop them from learning Javascript. And it won't stop them from learning C eventually either.
So essentially the promise and dream of the post-PC era comes at the cost of killing the current golden age of Computers? I would say we are being hyperbolic here.
For what it is worth I, (and I know of so many other people) have an iPad but end up spending my entire time (around 10-12 hours everyday) in front of a Mac. This is not going to change anytime in the near future because quite frankly the iPad is just not capable of doing all that I want to do on a "computer".
So yes. there is a post-PC era, but think of it as a new species that co-exists than an evolution that simply eliminates the last one.
This doesn't make any sense to me. The presence of post-pc devices doesn't void the existence of the PC.
Today you can go on ebay and for a hundred bucks get a several year old laptop and install a totally free operating system. You'll have access to dozens of free programming languages, complete with free compilers/interpreters, tools, and documentation. You can go on the internet and find tutorials, books, ask questions, and have conversations with actual programmers. Things have never been better for those with curiosity, kids or otherwise.
Not sure if the Lego example is a good one. Lego has been available for over 60 years, and still going strong. According to his website, the OP is 18 years old. If his fears had any basis, then Lego would have disappeared way before he was born.
Another important point is that our experiences shape us in specific ways. Our kids' experiences will shape them differently. Having iPads does not make them any less inventive or curious. It's just that we can not imagine, today, what they will think of in the future.
"If you can't break it, try harder. If that doesn't work, whack the shit out of it."
You are 100% correct in your observation that tinkering leads to innovation. I think that just means that we're going to see different kinds of tinkering in the future.
When I was growing up, that meant overclocking chips because you didn't like the speed of your machine. Today that means jailbreaking an iPhone because you don't like Apple's rules.
I would have much rather been breaking iPhones than XTs when I was growing up :)
Those that are driven to make things will be inspired by what the post-pc landscape has to offer, not dissuaded.
I don't want to completely discount this short post, but the FUD here seems misplaced. Many more people are being attracted to post-pc devices vs. pcs. The fraction that go on to become engineers will make for a larger pool of hackers, not a smaller one.
Things don't have to be hard to inspire one to hack. They just need to be sufficiently interesting.
People are afraid of technology because of the crap tech from the past 20 years. What do you think happens when those people stop being afraid? Suddenly software development feels accessible to a bunch more people.
I'd argue that people won't stop wanting to tinker, that in fact more people will start to because they no longer think it's impossible.
My high school (1997-2001) reinforced this. They suspended kids from school for tinkering. And by tinkering, I mean "bringing up a DOS prompt instead of using the pre-canned IBM ICLAS menu system".
My family didn't own a computer until after I had read my first programming book (I was in 2nd grade, the computer in my classroom fascinated me, and the public library had a book about BASIC.) Curiosity always finds a way.
A hacker's "freedom" to poke around in the system has been traded for the freedom of an average person to use the damn thing without worry. This argument about post-PC devices and whether or not they're good for hackers is tired. Be thankful that powerful, accessible devices are being put in the hands of millions of enthusiastic people that you have a chance to influence and affect through software and services.
Finally, let's address the specter of censorship. This is brilliantly simple (in the United States at least). Address the entity censoring you. If it's a private entity then accept that they have no legal, ethical, or moral obligation to give you any access to their customers (much less complete access). Roll up your sleeves and compete. Should you be dealing with government censorship then pursue justice in whatever way your heart guides you.
But please stop whining.