Thinking of "programming" as a completely separate field, to be "done" by specialists only, is highly unfortunate.
All "programming" really is, is formulating what you want done precisely. A compiler can take it from there. Where a specialist in "computers" comes in, is he can optimize your algorithms to run more efficiently. But in more cases than not, what "programmers" are tasked with, is to "just make it work", with little more specificity than that. And then the garbage inputer gets upset because he gets garbage back out.
SQL was originally, back when clarity of thought was still considered fashionable in executive suites, intended to be used by decision makers. "Specialist" IBM "programmers" provided the code that accessed and processed the relevant records efficiently, but the person who wanted an answer, often a C level, was the one tasked with specifying what he wanted done. With enough precision that his question made sense.
I agree with you that programming shouldn't be done by specialists only. I also think that's the way computers are supposed to be used (as Steve Jobs put it, a bicycle for the brain).
However, I also experienced the problems this will cause. I was working at a software company, where the software was started by non-specialists who then took over the development section. It was a mess.
And if two commands get stored in quick succession, such that the first results in a state that renders the second impossible? Particularly if some of the balance updates in the first command, are contingent on others ( credit line backing checking account updated if deposit balance < 0, for example )?
Financial transactions are pretty much the poster child for atomic, multi update transactions and pessimistic locking.
You can always save the fact that a transaction was started, read the account's state (including the most recent transactions as an ordered list), calculate the validity of the item, and update the success/failure accordingly.
It is not the transaction itself that is hard, it is the network partition. E.g. what happens if two network partition approve transactions, that wouldn't have been accepted if there were no partitions.
I've probably gotten off base here by wanting to perform arbitrary actions against services I may or may not control in the course of satisfying a command, and worrying too much about made up corner cases.
If this DB is the sole record of The Money, and I can move some quantity from X to Y in a transaction, then that's fine by me.
Even by the "standards" of Newsweek, this article is supremely stupid.
Current high level languages make so few concessions to the underlying execution engine, that all they really are, are convenient notation for clear and logical thought. The only reason they seem needlessly hard to your average Newsweek hack, is because of his/her own shortcomings in the clear and logical arena.
English, as well as other "human" languages, are (perhaps)good for making highly concrete grunts to ones hunting party, and for making jokes to impress chicks. They were never well suited for describing complex processes and procedures. And complex processes and procedures is what defines an advanced economy. Every area of advanced economy endeavor suffers from having too much of it's workings half assedly and imprecisely "described" in a "human" language. And honestly, for no other reason than that too many "stakeholders" are too bloody dense and lazy to learn anything better suited.
But I guess there are those who think mathematics would be better off without all that weird, difficult notation as well. With mathematicians instead hand waving and counting on fingers, in front of calculators "smart" enough to learn from humans......
It's not "our own" government, for any "our own" that purports to include me. It is a junta with enough guns to have their way with people across a continent. Pretending otherwise may sound "non extremist" and "responsible", but it's still ridiculously naive and bound to lead to nothing but disappointment. In ever increasing doses.
Practically, a good start is to recognize that not all valuable communication mechanisms benefit all that much from minimizing latency of packet delivery. IOW, not all, in fact not even most, means of communication really need the kind of "apparently real time" performance that telephony requires.
Moving services that don't, to a protocol where the focus is on making mixing and anonymizing simple, reliable and robust; rather than simply max throughput and min latency, would make end user security and anonymity guarantees much easier to make. And, for many types of channels, this can be done without much at all in the way of negative side effects, given how fast the underlying switching infrastructure has gotten.
Current protocols were necessary for any kind of usability when hardware was slow and expensive. And good enough privacy and security vise, when even the NSA didn't really have the means to do much wide net spying at the network level. But neither of those realities of the original internet is true anymore. Instead, sorry for the pompousness, the new environment is so different as to require, or at least recommend, something almost akin to a "new internet." Built with the "new" threats to communication in mind.
I'm not working anywhere, at a startup nor anywhere else, that could conceivably "profit" from any of the above ramble. If what I'm saying makes no sense, it's because I'm a moron (or at least misinformed), not because I'm a scumbag.
> It's not "our own" government, for any "our own" that purports to include me. It is a junta with enough guns to have their way with people across a continent.
Virtually all governments spy on their and other countries' citizens these days, not just the US. We Germans spy, the Brits and the rest of Five Eyes spy, the Russians spy, the Chinese spy, the Iranians spy and I bet that even North Korea has quite some good hackers.
And for the rest of your comments: indeed, a "new internet" would be required. But as you can see on the adoption rate of IPv6, we're stuck with this mess unless quantum computing forces us to switch.
IPv6 doesn't fundamentally offer end users anything far beyond the current standards.
I'm imagining a protocol for less tightly coupled endpoints could be written to, while the "switches" merely translate traffic to route it on current infrastructure. A more application agnostic version of mixmaster or TOR, so to speak. The important part, is really to get enough of a variety of end user apps written to it, to prevent anyone from knowing much about the traffic simply due to the protocol spoken. Then, over time, to optimize away more and more crud, until we've got dedicated hardware. It may still be a bit utopian, but the current mess isn't really serving people all that well anymore, either.
Which is a pretty good indication that all meaningful solutions to the spying problems, need to work at a level more fundamental than government. Routing around them, or rendering them impotent, by design, if you wish.
The relevant question isn't whether it is "breaking things" in the abstract, but whether it is breaking things in a way meaningfully detrimental to end users. Or, if you really want to stretch it, perhaps also include externalities imposed on the "internet as a whole."
In the era before the omnibox, I personally can't see much in the way of "breaking things" on account of OpenDNS. Doesn't mean you and others can't disagree with that, but it's not as cut and dried as your statement seems to imply.
All "language" is not the same. To be reliably and consistently machine executable, a language needs to facilitate manipulations that are logically rigorous. While in a language for general human communication, emphasis needs to be on manipulations that are more fluid and aesthetic in nature. Facility with one does not necessarily translate well to another. Particularly in the English -> Programming direction, which makes it doubly dangerous to assume it does, when hiring for STEM type jobs. Extreme aspies and savants aside, most good programmers can communicate well in English, in the sense they can formulate their intent precisely. While the converse is much, much rarer.
One threat that is getting bigger in the Bay Area, is the ever increased concentration and dependence on start ups. Most of whom are not living of anything remotely resembling cash flow.
The tech eco system itself is one thing, and largely populated by people who can afford a bit of a hit. But nowadays, everything is priced and organized around ever more millionaires being minted in tech every day, and the spending these generate. Take even some of that away, say by making money less artificially cheap, and there is potential for plenty of pain and dislocation across the area.
Much like Finance in NYC, the tech boom/bubble is driving costs so high in the Bay Area, that the only remaining jobs are either in tech, or jobs providing highly paid, localized and personalized services to tech workers able to afford paying thousands to have their toenails clipped, lips kissed and walls decorated with Van Goghs. Leaving those providing the latter, highly vulnerable to any slowdown in the increasingly monocultural spigot from which almost all funds here now flow.
Yeah, in a way it's not all that different from Midwestern towns like Gary, Indiana and Youngstown, Ohio - booming economies that depended on steel. Once the steel went away...
The kind of aggressive court interference talked about in the article, will (or at least should) be the catalyst to make that happen.
Anonymous comments really ought to be hosted anonymously and in a distributed fashion, preventing them from being taken down or "used against you." The "fire in a crowded theater" supposed exception, is simply not relevant wrt internet comments.
Like any other scheme to make commenting more expensive (resource wise), it implicitly makes the assumption that the value of what one has to say, somehow correlates with how much resources one has at ones disposal to be heard.
In the web era, the most important job a publisher has to ensure his site is a good one, is to write and edit content in such a way that he attracts an audience of interesting readers. Who will in turn, offer interesting comments. It is not, as may have been the case earlier, to hire the "best" journalists, to write their version of the truth. Sites that still use the latter approach, rarely, if ever, reach the level of truly interesting, simply because no one, or small group of, writer(s) will ever cover all bases, the way a whole community of interested and interesting commenters will.
I pointed this out in another comment, but I don't see why content and community have to coexist on the same site. Sites like Reddit and HN leverage the power of hyperlinking to merge the two models so each can do what they are best at: creating content for news sites and moderating and curating for social sites. I don't want to see what a bunch of wackos write in the comment section of my local newspaper. They added no value and the very few comments that did were not worth the overhead. HN has a community of interesting commentators, but it still relies on journalists to provide the basis for that community.
All "programming" really is, is formulating what you want done precisely. A compiler can take it from there. Where a specialist in "computers" comes in, is he can optimize your algorithms to run more efficiently. But in more cases than not, what "programmers" are tasked with, is to "just make it work", with little more specificity than that. And then the garbage inputer gets upset because he gets garbage back out.
SQL was originally, back when clarity of thought was still considered fashionable in executive suites, intended to be used by decision makers. "Specialist" IBM "programmers" provided the code that accessed and processed the relevant records efficiently, but the person who wanted an answer, often a C level, was the one tasked with specifying what he wanted done. With enough precision that his question made sense.