> Why shouldn’t the web work with dialup or a dialup-like connection?
Because we have the capability to work beyond that capacity now in most cases. That's like asking "why shouldn't we allow horses on our highways?"
> Pretty much everything I consume online is plain text, even if it happens to be styled with images and fancy javascript.
No doubt, pretty much everyone who works on web apps for long enough understands that it's total madness. The cost however, in supporting people so far behind as to only be able to serve them text is quite frankly unmanageable. The web has grown dramatically over the past 20 years both in terms of physical scale and supported media types.
The web is becoming a platform delivery service for complex applications. Some people like to think of the web as just hyper text, and everything on it should be human parse-able. For me, as someone who has come late to the game, it has never seemed that way. The web is where I go to do things: work, learn, consume, watch, play. It's a tool that allows me to access the interfaces I use in my daily life. I think there's a ton of value in this, perhaps more than as a platform for simple reading news and blogs.
I look forward to WebAssembly and other advancements that allow us to treat the web as we once treated desktop environments, at the expense of human readability. It doesn't mean we need to abandon older + simpler protocols, because they too serve a purpose. But to stop technological advancement in order to appease the lowest common denominator seems silly to me.
> Because we have the capability to work beyond that capacity now in most cases. That's like asking "why shouldn't we allow horses on our highways?"
Horses on highways would cause accidents. I have yet to see a fast-moving web page crash in to a slow-moving one and shut down the router. Analogies work better when there is connective tissue between the concepts in play.
More generally, the vast bulk of the problem is not human readability or interactivity over http, but more a matter of insane amounts of unnecessary gunk being included in web pages because of faulty assumptions about the width of pipes.
More generally, I find myself moving in the opposite direction. I find that many SaaS services' interests don't align with mine, so I'm going back to local applications. I don't trust others with most of my data, so the only service that sees much of it only sees encrypted blobs (for offsite backup). I've always run my own mail, and have slowly been expanding the services I host as I bring more of this stuff in-house. And so on. But I realize I'm in a minority.
But the nice thing is that it gives me an intranet and "other" grouping that is very straightforward, so that the browser instances that touch untrusted (not-mine) services can run in "bastion" VM, locked down nicely and reset to a pristine state at will, not to mention allowing some stupid networking tricks that are sometimes useful.
> More generally, the vast bulk of the problem is not human readability or interactivity over http, but more a matter of insane amounts of unnecessary gunk being included in web pages because of faulty assumptions about the width of pipes.
Doesn't affect the vast majority of users.
> But I realize I'm in a minority.
Yes, your statements are pretty anecdotal and don't really relate to the vast majority of internet users.
I'm sure your setup works great for you, but it sounds like a ton of overhead, none of which is required if you have fast internet and don't give a shit about what's going on (like nearly everyone who uses the internet.)
> don't really relate to the vast majority of internet users.
It's not that they don't. It's that you don't care.
Because why should you care about something that doesn't meaningfully increase ad revenue or sales? Why should you care that the 2 extra seconds of pageload on a fat pipe, and a fraction of a cent of extra electricity burned, when multiplied by a million of your US users add to over 500 man-hours and few kilograms of coal wasted. Not to mention the site being unreliable or unusable in trains, rural areas and larger buildings in which an user doesn't have Wi-Fi access.
And the problem wouldn't be as big if it was just you. The problem is, everyone else thinks the same way, so all the waste mentioned above adds up. All because people are too lazy to not put useless gunk - which often requires more work to add to your site than to refrain from using it in the first place.
And the funny thing is that it's even been shown that increasing speed increases usage and revenue/sales, so there's not even that excuse. Slow pages break flow, which cause people to realize that they've already wasted too much time on your site, and were supposed to have done xyz 15 minutes ago.
I don't think your claim that it doesn't affect the vast majority of users is correct. There's well over a billion people in India alone. You might argue that you were only making claims about US users, but a lot of sites have no reason not to be global.
If you've ever traveled internationally, you'll know that a lot of English-language sites become unbearably slow to use over mediocre hotel wifi, let alone cellular. Lightweight sites like HN become relatively MUCH more pleasant to use.
>That's like asking "why shouldn't we allow horses on our highways?" //
We do allow horses on our "highways" in the UK, not motorways but other highways. It's a terrible analogy though as you can't have progressive enhancement of a road for the vehicle capabilities as you can a website.
>But to stop technological advancement in order to appease the lowest common denominator seems silly to me. //
Not serving simple text, for websites where that's appropriate, and then offering enhanced capabilities when the web client can make use of them seems perverse to me.
You don't have to not advance the technology, having radio broadcasts doesn't hold back VR/AR, but if all you have to convey is able to be passed on in an audio stream then purposefully designing a site to be hostile to clients that can only consume an audio stream to me is wrong. Sure add on an immersive environment where one can play on a VR beach whilst listening to your "24/7 wave noises" but don't make it so that simple audio access is impossible if the primary content only requires that.
In other words don't require webgl so I can see your store opening times.
Text articles are probably the most widespread type of content on the web. Most web sites are not web apps. But many developers want to re-construct web sites into web app architectures even when there's no benefit to the end user.
I posted the links below on a previous discussion about AMP. They are two examples of basic, javascript-free web pages with text content. There's about 2500+ words on these test pages, but the page weight is still much smaller than, for example, a medium article with one tenth the number of words (250).
Try loading them on your mobile on a 3G (or slower) connection. Do they load fast or slow?
Version B could probably be optimized here by not loading two very similar fonts.
You can also try loading the font locally first, to avoid the download if it's installed on the user's system.
Finally, unicode-range lets you avoid the download completely if that character isn't included on the page. Not a likely outcome on an English page, but a good practice regardless.
Webfonts are tough to optimize, but not impossible. Right now there's solutions of using Javascript to background the font so it's non-blocking (eg. loadCSS[1]), but it's not ideal when trying to keep overhead down. The situation should improve once font-display[2] becomes standardized.
For what it's worth though, I find Version B looks much nicer.
Version B has two different font weights from the same family: Regular and Medium/Semi-bold. Version A relies on the fonts already installed on the user's computer.
Dropping the semi-bold font weight would save approx 23k, but having a regular and bold font weight felt like the minimal styles needed to support the page.
Dropping the header image would save 40k. (Note: the header image hasn't been optimised using something like the HTML srcset attribute which can load different picture sizes for different devices).
> Because we have the capability to work beyond that capacity now in most cases
Except if you are in a rural area of a developed country, or in practically any undeveloped country. Most of Africa, Asia and South America have abysmal internet connections, and rural parts of Australia (even a few hours drive from a major city) are only able to get satellite internet.
Tech people often fail to realise how divided and limited access to internet actually is once you leave 'tech hubs'.
Absolutely, but those are secondary markets for most American businesses because there are an abundance of issues with delivering any application to those areas. Whether its political, cultural, financial, etc. Priority for most businesses and website owners is to serve the people you know you can serve first, then work on supporting those other areas.
Twitter, like most companies, exists to make money. If you're busy shaving off every bit you can from your requests, you're spending a lot of money. You're also losing money because I'd expect you wouldn't be serving ads, etc. as well.
Most blogs that exist to make money aren't targeting those without good internet either, so I don't really see the problem.
Rural areas secondary markets for most American businesses?
This isn't just about Twitter, its about making sure if you are helping build a local Mom & Pop Shop's online presence - which is where all business is now days - then you build it in a way that their customers might care about.
There is a world outside of cities, and a lot of people live there. The tech world needs to wake up to that, because those people where Ubers don't go, GrubHub doesn't deliver, they exist (and vote) too.
They vote, yes, but do they buy? Do they significantly contribute to a company's economy (even potentially)?
If they do, companies will be happy to spend time and resources in serving lighter versions of their content. But if they don't, there's no reason, from the POV of a company, to employ resources in something that doesn't generate revenue.
The web is becoming a platform delivery service for complex applications.
No, it's not. Yes, there are MMORPGs that run in the browser using WebGL.[1] But very, very few pages use all that capability. Most web pages today would work just fine in HTML 3.1.
And what is this thing with running over ten trackers on one page?
As soon as WebAssembly is stable and available I'm predicting we'll see a dramatic shift away HTML + JS as the target for most web apps. I don't mean blogs, I mean people who are trying to build websites that work like apps.
Most webpages today that would work fine in HTML 3.1 aren't built as massive JS web apps. If they are, it may serve a purpose (better UI/UX for most of their users being a major one.)
We see gazillion of the web apps (and pages that are web apps but shouldn't be) because of those who are not willing to learn anything but JS. Good luck dragging them to C/C++.
I don't think the point is that everything should be human parse-able. But most things on the web are not complex applications. Dan Luu isn't trying to use Google Maps over HSCSD, he's trying to read hypertext blogs and Twitter. Do you seriously think Twitter qualifies as a "complex application"?
For a long time, you had to load 2+ mb of data to see a 140 character tweet. Twitter actually recently fixed this; the tweet text is available in the title now, so will be available very early in page load.
Not only does this make twitter usable on dialup again (it was effectively unusable ever since they switched from a simple html page to a massive "application" that you have to re-download every time they deploy), but it lets you search through tweets you've read in the browser history.
The 140-character limit was INITIALLY because of SMS, but now it's just the standard format for the medium. The whole point of Twitter is the 140-char limit.
The concept behind Twitter? No. Twitter? Maybe..? They have a user experience they want to deliver to 9X% of their users, so they optimize for that. For most of Twitter's users, their connection speeds aren't the limiting factor in their experience. If you optimize for the lowest common denominator, that 9X% almost certainly gets a worse experience.
* There is no proof in your statement that currrently 9X% of their users have fast internet connection.
* Even so, it would simply be an artefact of them never bothering to optimize for people with slower internet connections. If they did optimize, a lot of their traffic would come from the so called "lowest common denominator" just as the blog post says it did for Google.
* Finally I find the term "lowest common denominator" misrepresentative because it implies that optimization needs to cater to the slowest connection on earth, which is clearly not the case. If the average speed is above 90% of internet connections (as per Akamai report cited in the blog post) then the distribution of internet speeds is clearly skewed and there's value trying to even cater for median if not the minimum speed.
It's not always a tradeoff. Several sites stopped serving CSS during the Superbowl (https://twitter.com/jensimmons/status/828415747625992192). At a smaller scale, the same thing happens to some sites that reach the HN front-page.
It depends on exactly the costs are. But in that example, those sites are spending millions on Superbowl ads, and it's probably their highest day of traffic, so it's not "almost no benefit".
> Because we have the capability to work beyond that capacity now in most cases. That's like asking "why shouldn't we allow horses on our highways?"
Adding to your analogy, the JS bloat mentioned in the article is like driving a semi-truck carrying only one carton of oranges. It's a lot of extra waste for a very slight benefit.
While I agree with others here that dial-up-friendly sites don't "collide" with complex web-apps (thus rendering this argument somewhat void), I sadly think the underlying problem is much more mundane:
Why are word processors not orders of magnitudes faster than two decades ago? Same reason. No one wants to pay the additional costs of achieving that.
Practically speaking, this means that either Random Local Newspaper Inc. knows their potential online reader base exactly and deemed the additional effort not to pay off, or they have no idea of their potential customers that would flock to their site if they did put in the effort, and thus don't miss them. Add on top of that the fact that much of the internet is based on (perhaps only assumed) prestige (or loss of it if your page doesn't have the most modern features A through Z or looks like its from the 90ies) or extremely short-lived (a newspaper doesn't care about yesterday's news) and this theory pretty much explains it all.
Cynical, I know, but hey... :-/
Edit: Also: We can have nice things. No one with a 56k connection will honestly try to watch Netflix. A much more interesting question would be: How could we create incentives for big corporations to optimize their pages for connections with a small bandwith? After all, much of the web is also built on ad deals - which also almost certainly don't target those people.
> Why are word processors not orders of magnitudes faster than two decades ago?
Because even in the 1990s, word processors weren't orders of magnitude slower than the person sitting at the keyboard, which is the ultimate limiting factor on word processing speed.
Which is, essentially, the same explanation. I would very much like a fast, Vim-style modern word processor (the new cursor behavior in Word drives me insane btw). But I'm not the main target. I alone wouldn't pay the bills.
The same goes for almost any website: People with 56k don't consume much digital goods and returns from ads are most likely almost non-existent (especially if the ads are huge themselves). That easily explains a huge part of the web.
Thus, either the connectivity around the globe has to be improved (thats the route Facebook seems to choose, albeit with debatable conditions for their new "customers") or create other incentives for anyone hosting something on the web to attract people with low bandwith connections.
> The cost however, in supporting people so far behind as to only be able to serve them text is quite frankly unmanageable.
No, no it's not, it's really not. You're already writing your SPAs with a REST backend, right? Well, guess what: static HTML & REST go together like burgers & beer! All you need to do is add an HTML content renderer to your REST backend, and you have _something_ someone can use to interact with.
> The web is becoming a platform delivery service for complex applications. Some people like to think of the web as just hyper text, and everything on it should be human parse-able. For me, as someone who has come late to the game, it has never seemed that way. The web is where I go to do things: work, learn, consume, watch, play. It's a tool that allows me to access the interfaces I use in my daily life.
True fact: learning, consuming & watching are all well-supported by static HTML.
> But to stop technological advancement in order to appease the lowest common denominator seems silly to me.
Part of the problem is that we're not advancing technologically: we're getting bogged down in the La Brean morass which is modern web development. HTML is a terrible but acceptable markup language; CSS is an ugly but somewhat acceptable styling language; the DOM is as hideous as a Gorgon; JavaScript is approximately the worst language ever; the combination of all the above is grotesque, and an ongoing indictment of our entire industry.
A cross-platform application-distribution standard sounds pretty awesome, but that's not what web pages are supposed to be, and it's not what web browsers should deliver. The web is a web of hyperlinked documents. It's right there in its name. And anyone who demands that his readers enable a somewhat cross-platform, highly insecure, privacy-devouring application-distribution tool in order to read his text is welcome to take a long walk off of a short pier.
HTML is certainly not a 'terrible but acceptable markup language'. Used properly, it's a _fine_ markup language. CSS is brilliant, at least in its original incarnations: it's bloated beyond repair now.
The web used to be a web of hyperlinked documents. This has not been the case for a long time now. Webapps have evolved to enable widespread communication, collaboration, gaming, social media, and so much more.
It's the single-largest open platform that's available from nearly any device in the world. It's a little more than a document viewer.
'Web apps' are still irrelevant to the mainstream web user. Unless you think that Facebook counts as a 'web app'. Facebook is a great example of a website that very much feels like a website: it's full of hyperlinks.
>Because we have the capability to work beyond that capacity now in most cases. That's like asking "why shouldn't we allow horses on our highways?"
More like 'why shouldn't we allow people on our streets?' Which is a good question. It's a question we answered. With a yes.
>I look forward to WebAssembly and other advancements that allow us to treat the web as we once treated desktop environments, at the expense of human readability.
If you want to make a desktop programme, make a desktop programme. Don't ruin the web.
At my job, I work on a web app. It runs only on Android devices through a web view in a native app. It could have been a native app, and would have felt more natural on an Android device, been faster, and probably more secure too.
Its not just about plain text vs. content. Its about doing things at least reasonably efficiently. I commented about gmail taking 500MB on Chrome in another comment. I just checked Mail.app it started with 200MB. I wonder whats the energy budget of all the bloated website and their parsing is on millions of computers. The server side computation allows economy of scale and might have saved cpu cycles for the actual work but I'm sure many times more than that has been wasted by a bloated web.
I don't disagree that web and web applications provide a lot of convenience and I don't want to loose that. But I don't think for a moment that things can't be significantly more efficient.
> Why shouldn’t the web work with dialup or a dialup-like connection?
Because we have the capability to work beyond that capacity now in most cases. That's like asking "why shouldn't we allow horses on our highways?"
As a counterpoint, I have a 1G FTTH connection, 8GB of ram, but only a dual-core 1.4 GHz Haswell (Celeron 2955U) and an iffy SSD, so I get a terrible web experience if I have more than one tab open.
I work at a company whose revenue primarily comes from advertisements; it feels unethical to run an adblocker while ads are paying my salary. (And also would blind me to the experience that most users have; even ignoring the nice connectivity and ram that I have)
I have to applaud you for the second point. With many pages I see nowadays just googling for things like Vim-keybindings (I am sadly not allowed to install an ad blocker on some systems), I started to heavily doubt that anyone maintaining those sites has seen them through a user's eyes.
Because we have the capability to work beyond that capacity now in most cases. That's like asking "why shouldn't we allow horses on our highways?"
> Pretty much everything I consume online is plain text, even if it happens to be styled with images and fancy javascript.
No doubt, pretty much everyone who works on web apps for long enough understands that it's total madness. The cost however, in supporting people so far behind as to only be able to serve them text is quite frankly unmanageable. The web has grown dramatically over the past 20 years both in terms of physical scale and supported media types.
The web is becoming a platform delivery service for complex applications. Some people like to think of the web as just hyper text, and everything on it should be human parse-able. For me, as someone who has come late to the game, it has never seemed that way. The web is where I go to do things: work, learn, consume, watch, play. It's a tool that allows me to access the interfaces I use in my daily life. I think there's a ton of value in this, perhaps more than as a platform for simple reading news and blogs.
I look forward to WebAssembly and other advancements that allow us to treat the web as we once treated desktop environments, at the expense of human readability. It doesn't mean we need to abandon older + simpler protocols, because they too serve a purpose. But to stop technological advancement in order to appease the lowest common denominator seems silly to me.