Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think the point is that everything should be human parse-able. But most things on the web are not complex applications. Dan Luu isn't trying to use Google Maps over HSCSD, he's trying to read hypertext blogs and Twitter. Do you seriously think Twitter qualifies as a "complex application"?


For a long time, you had to load 2+ mb of data to see a 140 character tweet. Twitter actually recently fixed this; the tweet text is available in the title now, so will be available very early in page load.

Not only does this make twitter usable on dialup again (it was effectively unusable ever since they switched from a simple html page to a massive "application" that you have to re-download every time they deploy), but it lets you search through tweets you've read in the browser history.

Getting this stuff right is not rocket surgery.


I love that they are keen on holding compatibility with archaic SMS but not so much with slower connections.


1. SMS isn't archaic.

2. 140-char limit isn't about SMS.


1. 1992 in archaic. 2. Officially it was, not it's just stupid and unneccessary.


SMS isn't archaic any more than writing is archaic. They're both not brand new, and they're both widely used today.

VHS is archaic.


The 140-character limit was INITIALLY because of SMS, but now it's just the standard format for the medium. The whole point of Twitter is the 140-char limit.


The concept behind Twitter? No. Twitter? Maybe..? They have a user experience they want to deliver to 9X% of their users, so they optimize for that. For most of Twitter's users, their connection speeds aren't the limiting factor in their experience. If you optimize for the lowest common denominator, that 9X% almost certainly gets a worse experience.


* There is no proof in your statement that currrently 9X% of their users have fast internet connection.

* Even so, it would simply be an artefact of them never bothering to optimize for people with slower internet connections. If they did optimize, a lot of their traffic would come from the so called "lowest common denominator" just as the blog post says it did for Google.

* Finally I find the term "lowest common denominator" misrepresentative because it implies that optimization needs to cater to the slowest connection on earth, which is clearly not the case. If the average speed is above 90% of internet connections (as per Akamai report cited in the blog post) then the distribution of internet speeds is clearly skewed and there's value trying to even cater for median if not the minimum speed.


It's not always a tradeoff. Several sites stopped serving CSS during the Superbowl (https://twitter.com/jensimmons/status/828415747625992192). At a smaller scale, the same thing happens to some sites that reach the HN front-page.


So a couple sites go down on a single day of the year because they were unprepared and suddenly it's a bad idea to depend on CSS?

Seems like costly optimization with almost no benefit to me.


It depends on exactly the costs are. But in that example, those sites are spending millions on Superbowl ads, and it's probably their highest day of traffic, so it's not "almost no benefit".


So what's up with that character limit?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: