Hacker Newsnew | past | comments | ask | show | jobs | submit | britta's commentslogin

For anyone at GitHub looking at this thread: please update your documentation page about how to report abuse (https://docs.github.com/en/communities/maintaining-your-safe...). I tried to follow the instructions, but I ran into a bunch of dead ends that slowed me down - I couldn't find the report abuse buttons for issues, comments, or repositories, only for the user profile page. I'm on Chrome on a Mac laptop, logged into GitHub.

Also, on the report abuse page that I got to from the user profile page, the green submit button is nearly hidden by the grey footer, even when I scroll the page around and complete the captcha.


The team is a mix of employees and contractors. They also offer customers (government agencies who use their service) the option to use Bing results or their in-house Elasticsearch results: https://search.gov/admin-center/content/content-overview.htm...

They do good work, and it’s an important service. I believe it saves a ton of money for the federal government by reducing reinvention of the wheel. As a former federal employee and current federal contractor, it’s been very helpful to be able to use their no-cost-to-customer search services on multiple projects. On my current project we eventually shifted to doing our own search (using Postgres full text search) so we could customize the indexing and ranking, but Search.gov was a useful interim solution.


It's good too see I wish gov would make more in house solutions instead of contacting it out.

At my department that's all we do. Farm it out to 365 or beltway defense companies


It's not too late! Grab a ticket and come on in: https://www.eventbrite.com/e/roguelike-celebration-2021-tick...


The Forest Service worked with 18F on this Open Forest project to enable people to buy Christmas tree permits online, which uses United States Web Design System components: https://openforest.fs.usda.gov/christmas-trees/forests

You can learn more about that project here, including the open source code: https://18f.gsa.gov/what-we-deliver/forest-service/ - and a few blog posts about it: https://18f.gsa.gov/tags/forest-service/

Both that Forest Service project and Federalist run on cloud.gov, which is a Platform as a Service operated by 18F.

(I work for 18F but don't officially represent it here.)


https://openforest.fs.usda.gov/christmas-trees/forests has a pretty heavy dependency on Javascript (read: there's no fallback to anything non-javascript, so I get a white page). It feels like that goes against some of the things that 18F is about.


Pages don't need a fallback to non-Javascript in 2019. Accessibility technologies now fully support a JS enabled web, with accessibility standards following suit.

That leaves only those who have voluntarily disabled JavaScript (<1% of users), but fortunately those users are typically aware of how to resolve the issue of their own creation.

I've worked on public facing government websites (not 18F). We simply don't support this edge case, and our legal department supports our legal right to do so (in particular in relation to ADA requirements).


I wholeheartedly disagree. This is a public-facing government website. It ought to degrade gracefully in order to reach the broadest possible audience. Or, write a decent site to start with, and you wouldn't have to worry about degrading. The entire page is nothing but a shell for a web app written in JavaScript that doesn't need to be a web app written entirely in JavaScript.


> It ought to degrade gracefully in order to reach the broadest possible audience.

You mean under 1% of users that intentionally broke their browser?

Seems unreasonable to dedicate resources to that, that could be better spent on 99% of our users. Why should the 1% get special treatment? And what other parts of their browser can they disable that we need to support, perhaps no CSS? Maybe IE5? Maybe they only render XHTML? Etc.


> You mean under 1% of users that intentionally broke their browser?

I'd say: configured their browser to work like a browser instead of like a platform to run arbitrary code from the Internet.

Ideally we should be able to trust most of the web sites we visit. The last few years have shown us this is a bad idea, here are my two top reasons:

- security: while I'm personally less concerned with reasonable ads there are a number of problems with ad technology, like infectious ads and creepy tracking.

- a bigger problem for now IMO: poorly written web apps that makes the machine noticably slower.


There is still one important scenario that benefits from supporting a fallback to "classic" HTML for web sites (and apps): Bandwidth constrained environments. GMail's plain HTML version is an existing example (a link to it shows up if the JS version takes too long to load).

Anyway, for public facing sites and apps, you may already be doing most of the necessary work for SEO purposes. Letting humans access the version that you're showing to search engine spiders shouldn't be a huge burden.


Ideally, yes. Realistically, it’s not worth the cost.


> You mean under 1% of users that intentionally broke their browser?

No, for the people who can't afford to upgrade to the latest technology.



> Pages don't need a fallback to non-Javascript in 2019.

Strongly disagree. Having your page completely break instead of degrading gracefully puts up a barrier to those who cannot run JavaScript (for example, users with older, weaker computers).


People on "older, weaker computers" are likely running a browser we also don't support (<IE10) on an Operating System we don't support, and they'll likely see a TLS error before even hitting the load balancer (we don't support SSL or TLS 1.0).

It is unlikely that there exists a subset of users with a modern enough computer to even hit our web servers that is under-powered to the point of not handling JavaScript. Our analytics definitely don't show this.


> It is unlikely that there exists a subset of users with a modern enough computer to even hit our web servers that is under-powered to the point of not handling JavaScript. Our analytics definitely don't show this.

Does your analytics correctly register hits from clients that doesn't support Javascript (I'm thinking about survivorship bias).


Thank you for the response. I wasn't trying to suggest that it was a requirement, simply that it seemed odd that there would be no fallback whatsoever. I guess I'm behind the times.


No, you're not. The times are just circling back around to form over function again.

Since we keep revisiting this, you might say that you're ahead of the times.


> Pages don't need a fallback to non-Javascript in 2019. Accessibility technologies now fully support a JS enabled web, with accessibility standards following suit.

I think we should broaden the definition of accessibility to make websites aren't unneccesarily annoying or invasive for normal users either ;-)

(And yes, making web applications is part of my job.)


> Pages don't need a fallback to non-Javascript in 2019. Accessibility technologies now fully support a JS enabled web, with accessibility standards following suit.

This is wrong, of course, because it requires that people buy expensive hardware to use the latest accessibility technology, which is not generally available. It's like demanding that people buy electric-powered wheelchairs instead of making your building accessible to normal wheelchairs.

> We simply don't support this edge case, and our legal department supports our legal right to do so (in particular in relation to ADA requirements).

And I'm sure that ignoring poor people enables you to sleep very well at night.


As one of "those who have voluntarily disabled JavaScript" this comment is pretty disappointing.


If you disable a tool which you know is used to build many websites nowadays then you really should expect a deprecated experience. You're in the minority and honestly I don't know why you should expect web developers to cater for you.


Because things like blogger and Google groups shouldn't require js to just view text. There is a large, large, enormously large body of work out there that doesn't need JavaScript to achieve its goal like showing text or images, but are more-or-leas broken without JavaScript. That's the tragedy. I don't expect a web app, such as Google maps or Google docs to work without js, but I should be able to read blogs and newsgroups.


Why don't you want to run JS, on a site you trust, that has no ads?


That situation does not normally come up, because I stop trusting that site. If a page that just displays documents requires javascript to do so, then clearly the developers were not worthy of my trust.

There are legitimate uses of javascript, but I'm not going to enable it just to look at a document.


What site to trust? MDN once hanged their own browser (granted it can display itself without javascript).


Because its slow?


First, as a person who makes web sites and apps, I believe it's unprofessional to fail to account for users who can't or won't run JS. JS abuse is rampant. I choose not to let sites run JS by default because I don't trust most sites not to have some sort of compromise or malware.

Second, it's fine with me if some site wants to forgo my patronage, or provide "a deprecated experience", but not the government.

I certainly do expect my federal, state and local government to follow best practices and provide working websites that I can use without running JS.

I can't think of a single function of government that requires JS, thank God.


> I certainly do expect my federal, state and local government to follow best practices and provide working websites that I can use without running JS.

Best practices is a moving target. What made sense in 1999 doesn't make sense in 2019. The web simply requires JS, CSS, and HTML today. If you disable any one you aren't compatible.

There's no actual argument for why websites should spend significantly to support a tiny subset of users that intentionally break compatibility for ideological reasons. It is unfair to our other >99% of users who we'd have more time for.

The old arguments such as accessibility aren't correct any longer: accessibility devices specifically support JavaScript (text contrast, HTML organization/order, aria tags, video subtitles, etc remain highly important).

If you really insist on a JavaScript free world you are of course welcome to visit a government office in person, pick up, and mail back a paper form. The website is merely a convenience we offer to you.

Otherwise you'll need an IE 10 or newer browser, on an Operating System that supports TLS 1.1 (Windows Vista or newer), JavaScript, CSS, and HTML.


> Best practices is a moving target. What made sense in 1999 doesn't make sense in 2019.

Sure, but it still makes sense to use JS sparingly. Running untrusted remote code in your browser is a huge nest of attack vectors. I don't think that, in 2019, running JS from the open internet willy-nilly can be described as "best practices", despite the prevalence of it. We're not there yet. If three hundred million people jump off a bridge I'm still not going to do it too.

> The web simply requires JS, CSS, and HTML today. If you disable any one you aren't compatible.

That's, like, your opinion, man.

You're trying to insist that your concept of the Internet is the concept of the Internet. It's a self-fulfilling prophecy. But it's not quite true yet, eh?

> There's no actual argument for why websites should spend significantly to support [non-JS users]

Right, they shouldn't spend more because the tech they use should provide for non-JS users out-of-the box without additional overhead. If devs have chosen NOT to use tech like that then they are at fault, not the user, eh?

> for ideological reasons.

What about for security reasons?

> It is unfair to our other >99% of users who we'd have more time for.

But the reason you have to "spend significantly" to do the right thing is that you chose to use and deploy crappy JS frameworks, not that some people refuse to run your crappy frameworks. This is classic "blame the user".

Now, this is your prerogative if you're doing your own site/app, but the government doesn't get to exclude some people from service just because they don't run JS. Speaking as a techno-elitist, that's techno-elitist BS.

> If you really insist on a JavaScript free world you are of course welcome to visit a government office in person, pick up, and mail back a paper form.

AH-whaaaa? Rather than fallback to plain HTML+CSS you're content to let the user fallback to hard copies and physically transporting their meat-puppet? To save costs? On web development? Where's the sense in that?

> The website is merely a convenience we offer to you.

Well, no. It's an INconvenience you offer me. If you're offering convenience to most people but deliberately excluding some that seems to me to go against the egalitarian spirit of our American government, no? "Unfair"?

> Otherwise you'll need an IE 10 or newer browser, on an Operating System that supports TLS 1.1 (Windows Vista or newer), JavaScript, CSS, and HTML.

I run Dillo. A government website that doesn't look decent and work right when accessed with the Dillo browser is just broken and sad in 2019.


>First, as a person who makes web sites and apps, I believe it's unprofessional to fail to account for users who can't or won't run JS. JS abuse is rampant.

If you do web for a living, you should also know that fallbacks and graceful degradation and server-side rendering all come with a cost. Both monetarily and in terms of complexity.


I would bet that the vast majority of government websites are nothing more than simple forms or informational pages.

Yes, I'll Grant you that if you've designed an interactive, multipage form then it is costly to rebuild it to degrade. However, I'd argue that the form was probably unnecessary technically complex and that starting with having a degraded option in the first place isn't additionally costly (and better protects the agency from ada lawsuits).

In the very rare cases that we're talking about a true webapp (i.e. Google docs, or I'll even Grant you GIS (mapping)visualizations, even if it's possible to degrade those), then yes, decisions need to be made on what minimal technical requirements are required. E.g. a government agency offering an application that only works in chrome would be a non starter.


While I agree with you that most if not all sites should offer some sort of no JavaScript fallback, the development resources required to offer such a thing is just generally unpractical when it's such a small minority of users. Government or not, they still have to choose between spending their limited departmental resources on an extremely small minority, or the greater userbase as a whole.


> the development resources required to offer such a thing is just generally unpractical

Yeah, iSnow made the same point, but it doesn't scan. "Boss, I can make it work w/o JS but it will cost more..." Huh? Doesn't that sound just like what an unprofessional developer would say?

The fact that so many popular JS frameworks don't do the right thing is part of the JS abuse in my opinion (same goes for accessibility.) Lazy developers wrote half-assed frameworks and other lazy developers chose to use them and then people start to believe that adding JS somehow makes it hard or expensive to do without JS when really they are just doing it wrong in the first place.

> when it's such a small minority of users.

The population of the US is just under 330M, so if, say, 0.5% can't or won't run JS to interact with taxpayer-funded government services that's about 1.5M people. Those folks (of whom I am one) should not be disenfranchised, so to speak, because the gov hired unprofessional developers. The government shouldn't do that, and they certainly shouldn't try to tell me that I'm some out-of-date digital neanderthal for caring enough about web insecurity to disable JS, eh? Lousy devs (as demonstrated by the fact that they can't provide a non-JS web experience/fallbacks at an affordable rate) are precisely the ones the JS code of whom I have no wish to run, and I certainly don't want my tax dollars going to pay them to screw me out of access to the service also paid for by my tax dollars.

> choose between spending their limited departmental resources on an extremely small minority, or the greater userbase as a whole.

Or they could use tech that works for everybody automatically for the same cost, eh?


By that same argument, we don't need to worry about accessibility.


Disabled users don't get a choice, that's their lives. People who voluntarily disable a core browser component do. Apples and oranges. You should (and legally need to) worry about accessibility.


When I can't access the page content I read the page source. Disabled users can do the same. There are other workarounds like graphic captcha solvers. There's nothing unsolvable about accessibility, it's just annoying.


It's kind of absurd that just using the web the way it was designed is now considered "catering to you".


Browsers don’t have JavaScript disabled by default. You’re going out your way to disable it. The modern web expects JavaScript to be enabled.


Some do, and you're also going out of your way to require javascript from your site that just displays documents.

The "modern web" is dramatically worse (slower, more compute intensive, less consistent, less secure) than the web used to be. This is mostly because of the tendency to force javascript into places where it was never necessary.


As another user disabling js: Not entirely unexpected.


I disagree.

At this point, there is no reason to support <IE11 style JS, but JavaScript-less browsers are still A Thing and always will be. Search engines, Opera Mini, various accessibility thing, people with extremely low bandwidth, and weirdos who choose to disable JS are all factors.

Yes, if you're making a webapp, there's no good way to do it without JS and you can just be upfront about that. Also, people who can't use ES6+ are going to zero over time, so it's fine to just write ES6. But you should also have a no-JS fallback version of an information page (anything that's not a webapp) because that's an important usecase that won't drop to zero over time.


Working without JS is still a core requirement of GDS sites; c.f. https://www.gov.uk/service-manual/technology/using-progressi...

and for more as to why: https://gds.blog.gov.uk/2013/10/21/how-many-people-are-missi...


In 2013.

They're now building sites to meet WCAG 2.1 which does support the usage of JavaScript for accessible users.


> https://openforest.fs.usda.gov/christmas-trees/forests has a pretty heavy dependency on Javascript

And it doesn't even work with Javascript.


You need to move to DC to work for USDS (https://www.usds.gov/join#relocation), but you can work for 18F from anywhere in the US (https://18f.gsa.gov/join/). The login.gov team includes both USDS and 18F team members (as the blog post says).

Also the salary cap is $161,900, as shown in this pay grade table for SF: https://www.opm.gov/policy-data-oversight/pay-leave/salaries... Not a top-of-the-line private sector salary, but not some kind of hardship.


160k is a run of the mill senior engineer salary.... The best engineers in the world can earn 300k or more in Silicon Valley. It takes an altruistic engineer to give up half his salary to work for the public.


I work at 18F but am speaking about my experience here and experiences heard from coworkers, not officially on behalf of 18F.

Applying to 18F didn't require a drug test for me. The application process asks this question in the background check part (details at "security clearance" at the bottom of https://pages.18f.gov/joining-18f/how-to-apply/ for anyone curious), but answering yes to the "have you ever done federally-illegal drugs" question is not an automatic disqualification. It can be a disqualification depending on the details of the drug use.


Yes, if a company receiving this letter isn't quite sure what level of detail the NHSTA needs, they could make a polite call to the explicitly-listed "call us if you have questions" person and ask about the expected scope, to make sure they're on the right track before they submit the final documents. Federal regulatory agencies are made of people.


Aw, thanks for posting this and being excited about it! This is a labor of love organized by my friend Noah with help from a few other friends and me, just for fun since we love playing roguelikes.

We did sell out of tickets; our very graciously donated venue (thanks to Eventbrite) has an attendee limit since it's basically an office rather than a large venue. I hope you all will watch the streams! The talks will also be recorded so you can watch them later.


Thank you for providing streams! This just happens to be the weekend I leave the Bay Area so otherwise I'd be pretty bummed about missing it.


That's just the schedule page - sorry this is confusing! The intro page (https://roguelike.club/) has the date - tomorrow (Saturday), September 17.


You might find this chart interesting for thinking about the uses of discussion outside of courts: https://twitter.com/scriptjunkie1/status/739545814150746113


Except the sentencing as "rapist, sociopath, plagiarist", seems to already have taken place.

I'm not saying there shouldn't be any discussion, but this wasn't just discussion. It was more the presenting of a sentence, and then using the weight of the accusations to stifle discussion rather than engage with it. Even to the point of saying it won't go to court, because of what a vindictive person he is, or because the legal system is so bad that you really should believe the victims, and that's that. And of course, it's also his fault that there is no substantiation of the plagiarism claims, they're just a bonus anyway, something you're supposed to take on board with the main course. He's done, that's the "takeaway", and any discussion beyond that was hardly invited, much less responded to in a reasonable manner from what I saw.

It may be "more likely than not" that Appelbaum is guilty of some of or all these things, but it's pretty much before all our eyes what has been done with it, and what shoddy arguments have been used to excuse it. Personally, I wouldn't want anything to do with anyone who doesn't mind such a virtual lynch mob and the sophistry that went along with it, how's that for warning your social circles? Everybody throws a pebble, nobody feels like they stoned someone to death. After all, it was "more likely than not". I find that equally repulsive as what Appelbaum is accused of.

Once you're fine with holding that weapon, and your friends wielding it as well, once you are fine with a mob because you're in it and think it'll win and nobody will ask questions, that weapon will end up deciding for you who gets shot. And it takes your own humanity, too, bit by bit. It's not a new thing, it's not the first time I saw it, and though it didn't quite catch on as much as the people who tried so very hard probably wanted to, I never saw so many smart people go along with something so obviously wrong as this.

And all that in context of accusing him of that, of destroying people by using others? It would be funny if it wasn't so serious. Even if he does do that, that doesn't make it okay. Abusing an abuser while saying you're anti-abuse and pro-safe-space is a farce.

This is what is at stake, that is, what this has already broken to shit, and what many now are too proud or too immature to acknowledge, much less make any moves to fix:

> So stop it. You are creating a world where no one will ever dare to move out of their front door and take up a life of activism, because they can not know if Jake is a sociopath, or a person who is being framed.

( https://news.ycombinator.com/item?id=11851888 )

edit: and even if he is a sociopath, even if you have to warn your friends and circles and so on, that doesn't excuse whatever this was and is.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: