They do: https://gitlab.com/gitlab-org/gitlab-ce/. They maintain a Github mirror to allow for easier contribution. Github is more than just a version control software; it's a community.
Off topic, but there is not always "two sides". There is a tendency to try to frame and reduce everything to binary questions... but the world is just not binary.
To put it another way, there may be two sides (or N sides), but some sides' perspective are often more worthy than others. Not to Godwin the thread, but, yeah. Both sides' perspectives are not always equally valid. (I dutifully acknowledge that the offenses are not the same magnitude, and all the caveats that go with referencing The Big G.)
Julie made two sets of allegations against Github. The first has to do with her personal dealings with the founder and his wife and the HR department. The second with general harassment of women at Github. Github so colossally screwed up the former, they may have to concede, by default, the latter.
But the very core of XP was that it did have to be followed exactly. That there was 12 specific things you had to do. All of those things were existing things, that people already did. It was precisely the "doing all 12 of them exactly as we say" that was XP. And it was total BS.
As a former Sun guy, I can say it's because extracting value wasn't something we were very good at or really gave much weight. From Grid to Java to Solaris 10 Zones and ZFS, Jini, RFID we mostly just made cool stuff and then... went and made other cool stuff.
To be honest - I think it's a timing thing - virtualization wasn't popular initially but VMWare did a great marketing job. Then any hyper-visor became acceptable. Now - VPS-style containers are becoming acceptable. IE: Docker.
Being too early can kill you. If you think your idea is awesome but too early, my advice is to keep trying for as long as it takes. Docker was not my first attempt at solving this particular problem :) [1] [2] [3]
Within 15 minutes of setting up a https CI environment, complete with robots.txt, Googlebot was hitting the DNS name which also wasn't public or previously used or easily guessed.
Google gets a lot of leeway from people. If you have done SEO, you will learn that the Googlebot doesn't always respect the robots.txt. Requesting to de-index a page may take weeks or even months. The quickest way is file a DMCA complaint for the link to your own site.
Recently, they started tracking all downloads made on Chrome (for malwares), it includes the filename, the URL, IP and the timestamp. Sucks hard since I love Chrome and the only way to disable it is to disable the website malware checker (which only uses part of the hashes anyway).
Another possibility was that the hostnames were leaked via the SSL certificate. I've seen evidence of spiders using this for discovery, including Google. Your best protection in that case is to use a wildcard certificate, if you want it to validate.
Buy (or dig out of your closet) an old Wifi router and install Tomato on it. Its web interface lets you edit its hosts file which is then active for all the devices connected to it.
Except many of us don't have Tomato/DD-WRT capable devices, not to mention I can't be bothered setting up an additional router to my current two, flashing and securing it and then switching my devices over to it just to test something when this solution does essentially the same thing.
Because it's a PITA to show other people how to make entries to their hosts file if you want for someone else to look at your project. I'm not telling you anything you don't already know, but this is why I would use this service over just making entries to my own hosts file.