You have it reversed. If it is already true that being more symmetrical, imposing and pleasing is advantageous for negotiations, then the future in which we have the tools to counter that bias is getting less sad.
You can make a similar argument that the wide availability of soaps and deodorants reinforces biases about how people should smell. While that's true, the fact that anyone can easily match those biases reduces the opportunity to discriminate and makes those biases less of an issue.
You can make that argument, yes. I can deconstruct it by arguing that not everyone can easily match those biases - I bet there are still plenty of places on Earth where soap is surprisingly difficult to come by.
That side issue does not have much to do with the video issues under discussion, though.
I shudder at the thought of a world in which always-on lies about your appearance become standard practice.
Have physical reality and truth really become irrelevant?
They have been for quite some time: Cosmetic Surgery.
I remember watching a TV show years ago about a woman who had had lots of plastic surgery, found the man of her dreams, and was now pregnant. She was afraid the baby would come out looking totally different than she did, because she had physically changed herself so much, and her husband would leave her over the deceit.
At the same time, with cosmetic surgery, you are changing physical reality. Someone has sliced up your body and reconfigured it to be more what you want.
So it's not necessarily a "lie", as such.
Like I said, though, I see what you're getting at, and it's a valid point.
Since this has gotten to the point where we need to be precise with the language: they are not the tools for eliminating the existence of bias. They are tools for countering the effect of bias in particular instances.
I think you missed your parent's point. Let's examine a related case: fashion models. The industry has been manipulating images for years, to make models skinnier, whiter, removing blemishes, etc. In your language, these are tools for eliminating the effect of bias for the models. The impact that's had on our society is well studied: biases have been disastrously reinforced.
Ha! People who wear power suits, makeup, expensive haircuts and shoes are already doing all this. Technology just means you don't have to spend time in the stylist's chair to accomplish it any more!
And what percentage of the population do you believe that is true for? Remember, we're talking about people interviewing in general here, not the small percentage of software devs who work 100% remotely on contract.
Not to mention you'll just come off as a huge weirdo as soon as you're found out.
It's not a straw man when your case applies to a tiny fraction of the human population and mine applies to the rest. Context matters, I was responding to this:
>Ha! People who wear power suits, makeup, expensive haircuts and shoes are already doing all this. Technology just means you don't have to spend time in the stylist's chair to accomplish it any more!
I wondered the same thing, then I did a quick Google search and it turns out they're relatively known in Mexico
They also have a web page: www.biyubi.com
Maybe "underground" was not the best description, but it is the case that much of their software remains unpublished, and so people still debate whether it really exists or not. I think it probably does, because 40 years of a family building extremely optimized software could produce a talent like Óscar Toledo G., while 40 years of running a con job would not.
"The coming war" definitely paints their view of the future as cynical, but as someone who works in robotics for military use it's hard to argue with it...
I'm using Fennec F-Droid and can confirm this bug also affected me, and I really don't know if it can be solved without an update (which is not yet available, as it seems)
I once toyed with the idea of a machine-oriented spoken conlang similar to Lisp in "syntax" by reserving certain syllables for punctuation (brackets, etc.)
One of the main problems of existing solutions is that they lack flexibility due to the poor state of natural language understanding, which is solved by pretty much removing natural language from the equation. However, it would be unusable for most people, or at least would have a learning curve too steep for it to be practical.
I don't really care about Fitbit collecting all that data. If you don't want yours to be used, just buy another tracker from the tons out there. Privacy is all about choice, so it's not as bad as i.e. Google using your search data because the room for choice is smaller (at least for me, having tried out DuckDuckGo and others and switching back). Also I'd rather have my data harvested for science and medicine than for selling advertisements or whatnot
> If you don't want yours to be used, just buy another tracker from the tons out there.
I've already done this. But that doesn't mean I can't call out potential privacy violations in other products.
> Also I'd rather have my data harvested for science and medicine than for selling advertisements or whatnot
Can you prove this? Even if Fitbit isn't doing this now, what if they're acquired tomorrow by a company with more relaxed morals on what is permissible to do with health data?
There are no trackers which don't force customers to create an online account and then upload the data. I looked at every possible product last year and the only exception is Apple.
And all the "science and medicine" done by FitBit according to this article has resulted only in confirming things that are already known. I was actually curious what insights they discovered and there are none.
"Fitbit’s data confirms a lot of what cardiologists already know. But because the Fitbit data set is ridiculously huge, it unearthed some surprises, too."
From there the article goes on to list a number of new insights and potential research areas to pursue.
I did, none of the "surprising things" seem like they could result in any medically actionable info. They're more like medical curiosities, while the truly actionable info was already known.
iCloud sync can be turned off completely on any iPhone. Anyway, I believe the health data is encrypted with some device-specific secret, but I'm not 100% sure.
Privacy is also about "herd immunity" - that geotagged photo posted from a party at my place affect my privacy, as does your fitbit tracking data if purchased cell tower data can be collerated to the point of assuming we went hiking together...
Not trying to argue, just pointing out that it's not quite so simple.