> The price in USDT and USDT-only exchanges could indeed rocket, but the price in real actual USD would be expected to fall.
Wait, would it?
If you were an arb wouldn't you:
1) Buy bitcoin on a different exchange (with a temporarily lower price)
2) Transfer that bitcoin to, say, bitfinex (or another USDT exchange)
3) Sell that bitcoin for, say, ETH
4) Transfer that ETH back to the first exchange
5) Profit
Though I guess that would also flood the market and tank the ETH-BTC price as well (though I guess another arb could take advantage of that discrepancy as well).
Or are you talking about exchanges that only do USDT-BTC?
> One of the root causes of New York's government dysfunction (though not the only one) is that New Yorkers reliably vote in the democrat candidate regardless of pretty much any other consideration.
Er, what? That's not really true. Bloomberg was an independent and his predecessor was Giuliani.
Over the last 40 years, the mayor has been a democrat less than half the time (18 years over Koch, Dinkins, and now de Blasio).
We're talking about New York State, not New York City.
GP isn't quite right that New York is a single-party state, but he is right that most districts aren't at all competitive. There are certain districts that are deemed Democratic territory and others that are Republican territory, and the two parties have gentleman's agreements with each other not to compete in each other's turf.
> We're talking about New York State, not New York City.
That's pretty hard to infer given the context, since the MTA is controlled by both the city and the state.
> but he is right that most districts aren't at all competitive
This is also true of say, Georgia. You can see for example that no one even ran against Buddy Carter in 2016 [1]. Most districts, period, aren't really that competitive.
> That's pretty hard to infer given the context, since the MTA is controlled by both the city and the state.
Not really - the MTA is a private entity which receives the bulk of its funding from the state, and the state government is (ostensibly) responsible for holding it accountable.
> This is also true of say, Georgia. You can see for example that no one even ran against Buddy Carter in 2016 [1]. Most districts, period, aren't really that competitive.
New York is a special case in the degree to which both parties collude to keep districts non-competitive for general elections, how they ensure that even primary elections are non-competitive[0], and the number of state laws that they have passed in order to shield this power from being checked by voters at any step.
I don't really want to get too into the details here, because it's tangential to the original topic, but it's been discussed on HN before.
> the MTA is a private entity which receives the bulk of its funding from the state
That's not really true. The biggest source of funding for the MTA is fares, followed closely by dedicated taxes (i.e. there would be no reason for the taxes to exist were it not specifically for the MTA). A majority of those taxes are levied specifically on people who live in the MTA's core area (basically counties surrounding and including the city) [1].
The state actually dips into these dedicated taxes to cover other budget shortfalls.
You make it sound as though the state covers for the MTA using the general tax pool, which is such a minor part of the MTA's budget (through subsidies largely) as to be not meaningful.
> the state government is (ostensibly) responsible for holding it accountable.
No, the board is. The governor appoints 6 board members, the city 4, and the other 7 are delegates from various counties throughout the state.
> New York is a special case
Special case according to whom? Georgia rated 3rd-to-last on Ballotpedia's "competitiveness index" for the 2016 election cycle [2]. If your argument is that:
1) New York overtly colludes to remain uncompetitive
2) Georgia doesn't
3) George still manages to hold less competitive elections than New York
then I think a logical conclusion is one of:
1) Georgia legislators collude but less overtly
2) There's less transparency about the collusion in Georgia
3) Collusion clearly isn't as large a factor in uncompetitive elections as other factors
Having lived in both states here's my perspective:
NY has long been a NYC/everything else split. As home prices rose and blue collar workers got priced out of LI, it too is lumping in with NYC as a Dem block. Between the two they way outvote in statewide elections.
Upstate is mainly Repub with a few blue areas.
GA was Dem for years until the mid-90's. Even the Dems were fairly conservative (remember Zell Miller's "My party left me" comments) and since taking control of statewide and anything other than ATL and Macon proper they have gerrymandered districts to the point of there is no way the Democrats could take back either of the legislative bodies.
TL;DL NY has been like this a long time. Albany has played the game for centuries. GA is just learning.
That's not how state government works though. Representatives are not businesses that can steal customers. They each have interests they wish to serve. If people are electing the people they want and those people are doing what they are elected to do then democracy is working. How have we outsourced accountability any more than we intended to in the first place with representative democracy?
> If people are electing the people they want and those people are doing what they are elected to do then democracy is working.
New York election law has been engineered to the point where it's actually impossible for voters to have any influence on the outcome. (That's why New York perennially has the lowest election turnout in the country).
Even our primary elections are essentially coronations for the candidates that the parties themselves hand-pick[0], with no real way for voters to override that choice. That's just one law, and if it were only that one, it might not be such a problem, but it's part of a carefully-constructed system that leaves New York residents with truly no control over our government.
The cost of media services isn't mostly infrastructure; its licensing. When you play a song, they have to pay a certain amount to the rights holders. So they need to figure out the average listening habits across all their listeners then price their subscriptions so that they can pay licensing fees and make profit.
I think you can retroactively pay for COBRA, yeah?
You can enroll after an event and it'll be as though you had insurance the whole time. You'll have to pay for the retroactive coverage, but it's a pretty smooth way (assuming nothing happens) to avoid paying for COBRA while still technically "having coverage".
Robinhood seems to be pulling in a lot of different directions and there's all these "early access" features that never seem to actually, well, launch.
I've now signed up for:
* Web access (otherwise it's mobile-only)
* Options trading
* And now, cryptocurrency trading (because why not)
and I haven't gotten access to any of them yet, even though Robinhood for web has been in "early access" for quite awhile now.
So I signed up for this, but I have very little faith that I'll get access any time soon (even with the vague promise of early February).
I remember being in a startup that promised features before they could deliver, and how it was a symptom of shoddy business practices in general. So when I consider the fact that I have a decent amount of money tied up in Robinhood, and I see them promising features without delivering, I get uneasy.
Otherwise it's a great product. I'm really only writing this in the hope that a Robinhood person catches eye on this comment and takes it to heart.
I'm really confused about why web access hasn't launched yet? Would think that is the easiest of them all and entirely in their own fate. Options and crypto seem like much more complicated endeavors but a website, ffs.
people will not spend more money trading on a website. not having one is an inconvenience. options and crypto are more places for people to put their money
I blame Apple. No, really. When Robinhood first launched it was intentionally targeted at iOS users, hence the almost automatic focus of building an app vice a mobile friendly progressive web app. Then the push to target Android users - but since "we do apps" is already the corporate mindset, the next thing is not a web page, but yet another app. Eventually they get around to the web.
A company I left a few years ago went down the same path.
> If you being late to your job causes others to start doing the same
You're missing the point entirely: Who cares if they're also late?
Short of them being late to something important like a meeting, it seriously couldn't matter less than any of the thousand other things you should focus on as a manager.
Unless you work in something with external time pressures (e.g. I used to work in equities and US market hours dictated our need to be available) there's no reason it should matter whether someone shows up at 9 or 9:45. If you're really worried about people missing each other, set core hours (11-3 say) where everyone's expected to be available.
Disclaimer: I know relatively little about differences between processor architectures, so this might be totally wrong because Intel might just have them over a barrel on this.
I think the major difference between this and, say, the Equifax blowup, is that Intel's institutional clients are affected by this.
I'm not sure what they're thinking internally, but it stands to reason that they're probably a bit upset at least: Their CapEx just went up to maintain the same level of computing power. I'd be surprised if internally Google is buying the "AMD is just as affected" line that Intel's been throwing out.
So, I wouldn't be surprised if they're at least evaluating AMD.
Or, again, maybe Intel just totally has them over a barrel and transitioning isn't feasible at all. It certainly doesn't paint a great picture of Intel's future if AMD does catch up, though.
I think the big institutional clients are always evaluating AMD, and even more exotic options. But I don't think this bug has materially affected the equation here. Savvy buyers understand that there will occasionally be bugs, and this one has already been mitigated. How Intel verbally responds to the issue matters much less than their actions, because the words do not affect the costs of using their products, and the actions do.
I am deeply skeptical of the commentary on Intel's attitude and press releases. I really doubt that matters much to most buyers.
First, ARM is doing to Intel what Intel did to the Unix workstation vendors in the 80’s.
Second, given that they’re being cornered into the server business, they need to have products that are rock solid there until they can regroup. This is one of a long parade of recent screwups with their big bets in this space:
(1) A while back, all their server atom chips (tons of crypto and I/O with piles of ECC DRAM and cores for < $1000 and < 20W) had a bug where they stopped booting af 18 months of uptime. These compete exactly in the space server-ARM has a chance, so many affected vendors were already dual sourcing.
(2) NVIDIA crushes them for AI, and Intel is a distant third for graphics in general
(3) Samsung SSDs generally trounce Intel ones.
(4) They’re rapidly losing client device share. Their big recent innovation there is AMT, which is increasingly considered an anti-feature.
That leaves conventional IT compute, (web services, DBMS, etc) for their core business, but even on-prem stuff is moving to private cloud, which needs multi tenancy, and they’re looking pretty risk for that use case too (vs AMD?)
They’ll certainly be around for a long time, but it’s not clear how long they’ll keep their “no one gets fired for buying IBM”-level of dominance.
This is true only for the consumer SSD market, where Intel outsources large portions of the product development. It's also a market that Intel may abandon completely in the next few years as Intel and Micron start to pursue separate flash memory development. If Intel doesn't score a solid win with a consumer SSD in the next two generations, it would be reasonable for them to pull out and focus solely on enterprise SSDs, where they have no trouble winning.
Correct me if I'm wrong: It's been mitigated by applying a patch that has fairly severe performance implications, no? How does this not affect the institutional clients' bottom line in that case?
There's the 15-20% hit for going to database. Most servers out there do nothing but CRUD to some DB somewhere because most businesses don't do HPC. It's definitely an issue for them.
Then there are companies like Epic Games who reported horrific numbers. It seems if you do lots of simple communications (eg websockets or UDP), you can expect a huge slowdown.
>Or, again, maybe Intel just totally has them over a barrel and transitioning isn't feasible at all.
That was my hypothesis for why their stock didn't drop much. The problem is very bad but intel's quasi-monopoly and the very high switching costs involved will let them weather it.
Not only will they probably weather it (short term), it will probably result in a reasonable sales bump as affected datacenters replace capacity lost to the various patches and mitigations and also perhaps accelerate replacement schedules when hardware that has designed-in mitigations becomes available.
>I'm not sure what they're thinking internally, but it stands to reason that they're probably a bit upset at least: Their CapEx just went up to maintain the same level of computing power. I'd be surprised if internally Google is buying the "AMD is just as affected" line that Intel's been throwing out.
They don't have to buy it. Whether affected or not, AMD is a non starter at this moment for those things.
Semi-related question: Why is that? I've been looking for resources on why AMD is a non-starter, but most searches just turn up comparisons for low-level consumers.
AFAIK, it’s the ecosystem and availability. Let’s say you’re a Dell shop and you’re using mostly PowerEdge R730s (one of their middle-of-the-road 2-socket options). Your choice right now is Xeon E5-2600 v4 family, because that’s what that platform supports. These aren’t motherboards that you just pick up at Fry’s, they’re designed as an entire system, which takes a while to get going. Dell is not going to invest the R&D money unless they have demand, but there won’t be much demand without stories of success in the field, so we’re at a bit of a chicken and egg problem right now.
> shares I had to opt in to, at the cost of some salary.
I think this is a fair way to approach the situation.
> People on this thread are making comparisons to 300k/yr Google salaries. ... But for employers, it's a pretty silly comparison.
Why though? You're competing with them, whether you like it or not. AmaGooFaceAppleSoft hired 45% (including those going on to grad school, etc) of my graduating class, and I didn't go to Stanford. No joke, it was really 45%. These companies are vacuuming up everyone, and some of them are shockingly easy to get an offer from.
Sure, they didn't get paid 300k/yr to start. But with how much the equity has appreciated since then, it's probably not that far off.
So that leaves you with the remaining 25% who didn't get hired/didn't want to work at at one of those companies/didn't go to grad school, and you're competing with every other startup for them - including the larger, more established ones who can reasonably say they won't disappear tomorrow (e.g. AirBnB).
No, you're not competing with them. AmaGooBookSoft hires, to a first approximation, only pedigreed developers with their AKC papers in hand. They're competing for a limited supply of those developers.
But, in fact, a mutt developer you rescue from the dev pound catches a ball just as well, and may in fact be more healthy than the purebreds that Google is hiring.
You know this has to be true, because outside of AmaGooBookSoft (actually: pretty much just GooBook), nobody makes those wages anyways. Even accounting for the (wildly optimistic) company projections for equity value.
> AmaGooBookSoft hires, to a first approximation, only pedigreed developers with their AKC papers in hand.
I think that is a very outdated view of those companies. IME at least, you've described Google's hiring process pretty accurately but are fairly incorrect about the rest.
In particular, Amazon is hiring like crazy [1]. Even back when I graduated, Amazon was known for being a place that was easy to get into if you were willing to stomach the sometimes-insane workload (which still pales in comparison to some startup sweatshop horror stories).
Again, I didn't graduate from a program with much name recognition and AmaGooAppleFaceSoft still snatched up 45% of my graduating class.
> (actually: pretty much just GooBook)
I'm not sure why you're only including those two.
Amazon share prices have quintupled over the last 5 years. If you were compensated with RSUs you've done very well for yourself.
I know several MS software engineers who were (very recently) paid roughly $150K + $100K in equity each of their first four years (admittedly after attaining their Master's, though I'm not sure how much it helped).
Sorry, but I have first hand experience on multiple occasions in the bay area that say otherwise. I'll take my anecdata and disprove your anecdata.
As a long time HN'r myself, I really appreciate the comments you add to discussions, but telling people "you are not X" will make it difficult to get your point across. Not to mention, you don't even live/work here. How are you so comfortable postulating on something you don't have direct experience with?
I'm being imprecise. Obviously, a lot of SFBA tech startups recruiting developers to implement solutions on extremely well-trodden technical ground are, in fact, trying to compete with Google for the same pool of developers. But that's irrational, and they should stop.
Oh, come on. This kind of silly overgeneralization is more than a little bit insulting to me, my coworkers, and a heck of a lot of other people.
I could have gone to the "big four" a long time ago, and I turned them down. I've never regretted my decision. I've done incredibly fulfilling work here and couldn't imagine working with a more talented group.
Hire internationally. A good mobile developer from a 2nd-world country (say, Czech Republic, Ukraine, or Russia) goes for $20-30/hour on Upwork. You do pay in communication costs, but their development chops are usually as good or better than devs coming out of top American universities. And the supply is certainly much deeper than "top American college graduates living in the Bay Area" and probably a fair bit deeper than "all American programmers". I know multiple startups - some completely bootstrapped, some venture-funded - where the majority of the development team is abroad.
AppAmaGooFaceSoft hires internationally too, of course, but they tend to go for graduates of top Chinese, Indian, and Taiwanese universities, and less for the self-taught kid who has a computer, an analytical mindset, and a lot of free time.
Hell why go that far. I work in Pennsylvania and we're not typically making much more (if any more) than $20-30/hr and we'd love a number larger than that.
Ehhh. A few interesting thoughts though not necessarily original, the "your equity is worth 0" mantra has been repeated enough that it's not ground-breaking.
It maybe makes sense for huge, publicly traded companies like Apple (who could easily afford to just pay their employees enough to offset the equity loss and then some), but of course there's the whole idea that equity compensation aligns incentives for employees and the business.
> “You people in tech are crazy. I pay my employees handsomely in cash and I keep all of the equity for myself.”
This would be catastrophic for the startup industry:
* Why would I ever work for a company that has huge downsides (chance of failure, lack of resources, etc.) when I don't get to enjoy any of the potential upside
* How many startups can afford to pay their employees "handsomely" (relative to what they could be earning elsewhere)?
The only way I imagine a 0-equity world working is one where VCs cough up a ton more money to compensate startup employees handsomely. And to be fair to Fred, maybe that's what he's suggesting (spending more money now to retain more equity later). But I didn't see that stated anywhere.
I don't understand one point you raised. What downside risk is there to the employee if a company fails? The only risk is the cost of finding a new job which is offset by latitude in experience gained.
Why would you work for money instead of equity at a high risk venture? Because paying in equity pushes the risk onto the employee. Paying in cash takes the risk out. You are paid in full up front for your work. You'd take the job because it is paying you.
Startups pay in equity because they don't have cash.
Since then the lottery ticket aspect had taken grip with the labor market. However those people who view options as lottery tickets I find are subpar. Trend followers mostly. Those chasing Klondike gold.
> What downside risk is there to the employee if a company fails?
Sorry to be snippy, but: Come on man, do I really need to explain this one? Sudden loss of employment is incredibly disruptive at best, and for many it's a significant financial hardship.
> You'd take the job because it is paying you.
So is Amazon. And they're offering Amazon equity, which is killing it recently. So, again, why would I take a job at a high risk venture if there's no potential lottery ticket?
> Startups pay in equity because they don't have cash.
Yeah, exactly. Startups can't afford to compete with Amazon on salary.
> Those chasing Klondike gold.
Come on, as opposed to most startup founders? Anyone who has taken even a seed round is chasing klondike gold as well.
>Sudden loss of employment is incredibly disruptive at best, and for many it's a significant financial hardship.
It's a standard meme here that if you quit/are fired/company shuts down, you send a few emails and walk into a new job the following Monday. Good for you but that's just not the norm for most people in most roles.
> Sorry to be snippy, but: Come on man, do I really need to explain this one? Sudden loss of employment is incredibly disruptive at best, and for many it's a significant financial hardship.
If you're getting paid handsomely in cash, you should be able to weather that interruption.
> So is Amazon. And they're offering Amazon equity, which is killing it recently. So, again, why would I take a job at a high risk venture if there's no potential lottery ticket?
Amazon equity is publicly traded; you can buy as much of it as you want with any salary.
> If you're getting paid handsomely in cash, you should be able to weather that interruption.
Two glaring issues:
1) What people should do and what they actually do are two different things. People should eat sensible portions of healthful food. And yet, we have an obesity epidemic.
Saying people (even among those with high salaries) should be able to weather an interruption isn't very helpful when in practice many can't [1].
2) Which startups can afford to pay people handsomely in cash? (early) Startups offer equity to employees in large part because they can't afford to compete on salary.
> Amazon equity is publicly traded; you can buy as much of it as you want with any salary.
It's on top of Amazon's fairly generous salaries. The comparison was between being given Amazon equity, which is obviously worth something, vs. a private company's equity which is almost always worthless.
Which non-unicorn startup is offering $250K+ a year in salary and bonus for a plain old software engineer?
Wait, would it?
If you were an arb wouldn't you:
1) Buy bitcoin on a different exchange (with a temporarily lower price)
2) Transfer that bitcoin to, say, bitfinex (or another USDT exchange)
3) Sell that bitcoin for, say, ETH
4) Transfer that ETH back to the first exchange
5) Profit
Though I guess that would also flood the market and tank the ETH-BTC price as well (though I guess another arb could take advantage of that discrepancy as well).
Or are you talking about exchanges that only do USDT-BTC?