It seems like a combo: Zelda BotW is part of it, but the hardware is also fun. The build quality and kit is fantastic, it reminds me of an iPad designed by a gamer for gaming.
In my case it's also nostalgia. I've been working on my own apps and games, not to mention a day job, so this is the first console I have owned since the N64. Never got into console titles that seemed to be just as good on the PC.
That's 3 reasons right there... I ended up buying the Special Edition ($100 gets you the game, a music CD, a hard traveling case and a collector's coin) in March before I found a Switch in April.
I prefer the Pro Controller and using it as a standard console, but the mobility is great if needed. Sticker shock hasn't been a factor, since I remember buying 3 extra controllers for the N64.
The hunt for the Switch is as much as the hunt for the Classic was. Of course, now the Classic is going for 4-5 times as much... more than the Switch!
(Note: skip the Classic and skip RetroPie. You need a mac or PC anyway to load up the SD card... better to spend $60 on a nice XBone controller and play games in a better emulator on your computer. Just my 2 cents.)
It seems like a different set of tradeoffs. RetroPie (if you're running on a Raspberry Pi, and not on a PC) will have fewer game systems that it can support, of course. But it's also cheap enough to just buy one as a dedicated game system that you leave connected to a TV, so the kids aren't grabbing your laptop to play retro games, or something. For a little more money, you could set up an NUC or something, of course.
> You need a mac or PC anyway to load up the SD card
A lot of kits come with NOOBS loaded on them. You can load RetroPie directly into Raspbian by using a setup script that they provide. You could also boot into Raspbian via NOOBS and use a USB adapter to flash a microSD with another OS.
You could even take a blank SD, extract NOOBS onto it with your phone, and boot the Pi off that (NOOBS and Berryboot just need to be unzipped onto the SD; they don't require images to be flashed).
I bought a pretty cheap HDMI extension cable that snakes from one end of my living room (work desk) to the other (tv+consoles) that would argue otherwise!
I have had severe chronic pain for many years as well. I use opioids but for a very short duration. It is a huge pain to get off them when you take them for a while. Max dosage of medical marijuana (cbd and indica not sativa) and synthetic opioids like tramadol not used daily works best for me. Current political climate is a no no to all opiates which is why it's so hard to get them now and it is awful for those with lots of pain, but doctors really did prescribe them too much.
I've never understood this tactic. Specifying a range seems like a good way to underprice yourself. If you give a range, the recruiter will simply assume that your current salary is very close to the bottom of the range.
Now, if you mean that you give a range of previous salaries, the recruiter will know your best previous salary anyway. And they'll probably knock a fair bit off the top assuming you're padding.
> I prefer to give a range than specify individually.
FWIW, this is likely almost as bad for you in the negotiation as giving the number. You will simply always get offers towards the bottom of your range. It is better to refuse to answer, and just say you're looking for a market rate salary for this role.
Giving the number doesn't have to be bad for you. (But it can be.)
But yeah, never give any range. Not when talking about previous compensation, and not when talking about what you want. Just give them straight up numbers, if you give anything at all.
And you should definitely give them a number about how much you want to be paid. You want to set the first reference point. And yes, that means you will have to do your own due diligence about how much the market can currently bear for your (interviewing) skills. But you need to know these numbers anyway.
But it's not a dilemma. There's nothing wrong with saying "I don't disclose my past salary. It's personal. Let's focus on this job, and if you make me an offer, we can discuss that."
Some ethics test. The employer is already failing at ethics by asking this in the first place, so I have no problem lying to them. This is no different than them asking if you have kids, if you're married, etc.
Exactly. But the issue here is that you are dealing with two different mindsets. On one hand, you have the paragraph above that 'stands alone'.
But that has no bearing on the truth. It's omitting key facts to be true, and creating a social situation like 'reacting to a non-existent threat'.
So to me and you, ridiculous paragraphs like those above are In Error, because they are Unfinished and Missing Info. But to others, they point to the paragraph above as some kind of 'sole entity' that exists on it's own.
No one cares about a paragraph that omits critical facts to portray the reality of the situation. But apparently, on the Internet, some do.
Chuck that incomplete paragraph in the trash and try again, that's my response to their nonsensical verbal antics.
I imagine the new GPU will support the same Metal API. As this is probably something that Apple has been working on for a while now I'd assume that Metal was designed with this new GPU architecture in mind. How much of those tools would need to be updated to a new GPU assuming the current API was specifically written for it ?
But even if the API is the same, it doesn't mean it will have the same performance? Something that was not a problem with previous GPUs could become a bottleneck and vice-versa? And since you need to support both GPUs or else you drop older iPhone support, it might get annoying.
I don't really know if that would be the case though, I don't know anything about GPUs.
>But even if the API is the same, it doesn't mean it will have the same performance? Something that was not a problem with previous GPUs could become a bottleneck and vice-versa? And since you need to support both GPUs or else you drop older iPhone support, it might get annoying.
If "in certain cases, the performance profile can be annoyingly different" is the outcome of a tectonic shift in the underlying hardware, it would be a significant and praise-worthy achievement.
But it is Apple, one of rare "experienced" companies that can pull this off well (not perfect but enough). Apple has done two (or three?) complete CPU architecture changes and developers just had to recompile their app. Remember fat binaries from the PPC>Intel switch over?
I suspect as long as the graphics tools use Metal, it will be a quick transition.