Hacker Newsnew | past | comments | ask | show | jobs | submit | Ultramax's commentslogin

It's not a joke, some people are like that.


Is this the autistic spectrum of the internet expressing itself? Or just kids messing around?


[flagged]


That is a pretty unkind thing to attribute to autism.


When it comes to autism in memes, it's pretty obvious. But I understand why honest perception bothers people.

Try spending some time in alt-right meme communities, on 4chan, etc, and form an opinion for yourself.


I think of a blender. Eventually something is going to be found that breaks up pesky combos.

Long molecule chains in plastics seem like they would be easy. I guess it takes the right blender to chop it up!

I also imagine a small caterpillar-sized blender involved. Now I need a smoothie.


As someone who has had to deal with Assembly in the 80s for development projects, Java sounds and works great!

I guess it depends on your experience, attitude and point of view.


Better than assembly in the 80s is not a high bar.


It seems like a combo: Zelda BotW is part of it, but the hardware is also fun. The build quality and kit is fantastic, it reminds me of an iPad designed by a gamer for gaming.

In my case it's also nostalgia. I've been working on my own apps and games, not to mention a day job, so this is the first console I have owned since the N64. Never got into console titles that seemed to be just as good on the PC.

That's 3 reasons right there... I ended up buying the Special Edition ($100 gets you the game, a music CD, a hard traveling case and a collector's coin) in March before I found a Switch in April.

I prefer the Pro Controller and using it as a standard console, but the mobility is great if needed. Sticker shock hasn't been a factor, since I remember buying 3 extra controllers for the N64.

The hunt for the Switch is as much as the hunt for the Classic was. Of course, now the Classic is going for 4-5 times as much... more than the Switch!

(Note: skip the Classic and skip RetroPie. You need a mac or PC anyway to load up the SD card... better to spend $60 on a nice XBone controller and play games in a better emulator on your computer. Just my 2 cents.)


> Note: skip the Classic and skip RetroPie

It seems like a different set of tradeoffs. RetroPie (if you're running on a Raspberry Pi, and not on a PC) will have fewer game systems that it can support, of course. But it's also cheap enough to just buy one as a dedicated game system that you leave connected to a TV, so the kids aren't grabbing your laptop to play retro games, or something. For a little more money, you could set up an NUC or something, of course.

> You need a mac or PC anyway to load up the SD card

A lot of kits come with NOOBS loaded on them. You can load RetroPie directly into Raspbian by using a setup script that they provide. You could also boot into Raspbian via NOOBS and use a USB adapter to flash a microSD with another OS.

You could even take a blank SD, extract NOOBS onto it with your phone, and boot the Pi off that (NOOBS and Berryboot just need to be unzipped onto the SD; they don't require images to be flashed).


Why skip the Retropie? It seems like a great nostalgia machine for <$100 with 2 good quality buffalo controllers.


The sticker shock on the non-pro controllers is much worse because without the charging grip, charging for 4 controllers is a massive pita.

That's really my biggest complaint with the unit, the difficulty of charging the joycons. Taking off the wrist straps each time is really bad UX.


Not to mention accidentally inserting the wrist straps the wrong way!


You're ignoring the difference between playing in the living room (using a RetroPi) vs PC. Two very different experiences.


HDMI out is a thing.


HDMI out is a thing that most people can't realistically use to connect their PC to their TV because they're in different places.


I bought a pretty cheap HDMI extension cable that snakes from one end of my living room (work desk) to the other (tv+consoles) that would argue otherwise!


A laptop will work fine. Even a Android / Win10 tablet with HDMI out will emulate N64/PS1 decently.


As someone who deals with chronic joint pain, getting opiod pain relief right now is a fiasco.

It is funny: prescribe pain relief, then act like I'm an addict? Give me a break. Who is selling the opiod epidemic story ad nauseum?


I have had severe chronic pain for many years as well. I use opioids but for a very short duration. It is a huge pain to get off them when you take them for a while. Max dosage of medical marijuana (cbd and indica not sativa) and synthetic opioids like tramadol not used daily works best for me. Current political climate is a no no to all opiates which is why it's so hard to get them now and it is awful for those with lots of pain, but doctors really did prescribe them too much.


Seems like an ethics test. Will you tell the truth about your previous low salary or lie to make sure you don't get lowballed?

Then for the employer, will you decrease the salary to match previous low rates or wages? Or will you pay him/her the market rate regardless?

Personally, I have always been asked how much I made at previous places. I prefer to give a range than specify individually.


> I prefer to give a range

I've never understood this tactic. Specifying a range seems like a good way to underprice yourself. If you give a range, the recruiter will simply assume that your current salary is very close to the bottom of the range.

Now, if you mean that you give a range of previous salaries, the recruiter will know your best previous salary anyway. And they'll probably knock a fair bit off the top assuming you're padding.


> I prefer to give a range than specify individually.

FWIW, this is likely almost as bad for you in the negotiation as giving the number. You will simply always get offers towards the bottom of your range. It is better to refuse to answer, and just say you're looking for a market rate salary for this role.


Giving the number doesn't have to be bad for you. (But it can be.)

But yeah, never give any range. Not when talking about previous compensation, and not when talking about what you want. Just give them straight up numbers, if you give anything at all.

And you should definitely give them a number about how much you want to be paid. You want to set the first reference point. And yes, that means you will have to do your own due diligence about how much the market can currently bear for your (interviewing) skills. But you need to know these numbers anyway.


But it's not a dilemma. There's nothing wrong with saying "I don't disclose my past salary. It's personal. Let's focus on this job, and if you make me an offer, we can discuss that."


Some ethics test. The employer is already failing at ethics by asking this in the first place, so I have no problem lying to them. This is no different than them asking if you have kids, if you're married, etc.


I wish they acted like Apple in this regard.

People can understand and deal with having products sent in for repair, but only if you treat the issue of customer service professionally.


Exactly. But the issue here is that you are dealing with two different mindsets. On one hand, you have the paragraph above that 'stands alone'.

But that has no bearing on the truth. It's omitting key facts to be true, and creating a social situation like 'reacting to a non-existent threat'.

So to me and you, ridiculous paragraphs like those above are In Error, because they are Unfinished and Missing Info. But to others, they point to the paragraph above as some kind of 'sole entity' that exists on it's own.

No one cares about a paragraph that omits critical facts to portray the reality of the situation. But apparently, on the Internet, some do.

Chuck that incomplete paragraph in the trash and try again, that's my response to their nonsensical verbal antics.


Keep in mind this is not just hardware but software related.

Right now I have IT graphics tools to optimize 3D models for mobile devices (for example).

For us developers, it means optimizing to a different standard, while worrying about backward 'optimization compatibility'.

Plus we'll need new software tools for development on the new GPU.


I imagine the new GPU will support the same Metal API. As this is probably something that Apple has been working on for a while now I'd assume that Metal was designed with this new GPU architecture in mind. How much of those tools would need to be updated to a new GPU assuming the current API was specifically written for it ?


But even if the API is the same, it doesn't mean it will have the same performance? Something that was not a problem with previous GPUs could become a bottleneck and vice-versa? And since you need to support both GPUs or else you drop older iPhone support, it might get annoying.

I don't really know if that would be the case though, I don't know anything about GPUs.


>But even if the API is the same, it doesn't mean it will have the same performance? Something that was not a problem with previous GPUs could become a bottleneck and vice-versa? And since you need to support both GPUs or else you drop older iPhone support, it might get annoying.

If "in certain cases, the performance profile can be annoyingly different" is the outcome of a tectonic shift in the underlying hardware, it would be a significant and praise-worthy achievement.


But it is Apple, one of rare "experienced" companies that can pull this off well (not perfect but enough). Apple has done two (or three?) complete CPU architecture changes and developers just had to recompile their app. Remember fat binaries from the PPC>Intel switch over?

I suspect as long as the graphics tools use Metal, it will be a quick transition.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: