Hacker Newsnew | past | comments | ask | show | jobs | submit | willberman's commentslogin

+1 on vim with fzf and rg being enough for almost any task. I use that setup for most things and then when I need something with a little more syntactic awareness, I pop open my IDE.


I think problem is that your argument seems to come from a prior point of view, and doesn't assume good faith on the behalf of people working on smart contract problems (declaring them in whole fundamentally unethical). I would suggest that a better way to pose your point of view would be to ask, "Are there ways to mitigate the automatic triggering of actions in smart contracts?" Posing your point of view as a question like this would be more likely to foster a healthy discussion because there are certainly ways to mitigate the extent of automatic actions (multisigs etc...). And who knows, everyone might learn something as a result of the discussion!

Disclosure: I work in the cryptocurrency space and generally think hackernews doesn't know how to have productive discussions about cryptocurrency.


I would prefer if schools just gave as concrete as possible feedback for why individual applicants are or are not admitted.

i.e. Congrats, you got accepted. Your GPA ranked in the top $X percentile of our admitted class and your engagement in robotics was astonishing.

or

You were not admitted because while your GPA was excellent, we did not see sufficient community engagement in your extracuriculars.


There's no point in feedback since you can't go back in time to fix those issues. You should already be able to guess from looking at the profiles of students who got in and those that didn't.


I don’t necessarily agree with either of those.

Maybe a college tells me that a particular type of extra-curricular wasn’t valued as highly as I thought it was and I elect to pursue something different in undergrad (I wouldn’t recommend this but to each their own). The feedback would also provide relevant data points to future applicants if chosen to be shared.

There are plenty of examples of students who are accepted/rejected to schools which you wouldn’t expect.


Do you mind being specific on how there are no longer hard engineering problems to be solved?


Well beginning in 2008ish, we saw websites and applications that served the entire world, literally billions of users. That simply did not exist before then. The internet as a whole was a tiny part of peoples lives. All the stuff that went into solving that was genuinely new ideas in servers, databases, ops, data centre's etc. Eventual consistency went from being a joke to the only game in town. Distributed computing on large data sets went from a Google paper in 2005 on Hadoop to a widely used idea. Smartphones went from zero to iPhone to literally more than half the world has one. Devops became standard. Google went from just managing 1 or 2 datacenters to running AI to optimize the power efficiency of each one.

When you already have architecture that is capable of serving the entire world, when everyone has a smartphone all that's left in that space is efficiency/cost optimization. Whatever shiny new programming language or framework comes out, the bedrock of computing is solid (and the shiny new programming language is really mashing together decades old ideas which is cool if you're new to them but boring if you're not) . Every company is a mess on the inside of course, but they do their core functions well enough. AWS did not exist 10 years ago, now it's big internet drama when it goes down for a few hours twice a YEAR.A detailed postmortem comes out (not standard before), revealing the problem was a configuration change (because everything has long since been automated, so it's driven purely by configuration) .

I know not everyone is interested in those same problems, but I thought they were interesting and what's left now in the guts of software is an adoption phase, people rediscovering lessons that others long since figured out. Perhaps people who work on higher layers of the stack think differently, I'm only interested in the deep tech below, and it's pretty solid at this point.


There's still lots of relevant work to do deeper in the stack. It might just not be at large internet scale companies (I doubt this but lets assume it's true). Most every university has a systems research lab doing interesting things. I think I paged through 5 different professor's websites in CMU's systems department the other month and found 9-10 interesting projects.


For any point of view on AI, I can probably find a decent number of non technical people who hold that POV. Technical illiteracy is a significant problem imo.


Workflow on midsized personal project with editor that I'm comfortable with and have customized over the majority of time I've been programming: work.

Look, obviously VS Code has emacs beat in the vast majority usability features. Emacs works for a lot of what I work on. For the things it doesn't work for, I use the tool that works better.

I have lots of personal qualms with emacs, but to declare it "maximally unsuitable," is overkill.


Different things work for different people, but I'll give you my two cents. I got through college entirely through self study.

What works for me is reading a textbook and re-writing out any concept in the textbook in my own words until I understand it. Afterwards I do practice problems or a project to ensure I'm not lying to myself about my understanding.

If I can, I only do one topic until I'm sure I've solidified my knowledge to a sufficient degree that I won't forget it when I shift my main focus to something else.

Lastly but most importantly is be ok with failure. When you fail to learn something, take a break and then come back approaching it from an entirely different angle.


I’d give the same advice I’d give to understand anything remotely complicated.

Pick a real goal you want to accomplish that the thing you want to understand is a good fit for.

Maintain a curious mindset while trying to accomplish the goal.

To frame this in the context of emacs. Just start using it for programming. When you come across something you want it to do, google how do I make emacs <...>.

I know it sounds overly simplistic, but just trust me and try it.


Look at Chamath palihapitiya‘s description of social capital. This is what he states he’s attempting to address.

The counter argument is addressed in a debate between Reid Hoffman and Peter Thiel where Hoffman’s pov is that change occurs incrementally and what may not be considered revolutionary today (twitter, etc...) would be considered revolutionary 100 or so years ago.


I use Ubuntu as my main os for programming. I run into a few kinks every now and then, and for that I keep around a Windows and a macOS box. I’d say that Ubuntu is a sane desktop to get work done in.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: