> Last Tuesday, we shipped a new feature at 10 AM, A/B tested it by noon, and killed it by 3 PM because the data said no.
I've worked with Amazon Weblab and even Amazon doesn't have the kind of traffic you'd need to make statistically significant conclusions on that time frame (with a few very obvious exceptions, like the whole site being down/unusable/etc). People need to calm down with their "hustle porn", unless they don't mind their credibility being in the toilet.
With LLMs (and colleagues) it might be a legitimate problem since they would load that eval into context and maybe decide it’s an acceptable paradigm in your codebase.
At least two of YC's early (mid-aughts) "huge" successes come down to PG unilaterally (or with some help from JL) making some kind of "weird" call. AirBnB and Reddit come to mind. Even Stripe can be traced to him since he basically created the Auctomatic team (Patrick Collison's previous YC entry).
In other words, PG had the "knack" for sometimes encouraging the right weird thing. I'm not sure it's been the same since he handed off the reins, like any other formerly-founder-led company. Nowadays it really gives off the vibe of bean-counting and hype-chasing.
I don't think it's gotten quite as bad as this [0] article suggests, though.
Unintelligent people can also be right, or lucky, etc, and someone judging on those criteria can end up getting swept up in making some very bad decisions based on dubious advice.
One the most important lessons I ever learned in my career was not to mindlessly disregard a known bullshitter. He'll be right enough that you'll look foolish even if he hasn't earned his reputation.
This likely takes 500mb+ RAM, TFA probably didn't account for tauri://localhost in their calculation, which by itself takes 200mb+ RAM. Then your app process will take 100mb+ RAM, and there will be a couple of other processes besides.
Tauri is no better than electron in terms of RAM, just like people calling it "lightweight" are no better than flat earthers. Let's hope they come around.
This wasn't even just greenfield work, it included the exact type of work where AI arguably excels: extracting working code from an extant codebase (SQLite) as a reusable library. (It also included the type of work AI is really bad at: designing APIs sensibly.)
Going 100x faster is the problem. As they say, “slow is smooth, and smooth is fast,” when going for product market fit this is very important (and understated, in my opinion). It doesn’t help that your thread is spinning out at a hundred yards a second when what you’re doing is trying to thread the needle.
I wonder if "LLMs make coding 10x faster" is going to turn out to be as total SDLC useful as EVs "straight line acceleration faster than a supercar for $50k" turned out to be in everyday driving?
That is - lacking a supercar was not the thing preventing you from getting to work faster.
Neither. "Leaving YC" or "being removed from Y combinator" really just means you (more precisely, your YC/HN account) loses access to internal resources like bookface. This does have the knock on effect of essentially isolating you from the community. It's not entirely a punishment, it can be as simple as you are a person who isn't working on a YC company anymore, for example.
This has zero bearing on equity, which would be a different conversation. In this case, I think the YC SAFE is likely to remain as-is, unless the founders choose to return the money, or YC chooses to levy a heavier allegation of fraud (which they don't seem to have done here).
I've worked with Amazon Weblab and even Amazon doesn't have the kind of traffic you'd need to make statistically significant conclusions on that time frame (with a few very obvious exceptions, like the whole site being down/unusable/etc). People need to calm down with their "hustle porn", unless they don't mind their credibility being in the toilet.
reply