> BSc and an understanding of how to make things that can scale
That certainly is not the only way to learn how to write scalable applications. (e.g. reading about techniques and/or building them).
> We’re all familiar with the trope of the medium-sized business exec who scoffs at the invoice of an SWE consultant arguing “my 12-yo nephew could have built that!” - and they’re not superficially wrong - but their 12yo nephew couldn’t build something that scales to millions of users and petabytes of data.
Scaling to millions of users and petabytes of data is not needed in many cases and a simpler solution can get the product shipped faster and with fewer bugs. Part of being a good SWE consultant is taking the time to gather the real world requirements, the potential future uses, and clearly explaining the expected operational envelope to the client.
> I’ve found a quick-and-dirty way of identifying the unwarranted self-confident types is to ask them how they’d quickly put together a CSV parser - if they give an answer involving `String.Split` you can tell they don’t have a degree.
I'd rather have someone who bothers to ask what the parser will be used to process (and what will be consuming its output) and gathers the real world requirements. Someone who confidently assumes they know what the CSV "spec" is and start coding a parser for the RFC 4180 general case can waste lots of time and money building the wrong thing. There are certainly cases where splitting a string on line breaks and commas is a good fit for the problem. There are also cases where that will parse data incorrectly and explode on files over a certain size.
That certainly is not the only way to learn how to write scalable applications. (e.g. reading about techniques and/or building them).
> We’re all familiar with the trope of the medium-sized business exec who scoffs at the invoice of an SWE consultant arguing “my 12-yo nephew could have built that!” - and they’re not superficially wrong - but their 12yo nephew couldn’t build something that scales to millions of users and petabytes of data.
Scaling to millions of users and petabytes of data is not needed in many cases and a simpler solution can get the product shipped faster and with fewer bugs. Part of being a good SWE consultant is taking the time to gather the real world requirements, the potential future uses, and clearly explaining the expected operational envelope to the client.
> I’ve found a quick-and-dirty way of identifying the unwarranted self-confident types is to ask them how they’d quickly put together a CSV parser - if they give an answer involving `String.Split` you can tell they don’t have a degree.
I'd rather have someone who bothers to ask what the parser will be used to process (and what will be consuming its output) and gathers the real world requirements. Someone who confidently assumes they know what the CSV "spec" is and start coding a parser for the RFC 4180 general case can waste lots of time and money building the wrong thing. There are certainly cases where splitting a string on line breaks and commas is a good fit for the problem. There are also cases where that will parse data incorrectly and explode on files over a certain size.