> none of those organizations ever trained me at anything at all
They trained us repeatedly not to bribe foreign government officials, even though I had zero access to anybody like that. There was also some mandatory training against harassing coworkers. I.e. "protect the company from lawsuits" training, not "here are some ideas for how to do your job more effectively" training. They were megacorps, too.
Yeah that's proven by the fact they get degree educated level engineers and force feed them videos designed for people working entry level positions. Its a crying shame because there's actually a lot of interesting discussions around nuance that are just sidelined by these videos creating basic bitch absurdities:
> During the lunch break, Jim has dipped his penis into Samantha's yoghurt
> is this:
> a) entirely acceptable, its just his culture
> b) a borderline issue
> c) something that someone should report to HR
Instead of:
Jane is developing feelings for someone that reports to her, they meet up outside of work and have a one-night stand. What should Jane do next?
I'd be interested in knowing what the CO2 emissions were from these. You still need to feed the yeast, so you'll have the CO2 emissions involved in growing a crop associated with this. And if you look at the chart in the OP, you'll see that grain production is about half the CO2 emissions of milk. That's likely part of the milk CO2 production accounting.
In addition, you'll need more cleaning/sterilization/mixing. I'd guess that it's lower, but I wonder how much lower.
And then there's the other products that generally get thrown into the mix to make up for things like missing fats. For example, a vegan cheese based on bacteria will often include coconut oil, probably to get the same fat profile.
Whey is an interesting product in general because it's a waste product of cheese making.
Feed efficiency is critical when doing these calculations as cows inherently need energy to survive not just produce milk. As such even if you use the same crop two different sources of protein can have wildly different levels of CO2 emissions embedded in their creation. https://en.wikipedia.org/wiki/Feed_conversion_ratio
I think it is likely more efficient. That said, cows do have the advantage that the food they consume needs little to no processing in order to produce milk. The yeast needs pretty precise processing of the incoming mash both to make sure a wild yeast strain doesn't make it's way in, and to make sure the yeast ultimately produces the right proteins.
You can't just throw in grass clippings into a vat and get whey. You can throw grass clippings into a cow to get milk (though, TBF, I dislike grassy milk).
I agree it’s likely to be more labor intensive per lb of feedstock, but only 21% of calories in milk are protein and overall milk has ~10% of the initial energy. So you’re looking at ~2% of the energy from these crops ending up as milk protein.
That’s a lot of room for improvement which then means far less labor on growing crops.
Cows are pretty terrible because of methane from their burps (not farts; burps). People are working on that but it's still real. A 50% drop would be very significant
I am very much sympathetic to nature conservation, decarbonization, degrowth etc. but really, there are more important considerations at this very moment than shaving few kgs of CO2 by ditching milk.
And, as much as some powers try to convince us, not everything can be reduced to carbon footprint.
Is this going to result in net less greenhouse gas emissions?
Maybe but probably not zero, from parents article: "The use of such treated fertilizers will be most relevant for reducing the carbon footprint of milk in countries such as the United Kingdom, Ireland, and the Netherlands, where N fertilizer is a major contributor to the footprint."
In case you are unaware much of the nitrogen in plant matter (food for yeast or cows) comes from fertilizer. And that is extracted using the Haber process (see: https://en.wikipedia.org/wiki/Haber_process ). This runs on natural gas, because it's effectively a waste product of other hydrocarbons being extracted.
I remember when people still knew what milk was ... and what was not milk.
That was before multi-billion-dollar companies came up with marketing strategies that manipulated people into not understanding what milk was, instead making them believe that milk is whatever they tell people.
Usually, the reaction to this is "Well, language and the meaning of words change." ... Sure, but that argument comes in complete ignorance of the fact that it only happened, because people with too much money and power can manipulate millions into believing whatever these millions of people are supposed to believe.
Thus now anything can be milk, as long as some profit-oriented company decides that people shall call it milk.
This practise has become the norm to a degree that people will not only generally accept it, but also generally defend it. Pure madness.
Language changes. In this case just the spelling though.
"Almaund mylke" is all over medieval cookery manuscripts, among other options.
We’ve been using milk for non-animal products for longer than we’ve spelt milk with an i, and for longer than we’ve had companies, let alone multi-billion-dollar ones.
To be clear, Perfect Day doesn't make "milk" like plant-based milks (think almond "milk", oat "milk", etc). They bioengineered some yeast to grow whey protein directly. The milk they make (made?) probably wouldn't be considered "milk" in the strict sense (they had to get the fat and sugars from plants), but there's really not a good reason to distinguish between "whey protein from cows" and "whey protein from yeast" when it's the same stuff.
You understand that the product I'm talking about is the same proteins as milk, and is essentially whey, right?
I'm not talking about grinding up nuts or grains and calling it milk, I'm talking about engineering yeasts to literally produce the proteins that milk has to create a product that isn't just milk-like, but is literally identical proteins.
You are correct. Thankfully I never said anything of the sort.
I said proteins, plural. The s is a key indicator I was talking about more than one aspect.
Milk is about 5% lactose, 3.5% casein (80%) and whey proteins (20%, almost entirely two specific sub proteins), and 3-4% fat. A negligible amount of minerals. Nonfat milk, which I think we'd both recognize as milk (unpalatable as it may be). Lactose free milk is milk, I think. One assumes lactose free nonfat milk is milk.
So if one is producing the three primary proteins, you've got nearly the whole way there. There's some trace proteins you are missing, but if you are 99.99+% of the way to milk, you've got milk. You sure the milk you get from the store hasn't denatured those trace proteins?
The large diary producers are forcing things that everyone understand what is — “Oat milk” and “Almond milk” — to be called “Oat drink” and “Almond drink”. New terms for things that have existed for decades.
Really, we should be calling the OG milk “cow milk” and let the good times roll.
Big milk have been pushing questionable health research and narratives for cow milk for quite some time.
All this coming from someone (me) who drinks 0,5L of cow milk every day.
Yes, yeast milk is milk too. Just like coconut milk.
I thought the reason things are called "Oat Drink" versus "Oat Milk" is because non-dairy "milks" have to be fortified with vitamin D and calcium and the stuff that's labeled a "drink" is not fortified.
I don't know about the rest of the world, but in EU you can't write "milk" on the packaging unless it comes from cows/goats/sheep/etc. Also, you can't name it "honey" or picture bees or honeycombs on the packaging unless it's genuine honey from bees.
The solution is to write better laws and stop opposing regulations.
The first documented use of the word coconut milk in English dates from 1698 ( Philosophical Transactions of the Royal Society, volume 20, page 333) and the use of almond milk goes quite a bit further back, to at least 1390 (The Forme of Cury).
Where are these emissions coming from? For instance, if this is counting the emissions involved in logistics, none of that inherently or necessarily requires greenhouse emissions—you can electrify trains, tanker trucks, and refrigerators.
If this is counting the methane emissions of the cow itself, that’s not a fair or complete accounting. The cow produces methane in her digestive system after eating grass, and the grass grows by, among other things, extracting CO2 from the air. Then the cow burps methane, the methane combines with atmospheric oxygen and breaks down to CO2 and water, and you have a closed loop; the cow cannot belch more carbon than she eats, and that carbon came from the air in the first place.
That doesn’t change the fact that you’re selectively counting only one side of a closed loop process. Methane may be a more efficient greenhouse gas than CO2 but if that effect dominated, we would expect to see the global warming trend start with the evolution of ruminants, not the Industrial Revolution (a time when the North American ruminant population actually declined a significant amount!)
But to put this in context, the average American family’s carbon footprint per year is roughly 50,000 kg, and one flight is usually on the order of >1,000 kg, or ~300kg/700 pounds of milk, assuming that 3kg CO2 per kg milk high end figure. So if you like milk, there are probably other places you can cut first.
Does seem like a lot of carbon for a kg of plastic, though, how does that compare to normal plastic’s carbon footprint?
I'm not above asking the barber to leave an inch on the top, but then I'm not going to ask him to leave 15mm on the sides. At least keep the system consistent within a sentence :)
Yeah, that sounds very familiar. A deadbeat dad that was overly controlling when he was present. A mom that was trying to juggle her career, housework and two kids. This meant lots of time spent alone. Can't learn now to depend on people when there is nobody around. Bullying at school does not help, either. Hyper independence is a predictable outcome.
It's very familiar. My father was abusive towards me and so I tried to disturb him as little as possible. He was extremely supportive of my younger brother, however, and we both turned out very differently. He turns to my parents constantly for help (and receives it) and the few times I have I've only been brushed off, and so I've always done everything myself, even if it meant spending long periods living in poverty with no footholds upward.
Years later I'm nearly 50 and have transitioned to female. My now divorced mother decided she likes having a daughter and so suddenly all of this support has materialised and with presumably less time to live the life I missed out on I'm making a conscious choice to start being a little dependent on it and shortcut some things that would have taken longer otherwise.
It's hard to describe but it feels like I'm living a more normal, less marginalised life now. I definitely have more friends, where previously I chose to avoid having any because I felt like having friends meant sharing burdens that were mine alone to carry.
I also see a lot of very similar behaviour in the wider trans community where most people's axis of behaviour revolves around some facet of not having support. We don't all handle it in a healthy way, but I now do my best to help my community find stability and adjust their expectations to having better outcomes.
It’s a blessing in some ways, I’m much more capable at doing difficult things without much help. But it’s also debilitating in significant ways, I constantly feel like I need to do incredible things to get love
Apparently that's kinda where the name comes from. It's named after a Serbian musician who was known as Bora Čorba, who played for a band called Riblja Čorba (fish stew).
And also in Turkey. It (the word, if not the stew itself) arrived to the Balkans by way of the Ottomans. (And having just now clicked through to the link, it seems to have arrived to Turkey by way of the Persians).
The wikipedia article traces it to Persian, which formed it as a compound of words from different East Iranian languages. So you are on the money with Middle Eastern. From there it spread to the Balkan via Ottoman Turkish, and also from Persian to dialectal Arabic, which would explain the occurrences in Northern Africa, and maybe even Spain
If you have a problem with my comments, there is a link in my profile to a convenient way to hide comments from people you don't like. You are welcome!
Not all oatmeal has the same effect on blood sugar. Steel cut oats are absorbed more slowly than instant oats. Toppings on your oatmeal also affect blood glucose in non-linear ways, the same as any other combination of foods.
> you don't actually want to force the decision of what is in the cache back to compile time, when the relevant information is better available at runtime
That is very context-dependent. In high-performance code having explicit control over caches can be very beneficial. CUDA and similar give you that ability and it is used extensively.
Now, for general "I wrote some code and want the hardware to run it fast with little effort from my side", I agree that transparent caches are the way.
that solves the pollution problem, but it doesn't pin cache lines. it also doesn't cover the case that ppc does where you want to assert a line is valid without actually fetching.
Amateur's asserting their opinions as facts isn't great, but epistemologically it's no worse (and systemically, like less harmful) than when the experts do it.
Saying that experts are less likely to do X doesn't say anything about the relative harm of their doing so. If some rando on the streets is shouting their opinion about what causes Alzheimer's and asserting it's God's Own Truth, it's going to cause less overall harm that a carefully worded (but equally wrong) statement from an expert. (And the fact that we tend to hold experts in higher regard is the reason we should be more concerned about them stating their opinions as facts than about amateurs doing the same.)
Right, but as a programmer you rarely have control over that. And even if you do, you often can't handle out of memory errors gracefully.
Thus, for a typical situation it is reasonable to log the error and bail out, rather than adding extra custom error handling around every single memory allocation, which ends up being code that is never tested.
Throughout this article you have been voicing a desire for affordable and high-througput fp64 processors, blaming vendors for not building the product you desire at a price you are willing to pay.
We hear you: your needs are not being met. Your use case is not profitable enough to justify paying the sky-high prices they now demand. In particular, because you don't need to run the workload 24/7.
What alternatives have you looked into? For example, Blackwell nodes are available from the likes of AWS.
I think that you might have confused me with the author of the article.
American companies have a pronounced preference for business-to-business products, where they can sell large quantities in bulk and at very large profit margins that would not be accepted by small businesses or individual users, who spend their own money, instead of spending the money of an anonymous employer.
If that is the only way for them to be profitable, good for them. However such policies do not deserve respect. They demonstrate the inefficiencies in the management of these companies, which prevent them from competing efficiently in markets for low-margin commodity products.
From my experience, I am pretty certain that a smaller die version of the AMD "datacenter" GPUs could be made and it could be profitable, like such GPUs were a decade ago, when AMD was still making them. However today they no longer have any incentive to do such things, as they are content with selling a smaller number of units, but with much higher margins, and they do not feel any pressure to tighten their costs.
Fortunately at least in CPUs there has been a steady progress and AMD Zen 5 has been a great leap in floating-point throughput, exceeding the performance of older GPUs.
I am not blaming vendors for not building the product that I desire, but I am disappointed that years ago they have fooled me to waste time in porting applications to their products, which I bought instead of spending money for something else, but then they have discontinued such products, with no upgrade path.
Because I am old enough to remember what happened 15 to 20 years ago, I am annoyed about the hypocrisy of some discourses of the NVIDIA CEO, which have been repeated for several years after introducing CUDA, which were more or less equivalent with promises that the goal of NVIDIA is to put a "supercomputer" on the desk of everyone, only for him to pivot completely from these claims and remove FP64 from "consumer" GPUs, in order to be able to sell "enterprise" GPUs at inflated prices. Then soon this prompted AMD to imitate the same strategy.
I have a MsC in CS. While I spent half of my career writing device drivers, the other half was doing computer architecture. You could say I had a foot on the low level software side, and the other foot on the high-level hardware side. I found them to be two sides of the same coin. Understanding how hardware folks see the world took a few years, but it was very doable.
My biggest gripe with the semiconductor industry as a career, compared to software, is twofold.
First, it is very concentrated. If you want to make good money there are only a handful of potential employers, and this only a handful of cities/neighborhoods where you will have to live; remote work is theoretically possible but not all employers make it effective. I found this the most frustrating. The upside is that people know this and this tend to stay at the same employer for a long time, so you get to learn from people with a deep understanding of the product, and people are mindful to keep a pleasant work environment.
Second, the pay isn't as good at the top end. If you have FAANG-level skills, you will typically do much better financially there than in the semiconductor industry —with the notable exception of NVidia for the past decade or so.
They trained us repeatedly not to bribe foreign government officials, even though I had zero access to anybody like that. There was also some mandatory training against harassing coworkers. I.e. "protect the company from lawsuits" training, not "here are some ideas for how to do your job more effectively" training. They were megacorps, too.
reply