> Reduce energy usage to reduce CO2. That means reduce internet calls, and reduce CPU utilization to do the thing.
This is an interesting way of thinking! On one hand, all of my homelab servers now run on old AMD Athlon 200GE CPUs that would otherwise be considered e-waste in the eyes of some, but now run at 35W TDP and are enough for all of my CI/backup/development needs, in addition to being a cheap x86 option.
To me, it feels like drawing 100W from the wall might be a reasonably energy efficient option, since I don't care as much about uptime (servers can sleep every night when I do), redundancy or failover (apart from additional HDDs for data and backups), cooling (they're all passively cooled with heatsinks, slightly warm up the house in the winter) or other things you'd expect in a data center.
Python would go out the window. Ruby would go out the window. Same for something like Lua and PHP, maybe even TypeScript in the back end.
You'd essentially have to use C, C++ or Rust. Maybe something like Java, C#, Go or JavaScript for your back end (or some of the other more energy efficient languages, just listed some of the more popular options here).
I think that makes sense at larger scales (hundreds or thousands of instances, or more) instead of your small CRUD app, where you just want to iterate quickly and write code that's simple. Then again, Wirth's law should also not be forgotten about and it's hard for me to reason about the CO2 impact of big tech vs the collective impact of small projects.
And then there's something like comparing how much energy crypto used vs data centers.
>> Reduce energy usage to reduce CO2. That means reduce internet calls, and reduce CPU utilization to do the thing.
> This is an interesting way of thinking! On one hand, all of my homelab servers now run on old AMD Athlon 200GE CPUs that would otherwise be considered e-waste in the eyes of some, but now run at 35W TDP and are enough for all of my CI/backup/development needs, in addition to being a cheap x86 option.
Precisely! You're using them, and following the 4 R's : reduce, reuse, repair, recycle. Not sending stuff to the scrapyard or landfill is always laudable.
I'm also wary and careful in not doing the bad environmentalist trope, by blaming the individual for small things when we have massive things that could be greatly scaled down. It's like in California, where people are being heckled in taking too long showers, all the while farmers there are trying to grow wetland foods and trees in the desert! It's literally dozens of gallons vs acre-feet of water.
I recently read that XCode 14 is bloating all iPhone packages compiled with that. It's stuff like that I'm looking at first, since that contributes to manufactured obsolescence due to bloated storage reqt and wasteful uneeded code. Its those things at massive scale I'm looking at.
This is an interesting way of thinking! On one hand, all of my homelab servers now run on old AMD Athlon 200GE CPUs that would otherwise be considered e-waste in the eyes of some, but now run at 35W TDP and are enough for all of my CI/backup/development needs, in addition to being a cheap x86 option.
To me, it feels like drawing 100W from the wall might be a reasonably energy efficient option, since I don't care as much about uptime (servers can sleep every night when I do), redundancy or failover (apart from additional HDDs for data and backups), cooling (they're all passively cooled with heatsinks, slightly warm up the house in the winter) or other things you'd expect in a data center.
That said, this line of reasoning also implies that many languages would go out the window, if we talk about the back end: https://www.researchgate.net/publication/320436353_Energy_ef... (or an article: https://medium.com/codex/what-are-the-greenest-programming-l...)
Python would go out the window. Ruby would go out the window. Same for something like Lua and PHP, maybe even TypeScript in the back end.
You'd essentially have to use C, C++ or Rust. Maybe something like Java, C#, Go or JavaScript for your back end (or some of the other more energy efficient languages, just listed some of the more popular options here).
I think that makes sense at larger scales (hundreds or thousands of instances, or more) instead of your small CRUD app, where you just want to iterate quickly and write code that's simple. Then again, Wirth's law should also not be forgotten about and it's hard for me to reason about the CO2 impact of big tech vs the collective impact of small projects.
And then there's something like comparing how much energy crypto used vs data centers.
Bitcoin: approx. 127 TWh/year (source: https://www.forbes.com/advisor/investing/cryptocurrency/bitc...)
EU data centers: approx. 77 TWh/year (source: https://digital-strategy.ec.europa.eu/en/library/energy-effi...)
If nothing else, it's curious to see the numbers.