Computing gadgets don't last long enough. I wanted to write a lot about this, but since I don't have good answers, I figured I might as well just pose some questions.
All the time, we're throwing out something that still works, because there's something new that works better.
Working better is great! Faster chips and better screens and cameras make our lives easier and help us make cool new stuff.
I used to work near people designing processors, so I know it's plenty hard just to keep that part of the system progressing. And I know that newer stuff will use less energy, which is great. I know that people designing data centers think hard about their efficiency, and upgrade ruthlessly to improve it.
But how carefully are we considering the impact of getting rid of all that old equipment? How do we compare whether it's really, globally, ethically, better to improve speed & efficiency vs. just keep using old stuff? How do we factor in things like resources mined in conflict zones?
I recently remodeled a house, and among all the advice I read, I remember this best: "The greenest building material is the one that's already there." (It was about floors. We put in new bamboo anyway. That old parquet was awful, but I hope the new stuff stays there for a hundred years or more.)
I know that designing e.g., a great modern smartphone, is extremely difficult, even 'only' considering functionality, space and power constraints. Designing something that small and integrated that could still be upgraded partially, keeping around parts that still work, sounds near impossible, but wouldn't it be great? It sounds like a fun challenge.
CommentsComments powered by Disqus