The whole climate impact of AI is overstated a million times over. You can run a perfectly capable, GPT 4o destroying LLM on your own GPU right now. What is the difference between jonkling to vidya gaems or running an LLM for doing something productive for a fucking change?
The servers running the network you posted this shit over , the servers used to develop your favorite Gacha coom game, the servers used to deliver your shit ass Xitter feed, or your Netflix garbage, use more power than AI and LLMs are actually actively more productive than that shit
You can run a fully fletched LLM at home but you can’t train the model. The latter is a huge contributor to power consumption. Running it is peanuts in comparison.
That is a one time cost. Do you add the cost of every single watt of electricity ever generated, the cost of making factories, extracting materials and so on, to the “climate impact” of electrical rail service or EVs?
This is just hysteria, I’m going to be absolutely deadass with you
Also, you theoretically could make an LLM entirely on your own PC, it’d just take a bit of time
Yes I do in fact. We need to lower the economical impact of production too, consumption is just a drop in the bucket. To put it in perspective, I can run my PC from a second hand generator. Most low end generators might even be able to run 10s of my PCs. A datacenter training the high end LLMs that I could be running needs a nuclear power plant worth of energy. We are talking multiple magnitudes of difference.
I suppose you don’t consider the coal-powered electricity that powers your EV when reflecting over your impact too?
If you could be arsed, it’s entirely possible to train an LLM with 10 PCs using strictly open source and ethical datasets available on huggingface (although not something that is going to beat SOTA models)
Weirdly enough then that OpenAI is losing money on every querry, they lose money on every paid subscription and are on track to lose billions this year.
This is … Incredibly dumb.
The whole climate impact of AI is overstated a million times over. You can run a perfectly capable, GPT 4o destroying LLM on your own GPU right now. What is the difference between jonkling to vidya gaems or running an LLM for doing something productive for a fucking change?
The servers running the network you posted this shit over , the servers used to develop your favorite Gacha coom game, the servers used to deliver your shit ass Xitter feed, or your Netflix garbage, use more power than AI and LLMs are actually actively more productive than that shit
You can run a fully fletched LLM at home but you can’t train the model. The latter is a huge contributor to power consumption. Running it is peanuts in comparison.
That is a one time cost. Do you add the cost of every single watt of electricity ever generated, the cost of making factories, extracting materials and so on, to the “climate impact” of electrical rail service or EVs?
This is just hysteria, I’m going to be absolutely deadass with you
Also, you theoretically could make an LLM entirely on your own PC, it’d just take a bit of time
Yes I do in fact. We need to lower the economical impact of production too, consumption is just a drop in the bucket. To put it in perspective, I can run my PC from a second hand generator. Most low end generators might even be able to run 10s of my PCs. A datacenter training the high end LLMs that I could be running needs a nuclear power plant worth of energy. We are talking multiple magnitudes of difference.
I suppose you don’t consider the coal-powered electricity that powers your EV when reflecting over your impact too?
OK you walk the walk I respect that
If you could be arsed, it’s entirely possible to train an LLM with 10 PCs using strictly open source and ethical datasets available on huggingface (although not something that is going to beat SOTA models)
Weirdly enough then that OpenAI is losing money on every querry, they lose money on every paid subscription and are on track to lose billions this year.