

You’re not even arguing at this point, you’re just repeating false claims I already debunked as if saying things louder might make you more correct.
You’re not even arguing at this point, you’re just repeating false claims I already debunked as if saying things louder might make you more correct.
I never said I cared about labor, I only care about outcomes. You’re the inconsistent one.
bro just doesn’t get it, sorry you’re slow mate
Ah, so this kind of tool is allowable, but not another?
Yes.
Pretty hypocritical thinking there.
Not even.
Different tools with different costs and different outcomes.
Grammarly predates commercial generative AI, as I attempted to explain to you before. It’s over a decade old. You clearly don’t understand the core mechanisms of any of these things.
Ok but why do you think it’s okay to use a wrecking ball for a task that requires a chisel? You’re creating low quality high cost work just because it’s fast and easy.
LMAO wtf? I included all of gaming opposed to all generative AI. My estimate also included the cost of production if you check the source.
You’re the one who wanted to compare AI power costs to gaming costs and now you’ve shifted the goalpost to all power costs for everything total?
It’s a waste. AI is a massive fucking waste. It’s going to actually literally kill us all with climate change alone, it’s going to multiply our power consumption many times over in only a couple of decades at the current rate even after you account for efficiency gains. It’s beyond worthless, it’s an almost pure negative.
Apparently you can only read 2 of 3 lines, that estimate was a global projection of gaming cost IF the globe followed similar trends to the USA (because thats the only available data) so the real global cost estimate for gaming might be far far lower.
The USA alone spent 27 on gaming, not 285.
What a fucking curveball joke of a question, you take a nearly impossible to quantify comparison and ask if its equivalent?
A high scenario electricity consumption figure of around 27 TWh, and a low scenario figure of 14.7 TWh
North American gaming market is about 7% of the global total
then that gives us a very very rough figure of about 210-285 TWh per annum of global electricity used by gamers.
AI:
The rapid growth of AI and the investments into the underlying AI infrastructure have significantly intensified the power demands of data centers. Globally, data centers consumed an estimated 240–340 TWh of electricity in 2022—approximately 1% to 1.3% of global electricity use, according to the International Energy Agency (IEA). In the early 2010s, data center energy footprints grew at a relatively moderate pace, thanks to efficiency gains and the shift toward hyperscale facilities, which are more efficient than smaller server rooms.
That stable growth pattern has given way to explosive demand. The IEA projects that global data center electricity consumption could double between 2022 and 2026. Similarly, IDC forecasts that surging AI workloads will drive a massive increase in data center capacity and power usage, with global electricity consumption from data centers projected to double to 857 TWh between 2023 and 2028. Purpose-built AI nfrastructure is at the core of this growth, with IDC estimating that AI data center capacity will expand at a 40.5% CAGR through 2027.
Lets just say we’re at the halfway point and its 600 TWh per anum compared to 285 for gamers.
So more than fucking double, yeah.
And to reiterate, people generate thousands of frames in a session of gaming, vs a handful of images or maybe some emails in a session of AI.
Nice, I swear it never works when I try searching it.
And the official solution to this problem in the documentation is a library deprecated four versions of the framework and/or programming language ago.
None of your examples are even close to a comparison with AI which steals from people to generate approximate nonsense while costing massive amounts of electricity.
Firstly, a calculator doesn’t have a double digit percent chance of bullshitting you with made up information.
If you’ve ever taken a calculus course you likely were not allowed to use a calculator that has the ability to solve your problems for you and you likely had to show all of your math on paper, so yes. That statement is correct.
Right but to detect close-enough spellings and word orders, using a curated index or catalogue of accepted examples, is one thing.
To train layers of algorithms in layers of machines on massive datasets to come up with close enoughs would be that but many times over the costs.
You would be a moron to use llms for spellchecking.
To clarify to you, not all programs are equal. Its not all different methods to do the same thing at the same cost.
Good shit. A carefully thought out handcrafted experience will always be better than interactive slop.
If you learned to code with AI then you didnt learn to code.
LLMs shouldn’t be used for spellcheck that would just be a massive waste of power.
I think the sort of generative AI referred to is something that trains on data to approximate results, which consumes vast amounts more power.
Not AI but certainly a semirandom function. Then they go through and manually clean it up by hand.
Reread what I said, smarter this time.