Well even so, it does a pretty poor job of it. And I'm being generous, it's really more like smoke and mirrors. [MOD EDIT]
First, they use total world-wide
energy production for their basis, rather than
electrical energy production. Try mining bitcoin by burning some natural gas, kerosene, coal, wood, or dung w/o first turning it into electricity. Not gonna happen.
So let's substitute WW annual
electrical energy production, which was reported as 25,606 TWh (Terra is 10^12) in 2017, in place of their 170,000 TWh of
energy per year. So they minimize Bitcoin electrical energy % by a factor of 6.64x by using the wrong metric. How "convenient".
https://en.wikipedia.org/wiki/World_energy_supply_and_consumption
So they estimate current Bitcoin energy consumption at 140 TWh, ~ 0.08% of
energy production, and estimate that
"it should reach several tenths of one percent of global energy usage." So @ 0.2% (a generous number for 'several tenths'), that would be 340 TWh annually.
Which makes it 1.33% of annual electrical energy production.
Is it OK to use so much energy just because it is 'only 1.33%' of the total? That's not a very helpful way to improve energy efficiency. These are big numbers, let's try to add some context. What can we do with 340 TWh?:
Assume an EV uses 330 Wh per mile, and the average driver drives 12,000 miles/year. That is 3.96 MwH per year per EV (3.96*10^6 watt-hours to keep decimal points straight).
That could supply all the power needed for 85.86 Million EVs (85,858,586 to keep those decimal points straight)!
That's a lot of electrical energy, any way you slice it.
So yes, that article 'addresses it', using bad metrics, and does a hand-wave at a large number. If you break it down, the article actually does demonstrate just what a huge amount of energy Bitcoin requires. But they try to dismiss it by (falsely) claiming it is a low % overall. [MOD EDIT]
-ERD50