Featured Image

The Cost of Artificial Intelligence

Aside from the obvious (i.e, destruction of the planet along with all creatures including the human race by a Skynet wielding lunatic), how much does AI cost us? What do we pay, lose or donate?

Well, as it turns out, a lot. Several academics and businesses have published work about what these units of computation cost, for your wallet and the planet. By Google’s own estimates (in 2009), each AI-powered search consumes 0.0003 kWh of energy [1], equivalent to powering a 60W light bulb for 17 lovely seconds and ejecting roughly 0.2g of delicious CO2 into the atmosphere. With roughly 40,000 searches per second, that’s 8kg, 16kg, 24kg, and so on. By the time you finish reading this article, probably a tonne or two. The green plants of the world must be thankful. Now in 2021, each search perhaps uses less energy, owing to more efficient hardware and algorithms. Though, given the increasing complexity of their main business which ties together disparate data streams from your browsing history and that of online advertisers, it may well be higher. The fact is, AI consumes an enormous amount of computing power to build and run. Think about that next time Netflix uses AI to recommend you the latest anti-fossil fuel documentary, which is extra ironic since most of Netflix’s compute is purchased from Amazon, who power 50% of their data centres with coal and gas.

AI models are becoming massive. We’re talking billions of individual elements conjoining into a multi-stranded network of neurons, transformers and matrices. It’s been found that the training of common Natural Language Processing (NLP) models produces CO2 emissions equivalent to that of the construction and lifetime running of five cars, including fuel [2]. Or that of a flight between New York and San Francisco. The monetary costs of this training are beyond that of most people on the planet, estimated to be anything between $80k and $1.6m for BERT [3], a widely used language model created by Google used for “understanding” user searches. Still, for the majority of governments and large corporations, whilst not a trivial expense, it’s one that’s reasonable for those with large IT expenditure.

Ultimately, the budget for AI comes from the consumer. This is an upwards flow of cash, from pay cheques spent on your favourite brand of mayonnaise, to that drone you bought because of that well-timed advertisement you saw whilst scrolling on Instagram. Which leads conveniently on to the cost of privacy. The data of consumers is used so widely across the marketing and analytics regiments of businesses that it’s possible to purchase the political preferences of pretty much anyone. And if they have pets, children or fancy furniture (what does your credit card history say about you?). Combining knowledge of a holiday destination with a single large grocery bill is probably enough to accurately place a family on the income spectrum. Each and every online advertisement you see has been placed by an AI model, trained to optimise the probability that you will see it, notice it and eventually click on it. If not this time, the next time, or the time after that. In fact by not clicking on it, the model can learn something about you.

The data fed into AI machines is valuable, probably more so than the resulting model. And they are hungry. It’s likely that in the not too distant future, all data gathering projects will seek an AI model as an expected output. Experian Plc sells, at considerable cost, 6000 data points on each of the 14 million individuals in its UK database. When combined with realtime preferences, that’s Google searches, Facebook likes, Zoom conversations and Amazon purchases, the outputs can be phenomenally powerful. Perhaps the previous CEO of Google put it best:

“With your permission, you give us more information about you, about your friends, and we can improve the quality of our searches. We don’t need you to type at all. We know where you are. We know where you’ve been. We can more or less know what you’re thinking about.”
— Eric Schmitt, 2010.

[1] https://googleblog.blogspot.com/2009/01/powering-google-search.html
[2] Strubell, E., Ganesh, A. and Mccallum, A. (2019). Energy and Policy Considerations for Deep Learning in NLP. [online] Available at: https://arxiv.org/pdf/1906.02243.pdf.
[3] Sharir, O., Peleg, B. and Shoham, Y. (2020). THE COST OF TRAINING NLP MODELS. [online] Available at: https://arxiv.org/pdf/2004.08900.pdf.

To discuss the ideas presented in this article please click here.