Loading

The Power Paradox: A Review of the Challenges and Solutions to the Energy Efficiency of AI and Cloud Computing
Ammar Jiruwala

Ammar Jiruwala, Navrachana International School Vadodara, Gujarat, India.    

Manuscript received on 15 September 2024 | Revised Manuscript received on 29 September 2024 | Manuscript Accepted on 15 December 2024 | Manuscript published on 30 December 2024 | PP: 11-18 | Volume-14 Issue-2, December 2024 | Retrieval Number: 100.1/ijeat.B455414021224 | DOI: 10.35940/ijeat.B4554.14021224

Open Access | Editorial and Publishing Policies | Cite | Zenodo | OJS | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: Cloud computing has raised significant concerns about their environmental impact, particularly in terms of energy consumption and carbon emissions. This review paper provides a comprehensive analysis of the energy consumption trends in AI, with a particular focus on inference costs in both cloud and edge computing scenarios. By consolidating data from recent research, this paper presents a nuanced view of energy consumption trends, distinguishing between cutting-edge models and those in general use. The findings reveal that while state-of-the-art AI models show exponential growth in energy consumption, average models demonstrate more stable or even decreasing energy use patterns, largely due to improvements in hardware efficiency and algorithmic innovations. The review also explores potential solutions to mitigate AI’s environmental impact, including advanced hardware designs, energy-efficient algorithms, and novel data acquisition techniques.

Keywords: Artificial Intelligence, Energy Efficiency, Cloud Computing, Edge Computing, Environmental Impact.
Scope of the Article: Artificial Intelligence and Methods