New Delhi, Aug 6, 2024: An effective way to minimise the environmental footprint of leveraging Artificial Intelligence (AI) is by moving IT workloads from on-premises infrastructure to AWS cloud data centres in India and around the globe, according to a new study.
The study commissioned by Amazon Web Services (AWS) and completed by Accenture estimates that AWS’s global infrastructure is up to 4.1 times more efficient than on-premises. For Indian organisations, the total potential carbon reduction opportunity for AI workloads optimised on AWS is up to 99% compared to on-premises data centres.
The research states that simply utilizing AWS data centres for compute-heavy, or AI, workloads in India yields a 98% reduction in carbon emissions compared to on-premises data centres. This is credited to AWS’s utilisation of more efficient hardware (32%), improvements in power and cooling efficiency (35%), and additional carbon-free energy procurement (31%). Further optimising on AWS by leveraging purpose-built silicon can increase the total carbon reduction potential of AI workloads to up to 99% for Indian organisations that migrate to and optimise on AWS.
“Considering 85% of global IT spend by organisations remains on-premises, a carbon reduction of up to 99% for AI workloads optimised on AWS in India is a meaningful sustainability opportunity for Indian organisations,” said Jenna Leiner, Head of Environment Social Governance (ESG) and External Engagement, AWS Global.
“As India accelerates towards its US$1 trillion-dollar digital opportunity and encourages investments into digital infrastructure, sustainability innovations and minimising IT related carbon emissions will be critical in also helping India meet its net-zero emissions by 2070 goal. This is particularly important given the rising adoption of AI. AWS is constantly innovating for sustainability across our data centres —optimising our data centre design, investing in purpose-built chips, and innovating with new cooling technologies - so that we continuously increase energy efficiency to serve customer compute demands.
“This research shows that AWS's focus on hardware and cooling efficiency, carbon-free energy, purpose-built silicon, and optimized storage can help organizations reduce the carbon footprint of AI and machine learning workloads,” said Sanjay Podder, global lead for Technology Sustainability Innovation at Accenture. “As the demand for AI continues to grow, sustainability through technology can play a crucial role in helping businesses meet environmental goals while driving innovation,” Leiner said.
As per the study, running generative AI applications in a more sustainable way requires innovation at the silicon level with energy efficient hardware. To optimise performance and energy consumption, AWS developed purpose-built silicon like the AWS Trainium chip and AWS Inferentia chip to achieve significantly higher throughput than comparable accelerated compute instances. AWS Trainium cuts the time taken to train generative AI models—in some cases from months to hours. This means building new models requires less money and power, with energy-consumption reductions of almost one third/up to 29%.
According to the study, AWS’s additional carbon-free energy procurement in India contributes 31% in carbon emissions reduction for compute-heavy workloads and 44% for storage-heavy workloads. Aligning with Amazon's commitment to achieving net-zero carbon emissions across all operations by 2040, AWS is rapidly transitioning its global infrastructure to match electricity use with 100% carbon-free energy. Amazon met its 100% renewable energy goal seven years ahead of schedule. In India 100% of the electricity consumed by AWS data centres was matched with renewable energy sources procured in country in 2022 and 2023. This is due to Amazon’s investment in 50 renewable energy projects in India with an estimated 1.1 gigawatts of renewable energy capacity, enough energy to power more than 1.1 million homes in New Delhi each year.