Green AI: Making Machine Learning Environmentally Sustainable
AI's rapid growth is having a significant and largely hidden environmental cost — Microsoft and Google have both reported carbon emissions surging by 30-50% in recent years, driven primarily by AI infrastructure.
In this talk, Charles explores the environmental impact of large language models and offers practical strategies for reducing it at each stage of the AI lifecycle. Topics include demand shifting and shaping, model compression techniques (pruning, distillation, and quantisation), federated learning, speculative decoding, and how choosing where and when to run workloads can reduce software carbon emissions by up to 99%.