As AI models increasingly consume more energy, especially in data centers, it’s vital to adopt energy-efficient tools to mitigate their impact. The Lincoln Laboratory Supercomputing Center is developing methods to help data centers reduce energy use, from simple measures like power-capping hardware to advanced techniques that can halt AI training early. By demonstrating that these strategies have minimal impact on model performance and could significantly cut energy consumption, it is hoped to spur widespread adoption.