As AI models increasingly consume more energy, especially in data centers, it’s vital to adopt energy-efficient tools to mitigate their impact. The Lincoln Laboratory Supercomputing Center is developing methods to help data centers reduce energy use, from simple measures like power-capping hardware to advanced techniques to halt AI training when it has reached its goal. By demonstrating that these strategies have minimal impact on model performance and could significantly cut energy consumption, it is hoped to spur widespread adoption.