This article optimizes lithium-ion battery management in a datacenter to: (i) maximize the dollar savings attainable through peak shaving, while (ii) minimizing battery degradation. To the best of the authors’ knowledge, such multi-objective optimal datacenter battery management remains relatively unexplored. We solve this optimization problem using a second-order model of battery charge dynamics, coupled with a physics-based model of battery aging via solid electrolyte interphase (SEI) growth. Our optimization study focuses on a classical feedforward-feedback energy management policy, where feedforward control is used for peak shaving, and feedback is used for tracking a desired battery state of charge (SOC). Three feedforward-feedback architectures are examined: a proportional (P) control architecture, a proportional-integral (PI) architecture, and a PI architecture with a deadband in its feedforward path. We optimize these architectures’ parameters using differential evolution, for real datacenter power demand histories. Our results show a significant Pareto tradeoff between dollar savings and battery longevity for all architectures. The introduction of a deadband furnishes a more attractive Pareto front by allowing the feedforward controller to focus on shaving larger peaks. Moreover, the use of integral control improves the robustness of the feedback policy to demand uncertainties and battery pack sizing.