Automatic differentiation (AD) has had a tremendous impact on deep learning by automatically generating efficient derivative computations. Unlike the densely populated derivatives common in deep learning, graphics applications often require second-order methods in order to converge and the Hessians are sparsely populated, which makes them a poor fit for deep learning AD libraries. Because the dense Hessian computation achievable with traditional AD is asymptotically slow (e.g., cubic rather than linear time), whole papers are dedicated to the manual derivation and implementation of the sparse Hessian for a particular energy. We propose a simple algorithm for sparse automatic differentiation based on the insight that, applied carefully, dead code elimination is able to eliminate unnecessary derivative computations. Our algorithm automatically computes Hessians, is straightforward to implement in existing, general-purpose AD libraries, and enables rapid prototyping and scaling of graphics applications by enriching popular AD libraries with the ability to compute sparse Hessians. We call the realization of our algorithm dead index elemination, a variant of dead code elimination, which we implement in state-of-the-art AD libraries: JAX and Enzyme. In benchmarking, we find that it achieves state-of-the-art performance, matching hand-coded sparse Hessian implementations. We showcase our results on graphics applications such as UV mapping, discrete shells, and simulation of hyperelastic materials.