Optimal balance is a numerical method to compute a point on the slow manifold of a two-scale dynamical system, where the fast time scale is characterized by rapid oscillations. We describe how the method is based on the concept of adiabatic invariance of the fast degrees of freedom, show that an implementation by backward-forward nudging is quasi-convergent - convergent up to exponentially small residuals - and give examples of the use of optimal balance in geophysical fluid dynamics.