Efficiently solving saddle point systems like Karush–Kuhn–Tucker (KKT) systems is crucial for many algorithms in constrained nonlinear continuous optimization. Such systems can be very ill conditioned, in particular when the (1,1) block has few very small eigenvalues (see Rusten and Winther [SIAM J. Matrix Anal. Appl., 13 (1992), pp. 887–904]). However, it is commonly observed that despite these small eigenvalues, some sort of interaction between this (1,1) block and the (1,2) block actually occurs that may influence strongly the convergence of Krylov subspace methods like Minres. In this paper, we highlight some aspects of this interaction. We illustrate in particular, with some examples, how and in which circumstances the convergence of Minres might be affected by these few very small eigenvalues in the (1,1) block. We further derive theoretically a tighter lower bound on the positive eigenvalues of saddle point matrices of the KKT form.