Contemporary numerical methods for nonlinear optimization possess strong global and fast local convergence guarantees for feasible problems under common assumptions. They also often provide guarantees for (eventually) detecting if a problem is infeasible, though in such cases there are typically no guarantees of fast local convergence. This is a critical deficiency as in the optimization of complex systems, one often finds that nonlinear optimization methods can fail or stall due to minor constraint incompatibilities. This may suggest that the problem is infeasible, but without an infeasibility certificate, no useful result is provided to the user. We present a sequential quadratic optimization (SQO) method that possesses strong global and fast local convergence guarantees for both feasible and infeasible problem instances. Theoretical results are presented along with numerical results indicating the practical advantages of our approach.