Many important research hypotheses concern conditional relations in which the effect of one predictor varies with the value of another. Such relations are commonly evaluated as multiplicative interactions and can be tested in both fixed- and random-effects regression. Often, these interactive effects must be further probed to fully explicate the nature of the conditional relation. The most common method for probing interactions is to test simple slopes at specific levels of the predictors. A more general method is the Johnson-Neyman (J-N) technique. This technique is not widely used, however, because it is currently limited to categorical by continuous interactions in fixed-effects regression and has yet to be extended to the broader class of random-effects regression models. The goal of our article is to generalize the J-N technique to allow for tests of a variety of interactions that arise in both fixed- and random-effects regression. We review existing methods for probing interactions, explicate the analytic expressions needed to expand these tests to a wider set of conditions, and demonstrate the advantages of the J-N technique relative to simple slopes with three empirical examples.