We present proof-of-concept modeling results which demonstrate a novel way to evaluate theories of the structure of the phonological grammar: We generate predictions of how the behavior of constraint-based grammars and analogical models is likely to change under processing load. Previous results (Moore-Cantwell and Kush, 2019) show systematic differences in participant behavior when generating novel forms under working memory load, vs not. Under load, participants observed fewer generalizations of the language, produced less intra-speaker variation, and differed more participant-to-participant. We modeled memory load in constraint-based grammar (MaxEnt, Goldwater and Johnson, 2003) via probabilistic ignoring of constraints. Memory load in analogy (TiMBL, Daelemans et al., 2018) was modeled as a restricting of the search space for an analogical set, based on word-frequency. Our constraint-based model predicted a more systematic breakdown, more in line with previous experimental findings, while our analogical model predicted more word-to-word variation.