I recently finished my first set of classes toward a BS in data analytics. It’s not very useful advice now, I hope, but I wouldn’t necessarily recommend attempting this in the academic quarter containing the most critical election of your lifetime. That notwithstanding, this means I’ve spent the past several weeks immersed in discussions of deriving meaning from numbers.
As I was gearing up to start my term, a “debate” broke out on Twitter. I put the term in scare quotes, because what actually happened is that one group of offered reasoning and explanations and another group pointed to the first and had vapors about the end of Western civilization. The question at hand? “Does 2 + 2 = 4?” The answer some people found objectionable? “Sometimes. Not always.”
Surrounded as I’ve been this term with issues of data quality, making assumptions explicit, the limits of the most common statistical tests, and error terms, this “debate” has never been far from my mind. I saw an echo of it again today, and lo and behold, I finally have some time to write about it.
The boys who cry “Postmodernism!” without much understanding of the history of philosophy are all but background noise these days, so I mostly noted their existence once again and moved on. Funnily enough, though, this actually is a postmodernism question. This is all about deconstructing the meaning of the equation. Are we talking about some ideal of “2” and “4”, or are we communicating about something else, where “2” and “4” are abstractions of reality that may be more or less reflective of that reality?
Also, does anyone else laugh when someone claims that questioning the perfect, inherent two-ness of “2” will be the end of civilization as we know it? Continue reading “When 2 + 2 = 5 and Other Ways to Be Wrong with Heuristics”