Facts and True Facts: More on Divorce

initial guest post
noted that recent
divorce statistics were misinterpreted widely
in both the media,
and by the academics interviewed by the press. The question is what went wrong with the latest data?

First, some necessary background. This
was published by the Census Bureau counting the proportion of those
who had wed in each year who subsequently celebrated various
anniversaries. Here’s a quick test: Look
at the data, and decide for yourself what is happening to marital
stability. Or if you are lazier, let me
help with an example: the Census reported that 76.4% of men whose first wedding
occurred in 1985-89 had celebrated a tenth anniversary; this declined rather
dramatically to 70.0% among those who marrying in 1990-94. By jingo, it looks like recent marriages have
become much less stable!

Not so fast. The
marital history data were collected from July-September 2004, and hence those
who had married in, say, October 1994, simply
could not have reached their tenth anniversary
by the survey date. Because this affected around one-in-ten of
those wed from 1990-94, this statistical factor alone explains what looked like
a decline in marital stability.

How do we interpret what happened?

  1. The
    Census Bureau reported true and useful facts:
    The data are interesting, and
    the table includes a small footnote, noting “Approximately 10 percent of the
    cohort has not reached the stated age by the end of the latest specified time
    period. Because of this, estimates for this group for the highest anniversary
    are low.”  With this qualification, one
    should not conclude that divorce is rising. (But what should one conclude? No
    guidance is given.)
  2. The
    Census Bureau reported true, but useless facts:
    The tables measure exactly
    what it says it measures. The Census
    Bureau is like Fox news:
    We report, you decide. And we report,
    even if the number we report is meaningless.
  3. The Census
    Bureau reported misleading facts:
    It is obvious that a qualifying footnote will
    be ignored by most. Indeed, the New York
    Times printed
    the table
    but omitted the footnote. But
    let’s not be too harsh on the NY Times: I talked about these data with several excellent
    economists, and none even noticed the
    . Headline numbers deserve
    headline qualifications.
  4. The
    survey was flawed:
    If the Census is interested in measuring the survival of
    a set of marriages to their tenth anniversary, then failing to wait ten years after
    a wedding to measure this is a surveying glitch.

So what is the mission of a statistical agency? If the Census’ job is to just report back
what we (the surveyed population) tell them, then they performed that task
adequately. If their job is to report
parameters – useful facts – then they failed miserably, as the data they
reported are hopeless biased indicators of marital stability. Alternatively, the question is: Does the
Census provide facts, or interpretation? I’m happy if they present only facts and leave the interpretation to experts. But is there an obligation to report only interpretable

Stephen Colbert’s term “truthiness“,
the reigning word of
the year
, refers to what you
want the facts to be as opposed to what the facts are
. I’m wondering, what is the right word is for something that is a fact but isn’t true? Untruthiness, anyone?


Comments for this post are closed