
Human-Centered Child Support – The Role of Data Science & Behavioral Design
By James Guszcza, US Chief Data Scientist, Deloitte Consulting
The data science (AI and “playing Moneyball”) and behavioral insights (“Nudge”) revolutions are two of the signature issues of our time. But too often they are discussed in tactical, siloed terms rather than strategic, holistic terms. At first blush, the two topics might seem to have little in common. After all, isn’t AI and data science about algorithms and digital tech while behavioral insights is about psychology? Well, kind of. But viewed properly, data science and behavioral science can function as complementary parts of a greater – more effective – whole. And together, they provide us a modern toolkit that can be used to make the child support system more satisfying to struggling parents and child support professionals alike.
The connecting theme is human-centered design: people are more likely to do the right thing if you give them the right tools and information, and make it easy for them. Put another way, policies, programs, and tools are more effective if they are designed to go with, not against, the grain of human psychology.
Playing Moneyball
A major finding from psychology in the past 40 years is that unaided human intuition, while great at many things, is terrible at statistics. This is a major theme of Daniel Kahneman’s masterful book Thinking, Fast and Slow. Predictive algorithms can weigh together 500 risk factors more accurately than even trained experts can weigh together 5 – particularly when said expert is suffering from low blood sugar! My own experience as a data scientist is consistent with decades of psychological research and industrial case studies: predictive algorithms run circles around unaided professional judgment in such disparate realms as identifying non-custodial parents at risk of falling behind on child support payments, estimating the riskiness of insurance contracts, making effective hiring and promotion decisions, making medical diagnoses, and matching case workers with clients.
Does this imply that artificial intelligence algorithms can replace child support case workers, just as IBM Watson beat Ken Jennings and Brad Rutter on Jeopardy? No, and an analogy helps see why: just as we prescribe eyeglasses to help people overcome nearsightedness, algorithms can serve as “cognitive eyeglasses”: they are tools that help us overcome the cognitive biases associated with what Kahneman calls “Thinking Fast” (aka “System 1”). They can also automate simple tasks and “spadework”. This simultaneously provides economic benefits and helps busy professionals up their professional games by freeing up energy and mental cycles to direct their precious professional judgment and human empathy to the cases that need it most.
Steve Jobs memorably called computers “bicycles for the mind”, and this is also the best way to think about predictive models and other Artificial Intelligence algorithms. They don’t replace human minds; they extend human minds. It’s no accident that Jobs is remembered as a leading design thinker, not just a technologist.
Improved engagement by design
But the story doesn’t end here. It’s a truism that no predictive algorithm will provide value unless it is appropriately acted upon. Algorithms can help prioritize risks and match the right cases to the right case workers. But by themselves, algorithms cannot induce the needed behavior change on the part of struggling parents. I call this the “last mile problem” of predictive analytics. This is where the new science of choice architecture – Nudge – comes is.
Choice architecture is about harnessing behavioral insights to make small design tweaks to choice environments in ways that can have surprisingly large impacts on behavior. For example, if someone (say an NCP who is without a job or work history) pre-commits to a certain goal using specific language (“I will spend Wednesday morning at the library from 8-11AM doing x, y, and z to find a new job”) they are more likely to follow though than if they vaguely express their good intentions. Another example: people do “mental accounting” – they think in terms of mental cookie jars. An unexpected gift certificate “feels” like mad money, even though it could be used to help pay the rent. Setting up a dedicated bank account, or simply using smart data visualization, might be a way of harnessing mental accounting to help parents achieve their goals. A third, even simpler example: printing letters on colored paper and using clear language rather than bureaucratese can help ward off the “ostrich effect” and boost engagement.
Some of the most effective choice architecture tools involve social factors. For example, when the UK tax authority added a line to a letter reminding tax-payers that 90% of their neighbors payed their taxes on time, they collected millions of pounds of additional revenue. This was a classic example of harnessing “social proof”: people are more likely to do the right thing when reminded that this is how similar people typically act. Social proof is routinely used to cut down on college campus binge-drinking, prompt people to use less electricity, and cut down on towel usage in hotel rooms. We have used such nudge tactics – precision-targeted using machine learning algorithms – to help US states cut down on improper unemployment insurance (UI) payments. This combined use of predictive analytics and nudge tactics resulted in a reduction in improper payments of over 30%.
Child support agencies are beginning to adopt such nudge tactics in their written communications. For example:
- The Florida Department of Revenue used a predictive algorithm to identify cases selected for contempt, and field-tested attorney letters that were carefully crafted to nudge parents to contact the agency and make payments. This low-cost intervention resulted in a double-digit ROI.
- The Sacramento and San Joaquin Department of Child Support Enforcement is currently field-testing the effectiveness of a user-friendly “explainer sheet”, complemented with follow-up visits from case workers trained in behavioral economics concepts.
- Similarly, the New York City Office of Child Support Enforcement has also tested the effectiveness of clear communications. They made progressively more user-friendly edits to an advertising flier for an arrears forgiveness program. At first, the flier had a stark appearance and was loaded with jargon. The organization rolled out progressively more user-friendly variants of the letter over time, making it more readable and visually appealing; incorporating such behavioral insights as social proof, present bias, loss aversion, and time scarcity; and reinforcing the flyer with both magazine ads and personalized follow-up letters.
Note that using a Randomized Controlled Trial (RCT) methodology (the randomly selected control group does not get the treatment) enables us to quantify effectiveness and the social and economic benefits of such interventions. This “test and learn” approach enables us to scientifically determine what works, and is at the heart of the behavioral insights approach. While this kind of work takes time and effort, the good news is that: (a) much of it is common sense, (b) it can be scientifically and economically justified through the use of randomized controlled trials (RCT) methodology, and (c) it works!
3D thinking
Deploying predictive algorithms to help child support case workers and adopting choice architecture to make it life easier for struggling parents are each forms of human-centered design: they help improve different aspects of human decision-making. And they can work together synergistically. Algorithms can proactively identify potential problem cases and families most vulnerable to economic downturns so that the case worker can focus on prevention, not just enforcement. Such behavioral design tools as clear, user-centered communications, pre-commitment, mental accounting, saliency, and social poof (and the list goes on) can be harnessed and field tested to go “the last mile” from algorithm output to improved outcome.
And ultimately, data science can be used to match the right intervention to the right person to yield the best outcomes for families. For the toughest cases, well-selected case workers can give clients the coaching they need using empathy, positive psychology, and the latest findings from the science of positive habit-formation. For other cases, well-crafted mobile communication message or text message and commitment devices might be all that’s necessary. Data science helps us evolve beyond population-level “one size fits all” interventions, towards the right intervention for the right case at the right time.
Not all of this needs to be tackled at the same time. It’s best to start with relatively easy wins, and make steady progress using an empirical, test-and-learn approach. But even early in the process, it’s best to have a clear set of goals in mind.
To that end, I encourage what might be called a “3D” mindset: data science and digital technology enable us to create wonderful new tools; but when people are involved, human-centered design thinking is necessary for those tools to be effective.
Jim Guszcza is the US Chief Data Scientist at Deloitte Consulting LLP
Acknowledgement
Thanks to Margot Bean, John White, and Rachel Frey for helpful suggestions.
Further reading
“Next Gen Child Support: Improving Outcomes for Families” by John White, Margot Bean, Tiffany Fishman, John O’Leary
“The Last Mile Problem: How Data Science & Behavioral Science Can Work Together” by Jim Guszcza
“The Importance of Misbehaving: A Conversation with Richard Thaler”, by Jim Guszcza

