Wednesday, September 21, 2016

Supervisors Are Right to Push L.A. County on Replacing SDM

On Sept. 17, Daniel Heimpel reported in this publication on a motion by two members of the Los Angeles Board of Supervisors that would require Los Angeles’ child protection agency to re-evaluate how it addresses risk. This comes in the wake of the death of eleven-year-old Yonatan Aguilar, whose family was the subject of six prior CPS reports, according to records reviewed by the Los Angeles Times.
Four times, Yonatan was found to be at high risk of maltreatment, but the county never even opened a case.
Opinion_Feature_ImageThe Supervisors are correct to question the effectiveness of the Structured Decision Making (SDM) protocol that Los Angeles and many other jurisdictions use to guide decision making in cases of child maltreatment. SDM, described as an “actuarial model” of risk assessment, is a series of questionnaires to be used at different phases of a case to determine the risk level for a child in a given situation. The social worker checks off the appropriate boxes, and the instrument spits out a risk level which is supposed to inform the decision about how to proceed.
SDM has been criticized for various reasons, including the fact that it is easy for social workers to manipulate in order to generate the recommendations they want. As child welfare researcher Emily Putnam-Hornstein pointed out to the Commission to Eliminate Child Abuse and Neglect Fatalities, many workers manipulate the tools because they think their clinical judgment is superior.
And they are probably right. SDM tools simplify complex clinical data into multiple-choice questions and do not add any data to what the social worker puts in.
A new generation of risk assessment tools, usually referred to as “predictive analytics,” is on the brink of replacing outdated actuarial assessments like SDM. Los Angeles has been at the forefront of developing the new tools. It contracted with software behemoth SAS to develop apredictive analytics algorithm, called AURA, that attempted to identify which children referred to CPS would be the victims of severe maltreatment.
In a column published on Feb. 3, I wrote about the spectacular success of the AURA demonstration. Among those families with at least one CPS referral prior to the current one, flagging the top 10 percent of referrals that earned the highest AURA scores would have predicted 171 critical incidents, which amounts to 76 percent of the deaths and severe injuries to children, according to the Project AURA Final Report.
One of the sources of AURA’s power is that it draws from other sources of data that CPS investigators often cannot access or don’t have time to search fully, such as the mental health, public health and criminal justice systems. Without this data, workers must rely on a parent’s answers to questions about their mental health, substance abuse or criminal history.
But progress on implementing AURA seems to have stalled since a contentious public meeting in July of 2015. DCFS staff quoted in Heimpel’s article do not seem to understand the revolutionary nature of the AURA tool. Acknowledging that AURA predicted more than two-thirds of the county’s critical incidents, DCFS Public Affairs Director Armand Montiel stressed that it also identified 3,829 “false positives,” or cases in which there was no critical incident.
But Montiel missed the point. Children who are abused or neglected, but don’t die or “only” nearly die, are not “false positives.” We don’t know how these children have fared since they were identified by AURA, but I certainly hope that Montiel did not intend to say that we don’t need to protect children from any maltreatment that does not result in severe injury or death.
Nevertheless, Montiel is correct that AURA cannot tell for sure if a child will be a victim of severe maltreatment and that it cannot replace a good investigation. Clinical judgment should always trump the result of any algorithm. But as Putnam-Hornstein suggests, when a predictive algorithm identifies a case as high-risk, an extra layer of review can be put into place, or two workers can be sent out on a case, to make sure it is getting the scrutiny it deserves.
There is some irony in the case that prompted the motion. The SDM tool did correctly classify Yonatan as being at high risk, but workers disregarded the finding. Having a better tool is not enough to protect children. The tool needs to have teeth, not to replace clinical judgment but at least to require a higher level of review when a child is identified as high-risk.

This column was published in the Chronicle of Social Change on September 20, 2016.

1 comment:

  1. Custom Suit Washington DCDC Custom Clothiers offers the finest suits custom made in the Washington, DC | Maryland | Virginia area. Custom Suits at $539 and Custom Shirts at $75

    ReplyDelete