Evaluating Programs in Complex Systems: New Approaches

Posted September 25, 2017
By the Annie E. Casey Foundation
Blog evaluatingprogramsincomplexsystems 2017

Increas­ing­ly, human ser­vice pro­grams that serve chil­dren and fam­i­lies are embed­ded in com­plex sys­tems. Cre­at­ing oppor­tu­ni­ty for par­ents and chil­dren togeth­er means that child-care cen­ters might, for exam­ple, be in hous­ing projects whose res­i­dences are con­nect­ed to col­lege pro­grams. Com­mu­ni­ty orga­ni­za­tions increas­ing­ly pro­vide par­ent­ing skills edu­ca­tion, work­force devel­op­ment pro­grams and have men­tal health coun­selors on site. The goal: pro­gram par­tic­i­pants expe­ri­ence seam­less deliv­ery of mul­ti­ple ser­vices under the umbrel­la of a sin­gle pro­gram. But inte­grat­ing approach­es that tra­di­tion­al­ly have been sep­a­rate requires a new approach to eval­u­a­tion as well.

Tra­di­tion­al­ly, pro­gram eval­u­a­tion relies on iso­lat­ing inter­ven­tion effects, deter­min­ing core com­po­nents and pro­ce­dures that can be stan­dard­ized to a sig­nif­i­cant degree. But com­plex sys­tems that pro­vide indi­vid­u­al­ized pro­to­cols to meet shift­ing par­tic­i­pant needs chal­lenge these method­olo­gies at a time when inter­est in iden­ti­fy­ing evi­dence-based prac­tices, pro­grams and poli­cies con­tin­ues to grow — and with it, demand for rig­or­ous pro­gram eval­u­a­tion. As it invests in build­ing the evi­dence base for two-gen­er­a­tion approach­es, the Casey Foun­da­tion is exam­in­ing how eval­u­a­tion method­ol­o­gy can be adapted.

What hap­pens when tra­di­tion­al eval­u­a­tion meth­ods are not ade­quate for pro­vid­ing evi­dence?” asks T’Pring West­brook, a senior asso­ciate with the Foundation’s Research and Eval­u­a­tion team. The emerg­ing the­o­ry is that it is not the iso­la­tion of effects, but the syn­er­gy of mul­ti­ple effects that leads to sig­nif­i­cant change. The devel­op­ment of new eval­u­a­tion method­ol­o­gy is in the very ear­ly stages, but some impor­tant themes are emerging.”

Accord­ing to West­brook, those themes include the following:

  1. Despite the cus­tomiza­tion and com­plex­i­ty inher­ent in a ser­vice deliv­ery approach, it’s still crit­i­cal to define and explic­it­ly artic­u­late the inter­ven­tion.”
  2. The flex­i­bil­i­ty of the inter­ven­tion makes it even more impor­tant to devel­op a log­ic mod­el and/​or a com­pre­hen­sive the­o­ry of change.
  3. Mul­ti-method eval­u­a­tion designs that draw from imple­men­ta­tion sci­ence, appro­pri­ate qual­i­ta­tive pro­to­cols, quan­ti­ta­tive approach­es and sophis­ti­cat­ed sta­tis­ti­cal analy­ses are increas­ing­ly yield­ing rich information.
  4. New approach­es to eval­u­a­tion require an invest­ment in com­ple­men­tary new approach­es to measurement.

Casey grantees and oth­ers will come to togeth­er to dis­cuss these impor­tant issues dur­ing a pan­el dis­cus­sion at 1:45 p.m. Fri­day, Nov. 10, at the Amer­i­can Eval­u­a­tion Fall 2017 Research Con­fer­ence: From Learn­ing to Action. The pan­el, New Approach­es to Eval­u­at­ing Com­plex Social Ser­vice Sys­tems,” will high­light eval­u­a­tion efforts across many areas of social ser­vice — inte­grat­ed edu­ca­tion, finan­cial and health ser­vices for par­ents and chil­dren and place-based ini­tia­tives — and the chal­lenges of eval­u­at­ing pro­grams that real­ly con­sist of a mosa­ic of inter­ven­tions, part­ners and activities.

Pan­elists include con­sul­tant Alli­son Holmes; Kath­leen Dwyer, senior social sci­ence research ana­lyst at the Admin­is­tra­tion for Chil­dren and Fam­i­lies, U.S. Depart­ment of Health and Human Ser­vices; Mar­garet Sul­li­van, senior researcher at Math­e­mat­i­ca Pol­i­cy Research; and Susan J. Pop­kin, direc­tor, Neigh­bor­hoods and Youth Devel­op­ment at the Urban Institute.

Learn more about the pan­el dis­cus­sion on com­plex sys­tem evaluation