South Africa has one of the highest rates of sexual and gender-based violence (SGBV) in the world. To assist SGBV survivors, the Government of South Africa has established the Thuthuzela Care Centers (TCCs) that provide emergency medical care, psychosocial care, case management, and legal support. Social Impact designed a Randomized Control Trial (RCT) to evaluate the effectiveness of two USAID-funded activities aimed to increase utilization of the TCCs and decrease barriers to reporting SGBV.
SI interviewed Jordan Robinson, SI Deputy Director for Impact Evaluation, about this project.
What were some of the challenges to conducting the evaluation?
In order to measure what worked (or did not work) in achieving the project goal, we had to isolate and test just one specific intervention. Although the project had a full suite of activities, we could not evaluate them all together and generate results useful for learning or replication. We therefore opted to instead test the effectiveness of one specific intervention within that suite of activities.
Coordinating with multiple stakeholders was another important challenge for us. We devoted substantial efforts to get on the same page with the implementing partners as well as the South African government stakeholders before the impact evaluation (IE) and the intervention could commence. Thereafter, close coordination throughout implementation was paramount to the ultimate success of the IE. We established a coordinated rollout schedule that worked both for the evaluation and the implementation and both parties had to rigidly stick to that schedule for this effort to work. We also relied on the South African government to promote buy-in and cooperation with the crisis centers throughout the evaluation period, which lasted over a year.
What did this IE tell us about the intervention’s theory of change?
One of the key aspects of conducting a true impact evaluation is that it requires us to think very carefully about the program’s theory of change. In this case, we had an intervention hoping to increase utilization of rape crisis centers among victims of SGBV. The theory of change focused on a perceived lack of awareness among service providers and the public about the rape crisis centers. Our IE confirmed the perceived lack of awareness among the public and to a lesser extent, among service providers, but also identified other important constraints to utilization. However, the IE shows that the community dialogues that were supposed to be key drivers of use of the centers were not designed such that they could affect awareness at the community level.
What have we learned about how to help sexual assault survivors in South Africa? How can development assistance in South Africa be more effective?
From this IE, we learned some important things. First, we learned to be realistic about what interventions can achieve. Second, we learned to consider other binding constraints that impact outcomes of interest in our theories of change. We also learned important things about community knowledge and behavior that can be used to inform the design of future interventions. For instance, knowledge of TCCs differs substantially by geographic location, and women who know a survivor of SGBV are nearly five times as likely to have heard about the crisis centers.
Overall, the IE found that these interventions on their own were not able to meaningfully impact SGBV reporting or community knowledge or attitudes. This should not be interpreted to conclude these interventions are useless, rather these interventions are not sufficiently strong to meaningfully impact rape crisis center utilization or community-level knowledge or attitudes.
How are impact evaluations an important part of comprehensive monitoring and evaluation support and CLA?
The most important piece of any evaluation is helping partners to see how they can use the findings to improve their programs. There are limits and constraints to RCTs and measuring single interventions. It is important for impact evaluations to be a part of a comprehensive suite of monitoring and evaluation, collaboration, learning, and adapting approaches. In this case, we recommended that the implementing partners adopt a train-the-trainer approach to the community dialogues to allow for a broader reach. We also recommended formalization of the SGBV training for a broader set of service providers beyond those already motivated by SGBV issues. Partners who are able to take recommendations and adapt their programs mid-course often see stronger results by the end of their project.
The full report is available online: Impact Evaluation of the “Increasing Services for Survivors of Sexual Assault in South Africa” Program