At Conectas, impact evaluation has always been an important institutional priority. We have come to understand evaluation not so much as a results assessment, but instead, like Emma Naughton and Kevin Kelpin have noted, as a process that contributes to improve our institutional learning. This approach to evaluation also allows us to discover impacts that we might not have expected. If organizations use evaluation only to assess the level of the desired result(s), and not to learn more about the intervention overall, they might miss many other changes that have occurred.
Our approach and process allows us to evaluate not only our own operations, but also how the context might be changing, evolving and affecting our plans. Despite the many complexities, it is essential to understand if and how change is happening—including unanticipated change—and how constituencies and organizations may have contributed to this change.
Only by understanding if advocacy strategies have been effective and why (or why not), can we understand whether it would make sense to replicate it. Last year, for the first time, Conectas and partner organizations from the Criminal Justice Network launched a large media campaign against the practice of invasive strip-searches for family members who visit their relatives in prison. The impact of the campaign was two-fold: in the state of Sao Paulo, where the campaign was launched, a law was passed to ban the practice, which in and of itself was a great victory. In addition, while not endorsing “human rights” directly, a new audience started to empathize with the situation of these women—grandmothers, mothers or daughters—who have to go through these humiliating treatments in order to visit their relatives in prison. By appealing to people’s understanding of the barbaric situation that prisoners’ relatives have to go through, as opposed to prisoners themselves, the campaign gathered unprecedented support. This impact was unexpected, and learning to identify it has helped us think about other human rights campaigns that could rally an even larger audience to our causes.
In fact, the unexpected lessons that we learn from our evaluation processes happen often. They are frequently surprising and always relevant, and they have informed our strategies and planning processes in ways both profound and constructive.
For example, another evaluation process helped us understand that Conectas’ use of international mechanisms was largely reinforced by how the international press covered the case. Resolutions and recommendations do impact official interlocutors, but if recommendations are somehow featured in international dailies, the reaction of government officials can be much more rapid.
More specifically, in 2011 when Conectas tried to shed light on the appalling prison conditions in the state of Espirito Santo, domestic efforts to speak with policy and decision-makers proved useless. But when Conectas brought the case to the UN, this engagement was covered widely in the international press. These actions then led directly to the abolishment of the metal containers in which pre-trial detainees were held, and to the creation of a “torturômetro” to monitor acts of torture within the state.
Evaluating our work helps us negotiate the difficult balance between the battles we want to fight from a moral standpoint, and those we have the ability to fight and potentially win. It also provides us with the evidence and data that can help recognize and celebrate our victories, a critical aspect for human rights defenders who dedicate their lives to this cause. Last but certainly not least, our funders want and need to understand where we make a difference, just like any investor would need to know the return on their investment. Even if we didn’t have the expected result, but can show some other result, that is still an impact, and new interventions and funding might come out of that evaluation and reflection.
Over time, we have raised our team’s awareness of the need to evaluate their work. Conectas now carries out rigorous planning processes: based on our five-year strategic plan, and our three-year tactical plan, our programs and areas develop annual operational plans that are reviewed twice a year during formal evaluations. The teams themselves conduct these evaluations because, as Naughton and Kelpin have also noted, they are the best suited to understand the subtleties and complexities of a particular situation, and to identify changes or unplanned impacts that others might not see.
During these evaluations, we try to consider not only the quality of the implementation of any given action—although that is also a critical part of the process—but more importantly the feedback of important stakeholders. Participants in our bi-annual Colloquium are asked to answer an opinion survey at the end of the event, as well as six months later in order to measure the impact of the event on their lives and work. Readers and contributors to the Sur Journal are also regularly questioned about how relevant and useful they find the articles for their work.
These survey results have at times been surprising, such as the finding that despite our many efforts to disseminate the print edition of the Sur Journal, the online version has a much larger following. As a result, we decided to transform it into a primarily online journal. The Colloquium surveys have also revealed important elements about the program and format of the meeting. The methodologies that are used today, and the time and resources dedicated to strengthening human and organizational relations during the encounter, are the direct result of these surveys.
Finally, we readily accept that these evaluations are biased, subjective and incomplete. We are seeing the world with our own framework and from our own perspective. We look carefully at the work we do and the advances that are made, for example, within the criminal justice system or within the realm of Brazilian foreign policy, from the perspective of increasing transparency and civil society participation. We cannot look at the advances of human rights globally—we can merely try to understand our role in advancing our causes within each field. But by acknowledging when we don’t hit our goals, and by remaining open to unexpected results, we hope to always evolve and adapt to what is around us. And we can only hope that each organization will do the same, in order to build a more complete view of the field and create more effective interventions.Evaluating our work helps us negotiate the difficult balance between the battles we want to fight from a moral standpoint, and those we have the ability to fight and potentially win. It also provides us with the evidence and data that can help recognize and celebrate our victories, a critical aspect for human rights defenders who dedicate their lives to this cause. Last but certainly not least, our funders want and need to understand where we make a difference, just like any investor would need to know the return on their investment. Even if we didn’t have the expected result, but can show some other result, that is still an impact, and new interventions and funding might come out of that evaluation and reflection.
Over time, we have raised our team’s awareness of the need to evaluate their work. Conectas now carries out rigorous planning processes: based on our five-year strategic plan, and our three-year tactical plan, our programs and areas develop annual operational plans that are reviewed twice a year during formal evaluations. The teams themselves conduct these evaluations because, as Naughton and Kelpin have also noted, they are the best suited to understand the subtleties and complexities of a particular situation, and to identify changes or unplanned impacts that others might not see.
During these evaluations, we try to consider not only the quality of the implementation of any given action—although that is also a critical part of the process—but more importantly the feedback of important stakeholders. Participants in our bi-annual Colloquium are asked to answer an opinion survey at the end of the event, as well as six months later in order to measure the impact of the event on their lives and work. Readers and contributors to the Sur Journal are also regularly questioned about how relevant and useful they find the articles for their work.
These survey results have at times been surprising, such as the finding that despite our many efforts to disseminate the print edition of the Sur Journal, the online version has a much larger following. As a result, we decided to transform it into a primarily online journal. The Colloquium surveys have also revealed important elements about the program and format of the meeting. The methodologies that are used today, and the time and resources dedicated to strengthening human and organizational relations during the encounter, are the direct result of these surveys.
Finally, we readily accept that these evaluations are biased, subjective and incomplete. We are seeing the world with our own framework and from our own perspective. We look carefully at the work we do and the advances that are made, for example, within the criminal justice system or within the realm of Brazilian foreign policy, from the perspective of increasing transparency and civil society participation. We cannot look at the advances of human rights globally—we can merely try to understand our role in advancing our causes within each field. But by acknowledging when we don’t hit our goals, and by remaining open to unexpected results, we hope to always evolve and adapt to what is around us. And we can only hope that each organization will do the same, in order to build a more complete view of the field and create more effective interventions.
Read the original clicking here.