Feeds:
Posts
Comments

Posts Tagged ‘CRS’

This post was written with input from Maliha Khan, Independent Consultant; Emily Tomkys, Oxfam GB; Siobhan Green, Sonjara and Zara Rahman, The Engine Room.

A friend reminded me earlier this month at the MERL Tech Conference that a few years ago when we brought up the need for greater attention to privacy, security and ethics when using ICTs and digital data in humanitarian and development contexts, people pointed us to Tor, encryption and specialized apps. “No, no, that’s not what we mean!” we kept saying. “This is bigger. It needs to be holistic. It’s not just more tools and tech.”

So, even if as a sector we are still struggling to understand and address all the different elements of what’s now referred to as “Responsible Data” (thanks to the great work of the Engine Room and key partners), at least we’ve come a long way towards framing and defining the areas we need to tackle. We understand the increasing urgency of the issue that the volume of data in the world is increasing exponentially and the data in our sector is becoming more and more digitalized.

This year’s MERL Tech included several sessions on Responsible Data, including Responsible Data Policies, the Human Element of the Data Cycle, The Changing Nature of Informed Consent, Remote Monitoring in Fragile Environments and plenary talks that mentioned ethics, privacy and consent as integral pieces of any MERL Tech effort.

The session on Responsible Data Policies was a space to share with participants why, how, and what policies some organizations have put in place in an attempt to be more responsible. The presenters spoke about the different elements and processes their organizations have followed, and the reasoning behind the creation of these policies. They spoke about early results from the policies, though it is still early days when it comes to implementing them.

What do we mean by Responsible Data?

Responsible data is about more than just privacy or encryption. It’s a wider concept that includes attention to the data cycle at every step, and puts the rights of people reflected in the data first:

  • Clear planning and purposeful collection and use of data with the aim of improving humanitarian and development approaches and results for those we work with and for
  • Responsible treatment of the data and respectful and ethical engagement with people we collect data from, including privacy and security of data and careful attention to consent processes and/or duty of care
  • Clarity on data sharing – what data, from whom and with whom and under what circumstances and conditions
  • Attention to transparency and accountability efforts in all directions (upwards, downwards and horizontally)
  • Responsible maintenance, retention or destruction of data.

Existing documentation and areas to explore

There is a huge bucket of concepts, frameworks, laws and policies that already exist in various other sectors and that can be used, adapted and built on to develop responsible approaches to data in development and humanitarian work. Some of these are in conflict with one another, however, and those conflicts need to be worked out or at least recognized if we are to move forward as a sector and/or in our own organizations.

Some areas to explore when developing a Responsible Data policy include:

  • An organization’s existing policies and practices (IT and equipment; downloading; storing of official information; confidentiality; monitoring, evaluation and research; data collection and storage for program administration, finance and audit purposes; consent and storage for digital images and communications; social media policies).
  • Local and global laws that relate to collection, storage, use and destruction of data, such as: Freedom of information acts (FOIA); consumer protection laws; data storage and transfer regulations; laws related to data collection from minors; privacy regulations such as the latest from the EU.
  • Donor grant requirements related to data privacy and open data, such as USAID’s Chapter 579 or International Aid Transparency Initiative (IATI) stipulations.

Experiences with Responsible Data Policies

At the MERL Tech Responsible Data Policy session, organizers and participants shared their experiences. The first step for everyone developing a policy was establishing wide agreement and buy-in for why their organizations should care about Responsible Data. This was done by developing Values and Principles that form the foundation for policies and guidance.

Oxfam’s Responsible Data policy has a focus on rights, since Oxfam is a rights-based organization. The organization’s existing values made it clear that ethical use and treatment of data was something the organization must consider to hold true to its ethos. It took around six months to get all of the global affiliates to agree on the Responsible Program Data policy, a quick turnaround compared to other globally agreed documents because all the global executive directors recognized that this policy was critical. A core point for Oxfam was the belief that digital identities and access will become increasingly important for inclusion in the future, and so the organization did not want to stand in the way of people being counted and heard. However, it wanted to be sure that this was done in a way that balanced and took privacy and security into consideration.

The policy is a short document that is now in the process of operationalization in all the countries where Oxfam works. Because many of Oxfam’s affiliate headquarters reside in the European Union, it needs to consider the new EU regulations on data, which are extremely strict, for example, providing everyone with an option for withdrawing consent. This poses a challenge for development agencies who normally do not have the type of detailed databases on ‘beneficiaries’ as they do on private donors. Shifting thinking about ‘beneficiaries’ and treating them more as clients may be in order as one result of these new regulations. As Oxfam moves into implementation, challenges continue to arise. For example, data protection in Yemen is different than data protection in Haiti. Knowing all the national level laws and frameworks and mapping these out alongside donor requirements and internal policies is extremely complicated, and providing guidance to country staff is difficult given that each country has different laws.

Girl Effect’s policy has a focus on privacy, security and safety of adolescent girls, who are the core constituency of the organization. The policy became clearly necessary because although the organization had a strong girl safeguarding policy and practice, the effect of digital data had not previously been considered, and the number of programs that involve digital tools and data is increasing. The Girl Effect policy currently has four core chapters: privacy and security during design of a tool, service or platform; content considerations; partner vetting; and MEAL considerations. Girl Effect looks at not only the privacy and security elements, but also aims to spur thinking about potential risks and unintended consequences for girls who access and use digital tools, platforms and content. One core goal is to stimulate implementers to think through a series of questions that help them to identify risks. Another is to establish accountability for decisions around digital data.

The policy has been in process of implementation with one team for a year and will be updated and adapted as the organization learns. It has proven to have good uptake so far from team members and partners, and has become core to how the teams and the wider organization think about digital programming. Cost and time for implementation increase with the incorporation of stricter policies, however, and it is challenging to find a good balance between privacy and security, the ability to safely collect and use data to adapt and improve tools and platforms, and user friendliness/ease of use.

Catholic Relief Services has an existing set of eight organizational principles: Sacredness and Dignity of the human person; Rights and responsibilities; Social Nature of Humanity; The Common Good; Subsidiarity; Solidarity; Option for the Poor; Stewardship. It was a natural fit to see how these values that are already embedded in the organization could extend to the idea of Responsible Data. Data is an extension of the human person, therefore it should be afforded the same respect as the individual. The principle of ‘common good’ easily extends to responsible data sharing. The notion of subsidiarity says that decision-making should happen as close as possible to the place where the impact of the decision will be the strongest, and this is nicely linked with the idea of sharing data back with communities where CRS works and engaging them in decision-making. The option for the poor urges CRS to place a preferential value on privacy, security and safety of the data of the poor over the data demands of other entities.

The organization is at the initial phase of creating its Responsible Data Policy. The process includes the development of the values and principles, two country learning visits to understand the practices of country programs and their concerns about data, development of the policy, and a set of guidelines to support staff in following the policy.

USAID recently embarked on its process of developing practical Responsible Data guidance to pair with its efforts in the area of open data. (See ADS 579). More information will be available soon on this initiative.

Where are we now?

Though several organizations are moving towards the development of policies and guidelines, it was clear from the session that uncertainties are the order of the day, as Responsible Data is an ethical question, often relying on tradeoffs and decisions that are not hard and fast. Policies and guidelines generally aim to help implementers ask the right questions, sort through a range of possibilities and weigh potential risks and benefits.

Another critical aspect that was raised at the MERL Tech session was the financial and staff resources that can be required to be responsible about data. On the other hand, for those organizations receiving funds from the European Union or residing in the EU or the UK (where despite Brexit, organizations will likely need to comply with EU Privacy Regulations), the new regulations mean that NOT being responsible about data may result in hefty fines and potential legal action.

Going from policy to implementation is a challenge that involves both capacity strengthening in this new area as well as behavior change and a better understanding of emerging concepts and multiple legal frameworks. The nuances by country, organization and donor make the process difficult to get a handle on.

Because staff and management are already overburdened, the trick to developing and implementing Responsible Data Policies and Practice will be finding ways to strengthen staff capacity and to provide guidance in ways that do not feel overwhelmingly complex. Though each situation will be different, finding ongoing ways to share resources and experiences so that we can advance as a sector will be one key step for moving forward.

Read Full Post »

At Catholic Relief Services’ annual ICT4D meeting in March 2013, I worked with Jill Hannon from Rockefeller Foundation’s Evaluation Office to organize 3 sessions on the use of ICT for Monitoring and Evaluation (ICTME). The sessions covered the benefits (known and perceived) of using ICTs for M&E, the challenges and barriers organizations face when doing so, and some lessons and advice on how to integrate ICTs into the M&E process.

Our lead discussants in the three sessions included: Stella Luk (Dimagi), Guy Sharrack (CRS), Mike Matarasso (CRS), David McAfee (HNI/Datawinners), Mark Boots (Votomobile), and Teressa Trusty (USAID’s IDEA/Mobile Solutions). In addition, we drew from the experiences and expertise of some 60 people who attended our two round table sessions.

Benefits of integrating ICTs into the M&E process

Some of the potential benefits of integrating ICTs mentioned by the various discussants and participants in the sessions included:

  • More rigorous, higher quality data collection and more complete data
  • Reduction in required resources (time, human, money) to collect, aggregate and analyze data
  • Reduced complexity if data systems are simplified; thus increased productivity and efficiency
  • Combined information sources and types and integration of free form, qualitative data with quantitative data
  • Broader general feedback from a wider public via ICT tools like SMS; inclusion of new voices in the feedback process, elimination of the middleman to empower communities
  • Better cross-sections of information, information comparisons; better coordination and cross-comparing if standard, open formats are used
  • Trend-spotting with visualization tools
  • Greater data transparency and data visibility, easier data auditing
  • Real-time or near real-time feedback “up the chain” that enables quicker decision-making, adaptive management, improved allocation of limited resources based on real-time data, quicker communication of decisions/changes back to field-level staff, faster response to donors and better learning
  • Real-time feedback “down the ladder” that allows for direct citizen/beneficiary feedback, and complementing of formal M&E with other social monitoring approaches
  • Scale, greater data security and archiving, and less environmental impact
  • Better user experience for staff as well as skill enhancement and job marketability and competitiveness of staff who use the system

Barriers and challenges of integrating ICTs into M&E processes

A number of challenges and barriers were also identified, including:

  • A lack of organizational capacity to decide when to use ICTs in M&E, for what, and why, and deciding on the right ICT (if any) for the situation. Organizations may find it difficult to get beyond collecting the data to better use of data for decision-making and coordination. There is often low staff capacity, low uptake of ICT tools and resistance to change.
  • A tendency to focus on surveys and less attention to other types of M&E input, such as qualitative input. Scaling analysis of large-scale qualitative feedback is also a challenge: “How do you scale qualitative feedback to 10,000 people or more? People can give their feedback in a number of languages by voice. How do you mine that data?”
  • The temptation to offload excessive data collection to frontline staff without carefully selecting what data is actually going to be used and useful for them or for other decision-makers.
  • M&E is often tacked on at the end of a proposal design. The same is true for ICT. Both ICT and M&E need to be considered and “baked in” to a process from the very beginning.
  • ICT-based M&E systems have missed the ball on sharing data back. “Clinics in Ghana collect a lot of information that gets aggregated and moved up the chain. What doesn’t happen is sharing that information back with the clinic staff so that they can see what is happening in their own clinic and why. We need to do a better job of giving information back to people and closing the loop.” This step is also important for accountability back to communities. On the whole, we need to be less extractive.
  • Available tools are not always exactly right, and no tool seems to provide everything an organization needs, making it difficult to choose the right tool. There are too many solutions, many of which are duplicative, and often the feature sets and the usability of these tools are both poor. There are issues with sustainability and ongoing maintenance and development of M&E platforms.
  • Common definitions for data types and standards for data formatting are needed. The lack of interoperability among ICT solutions also causes challenges. As a field, we don’t do enough linking of systems together to see a bigger picture of which programs are doing what, where and who they are impacting and how.
  • Security and privacy are not adequately addressed. Many organizations or technology providers are unaware of the ethical implications of collecting data via new tools and channels. Many organizations are unclear about the ethical standards for research versus information that is offered up by different constituents or “beneficiaries” (eg., information provided by people participating in programs that use SMS or collect information through SMS-based surveys) versus monitoring and evaluation information. It is also unclear what the rules are for information collected by private companies, who this information can be shared with and what privacy laws mean for ICT-enabled M&E and other types of data collection. If there are too many barriers to collecting information, however, the amount of information collected will be reduced. A balance needs to be found. The information that telecommunications companies hold is something to think about when considering privacy and consent issues, especially in situations of higher vulnerability and risk. (UNOCHA has recently released a report that may be useful.)
  • Not enough is understood about motivation and incentive for staff or community members to participate or share data. “Where does my information go? Do I see the results? Why should I participate? Is anyone responding to my input?” In addition, the common issues of cost, access, capacity, language, literacy, cultural barriers are very much present in attempts to collect information directly from community members. Another question is that of inclusion: Does ICT-enabled data collection or surveying leave certain groups out? (See this study on intrinsic vs extrinsic motivation for feedback.)
  • Donors often push or dictate the use of ICT when it’s perhaps not the most useful for the situation. In addition there is normally not enough time during proposal process for organizations to work on buy-in and good design of an ICT-enabled M&E system. There is often a demand from the top for excessive data collection without an understanding of the effort required to collect it, and time/resource trade-offs for excessive data collection when it leads to less time spent on program implementation. “People making decisions in the capital want to add all these new questions and information and that can be a challenge… What data are valuable to collect? Who will respond to them? Who will use them as the project goes forward?”
  • There seems to be a focus on top-down, externally created solutions rather than building on local systems and strengths or supporting local organizations or small businesses to strengthen their ICTME capacities. “Can strengthening local capacity be an objective in its own right? Are donors encouraging agencies to develop vertical ICTME solutions without strengthening local systems and partners?”
  • Results-based, data-based focus can bias the countable, leave out complex development processes with more difficult to count/measure impacts.

Lessons and good practice for integrating ICTs into M&E processes

ICT is not a silver bullet – it presents its own set of challenges. But a number of good practices surfaced:

  • The use of ICTs for M&E is not just a technology issue, it’s a people and processes issue too, and it is important to manage the change carefully. It’s also important to keep an open mind that ICT4D to support M&E might not always be the best use of scarce resources – there may be more pressing priorities for a project. Getting influential people on your side to support the cause and help leverage funding and support is critical. It’s also important to communicate goals and objectives clearly, and provide incentives to make sure ICTs are successfully adopted. The trick is keeping up with technology advances to improve the system, but also keeping your eye on the ball.
  • When designing an ICTME effort, clarity of purpose and a holistic picture of the project M&E system are needed in order to review options for where ICT4D can best fit. Don’t start with the technology. Start with the M&E purpose and goals and focus on the business need, not the gadgets. Have a detailed understanding of M&E data requirements and data flows as a first step. Follow those with iterative discussions with ICT staff to specify the ICT4D solution requirements.
  • Select an important but modest project to start with and pilot in one location – get it right and work out the glitches before expanding to a second tier of pilots or expanding widely. Have a fully functional model to share for broad buy-in and collect some hard data during the pilot to convince people of adoption. The first ICT initiative will be the most important.  If it is successful, use of ICTs will likely spread throughout an organization.  If the first initiative fails, it can significantly push back the adoption of ICTs in general. For this reason, it’s important to use your best people for the first effort. Teamwork and/or new skill sets may be required to improve ICT-enabled M&E. The “ICT4D 2.0 Manifesto” talks about a tribrid set of skills needed for ICT-enabled programs.
  • Don’t underestimate the need for staff training and ongoing technical assistance to ensure a positive user experience, particularly when starting out. Agencies need to find the right balance between being able to provide support for a limited number of ICT solutions versus the need to support ongoing local innovation.  It’s also important to ask for help when needed.  The most successful M&E projects are led by competent managers who seek out resources both inside and outside their organizations.
  • Good ICT-enabled M&E comes from a partnership between program, M&E and ICT staff, technical support internal and external to the organization. Having a solid training curriculum and a good help desk are important. In addition, in-built capacity for original architecture design and to maintain and adjust the system is a good idea. A lead business owner and manager for the system need to be in place as well as global and local level pioneers and strong leadership (with budget!) to do testing and piloting. At the local level, it is important to have an energetic and savvy local M&E pioneer who has a high level of patience and understands technology.
  • At the community level, a key piece is understanding who you need to hear from for effective M&E and ensuring that ICT tools are accessible to all. It’s also critical to understand who you are ignoring or not reaching with any tool or process. Are women and children left out? What about income level? Those who are not literate?
  • Organizations should also take care that they are not replacing or obliterating existing human responsibilities for evaluation. For example, at community level in Ghana, Assembly Members have the current responsibility for representing citizen concerns. An ICT-enabled feedback loop might undermine this responsibility if it seeks direct-from-citizen evaluation input.  The issue of trust and the human-human link also need consideration. ICT cannot and should not be a replacement for everything. New ICT tools can increase the number of people and factors evaluated; not just increase efficiency of existing evaluations.
  • Along the same lines, it’s important not to duplicate existing information systems, create parallel systems or fragment the government’s own systems. Organizations should be strengthening local government systems and working with government to use the information to inform policy and help with decision-making and implementation of programs.
  • implementors need to think about the direction of information flow. “Is it valuable to share results “upward” and “downward”? It is possible to integrate local decision-making into a system.” Systems can be created that allow for immediate local-level decision-making based on survey input. Key survey questions can be linked to indicators that allow for immediate discussion and solutions to improve service provision.
  • Also, the potential political and social implications of greater openness in information flows needs to be considered. Will local, regional and national government embrace the openness and transparency that ICTs offer? Are donors and NGOs potentially putting people at risk?
  • For best results, pick a feasible and limited number of quality indicators and think through how frontline workers will be motivated to collect the data. Excessive data collection will interfere with or impede service delivery. Make sure managers are capable of handling and analyzing data that comes in and reacting to it, or there is no point in collecting it. It’s important to not only think about what data you want, but how this data will be used. Real-time data collected needs to be actionable. Be sure that those submitting data understand what data they have submitted and can verify its accuracy. Mobile data collection needs to be integrated into real processes and feedback loops. People will only submit information or reports if they see that someone cares about those reports and does something about them.
  • Collecting data through mobile technology may change the behavior being monitored or tracked. One participant commented that when his organization implemented an ICT-based system to track staff performance, people started doing unnecessary activities so that they could tick off the system boxes rather than doing what they knew should be done for better program impact.
  • At the practical level, tips include having robust options for connectivity and power solutions, testing the technology in the field with a real situation, securing reduced costs with vendors for bulk purchasing and master agreements, using standard vendor tools instead of custom building. It’s good to keep the system simple, efficient and effective as possible and to avoid redundancy or the addition of features things that don’t truly offer more functionality.

Thanks to all our participants and lead discussants at the sessions!

Useful information and guides on ICTME:

Mobile-based technology for monitoring and evaluation: A reference guide for project managers, M&E specialists, researchers, donors

3 Reports on mobile data collection

Other posts on ICTs for M&E:

12 tips on using ICTs for social monitoring and accountability

11 points on strengthening local capacity to use new ICTs for M&E

10 tips on using new ICTs for qualitative M&E

Using participatory video for M&E

ICTs and M&E at the South Asia Evaluators’ Conclave

Read Full Post »