Feeds:
Posts
Comments

Archive for April, 2013

This is a cross-post from Tessie San Martin, CEO of Plan International USA. Tessie’s original post is published on the Plan International USA blogFor more on the status of the International Aid Transparency Initiative (IATI) in the US and information on which donors sit where on the Transparency Index, visit Publish What You Fund.

Over 40 governments, along with UN organizations and the World Bank, have committed to a common standard and time schedule for publishing aid information under the International Aid Transparency Initiative (IATI).  There are high expectations for this initiative. The ultimate objective is to increase the effectiveness of donor assistance, making aid work for those whom we are trying to help and contributing to accelerated development outcomes on the ground. IATI is good news for increased accountability, can help improve coordination, and provides a space for engaging donors, communities, governments and the general public in a broader development dialogue.

Secretary of State Clinton signed on behalf of the US Government on November 2011. While US engagement has been very welcomed, US Government performance in terms of actually executing IATI has left much to be desired.  Publish What You Fund, an organization helping to ensure governments are held to their initial aid transparency commitments, ranked only one out of six agencies (MCC) in the ‘fair’ category in terms of execution. Recently, organizations like Oxfam and ONE have rightly questioned the US Government’s commitment and progress, and exhorted the Obama administration to make full compliance with the IATI standard a priority.

But with all the attention focused on how USG is performing, what are INGOs doing about IATI?  After all, governments can only open access to the data they have. Official development assistance is an increasingly smaller proportion of the entire aid flows, so having INGOs — particularly, and at the very least, the largest global INGOs — also committed to this process is vital to the success of the Initiative.

What are INGO’s doing about IATI? The answer is: not much.

Very few INGOs have committed to publishing their information to the IATI standard.  INGOs that have complied are doing so primarily because a donor is requiring it.  For example, DfID, the UK foreign aid agency, has such a requirement and, as a result, the UK has the largest number of INGOs in compliance.  The US Government has not imposed this requirement on US-based INGOs and it is not likely to do so in the future.  It is therefore not surprising that US-based INGOs have not shown much interest in IATI.

This is a lost opportunity for everyone.  Accountability and transparency are as relevant to the private and the non-profit side of development assistance as they are to the public side.

At Plan International, an INGO with offices in almost 70 countries, it is not surprising that the part of our organization making the fastest strides in this area is our office in the United Kingdom.  As an important recipient of DfID money, they were instructed to do so.  In the US, though Plan International USA is not a major recipient of USG funding, we believe that making the investment to comply with IATI reporting format and timelines is good development practice; we are thus committed to publishing to IATI in the next year.  How can we effectively preach transparency and increased accountability to our recipient communities and to the governments with which we are working yet not commit to something as eminently common sensical as uniform formats, comparable data sets and systematic reporting frequencies?

We are not Pollyannaish about the task.  Like all INGOs pondering whether and how to comply with IATI, we have many concerns, including the costs of complying and what it will do to our overhead (and therefore to something like our Charity Navigator) rating.   We have established an internal project code so we can better capture, track and understand the costs involved in this initiative.  And we are evaluating where we draw the line in terms of the size of the projects on which we should be reporting, balancing costs with the desire to maximize disclosure (it is also worth remembering that rating agencies themselves are placing increasing emphasis on transparent reporting, so rating concerns may ultimately support a move towards greater IATI compliance).

As we have moved forward, we have had many issues to address, including privacy concerns, since a fair bit of Plan’s internal documentation was not written with the idea that it would one day be shared with the public.  Publishing some information may pose security risks for minority or political groups being supported.  These situations have been contemplated by IATI already, however, and there are valid exemptions for sensitive data.  We have also learned that there are many resources to help INGOs navigate the IATI compliance waters.  These resources are not well known to US INGOs, and need to be better publicized. Plan in the US, of course, is also benefiting from the research and hard work our UK office has done to comply with DfID’s mandate, allowing us to start off on a strong foundation of organizational experience.

I am convinced that IATI is not just good development practice but also makes good business sense. At the same time, it is worth remembering that IATI is not the entire solution.  IATI is designed to improve upward accountability to donors and taxpayers.  It is not designed explicitly to improve accountability to the children and communities with which we are partnering and whom we serve. And, as the ultimate goal is improved aid effectiveness, data must be accompanied by better information about goals, methodologies and approaches.  We also need to get better at sharing not just successes but failures within our federation and across all development organizations.

Despite all the shortcomings, IATI is a good start.  And as we push the US Government to do better, INGOs need to be pushing themselves to do better as well.

Advertisements

Read Full Post »

At Catholic Relief Services’ annual ICT4D meeting in March 2013, I worked with Jill Hannon from Rockefeller Foundation’s Evaluation Office to organize 3 sessions on the use of ICT for Monitoring and Evaluation (ICTME). The sessions covered the benefits (known and perceived) of using ICTs for M&E, the challenges and barriers organizations face when doing so, and some lessons and advice on how to integrate ICTs into the M&E process.

Our lead discussants in the three sessions included: Stella Luk (Dimagi), Guy Sharrack (CRS), Mike Matarasso (CRS), David McAfee (HNI/Datawinners), Mark Boots (Votomobile), and Teressa Trusty (USAID’s IDEA/Mobile Solutions). In addition, we drew from the experiences and expertise of some 60 people who attended our two round table sessions.

Benefits of integrating ICTs into the M&E process

Some of the potential benefits of integrating ICTs mentioned by the various discussants and participants in the sessions included:

  • More rigorous, higher quality data collection and more complete data
  • Reduction in required resources (time, human, money) to collect, aggregate and analyze data
  • Reduced complexity if data systems are simplified; thus increased productivity and efficiency
  • Combined information sources and types and integration of free form, qualitative data with quantitative data
  • Broader general feedback from a wider public via ICT tools like SMS; inclusion of new voices in the feedback process, elimination of the middleman to empower communities
  • Better cross-sections of information, information comparisons; better coordination and cross-comparing if standard, open formats are used
  • Trend-spotting with visualization tools
  • Greater data transparency and data visibility, easier data auditing
  • Real-time or near real-time feedback “up the chain” that enables quicker decision-making, adaptive management, improved allocation of limited resources based on real-time data, quicker communication of decisions/changes back to field-level staff, faster response to donors and better learning
  • Real-time feedback “down the ladder” that allows for direct citizen/beneficiary feedback, and complementing of formal M&E with other social monitoring approaches
  • Scale, greater data security and archiving, and less environmental impact
  • Better user experience for staff as well as skill enhancement and job marketability and competitiveness of staff who use the system

Barriers and challenges of integrating ICTs into M&E processes

A number of challenges and barriers were also identified, including:

  • A lack of organizational capacity to decide when to use ICTs in M&E, for what, and why, and deciding on the right ICT (if any) for the situation. Organizations may find it difficult to get beyond collecting the data to better use of data for decision-making and coordination. There is often low staff capacity, low uptake of ICT tools and resistance to change.
  • A tendency to focus on surveys and less attention to other types of M&E input, such as qualitative input. Scaling analysis of large-scale qualitative feedback is also a challenge: “How do you scale qualitative feedback to 10,000 people or more? People can give their feedback in a number of languages by voice. How do you mine that data?”
  • The temptation to offload excessive data collection to frontline staff without carefully selecting what data is actually going to be used and useful for them or for other decision-makers.
  • M&E is often tacked on at the end of a proposal design. The same is true for ICT. Both ICT and M&E need to be considered and “baked in” to a process from the very beginning.
  • ICT-based M&E systems have missed the ball on sharing data back. “Clinics in Ghana collect a lot of information that gets aggregated and moved up the chain. What doesn’t happen is sharing that information back with the clinic staff so that they can see what is happening in their own clinic and why. We need to do a better job of giving information back to people and closing the loop.” This step is also important for accountability back to communities. On the whole, we need to be less extractive.
  • Available tools are not always exactly right, and no tool seems to provide everything an organization needs, making it difficult to choose the right tool. There are too many solutions, many of which are duplicative, and often the feature sets and the usability of these tools are both poor. There are issues with sustainability and ongoing maintenance and development of M&E platforms.
  • Common definitions for data types and standards for data formatting are needed. The lack of interoperability among ICT solutions also causes challenges. As a field, we don’t do enough linking of systems together to see a bigger picture of which programs are doing what, where and who they are impacting and how.
  • Security and privacy are not adequately addressed. Many organizations or technology providers are unaware of the ethical implications of collecting data via new tools and channels. Many organizations are unclear about the ethical standards for research versus information that is offered up by different constituents or “beneficiaries” (eg., information provided by people participating in programs that use SMS or collect information through SMS-based surveys) versus monitoring and evaluation information. It is also unclear what the rules are for information collected by private companies, who this information can be shared with and what privacy laws mean for ICT-enabled M&E and other types of data collection. If there are too many barriers to collecting information, however, the amount of information collected will be reduced. A balance needs to be found. The information that telecommunications companies hold is something to think about when considering privacy and consent issues, especially in situations of higher vulnerability and risk. (UNOCHA has recently released a report that may be useful.)
  • Not enough is understood about motivation and incentive for staff or community members to participate or share data. “Where does my information go? Do I see the results? Why should I participate? Is anyone responding to my input?” In addition, the common issues of cost, access, capacity, language, literacy, cultural barriers are very much present in attempts to collect information directly from community members. Another question is that of inclusion: Does ICT-enabled data collection or surveying leave certain groups out? (See this study on intrinsic vs extrinsic motivation for feedback.)
  • Donors often push or dictate the use of ICT when it’s perhaps not the most useful for the situation. In addition there is normally not enough time during proposal process for organizations to work on buy-in and good design of an ICT-enabled M&E system. There is often a demand from the top for excessive data collection without an understanding of the effort required to collect it, and time/resource trade-offs for excessive data collection when it leads to less time spent on program implementation. “People making decisions in the capital want to add all these new questions and information and that can be a challenge… What data are valuable to collect? Who will respond to them? Who will use them as the project goes forward?”
  • There seems to be a focus on top-down, externally created solutions rather than building on local systems and strengths or supporting local organizations or small businesses to strengthen their ICTME capacities. “Can strengthening local capacity be an objective in its own right? Are donors encouraging agencies to develop vertical ICTME solutions without strengthening local systems and partners?”
  • Results-based, data-based focus can bias the countable, leave out complex development processes with more difficult to count/measure impacts.

Lessons and good practice for integrating ICTs into M&E processes

ICT is not a silver bullet – it presents its own set of challenges. But a number of good practices surfaced:

  • The use of ICTs for M&E is not just a technology issue, it’s a people and processes issue too, and it is important to manage the change carefully. It’s also important to keep an open mind that ICT4D to support M&E might not always be the best use of scarce resources – there may be more pressing priorities for a project. Getting influential people on your side to support the cause and help leverage funding and support is critical. It’s also important to communicate goals and objectives clearly, and provide incentives to make sure ICTs are successfully adopted. The trick is keeping up with technology advances to improve the system, but also keeping your eye on the ball.
  • When designing an ICTME effort, clarity of purpose and a holistic picture of the project M&E system are needed in order to review options for where ICT4D can best fit. Don’t start with the technology. Start with the M&E purpose and goals and focus on the business need, not the gadgets. Have a detailed understanding of M&E data requirements and data flows as a first step. Follow those with iterative discussions with ICT staff to specify the ICT4D solution requirements.
  • Select an important but modest project to start with and pilot in one location – get it right and work out the glitches before expanding to a second tier of pilots or expanding widely. Have a fully functional model to share for broad buy-in and collect some hard data during the pilot to convince people of adoption. The first ICT initiative will be the most important.  If it is successful, use of ICTs will likely spread throughout an organization.  If the first initiative fails, it can significantly push back the adoption of ICTs in general. For this reason, it’s important to use your best people for the first effort. Teamwork and/or new skill sets may be required to improve ICT-enabled M&E. The “ICT4D 2.0 Manifesto” talks about a tribrid set of skills needed for ICT-enabled programs.
  • Don’t underestimate the need for staff training and ongoing technical assistance to ensure a positive user experience, particularly when starting out. Agencies need to find the right balance between being able to provide support for a limited number of ICT solutions versus the need to support ongoing local innovation.  It’s also important to ask for help when needed.  The most successful M&E projects are led by competent managers who seek out resources both inside and outside their organizations.
  • Good ICT-enabled M&E comes from a partnership between program, M&E and ICT staff, technical support internal and external to the organization. Having a solid training curriculum and a good help desk are important. In addition, in-built capacity for original architecture design and to maintain and adjust the system is a good idea. A lead business owner and manager for the system need to be in place as well as global and local level pioneers and strong leadership (with budget!) to do testing and piloting. At the local level, it is important to have an energetic and savvy local M&E pioneer who has a high level of patience and understands technology.
  • At the community level, a key piece is understanding who you need to hear from for effective M&E and ensuring that ICT tools are accessible to all. It’s also critical to understand who you are ignoring or not reaching with any tool or process. Are women and children left out? What about income level? Those who are not literate?
  • Organizations should also take care that they are not replacing or obliterating existing human responsibilities for evaluation. For example, at community level in Ghana, Assembly Members have the current responsibility for representing citizen concerns. An ICT-enabled feedback loop might undermine this responsibility if it seeks direct-from-citizen evaluation input.  The issue of trust and the human-human link also need consideration. ICT cannot and should not be a replacement for everything. New ICT tools can increase the number of people and factors evaluated; not just increase efficiency of existing evaluations.
  • Along the same lines, it’s important not to duplicate existing information systems, create parallel systems or fragment the government’s own systems. Organizations should be strengthening local government systems and working with government to use the information to inform policy and help with decision-making and implementation of programs.
  • implementors need to think about the direction of information flow. “Is it valuable to share results “upward” and “downward”? It is possible to integrate local decision-making into a system.” Systems can be created that allow for immediate local-level decision-making based on survey input. Key survey questions can be linked to indicators that allow for immediate discussion and solutions to improve service provision.
  • Also, the potential political and social implications of greater openness in information flows needs to be considered. Will local, regional and national government embrace the openness and transparency that ICTs offer? Are donors and NGOs potentially putting people at risk?
  • For best results, pick a feasible and limited number of quality indicators and think through how frontline workers will be motivated to collect the data. Excessive data collection will interfere with or impede service delivery. Make sure managers are capable of handling and analyzing data that comes in and reacting to it, or there is no point in collecting it. It’s important to not only think about what data you want, but how this data will be used. Real-time data collected needs to be actionable. Be sure that those submitting data understand what data they have submitted and can verify its accuracy. Mobile data collection needs to be integrated into real processes and feedback loops. People will only submit information or reports if they see that someone cares about those reports and does something about them.
  • Collecting data through mobile technology may change the behavior being monitored or tracked. One participant commented that when his organization implemented an ICT-based system to track staff performance, people started doing unnecessary activities so that they could tick off the system boxes rather than doing what they knew should be done for better program impact.
  • At the practical level, tips include having robust options for connectivity and power solutions, testing the technology in the field with a real situation, securing reduced costs with vendors for bulk purchasing and master agreements, using standard vendor tools instead of custom building. It’s good to keep the system simple, efficient and effective as possible and to avoid redundancy or the addition of features things that don’t truly offer more functionality.

Thanks to all our participants and lead discussants at the sessions!

Useful information and guides on ICTME:

Mobile-based technology for monitoring and evaluation: A reference guide for project managers, M&E specialists, researchers, donors

3 Reports on mobile data collection

Other posts on ICTs for M&E:

12 tips on using ICTs for social monitoring and accountability

11 points on strengthening local capacity to use new ICTs for M&E

10 tips on using new ICTs for qualitative M&E

Using participatory video for M&E

ICTs and M&E at the South Asia Evaluators’ Conclave

Read Full Post »

CISCO is looking for stories about how people and organizations have used network services (mobile video and gaming, video conferencing, SMS, and social networking) to achieve different social or personal goals. The idea is to find stories not only about how technology developers are inventing things, but also about how different tools are enabling a range of things to happen and helping people to resolve challenges and achieve progress in their communities.

The application process is quite simple – just answer 5 questions through a form on the website to formulate the story. You can also download a form, fill it out, and then cut/paste into the form online if you are on slow internet. Contest and story submission information is here.

According to CISCO, by posting a story on how you’ve developed solutions, or used these services, you will be an inspiration to others and CISCO will help you promote your work through their social media channels thereby increasing your visibility. They hope that the stories help generate more conversation around the impact of Internet services.

They also hope the stories can help answer some questions around the benefits of increased connectivity. For example:

  • How will growth in Internet access impact global economies?
  • How can more communities and businesses take advantage of the significant growth in mobile services?
  • What new opportunities can be created for those least connected?
  • Why are there regional variations in adoption rates of certain networking services?

I know a lot of smaller local organizations – especially some great youth-led organizations – are doing some fantastic things, and a prize like this could help these organizations get to the next level. More information here.

The more academic folks out there may not want to submit stories, but you might be interested in the Virtual Network Indexing Forecast.

Read Full Post »