Feeds:
Posts
Comments

Archive for the ‘accountability’ Category

So, here’s a good post called “Dear White Protesters” from Tam who writes on Tumblr as Young, Gifted and Black. It’s aimed at white folks protesting the grand jury decisions on the Michael Brown and Eric Garner cases and in general at white people who want to be allies in the struggle against structural violence and discrimination against black people and racist policing.

Tam specifically talks about the protests that happened in Berkeley on December 6th, writing:

Screen Shot 2014-12-12 at 9.15.14 AM

I was so happy to read Tam’s post, because I was in Berkeley last week, too, and the protesters were assembled in front of the police station down the street from where I was staying. I went over there around 6.30 because I wanted to join in, and I was missing the protests in New York because of travel. At that point in the evening, the situation was peaceful. The cops were lined up in front of the police station in riot gear, and people were calmly standing around or sitting on the ground singing, “Which side are you on? Which side are you on?” Later, I hear, the protests got crazy and there were rubber bullets, tear gas, windows smashed with skateboards, and tasers.

As I arrived to the police station, however, it was people milling around, getting ready for a ‘die in.’ They started lying down in the street. And I was not sure what to do. I wanted to support the movement and guessed that I should also lie down. But the protest seemed a bit ‘off.’ I hardly saw a black person there. The sign saying “Fuck the Police” covering the body of a hipster white girl lying in the street felt about as real as when middle class white people rap along with the 1988 N.W.A. song by the same name. (OK, confession. I do that. But not in public, and not to make a statement.)

Anyway, the whole thing made me feel confused about what I and others were doing there, so I left, feeling that maybe I was just getting old. I felt like I was not doing enough, but I also felt unable to participate in something that seemed somehow false. As I walked over to the BART station to catch a train, I couldn’t help but notice the group of older black homeless men at the park a half a block away from the police station. I couldn’t help but think of the black man with a shopping cart that I witnessed police harassing earlier that week on a suburban side street in Berkeley. None of them were engaged with this student protest. And I couldn’t help but feel awkward for the protesters who in their zeal to protest, somehow seemed oblivious to their surroundings and their privilege.

It’s possible that later on the protest became different and more diverse, and in that case I will retract these words and feel better, I guess. But I was glad to read Tam’s post. I was having a hard time unpacking my own reactions to the Berkeley protest, and Tam’s analysis illuminated what was wrong. It’s important to have allies in all struggles, but allies need to learn to take a back seat, understand their role, and take the lead from those whose struggle it is.

Tam gives advice on how:

Screen Shot 2014-12-12 at 11.09.38 AM

As Franchesca Ramsey also says: “An ally’s job is to support.” Watch her video (below) on how to do that, and read Tam’s full post for some good insight.

Ramsey’s 5 Tips for Being a Good Ally include:

1. Understand your privilege.
2. Listen, do your homework.
3. Speak up, not over.
4. Apologize when you make mistakes and learn from them.
5. Saying you’re an ally is not enough.

Lastly, a few months ago I read this post about Imani Henry and Equality for Flatbush, who organizes people (of all colors) in the community where I live around issues of gentrification, racial tension, and discrimination against black and brown people by law enforcement. Henry says many of the same things (read that whole article too – it’s really insightful).

Screen Shot 2014-12-12 at 11.40.33 AM

There are a lots of places for white people to listen and learn how to be better allies, and opportunities to put that learning into practice. Understanding our own privilege is a critical task, and it’s hard. These are all lifelong learning pathways, and as Ramsey says, we’ll make mistakes. It’s part of the process of changing and shifting the balance of power to a more just one. It won’t happen overnight, but we shouldn’t give up just because we feel awkward and uncertain.

So go to protests, get involved, know and exercise your rights to dissent and assemble, show solidarity. This movement needs everyone to get on board. Like Fannie Lou Hamer said: ‘Nobody’s free until everybody’s free’. But as white people, we need to think through our participation, join as allies, and avoid making it about us.

 

 

 

 

Read Full Post »

Today as we jump into the M&E Tech conference in DC (we’ll also have a Deep Dive on the same topic in NYC next week), I’m excited to share a report I’ve been working on for the past year or so with Michael Bamberger: Emerging Opportunities in a Tech-Enabled World.

The past few years have seen dramatic advances in the use of hand-held devices (phones and tablets) for program monitoring and for survey data collection. Progress has been slower with respect to the application of ICT-enabled devices for program evaluation, but this is clearly the next frontier.

In the paper, we review how ICT-enabled technologies are already being applied in program monitoring and in survey research. We also review areas where ICTs are starting to be applied in program evaluation and identify new areas in which new technologies can potentially be applied. The technologies discussed include hand-held devices for quantitative and qualitative data collection and analysis, data quality control, GPS and mapping devices, environmental monitoring, satellite imaging and big data.

While the technological advances and the rapidly falling costs of data collection and analysis are opening up exciting new opportunities for monitoring and evaluation, the paper also cautions that more attention should be paid to basic quality control questions that evaluators normally ask about representativity of data and selection bias, data quality and construct validity. The ability to use techniques such as crowd sourcing to generate information and feedback from tens of thousands of respondents has so fascinated researchers that concerns about the representativity or quality of the responses have received less attention than is the case with conventional instruments for data collection and analysis.

Some of the challenges include the potential for: selectivity bias and sample design, M&E processes being driven by the requirements of the technology and over-reliance on simple quantitative data, as well as low institutional capacity to introduce ICT and resistance to change, and issues of privacy.

None of this is intended to discourage the introduction of these technologies, as the authors fully recognize their huge potential. One of the most exciting areas concerns the promotion of a more equitable society through simple and cost-effective monitoring and evaluation systems that give voice to previously excluded sectors of the target populations; and that offer opportunities for promoting gender equality in access to information. The application of these technologies however needs to be on a sound methodological footing.

The last section of the paper offers some tips and ideas on how to integrate ICTs into M&E practice and potential pitfalls to avoid. Many of these were drawn from Salons and discussions with practitioners, given that there is little solid documentation or evidence related to the use of ICTs for M&E.

Download the full paper here! 

Read Full Post »

Screen Shot 2014-05-08 at 9.36.00 AMDebate and thinking around data, ethics, ICT have been growing and expanding a lot lately, which makes me very happy!

Coming up on May 22 in NYC, the engine room, Hivos, the Berkman Center for Internet and Society, and Kurante (my newish gig) are organizing the latest in a series of events as part of the Responsible Data Forum.

The event will be hosted at ThoughtWorks and it is in-person only. Space is limited, so if you’d like to join us, let us know soon by filling in this form. 

What’s it all about?

This particular Responsible Data Forum event is an effort to map the ethical, legal, privacy and security challenges surrounding the increased use and sharing of data in development programming. The Forum will aim to explore the ways in which these challenges are experienced in project design and implementation, as well as when project data is shared or published in an effort to strengthen accountability. The event will be a collaborative effort to begin developing concrete tools and strategies to address these challenges, which can be further tested and refined with end users at events in Amsterdam and Budapest.

We will explore the responsible data challenges faced by development practitioners in program design and implementation.

Some of the use cases we’ll consider include:

  • projects collecting data from marginalized populations, aspiring to respect a do no harm principle, but also to identify opportunities for informational empowerment
  • project design staff seeking to understand and manage the lifespan of project data from collection, through maintenance, utilization, and sharing or destruction.
  • project staff that are considering data sharing or joint data collection with government agencies or corporate actors
  • project staff who want to better understand how ICT4D will impact communities
  • projects exploring the potential of popular ICT-related mechanisms, such as hackathons, incubation labs or innovation hubs
  • projects wishing to use development data for research purposes, and crafting responsible ways to use personally identifiable data for academic purposes
  • projects working with children under the age of 18, struggling to balance the need for data to improve programming approaches, and demand higher levels of protection for children

By gathering a significant number of development practitioners grappling with these issues, the Forum aims to pose practical and critical questions to the use of data and ICTs in development programming. Through collaborative sessions and group work, the Forum will identify common pressing issues for which there might be practical and feasible solutions. The Forum will focus on prototyping specific tools and strategies to respond to these challenges.

What will be accomplished?

Some outputs from the event may include:

  • Tools and checklists for managing responsible data challenges for specific project modalities, such as sms surveys, constructing national databases, or social media scraping and engagement.
  • Best practices and ethical controls for data sharing agreements with governments, corporate actors, academia or civil society
  • Strategies for responsible program development
  • Guidelines for data-driven projects dealing with communities with limited representation or access to information
  • Heuristics and frameworks for understanding anonymity and re-identification of large development data sets
  • Potential policy interventions to create greater awareness and possibly consider minimum standards

Hope to see some of you on the 22nd! Sign up here if you’re interested in attending, and read more about the Responsible Data Forum here.

 

Read Full Post »

Last week’s Technology Salon New York City touched on ethics in technology for democracy initiatives. We heard from lead discussants Malavika Jayaram, Berkman Center for Internet and SocietyIvan Sigal, Global Voices; and Amilcar Priestley, Afrolatin@ Project. Though the topic was catalyzed by the Associated Press’ article on ‘Zunzuneo’ (a.k.a. ‘Cuban Twitter’) and subsequent discussions in the press and elsewhere, we aimed to cover some of the wider ethical issues encountered by people and organizations who implement technology for democracy programs.

Salons are off the record spaces, so no attribution is made in this post, but I’ve summarized the discussion points here:

First up: Zunzuneo

The media misinterpreted much of the Zunzuneo story. Zunzuneo was not a secret mission, according to one Salon participant, as it’s not in the remit of USAID to carry out covert operations. The AP article conflated a number of ideas regarding how USAID works and the contracting mechanisms that were involved in this case, he said. USAID and the Office of Transition Initiatives (OTI) frequently disguise members, organizations, and contractors that work for it on the ground for security reasons. (See USAID’s side of the story here). This may still be an ethical question, but it is not technically “spying.” The project was known within the OTI and development community, but on a ‘need to know’ basis. It was not a ‘fly by night’ operation; it was more a ‘quietly and not very effectively run project.’

There were likely ethics breaches in Zunzuneo, from a legal standpoint. It’s not clear whether the data and phone numbers collected from the Cuban public for the project were obtained in a legal or ethical way. Some reports say they were obtained through a mid-level employee (a “Cuban engineer who had gotten the phone list” according to the AP article). (Note: I spoke separately to someone close to the project who told me that user opt-in/opt-out and other standard privacy protocols were in place). It’s also not entirely clear whether, as the AP states, the user information collected was being categorized into segments who were loyal or disloyal to the Cuban government, information which could put users at risk if found out.

Zunzuneo took place in a broader historical and geo-political context. As one person put it, the project followed Secretary Clinton’s speeches on Internet Freedom. There was a rush to bring technology into the geopolitical space, and ‘the articulation of why technology was important collided with a bureaucratic process in USAID and the State Department (the ‘F process’) that absorbed USAID into the State Department and made development part of the State Department’s broader political agenda.’ This agenda had been in the works for quite some time, and was part of a wider strategy of quietly moving into development spaces and combining development, diplomacy, intelligence and military (defense), the so-called 3 D’s.

Implementers failed to think through good design, ethics and community aspects of the work. In a number of projects of this type, the idea was that if you give people technology, they will somehow create bottom up pressure for political social change. As one person noted, ‘in the Middle East, as a counter example, the tech was there to enable and assist people who had spent 8-10 years building networks. The idea that we can drop tech into a space and an uprising will just happen and it will coincidentally push the US geopolitical agenda is a fantasy.’ Often these kinds of programs start with a strategic communications goal that serves a political end of the US Government. They are designed with the idea that a particular input equals some kind of a specific result down the chain. The problem comes when the people doing the seeding of the ideas and inputs are not familiar with the context they will be operating in. They are injecting inputs into a space that they don’t understand. The bigger ethical question is: Why does this thought process prevail in development? Much of that answer is found in US domestic politics and the ways that initiatives get funded.

Zunzuneo was not a big surprise for Afrolatino organizations. According to one discussant, Afrolatino organizations were not surprised when the Zunzuneo article came out, given the geopolitical history and the ongoing presence of the US in Latin America. Zunzuneo was seen as a 21st Century version of what has been happening for decades. Though it was criticized, it was not seen as particularly detrimental. Furthermore, the Afrolatino community (within the wider Latino community) has had a variety of relationships with the US over time – for example, some Afrolatino groups supported the Contras. Many Afrolatino groups have felt that they were not benefiting overall from the mestizo governments who have held power. In addition, much of Latin America’s younger generation is less tainted by the Cold War mentality, and does not see US involvement in the region as necessarily bad. Programs like Zunzuneo come with a lot of money attached, so often wider concerns about their implications are not in the forefront because organizations need to access funding. Central American and Caribbean countries are only just entering into a phase of deeper analysis of digital citizenship, and views and perceptions on privacy are still being developed.

Perceptions of privacy

There are differences in perception when it comes to privacy and these perceptions are contextual. They vary within and across countries and communities based on age, race, gender, economic levels, comfort with digital devices, political perspective and past history. Some older people, for example, are worried about the privacy violation of having their voice or image recorded, because the voice, image and gaze hold spiritual value and power. These angles of privacy need to be considered as we think through what privacy means in different contexts and adapt our discourse accordingly.

Privacy is hard to explain, as one discussant said: ‘There are not enough dead bodies yet, so it’s hard to get people interested. People get mad when the media gets mad, and until an issue hits the media, it may go unnoticed. It’s very hard to conceptualize the potential harm from lack of privacy. There may be a chilling effect but it’s hard to measure. The digital divide comes in as well, and those with less exposure may have trouble understanding devices and technology. They will then have even greater trouble understanding beyond the device to data doubles, disembodied information and de-anonymization, which are about 7 levels removed from what people can immediately see. Caring a lot about privacy can get you labeled as paranoid or a crazy person in many places.’

Fatalism about privacy can also hamper efforts. In the developing world, many feel that everything is corrupt and inept, and that there is no point in worrying about privacy and security. ‘Nothing ever works anyway, so even if the government wanted to spy on us, they’d screw it up,’ is the feeling. This is often the attitude of human rights workers and others who could be at greatest risk from privacy breaches or data collection, such as that which was reportedly happening within Zunzuneo. Especially among populations and practitioners who have less experience with new technologies and data, this can create large-scale risk.

Intent, action, context and consequences

Good intentions with little attention to privacy vs data collection with a hidden political agenda. Where are the lines when data that are collected for a ‘good cause’ (for example, to improve humanitarian response) might be used for a different purpose that puts vulnerable people at risk? What about data that are collected with less altruistic intentions? What about when the two scenarios overlap? Data might be freely given or collected in an emergency that would be considered a privacy violation in a ‘development’ setting, or the data collection may lead to a privacy violation post-emergency. Often, slapping the ‘obviously good and unarguably positive’ label of ‘Internet freedom’ on something implies that it’s unquestionably positive when it may in fact be part of a political agenda with a misleading label. There is a long history of those with power collecting data that helps them understand and/or control those with less power, as one Salon participant noted, and we need to be cognizant of that when we think about data and privacy.

US Government approaches to political development often take an input/output approach, when, in fact, political development is not the same as health development. ‘In political work, there is no clear and clean epidemiological goal we are trying to reach,’ noted a Salon participant. Political development is often contentious and the targets and approaches are very different than those of health. When a health model and rhetoric is used to work on other development issues, it is misleading. The wholesale adoption of these kinds of disease model approaches leaves people and communities out of the decision making process about their own development. Similarly, the rhetoric of strategic communications and its inclusion into the development agenda came about after the War on Terror, and it is also a poor fit for political development. The rhetoric of ‘opening’ and ‘liberating’ data is similar. These arguments may work well for one kind of issue, but they are not transferable to a political agenda. One Salon participant pointed out the rhetoric of the privatization model also, and explained that a profound yet not often considered implication of the privatization of services is that once a service passes over to the private sector, the Freedom of Information Act (FOIA) does not apply, and citizens and human rights organizations lose FOIA as a tool. Examples included the US prison system and the Blackwater case of several years ago.

It can be confusing for implementers to know what to do, what tools to use, what funding to accept and when it is OK to bring in an outside agenda. Salon participants provided a number of examples where they had to make choices and felt ethics could have been compromised. Is it OK to sign people up on Facebook or Gmail during an ICT and education project, given these companies’ marketing and privacy policies? What about working on aid transparency initiatives in places where human rights work or crime reporting can get people killed or individual philanthropists/donors might be kidnapped or extorted? What about a hackathon where the data and solutions are later given to a government’s civilian-military affairs office? What about telling LGBT youth about a social media site that encourages LGBT youth to connect openly with one another (in light of recent harsh legal penalties against homosexuality)? What about employing a user-centered design approach for a project that will eventually be overlaid on top of a larger platform, system or service that does not pass the privacy litmus test? Is it better to contribute to improving healthcare while knowing that your software system might compromise privacy and autonomy because it sits on top of a biometric system, for example? Participants at the Salon face these ethical dilemmas every day, and as one person noted, ‘I wonder if I am just window dressing something that will look and feel holistic and human-centered, but that will be used to justify decisions down the road that are politically negative or go against my values.’ Participants said they normally rely on their own moral compass, but clearly many Salon participants are wrestling with the potential ethical implications of their actions.

What we can do? Recommendations from Salon participants

Work closely with and listen to local partners, who should be driving the process and decisions. There may be a role for an outside perspective, but the outside perspective should not trump the local one. Inculcate and support local communities to build their own tools, narratives, and projects. Let people set their own agendas. Find ways to facilitate long-term development processes around communities rather than being subject to agendas from the outside.

Consider this to be ICT for Discrimination and think in every instance and every design decision about how to dial down discrimination. Data lead to sorting, and data get lumped into clusters. Find ways during the design process to reduce the discrimination that will come from that sorting and clustering process. The ‘Do no harm’ approach is key. Practitioners and designers should also be wary of the automation of development and the potential for automated decisions to be discriminatory.

Call out hypocrisy. Those of us who sit at Salons or attend global meetings hold tremendous privilege and power as compared to most of the rest of the world. ‘It’s not landless farmers or disenfranchised young black youth in Brazil who get to attend global meetings,’ said one Salon attendee. ‘It’s people like us. We need to be cognizant of the advantage we have as holders of power.’ Here in the US, the participant added, we need to be more aware of what private sector US technology companies are doing to take advantage of and maintain their stronghold in the global market and how the US government is working to allow US corporations to benefit disproportionately from the current Internet governance structure.

Use a rights-based approach to data and privacy to help to frame these issues and situations. Disclosure and consent are sometimes considered extraneous, especially in emergency situations. People think ‘this might be the only time I can get into this disaster or conflict zone, so I’m going to Hoover up as much data as possible without worrying about privacy.’ On the other hand, sometimes organizations are paternalistic and make choices for people about their own privacy. Consent and disclosure are not new issues; they are merely manifested in new ways as new technology changes the game and we cannot guarantee anonymity or privacy any more for research subjects. There is also a difference between information a person actively volunteers and information that is passively collected and used without a person’s knowledge. Framing privacy in a human rights context can help place importance on both processes and outcomes that support people’s rights to control their own data and that increase empowerment.

Create a minimum standard for privacy. Though we may not be able to determine a ceiling for privacy, one Salon participant said we should at least consider a floor or a minimum standard. Actors on the ground will always feel that privacy standards are a luxury because they have little know-how and little funding, so creating and working within an ethical standard should be a mandate from donors. The standard could be established as an M&E criterion.

Establish an ethics checklist to decide on funding sources and create policies and processes that help organizations to better understand how a donor or sub-donor would access and/or use data collected as part of a project or program they are funding. This is not always an easy solution, however, especially for cash-strapped local organizations. In India, for example, organizations are legally restricted from receiving certain types of funding based on government concerns that external agencies are trying to bring in Western democracy and Western values. Local organizations have a hard time getting funding for anti-censorship or free speech efforts. As one person at the Salon said, ‘agencies working on the ground are in a bind because they can’t take money from Google because it’s tainted, they can’t take money from the State Department because it’s imperialism and they can’t take money from local donors because there are none.’

Use encryption and other technology solutions. Given the low levels of understanding and awareness of these tools, more needs to be done so that more organizations learn how to use them, and they need to be made simpler, more accessible and user-friendly. ‘Crypto Parties’ can help get organizations familiar with encryption and privacy, but better outreach is needed so that organizations understand the relevance of encryption and feel welcome in tech-heavy environments.

Thanks to participants and lead discussants for the great discussions and to ThoughtWorks for hosting us at their offices!

 If you’d like to attend future Salons, sign up here!

Read Full Post »

Screen Shot 2014-03-07 at 3.34.23 AM

As Tom over at Humanosphere wrote a few days ago, there’s a cool initiative happening at Sheffield University that seeks to develop a global research agenda related to the post-2015 sustainable development goals process. (Disclaimer, I’m part of the steering committee.)

ID100: The Hundred Most Important Questions in International Development asks individuals and organizations from across policy, practice and academia to submit questions that address the world’s biggest environmental, political and socioeconomic problems. These will then be shortlisted down to a final set of 100 questions following a debate and voting process with representatives from development organizations and academia who will meet in July 2014. 

The final list of questions will be published as policy report and in a leading academic journal. More on the consultation methodology here. Similar crowdsourced priority-setting exercises have worked in biodiversity conservation, food security and other areas, and they have been instrumental in framing global research priorities for policy development and implementation.

Screen Shot 2014-03-07 at 3.36.06 AMAnyone can submit up to five questions related to key issues in international development that require more exploration. You are encouraged to involve colleagues in the formulation of these questions.

Please submit your questions by March 25th – and check the submission guidelines before formulating questions. More information on the project can be accessed on the ID100 website. Hashtag: #ID100.

Read Full Post »

This is a cross post from Heather Leson, Community Engagement Director at the Open Knowledge Foundation. The original post appeared here on the School of Data site.

by Heather Leson

What is the currency of change? What can coders (consumers) do with IATI data? How can suppliers deliver the data sets? Last week I had the honour of participating in the Open Data for Development Codeathon and the International Aid Transparency Initiative Technical Advisory Group meetings. IATI’s goal is to make information about aid spending easier to access, use, and understand. It was great that these events were back-to-back to push a big picture view.

My big takeaways included similar themes that I have learned on my open source journey:

You can talk about open data [insert tech or OS project] all you want, but if you don’t have an interactive community (including mentorship programmes), an education strategy, engagement/feedback loops plan, translation/localization plan and a process for people to learn how to contribute, then you build a double-edged barrier: barrier to entry and barrier for impact/contributor outputs.

Currency

About the Open Data in Development Codeathon

At the Codathon close, Mark Surman, Executive Director of Mozilla Foundation, gave us a call to action to make the web. Well, in order to create a world of data makers, I think we should run aid and development processes through this mindset. What is the currency of change? I hear many people talking about theory of change and impact, but I’d like to add ‘currency’. This is not only about money, this is about using the best brainpower and best energy sources to solve real world problems in smart ways. I think if we heed Mark’s call to action with a “yes, and”, then we can rethink how we approach complex change. Every single industry is suffering from the same issue: how to deal with the influx of supply and demand in information. We need to change how we approach the problem. Combined events like these give a window into tackling problems in a new format. It is not about the next greatest app, but more about asking: how can we learn from the Webmakers and build with each other in our respective fields and networks?

Ease of Delivery

The IATI community / network is very passionate about moving the ball forward on releasing data. During the sessions, it was clear that the attendees see some gaps and are already working to fill them. The new IATI website is set up to grow with a Community component. The feedback from each of the sessions was distilled by the IATI – TAG and Civil Society Guidance groups to share with the IATI Secretariat.

In the Open Data in Development, Impact of Open Data in Developing Countries, and CSO Guidance sessions, we discussed some key items about sharing, learning, and using IATI data. Farai Matsika, with International HIV/Aids Alliance, was particularly poignant reminding us of IATI’s CSO purpose – we need to share data with those we serve.

Country edits IATI

One of the biggest themes was data ethics. As we rush to ask NGOs and CSOs to release data, what are some of the data pitfalls? Anahi Ayala Iaccuci of Internews and Linda Raftree of Plan International USA both reminded participants that data needs to be anonymized to protect those at risk. Ms. Iaccuci asked that we consider the complex nature of sharing both sides of the open data story – successes and failures. As well, she advised: don’t create trust, but think about who people are trusting. Turning this model around is key to rethinking assumptions. I would add to her point: trust and sharing are currency and will add to the success measures of IATI. If people don’t trust the IATI data, they won’t share and use it.

Anne Crowe of Privacy International frequently asked attendees to consider the ramifications of opening data. It is clear that the IATI TAG does not curate the data that NGOS and CSOs share. Thus it falls on each of these organizations to learn how to be data makers in order to contribute data to IATI. Perhaps organizations need a lead educator and curator to ensure the future success of the IATI process, including quality data.

I think that School of Data and the Partnership for Open Data have a huge part to play with IATI. My colleague Zara Rahman is collecting user feedback for the Open Development Toolkit, and Katelyn Rogers is leading the Open Development mailing list. We collectively want to help people become data makers and consumers to effectively achieve their development goals using open data. This also means also tackling the ongoing questions about data quality and data ethics.


Here are some additional resources shared during the IATI meetings.

Read Full Post »

Santa announces IATI commitment

Santa Claus has become the first major private philanthropist to publish to the IATI Registry according to a press release from Bond*.

London, 18th December, 2013

As part of his IATI commitment, Claus is planning to digitize his records over the course of 2014-2016.

As preparations for Christmas reach their high-point, Bond is today announcing that following a long period of engagement, Santa Claus has committed to publish information on his philanthropic activities to the International Aid Transparency Initiative.

Santa Claus – with his global reach, substantial gift-giving programme and enviable brand awareness – has long been a controversial figure in the aid community. His approach to the provision of gifts-in-kind has been the subject of direct criticism by the OECD Development Assistance Committee, who have suggested that Santa’s policy of only delivering presents manufactured in his own grotto in the North Pole constitutes a form of tied aid, and that he could achieve much greater efficiency by providing cash to recipients, or by sourcing his presents within developing countries. Santa’s commitment to publish his activities to the IATI Registry will enable better comparative data to be generated to test these claims.

Santa has been a pioneer in the use of technology, and his logistics capacity is the envy of actors ranging from Coca-Cola to MSF. However there have been rumours of the use of GM technology in the development of his reindeer-based delivery mechanism. Those looking for insights into Santa’s magic reindeers may be disappointed, however, as this is likely to be excluded under a commercial sensitivity clause in his new Open Information Policy.

Santa’s commitment to publish comes after an organizational Health Check carried out with Bond’s support identified transparency as an area of weakness for Santa. Santa notably scored highly on participation, with his letter-based consultation method being seen as a sector-leading beneficiary feedback mechanism that others could learn from. Santa also scored full marks on “inspiring leadership”, but his lack of a board of trustees creates concerns about governance in his organization.

Other priority areas for improvement include monitoring and evaluation, as Santa’s policy is not to carry out formal reviews of the impact of his gift-giving activities on child wellbeing indicators, which hinders informed decision-making on improving his effectiveness and value-for-money, and limits opportunities for wider learning across the sector.

It is anticipated that publishing data on where Santa’s aid goes will also shed some light on his controversial targeting mechanism. Santa’s approach to distinguishing naughty children from nice children has been considered by many to be too subjective, potentially in breach of principles of equity and non-discrimination and failing to deliver aid where it is needed most.

END

*I received this clever press release from Bond’s Michael O’Donnell, Senior Manager Effectiveness Services, who gave me permission to post. Contact Michael (@modonnell151) at Bond for more information. 

Read Full Post »

Older Posts »

Follow

Get every new post delivered to your Inbox.

Join 798 other followers