Feeds:
Posts
Comments

Archive for the ‘open development’ Category

Screen Shot 2014-07-22 at 5.13.57 AM

I spent last week in Berlin at the Open Knowledge Festival – a great place to talk ‘open’ everything and catch up on what is happening in this burgeoning area that crosses through the fields of data, science, education, art, transparency and accountability, governance, development, technology and more.

One session was on Power, politics, inclusion and voice, and it encouraged participants to dig deeper into those 4 aspects of open data and open knowledge. The organizers kicked things off by asking us to get into small groups and talk about power. Our group was assigned the topic of “feeling powerless” and we shared personal experiences of when we had felt powerless. There were several women in my group, many of whom, unsurprisingly, recounted experiences that felt gendered.

Screen Shot 2014-07-22 at 5.24.53 AMThe concept of ‘mansplaining‘ came up. Mansplaining (according to Wikipedia) is a term that describes when a man speaks to a woman with the assumption that she knows less than he does about the topic being discussed because she is female. ‘Mansplaining is different from other forms of condescension because mansplaining is rooted in the assumption that, in general, a man is likely to be more knowledgeable than a woman.’

From there, we got into the tokenism we’d seen in development programs that say they want ‘participation’ but really don’t care to include the viewpoints of the participants. One member of our group talked about the feelings of powerlessness development workers create when they are dismissive of indigenous knowledge and assume they know more than the poor in general. “Like when they go out and explain climate change to people who have been farming their entire lives,” she said.

A lightbulb went off. It’s the same attitude as ‘mansplaining,’ but seen in development workers. It’s #devsplaining.

So I made a hashtag (of course) and tried to come up with a definition.

Devsplaining – when a development worker, academic, or someone who generally has more power within the ‘development industry’ speaks condescendingly to someone with less power. The devsplainer assumes that he/she knows more and has more right to an opinion because of his/her position and power within the industry. Devsplaining is rooted in the assumption that, in general, development workers are likely to be more knowledgeable about the lives and situations of the people who participate in their programs/research than the people themselves are.

What do people think? Any good examples?

 

 

Read Full Post »

This is a cross post from Heather Leson, Community Engagement Director at the Open Knowledge Foundation. The original post appeared here on the School of Data site.

by Heather Leson

What is the currency of change? What can coders (consumers) do with IATI data? How can suppliers deliver the data sets? Last week I had the honour of participating in the Open Data for Development Codeathon and the International Aid Transparency Initiative Technical Advisory Group meetings. IATI’s goal is to make information about aid spending easier to access, use, and understand. It was great that these events were back-to-back to push a big picture view.

My big takeaways included similar themes that I have learned on my open source journey:

You can talk about open data [insert tech or OS project] all you want, but if you don’t have an interactive community (including mentorship programmes), an education strategy, engagement/feedback loops plan, translation/localization plan and a process for people to learn how to contribute, then you build a double-edged barrier: barrier to entry and barrier for impact/contributor outputs.

Currency

About the Open Data in Development Codeathon

At the Codathon close, Mark Surman, Executive Director of Mozilla Foundation, gave us a call to action to make the web. Well, in order to create a world of data makers, I think we should run aid and development processes through this mindset. What is the currency of change? I hear many people talking about theory of change and impact, but I’d like to add ‘currency’. This is not only about money, this is about using the best brainpower and best energy sources to solve real world problems in smart ways. I think if we heed Mark’s call to action with a “yes, and”, then we can rethink how we approach complex change. Every single industry is suffering from the same issue: how to deal with the influx of supply and demand in information. We need to change how we approach the problem. Combined events like these give a window into tackling problems in a new format. It is not about the next greatest app, but more about asking: how can we learn from the Webmakers and build with each other in our respective fields and networks?

Ease of Delivery

The IATI community / network is very passionate about moving the ball forward on releasing data. During the sessions, it was clear that the attendees see some gaps and are already working to fill them. The new IATI website is set up to grow with a Community component. The feedback from each of the sessions was distilled by the IATI – TAG and Civil Society Guidance groups to share with the IATI Secretariat.

In the Open Data in Development, Impact of Open Data in Developing Countries, and CSO Guidance sessions, we discussed some key items about sharing, learning, and using IATI data. Farai Matsika, with International HIV/Aids Alliance, was particularly poignant reminding us of IATI’s CSO purpose – we need to share data with those we serve.

Country edits IATI

One of the biggest themes was data ethics. As we rush to ask NGOs and CSOs to release data, what are some of the data pitfalls? Anahi Ayala Iaccuci of Internews and Linda Raftree of Plan International USA both reminded participants that data needs to be anonymized to protect those at risk. Ms. Iaccuci asked that we consider the complex nature of sharing both sides of the open data story – successes and failures. As well, she advised: don’t create trust, but think about who people are trusting. Turning this model around is key to rethinking assumptions. I would add to her point: trust and sharing are currency and will add to the success measures of IATI. If people don’t trust the IATI data, they won’t share and use it.

Anne Crowe of Privacy International frequently asked attendees to consider the ramifications of opening data. It is clear that the IATI TAG does not curate the data that NGOS and CSOs share. Thus it falls on each of these organizations to learn how to be data makers in order to contribute data to IATI. Perhaps organizations need a lead educator and curator to ensure the future success of the IATI process, including quality data.

I think that School of Data and the Partnership for Open Data have a huge part to play with IATI. My colleague Zara Rahman is collecting user feedback for the Open Development Toolkit, and Katelyn Rogers is leading the Open Development mailing list. We collectively want to help people become data makers and consumers to effectively achieve their development goals using open data. This also means also tackling the ongoing questions about data quality and data ethics.


Here are some additional resources shared during the IATI meetings.

Read Full Post »

Sometimes a work trip accidentally has a theme, and my recent trip to Cape Town was one of those. I arrived on Thursday, December 5th to the news that Mandela had passed away. My cab driver was on the phone, telling someone that Friday would be a holiday. He glanced back at me and asked “Do you know who Nelson Mandela is? He’s passed.” I turned on the television when I got to my hotel and watched for a few hours, but it was already after midnight and so there was not a lot of new content.

Screen Shot 2013-12-13 at 7.56.17 AMThe following day I went with a small group to an ecumenical ceremony in the square, but it didn’t feel yet like the news had really hit. I had no idea how to interpret the crowd, the messages, the speakers, the politics. As the news traveled and people began writing about Mandela and his life, I dipped in here and there. The typical conversations happened. Was Mandela and his life going to be sanitized by the mainstream media for political purposes? It was good to see people attempting to show the full man, with all his complexities. It was striking to remember that such a short time ago apartheid was alive and well, and to really think about that, I mean really really think about it, and to be reminded yet again of the fact that social change is not easy, clean, or straightforward. It’s most certainly not a technical problem waiting to be solved with a new device or invention, though clearly international and national political pressure play a huge role.

Mandela and his life became an underlying base for the conference, as I’m sure was true for much of what was happening around the world. Whether he was directly mentioned or not, his life’s work was present. I participated in sessions on ICTs and open development, ICTs and children, ICTs and raising critical consciousness. In all of them, the issues of equity and power came up. How can development processes be more open and is there a role for ICTs there? What world do we want to see in the future? How do we get there? How do we include children and youth so that they are not marginalized? How can we take a critical approach to ourselves and our agendas in development and in ICT4D? Can ICTs play a role in helping people to change existing power structures, achieve more equity and equality, and transform our societies? All these sessions were planned before anyone knew of Mandela’s passing, but talking about issues in light of the recent news and the renewed presence of him and his life made them feel more real.

Fast forward to the flights home. My first flight was the long one, from Cape Town to Amsterdam. My seat mates were two inexperienced flyers in their late 30s or so. They didn’t know where to put their bags or that they could not get up to go to the bathroom while the seatbelt sign was on and the flight was taking off. They were tattooed and looked a little rough around the edges. One of them carried a small, stuffed cheetah and wore hot pink pumps. I fell fast asleep the minute we took off and woke up an hour before we landed. The woman with the pink pumps started a conversation. Almost immediately she told me that she and her friend were returning from 2 months in rehab. They were both struggling with addictions to alcohol and sex, she told me. She was originally from Croatia and had lived in Amsterdam for years. She had recently relapsed and that’s why she went into treatment. She was returning to a safe house now, and it was her daughter’s 10th birthday. She was feeling positive about her life, yet sad that she would spend her daughter’s birthday in a safe house. She had recently revealed her addiction to her boss and received a negative and disempowering response. She was trying to be strong and accept that she was a recovering addict, learning to not feel ashamed, and working on being proud of the fact that she was moving forward. I was struck by her vulnerability and sweetness and left wondering how she would fare in a world where addiction and mental illness are so buried and stigmatized.

I got on my last flight and checked my Facebook while waiting to take off. My friend Subir had posted that two Supreme Court judges had overruled the Delhi high court’s decision and upheld the constitutionality of Section 377 –  essentially ruling that homosexuality is a crime and throwing India back into the dark ages.

Screen Shot 2013-12-13 at 7.26.51 AM

My seatmate on this flight started up a conversation and I mentioned the India decision. I also told him about some of the different work that I do and the various hats I wear, including my involvement as a board member with ICAAD, the International Center for Advocates Against Discrimination. ICAAD’s work is fascinating because they look at discrimination that is embedded into law, and the link between structural and legal discrimination and racial, gender, religious and social discrimination, violence, and hate crimes including those against religious minorities, immigrants, women, the LGBT community, and people of color.

As we talked, I learned that my seat mate’s mother had been a Holocaust survivor and that he was traveling to the US to attend an event in his mother’s honor. Her father survived a concentration camp, and she had been hidden and sheltered by different families for many years until the two were finally reunited and moved to the US.  She spent years dealing with the psychological impacts of the experience, but now works to help children and youth understand and deal with bigotry and hate, to identify it around them even when it’s not directly aimed at them, and to find ways to stop it. She highlights that it can manifest itself in seemingly small ways, like bullying at school.

This accidental theme of discrimination, violence and hate, whether based on race, poverty, addiction, religious beliefs or sexual orientation was so alive for me this week. I met and learned more about brave individuals and the work of organizations who stand up in the face of injustice to take action at both the personal and the institutional level, raising critical consciousness to push for the changes that the world needs.

Despite our ‘advanced’ societies, our awareness of history, our facts, our data, our evidence, our literary genius, our ICTs, our innovations, we have very far to go, as I was reminded multiple times. But strong and caring individuals, organized communities, and political will can make a dent in structural discrimination and contribute to a more human society. More of us, self included, need to re-focus and work harder toward this end.

Read Full Post »

This is a guest post from Anna Crowe, Research Officer on the Privacy in the Developing World Project, and  Carly Nyst, Head of International Advocacy at Privacy International, a London-based NGO working on issues related to technology and human rights, with a focus on privacy and data protection. Privacy International’s new report, Aiding Surveillance, which covers this topic in greater depth was released this week.

by Anna Crowe and Carly Nyst

NOV 21 CANON 040

New technologies hold great potential for the developing world, and countless development scholars and practitioners have sung the praises of technology in accelerating development, reducing poverty, spurring innovation and improving accountability and transparency.

Worryingly, however, privacy is presented as a luxury that creates barriers to development, rather than a key aspect to sustainable development. This perspective needs to change.

Privacy is not a luxury, but a fundamental human right

New technologies are being incorporated into development initiatives and programmes relating to everything from education to health and elections, and in humanitarian initiatives, including crisis response, food delivery and refugee management. But many of the same technologies being deployed in the developing world with lofty claims and high price tags have been extremely controversial in the developed world. Expansive registration systems, identity schemes and databases that collect biometric information including fingerprints, facial scans, iris information and even DNA, have been proposed, resisted, and sometimes rejected in various countries.

The deployment of surveillance technologies by development actors, foreign aid donors and humanitarian organisations, however, is often conducted in the complete absence of the type of public debate or deliberation that has occurred in developed countries. Development actors rarely consider target populations’ opinions when approving aid programmes. Important strategy documents such as the UN Office for Humanitarian Affairs’ Humanitarianism in a Networked Age and the UN High-Level Panel on the Post-2015 Development Agenda’s A New Global Partnership: Eradicate Poverty and Transfer Economies through Sustainable Development give little space to the possible impact adopting new technologies or data analysis techniques could have on individuals’ privacy.

Some of this trend can be attributed to development actors’ systematic failure to recognise the risks to privacy that development initiatives present. However, it also reflects an often unspoken view that the right to privacy must necessarily be sacrificed at the altar of development – that privacy and development are conflicting, mutually exclusive goals.

The assumptions underpinning this view are as follows:

  • that privacy is not important to people in developing countries;
  • that the privacy implications of new technologies are not significant enough to warrant special attention;
  • and that respecting privacy comes at a high cost, endangering the success of development initiatives and creating unnecessary work for development actors.

These assumptions are deeply flawed. While it should go without saying, privacy is a universal right, enshrined in numerous international human rights treaties, and matters to all individuals, including those living in the developing world. The vast majority of developing countries have explicit constitutional requirements to ensure that their policies and practices do not unnecessarily interfere with privacy. The right to privacy guarantees individuals a personal sphere, free from state interference, and the ability to determine who has information about them and how it is used. Privacy is also an “essential requirement for the realization of the right to freedom of expression”. It is not an “optional” right that only those living in the developed world deserve to see protected. To presume otherwise ignores the humanity of individuals living in various parts of the world.

Technologies undoubtedly have the potential to dramatically improve the provision of development and humanitarian aid and to empower populations. However, the privacy implications of many new technologies are significant and are not well understood by many development actors. The expectations that are placed on technologies to solve problems need to be significantly circumscribed, and the potential negative implications of technologies must be assessed before their deployment. Biometric identification systems, for example, may assist in aid disbursement, but if they also wrongly exclude whole categories of people, then the objectives of the original development intervention have not been achieved. Similarly, border surveillance and communications surveillance systems may help a government improve national security, but may also enable the surveillance of human rights defenders, political activists, immigrants and other groups.

Asking for humanitarian actors to protect and respect privacy rights must not be distorted as requiring inflexible and impossibly high standards that would derail development initiatives if put into practice. Privacy is not an absolute right and may be limited, but only where limitation is necessary, proportionate and in accordance with law. The crucial aspect is to actually undertake an analysis of the technology and its privacy implications and to do so in a thoughtful and considered manner. For example, if an intervention requires collecting personal data from those receiving aid, the first step should be to ask what information is necessary to collect, rather than just applying a standard approach to each programme. In some cases, this may mean additional work. But this work should be considered in light of the contribution upholding human rights and the rule of law make to development and to producing sustainable outcomes. And in some cases, respecting privacy can also mean saving lives, as information falling into the wrong hands could spell tragedy.

A new framing

While there is an increasing recognition among development actors that more attention needs to be paid to privacy, it is not enough to merely ensure that a programme or initiative does not actively harm the right to privacy; instead, development actors should aim to promote rights, including the right to privacy, as an integral part of achieving sustainable development outcomes. Development is not just, or even mostly, about accelerating economic growth. The core of development is building capacity and infrastructure, advancing equality, and supporting democratic societies that protect, respect and fulfill human rights.

The benefits of development and humanitarian assistance can be delivered without unnecessary and disproportionate limitations on the right to privacy. The challenge is to improve access to and understanding of technologies, ensure that policymakers and the laws they adopt respond to the challenges and possibilities of technology, and generate greater public debate to ensure that rights and freedoms are negotiated at a societal level.

Technologies can be built to satisfy both development and privacy.

Download the Aiding Surveillance report.

Read Full Post »

This post was originally published on the Open Knowledge Foundation blog

A core theme that the Open Development track covered at September’s Open Knowledge Conference was Ethics and Risk in Open Development. There were more questions than answers in the discussions, summarized below, and the Open Development working group plans to further examine these issues over the coming year.

Informed consent and opting in or out

Ethics around ‘opt in’ and ‘opt out’ when working with people in communities with fewer resources, lower connectivity, and/or less of an understanding about privacy and data are tricky. Yet project implementers have a responsibility to work to the best of their ability to ensure that participants understand what will happen with their data in general, and what might happen if it is shared openly.

There are some concerns around how these decisions are currently being made and by whom. Can an NGO make the decision to share or open data from/about program participants? Is it OK for an NGO to share ‘beneficiary’ data with the private sector in return for funding to help make a program ‘sustainable’? What liabilities might donors or program implementers face in the future as these issues develop?

Issues related to private vs. public good need further discussion, and there is no one right answer because concepts and definitions of ‘private’ and ‘public’ data change according to context and geography.

Informed participation, informed risk-taking

The ‘do no harm’ principle is applicable in emergency and conflict situations, but is it realistic to apply it to activism? There is concern that organizations implementing programs that rely on newer ICTs and open data are not ensuring that activists have enough information to make an informed choice about their involvement. At the same time, assuming that activists don’t know enough to decide for themselves can come across as paternalistic.

As one participant at OK Con commented, “human rights and accountability work are about changing power relations. Those threatened by power shifts are likely to respond with violence and intimidation. If you are trying to avoid all harm, you will probably not have any impact.” There is also the concept of transformative change: “things get worse before they get better. How do you include that in your prediction of what risks may be involved? There also may be a perception gap in terms of what different people consider harm to be. Whose opinion counts and are we listening? Are the right people involved in the conversations about this?”

A key point is that whomever assumes the risk needs to be involved in assessing that potential risk and deciding what the actions should be — but people also need to be fully informed. With new tools coming into play all the time, can people be truly ‘informed’ and are outsiders who come in with new technologies doing a good enough job of facilitating discussions about possible implications and risk with those who will face the consequences? Are community members and activists themselves included in risk analysis, assumption testing, threat modeling and risk mitigation work? Is there a way to predict the likelihood of harm? For example, can we determine whether releasing ‘x’ data will likely lead to ‘y’ harm happening? How can participants, practitioners and program designers get better at identifying and mitigating risks?

When things get scary…

Even when risk analysis is conducted, it is impossible to predict or foresee every possible way that a program can go wrong during implementation. Then the question becomes what to do when you are in the middle of something that is putting people at risk or leading to extremely negative unintended consequences. Who can you call for help? What do you do when there is no mitigation possible and you need to pull the plug on an effort? Who decides that you’ve reached that point? This is not an issue that exclusively affects programs that use open data, but open data may create new risks with which practitioners, participants and activists have less experience, thus the need to examine it more closely.

Participants felt that there is not enough honest discussion on this aspect. There is a pop culture of ‘admitting failure’ but admitting harm is different because there is a higher sense of liability and distress. “When I’m really scared shitless about what is happening in a project, what do I do?” asked one participant at the OK Con discussion sessions. “When I realize that opening data up has generated a huge potential risk to people who are already vulnerable, where do I go for help?” We tend to share our “cute” failures, not our really dismal ones.

Academia has done some work around research ethics, informed consent, human subject research and use of Internal Review Boards (IRBs). What aspects of this can or should be applied to mobile data gathering, crowdsourcing, open data work and the like? What about when citizens are their own source of information and they voluntarily share data without a clear understanding of what happens to the data, or what the possible implications are?

Do we need to think about updating and modernizing the concept of IRBs? A major issue is that many people who are conducting these kinds of data collection and sharing activities using new ICTs are unaware of research ethics and IRBs and don’t consider what they are doing to be ‘research’. How can we broaden this discussion and engage those who may not be aware of the need to integrate informed consent, risk analysis and privacy awareness into their approaches?

The elephant in the room

Despite our good intentions to do better planning and risk management, one big problem is donors, according to some of the OK Con participants.  Do donors require enough risk assessment and mitigation planning in their program proposal designs? Do they allow organizations enough time to develop a well-thought-out and participatory Theory of Change along with a rigorous risk assessment together with program participants? Are funding recipients required to report back on risks and how they played out? As one person put it, “talk about failure is currently more like a ‘cult of failure’ and there is no real learning from it. Systematically we have to report up the chain on money and results and all the good things happening. and no one up at the top really wants to know about the bad things. The most interesting learning doesn’t get back to the donors or permeate across practitioners. We never talk about all the work-arounds and backdoor negotiations that make development work happen. This is a serious systemic issue.”

Greater transparency can actually be a deterrent to talking about some of these complexities, because “the last thing donors want is more complexity as it raises difficult questions.”

Reporting upwards into government representatives in Parliament or Congress leads to continued aversion to any failures or ‘bad news’. Though funding recipients are urged to be innovative, they still need to hit numeric targets so that the international aid budget can be defended in government spaces. Thus, the message is mixed: “Make sure you are learning and recognizing failure, but please don’t put anything too serious in the final report.” There is awareness that rigid program planning doesn’t work and that we need to be adaptive, yet we are asked to “put it all into a log frame and make sure the government aid person can defend it to their superiors.”

Where to from here?

It was suggested that monitoring and evaluation (M&E) could be used as a tool for examining some of these issues, but M&E needs to be seen as a learning component, not only an accountability one. M&E needs to feed into the choices people are making along the way and linking it in well during program design may be one way to include a more adaptive and iterative approach. M&E should force practitioners to ask themselves the right questions as they design programs and as they assess them throughout implementation. Theory of Change might help, and an ethics-based approach could be introduced as well to raise these questions about risk and privacy and ensure that they are addressed from the start of an initiative.

Practitioners have also expressed the need for additional resources to help them predict and manage possible risk: case studies, a safe space for sharing concerns during implementation, people who can help when things go pear-shaped, a menu of methodologies, a set of principles or questions to ask during program design, or even an ICT4D Implementation Hotline or a forum for questions and discussion.

These ethical issues around privacy and risk are not exclusive to Open Development. Similar issues were raised last week at the Open Government Partnership Summit sessions on whistle blowing, privacy, and safeguarding civic space, especially in light of the Snowden case. They were also raised at last year’s Technology Salon on Participatory Mapping.

A number of groups are looking more deeply into this area, including the Capture the Ocean Project, The Engine Room, IDRC’s research network, The Open Technology InstitutePrivacy InternationalGSMA, those working on “Big Data,” those in the Internet of Things space, and others.

I’m looking forward to further discussion with the Open Development working group on all of this in the coming months, and will also be putting a little time into mapping out existing initiatives and identifying gaps when it comes to these cross-cutting ethics, power, privacy and risk issues in open development and other ICT-enabled data-heavy initiatives.

Please do share information, projects, research, opinion pieces and more if you have them!

Read Full Post »

This is a cross-post by Duncan Edwards from the Institute of Development Studies. Duncan and I collaborated on some sessions for the Open Development stream at September’s Open Knowledge Conference, and we are working on a few posts to sum up what we discussed there and highlight some lingering thoughts on open development and open data. This post was originally published on the Open Knowledge Foundation blog on October 21, 2013

by Duncan Edwards

I’ve had a lingering feeling of unease that things were not quite right in the world of open development and ICT4D (Information and communication technology for development), so at September’s Open Knowledge Conference in Geneva I took advantage of the presence of some of the world’s top practitioners in these two areas to explore the question: How does “openness” really effect change within development?

Inspiration for the session came from a number of conversations I’ve had over the last few years. My co-conspirator/co-organiser of the OKCon side event “Reality check: Ethics and Risk in Open Development,” Linda Raftree, had also been feeling uncomfortable with the framing of many open development projects, assumptions being made about how “openness + ICTs = development outcomes,” and a concern that risks and privacy were not being adequately considered. We had been wondering whether the claims made by Open Development enthusiasts were substantiated by any demonstrable impact. For some reason, as soon as you introduce the words “open data” and “ICT,” good practice in development gets thrown out the window in the excitement to reach “the solution”.

A common narrative in many “open” development projects goes along the lines of “provide access to data/information –> some magic occurs –> we see positive change.” In essence, because of the newness of this field, we only know what we THINK happens, we don’t know what REALLY happens because there is a paucity of documentation and evidence.

It’s problematic that we often use the terms data, information, and knowledge interchangeably, because:
Data is NOT knowledge.
Data is NOT information.
Information is NOT knowledge.
Knowledge IS what you know. It’s the result of information you’ve consumed, your education, your culture, beliefs, religion, experience – it’s intertwined with the society within which you live.

Data cake metaphor developed by Mark Johnstone.

Understanding and thinking through how we get from the “openness” of data, to how this affects how and what people think, and consequently how they MIGHT act, is critical in whether “open” actually has any additional impact.

At Wednesday’s session, panellist Matthew Smith from the International Development Research Centre (IDRC) talked about the commonalities across various open initiatives. Matthew argued that a larger Theory of Change (ToC) around how ‘open’ leads to change on a number of levels could allow practitioners to draw out common points. The basic theory we see in open initiatives is “put information out, get a feedback loop going, see change happen.” But open development can be sliced in many ways, and we tend to work in silos when talking about openness. We have open educational resources, open data, open government, open science, etc. We apply ideas and theories of openness in a number of domains but we are not learning across these domains.

We explored the theories of change underpinning two active programmes that incorporate a certain amount of “openness” in their logic. Simon Colmer from the Knowledge Services department at the Institute of Development Studies outlined his department’s theory of change of how research evidence can help support decision-making in development policy-making and practice. Erik Nijland from HIVOS presented elements of the theory of change that underpins the Making All Voices Count programme, which looks to increase the links between citizens and governments to improve public services and deepen democracy. Both of these ToCs assume that because data/information is accessible, people will use it within their decision-making processes.

They also both assume that intermediaries play a critical role in analysis, translation, interpretation, and contextualisation of data and information to ensure that decision makers (whether citizens, policy actors, or development practitioners) are able to make use of it. Although access is theoretically open, in practice even mediated access is not equal – so how might this play out in respect to marginalised communities and individuals?

What neither ToC really does is unpack who these intermediaries are. What are their politics? What are their drivers for mediating data and information? What is the effect of this? A common assumption is that intermediaries are somehow neutral and unbiased – does this assumption really hold true?

What many open data initiatives do not consider is what happens after people are able to access and internalise open data and information. How do people act once they know something? As Vanessa Herringshaw from the Transparency and Accountability Initiative said in the “Raising the Bar for ambition and quality in OGP” session, “We know what transparency should look like but things are a lot less clear on the accountability end of things”.

There are a lot of unanswered questions. Do citizens have the agency to take action? Who holds power? What kind of action is appropriate or desirable? Who is listening? And if they are listening, do they care?

Linda finished up the panel by raising some questions around the assumptions that people make decisions based on information rather than on emotion, and that there is a homogeneous “public” or “community” that is waiting for data/information upon which to base their opinions and actions.

So as a final thought, here’s my (perhaps clumsy) 2013 update on Gil Scott Heron’s 1970 song “The Revolution will not be televised”:

“The revolution will NOT be in Open data,
It will NOT be in hackathons, data dives, and mobile apps,
It will NOT be broadcast on Facebook, Twitter, and YouTube,
It will NOT be live-streamed, podcast, and available on catch-up
The revolution will not be televised”

Heron’s point, which holds true today, was that “the revolution” or change, starts in the head. We need to think carefully about how we get far beyond access to data.

Look out for a second post coming soon on Theories of Change in Open, and a third post on ethics and risk in open data and open development.

And if you’re interested in joining the conversation, \sign up to our Open Development mailing list.

Read Full Post »

Bruce Lee explains why many open data and technology-led initiatives go wrong.

(See minute 1.14).

Read Full Post »

Older Posts »

Follow

Get every new post delivered to your Inbox.

Join 771 other followers