Feeds:
Posts
Comments

Posts Tagged ‘privacy’

This is a cross post from Heather Leson, Community Engagement Director at the Open Knowledge Foundation. The original post appeared here on the School of Data site.

by Heather Leson

What is the currency of change? What can coders (consumers) do with IATI data? How can suppliers deliver the data sets? Last week I had the honour of participating in the Open Data for Development Codeathon and the International Aid Transparency Initiative Technical Advisory Group meetings. IATI’s goal is to make information about aid spending easier to access, use, and understand. It was great that these events were back-to-back to push a big picture view.

My big takeaways included similar themes that I have learned on my open source journey:

You can talk about open data [insert tech or OS project] all you want, but if you don’t have an interactive community (including mentorship programmes), an education strategy, engagement/feedback loops plan, translation/localization plan and a process for people to learn how to contribute, then you build a double-edged barrier: barrier to entry and barrier for impact/contributor outputs.

Currency

About the Open Data in Development Codeathon

At the Codathon close, Mark Surman, Executive Director of Mozilla Foundation, gave us a call to action to make the web. Well, in order to create a world of data makers, I think we should run aid and development processes through this mindset. What is the currency of change? I hear many people talking about theory of change and impact, but I’d like to add ‘currency’. This is not only about money, this is about using the best brainpower and best energy sources to solve real world problems in smart ways. I think if we heed Mark’s call to action with a “yes, and”, then we can rethink how we approach complex change. Every single industry is suffering from the same issue: how to deal with the influx of supply and demand in information. We need to change how we approach the problem. Combined events like these give a window into tackling problems in a new format. It is not about the next greatest app, but more about asking: how can we learn from the Webmakers and build with each other in our respective fields and networks?

Ease of Delivery

The IATI community / network is very passionate about moving the ball forward on releasing data. During the sessions, it was clear that the attendees see some gaps and are already working to fill them. The new IATI website is set up to grow with a Community component. The feedback from each of the sessions was distilled by the IATI – TAG and Civil Society Guidance groups to share with the IATI Secretariat.

In the Open Data in Development, Impact of Open Data in Developing Countries, and CSO Guidance sessions, we discussed some key items about sharing, learning, and using IATI data. Farai Matsika, with International HIV/Aids Alliance, was particularly poignant reminding us of IATI’s CSO purpose – we need to share data with those we serve.

Country edits IATI

One of the biggest themes was data ethics. As we rush to ask NGOs and CSOs to release data, what are some of the data pitfalls? Anahi Ayala Iaccuci of Internews and Linda Raftree of Plan International USA both reminded participants that data needs to be anonymized to protect those at risk. Ms. Iaccuci asked that we consider the complex nature of sharing both sides of the open data story – successes and failures. As well, she advised: don’t create trust, but think about who people are trusting. Turning this model around is key to rethinking assumptions. I would add to her point: trust and sharing are currency and will add to the success measures of IATI. If people don’t trust the IATI data, they won’t share and use it.

Anne Crowe of Privacy International frequently asked attendees to consider the ramifications of opening data. It is clear that the IATI TAG does not curate the data that NGOS and CSOs share. Thus it falls on each of these organizations to learn how to be data makers in order to contribute data to IATI. Perhaps organizations need a lead educator and curator to ensure the future success of the IATI process, including quality data.

I think that School of Data and the Partnership for Open Data have a huge part to play with IATI. My colleague Zara Rahman is collecting user feedback for the Open Development Toolkit, and Katelyn Rogers is leading the Open Development mailing list. We collectively want to help people become data makers and consumers to effectively achieve their development goals using open data. This also means also tackling the ongoing questions about data quality and data ethics.


Here are some additional resources shared during the IATI meetings.

Read Full Post »

This is a guest post from Anna Crowe, Research Officer on the Privacy in the Developing World Project, and  Carly Nyst, Head of International Advocacy at Privacy International, a London-based NGO working on issues related to technology and human rights, with a focus on privacy and data protection. Privacy International’s new report, Aiding Surveillance, which covers this topic in greater depth was released this week.

by Anna Crowe and Carly Nyst

NOV 21 CANON 040

New technologies hold great potential for the developing world, and countless development scholars and practitioners have sung the praises of technology in accelerating development, reducing poverty, spurring innovation and improving accountability and transparency.

Worryingly, however, privacy is presented as a luxury that creates barriers to development, rather than a key aspect to sustainable development. This perspective needs to change.

Privacy is not a luxury, but a fundamental human right

New technologies are being incorporated into development initiatives and programmes relating to everything from education to health and elections, and in humanitarian initiatives, including crisis response, food delivery and refugee management. But many of the same technologies being deployed in the developing world with lofty claims and high price tags have been extremely controversial in the developed world. Expansive registration systems, identity schemes and databases that collect biometric information including fingerprints, facial scans, iris information and even DNA, have been proposed, resisted, and sometimes rejected in various countries.

The deployment of surveillance technologies by development actors, foreign aid donors and humanitarian organisations, however, is often conducted in the complete absence of the type of public debate or deliberation that has occurred in developed countries. Development actors rarely consider target populations’ opinions when approving aid programmes. Important strategy documents such as the UN Office for Humanitarian Affairs’ Humanitarianism in a Networked Age and the UN High-Level Panel on the Post-2015 Development Agenda’s A New Global Partnership: Eradicate Poverty and Transfer Economies through Sustainable Development give little space to the possible impact adopting new technologies or data analysis techniques could have on individuals’ privacy.

Some of this trend can be attributed to development actors’ systematic failure to recognise the risks to privacy that development initiatives present. However, it also reflects an often unspoken view that the right to privacy must necessarily be sacrificed at the altar of development – that privacy and development are conflicting, mutually exclusive goals.

The assumptions underpinning this view are as follows:

  • that privacy is not important to people in developing countries;
  • that the privacy implications of new technologies are not significant enough to warrant special attention;
  • and that respecting privacy comes at a high cost, endangering the success of development initiatives and creating unnecessary work for development actors.

These assumptions are deeply flawed. While it should go without saying, privacy is a universal right, enshrined in numerous international human rights treaties, and matters to all individuals, including those living in the developing world. The vast majority of developing countries have explicit constitutional requirements to ensure that their policies and practices do not unnecessarily interfere with privacy. The right to privacy guarantees individuals a personal sphere, free from state interference, and the ability to determine who has information about them and how it is used. Privacy is also an “essential requirement for the realization of the right to freedom of expression”. It is not an “optional” right that only those living in the developed world deserve to see protected. To presume otherwise ignores the humanity of individuals living in various parts of the world.

Technologies undoubtedly have the potential to dramatically improve the provision of development and humanitarian aid and to empower populations. However, the privacy implications of many new technologies are significant and are not well understood by many development actors. The expectations that are placed on technologies to solve problems need to be significantly circumscribed, and the potential negative implications of technologies must be assessed before their deployment. Biometric identification systems, for example, may assist in aid disbursement, but if they also wrongly exclude whole categories of people, then the objectives of the original development intervention have not been achieved. Similarly, border surveillance and communications surveillance systems may help a government improve national security, but may also enable the surveillance of human rights defenders, political activists, immigrants and other groups.

Asking for humanitarian actors to protect and respect privacy rights must not be distorted as requiring inflexible and impossibly high standards that would derail development initiatives if put into practice. Privacy is not an absolute right and may be limited, but only where limitation is necessary, proportionate and in accordance with law. The crucial aspect is to actually undertake an analysis of the technology and its privacy implications and to do so in a thoughtful and considered manner. For example, if an intervention requires collecting personal data from those receiving aid, the first step should be to ask what information is necessary to collect, rather than just applying a standard approach to each programme. In some cases, this may mean additional work. But this work should be considered in light of the contribution upholding human rights and the rule of law make to development and to producing sustainable outcomes. And in some cases, respecting privacy can also mean saving lives, as information falling into the wrong hands could spell tragedy.

A new framing

While there is an increasing recognition among development actors that more attention needs to be paid to privacy, it is not enough to merely ensure that a programme or initiative does not actively harm the right to privacy; instead, development actors should aim to promote rights, including the right to privacy, as an integral part of achieving sustainable development outcomes. Development is not just, or even mostly, about accelerating economic growth. The core of development is building capacity and infrastructure, advancing equality, and supporting democratic societies that protect, respect and fulfill human rights.

The benefits of development and humanitarian assistance can be delivered without unnecessary and disproportionate limitations on the right to privacy. The challenge is to improve access to and understanding of technologies, ensure that policymakers and the laws they adopt respond to the challenges and possibilities of technology, and generate greater public debate to ensure that rights and freedoms are negotiated at a societal level.

Technologies can be built to satisfy both development and privacy.

Download the Aiding Surveillance report.

Read Full Post »

This post was originally published on the Open Knowledge Foundation blog

A core theme that the Open Development track covered at September’s Open Knowledge Conference was Ethics and Risk in Open Development. There were more questions than answers in the discussions, summarized below, and the Open Development working group plans to further examine these issues over the coming year.

Informed consent and opting in or out

Ethics around ‘opt in’ and ‘opt out’ when working with people in communities with fewer resources, lower connectivity, and/or less of an understanding about privacy and data are tricky. Yet project implementers have a responsibility to work to the best of their ability to ensure that participants understand what will happen with their data in general, and what might happen if it is shared openly.

There are some concerns around how these decisions are currently being made and by whom. Can an NGO make the decision to share or open data from/about program participants? Is it OK for an NGO to share ‘beneficiary’ data with the private sector in return for funding to help make a program ‘sustainable’? What liabilities might donors or program implementers face in the future as these issues develop?

Issues related to private vs. public good need further discussion, and there is no one right answer because concepts and definitions of ‘private’ and ‘public’ data change according to context and geography.

Informed participation, informed risk-taking

The ‘do no harm’ principle is applicable in emergency and conflict situations, but is it realistic to apply it to activism? There is concern that organizations implementing programs that rely on newer ICTs and open data are not ensuring that activists have enough information to make an informed choice about their involvement. At the same time, assuming that activists don’t know enough to decide for themselves can come across as paternalistic.

As one participant at OK Con commented, “human rights and accountability work are about changing power relations. Those threatened by power shifts are likely to respond with violence and intimidation. If you are trying to avoid all harm, you will probably not have any impact.” There is also the concept of transformative change: “things get worse before they get better. How do you include that in your prediction of what risks may be involved? There also may be a perception gap in terms of what different people consider harm to be. Whose opinion counts and are we listening? Are the right people involved in the conversations about this?”

A key point is that whomever assumes the risk needs to be involved in assessing that potential risk and deciding what the actions should be — but people also need to be fully informed. With new tools coming into play all the time, can people be truly ‘informed’ and are outsiders who come in with new technologies doing a good enough job of facilitating discussions about possible implications and risk with those who will face the consequences? Are community members and activists themselves included in risk analysis, assumption testing, threat modeling and risk mitigation work? Is there a way to predict the likelihood of harm? For example, can we determine whether releasing ‘x’ data will likely lead to ‘y’ harm happening? How can participants, practitioners and program designers get better at identifying and mitigating risks?

When things get scary…

Even when risk analysis is conducted, it is impossible to predict or foresee every possible way that a program can go wrong during implementation. Then the question becomes what to do when you are in the middle of something that is putting people at risk or leading to extremely negative unintended consequences. Who can you call for help? What do you do when there is no mitigation possible and you need to pull the plug on an effort? Who decides that you’ve reached that point? This is not an issue that exclusively affects programs that use open data, but open data may create new risks with which practitioners, participants and activists have less experience, thus the need to examine it more closely.

Participants felt that there is not enough honest discussion on this aspect. There is a pop culture of ‘admitting failure’ but admitting harm is different because there is a higher sense of liability and distress. “When I’m really scared shitless about what is happening in a project, what do I do?” asked one participant at the OK Con discussion sessions. “When I realize that opening data up has generated a huge potential risk to people who are already vulnerable, where do I go for help?” We tend to share our “cute” failures, not our really dismal ones.

Academia has done some work around research ethics, informed consent, human subject research and use of Internal Review Boards (IRBs). What aspects of this can or should be applied to mobile data gathering, crowdsourcing, open data work and the like? What about when citizens are their own source of information and they voluntarily share data without a clear understanding of what happens to the data, or what the possible implications are?

Do we need to think about updating and modernizing the concept of IRBs? A major issue is that many people who are conducting these kinds of data collection and sharing activities using new ICTs are unaware of research ethics and IRBs and don’t consider what they are doing to be ‘research’. How can we broaden this discussion and engage those who may not be aware of the need to integrate informed consent, risk analysis and privacy awareness into their approaches?

The elephant in the room

Despite our good intentions to do better planning and risk management, one big problem is donors, according to some of the OK Con participants.  Do donors require enough risk assessment and mitigation planning in their program proposal designs? Do they allow organizations enough time to develop a well-thought-out and participatory Theory of Change along with a rigorous risk assessment together with program participants? Are funding recipients required to report back on risks and how they played out? As one person put it, “talk about failure is currently more like a ‘cult of failure’ and there is no real learning from it. Systematically we have to report up the chain on money and results and all the good things happening. and no one up at the top really wants to know about the bad things. The most interesting learning doesn’t get back to the donors or permeate across practitioners. We never talk about all the work-arounds and backdoor negotiations that make development work happen. This is a serious systemic issue.”

Greater transparency can actually be a deterrent to talking about some of these complexities, because “the last thing donors want is more complexity as it raises difficult questions.”

Reporting upwards into government representatives in Parliament or Congress leads to continued aversion to any failures or ‘bad news’. Though funding recipients are urged to be innovative, they still need to hit numeric targets so that the international aid budget can be defended in government spaces. Thus, the message is mixed: “Make sure you are learning and recognizing failure, but please don’t put anything too serious in the final report.” There is awareness that rigid program planning doesn’t work and that we need to be adaptive, yet we are asked to “put it all into a log frame and make sure the government aid person can defend it to their superiors.”

Where to from here?

It was suggested that monitoring and evaluation (M&E) could be used as a tool for examining some of these issues, but M&E needs to be seen as a learning component, not only an accountability one. M&E needs to feed into the choices people are making along the way and linking it in well during program design may be one way to include a more adaptive and iterative approach. M&E should force practitioners to ask themselves the right questions as they design programs and as they assess them throughout implementation. Theory of Change might help, and an ethics-based approach could be introduced as well to raise these questions about risk and privacy and ensure that they are addressed from the start of an initiative.

Practitioners have also expressed the need for additional resources to help them predict and manage possible risk: case studies, a safe space for sharing concerns during implementation, people who can help when things go pear-shaped, a menu of methodologies, a set of principles or questions to ask during program design, or even an ICT4D Implementation Hotline or a forum for questions and discussion.

These ethical issues around privacy and risk are not exclusive to Open Development. Similar issues were raised last week at the Open Government Partnership Summit sessions on whistle blowing, privacy, and safeguarding civic space, especially in light of the Snowden case. They were also raised at last year’s Technology Salon on Participatory Mapping.

A number of groups are looking more deeply into this area, including the Capture the Ocean Project, The Engine Room, IDRC’s research network, The Open Technology InstitutePrivacy InternationalGSMA, those working on “Big Data,” those in the Internet of Things space, and others.

I’m looking forward to further discussion with the Open Development working group on all of this in the coming months, and will also be putting a little time into mapping out existing initiatives and identifying gaps when it comes to these cross-cutting ethics, power, privacy and risk issues in open development and other ICT-enabled data-heavy initiatives.

Please do share information, projects, research, opinion pieces and more if you have them!

Read Full Post »

The February 5 Technology Salon in New York City asked “What are the ethics in participatory digital mapping?” Judging by the packed Salon and long waiting list, many of us are struggling with these questions in our work.

Some of the key ethical points raised at the Salon related to the benefits of open data vs privacy and the desire to do no harm. Others were about whether digital maps are an effective tool in participatory community development or if they are mostly an innovation showcase for donors or a backdrop for individual egos to assert their ‘personal coolness’. The absence of research and ethics protocols for some of these new kinds of data gathering and sharing was also an issue of concern for participants.

During the Salon we were only able to scratch the surface, and we hope to get together soon for a more in-depth session (or maybe 2 or 3 sessions – stay tuned!) to further unpack the ethical issues around participatory digital community mapping.

The points raised by discussants and participants included:

1) Showcasing innovation

Is digital mapping really about communities, or are we really just using communities as a backdrop to showcase our own innovation and coolness or that of our donors?

2) Can you do justice to both process and product?

Maps should be less an “in-out tool“ and more part of a broader program. External agents should be supporting communities to articulate and to be full partners in saying, doing, and knowing what they want to do with maps. Digital mapping may not be better than hand drawn maps, if we consider that the process of mapping is just as or more important than the final product. Hand drawn maps can allow for important discussions to happen while people draw. This seems to happens much less with the digital mapping process, which is more technical, and it happens even less when outside agents are doing the mapping. A hand drawn map can be imbued with meaning in terms of the size, color or placement of objects or borders. Important meaning may be missed when hand drawn maps are replaced with digital ones.

Digital maps, however, can be printed and further enhanced with comments and drawings and discussed in the community, as some noted. And digital maps can lend a sense of professionalism to community members and help them to make a stronger case to authorities and decisions makers. Some participants raised concerns about power relations during mapping processes, and worried that using digital tools could emphasize those.

3) The ethics of wasting people’s time.

Community mapping is difficult. The goal of external agents should be to train local people so that they can be owners of the process and sustain it in the long term. This takes time. Often, however, mapping experts are flown in for a week or two to train community members. They leave people with some knowledge, but not enough to fully manage the mapping process and tools. If people end up only half-trained and without local options to continue training, their time has essentially been wasted. In addition, if young people see the training as a pathway to a highly demanded skill set yet are left partially trained and without access to tools and equipment, they will also feel they have wasted their time.

4) Data extraction

When agencies, academics and mappers come in with their clipboards or their GPS units and conduct the same surveys and studies over and over with the same populations, people’s time is also wasted. Open digital community mapping comes from a viewpoint that an open map and open data are one way to make sure that data that is taken from or created by communities is made available to the communities for their own use and can be accessed by others so that the same data is not collected repeatedly. Though there are privacy concerns around opening data, there is a counter balanced ethical dilemma related to how much time gets wasted by keeping data closed.

5) The (missing) link between data and action

Related to the issue of time wasting is the common issue of a missing link between data collected and/or mapped, action and results. Making a map identifying issues is certainly no guarantee that the government will come and take care of those issues. Maps are a means to an end, but often the end is not clear. What do we really hope the data leads to? What does the community hope for? Mapping can be a flashy technology that brings people to the table, but that is no guarantee that something will happen to resolve the issues the map is aimed at solving.

6) Intermediaries are important

One way to ensure that there is a link between data and action is to identify stakeholders that have the ability to use, understand and re-interpret the data. One case was mentioned where health workers collected data and then wanted to know “What do we do now? How does this affect the work that we do? How do we present this information to community health workers in a way that it is useful to our work?” It’s important to tone the data down and make them understandable to the base population, and to also show them in a way that is useful to people working at local institutions. Each audience will need the data to be visualized or shared in a different, contextually appropriate way if they are going to use the data for decision-making. It’s possible to provide the same data in different ways across different platforms from paper to high tech. The challenge of keeping all the data and the different sharing platforms updated, however, is one that can’t be overlooked.

7) What does informed consent actually mean in today’s world?

There is a viewpoint that data must be open and that locking up data is unethical. On the other hand, there are questions about research ethics and protocols when doing mapping projects and sharing or opening data. Are those who do mapping getting informed consent from people to use or open their data? This is the cornerstone of ethics when doing research with human beings. One must be able to explain and be clear about the risks of this data collection, or it is impossible to get truly informed consent. What consent do community mappers need from other community members if they are opening data or information? What about when people are volunteering their information and self-reporting? What does informed consent mean in those cases? And what needs to be done to ensure that consent is truly informed? How can open data and mapping be explained to those who have not used the Internet before? How can we have informed consent if we cannot promise anyone that their data are really secure? Do we have ethics review boards for these new technological ways of gathering data?

8) Not having community data also has ethical implications

It may seem like time wasting, and there may be privacy and protection questions, but there are are also ethical implications of not having community data. When tools like satellite remote sensing are used to do slum mapping, for example, data are very dehumanized and can lead to sterile decision-making. The data that come from a community itself can make these maps more human and these decisions more humane. But there is a balance between the human/humanizing side and the need to protect. Standards are needed for bringing in community and/or human data in an anonymized way, because there are ethical implications on both ends.

9) The problem with donors….

Big donors are not asking the tough questions, according to some participants. There is a lack of understanding around the meaning, use and value of the data being collected and the utility of maps. “If the data is crap, you’ll have crap GIS and a crap map. If you are just doing a map to do a map, there’s an issue.” There is great incentive from the donor side to show maps and to demonstrate value, because maps are a great photo op, a great visual. But how to go a level down to make a map really useful? Are the M&E folks raising the bar and asking these hard questions? Often from the funder’s perspective, mapping is seen as something that can be done quickly. “Get the map up and the project is done. Voila! And if you can do it in 3 weeks, even better!”

Some participants felt the need for greater donor awareness of these ethical questions because many of them are directly related to funding issues. As one participant noted, whether you coordinate, whether it’s participatory, whether you communicate and share back the information, whether you can do the right thing with the privacy issue — these all depend on what you can convince a donor to fund. Often it’s faster to reinvent the wheel because doing it the right way – coordinating, learning from past efforts, involving the community — takes more time and money. That’s often the hard constraint on these questions of ethics.

Check this link for some resources on the topic, and add yours to the list.

Many thanks to our lead discussants, Robert Banick from the American Red Cross and Erica Hagen from Ground Truth, and to Population Council for hosting us for this month’s Salon!

The next Technology Salon NYC will be coming up in March. Stay tuned for more information, and if you’d like to receive notifications about future salons, sign up for the mailing list!

Read Full Post »

policy forum

This past Monday I had the opportunity to join Engineers without Borders (EWB) in Calgary, Canada, at their Annual Policy Forum on Global Development to discuss “How can open government contribute to community and economic development?”

Morning panels covered some examples of open government initiatives from Finland, Ghana and Canada. In the afternoon we heard about some of the challenges with open data, open government and the International Aid Transparency Initiative. Table discussions followed both of the panels. The group was a mix of Canadian and African government representatives, people from organizations and groups working in different countries on open government and open data initiatives, and young people who are connected with EWB. The session was under Chatham House Rule in order to encourage frank conversation.

Drawing from such documents as the Open Government Partnership’s Open Government Declaration, Harlan Yu and David G. Robinson’s “The New Ambiguity of “Open Government,” Beth Noveck’s What’s in a Name? Open Gov and Good Gov and Nathaniel Heller, A Working Definition of ‘Open Government’, the following definition of Open Government was used to frame the discussions.

EWB Definition of Open Government

Below (in a very-much-longer-than-you-are-supposed-to-write-in-a-blogpost summary) are the highlights and points I found interesting and useful as related to Open Development, Open Data, Open Government and the International Aid Transparency Initiative (IATI)

1.  Participation thresholds need to be as low as possible for people to participate and engage in open government or open data initiatives. You need to understand well what engagement tools are most useful or comfortable for different groups. In some places, to engage the public you can use tools such as etherpad, wiki platforms, google docs, open tools and online collaboration spaces. In other places and with other populations, regardless of what country, you may be more successful with face-to-face methods or with traditional media like television and radio, but these need to be enhanced with different types of feedback methods like phone calls or surveys or going house to house so that your information is not only traveling one way. Community organizing skills are key to this work, regardless of whether the tools are digital or not.

2.  Literacy remains a huge challenge hindering access to information and citizen engagement in holding government accountable in many countries. This is why face-to-face engagement is important, as well as radio and more popular or broad-based communication channels. One participant asked “how can you make open government a rural, rather than an urban only, phenomenon?” This question resonated for participants from all countries.

3.  Language is still a critical issue. Language poses a big challenge for these kinds of initiatives, from the grassroots level to the global level, within and among countries, for citizens, governments, and anyone trying to share or collect data or information. It was noted that all the countries who have published data to IATI are publishing in English. All the IATI Standards are in English, as is the entire support system for IATI. As one participant noted, this begs the question of who the information in IATI is actually designed for and serving, and who are the expected users of it. Open data initiatives should consider the implications of language they publish in, both politically and practically.

4.  Open data can serve to empower the already empowered. As one speaker noted, “the idea that everyone has the potential to make use of open data is simply not true.” Access to digital infrastructure and educational resource may be missing, meaning that many do not have the ability to access, interpret or use data for their own purposes. Governments can also manipulate data and selectively release data that serves their own interests. Some questioned government motives, citing the example of a government that released “data” saying its unemployment rate was 10% when “everyone knew this to be false, and people grumbled but we did not feel empowered to challenge that statement.” Concern was expressed over the lack of an independent body or commission in some countries to oversee open data and open government processes. Some did not trust the government bodies who were currently in charge of collecting and opening information, saying that due to politics, they would never release any information that made their party or their government look bad.

5.  Privacy rights can be exploited if data is opened without data protection laws and effort to build capacity around how to make certain data anonymous. Citizens may also not be aware of what rights are being violated, so this should also be addressed.

6.  Too much open data discussion takes place without a power analysis, as one participant commented, making some of the ideas around open data and open government somewhat naïve. “Those who have the greatest stake will be the most determined to push their point of view and to make sure it prevails.”

7.  Open data needs to become open data 2.0. According to one participant, open data is still mostly one-way information delivery. In some cases there isn’t even any delivery – information is opened on a portal but no one knows it’s there or what it refers to or why it would be useful. When will open data, open government and open aid become more of a dialogue? When will data be released that answers questions that citizens have rather than the government deciding what it will release? The importance of working with community groups to strengthen their capacity to ask questions and build critical consciousness to question the data was emphasized. A counter point was that government is not necessarily there to start collecting information or creating data sets according to public demand. Governments collect certain data to help them function.

8.  Intermediaries working on open government should be careful of real or perceived bias. Non-profits have their own agendas, and ‘open data’ and ‘open information’ is not immune to being interpreted in non-objective ways. Those working on civic engagement initiatives need to be careful that they are not biased in their support for citizen initiatives. One presenter who works on a platform that encourages citizens to be involved in petitioning new laws for contemplation in Parliament said “Our software is open source so that anyone can set up a similar process to compete with us if they feel we are biased towards one or another type of agenda.”

9.  Technology-based engagement tools change who is participating. Whether in Finland, Canada, Ghana or Malawi, it’s critical to think about reaching those who are not active already online, those who are not the typical early adopters. To reach a broader public, one speaker noted “We are going to remote places, doing events in smaller towns and cities to see how people want to influence and take part in this. Making sure the website is accessible and understandable.”

10. Technological platforms are modifying how political parties and democratic processes operate. This may or may not be a good thing. Normally priorities arise and are discussed within political parties. Will people now bypass the party process and use ‘direct democracy’ channels if they are passionate about an issue but do not want to enter into negotiation around it? Will this weaken political processes or longer standing democratic processes? One speaker considered this change to be positive. People are not happy with being able to vote every 4 years and they want opportunities to participate in between elections cycles and direct voice in how priorities are decided. Others questioned whether bypassing official processes can lead to less participation and more apathy overall on national issues. Some questioned whether within fairly long-standing democracies, open data will have any real impact, considering existing levels of apathy and the lack of political participation.

11. Strong information, statistical, monitoring and evaluation systems are critical for open data and open government processes and to ensure more effective management of development results. This is still a challenge for some countries that need to review their mechanisms and improve their tools and processes for data collection and dissemination. If there is no data, or no current data, there is not much point in opening it. In addition, there are capacity and technical competency challenges within institutions in some countries. One participant mentioned a lack of current government geological information about gold and oil deposits that weakens government capacity to negotiate with the private sector extraction industry and ensure partnerships and earnings will contribute to national development. In addition more evidence is needed on the impact, use, and outcomes of open data. At the moment it’s quite difficult to say with any real authority what the outcomes and impact of open data and open government have been.

12. IATI (International Aid Transparency Initiative) needs more partners. Government representatives noted that they are opening their data, but they can only open the data they possess. In order for data on aid to be useful, more data is needed, especially that of NGOs who are implementing programs. Not many NGOs have published their information to the IATI standard at this point. “The really interesting thing will be when we can start mashing up and mapping out the different kinds of information,” as one speaker noted, “for example, this is the goal of the Open Aid Partnership. It will involve combining information from the donor, development indicators from the World Bank, and country information, and this will open up amazing possibilities once this is all geo-coded.” There are reporting challenges related to IATI and open government data, however, because at times countries and NGOs do not see the benefits of reporting – it feels like just one more top-down administrative burden. There are also issues with donor governments reporting their committed intentions and amounts, recipient governments reporting back, and communications with citizens on both sides (donor and recipient countries). One example that was reported to be enjoying some success was the multi-donor budget support initiative in Ghana, where development partners and government work together to establish development indicators and commitments. If the government delivers on the indicators, the development partners will then provide them with the funding. Development partners can also earmark funding to particular areas if there is government agreement.

13. We need more accountability towards ‘beneficiaries’.Currently many of these initiatives are perceived as being focused on donors and donor publics. As one participant noted, “the interesting thing is less about government and more about getting regular people involved in these processes. When you engage the public you’ll engage government leaders in thinking they will need to change to respond to what citizens are asking for.” Another noted that the essential issue is the link between transparency/accountability and citizens and their own governments. In addition, as one participant asked, “How can you strengthen capacity among citizens to ask the right questions about the data that’s being opened?” For example, citizens may ask about the number of schools being built, but not ask about the quality of education being provided. Public education was a strong focus of discussions around citizen engagement during the policy forum.

14. Should citizens be consulted on everything? however, was one big question. The public at large may not understand the ramifications of its own deep misunderstandings on particular issues and may be inputting from a viewpoint that lacks scientific evidence or fact. “It’s one thing to have an opinion about whether your child should be able to drink energy drinks before age 16, it’s another to input about technical programs like the best policy for green energy,” commented one group.

15. Can citizens really have greater participation if government is still in control of data? was another big question. An example was given of an open consultative process that became unwieldy for a local government, which then shut down the consultation process and changed the nature of the documents to ‘administrative’ and therefore no longer open. Others asked why governments pat themselves on the back over being part of the Open Government Partnership yet they do not have Freedom of Information Acts (FOIA) or they prosecute those who open data in alternative ways, such as Bradley Manning and Aaron Swartz.

16. If citizens don’t get a response from government (or if they don’t like the response, or feel it’s biased or manipulated), apathy and cynicism will increase. It’s important to make sure that ‘open government’ is not just a box that gets ticked off, but rather a long-term change in mentality of those in power and deeper expectations and efforts by citizens for openness and participation in conversations of national importance.

The conclusion was that Open Government is somewhat of a paradox, rooted in aims that are not necessarily new. Open Government strives to enable leaders in their communities to create change and transform their lives and those of people in their communities. It is a complex process that involves many actors and multiple conflicting goals and interests. It’s also something new that we are all learning about and experimenting with, but we are very impatient to know what works and what the impact is. In the room, the feeling was one of ‘radical pragmatism,’ as one participant put it. Open Government is a big idea that represents a big change. It’s something that can transform communities at the global level and there is a great deal of hope and excitement around it. At the same time, we need to acknowledge the challenges associated with it in order to address them and move things forward.

I’ll do a follow up post with the points I made during the panel as this post is clearly way too too long already. Kudos if you are still reading, and a huge thanks to the organizers and participants in the EWB policy forum.

Read Full Post »

Follow

Get every new post delivered to your Inbox.

Join 744 other followers