Feeds:
Posts
Comments

Archive for the ‘politics’ Category

Last week’s Technology Salon New York City touched on ethics in technology for democracy initiatives. We heard from lead discussants Malavika Jayaram, Berkman Center for Internet and SocietyIvan Sigal, Global Voices; and Amilcar Priestley, Afrolatin@ Project. Though the topic was catalyzed by the Associated Press’ article on ‘Zunzuneo’ (a.k.a. ‘Cuban Twitter’) and subsequent discussions in the press and elsewhere, we aimed to cover some of the wider ethical issues encountered by people and organizations who implement technology for democracy programs.

Salons are off the record spaces, so no attribution is made in this post, but I’ve summarized the discussion points here:

First up: Zunzuneo

The media misinterpreted much of the Zunzuneo story. Zunzuneo was not a secret mission, according to one Salon participant, as it’s not in the remit of USAID to carry out covert operations. The AP article conflated a number of ideas regarding how USAID works and the contracting mechanisms that were involved in this case, he said. USAID and the Office of Transition Initiatives (OTI) frequently disguise members, organizations, and contractors that work for it on the ground for security reasons. (See USAID’s side of the story here). This may still be an ethical question, but it is not technically “spying.” The project was known within the OTI and development community, but on a ‘need to know’ basis. It was not a ‘fly by night’ operation; it was more a ‘quietly and not very effectively run project.’

There were likely ethics breaches in Zunzuneo, from a legal standpoint. It’s not clear whether the data and phone numbers collected from the Cuban public for the project were obtained in a legal or ethical way. Some reports say they were obtained through a mid-level employee (a “Cuban engineer who had gotten the phone list” according to the AP article). (Note: I spoke separately to someone close to the project who told me that user opt-in/opt-out and other standard privacy protocols were in place). It’s also not entirely clear whether, as the AP states, the user information collected was being categorized into segments who were loyal or disloyal to the Cuban government, information which could put users at risk if found out.

Zunzuneo took place in a broader historical and geo-political context. As one person put it, the project followed Secretary Clinton’s speeches on Internet Freedom. There was a rush to bring technology into the geopolitical space, and ‘the articulation of why technology was important collided with a bureaucratic process in USAID and the State Department (the ‘F process’) that absorbed USAID into the State Department and made development part of the State Department’s broader political agenda.’ This agenda had been in the works for quite some time, and was part of a wider strategy of quietly moving into development spaces and combining development, diplomacy, intelligence and military (defense), the so-called 3 D’s.

Implementers failed to think through good design, ethics and community aspects of the work. In a number of projects of this type, the idea was that if you give people technology, they will somehow create bottom up pressure for political social change. As one person noted, ‘in the Middle East, as a counter example, the tech was there to enable and assist people who had spent 8-10 years building networks. The idea that we can drop tech into a space and an uprising will just happen and it will coincidentally push the US geopolitical agenda is a fantasy.’ Often these kinds of programs start with a strategic communications goal that serves a political end of the US Government. They are designed with the idea that a particular input equals some kind of a specific result down the chain. The problem comes when the people doing the seeding of the ideas and inputs are not familiar with the context they will be operating in. They are injecting inputs into a space that they don’t understand. The bigger ethical question is: Why does this thought process prevail in development? Much of that answer is found in US domestic politics and the ways that initiatives get funded.

Zunzuneo was not a big surprise for Afrolatino organizations. According to one discussant, Afrolatino organizations were not surprised when the Zunzuneo article came out, given the geopolitical history and the ongoing presence of the US in Latin America. Zunzuneo was seen as a 21st Century version of what has been happening for decades. Though it was criticized, it was not seen as particularly detrimental. Furthermore, the Afrolatino community (within the wider Latino community) has had a variety of relationships with the US over time – for example, some Afrolatino groups supported the Contras. Many Afrolatino groups have felt that they were not benefiting overall from the mestizo governments who have held power. In addition, much of Latin America’s younger generation is less tainted by the Cold War mentality, and does not see US involvement in the region as necessarily bad. Programs like Zunzuneo come with a lot of money attached, so often wider concerns about their implications are not in the forefront because organizations need to access funding. Central American and Caribbean countries are only just entering into a phase of deeper analysis of digital citizenship, and views and perceptions on privacy are still being developed.

Perceptions of privacy

There are differences in perception when it comes to privacy and these perceptions are contextual. They vary within and across countries and communities based on age, race, gender, economic levels, comfort with digital devices, political perspective and past history. Some older people, for example, are worried about the privacy violation of having their voice or image recorded, because the voice, image and gaze hold spiritual value and power. These angles of privacy need to be considered as we think through what privacy means in different contexts and adapt our discourse accordingly.

Privacy is hard to explain, as one discussant said: ‘There are not enough dead bodies yet, so it’s hard to get people interested. People get mad when the media gets mad, and until an issue hits the media, it may go unnoticed. It’s very hard to conceptualize the potential harm from lack of privacy. There may be a chilling effect but it’s hard to measure. The digital divide comes in as well, and those with less exposure may have trouble understanding devices and technology. They will then have even greater trouble understanding beyond the device to data doubles, disembodied information and de-anonymization, which are about 7 levels removed from what people can immediately see. Caring a lot about privacy can get you labeled as paranoid or a crazy person in many places.’

Fatalism about privacy can also hamper efforts. In the developing world, many feel that everything is corrupt and inept, and that there is no point in worrying about privacy and security. ‘Nothing ever works anyway, so even if the government wanted to spy on us, they’d screw it up,’ is the feeling. This is often the attitude of human rights workers and others who could be at greatest risk from privacy breaches or data collection, such as that which was reportedly happening within Zunzuneo. Especially among populations and practitioners who have less experience with new technologies and data, this can create large-scale risk.

Intent, action, context and consequences

Good intentions with little attention to privacy vs data collection with a hidden political agenda. Where are the lines when data that are collected for a ‘good cause’ (for example, to improve humanitarian response) might be used for a different purpose that puts vulnerable people at risk? What about data that are collected with less altruistic intentions? What about when the two scenarios overlap? Data might be freely given or collected in an emergency that would be considered a privacy violation in a ‘development’ setting, or the data collection may lead to a privacy violation post-emergency. Often, slapping the ‘obviously good and unarguably positive’ label of ‘Internet freedom’ on something implies that it’s unquestionably positive when it may in fact be part of a political agenda with a misleading label. There is a long history of those with power collecting data that helps them understand and/or control those with less power, as one Salon participant noted, and we need to be cognizant of that when we think about data and privacy.

US Government approaches to political development often take an input/output approach, when, in fact, political development is not the same as health development. ‘In political work, there is no clear and clean epidemiological goal we are trying to reach,’ noted a Salon participant. Political development is often contentious and the targets and approaches are very different than those of health. When a health model and rhetoric is used to work on other development issues, it is misleading. The wholesale adoption of these kinds of disease model approaches leaves people and communities out of the decision making process about their own development. Similarly, the rhetoric of strategic communications and its inclusion into the development agenda came about after the War on Terror, and it is also a poor fit for political development. The rhetoric of ‘opening’ and ‘liberating’ data is similar. These arguments may work well for one kind of issue, but they are not transferable to a political agenda. One Salon participant pointed out the rhetoric of the privatization model also, and explained that a profound yet not often considered implication of the privatization of services is that once a service passes over to the private sector, the Freedom of Information Act (FOIA) does not apply, and citizens and human rights organizations lose FOIA as a tool. Examples included the US prison system and the Blackwater case of several years ago.

It can be confusing for implementers to know what to do, what tools to use, what funding to accept and when it is OK to bring in an outside agenda. Salon participants provided a number of examples where they had to make choices and felt ethics could have been compromised. Is it OK to sign people up on Facebook or Gmail during an ICT and education project, given these companies’ marketing and privacy policies? What about working on aid transparency initiatives in places where human rights work or crime reporting can get people killed or individual philanthropists/donors might be kidnapped or extorted? What about a hackathon where the data and solutions are later given to a government’s civilian-military affairs office? What about telling LGBT youth about a social media site that encourages LGBT youth to connect openly with one another (in light of recent harsh legal penalties against homosexuality)? What about employing a user-centered design approach for a project that will eventually be overlaid on top of a larger platform, system or service that does not pass the privacy litmus test? Is it better to contribute to improving healthcare while knowing that your software system might compromise privacy and autonomy because it sits on top of a biometric system, for example? Participants at the Salon face these ethical dilemmas every day, and as one person noted, ‘I wonder if I am just window dressing something that will look and feel holistic and human-centered, but that will be used to justify decisions down the road that are politically negative or go against my values.’ Participants said they normally rely on their own moral compass, but clearly many Salon participants are wrestling with the potential ethical implications of their actions.

What we can do? Recommendations from Salon participants

Work closely with and listen to local partners, who should be driving the process and decisions. There may be a role for an outside perspective, but the outside perspective should not trump the local one. Inculcate and support local communities to build their own tools, narratives, and projects. Let people set their own agendas. Find ways to facilitate long-term development processes around communities rather than being subject to agendas from the outside.

Consider this to be ICT for Discrimination and think in every instance and every design decision about how to dial down discrimination. Data lead to sorting, and data get lumped into clusters. Find ways during the design process to reduce the discrimination that will come from that sorting and clustering process. The ‘Do no harm’ approach is key. Practitioners and designers should also be wary of the automation of development and the potential for automated decisions to be discriminatory.

Call out hypocrisy. Those of us who sit at Salons or attend global meetings hold tremendous privilege and power as compared to most of the rest of the world. ‘It’s not landless farmers or disenfranchised young black youth in Brazil who get to attend global meetings,’ said one Salon attendee. ‘It’s people like us. We need to be cognizant of the advantage we have as holders of power.’ Here in the US, the participant added, we need to be more aware of what private sector US technology companies are doing to take advantage of and maintain their stronghold in the global market and how the US government is working to allow US corporations to benefit disproportionately from the current Internet governance structure.

Use a rights-based approach to data and privacy to help to frame these issues and situations. Disclosure and consent are sometimes considered extraneous, especially in emergency situations. People think ‘this might be the only time I can get into this disaster or conflict zone, so I’m going to Hoover up as much data as possible without worrying about privacy.’ On the other hand, sometimes organizations are paternalistic and make choices for people about their own privacy. Consent and disclosure are not new issues; they are merely manifested in new ways as new technology changes the game and we cannot guarantee anonymity or privacy any more for research subjects. There is also a difference between information a person actively volunteers and information that is passively collected and used without a person’s knowledge. Framing privacy in a human rights context can help place importance on both processes and outcomes that support people’s rights to control their own data and that increase empowerment.

Create a minimum standard for privacy. Though we may not be able to determine a ceiling for privacy, one Salon participant said we should at least consider a floor or a minimum standard. Actors on the ground will always feel that privacy standards are a luxury because they have little know-how and little funding, so creating and working within an ethical standard should be a mandate from donors. The standard could be established as an M&E criterion.

Establish an ethics checklist to decide on funding sources and create policies and processes that help organizations to better understand how a donor or sub-donor would access and/or use data collected as part of a project or program they are funding. This is not always an easy solution, however, especially for cash-strapped local organizations. In India, for example, organizations are legally restricted from receiving certain types of funding based on government concerns that external agencies are trying to bring in Western democracy and Western values. Local organizations have a hard time getting funding for anti-censorship or free speech efforts. As one person at the Salon said, ‘agencies working on the ground are in a bind because they can’t take money from Google because it’s tainted, they can’t take money from the State Department because it’s imperialism and they can’t take money from local donors because there are none.’

Use encryption and other technology solutions. Given the low levels of understanding and awareness of these tools, more needs to be done so that more organizations learn how to use them, and they need to be made simpler, more accessible and user-friendly. ‘Crypto Parties’ can help get organizations familiar with encryption and privacy, but better outreach is needed so that organizations understand the relevance of encryption and feel welcome in tech-heavy environments.

Thanks to participants and lead discussants for the great discussions and to ThoughtWorks for hosting us at their offices!

 If you’d like to attend future Salons, sign up here!

Read Full Post »

This is a cross post from Heather Leson, Community Engagement Director at the Open Knowledge Foundation. The original post appeared here on the School of Data site.

by Heather Leson

What is the currency of change? What can coders (consumers) do with IATI data? How can suppliers deliver the data sets? Last week I had the honour of participating in the Open Data for Development Codeathon and the International Aid Transparency Initiative Technical Advisory Group meetings. IATI’s goal is to make information about aid spending easier to access, use, and understand. It was great that these events were back-to-back to push a big picture view.

My big takeaways included similar themes that I have learned on my open source journey:

You can talk about open data [insert tech or OS project] all you want, but if you don’t have an interactive community (including mentorship programmes), an education strategy, engagement/feedback loops plan, translation/localization plan and a process for people to learn how to contribute, then you build a double-edged barrier: barrier to entry and barrier for impact/contributor outputs.

Currency

About the Open Data in Development Codeathon

At the Codathon close, Mark Surman, Executive Director of Mozilla Foundation, gave us a call to action to make the web. Well, in order to create a world of data makers, I think we should run aid and development processes through this mindset. What is the currency of change? I hear many people talking about theory of change and impact, but I’d like to add ‘currency’. This is not only about money, this is about using the best brainpower and best energy sources to solve real world problems in smart ways. I think if we heed Mark’s call to action with a “yes, and”, then we can rethink how we approach complex change. Every single industry is suffering from the same issue: how to deal with the influx of supply and demand in information. We need to change how we approach the problem. Combined events like these give a window into tackling problems in a new format. It is not about the next greatest app, but more about asking: how can we learn from the Webmakers and build with each other in our respective fields and networks?

Ease of Delivery

The IATI community / network is very passionate about moving the ball forward on releasing data. During the sessions, it was clear that the attendees see some gaps and are already working to fill them. The new IATI website is set up to grow with a Community component. The feedback from each of the sessions was distilled by the IATI – TAG and Civil Society Guidance groups to share with the IATI Secretariat.

In the Open Data in Development, Impact of Open Data in Developing Countries, and CSO Guidance sessions, we discussed some key items about sharing, learning, and using IATI data. Farai Matsika, with International HIV/Aids Alliance, was particularly poignant reminding us of IATI’s CSO purpose – we need to share data with those we serve.

Country edits IATI

One of the biggest themes was data ethics. As we rush to ask NGOs and CSOs to release data, what are some of the data pitfalls? Anahi Ayala Iaccuci of Internews and Linda Raftree of Plan International USA both reminded participants that data needs to be anonymized to protect those at risk. Ms. Iaccuci asked that we consider the complex nature of sharing both sides of the open data story – successes and failures. As well, she advised: don’t create trust, but think about who people are trusting. Turning this model around is key to rethinking assumptions. I would add to her point: trust and sharing are currency and will add to the success measures of IATI. If people don’t trust the IATI data, they won’t share and use it.

Anne Crowe of Privacy International frequently asked attendees to consider the ramifications of opening data. It is clear that the IATI TAG does not curate the data that NGOS and CSOs share. Thus it falls on each of these organizations to learn how to be data makers in order to contribute data to IATI. Perhaps organizations need a lead educator and curator to ensure the future success of the IATI process, including quality data.

I think that School of Data and the Partnership for Open Data have a huge part to play with IATI. My colleague Zara Rahman is collecting user feedback for the Open Development Toolkit, and Katelyn Rogers is leading the Open Development mailing list. We collectively want to help people become data makers and consumers to effectively achieve their development goals using open data. This also means also tackling the ongoing questions about data quality and data ethics.


Here are some additional resources shared during the IATI meetings.

Read Full Post »

Sometimes a work trip accidentally has a theme, and my recent trip to Cape Town was one of those. I arrived on Thursday, December 5th to the news that Mandela had passed away. My cab driver was on the phone, telling someone that Friday would be a holiday. He glanced back at me and asked “Do you know who Nelson Mandela is? He’s passed.” I turned on the television when I got to my hotel and watched for a few hours, but it was already after midnight and so there was not a lot of new content.

Screen Shot 2013-12-13 at 7.56.17 AMThe following day I went with a small group to an ecumenical ceremony in the square, but it didn’t feel yet like the news had really hit. I had no idea how to interpret the crowd, the messages, the speakers, the politics. As the news traveled and people began writing about Mandela and his life, I dipped in here and there. The typical conversations happened. Was Mandela and his life going to be sanitized by the mainstream media for political purposes? It was good to see people attempting to show the full man, with all his complexities. It was striking to remember that such a short time ago apartheid was alive and well, and to really think about that, I mean really really think about it, and to be reminded yet again of the fact that social change is not easy, clean, or straightforward. It’s most certainly not a technical problem waiting to be solved with a new device or invention, though clearly international and national political pressure play a huge role.

Mandela and his life became an underlying base for the conference, as I’m sure was true for much of what was happening around the world. Whether he was directly mentioned or not, his life’s work was present. I participated in sessions on ICTs and open development, ICTs and children, ICTs and raising critical consciousness. In all of them, the issues of equity and power came up. How can development processes be more open and is there a role for ICTs there? What world do we want to see in the future? How do we get there? How do we include children and youth so that they are not marginalized? How can we take a critical approach to ourselves and our agendas in development and in ICT4D? Can ICTs play a role in helping people to change existing power structures, achieve more equity and equality, and transform our societies? All these sessions were planned before anyone knew of Mandela’s passing, but talking about issues in light of the recent news and the renewed presence of him and his life made them feel more real.

Fast forward to the flights home. My first flight was the long one, from Cape Town to Amsterdam. My seat mates were two inexperienced flyers in their late 30s or so. They didn’t know where to put their bags or that they could not get up to go to the bathroom while the seatbelt sign was on and the flight was taking off. They were tattooed and looked a little rough around the edges. One of them carried a small, stuffed cheetah and wore hot pink pumps. I fell fast asleep the minute we took off and woke up an hour before we landed. The woman with the pink pumps started a conversation. Almost immediately she told me that she and her friend were returning from 2 months in rehab. They were both struggling with addictions to alcohol and sex, she told me. She was originally from Croatia and had lived in Amsterdam for years. She had recently relapsed and that’s why she went into treatment. She was returning to a safe house now, and it was her daughter’s 10th birthday. She was feeling positive about her life, yet sad that she would spend her daughter’s birthday in a safe house. She had recently revealed her addiction to her boss and received a negative and disempowering response. She was trying to be strong and accept that she was a recovering addict, learning to not feel ashamed, and working on being proud of the fact that she was moving forward. I was struck by her vulnerability and sweetness and left wondering how she would fare in a world where addiction and mental illness are so buried and stigmatized.

I got on my last flight and checked my Facebook while waiting to take off. My friend Subir had posted that two Supreme Court judges had overruled the Delhi high court’s decision and upheld the constitutionality of Section 377 –  essentially ruling that homosexuality is a crime and throwing India back into the dark ages.

Screen Shot 2013-12-13 at 7.26.51 AM

My seatmate on this flight started up a conversation and I mentioned the India decision. I also told him about some of the different work that I do and the various hats I wear, including my involvement as a board member with ICAAD, the International Center for Advocates Against Discrimination. ICAAD’s work is fascinating because they look at discrimination that is embedded into law, and the link between structural and legal discrimination and racial, gender, religious and social discrimination, violence, and hate crimes including those against religious minorities, immigrants, women, the LGBT community, and people of color.

As we talked, I learned that my seat mate’s mother had been a Holocaust survivor and that he was traveling to the US to attend an event in his mother’s honor. Her father survived a concentration camp, and she had been hidden and sheltered by different families for many years until the two were finally reunited and moved to the US.  She spent years dealing with the psychological impacts of the experience, but now works to help children and youth understand and deal with bigotry and hate, to identify it around them even when it’s not directly aimed at them, and to find ways to stop it. She highlights that it can manifest itself in seemingly small ways, like bullying at school.

This accidental theme of discrimination, violence and hate, whether based on race, poverty, addiction, religious beliefs or sexual orientation was so alive for me this week. I met and learned more about brave individuals and the work of organizations who stand up in the face of injustice to take action at both the personal and the institutional level, raising critical consciousness to push for the changes that the world needs.

Despite our ‘advanced’ societies, our awareness of history, our facts, our data, our evidence, our literary genius, our ICTs, our innovations, we have very far to go, as I was reminded multiple times. But strong and caring individuals, organized communities, and political will can make a dent in structural discrimination and contribute to a more human society. More of us, self included, need to re-focus and work harder toward this end.

Read Full Post »

At the November 8th Technology Salon in New York City, we looked at the role of ICTs in communication for development (C4D) initiatives with marginalized adolescent girls. Lead discussants Kerida McDonald and Katarzyna Pawelczyk discussed recent UNICEF reports related to the topic, and John Zoltner spoke about FHI360’s C4D work in practice.

To begin, it was pointed out that C4D is not donor communications or marketing. It is the use of communication approaches and methodologies to achieve influence at various levels —  e.g., family, institutional and policy —  to change behavior and social norms. C4D is one approach that is being used to address the root causes of gender inequality and exclusion.

Screen Shot 2013-10-11 at 7.24.48 AMAs the UNICEF report on ICTs and C4D* notes, girls may face a number of situations that contribute to and/or are caused by their marginalization: early pregnancy, female genital cutting, early marriage, high rates of HIV/AIDS, low levels of education, lack of control over resources. ICTs alone cannot resolve these, because there is a deep and broad set of root causes. However, ICTs can be integrated systematically into the set of C4D tools and approaches that contribute to positive change.

Issues like bandwidth, censorship and electricity need to be considered when integrating ICTs into C4D work, and approaches that fit the context need to be developed. Practitioners should use tools that are in the hands of girls and their communities now, yet be aware of advances in access and new technologies, as these change rapidly.

Key points:

Interactivity is more empowering than one-way messaging:  Many of the ICT solutions being promoted today focus on sending messages out via mobile phones. However C4D approaches aim for interactivity and multi-channel, multi-directional communication, which has proven more empowering.

Content: Traditional media normally goes through a rigorous editorial process and it is possible to infuse it with a gender balance. Social media does not have the same type of filters, and it can easily be used to reinforce stereotypes about girls. This is something to watch and be aware of.

Purpose: It’s common with ICT-related approaches to start with the technology rather than starting with the goals. As one Salon participant asked “What are the results we want to see for ourselves? What are the results that girls want to see? What are the root causes of discrimination and how are we trying to address them? What does success look like for girls? For organizations? Is there a role for ICTs in helping achieve success? If so, what is it?” These questions need to be the starting point, rather than the technology.

Participation: One Salon participant mentioned a 2-year project that is working together with girls to define their needs and their vision of success. The process is one co-design, and it is aimed at understanding what girls want. Many girls expressed a feeling of isolation and desire for connection, and so the project is looking at how ICTs can help them connect. As the process developed, the diversity of needs became very clear and plans have changed dramatically based on input from a range of girls from different contexts. Implementors need to be prepared to change, adapt and respond to what girls say they want and to local realities.

****

Screen Shot 2013-11-23 at 10.41.22 PMA second study commissioned by UNICEF explores how young people use social media. The researchers encountered some challenges in terms of a strong gender approach for the study. Though a gender lens was used for analysis, there is little available data disaggregated by sex. The study does not focus on the most marginalized, because it looks at the use of social media, which normally requires a data connection or Internet access, which the most marginalized youth usually do not have.

The authors of the report found that youth most commonly used the Internet and social media for socializing and communicating with friends. Youth connected less often for schoolwork. One reason for this may be that in the countries/contexts where the research took place, there is no real integration of ICTs into the school system. It was emphasized that the  findings in the report are not comparable or nationally representative, and blanket statements such as “this means x for the whole developing world” should be avoided.

Key points:

Self-reporting biases. Boys tend to have higher levels of confidence and self-report greater ICT proficiency than girls do. This may skew results and make it seem that boys have higher skill levels.

Do girls really have less access? We often hear that girls have less access than boys. The evidence gathered for this particular report found that “yes and no.” In some places, when researchers asked “Do you have access to a mobile,” there was not a huge difference between urban and rural or between boys and girls. When they dug deeper, however, it became more complex. In the case of Zambia, access and ownership were similar for boys and girls, but fewer girls were connecting at all to the Internet as compared to boys. Understanding connectivity and use was quite complicated.

What are girls vs. boys doing online? This is an important factor when thinking about what solutions are applicable to which situation(s). Differences came up here in the study. In Argentina, girls were doing certain activities more frequently, such as chatting and looking for information, but they were not gaming. In Zambia, girls were doing some things less often than boys; for example, fewer girls than boys were looking for health information, although the number was still significant. A notable finding was that both girls and boys were accessing general health information more often than they were accessing sensitive information, such as sexual health or mental health.

What are the risks in the online world? A qualitative portion of the study in Kenya used focus groups with girls and boys, and asked about their uses of social media and experience of risk. Many out-of-school girls aged 15-17 reported that they used social media as a way to meet a potential partner to help them out of their financial situation. They reported riskier behavior, contact with older men, and relationships more often than girls who were in school. Girls in general were more likely to report unpleasant online encounters than boys, for example, request for self-exposure photos.

Hiding social media use. Most of the young people that researchers spoke with in Kenya were hiding social media use from their parents, who disapproved of it. This is an important point to note in C4D efforts that plan on using social media, and program designers will want to take parental attitudes about different media and communication channels into consideration as they design C4D programs.

****

When implementing programs, it is noteworthy how boys and girls tend to use ICT and media tools. Gender issues often manifest themselves right away. “The boys grab the cameras, the boys sit down first at the computers.” If practitioners don’t create special rules and a safe space for girls to participate, girls may be marginalized. In practical ICT and media work, it’s common for boys and girls to take on certain roles. “Some girls like to go on camera, but more often they tend to facilitate what is being done rather than star in it.” The gender gap in ICT access and use, where it exists, is a reflection of the power gaps of society in general.

In the most rural areas, even when people have access, they usually don’t have the resources and skills to use ICTs.  Very simple challenges can affect girls’ ability to participate in projects, for example, oftentimes a project will hold training at times when it’s difficult for girls to attend. Unless someone systematically goes through and applies a gender lens to a program, organizations often don’t notice the challenges girls may face in participating. It’s not enough to do gender training or measure gender once a year; gendered approaches needs to be built into program design.

Long-terms interventions are needed if the goal is to emancipate girls, help them learn better, graduate, postpone pregnancy, and get a job. This cannot be done in a year with a simple project that has only one focus, because girls are dealing with education, healthcare, and a whole series of very entrenched social issues. What’s needed is to follow a cohort of girls and to provide information and support across all these sectors over the long-term.

Key points:

Engaging boys and men: Negative reactions from men are a concern if and when girls and women start to feel more empowered or to access resources. For example, some mobile money and cash transfer programs direct funds to girls and women, and some studies have found that violence against women increases when women start to have more money and more freedom. Another study, however, of a small-scale effort that provides unconditional cash transfers to girls ages 18-19 in rural Kenya, is demonstrating just the opposite: girls have been able to say where money is spent and the gender dynamics have improved. This raises the question of whether program methodologies need to be oriented towards engaging boys and men and involving them in changing gender dynamics, and whether engaging boys and men can help avoid an increase in violence. Working with boys to become “girl champions” was cited as a way to help to bring boys into the process as advocates and role models.

Girls as producers, not just consumers. ICTs are not only tools for sending content to girls. Some programs are working to help girls produce content and create digital stories in their own languages. Sometimes these stories are used to advocate to decision makers for change in favor of girls and their agendas. Digital stories are being used as part of research processes and to support monitoring, evaluation and accountability work through ‘real-time’ data.

ICTs and social accountability. Digital tools are helping young people address accountability issues and inform local and national development processes. In some cases, youth are able to use simple, narrow bandwidth tools to keep up to date on actions of government officials or to respond to surveys to voice their priorities. Online tools can also lead to offline, face-to-face engagement. One issue, however, is that in some countries, youth are able to establish communication with national government ministers (because there is national-level capacity and infrastructure) but at local level there is very little chance or capability for engagement with elected officials, who are unprepared to respond and engage with youth or via social media. Youth therefore tend to bypass local government and communicate with national government. There is a need for capacity building at local level and decentralized policies and practices so that response capacity is strengthened.

Do ICTs marginalize girls? Some Salon participants worried that as conversations and information increasingly move to a digital environment, ICTs are magnifying the information and communication divide and further marginalizing some girls. Others felt that the fact that we are able to reach the majority of the world’s population now is very significant, and the inability to reach absolutely everyone doesn’t mean we should stop using ICTs. For this very reason – because sharing of information is increasingly digital – we should continue working to get more girls online and strengthen their confidence and abilities to use ICTs.

Many thanks to UNICEF for hosting the Salon!

(Salons operate under Chatham House Rule, thus no attribution has been given in the above summary. Sign up here if you’d like to attend Salons in the future!)

*Disclosure: I co-authored this report with Keshet Bachan.

Read Full Post »

Back in May I participated in a discussion on if and how International Civil Society Organizations (ICSOs) are adapting to changes around them. Here’s my summary of the conversations: Can International Civil Society Organizations be nimble?

A final report from the meeting is ready (download Riding the Wave: A proposal for Boards and CEOs on how to prepare their organizations for disruptive change) and here’s a video summary:

I’m curious what other folks think about this topic and the analysis and recommendations in the report.

Read Full Post »

This is a guest post from Anna Crowe, Research Officer on the Privacy in the Developing World Project, and  Carly Nyst, Head of International Advocacy at Privacy International, a London-based NGO working on issues related to technology and human rights, with a focus on privacy and data protection. Privacy International’s new report, Aiding Surveillance, which covers this topic in greater depth was released this week.

by Anna Crowe and Carly Nyst

NOV 21 CANON 040

New technologies hold great potential for the developing world, and countless development scholars and practitioners have sung the praises of technology in accelerating development, reducing poverty, spurring innovation and improving accountability and transparency.

Worryingly, however, privacy is presented as a luxury that creates barriers to development, rather than a key aspect to sustainable development. This perspective needs to change.

Privacy is not a luxury, but a fundamental human right

New technologies are being incorporated into development initiatives and programmes relating to everything from education to health and elections, and in humanitarian initiatives, including crisis response, food delivery and refugee management. But many of the same technologies being deployed in the developing world with lofty claims and high price tags have been extremely controversial in the developed world. Expansive registration systems, identity schemes and databases that collect biometric information including fingerprints, facial scans, iris information and even DNA, have been proposed, resisted, and sometimes rejected in various countries.

The deployment of surveillance technologies by development actors, foreign aid donors and humanitarian organisations, however, is often conducted in the complete absence of the type of public debate or deliberation that has occurred in developed countries. Development actors rarely consider target populations’ opinions when approving aid programmes. Important strategy documents such as the UN Office for Humanitarian Affairs’ Humanitarianism in a Networked Age and the UN High-Level Panel on the Post-2015 Development Agenda’s A New Global Partnership: Eradicate Poverty and Transfer Economies through Sustainable Development give little space to the possible impact adopting new technologies or data analysis techniques could have on individuals’ privacy.

Some of this trend can be attributed to development actors’ systematic failure to recognise the risks to privacy that development initiatives present. However, it also reflects an often unspoken view that the right to privacy must necessarily be sacrificed at the altar of development – that privacy and development are conflicting, mutually exclusive goals.

The assumptions underpinning this view are as follows:

  • that privacy is not important to people in developing countries;
  • that the privacy implications of new technologies are not significant enough to warrant special attention;
  • and that respecting privacy comes at a high cost, endangering the success of development initiatives and creating unnecessary work for development actors.

These assumptions are deeply flawed. While it should go without saying, privacy is a universal right, enshrined in numerous international human rights treaties, and matters to all individuals, including those living in the developing world. The vast majority of developing countries have explicit constitutional requirements to ensure that their policies and practices do not unnecessarily interfere with privacy. The right to privacy guarantees individuals a personal sphere, free from state interference, and the ability to determine who has information about them and how it is used. Privacy is also an “essential requirement for the realization of the right to freedom of expression”. It is not an “optional” right that only those living in the developed world deserve to see protected. To presume otherwise ignores the humanity of individuals living in various parts of the world.

Technologies undoubtedly have the potential to dramatically improve the provision of development and humanitarian aid and to empower populations. However, the privacy implications of many new technologies are significant and are not well understood by many development actors. The expectations that are placed on technologies to solve problems need to be significantly circumscribed, and the potential negative implications of technologies must be assessed before their deployment. Biometric identification systems, for example, may assist in aid disbursement, but if they also wrongly exclude whole categories of people, then the objectives of the original development intervention have not been achieved. Similarly, border surveillance and communications surveillance systems may help a government improve national security, but may also enable the surveillance of human rights defenders, political activists, immigrants and other groups.

Asking for humanitarian actors to protect and respect privacy rights must not be distorted as requiring inflexible and impossibly high standards that would derail development initiatives if put into practice. Privacy is not an absolute right and may be limited, but only where limitation is necessary, proportionate and in accordance with law. The crucial aspect is to actually undertake an analysis of the technology and its privacy implications and to do so in a thoughtful and considered manner. For example, if an intervention requires collecting personal data from those receiving aid, the first step should be to ask what information is necessary to collect, rather than just applying a standard approach to each programme. In some cases, this may mean additional work. But this work should be considered in light of the contribution upholding human rights and the rule of law make to development and to producing sustainable outcomes. And in some cases, respecting privacy can also mean saving lives, as information falling into the wrong hands could spell tragedy.

A new framing

While there is an increasing recognition among development actors that more attention needs to be paid to privacy, it is not enough to merely ensure that a programme or initiative does not actively harm the right to privacy; instead, development actors should aim to promote rights, including the right to privacy, as an integral part of achieving sustainable development outcomes. Development is not just, or even mostly, about accelerating economic growth. The core of development is building capacity and infrastructure, advancing equality, and supporting democratic societies that protect, respect and fulfill human rights.

The benefits of development and humanitarian assistance can be delivered without unnecessary and disproportionate limitations on the right to privacy. The challenge is to improve access to and understanding of technologies, ensure that policymakers and the laws they adopt respond to the challenges and possibilities of technology, and generate greater public debate to ensure that rights and freedoms are negotiated at a societal level.

Technologies can be built to satisfy both development and privacy.

Download the Aiding Surveillance report.

Read Full Post »

This post was originally published on the Open Knowledge Foundation blog

A core theme that the Open Development track covered at September’s Open Knowledge Conference was Ethics and Risk in Open Development. There were more questions than answers in the discussions, summarized below, and the Open Development working group plans to further examine these issues over the coming year.

Informed consent and opting in or out

Ethics around ‘opt in’ and ‘opt out’ when working with people in communities with fewer resources, lower connectivity, and/or less of an understanding about privacy and data are tricky. Yet project implementers have a responsibility to work to the best of their ability to ensure that participants understand what will happen with their data in general, and what might happen if it is shared openly.

There are some concerns around how these decisions are currently being made and by whom. Can an NGO make the decision to share or open data from/about program participants? Is it OK for an NGO to share ‘beneficiary’ data with the private sector in return for funding to help make a program ‘sustainable’? What liabilities might donors or program implementers face in the future as these issues develop?

Issues related to private vs. public good need further discussion, and there is no one right answer because concepts and definitions of ‘private’ and ‘public’ data change according to context and geography.

Informed participation, informed risk-taking

The ‘do no harm’ principle is applicable in emergency and conflict situations, but is it realistic to apply it to activism? There is concern that organizations implementing programs that rely on newer ICTs and open data are not ensuring that activists have enough information to make an informed choice about their involvement. At the same time, assuming that activists don’t know enough to decide for themselves can come across as paternalistic.

As one participant at OK Con commented, “human rights and accountability work are about changing power relations. Those threatened by power shifts are likely to respond with violence and intimidation. If you are trying to avoid all harm, you will probably not have any impact.” There is also the concept of transformative change: “things get worse before they get better. How do you include that in your prediction of what risks may be involved? There also may be a perception gap in terms of what different people consider harm to be. Whose opinion counts and are we listening? Are the right people involved in the conversations about this?”

A key point is that whomever assumes the risk needs to be involved in assessing that potential risk and deciding what the actions should be — but people also need to be fully informed. With new tools coming into play all the time, can people be truly ‘informed’ and are outsiders who come in with new technologies doing a good enough job of facilitating discussions about possible implications and risk with those who will face the consequences? Are community members and activists themselves included in risk analysis, assumption testing, threat modeling and risk mitigation work? Is there a way to predict the likelihood of harm? For example, can we determine whether releasing ‘x’ data will likely lead to ‘y’ harm happening? How can participants, practitioners and program designers get better at identifying and mitigating risks?

When things get scary…

Even when risk analysis is conducted, it is impossible to predict or foresee every possible way that a program can go wrong during implementation. Then the question becomes what to do when you are in the middle of something that is putting people at risk or leading to extremely negative unintended consequences. Who can you call for help? What do you do when there is no mitigation possible and you need to pull the plug on an effort? Who decides that you’ve reached that point? This is not an issue that exclusively affects programs that use open data, but open data may create new risks with which practitioners, participants and activists have less experience, thus the need to examine it more closely.

Participants felt that there is not enough honest discussion on this aspect. There is a pop culture of ‘admitting failure’ but admitting harm is different because there is a higher sense of liability and distress. “When I’m really scared shitless about what is happening in a project, what do I do?” asked one participant at the OK Con discussion sessions. “When I realize that opening data up has generated a huge potential risk to people who are already vulnerable, where do I go for help?” We tend to share our “cute” failures, not our really dismal ones.

Academia has done some work around research ethics, informed consent, human subject research and use of Internal Review Boards (IRBs). What aspects of this can or should be applied to mobile data gathering, crowdsourcing, open data work and the like? What about when citizens are their own source of information and they voluntarily share data without a clear understanding of what happens to the data, or what the possible implications are?

Do we need to think about updating and modernizing the concept of IRBs? A major issue is that many people who are conducting these kinds of data collection and sharing activities using new ICTs are unaware of research ethics and IRBs and don’t consider what they are doing to be ‘research’. How can we broaden this discussion and engage those who may not be aware of the need to integrate informed consent, risk analysis and privacy awareness into their approaches?

The elephant in the room

Despite our good intentions to do better planning and risk management, one big problem is donors, according to some of the OK Con participants.  Do donors require enough risk assessment and mitigation planning in their program proposal designs? Do they allow organizations enough time to develop a well-thought-out and participatory Theory of Change along with a rigorous risk assessment together with program participants? Are funding recipients required to report back on risks and how they played out? As one person put it, “talk about failure is currently more like a ‘cult of failure’ and there is no real learning from it. Systematically we have to report up the chain on money and results and all the good things happening. and no one up at the top really wants to know about the bad things. The most interesting learning doesn’t get back to the donors or permeate across practitioners. We never talk about all the work-arounds and backdoor negotiations that make development work happen. This is a serious systemic issue.”

Greater transparency can actually be a deterrent to talking about some of these complexities, because “the last thing donors want is more complexity as it raises difficult questions.”

Reporting upwards into government representatives in Parliament or Congress leads to continued aversion to any failures or ‘bad news’. Though funding recipients are urged to be innovative, they still need to hit numeric targets so that the international aid budget can be defended in government spaces. Thus, the message is mixed: “Make sure you are learning and recognizing failure, but please don’t put anything too serious in the final report.” There is awareness that rigid program planning doesn’t work and that we need to be adaptive, yet we are asked to “put it all into a log frame and make sure the government aid person can defend it to their superiors.”

Where to from here?

It was suggested that monitoring and evaluation (M&E) could be used as a tool for examining some of these issues, but M&E needs to be seen as a learning component, not only an accountability one. M&E needs to feed into the choices people are making along the way and linking it in well during program design may be one way to include a more adaptive and iterative approach. M&E should force practitioners to ask themselves the right questions as they design programs and as they assess them throughout implementation. Theory of Change might help, and an ethics-based approach could be introduced as well to raise these questions about risk and privacy and ensure that they are addressed from the start of an initiative.

Practitioners have also expressed the need for additional resources to help them predict and manage possible risk: case studies, a safe space for sharing concerns during implementation, people who can help when things go pear-shaped, a menu of methodologies, a set of principles or questions to ask during program design, or even an ICT4D Implementation Hotline or a forum for questions and discussion.

These ethical issues around privacy and risk are not exclusive to Open Development. Similar issues were raised last week at the Open Government Partnership Summit sessions on whistle blowing, privacy, and safeguarding civic space, especially in light of the Snowden case. They were also raised at last year’s Technology Salon on Participatory Mapping.

A number of groups are looking more deeply into this area, including the Capture the Ocean Project, The Engine Room, IDRC’s research network, The Open Technology InstitutePrivacy InternationalGSMA, those working on “Big Data,” those in the Internet of Things space, and others.

I’m looking forward to further discussion with the Open Development working group on all of this in the coming months, and will also be putting a little time into mapping out existing initiatives and identifying gaps when it comes to these cross-cutting ethics, power, privacy and risk issues in open development and other ICT-enabled data-heavy initiatives.

Please do share information, projects, research, opinion pieces and more if you have them!

Read Full Post »

Bruce Lee explains why many open data and technology-led initiatives go wrong.

(See minute 1.14).

Read Full Post »

from chaos to haute couture...

from chaos to haute couture…

I took my 16-year-old (a huge music fan) to the Met on Saturday to see the Punk: Chaos to Couture exhibit. I was a little skeptical, having been pretty into punk rock as a teenager.

It was really bizarre to be at The Met, looking at how punk rock influenced high fashion. In fact, it was kind of ridiculous. (Hello, $100 designer “punk rock” shirts in the Museum Shop.) Everyone kept getting in trouble for photographing things (which of course only made me want to photograph them more). My daughter kept admonishing me to stop.

I would have preferred an exhibit about punk rock over one on high fashion, but it was cool that I kept overhearing mothers around my age explaining to their daughters “yes, back when I was a teenager….” or “I used to wear this kind of thing back when…” Of course they were not talking about high fashion, but about the real live stuff we used to get from the thrift shops and modify into something new. In my home town we were normally about 10 years behind the times, and we didn’t discover punk rock till the 80s, but oh well. Back then we didn’t have cable or the Internet, and punk rock was not available at the mall. We had to find inspiration wherever we could.

46909_10151752486462284_666195068_nThe background information about the Chaos to Couture exhibit explained that the difference between British punk and American punk was that Brit punk was a sociopolitical reaction of the working class and US punk was more an intellectual-artistic movement. I have to say, though, that in Indiana in the 1980s, we were rebelling against the boredom and status quo of life in general. We weren’t politically engaged, intellectual, or particularly artistic; we just felt like misfits and knew something was not quite right. We didn’t want to be like the people around us.

The Met exhibit credits the punk DIY attitude for spawning some of the current approaches in tech and social media today. (DIY is wonderfully chronicled, by the way, in the excellent book “Our Band Could Be Your Life” which covers the 80s DIY music movement in the US, including chapters on some of my favorites: The Butthole Surfers, Black Flag, The Minutemen, and Sonic Youth).

A good friend of mine and I semi-jokingly talk about wanting to do PunkICT4D. We’ve mused about starting the #ICT4Anarchy hashtag based on the definition below from Noam Chomsky. (We’ll be exploring some of this in December at ICTD2013 during a panel with a high brow, academic, not very punk sounding title: “Appropriating ICTs for Developing Critical Consciousness and Structural Social Change.”)

[Noam says] Well, anarchism is, in my view, basically a kind of tendency in human thought which shows up in different forms in different circumstances, and has some leading characteristics.  Primarily it is a tendency that is suspicious and skeptical of domination, authority, and hierarchy.  It seeks structures of hierarchy and domination in human life over the whole range, extending from, say, patriarchal families to, say, imperial systems, and it asks whether those systems are justified.  It assumes that the burden of proof for anyone in a position of power and authority lies on them.  Their authority is not self-justifying.  They have to give a reason for it, a justification.  And if they can’t justify that authority and power and control, which is the usual case, then the authority ought to be dismantled and replaced by something more free and just.  And, as I understand it, anarchy is just that tendency.  It takes different forms at different times….

In doing a little reading on the exhibit at the Met, I found this rather blind statement (my bold).

In the 1970s, Punk was so much more than a fashion statement. In Britain, it was a reaction to sky-high unemployment, to the Thatcher administration’s closing of the mines, and to the pervasive feeling of hopelessness. Andrew Bolton, the British curator of the show, talked about how that political tinderbox could never be recreated today, although the fashions could be referenced and reworked.

On the one hand, duh. We all know that punk became another watered down, mass-produced, corporate-sponsored commodity. On the other hand, no. Punk’s not dead. The best part of the exhibit was this, scrawled on one of the walls:

Courtesy of @umairh because I was being watched by a museum guard and couldn't get a photo.

Courtesy of @umairh’s Twitter feed because the museum guards were watching, and I couldn’t get any more photos.

Maybe I’m a throwback, but it seems punk’s still hugely relevant.
.
PD: Chomsky on Celebrity and Punk Rock, via Christopher Neu.
.

Read Full Post »

The February 5 Technology Salon in New York City asked “What are the ethics in participatory digital mapping?” Judging by the packed Salon and long waiting list, many of us are struggling with these questions in our work.

Some of the key ethical points raised at the Salon related to the benefits of open data vs privacy and the desire to do no harm. Others were about whether digital maps are an effective tool in participatory community development or if they are mostly an innovation showcase for donors or a backdrop for individual egos to assert their ‘personal coolness’. The absence of research and ethics protocols for some of these new kinds of data gathering and sharing was also an issue of concern for participants.

During the Salon we were only able to scratch the surface, and we hope to get together soon for a more in-depth session (or maybe 2 or 3 sessions – stay tuned!) to further unpack the ethical issues around participatory digital community mapping.

The points raised by discussants and participants included:

1) Showcasing innovation

Is digital mapping really about communities, or are we really just using communities as a backdrop to showcase our own innovation and coolness or that of our donors?

2) Can you do justice to both process and product?

Maps should be less an “in-out tool“ and more part of a broader program. External agents should be supporting communities to articulate and to be full partners in saying, doing, and knowing what they want to do with maps. Digital mapping may not be better than hand drawn maps, if we consider that the process of mapping is just as or more important than the final product. Hand drawn maps can allow for important discussions to happen while people draw. This seems to happens much less with the digital mapping process, which is more technical, and it happens even less when outside agents are doing the mapping. A hand drawn map can be imbued with meaning in terms of the size, color or placement of objects or borders. Important meaning may be missed when hand drawn maps are replaced with digital ones.

Digital maps, however, can be printed and further enhanced with comments and drawings and discussed in the community, as some noted. And digital maps can lend a sense of professionalism to community members and help them to make a stronger case to authorities and decisions makers. Some participants raised concerns about power relations during mapping processes, and worried that using digital tools could emphasize those.

3) The ethics of wasting people’s time.

Community mapping is difficult. The goal of external agents should be to train local people so that they can be owners of the process and sustain it in the long term. This takes time. Often, however, mapping experts are flown in for a week or two to train community members. They leave people with some knowledge, but not enough to fully manage the mapping process and tools. If people end up only half-trained and without local options to continue training, their time has essentially been wasted. In addition, if young people see the training as a pathway to a highly demanded skill set yet are left partially trained and without access to tools and equipment, they will also feel they have wasted their time.

4) Data extraction

When agencies, academics and mappers come in with their clipboards or their GPS units and conduct the same surveys and studies over and over with the same populations, people’s time is also wasted. Open digital community mapping comes from a viewpoint that an open map and open data are one way to make sure that data that is taken from or created by communities is made available to the communities for their own use and can be accessed by others so that the same data is not collected repeatedly. Though there are privacy concerns around opening data, there is a counter balanced ethical dilemma related to how much time gets wasted by keeping data closed.

5) The (missing) link between data and action

Related to the issue of time wasting is the common issue of a missing link between data collected and/or mapped, action and results. Making a map identifying issues is certainly no guarantee that the government will come and take care of those issues. Maps are a means to an end, but often the end is not clear. What do we really hope the data leads to? What does the community hope for? Mapping can be a flashy technology that brings people to the table, but that is no guarantee that something will happen to resolve the issues the map is aimed at solving.

6) Intermediaries are important

One way to ensure that there is a link between data and action is to identify stakeholders that have the ability to use, understand and re-interpret the data. One case was mentioned where health workers collected data and then wanted to know “What do we do now? How does this affect the work that we do? How do we present this information to community health workers in a way that it is useful to our work?” It’s important to tone the data down and make them understandable to the base population, and to also show them in a way that is useful to people working at local institutions. Each audience will need the data to be visualized or shared in a different, contextually appropriate way if they are going to use the data for decision-making. It’s possible to provide the same data in different ways across different platforms from paper to high tech. The challenge of keeping all the data and the different sharing platforms updated, however, is one that can’t be overlooked.

7) What does informed consent actually mean in today’s world?

There is a viewpoint that data must be open and that locking up data is unethical. On the other hand, there are questions about research ethics and protocols when doing mapping projects and sharing or opening data. Are those who do mapping getting informed consent from people to use or open their data? This is the cornerstone of ethics when doing research with human beings. One must be able to explain and be clear about the risks of this data collection, or it is impossible to get truly informed consent. What consent do community mappers need from other community members if they are opening data or information? What about when people are volunteering their information and self-reporting? What does informed consent mean in those cases? And what needs to be done to ensure that consent is truly informed? How can open data and mapping be explained to those who have not used the Internet before? How can we have informed consent if we cannot promise anyone that their data are really secure? Do we have ethics review boards for these new technological ways of gathering data?

8) Not having community data also has ethical implications

It may seem like time wasting, and there may be privacy and protection questions, but there are are also ethical implications of not having community data. When tools like satellite remote sensing are used to do slum mapping, for example, data are very dehumanized and can lead to sterile decision-making. The data that come from a community itself can make these maps more human and these decisions more humane. But there is a balance between the human/humanizing side and the need to protect. Standards are needed for bringing in community and/or human data in an anonymized way, because there are ethical implications on both ends.

9) The problem with donors….

Big donors are not asking the tough questions, according to some participants. There is a lack of understanding around the meaning, use and value of the data being collected and the utility of maps. “If the data is crap, you’ll have crap GIS and a crap map. If you are just doing a map to do a map, there’s an issue.” There is great incentive from the donor side to show maps and to demonstrate value, because maps are a great photo op, a great visual. But how to go a level down to make a map really useful? Are the M&E folks raising the bar and asking these hard questions? Often from the funder’s perspective, mapping is seen as something that can be done quickly. “Get the map up and the project is done. Voila! And if you can do it in 3 weeks, even better!”

Some participants felt the need for greater donor awareness of these ethical questions because many of them are directly related to funding issues. As one participant noted, whether you coordinate, whether it’s participatory, whether you communicate and share back the information, whether you can do the right thing with the privacy issue — these all depend on what you can convince a donor to fund. Often it’s faster to reinvent the wheel because doing it the right way – coordinating, learning from past efforts, involving the community — takes more time and money. That’s often the hard constraint on these questions of ethics.

Check this link for some resources on the topic, and add yours to the list.

Many thanks to our lead discussants, Robert Banick from the American Red Cross and Erica Hagen from Ground Truth, and to Population Council for hosting us for this month’s Salon!

The next Technology Salon NYC will be coming up in March. Stay tuned for more information, and if you’d like to receive notifications about future salons, sign up for the mailing list!

Read Full Post »

« Newer Posts - Older Posts »