Feeds:
Posts
Comments

Archive for the ‘protection’ Category

Screen Shot 2015-04-23 at 8.59.45 PMBy Mala Kumar and Linda Raftree

Our April 21st NYC Technology Salon focused on issues related to the LGBT ICT4D community, including how LGBTQI issues are addressed in the context of stakeholders and ICT4D staff. We examined specific concerns that ICT4D practitioners who identify as LGBTQI have, as well as how LGBTQI stakeholders are (or are not) incorporated into ICT4D projects, programs and policies. Among the many issues covered in the Salon, the role of the Internet and mobile devices for both community building and surveillance/security concerns played a central part in much of the discussion.

To frame the discussion, participants were asked to think about how LGBTQI issues within ICT4D (and more broadly, development) are akin to gender. Mainstreaming gender in development starts with how organizations treat their own staff. Implementing programs, projects and policies with a focus on gender cannot happen if the implementers do not first understand how to treat staff, colleagues and those closest to them (i.e. family, friends). Likewise, without a proper understanding of LGBTQI colleagues and staff, programs that address LGBTQI stakeholders will be ineffective.

The lead discussants of the Salon were Mala Kumar, writer and former UN ICT4D staff, Tania Lee, current IRC ICT4D Program Officer, and Robert Valadéz, current UN ICT4D staff. Linda Raftree moderated the discussion.

Unpacking LGBTQI

The first discussant pointed out how we as ICT4D/development practitioners think of the acronym LGBTQI, particularly the T and I – transgender and intersex. Often, development work focuses on the sexual identity portion of the acronym (the LGBQ), and not what is considered in Western countries as transgenderism.

As one participant said, the very label of “transgender” is hard to convey in many countries where “third gender” and “two-spirit gender” exist. These disagreements in terminology have – in Bangladesh and Nepal for example – resulted in creating conflict and division of interest within LGBTQI communities. In other countries, such as Thailand and parts of the Middle East, “transgenderism” can be considered more “normal” or societally acceptable than homosexuality. Across Africa, Latin America, North America and Europe, homosexuality is a better understood – albeit sometimes severely criminalized and socially rejected – concept than transgenderism.

One participant cited that in her previous first-hand work on services for lesbian, gay and bisexual people; often in North America, transgender communities are prioritized less in LGBTQI services. In many cases she saw in San Francisco, homeless youth would identify as anything in order to gain access to needed services. Only after the services were provided did the beneficiaries realize the consequences of self-reporting or incorrectly self-reporting.

Security concerns within Unpacking LGBTQI

For many people, the very notion of self-identifying as LGBTQI poses severe security risks. From a data collection standpoint, this results in large problems in accurate representation of populations. It also results in privacy concerns. As one discussant mentioned, development and ICT4D teams often do not have the technical capacity (i.e. statisticians, software engineers) to properly anonymize data and/or keep data on servers safe from hackers. On the other hand, the biggest threat to security may just be “your dad finding your phone and reading a text message,” as one person noted.

Being an LGBTQI staff in ICT4D

 Our second lead discussant spoke about being (and being perceived as) an LGBTQI staff member in ICT4D. She noted that many of the ICT4D hubs, labs, centers, etc. are in countries that are notoriously homophobic. Examples include Uganda (Kampala), Kenya (Nairobi), Nigeria (Abuja, Lagos), Kosovo and Ethiopia (Addis). This puts people who are interested in technology for development and are queer at a distinct disadvantage.

Some of the challenges she highlighted include that ICT4D attracts colleagues from around the world who are the most likely to be adept at computers and Internet usage, and therefore more likely to seek out and find information about other staff/colleagues online. If those who are searching are homophobic, finding “evidence” against colleagues can be both easy and easy to disseminate. Along those lines, ICT4D practitioners are encouraged (and sometimes necessitated) to blog, use social media, and keep an online presence. In fact, many people in ICT4D find posts and contracts this way. However, keeping online professional and personal presences completely separate is incredibly challenging. Since ICT4D practitioners are working with colleagues most likely to actually find colleagues online, queer ICT4D practitioners are presented with a unique dilemma.

ICT4D practitioners are arguably the set of people within development that are the best fitted to utilize technology and programmatic knowledge to self-advocate as LGBT staff and for LGBT stakeholder inclusion. However, how are queer ICT4D staff supposed to balance safety concerns and professional advancement limitations when dealing with homophobic staff? This issue is further compounded (especially in the UN, as one participant noted) by being awarded the commonly used project-based contracts, which give staff little to no job security, bargaining power or general protection when working overseas.

Security concerns within being an LGBTQI staff in ICT4D

A participant who works in North America for a Kenyan-based company said that none of her colleagues ever mentioned her orientation, even though they must have found her publicly viewable blog on gender and she is not able to easily disguise her orientation. She talked about always finding and connecting to the local queer community wherever she goes, often through the Internet, and tries to support local organizations working on LGBT issues. Still, she and several other participants and discussants emphasized their need to segment online personal and professional lives to remain safe.

Another participant mentioned his time working in Ethiopia. The staff from the center he worked with made openly hostile remarks about gays, which reinforced his need to stay closeted. He noticed that the ICT staff of the organization made a concerted effort to research people online, and that Facebook made it difficult, if not impossible, to keep personal and private lives separate.

Another person reiterated this point by saying that as a gay Latino man, and the first person in his family to go to university, grad school and work in a professional job, he is a role model to many people in his community. He wants to offer guidance and support, and used to do so with a public online presence. However, at his current internationally-focused job he feels the need to self-censor and has effectively limited talking about his public online presence, because he often interacts with high level officials who are hostile towards the LGBTQI community.

One discussant also echoed this idea, saying that she is becoming a voice for the queer South Asian community, which is important because much of LGBT media is very white. The tradeoff for becoming this voice is compromising her career in the field because she cannot accept a lot of posts because they do not offer adequate support and security.

Intersectionality

Several participants and discussants offered their own experiences on the various levels of hostility and danger involved with even being suspected as gay. One (female) participant began a relationship with a woman while working in a very conservative country, and recalled being terrified at being killed over the relationship. Local colleagues began to suspect, and eventually physically intervened by showing up at her house. This participant cited her “light skinned privilege” as one reason that she did not suffer serious consequences from her actions.

Another participant recounted his time with the US Peace Corps. After a year, he started coming out and dating people in host country. When one relationship went awry and he was turned into the police for being gay, nothing came of the charges. Meanwhile, he saw local gay men being thrown into – and sometimes dying in – jail for the same charges. He and some other participants noted their relative privilege in these situations because they are white. This participant said he felt that as a white male, he felt a sense of invincibility.

In contrast, a participant from an African country described his experience growing up and using ICTs as an escape because any physical indication he was gay would have landed him in jail, or worse. He had to learn how to change his mannerisms to be more masculine, had to learn how to disengage from social situations in real life, and live in the shadows.

One of the discussants echoed these concerns, saying that as a queer woman of color, everything is compounded. She was recruited for a position at a UN Agency in Kenya, but turned the post down because of the hostility towards gays and lesbians there. However, she noted that some queer people she has met – all white men from the States or Europe – have had overall positive experiences being gay with the UN.

Perceived as predators

One person brought up the “predator” stereotype often associated with gay men. He and his partner have had to turn down media opportunities where they could have served as role models for the gay community, especially poor, gay queer men of color, (who are one of the most difficult socioeconomic classes to reach) out of fear that this stereotype may impact on their being hired to work in organizations that serve children.

Monitoring and baiting by the government

One participant who grew up in Cameroon mentioned that queer communities in his country use the Internet cautiously, even though it’s the best resource to find other queer people. The reason for the caution is that government officials have been known to pose as queer people to bait real users for illegal gay activity.

Several other participants cited this same phenomenon in different forms. A recent article talked about Egypt using new online surveillance tactics to find LGBTQI people. Some believe that this type of surveillance will also happen in Nigeria, a notoriously hostile country towards LGBTQI persons and other places.

There was also discussion about what IP or technology is the safest for LGBTQI people. While the Internet can be monitored and traced back to a specific user, being able to connect from multiple access points and with varying levels of security creates a sense of anonymity that phones cannot provide. A person also generally carries phones, so if the government intercepts a message on either the originating or receiving device, implications of existing messages are immediate unless a user can convince the government the device was stolen or used by someone else. In contrast, phones are more easily disposable and in several countries do not require registration (or a registered SIM card) to a specific person.

In Ethiopia, the government has control over the phone networks and can in theory monitor these messages for LGBTQI activity. This poses a particular threat since there is already legal precedent for convictions of illegal activity based on text messages. In some countries, major telecom carriers are owned by a national government. In others, major telecom carries are national subsidiaries of an international company.

Another major concern raised relates back to privacy. Many major international development organizations do not have the capacity or ability to retain necessary software engineers, ICT architects and system operators, statisticians and other technology people to properly prevent Internet hacks and surveillance. In some cases, this work is illegal by national government policy, and thus also requires legal advocacy. The mere collection of data and information can therefore pose a security threat to staff and stakeholders – LGBTQI and allies, alike.

The “queer divide”

One discussant asked the group for data or anecdotal information related to the “queer divide.” A commonly understood problem in ICT4D work are divides – between genders, urban and rural, rich and poor, socially accepted and socially marginalized. There have also been studies to clearly demonstrate that people who are naturally extroverted and not shy benefit more from any given program or project. As such, is there any data to support a “queer divide” between those who are LGBTQI and those who are not, he wondered. As demonstrated in the above sections, many queer people are forced to disengage socially and retreat from “normal” society to stay safe.

Success stories, key organizations and resources

Participants mentioned organizations and examples of more progressive policies for LGBTQI staff and stakeholders (this list is not comprehensive, nor does it suggest these organizations’ policies are foolproof), including:

We also compiled a much more extensive list of resources on the topic here as background reading, including organizations, articles and research. (Feel free to add to it!)

What can we do moving forward?

  • Engage relevant organizations, such as Out in Tech and Lesbians who Tech, with specific solutions, such as coding privacy protocols for online communities and helping grassroots organizations target ads to relevant stakeholders.
  • Lobby smartphone manufacturers to increase privacy protections on mobile devices.
  • Lobby US and other national governments to introduce “Right to be forgotten” law, which allows Internet users to wipe all records of themselves and personal activity.
  • Support organizations and services that offer legal council to those in need.
  • Demand better and more comprehensive protection for LGBTQI staff, consultants and interns in international organizations.

Key questions to work on…

  • In some countries, a government owns telecom companies. In others, telecom companies are national subsidiaries of international corporations. In countries in which the government is actively or planning on actively surveying networks for LGBTQI activity, how does the type of telecom company factor in?
  • What datasets do we need on LGBTQI people for better programming?
  • How do we properly anonymize data collected? What are the standards of best practices?
  • What policies need to be in place to better protect LGBTQI staff, consultants and interns? What kind of sensitizing activities, trainings and programming need to be done for local staff and less LGBTQI sensitive international staff in ICT4D organizations?
  • How much capacity have ICT4D/international organizations lost as a result of their policies for LGBTQI staff and stakeholders?
  • What are the roles and obligations of ICT4D/international organizations to their LGBTQI staff, now and in the future?
  • What are the ICT4D and international development programmatic links with LGBT stakeholders and staff? How does LGBT stakeholders intersect with water? Public health? Nutrition? Food security? Governance and transparency? Human rights? Humanitarian crises? How does LGBT staff intersect with capacity? Trainings? Programming?
  • How do we safely and responsibility increase visibility of LGBTQI people around the world?
  • How do we engage tech companies that are pro-LGBTQI, including Google, to do more for those who cannot or do not engage with their services?
  • What are the economic costs of homophobia, and does this provide a compelling enough case for countries to stop systemic LGBTQI-phobic behavior?
  • How do we mainstream LGBTQI issues in bigger development conferences and discussions?

Thanks to the great folks at ThoughtWorks for hosting and providing a lovely breakfast to us! Technology Salons are carried out under Chatham House Rule, so no attribution has been made. If you’d like to join us for Technology Salons in future, sign up here!

Read Full Post »

It’s been two weeks since we closed out the M&E Tech Conference in DC and the Deep Dive in NYC. For those of you who missed it or who want to see a quick summary of what happened, here are some of the best tweets from the sessions.

We’re compiling blog posts and related documentation and will be sharing more detailed summaries soon. In the meantime, enjoy a snapshot!

Read Full Post »

Today as we jump into the M&E Tech conference in DC (we’ll also have a Deep Dive on the same topic in NYC next week), I’m excited to share a report I’ve been working on for the past year or so with Michael Bamberger: Emerging Opportunities in a Tech-Enabled World.

The past few years have seen dramatic advances in the use of hand-held devices (phones and tablets) for program monitoring and for survey data collection. Progress has been slower with respect to the application of ICT-enabled devices for program evaluation, but this is clearly the next frontier.

In the paper, we review how ICT-enabled technologies are already being applied in program monitoring and in survey research. We also review areas where ICTs are starting to be applied in program evaluation and identify new areas in which new technologies can potentially be applied. The technologies discussed include hand-held devices for quantitative and qualitative data collection and analysis, data quality control, GPS and mapping devices, environmental monitoring, satellite imaging and big data.

While the technological advances and the rapidly falling costs of data collection and analysis are opening up exciting new opportunities for monitoring and evaluation, the paper also cautions that more attention should be paid to basic quality control questions that evaluators normally ask about representativity of data and selection bias, data quality and construct validity. The ability to use techniques such as crowd sourcing to generate information and feedback from tens of thousands of respondents has so fascinated researchers that concerns about the representativity or quality of the responses have received less attention than is the case with conventional instruments for data collection and analysis.

Some of the challenges include the potential for: selectivity bias and sample design, M&E processes being driven by the requirements of the technology and over-reliance on simple quantitative data, as well as low institutional capacity to introduce ICT and resistance to change, and issues of privacy.

None of this is intended to discourage the introduction of these technologies, as the authors fully recognize their huge potential. One of the most exciting areas concerns the promotion of a more equitable society through simple and cost-effective monitoring and evaluation systems that give voice to previously excluded sectors of the target populations; and that offer opportunities for promoting gender equality in access to information. The application of these technologies however needs to be on a sound methodological footing.

The last section of the paper offers some tips and ideas on how to integrate ICTs into M&E practice and potential pitfalls to avoid. Many of these were drawn from Salons and discussions with practitioners, given that there is little solid documentation or evidence related to the use of ICTs for M&E.

Download the full paper here! 

Read Full Post »

Screen Shot 2014-05-08 at 9.36.00 AMDebate and thinking around data, ethics, ICT have been growing and expanding a lot lately, which makes me very happy!

Coming up on May 22 in NYC, the engine room, Hivos, the Berkman Center for Internet and Society, and Kurante (my newish gig) are organizing the latest in a series of events as part of the Responsible Data Forum.

The event will be hosted at ThoughtWorks and it is in-person only. Space is limited, so if you’d like to join us, let us know soon by filling in this form. 

What’s it all about?

This particular Responsible Data Forum event is an effort to map the ethical, legal, privacy and security challenges surrounding the increased use and sharing of data in development programming. The Forum will aim to explore the ways in which these challenges are experienced in project design and implementation, as well as when project data is shared or published in an effort to strengthen accountability. The event will be a collaborative effort to begin developing concrete tools and strategies to address these challenges, which can be further tested and refined with end users at events in Amsterdam and Budapest.

We will explore the responsible data challenges faced by development practitioners in program design and implementation.

Some of the use cases we’ll consider include:

  • projects collecting data from marginalized populations, aspiring to respect a do no harm principle, but also to identify opportunities for informational empowerment
  • project design staff seeking to understand and manage the lifespan of project data from collection, through maintenance, utilization, and sharing or destruction.
  • project staff that are considering data sharing or joint data collection with government agencies or corporate actors
  • project staff who want to better understand how ICT4D will impact communities
  • projects exploring the potential of popular ICT-related mechanisms, such as hackathons, incubation labs or innovation hubs
  • projects wishing to use development data for research purposes, and crafting responsible ways to use personally identifiable data for academic purposes
  • projects working with children under the age of 18, struggling to balance the need for data to improve programming approaches, and demand higher levels of protection for children

By gathering a significant number of development practitioners grappling with these issues, the Forum aims to pose practical and critical questions to the use of data and ICTs in development programming. Through collaborative sessions and group work, the Forum will identify common pressing issues for which there might be practical and feasible solutions. The Forum will focus on prototyping specific tools and strategies to respond to these challenges.

What will be accomplished?

Some outputs from the event may include:

  • Tools and checklists for managing responsible data challenges for specific project modalities, such as sms surveys, constructing national databases, or social media scraping and engagement.
  • Best practices and ethical controls for data sharing agreements with governments, corporate actors, academia or civil society
  • Strategies for responsible program development
  • Guidelines for data-driven projects dealing with communities with limited representation or access to information
  • Heuristics and frameworks for understanding anonymity and re-identification of large development data sets
  • Potential policy interventions to create greater awareness and possibly consider minimum standards

Hope to see some of you on the 22nd! Sign up here if you’re interested in attending, and read more about the Responsible Data Forum here.

 

Read Full Post »

Last week’s Technology Salon New York City touched on ethics in technology for democracy initiatives. We heard from lead discussants Malavika Jayaram, Berkman Center for Internet and SocietyIvan Sigal, Global Voices; and Amilcar Priestley, Afrolatin@ Project. Though the topic was catalyzed by the Associated Press’ article on ‘Zunzuneo’ (a.k.a. ‘Cuban Twitter’) and subsequent discussions in the press and elsewhere, we aimed to cover some of the wider ethical issues encountered by people and organizations who implement technology for democracy programs.

Salons are off the record spaces, so no attribution is made in this post, but I’ve summarized the discussion points here:

First up: Zunzuneo

The media misinterpreted much of the Zunzuneo story. Zunzuneo was not a secret mission, according to one Salon participant, as it’s not in the remit of USAID to carry out covert operations. The AP article conflated a number of ideas regarding how USAID works and the contracting mechanisms that were involved in this case, he said. USAID and the Office of Transition Initiatives (OTI) frequently disguise members, organizations, and contractors that work for it on the ground for security reasons. (See USAID’s side of the story here). This may still be an ethical question, but it is not technically “spying.” The project was known within the OTI and development community, but on a ‘need to know’ basis. It was not a ‘fly by night’ operation; it was more a ‘quietly and not very effectively run project.’

There were likely ethics breaches in Zunzuneo, from a legal standpoint. It’s not clear whether the data and phone numbers collected from the Cuban public for the project were obtained in a legal or ethical way. Some reports say they were obtained through a mid-level employee (a “Cuban engineer who had gotten the phone list” according to the AP article). (Note: I spoke separately to someone close to the project who told me that user opt-in/opt-out and other standard privacy protocols were in place). It’s also not entirely clear whether, as the AP states, the user information collected was being categorized into segments who were loyal or disloyal to the Cuban government, information which could put users at risk if found out.

Zunzuneo took place in a broader historical and geo-political context. As one person put it, the project followed Secretary Clinton’s speeches on Internet Freedom. There was a rush to bring technology into the geopolitical space, and ‘the articulation of why technology was important collided with a bureaucratic process in USAID and the State Department (the ‘F process’) that absorbed USAID into the State Department and made development part of the State Department’s broader political agenda.’ This agenda had been in the works for quite some time, and was part of a wider strategy of quietly moving into development spaces and combining development, diplomacy, intelligence and military (defense), the so-called 3 D’s.

Implementers failed to think through good design, ethics and community aspects of the work. In a number of projects of this type, the idea was that if you give people technology, they will somehow create bottom up pressure for political social change. As one person noted, ‘in the Middle East, as a counter example, the tech was there to enable and assist people who had spent 8-10 years building networks. The idea that we can drop tech into a space and an uprising will just happen and it will coincidentally push the US geopolitical agenda is a fantasy.’ Often these kinds of programs start with a strategic communications goal that serves a political end of the US Government. They are designed with the idea that a particular input equals some kind of a specific result down the chain. The problem comes when the people doing the seeding of the ideas and inputs are not familiar with the context they will be operating in. They are injecting inputs into a space that they don’t understand. The bigger ethical question is: Why does this thought process prevail in development? Much of that answer is found in US domestic politics and the ways that initiatives get funded.

Zunzuneo was not a big surprise for Afrolatino organizations. According to one discussant, Afrolatino organizations were not surprised when the Zunzuneo article came out, given the geopolitical history and the ongoing presence of the US in Latin America. Zunzuneo was seen as a 21st Century version of what has been happening for decades. Though it was criticized, it was not seen as particularly detrimental. Furthermore, the Afrolatino community (within the wider Latino community) has had a variety of relationships with the US over time – for example, some Afrolatino groups supported the Contras. Many Afrolatino groups have felt that they were not benefiting overall from the mestizo governments who have held power. In addition, much of Latin America’s younger generation is less tainted by the Cold War mentality, and does not see US involvement in the region as necessarily bad. Programs like Zunzuneo come with a lot of money attached, so often wider concerns about their implications are not in the forefront because organizations need to access funding. Central American and Caribbean countries are only just entering into a phase of deeper analysis of digital citizenship, and views and perceptions on privacy are still being developed.

Perceptions of privacy

There are differences in perception when it comes to privacy and these perceptions are contextual. They vary within and across countries and communities based on age, race, gender, economic levels, comfort with digital devices, political perspective and past history. Some older people, for example, are worried about the privacy violation of having their voice or image recorded, because the voice, image and gaze hold spiritual value and power. These angles of privacy need to be considered as we think through what privacy means in different contexts and adapt our discourse accordingly.

Privacy is hard to explain, as one discussant said: ‘There are not enough dead bodies yet, so it’s hard to get people interested. People get mad when the media gets mad, and until an issue hits the media, it may go unnoticed. It’s very hard to conceptualize the potential harm from lack of privacy. There may be a chilling effect but it’s hard to measure. The digital divide comes in as well, and those with less exposure may have trouble understanding devices and technology. They will then have even greater trouble understanding beyond the device to data doubles, disembodied information and de-anonymization, which are about 7 levels removed from what people can immediately see. Caring a lot about privacy can get you labeled as paranoid or a crazy person in many places.’

Fatalism about privacy can also hamper efforts. In the developing world, many feel that everything is corrupt and inept, and that there is no point in worrying about privacy and security. ‘Nothing ever works anyway, so even if the government wanted to spy on us, they’d screw it up,’ is the feeling. This is often the attitude of human rights workers and others who could be at greatest risk from privacy breaches or data collection, such as that which was reportedly happening within Zunzuneo. Especially among populations and practitioners who have less experience with new technologies and data, this can create large-scale risk.

Intent, action, context and consequences

Good intentions with little attention to privacy vs data collection with a hidden political agenda. Where are the lines when data that are collected for a ‘good cause’ (for example, to improve humanitarian response) might be used for a different purpose that puts vulnerable people at risk? What about data that are collected with less altruistic intentions? What about when the two scenarios overlap? Data might be freely given or collected in an emergency that would be considered a privacy violation in a ‘development’ setting, or the data collection may lead to a privacy violation post-emergency. Often, slapping the ‘obviously good and unarguably positive’ label of ‘Internet freedom’ on something implies that it’s unquestionably positive when it may in fact be part of a political agenda with a misleading label. There is a long history of those with power collecting data that helps them understand and/or control those with less power, as one Salon participant noted, and we need to be cognizant of that when we think about data and privacy.

US Government approaches to political development often take an input/output approach, when, in fact, political development is not the same as health development. ‘In political work, there is no clear and clean epidemiological goal we are trying to reach,’ noted a Salon participant. Political development is often contentious and the targets and approaches are very different than those of health. When a health model and rhetoric is used to work on other development issues, it is misleading. The wholesale adoption of these kinds of disease model approaches leaves people and communities out of the decision making process about their own development. Similarly, the rhetoric of strategic communications and its inclusion into the development agenda came about after the War on Terror, and it is also a poor fit for political development. The rhetoric of ‘opening’ and ‘liberating’ data is similar. These arguments may work well for one kind of issue, but they are not transferable to a political agenda. One Salon participant pointed out the rhetoric of the privatization model also, and explained that a profound yet not often considered implication of the privatization of services is that once a service passes over to the private sector, the Freedom of Information Act (FOIA) does not apply, and citizens and human rights organizations lose FOIA as a tool. Examples included the US prison system and the Blackwater case of several years ago.

It can be confusing for implementers to know what to do, what tools to use, what funding to accept and when it is OK to bring in an outside agenda. Salon participants provided a number of examples where they had to make choices and felt ethics could have been compromised. Is it OK to sign people up on Facebook or Gmail during an ICT and education project, given these companies’ marketing and privacy policies? What about working on aid transparency initiatives in places where human rights work or crime reporting can get people killed or individual philanthropists/donors might be kidnapped or extorted? What about a hackathon where the data and solutions are later given to a government’s civilian-military affairs office? What about telling LGBT youth about a social media site that encourages LGBT youth to connect openly with one another (in light of recent harsh legal penalties against homosexuality)? What about employing a user-centered design approach for a project that will eventually be overlaid on top of a larger platform, system or service that does not pass the privacy litmus test? Is it better to contribute to improving healthcare while knowing that your software system might compromise privacy and autonomy because it sits on top of a biometric system, for example? Participants at the Salon face these ethical dilemmas every day, and as one person noted, ‘I wonder if I am just window dressing something that will look and feel holistic and human-centered, but that will be used to justify decisions down the road that are politically negative or go against my values.’ Participants said they normally rely on their own moral compass, but clearly many Salon participants are wrestling with the potential ethical implications of their actions.

What we can do? Recommendations from Salon participants

Work closely with and listen to local partners, who should be driving the process and decisions. There may be a role for an outside perspective, but the outside perspective should not trump the local one. Inculcate and support local communities to build their own tools, narratives, and projects. Let people set their own agendas. Find ways to facilitate long-term development processes around communities rather than being subject to agendas from the outside.

Consider this to be ICT for Discrimination and think in every instance and every design decision about how to dial down discrimination. Data lead to sorting, and data get lumped into clusters. Find ways during the design process to reduce the discrimination that will come from that sorting and clustering process. The ‘Do no harm’ approach is key. Practitioners and designers should also be wary of the automation of development and the potential for automated decisions to be discriminatory.

Call out hypocrisy. Those of us who sit at Salons or attend global meetings hold tremendous privilege and power as compared to most of the rest of the world. ‘It’s not landless farmers or disenfranchised young black youth in Brazil who get to attend global meetings,’ said one Salon attendee. ‘It’s people like us. We need to be cognizant of the advantage we have as holders of power.’ Here in the US, the participant added, we need to be more aware of what private sector US technology companies are doing to take advantage of and maintain their stronghold in the global market and how the US government is working to allow US corporations to benefit disproportionately from the current Internet governance structure.

Use a rights-based approach to data and privacy to help to frame these issues and situations. Disclosure and consent are sometimes considered extraneous, especially in emergency situations. People think ‘this might be the only time I can get into this disaster or conflict zone, so I’m going to Hoover up as much data as possible without worrying about privacy.’ On the other hand, sometimes organizations are paternalistic and make choices for people about their own privacy. Consent and disclosure are not new issues; they are merely manifested in new ways as new technology changes the game and we cannot guarantee anonymity or privacy any more for research subjects. There is also a difference between information a person actively volunteers and information that is passively collected and used without a person’s knowledge. Framing privacy in a human rights context can help place importance on both processes and outcomes that support people’s rights to control their own data and that increase empowerment.

Create a minimum standard for privacy. Though we may not be able to determine a ceiling for privacy, one Salon participant said we should at least consider a floor or a minimum standard. Actors on the ground will always feel that privacy standards are a luxury because they have little know-how and little funding, so creating and working within an ethical standard should be a mandate from donors. The standard could be established as an M&E criterion.

Establish an ethics checklist to decide on funding sources and create policies and processes that help organizations to better understand how a donor or sub-donor would access and/or use data collected as part of a project or program they are funding. This is not always an easy solution, however, especially for cash-strapped local organizations. In India, for example, organizations are legally restricted from receiving certain types of funding based on government concerns that external agencies are trying to bring in Western democracy and Western values. Local organizations have a hard time getting funding for anti-censorship or free speech efforts. As one person at the Salon said, ‘agencies working on the ground are in a bind because they can’t take money from Google because it’s tainted, they can’t take money from the State Department because it’s imperialism and they can’t take money from local donors because there are none.’

Use encryption and other technology solutions. Given the low levels of understanding and awareness of these tools, more needs to be done so that more organizations learn how to use them, and they need to be made simpler, more accessible and user-friendly. ‘Crypto Parties’ can help get organizations familiar with encryption and privacy, but better outreach is needed so that organizations understand the relevance of encryption and feel welcome in tech-heavy environments.

Thanks to participants and lead discussants for the great discussions and to ThoughtWorks for hosting us at their offices!

 If you’d like to attend future Salons, sign up here!

Read Full Post »

This is a cross post from Heather Leson, Community Engagement Director at the Open Knowledge Foundation. The original post appeared here on the School of Data site.

by Heather Leson

What is the currency of change? What can coders (consumers) do with IATI data? How can suppliers deliver the data sets? Last week I had the honour of participating in the Open Data for Development Codeathon and the International Aid Transparency Initiative Technical Advisory Group meetings. IATI’s goal is to make information about aid spending easier to access, use, and understand. It was great that these events were back-to-back to push a big picture view.

My big takeaways included similar themes that I have learned on my open source journey:

You can talk about open data [insert tech or OS project] all you want, but if you don’t have an interactive community (including mentorship programmes), an education strategy, engagement/feedback loops plan, translation/localization plan and a process for people to learn how to contribute, then you build a double-edged barrier: barrier to entry and barrier for impact/contributor outputs.

Currency

About the Open Data in Development Codeathon

At the Codathon close, Mark Surman, Executive Director of Mozilla Foundation, gave us a call to action to make the web. Well, in order to create a world of data makers, I think we should run aid and development processes through this mindset. What is the currency of change? I hear many people talking about theory of change and impact, but I’d like to add ‘currency’. This is not only about money, this is about using the best brainpower and best energy sources to solve real world problems in smart ways. I think if we heed Mark’s call to action with a “yes, and”, then we can rethink how we approach complex change. Every single industry is suffering from the same issue: how to deal with the influx of supply and demand in information. We need to change how we approach the problem. Combined events like these give a window into tackling problems in a new format. It is not about the next greatest app, but more about asking: how can we learn from the Webmakers and build with each other in our respective fields and networks?

Ease of Delivery

The IATI community / network is very passionate about moving the ball forward on releasing data. During the sessions, it was clear that the attendees see some gaps and are already working to fill them. The new IATI website is set up to grow with a Community component. The feedback from each of the sessions was distilled by the IATI – TAG and Civil Society Guidance groups to share with the IATI Secretariat.

In the Open Data in Development, Impact of Open Data in Developing Countries, and CSO Guidance sessions, we discussed some key items about sharing, learning, and using IATI data. Farai Matsika, with International HIV/Aids Alliance, was particularly poignant reminding us of IATI’s CSO purpose – we need to share data with those we serve.

Country edits IATI

One of the biggest themes was data ethics. As we rush to ask NGOs and CSOs to release data, what are some of the data pitfalls? Anahi Ayala Iaccuci of Internews and Linda Raftree of Plan International USA both reminded participants that data needs to be anonymized to protect those at risk. Ms. Iaccuci asked that we consider the complex nature of sharing both sides of the open data story – successes and failures. As well, she advised: don’t create trust, but think about who people are trusting. Turning this model around is key to rethinking assumptions. I would add to her point: trust and sharing are currency and will add to the success measures of IATI. If people don’t trust the IATI data, they won’t share and use it.

Anne Crowe of Privacy International frequently asked attendees to consider the ramifications of opening data. It is clear that the IATI TAG does not curate the data that NGOS and CSOs share. Thus it falls on each of these organizations to learn how to be data makers in order to contribute data to IATI. Perhaps organizations need a lead educator and curator to ensure the future success of the IATI process, including quality data.

I think that School of Data and the Partnership for Open Data have a huge part to play with IATI. My colleague Zara Rahman is collecting user feedback for the Open Development Toolkit, and Katelyn Rogers is leading the Open Development mailing list. We collectively want to help people become data makers and consumers to effectively achieve their development goals using open data. This also means also tackling the ongoing questions about data quality and data ethics.


Here are some additional resources shared during the IATI meetings.

Read Full Post »

Screen Shot 2013-11-23 at 6.14.40 PM

Migration has been a part of the human experience since the dawn of time, and populations have always moved in search of resources and better conditions. Today, unaccompanied children and youth are an integral part of national and global migration patterns, often leaving their place of origin due to violence, conflict, abuse, or other rights violations, or simply to seek better opportunities for themselves.

It is estimated that 33 million (or some 16 percent) of the total migrant population today is younger than age 
20. Child and adolescent migrants make up a significant proportion of the total population of migrants in Africa (28 percent), Asia (21 percent), Oceania (11 percent), Europe (11 percent), and the Americas (10 percent).

The issue of migration is central to the current political debate as well as to the development discussion, especially in conversations about the “post 2015” agenda. Though many organizations are working to improve children’s well-being in their home communities, prevention work with children and youth is not likely to end migration. Civil society organizations, together with children and youth, government, community members, and other stakeholders can help make migration safer and more productive for those young people who do end up on the move.

As the debate around migration rages, access to and use of ICTs is expanding exponentially around the globe. For this reason Plan International USA and the Oak Foundation felt it was an opportune time to take stock of the ways that ICTs are being used in the child and youth migration process.

Our new report, “Modern Mobility: the role of ICTs in child and youth migration” takes a look at:

  • how children and youth are using ICTs to prepare for migration; to guide and facilitate their journey; to keep in touch with families; to connect with opportunities for support and work; and to cope with integration, forced repatriation or continued movement; and
  • how civil society organizations are using ICTs to facilitate and manage their work; to support children and youth on the move; and to communicate and advocate for the rights of child and youth migrants.

In the Modern Mobility paper, we identify and provide examples of three core ways that child and youth migrants are using new ICTs during the different phases of the migration process:

  1. for communicating and connecting with families and friends
  2. for accessing information
  3. for accessing services

We then outline seven areas where we found CSOs are using ICTs in their work with child and youth migrants, and we offer some examples:

Ways that CSOs are using ICTs in their work with child and youth migrants.

Ways that CSOs are using ICTs in their work with child and youth migrants.

Though we were able to identify some major trends in how children and youth themselves use ICTs and how organizations are experimenting with ICTs in programming, we found little information on the impact that ICTs and ICT-enabled programs and services have on migrating children and youth, whether positive or negative. Most CSO practitioners that we talked with said that they had very little awareness of how other organizations or initiatives similar to their own were using ICTs. Most also said they did not know where to find orientation or guidance on good practice in the use of ICTs in child-centered programming, ICTs in protection work (aside from protecting children from online risks), or use of ICTs in work with children and young people at various stages of migration. Most CSO practitioners we spoke with were interested in learning more, sharing experiences, and improving their capacities to use ICTs in their work.

Based on Plan Finland’s “ICT-Enabled Development Guide” (authored by Hannah Beardon), the Modern Mobility report provides CSOs with a checklist to support thinking around the strategic use of ICTs in general.

ICT-enabled development checklist developed by Hannah Beardon for Plan International.

ICT-enabled development checklist developed by Hannah Beardon for Plan International.

We also offer a list of key considerations for practitioners who wish to incorporate new technologies into their work, including core questions to ask about access, age, capacity, conflict, connectivity, cost, disability, economic status, electricity, existing information ecosystems, gender, information literacy, language, literacy, power, protection, privacy, sustainability, and user-involvement.

Our recommendation for taking this area forward is to develop greater awareness and capacity among CSOs regarding the potential uses and risks of ICTs in work with children and youth on the move by:

  1. Establishing an active community of practice on ICTs and children and youth on the move.
  2. Mapping and sharing existing projects and programs.
  3. Creating a guide or toolbox on good practice for ICTs in work with children and youth on the move.
  4. Further providing guidance on how ICTs can help “normal” programs to reach out to and include children and youth on the move.
  5. Further documentation and development of an evidence base.
  6. Sharing and distributing this report for discussion and action.

Download the Modern Mobility report here.

We’d love comments and feedback, and information about examples or documentation/evidence that we did not come across while writing the report!

Read Full Post »

Older Posts »

Follow

Get every new post delivered to your Inbox.

Join 876 other followers