Feeds:
Posts
Comments

Posts Tagged ‘safety’

As the world became more digital in the wake of COVID-19, the number of mobile applications and online services and support increased exponentially. Many of these apps offer important support to people who live and move in contexts where they are at risk. Digital apps for sensitive services (such as mental health, reproductive health, shelter and support for gender-based violence, and safe spaces for LGBTQI+ communities) can expose people to harm at the family, peer, and wider societal level if not designed carefully. This harm can be severe – for example, detention or death. Though people who habitually face risk have their own coping mechanisms, those designing digital apps and services also have a responsibility to mitigate harm.

At our March 8 Technology Salon NYC (hosted at Thoughtworks), we discussed how to create safe, private digital solutions for sensitive services. Joining were Gerda Binder, UNICEF’s Oky Period Tracker App for Girls; Jonathan McKay, SameSame Collective; Stephanie Mikkelson, United Nations Population Fund; Tania Lee, Trestle, Jane Piercy, Reproductive Equity Now Foundation; and 25 others, making for a rich discussion on this critical topic!

Key Takeaways from the conversation

1. Do constant threat modeling. Threat modeling needs to include a wide range of potential challenges including mis- and disinformation, hostile family and community members, shifting legal landscapes, and law enforcement tactics. The latter are especially important if you are working in environments where people are being persecuted by government. Roughly 70 countries criminalize consensual same-sex activities and some forms of gender expression, most in Sub-Saharan Africa, for example. The US is placing ever greater legal restrictions on gender expression and identity and on reproductive rights, and laws differ from state-to-state, making the legal landscape highly complex. Hate groups are organizing online to perpetrate violence against women, girls and LGBTQI+ people in many other parts of the world as well. In Egypt, police have used the dating app Grindr to entrap, arrest and prosecute gay men. Similar tactics were used in the US to identify and ‘out’ gay priests. Since political and social contexts and the tactics of those who want to do harm change rapidly, ongoing threat modeling is critical. Your threat models will look different in each context and for each digital app.

2. Involve communities and other stakeholders and experts. Co-creation processes are vital for identifying what to design as well as how to design for safety and privacy. By working together with communities, you will have a much better idea of what they need and want, the various challenges they face to access and use a digital tool, and the kinds of risks and harms that need to be reduced through design and during implementation. For example, a lot of apps have emergency buttons designed to protect women, one Salon participant explained. These often alert the police, however that might absolutely be the wrong choice. “Women will tell you about their experiences with police as perpetrators of gender-based violence” (GBV). It’s important to hire tech designers who identify with the groups you are designing for/with. Subject matter experts are key stakeholders, too. There are decades of experience working with groups who are at risk, so don’t re-invent the wheel. Standards exist for how to work on themes like GBV, data protection, and other aspects of safe design of apps and digital services – use them!

3. Collect as little data as possible. Despite the value of data in measuring impact and use and helping to adapt interventions to meet the needs of the target population, collection of personal and sensitive data is extremely dangerous for people using these apps and for organizations providing the services. Data collected from individuals who explicitly or implicitly admit to same-sex activities or gender non-conforming behavior could, in theory, be used by their family and community as evidence in their persecution. Similarly, sexual activity and fertility data tracked in a period tracker could be used to ‘prove’ that a girl or woman is/was fertile or infertile, had sex, miscarried, or aborted — all of which can be a risk depending on the family, social, or legal context. Communication on sensitive topics increases the risk of prosecution because email, web searches, social media posts, text messages, voice messages, call logs, and anything that can be found on a phone or computer can be used as evidence. If a digital app or service can function without collecting data, then it should! For example, it’s not necessary to collect a person’s data to provide them with legal advice or to allow them to track their period.

4. Be thoughtful about where data is stored. When using third party apps to help manage a digital solution, it’s important to know exactly what data is stored, whether the data can be deleted, and whether it can be subpoenaed. Also consider that if an app or third-party data processor is sold to another company, the data they store will likely be sold along with the app, and the policies related to data might change.

While sometimes it is safer to store data on an individual’s device, in other cases it might be safer for data to live in the cloud and/or in different country. This will depend on the threat landscape and actors. You’ll want to also review data privacy regulations for the countries where you are based, where the data is stored, and where your target end users live. All of these regulations may need to be complied with depending on where data is collected, processed, and stored. Some countries have “data sovereignty laws” that dictate that data must reside in the country where it was collected. Some governments have even drafted laws that require government to have access to this data. Others have so-called “hostage” laws that require that digital platforms maintain at least one employee in the country. These employees have been harassed by governments who push them to comply with certain types of censorship or surrender data from their digital platforms. If government is your main threat actor, you might need to decide whether non-compliance with data laws is a risk that you are willing to take.

5. Improve consent processes and transparency. Consent cannot be conceived as a one-time, one-off process, because circumstances change and so does consent. Generally digital platforms do a terrible job at telling people about what happens to their data and informing them of the possible risks to their privacy and safety. It’s complicated to explain where data goes and what happens to it, but we all need to do better with consent and transparency. Engaging people who will use your app in designing a good process is one way to help develop easy to understand language and explanations.

6. Help people protect themselves. Add content to your website, app, or bot that helps people learn how to adjust their privacy settings and understand the risks of using your service and how to protect themselves while doing so. Some features that were mentioned by Salon participants include those that allow people to disguise the apps they are using, quickly delete their data and/or the app itself, mask or ‘forget’ phone numbers so that the number won’t appear in the contact list and so that text message content won’t repopulate if the number is used again to send a text, and using different phone numbers for the organization’s website and for outreach so that the numbers are harder to trace back to the organization or a service.

7. Plan for the end of your project and/or funding. It’s important to plan for how you will safely delete all your data and any data held by third parties at the end of your funding cycle if the app or service is discontinued. In addition, you’ll need to think about what happens to the people who relied on your service. Will you leave them high and dry? Some organizations think of this as an “off ramp” and recommend that you plan for the end of the effort from the very beginning.

8. Take care of your staff. Ensure that you have enough staff capacity to respond to any incoming requests or needs of the people your service targets. Additionally, keep staff safe from harm. Countries, like Hungary, Russia and Indonesia have laws that make the provision of educational material related to LGBTQI+ identities challenging, especially to minors. Similarly, some countries and some US states prohibit any type of counseling related to abortion or gender affirmative care. This poses a risk to organizations who establish legal entities and employ people in these countries and states and to their staff. It’s critical to ensure that you have enough resources to keep staff safe. You will also want to be sure to provide support for them to avoid high levels of burn out and to deal with any vicarious trauma. Keeping staff safe and healthy is not only good for them, but also for your service because better morale will mean higher quality support services.

9. Accept that there will be trade-offs. Password protected apps are more secure, but they can pose higher barriers to use because they introduce friction. If your app doesn’t collect personal data it will be safer, but it will be more difficult to offer a password reset or recovery options, which is a usability challenge, especially in places where people have lower literacy and less experience using apps and passwords. When data is stored locally, it’s less susceptible to large scale data mining, however it might be more at risk of a family member or law enforcement forcing the data to be shared, and if a device is lost or broken, the data will be lost.

Large platforms may be more prone to commercial privacy risks, yet in some ways they provide greater data security. As one person said, “We decided to just go with WhatsApp because we could never develop a platform as secure as theirs – we simply don’t have the engineering power that they do.” Another person mentioned that they offer a Signal option (which is encrypted) for private messaging but that many people do not use Signal and prefer to communicate through platforms they already use. These more popular platforms are less secure, so the organization had to find other ways to set protective parameters for people who use them. Some organizations have decided that, despite the legal challenges it might bring, they simply will not hand over data to law enforcement. To prevent this situation from happening, they have only set up legal entities in countries where human rights protections for the populations they serve are strong. You’ll want to carefully discuss all these different privacy and usability choices, including with potential end users, to come to the best decision for each app or service.

Additional resources on this topic include:

Technology Salons run under Chatham House Rule, so no attribution has been made in this post. If you’d like to join us for a Salon, sign up here. If you’d like to suggest a topic or provide funding support to Salons in NYC please get in touch!

Read Full Post »

Crowdsourcing our Responsible Data questions, challenges and lessons. (Photo courtesy of Amy O'Donnell).

Crowdsourcing our Responsible Data questions, challenges and lessons. (Photo by Amy O’Donnell).

At Catholic Relief Services’ ICT4D Conference in May 2016, I worked with Amy O’Donnell  (Oxfam GB) and Paul Perrin (CRS) to facilitate a participatory session that explored notions of Digital Privacy, Security and Safety. We had a full room, with a widely varied set of experiences and expertise.

The session kicked off with stories of privacy and security breaches. One person told of having personal data stolen when a federal government clearance database was compromised. We also shared how a researcher in Denmark scraped very personal data from the OK Cupid online dating site and opened it up to the public.

A comparison was made between the OK Cupid data situation and the work that we do as development professionals. When we collect very personal information from program participants, they may not expect that their household level income, health data or personal habits would be ‘opened’ at some point.

Our first task was to explore and compare the meaning of the terms: Privacy, Security and Safety as they relate to “digital” and “development.”

What do we mean by privacy?

The “privacy” group talked quite a bit about contextuality of data ownership. They noted that there are aspects of privacy that cut across different groups of people in different societies, and that some aspects of privacy may be culturally specific. Privacy is concerned with ownership of data and protection of one’s information, they said. It’s about who owns data and who collects and protects it and notions of to whom it belongs. Private information is that which may be known by some but not by all. Privacy is a temporal notion — private information should be protected indefinitely over time. In addition, privacy is constantly changing. Because we are using data on our mobile phones, said one person, “Safaricom knows we are all in this same space, but we don’t know that they know.”

Another said that in today’s world, “You assume others can’t know something about you, but things are actually known about you that you don’t even know that others can know. There are some facts about you that you don’t think anyone should know or be able to know, but they do.” The group mentioned website terms and conditions, corporate ownership of personal data and a lack of control of privacy now. Some felt that we are unable to maintain our privacy today, whereas others felt that one could opt out of social media and other technologies to remain in control of one’s own privacy. The group noted that “privacy is about the appropriate use of data for its intended purpose. If that purpose shifts and I haven’t consented, then it’s a violation of privacy.”

What do we mean by security?

The Security group considered security to relate to an individual’s information. “It’s your information, and security of it means that what you’re doing is protected, confidential, and access is only for authorized users.” Security was also related to the location of where a person’s information is hosted and the legal parameters. Other aspects were related to “a barrier – an anti-virus program or some kind of encryption software, something that protects you from harm…. It’s about setting roles and permissions on software and installing firewalls, role-based permissions for accessing data, and cloud security of individuals’ data.” A broader aspect of security was linked to the effects of hacking that lead to offline vulnerability, to a lack of emotional security or feeling intimidated in an online space. Lastly, the group noted that “we, not the systems, are the weakest link in security – what we click on, what we view, what we’ve done. We are our own worst enemies in terms of keeping ourselves and our data secure.”

What do we mean by safety?

The Safety group noted that it’s difficult to know the difference between safety and security. “Safety evokes something highly personal. Like privacy… it’s related to being free from harm personally, physically and emotionally.” The group raised examples of protecting children from harmful online content or from people seeking to harm vulnerable users of online tools. The aspect of keeping your online financial information safe, and feeling confident that a service was ‘safe’ to use was also raised. Safety was considered to be linked to the concept of risk. “Safety engenders a level of trust, which is at the heart of safety online,” said one person.

In the context of data collection for communities we work with – safety was connected to data minimization concepts and linked with vulnerability, and a compounded vulnerability when it comes to online risk and safety. “If one person’s data is not safely maintained it puts others at risk,” noted the group. “And pieces of information that are innocuous on their own may become harmful when combined.” Lastly, the notion of safety as related to offline risk or risk to an individual due to a specific online behavior or data breach was raised.

It was noted that in all of these terms: privacy, security and safety, there is an element of power, and that in this type of work, a power relations analysis is critical.

The Digital Data Life Cycle

After unpacking the above terms, Amy took the group through an analysis of the data life cycle (courtesy of the Engine Room’s Responsible Data website) in order to highlight the different moments where the three concepts (privacy, security and safety) come into play.

Screen Shot 2016-05-25 at 6.51.50 AM

  • Plan/Design
  • Collect/Find/Acquire
  • Store
  • Transmit
  • Access
  • Share
  • Analyze/use
  • Retention
  • Disposal
  • Afterlife

Participants added additional stages in the data life cycle that they passed through in their work (coordinate, monitor the process, monitor compliance with data privacy and security policies). We placed the points of the data life cycle on the wall, and invited participants to:

  • Place a pink sticky note under the stage in the data life cycle that resonates or interests them most and think about why.
  • Place a green sticky note under the stage that is the most challenging or troublesome for them or their organizations and think about why.
  • Place a blue sticky note under the stage where they have the most experience, and to share a particular experience or tip that might help others to better manage their data life cycle in a private, secure and safe way.

Challenges, concerns and lessons

Design as well as policy are important!

  • Design drives everScreen Shot 2016-05-25 at 7.21.07 AMything else. We often start from the point of collection when really it’s at the design stage when we should think about the burden of data collection and define what’s the minimum we can ask of people? How we design – even how we get consent – can inform how the whole process happens.
  • When we get part-way through the data life cycle, we often wish we’d have thought of the whole cycle at the beginning, during the design phase.
  • In addition to good design, coordination of data collection needs to be thought about early in the process so that duplication can be reduced. This can also reduce fatigue for people who are asked over and over for their data.
  • Informed consent is such a critical issue that needs to be linked with the entire process of design for the whole data life cycle. How do you explain to people that you will be giving their data away, anonymizing, separating out, encrypting? There are often flow down clauses in some contracts that shifts responsibilities for data protection and security and it’s not always clear who is responsible for those data processes? How can you be sure that they are doing it properly and in a painstaking way?
  • Anonymization is also an issue. It’s hard to know to what level to anonymize things like call data records — to the individual? Township? District Level? And for how long will anonymization actually hold up?
  • The lack of good design and policy contributes to overlapping efforts and poor coordination of data collection efforts across agencies. We often collect too much data in poorly designed databases.
  • Policy is not enough – we need to do a much better job of monitoring compliance with policy.
  • Institutional Review Boards (IRBs) and compliance aspects need to be updated to the new digital data reality. At the same time, sometimes IRBs are not the right instrument for what we are aiming to achieve.

Data collection needs more attention.

  • Data collection is the easy part – where institutions struggle is with analyzing and doing something with the data we collect.
  • Organizations often don’t have a well-structured or systematic process for data collection.
  • We need to be clearer about what type of information we are collecting and why.
  • We need to update our data protection policy.

Reasons for data sharing are not always clear.

  • How can share data securely and efficiently without building duplicative systems? We should be thinking more during the design and collection phase about whether the data is going to be interoperable and who needs to access it.
  • How can we get the right balance in terms of data sharing? Some donors really push for information that can put people in real danger – like details of people who have participated in particular programs that would put them at risk with their home governments. Organizations really need to push back against this. It’s an education thing with donors. Middle management and intermediaries are often the ones that push for this type of data because they don’t really have a handle on the risk it represents. They are the weak points because of the demands they are putting on people. This is a challenge for open data policies – leaving it open to people leaves it to doing the laziest job possible of thinking about the potential risks for that data.
  • There are legal aspects of sharing too – such as the USAID open data policy where those collecting data have to share with the government. But we don’t have a clear understanding of what the international laws are about data sharing.
  • There are so many pressures to share data but they are not all fully thought through!

Data analysis and use of data are key weak spots for organizations.

  • We are just beginning to think through capturing lots of data.
  • Data is collected but not always used. Too often it’s extractive data collection. We don’t have the feedback loops in place, and when there are feedback loops we often don’t use the the feedback to make changes.
  • We forget often to go back to the people who have provided us with data to share back with them. It’s not often that we hold a consultation with the community to really involve them in how the data can be used.

Secure storage is a challenge.

  • We have hundreds of databases across the agency in various formats, hard drives and states of security, privacy and safety. Are we able to keep these secure?
  • We need to think more carefully about where we hold our data and who has access to it. Sometimes our data is held by external consultants. How should we be addressing that?

Disposing of data properly in a global context is hard!

  • Screen Shot 2016-05-25 at 7.17.58 AMIt’s difficult to dispose of data when there are multiple versions of it and a data footprint.
  • Disposal is an issue. We’re doing a lot of server upgrades and many of these are remote locations. How do we ensure that the right disposal process is going on globally, short of physically seeing that hard drives are smashed up!
  • We need to do a better job of disposal on personal laptops. I’ve done a lot of data collection on my personal laptop – no one has ever followed up to see if I’ve deleted it. How are we handling data handover? How do you really dispose of data?
  • Our organization hasn’t even thought about this yet!

Tips and recommendations from participants

  • Organizations should be using different tools. They should be using Pretty Good Privacy techniques rather than relying on free or commercial tools like Google or Skype.
  • People can be your weakest link if they are not aware or they don’t care about privacy and security. We send an email out to all staff on a weekly basis that talks about taking adequate measures. We share tips and stories. That helps to keep privacy and security front and center.
  • Even if you have a policy the hard part is enforcement, accountability, and policy reform. If our organizations are not doing direct policy around the formation of best practices in this area, then it’s on us to be sure we understand what is best practice, and to advocate for that. Let’s do what we can before the policy catches up.
  • The Responsible Data Forum and Tactical Tech have a great set of resources.
  • Oxfam has a Responsible Data Policy and Girl Effect have developed a Girls’ Digital Privacy, Security and Safety Toolkit that can also offer some guidance.

In conclusion, participants agreed that development agencies and NGOs need to take privacy, security and safety seriously. They can no longer afford to implement security at a lower level than corporations. “Times are changing and hackers are no longer just interested in financial information. People’s data is very valuable. We need to change and take security as seriously as corporates do!” as one person said.

 

 

Read Full Post »