Feeds:
Posts
Comments

Archive for the ‘data’ Category

It’s been two weeks since we closed out the M&E Tech Conference in DC and the Deep Dive in NYC. For those of you who missed it or who want to see a quick summary of what happened, here are some of the best tweets from the sessions.

We’re compiling blog posts and related documentation and will be sharing more detailed summaries soon. In the meantime, enjoy a snapshot!

https://twitter.com/neuguy/status/515134807672909826

https://twitter.com/dalgoso/status/515136050793291776

https://twitter.com/neuguy/status/515166952378343425

https://twitter.com/neuguy/status/515184242595487744

https://twitter.com/schmutzie/status/515215243388014592

https://twitter.com/prefontaine/status/515222154670252032

https://twitter.com/richmanmax/status/515576201084411904

https://twitter.com/sandhya_c_rao/status/516343304448131072

https://twitter.com/dalgoso/status/519879358370955264

Read Full Post »

Screen Shot 2014-07-22 at 5.13.57 AM

I spent last week in Berlin at the Open Knowledge Festival – a great place to talk ‘open’ everything and catch up on what is happening in this burgeoning area that crosses through the fields of data, science, education, art, transparency and accountability, governance, development, technology and more.

One session was on Power, politics, inclusion and voice, and it encouraged participants to dig deeper into those 4 aspects of open data and open knowledge. The organizers kicked things off by asking us to get into small groups and talk about power. Our group was assigned the topic of “feeling powerless” and we shared personal experiences of when we had felt powerless. There were several women in my group, many of whom, unsurprisingly, recounted experiences that felt gendered.

Screen Shot 2014-07-22 at 5.24.53 AMThe concept of ‘mansplaining‘ came up. Mansplaining (according to Wikipedia) is a term that describes when a man speaks to a woman with the assumption that she knows less than he does about the topic being discussed because she is female. ‘Mansplaining is different from other forms of condescension because mansplaining is rooted in the assumption that, in general, a man is likely to be more knowledgeable than a woman.’

From there, we got into the tokenism we’d seen in development programs that say they want ‘participation’ but really don’t care to include the viewpoints of the participants. One member of our group talked about the feelings of powerlessness development workers create when they are dismissive of indigenous knowledge and assume they know more than the poor in general. “Like when they go out and explain climate change to people who have been farming their entire lives,” she said.

A lightbulb went off. It’s the same attitude as ‘mansplaining,’ but seen in development workers. It’s #devsplaining.

So I made a hashtag (of course) and tried to come up with a definition.

Devsplaining – when a development worker, academic, or someone who generally has more power within the ‘development industry’ speaks condescendingly to someone with less power. The devsplainer assumes that he/she knows more and has more right to an opinion because of his/her position and power within the industry. Devsplaining is rooted in the assumption that, in general, development workers are likely to be more knowledgeable about the lives and situations of the people who participate in their programs/research than the people themselves are.

What do people think? Any good examples?

 

 

Read Full Post »

Screen Shot 2014-05-08 at 9.36.00 AMDebate and thinking around data, ethics, ICT have been growing and expanding a lot lately, which makes me very happy!

Coming up on May 22 in NYC, the engine room, Hivos, the Berkman Center for Internet and Society, and Kurante (my newish gig) are organizing the latest in a series of events as part of the Responsible Data Forum.

The event will be hosted at ThoughtWorks and it is in-person only. Space is limited, so if you’d like to join us, let us know soon by filling in this form. 

What’s it all about?

This particular Responsible Data Forum event is an effort to map the ethical, legal, privacy and security challenges surrounding the increased use and sharing of data in development programming. The Forum will aim to explore the ways in which these challenges are experienced in project design and implementation, as well as when project data is shared or published in an effort to strengthen accountability. The event will be a collaborative effort to begin developing concrete tools and strategies to address these challenges, which can be further tested and refined with end users at events in Amsterdam and Budapest.

We will explore the responsible data challenges faced by development practitioners in program design and implementation.

Some of the use cases we’ll consider include:

  • projects collecting data from marginalized populations, aspiring to respect a do no harm principle, but also to identify opportunities for informational empowerment
  • project design staff seeking to understand and manage the lifespan of project data from collection, through maintenance, utilization, and sharing or destruction.
  • project staff that are considering data sharing or joint data collection with government agencies or corporate actors
  • project staff who want to better understand how ICT4D will impact communities
  • projects exploring the potential of popular ICT-related mechanisms, such as hackathons, incubation labs or innovation hubs
  • projects wishing to use development data for research purposes, and crafting responsible ways to use personally identifiable data for academic purposes
  • projects working with children under the age of 18, struggling to balance the need for data to improve programming approaches, and demand higher levels of protection for children

By gathering a significant number of development practitioners grappling with these issues, the Forum aims to pose practical and critical questions to the use of data and ICTs in development programming. Through collaborative sessions and group work, the Forum will identify common pressing issues for which there might be practical and feasible solutions. The Forum will focus on prototyping specific tools and strategies to respond to these challenges.

What will be accomplished?

Some outputs from the event may include:

  • Tools and checklists for managing responsible data challenges for specific project modalities, such as sms surveys, constructing national databases, or social media scraping and engagement.
  • Best practices and ethical controls for data sharing agreements with governments, corporate actors, academia or civil society
  • Strategies for responsible program development
  • Guidelines for data-driven projects dealing with communities with limited representation or access to information
  • Heuristics and frameworks for understanding anonymity and re-identification of large development data sets
  • Potential policy interventions to create greater awareness and possibly consider minimum standards

Hope to see some of you on the 22nd! Sign up here if you’re interested in attending, and read more about the Responsible Data Forum here.

 

Read Full Post »

Last week’s Technology Salon New York City touched on ethics in technology for democracy initiatives. We heard from lead discussants Malavika Jayaram, Berkman Center for Internet and SocietyIvan Sigal, Global Voices; and Amilcar Priestley, Afrolatin@ Project. Though the topic was catalyzed by the Associated Press’ article on ‘Zunzuneo’ (a.k.a. ‘Cuban Twitter’) and subsequent discussions in the press and elsewhere, we aimed to cover some of the wider ethical issues encountered by people and organizations who implement technology for democracy programs.

Salons are off the record spaces, so no attribution is made in this post, but I’ve summarized the discussion points here:

First up: Zunzuneo

The media misinterpreted much of the Zunzuneo story. Zunzuneo was not a secret mission, according to one Salon participant, as it’s not in the remit of USAID to carry out covert operations. The AP article conflated a number of ideas regarding how USAID works and the contracting mechanisms that were involved in this case, he said. USAID and the Office of Transition Initiatives (OTI) frequently disguise members, organizations, and contractors that work for it on the ground for security reasons. (See USAID’s side of the story here). This may still be an ethical question, but it is not technically “spying.” The project was known within the OTI and development community, but on a ‘need to know’ basis. It was not a ‘fly by night’ operation; it was more a ‘quietly and not very effectively run project.’

There were likely ethics breaches in Zunzuneo, from a legal standpoint. It’s not clear whether the data and phone numbers collected from the Cuban public for the project were obtained in a legal or ethical way. Some reports say they were obtained through a mid-level employee (a “Cuban engineer who had gotten the phone list” according to the AP article). (Note: I spoke separately to someone close to the project who told me that user opt-in/opt-out and other standard privacy protocols were in place). It’s also not entirely clear whether, as the AP states, the user information collected was being categorized into segments who were loyal or disloyal to the Cuban government, information which could put users at risk if found out.

Zunzuneo took place in a broader historical and geo-political context. As one person put it, the project followed Secretary Clinton’s speeches on Internet Freedom. There was a rush to bring technology into the geopolitical space, and ‘the articulation of why technology was important collided with a bureaucratic process in USAID and the State Department (the ‘F process’) that absorbed USAID into the State Department and made development part of the State Department’s broader political agenda.’ This agenda had been in the works for quite some time, and was part of a wider strategy of quietly moving into development spaces and combining development, diplomacy, intelligence and military (defense), the so-called 3 D’s.

Implementers failed to think through good design, ethics and community aspects of the work. In a number of projects of this type, the idea was that if you give people technology, they will somehow create bottom up pressure for political social change. As one person noted, ‘in the Middle East, as a counter example, the tech was there to enable and assist people who had spent 8-10 years building networks. The idea that we can drop tech into a space and an uprising will just happen and it will coincidentally push the US geopolitical agenda is a fantasy.’ Often these kinds of programs start with a strategic communications goal that serves a political end of the US Government. They are designed with the idea that a particular input equals some kind of a specific result down the chain. The problem comes when the people doing the seeding of the ideas and inputs are not familiar with the context they will be operating in. They are injecting inputs into a space that they don’t understand. The bigger ethical question is: Why does this thought process prevail in development? Much of that answer is found in US domestic politics and the ways that initiatives get funded.

Zunzuneo was not a big surprise for Afrolatino organizations. According to one discussant, Afrolatino organizations were not surprised when the Zunzuneo article came out, given the geopolitical history and the ongoing presence of the US in Latin America. Zunzuneo was seen as a 21st Century version of what has been happening for decades. Though it was criticized, it was not seen as particularly detrimental. Furthermore, the Afrolatino community (within the wider Latino community) has had a variety of relationships with the US over time – for example, some Afrolatino groups supported the Contras. Many Afrolatino groups have felt that they were not benefiting overall from the mestizo governments who have held power. In addition, much of Latin America’s younger generation is less tainted by the Cold War mentality, and does not see US involvement in the region as necessarily bad. Programs like Zunzuneo come with a lot of money attached, so often wider concerns about their implications are not in the forefront because organizations need to access funding. Central American and Caribbean countries are only just entering into a phase of deeper analysis of digital citizenship, and views and perceptions on privacy are still being developed.

Perceptions of privacy

There are differences in perception when it comes to privacy and these perceptions are contextual. They vary within and across countries and communities based on age, race, gender, economic levels, comfort with digital devices, political perspective and past history. Some older people, for example, are worried about the privacy violation of having their voice or image recorded, because the voice, image and gaze hold spiritual value and power. These angles of privacy need to be considered as we think through what privacy means in different contexts and adapt our discourse accordingly.

Privacy is hard to explain, as one discussant said: ‘There are not enough dead bodies yet, so it’s hard to get people interested. People get mad when the media gets mad, and until an issue hits the media, it may go unnoticed. It’s very hard to conceptualize the potential harm from lack of privacy. There may be a chilling effect but it’s hard to measure. The digital divide comes in as well, and those with less exposure may have trouble understanding devices and technology. They will then have even greater trouble understanding beyond the device to data doubles, disembodied information and de-anonymization, which are about 7 levels removed from what people can immediately see. Caring a lot about privacy can get you labeled as paranoid or a crazy person in many places.’

Fatalism about privacy can also hamper efforts. In the developing world, many feel that everything is corrupt and inept, and that there is no point in worrying about privacy and security. ‘Nothing ever works anyway, so even if the government wanted to spy on us, they’d screw it up,’ is the feeling. This is often the attitude of human rights workers and others who could be at greatest risk from privacy breaches or data collection, such as that which was reportedly happening within Zunzuneo. Especially among populations and practitioners who have less experience with new technologies and data, this can create large-scale risk.

Intent, action, context and consequences

Good intentions with little attention to privacy vs data collection with a hidden political agenda. Where are the lines when data that are collected for a ‘good cause’ (for example, to improve humanitarian response) might be used for a different purpose that puts vulnerable people at risk? What about data that are collected with less altruistic intentions? What about when the two scenarios overlap? Data might be freely given or collected in an emergency that would be considered a privacy violation in a ‘development’ setting, or the data collection may lead to a privacy violation post-emergency. Often, slapping the ‘obviously good and unarguably positive’ label of ‘Internet freedom’ on something implies that it’s unquestionably positive when it may in fact be part of a political agenda with a misleading label. There is a long history of those with power collecting data that helps them understand and/or control those with less power, as one Salon participant noted, and we need to be cognizant of that when we think about data and privacy.

US Government approaches to political development often take an input/output approach, when, in fact, political development is not the same as health development. ‘In political work, there is no clear and clean epidemiological goal we are trying to reach,’ noted a Salon participant. Political development is often contentious and the targets and approaches are very different than those of health. When a health model and rhetoric is used to work on other development issues, it is misleading. The wholesale adoption of these kinds of disease model approaches leaves people and communities out of the decision making process about their own development. Similarly, the rhetoric of strategic communications and its inclusion into the development agenda came about after the War on Terror, and it is also a poor fit for political development. The rhetoric of ‘opening’ and ‘liberating’ data is similar. These arguments may work well for one kind of issue, but they are not transferable to a political agenda. One Salon participant pointed out the rhetoric of the privatization model also, and explained that a profound yet not often considered implication of the privatization of services is that once a service passes over to the private sector, the Freedom of Information Act (FOIA) does not apply, and citizens and human rights organizations lose FOIA as a tool. Examples included the US prison system and the Blackwater case of several years ago.

It can be confusing for implementers to know what to do, what tools to use, what funding to accept and when it is OK to bring in an outside agenda. Salon participants provided a number of examples where they had to make choices and felt ethics could have been compromised. Is it OK to sign people up on Facebook or Gmail during an ICT and education project, given these companies’ marketing and privacy policies? What about working on aid transparency initiatives in places where human rights work or crime reporting can get people killed or individual philanthropists/donors might be kidnapped or extorted? What about a hackathon where the data and solutions are later given to a government’s civilian-military affairs office? What about telling LGBT youth about a social media site that encourages LGBT youth to connect openly with one another (in light of recent harsh legal penalties against homosexuality)? What about employing a user-centered design approach for a project that will eventually be overlaid on top of a larger platform, system or service that does not pass the privacy litmus test? Is it better to contribute to improving healthcare while knowing that your software system might compromise privacy and autonomy because it sits on top of a biometric system, for example? Participants at the Salon face these ethical dilemmas every day, and as one person noted, ‘I wonder if I am just window dressing something that will look and feel holistic and human-centered, but that will be used to justify decisions down the road that are politically negative or go against my values.’ Participants said they normally rely on their own moral compass, but clearly many Salon participants are wrestling with the potential ethical implications of their actions.

What we can do? Recommendations from Salon participants

Work closely with and listen to local partners, who should be driving the process and decisions. There may be a role for an outside perspective, but the outside perspective should not trump the local one. Inculcate and support local communities to build their own tools, narratives, and projects. Let people set their own agendas. Find ways to facilitate long-term development processes around communities rather than being subject to agendas from the outside.

Consider this to be ICT for Discrimination and think in every instance and every design decision about how to dial down discrimination. Data lead to sorting, and data get lumped into clusters. Find ways during the design process to reduce the discrimination that will come from that sorting and clustering process. The ‘Do no harm’ approach is key. Practitioners and designers should also be wary of the automation of development and the potential for automated decisions to be discriminatory.

Call out hypocrisy. Those of us who sit at Salons or attend global meetings hold tremendous privilege and power as compared to most of the rest of the world. ‘It’s not landless farmers or disenfranchised young black youth in Brazil who get to attend global meetings,’ said one Salon attendee. ‘It’s people like us. We need to be cognizant of the advantage we have as holders of power.’ Here in the US, the participant added, we need to be more aware of what private sector US technology companies are doing to take advantage of and maintain their stronghold in the global market and how the US government is working to allow US corporations to benefit disproportionately from the current Internet governance structure.

Use a rights-based approach to data and privacy to help to frame these issues and situations. Disclosure and consent are sometimes considered extraneous, especially in emergency situations. People think ‘this might be the only time I can get into this disaster or conflict zone, so I’m going to Hoover up as much data as possible without worrying about privacy.’ On the other hand, sometimes organizations are paternalistic and make choices for people about their own privacy. Consent and disclosure are not new issues; they are merely manifested in new ways as new technology changes the game and we cannot guarantee anonymity or privacy any more for research subjects. There is also a difference between information a person actively volunteers and information that is passively collected and used without a person’s knowledge. Framing privacy in a human rights context can help place importance on both processes and outcomes that support people’s rights to control their own data and that increase empowerment.

Create a minimum standard for privacy. Though we may not be able to determine a ceiling for privacy, one Salon participant said we should at least consider a floor or a minimum standard. Actors on the ground will always feel that privacy standards are a luxury because they have little know-how and little funding, so creating and working within an ethical standard should be a mandate from donors. The standard could be established as an M&E criterion.

Establish an ethics checklist to decide on funding sources and create policies and processes that help organizations to better understand how a donor or sub-donor would access and/or use data collected as part of a project or program they are funding. This is not always an easy solution, however, especially for cash-strapped local organizations. In India, for example, organizations are legally restricted from receiving certain types of funding based on government concerns that external agencies are trying to bring in Western democracy and Western values. Local organizations have a hard time getting funding for anti-censorship or free speech efforts. As one person at the Salon said, ‘agencies working on the ground are in a bind because they can’t take money from Google because it’s tainted, they can’t take money from the State Department because it’s imperialism and they can’t take money from local donors because there are none.’

Use encryption and other technology solutions. Given the low levels of understanding and awareness of these tools, more needs to be done so that more organizations learn how to use them, and they need to be made simpler, more accessible and user-friendly. ‘Crypto Parties’ can help get organizations familiar with encryption and privacy, but better outreach is needed so that organizations understand the relevance of encryption and feel welcome in tech-heavy environments.

Thanks to participants and lead discussants for the great discussions and to ThoughtWorks for hosting us at their offices!

 If you’d like to attend future Salons, sign up here!

Read Full Post »

The NYC Technology Salon on February 28th examined the connection between bigger, better data and resilience. We held morning and afternoon Salons due to the high response rate for the topic. Jake Porway, DataKind; Emmanuel Letouzé, Harvard Humanitarian Initiative; and Elizabeth Eagen, Open Society Foundations; were our lead discussants for the morning. Max Shron, Data Strategy; joined Emmanuel and Elizabeth for the afternoon session.

This post summarizes key discussions from both Salons.

What the heck do we mean by ‘big data’?

The first question at the morning salon was: What precisely do we mean by the term ‘big data’? Participants and lead discussants had varying definitions. One way of thinking about big data is that it is comprised of small bits of unintentionally produced ‘data exhaust’ (website cookies, cellphone data records, etc.) that add up to a dataset. In this case, the term big data refers to the quality and nature of the data, and we think of non-sampled data that are messy, noisy and unstructured. The mindset that goes with big data is one of ‘turning mess into meaning.’

Some Salon participants understood big data as datasets that are too large to be stored, managed and analyzed via conventional database technologies or managed on normal computers. One person suggested dropping the adjective ‘big,’ forgetting about the size, and instead considering the impact of the contribution of the data to understanding. For example, if there were absolutely no data on something and 1000 data points were contributed, this might have a greater impact than adding another 10,000 data points to an existing set of 10 million.

The point here was that when the emphasis is on big (understood as size and/or volume), someone with a small data set (for example, one that fits into an excel sheet) might feel inadequate, yet their data contribution may be actually ‘bigger’ than a physically larger data set (aha! it’s not the size of the paintbrush…). There was a suggestion that instead of talking about big data we should talk about smart data.

How can big data support development?

Two frameworks were shared for thinking about big data in development. One from UN Global Pulse considers that big data can improve a) real-time awareness, b) early warning and c) real-time monitoring. Another looks at big data being used for three kinds of analysis: a) descriptive (providing a summary of something that has already happened), b) predictive (likelihood and probability of something occurring in the future), and c) diagnostic (causal inference and understanding of the world).

What’s the link between big data and resilience?

‘Resilience’ as a concept is contested, difficult to measure and complex. In its most simple definition, resilience can be thought of as the ability to bounce back or bounce forward. (For an interesting discussion on whether we should be talking about sustainability or resilience, see this piece). One discussant noted that global processes and structures are not working well for the poor, as evidenced from continuing cycles of poverty and glaring wealth inequalities. In this view, people are poor as a result of being more exposed and vulnerable to shocks, at the same time, their poverty increases their vulnerability, and it’s difficult to escape from the cycle where over time, small and large shocks deplete assets. An assets-based model of resilience would help individuals, families and communities who are hit by a shock in one sphere — financial, human, capital, social, legal and/or political — to draw on the assets within another sphere to bounce back or forward.

Big data could help this type of an assets-based model of resilience by predicting /helping poor and vulnerable people predict when a shock might happen and preparing for it. Big data analytics, if accessible to the poor, could help them to increase their chances of making better decisions now and for the future. Big data then, should be made accessible and available to communities so that they can self-organize and decrease their own exposure to shocks and hazards and increase their ability to bounce back and bounce forward. Big data could also help various actors to develop a better understanding of the human ecosystem and contribute to increasing resilience.

Can ivory tower big data approaches contribute to resilience?

The application of big data approaches to efforts that aim to increase resilience and better understand human ecosystems often comes at things from the wrong angle, according to one discussant. We are increasingly seeing situations where a decision is made at the top by people who know how to crunch data yet have no way of really understanding the meaning of the data in the local context. In these cases, the impact of data on resilience will be low, because resilience can only truly be created and supported at the local level. Instead of large organizations thinking about how they can use data from afar to ‘rescue’ or ‘help’ the poor, organizations should be working together with communities in crisis (or supporting local or nationally based intermediaries to facilitate this process) so that communities can discuss and pull meaning from the data, contextualize it and use it to help themselves. They can also be more informed what data exist about them and more aware of how these data might be used.

For the Human Rights community, for example, the story is about how people successfully use data to advocate for their own rights, and there is less emphasis on large data sets. Rather, the goal is to get data to citizens and communities. It’s to support groups to define and use data locally and to think about what the data can tell them about the advocacy path they could take to achieve a particular goal.

Can data really empower people?

To better understand the opportunities and challenges of big data, we need to unpack questions related to empowerment. Who has the knowledge? The access? Who can use the data? Salon participants emphasized that change doesn’t come by merely having data. Rather it’s about using big data as an advocacy tool to tell the world to change processes and to put things normally left unsaid on the table for discussion and action. It is also about decisions and getting ‘big data’ to the ‘small world,’ e.g., the local level. According to some, this should be the priority of ‘big data for development’ actors over the next 5 years.

Though some participants at the Salon felt that data on their own do not empower individuals; others noted that knowing your credit score or tracking how much you are eating or exercising can indeed be empowering to individuals. In addition, the process of gathering data can help communities understand their own realities better, build their self-esteem and analytical capacities, and contribute to achieving a more level playing field when they are advocating for their rights or for a budget or service. As one Salon participant said, most communities have information but are not perceived to have data unless they collect it using ‘Western’ methods. Having data to support and back information, opinions and demands can serve communities in negotiations with entities that wield more power. (See the book “Who Counts, the power of participatory statistics” on how to work with communities to create ‘data’ from participatory approaches).

On the other hand, data are not enough if there is no political will to make change to respond to the data and to the requests or demands being made based on the data. As one Salon participant said: “giving someone a data set doesn’t change politics.”

Should we all jump on the data bandwagon?

Both discussants and participants made a plea to ‘practice safe statistics!’ Human rights organizations wander in and out of statistics and don’t really understand how it works, said one person. ‘You wouldn’t go to court without a lawyer, so don’t try to use big data unless you can ensure it’s valid and you know how to manage it.’ If organizations plan to work with data, they should have statisticians and/or data scientists on staff or on call as partners and collaborators. Lack of basic statistical literacy is a huge issue amongst the general population and within many organizations, thought leaders, and journalists, and this can be dangerous.

As big data becomes more trendy, the risk of misinterpretation is growing, and we need to place more attention on the responsible use of statistics and data or we may end up harming people by bad decisions. ‘Everyone thinks they are experts who can handle statistics – bias, collection, correlation’ these days. And ‘as a general rule, no matter how many times you say the data show possible correlation not causality, the public will understand that there is causality,’ commented one discussant. And generally, he noted, ‘when people look at data, they believe them as truth because they include numbers, statistics, science.’ Greater statistical literacy could help people to not just read or access data and information but to use them wisely, to understand and question how data are interpreted, and to detect political or other biases. What’s more, organizations today are asking questions about big data that have been on statisticians’ minds for a very long time, so reaching out to those who understand these issues can be useful to avoid repeating mistakes and re-learning lessons that have already been well-documented.

This poor statistical literacy becomes a serious ethical issue when data are used to determine funding or actions that impact on people’s lives, or when they are shared openly, accidentally or in ways that are unethical. In addition, privacy and protection are critical elements in using and working with data about people, especially when the data involve vulnerable populations. Organizations can face legal action and liability suits if their data put people at harm, as one Salon participant noted. ‘An organization could even be accused of manslaughter… and I’m speaking from experience,’ she added.

What can we do to move forward?

Some potential actions for moving forward included:

  • Emphasis with donors that having big data does not mean that in order to cut costs, you should eliminate community level processes related to data collection, interpretation, analysis, and ownership;
  • Evaluations and literature/documentation on the effectiveness of different tools and methods, and when and in which contexts they might be applicable, including things like cost-benefit analyses of using big data and evaluation of its impact on development/on communities when combined with community level processes vs used alone/without community involvement — practitioner gut feelings are that big data without community involvement is irresponsible and ineffective in terms of resilience, and it would be good to have evidence to help validate or disprove this;
  • More and better tools and resources to support data collection, visualization and use and to help organizations with risk analysis, privacy impact assessments, strategies and planning around use of big data; case studies and a place to share and engage with peers, creation of a ‘cook book’ to help organizations understand the ingredients, tools, processes of using data/big data in their work;
  • ‘Normative conventions’ on how big data should be used to avoid falling into tech-driven dystopia;
  • Greater capacity for ‘safe statistics’ among organizations;
  • A community space where frank and open conversations around data/big data can occur in an ongoing way with the right range of people and cross-section of experiences and expertise from business, data, organizations, etc.

In conclusion?

We touched upon all types of data and various levels of data usage for a huge range of purposes at the two Salons. One closing thought was around the importance of having a solid idea of what questions we trying to answer before moving on to collecting data, and then understanding what data collection methods are adequate for our purpose, what ICT tools are right for which data collection and interpretation methods, what will done with the data/what is the purpose of collecting data, how we’ll interpret them, and how data will be shared, with whom, and in what format.

See this growing list of resources related to Data and Resilience here and add yours!

Thanks to participants and lead discussants for the fantastic exchange, and a big thank you to ThoughtWorks for hosting us at their offices for this Salon. Thanks also to Hunter Goldman, Elizabeth Eagen and Emmanuel Letouzé for their support developing this Salon topic, and to Somto Fab-Ukozor for support with notes and the summary. Salons are held under Chatham House Rule, therefore no attribution has been made in this post. If you’d like to attend future Salons, sign up here!

Read Full Post »

« Newer Posts