Feeds:
Posts
Comments

Posts Tagged ‘tips’

Last year I wrote a report for UNHCR that explores the potential of digital mental health and psycho-social support (MHPSS) for displaced and stateless adolescents. The key question was whether digital could help to safely expand MHPSS services to a population that is often at high risk due to life circumstances and contexts, yet remains largely under-served.

While it is possible that digital could provide some support (and in fact many young people already go online to find mental health support, especially from their peers), there is also a debate raging around whether social media and the online environment are key contributors to the adolescent mental health crisis. As usual, when you dig into a complex area like this one, nuance is important.

To unpack the topic, we started with the World Health Organization’s traditional MHPSS pyramid, which is used by most humanitarian organizations to frame their MHPSS work. We adapted the pyramid to consider how digital interventions might be safely and feasibly incorporated at the different layers. This presupposes that adolescents can safely access the digital environment so that they could take advantage of digital MHPSS services.

Figure 1. A revised mental health and psycho-social support (MHPSS) pyramid showing ways that digital interventions might enhance adolescent MHPSS. An underlying MHPSS approach that supports safe internet access and safe digital environments for adolescents is needed to enable these interventions.

The report summarizes existing evidence and insights from UNHCR staff working in several country operations to lay out the case and the caveats for digital MHPSS for forcibly displaced and stateless adolescents. We offer ideas on if, when, where, and how digital MHPSS might be explored as an option for reaching these adolescents. We also look at the risks of digital interventions, and explore contextual challenges with digital interventions for this population. This leads to a set of core insights into the key benefits of digital MHPSS at the different levels of the MHPSS Pyramid alongside the barriers, limitations, and risks.

We highlight good practices for designing and implementing digital MHPSS programming with forcibly displaced and stateless adolescents and make recommendations for further action by UNHCR at strategic, advocacy, policy, monitoring, evaluation, research, operational, and guidance levels. Rounding off the report is a checklist for practitioners to follow when designing and implementing digital MHPSS approaches and interventions.

Read the full report here or take a glance at the executive summary and let us know what you think!

Read Full Post »

Over the past 4 years I’ve had the opportunity to look more closely at the role of ICTs in Monitoring and Evaluation practice (and the privilege of working with Michael Bamberger and Nancy MacPherson in this area). When we started out, we wanted to better understand how evaluators were using ICTs in general, how organizations were using ICTs internally for monitoring, and what was happening overall in the space. A few years into that work we published the Emerging Opportunities paper that aimed to be somewhat of a landscape document or base report upon which to build additional explorations.

As a result of this work, in late April I had the pleasure of talking with the OECD-DAC Evaluation Network about the use of ICTs in Evaluation. I drew from a new paper on The Role of New ICTs in Equity-Focused Evaluation: Opportunities and Challenges that Michael, Veronica Olazabal and I developed for the Evaluation Journal. The core points of the talk are below.

*****

In the past two decades there have been 3 main explosions that impact on M&E: a device explosion (mobiles, tablets, laptops, sensors, dashboards, satellite maps, Internet of Things, etc.); a social media explosion (digital photos, online ratings, blogs, Twitter, Facebook, discussion forums, What’sApp groups, co-creation and collaboration platforms, and more); and a data explosion (big data, real-time data, data science and analytics moving into the field of development, capacity to process huge data sets, etc.). This new ecosystem is something that M&E practitioners should be tapping into and understanding.

In addition to these ‘explosions,’ there’s been a growing emphasis on documentation of the use of ICTs in Evaluation alongside a greater thirst for understanding how, when, where and why to use ICTs for M&E. We’ve held / attended large gatherings on ICTs and Monitoring, Evaluation, Research and Learning (MERL Tech). And in the past year or two, it seems the development and humanitarian fields can’t stop talking about the potential of “data” – small data, big data, inclusive data, real-time data for the SDGs, etc. and the possible roles for ICT in collecting, analyzing, visualizing, and sharing that data.

The field has advanced in many ways. But as the tools and approaches develop and shift, so do our understandings of the challenges. Concern around more data and “open data” and the inherent privacy risks have caught up with the enthusiasm about the possibilities of new technologies in this space. Likewise, there is more in-depth discussion about methodological challenges, bias and unintended consequences when new ICT tools are used in Evaluation.

Why should evaluators care about ICT?

There are 2 core reasons that evaluators should care about ICTs. Reason number one is practical. ICTs help address real world challenges in M&E: insufficient time, insufficient resources and poor quality data. And let’s be honest – ICTs are not going away, and evaluators need to accept that reality at a practical level as well.

Reason number two is both professional and personal. If evaluators want to stay abreast of their field, they need to be aware of ICTs. If they want to improve evaluation practice and influence better development, they need to know if, where, how and why ICTs may (or may not) be of use. Evaluation commissioners need to have the skills and capacities to know which new ICT-enabled approaches are appropriate for the type of evaluation they are soliciting and whether the methods being proposed are going to lead to quality evaluations and useful learnings. One trick to using ICTs in M&E is understanding who has access to what tools, devices and platforms already, and what kind of information or data is needed to answer what kinds of questions or to communicate which kinds of information. There is quite a science to this and one size does not fit all. Evaluators, because of their critical thinking skills and social science backgrounds, are very well placed to take a more critical view of the role of ICTs in Evaluation and in the worlds of aid and development overall and help temper expectations with reality.

Though ICTs are being used along all phases of the program cycle (research/diagnosis and consultation, design and planning, implementation and monitoring, evaluation, reporting/sharing/learning) there is plenty of hype in this space.

Screen Shot 2016-05-25 at 3.14.31 PM

There is certainly a place for ICTs in M&E, if introduced with caution and clear analysis about where, when and why they are appropriate and useful, and evaluators are well-placed to take a lead in identifying and trailing what ICTs can offer to evaluation. If they don’t, others are going to do it for them!

Promising areas

There are four key areas (I’ll save the nuance for another time…) where I see a lot of promise for ICTs in Evaluation:

1. Data collection. Here I’d divide it into 3 kinds of data collection and note that the latter two normally also provide ‘real time’ data:

  • Structured data gathering – where enumerators or evaluators go out with mobile devices to collect specific types of data (whether quantitative or qualitative).
  • Decentralized data gathering – where the focus is on self-reporting or ‘feedback’ from program participants or research subjects.
  • Data ‘harvesting’ – where data is gathered from existing online sources like social media sites, What’sApp groups, etc.
  • Real-time data – which aims to provide data in a much shorter time frame, normally as monitoring, but these data sets may be useful for evaluators as well.

2. New and mixed methods. These are areas that Michael Bamberger has been looking at quite closely. New ICT tools and data sources can contribute to more traditional methods. But triangulation still matters.

  • Improving construct validity – enabling a greater number of data sources at various levels that can contribute to better understanding of multi-dimensional indicators (for example, looking at changes in the volume of withdrawals from ATMs, records of electronic purchases of agricultural inputs, satellite images showing lorries traveling to and from markets, and the frequency of Tweets that contain the words hunger or sickness).
  • Evaluating complex development programs – tracking complex and non-linear causal paths and implementation processes by combining multiple data sources and types (for example, participant feedback plus structured qualitative and quantitative data, big data sets/records, census data, social media trends and input from remote sensors).
  • Mixed methods approaches and triangulation – using traditional and new data sources (for example, using real-time data visualization to provide clues on where additional focus group discussions might need to be done to better understand the situation or improve data interpretation).
  • Capturing wide-scale behavior change – using social media data harvesting and sentiment analysis to better understand wide-spread, wide-scale changes in perceptions, attitudes, stated behaviors and analyzing changes in these.
  • Combining big data and real-time data – these emerging approaches may become valuable for identifying potential problems and emergencies that need further exploration using traditional M&E approaches.

3. Data Analysis and Visualization. This is an area that is less advanced than the data collection area – often it seems we’re collecting more and more data but still not really using it! Some interesting things here include:

  • Big data and data science approaches – there’s a growing body of work exploring how to use predictive analytics to help define what programs might work best in which contexts and with which kinds of people — (how this connects to evaluation is still being worked out, and there are lots of ethical aspects to think about here too — most of us don’t like the idea of predictive policing, and in some ways you could end up in a situation that is not quite what was aimed at.) With big data, you’ll often have a hypothesis and you’ll go looking for patterns in huge data sets. Whereas with evaluation you normally have particular questions and you design a methodology to answer them — it’s interesting to think about how these two approaches are going to combine.
  • Data Dashboards – these are becoming very popular as people try to work out how to do a better job of using the data that is coming into their organizations for decision making. There are some efforts at pulling data from community level all the way up to UN representatives, for example, the global level consultations that were done for the SDGs or using “near real-time data” to share with board members. Other efforts are more focused on providing frontline managers with tools to better tweak their programs during implementation.
  • Meta-evaluation – some organizations are working on ways to better draw conclusions from what we are learning from evaluation around the world and to better visualize these conclusions to inform investments and decision-making.

4. Equity-focused Evaluation. As digital devices and tools become more widespread, there is hope that they can enable greater inclusion and broader voice and participation in the development process. There are still huge gaps however — in some parts of the world 23% less women have access to mobile phones — and when you talk about Internet access the gap is much much bigger. But there are cases where greater participation in evaluation processes is being sought through mobile. When this is balanced with other methods to ensure that we’re not excluding the very poorest or those without access to a mobile phone, it can help to broaden out the pool of voices we are hearing from. Some examples are:

  • Equity-focused evaluation / participatory evaluation methods – some evaluators are seeking to incorporate more real-time (or near real-time) feedback loops where participants provide direct feedback via SMS or voice recordings.
  • Using mobile to directly access participants through mobile-based surveys.
  • Enhancing data visualization for returning results back to the community and supporting community participation in data interpretation and decision-making.

Challenges

Alongside all the potential, of course there are also challenges. I’d divide these into 3 main areas:

1. Operational/institutional

Some of the biggest challenges to improving the use of ICTs in evaluation are institutional or related to institutional change processes. In focus groups I’ve done with different evaluators in different regions, this was emphasized as a huge issue. Specifically:

  • Potentially heavy up-front investment costs, training efforts, and/or maintenance costs if adopting/designing a new system at wide scale.
  • Tech or tool-driven M&E processes – often these are also donor driven. This happens because tech is perceived as cheaper, easier, at scale, objective. It also happens because people and management are under a lot of pressure to “be innovative.” Sometimes this ends up leading to an over-reliance on digital data and remote data collection and time spent developing tools and looking at data sets on a laptop rather than spending time ‘on the ground’ to observe and engage with local organizations and populations.
  • Little attention to institutional change processes, organizational readiness, and the capacity needed to incorporate new ICT tools, platforms, systems and processes.
  • Bureaucracy levels may mean that decisions happen far from the ground, and there is little capacity to make quick decisions, even if real-time data is available or the data and analysis are provided frequently to decision-makers sitting at a headquarters or to local staff who do not have decision-making power in their own hands and must wait on orders from on high to adapt or change their program approaches and methods.
  • Swinging too far towards digital due to a lack of awareness that digital most often needs to be combined with human. Digital technology always works better when combined with human interventions (such as visits to prepare folks for using the technology, making sure that gatekeepers; e.g., a husband or mother-in-law is on-board in the case of women). A main message from the World Bank 2016 World Development Report “Digital Dividends” is that digital technology must always be combined with what the Bank calls “analog” (a.k.a. “human”) approaches.

B) Methodological

Some of the areas that Michael and I have been looking at relate to how the introduction of ICTs could address issues of bias, rigor, and validity — yet how, at the same time, ICT-heavy methods may actually just change the nature of those issues or create new issues, as noted below:

  • Selection and sample bias – you may be reaching more people, but you’re still going to be leaving some people out. Who is left out of mobile phone or ICT access/use? Typical respondents are male, educated, urban. How representative are these respondents of all ICT users and of the total target population?
  • Data quality and rigor – you may have an over-reliance on self-reporting via mobile surveys; lack of quality control ‘on the ground’ because it’s all being done remotely; enumerators may game the system if there is no personal supervision; there may be errors and bias in algorithms and logic in big data sets or analysis because of non-representative data or hidden assumptions.
  • Validity challenges – if there is a push to use a specific ICT-enabled evaluation method or tool without it being the right one, the design of the evaluation may not pass the validity challenge.
  • Fallacy of large numbers (in cases of national level self-reporting/surveying) — you may think that because a lot of people said something that it’s more valid, but you might just be reinforcing the viewpoints of a particular group. This has been shown clearly in research by the World Bank on public participation processes that use ICTs.
  • ICTs often favor extractive processes that do not involve local people and local organizations or provide benefit to participants/local agencies — data is gathered and sent ‘up the chain’ rather than shared or analyzed in a participatory way with local people or organizations. Not only is this disempowering, it may impact on data quality if people don’t see any point in providing it as it is not seen to be of any benefit.
  • There’s often a failure to identify unintended consequences or biases arising from use of ICTs in evaluation — What happens when you introduce tablets for data collection? What happens when you collect GPS information on your beneficiaries? What risks might you be introducing or how might people react to you when you are carrying around some kind of device?

C) Ethical and Legal

This is an area that I’m very interested in — especially as some donors have started asking for the raw data sets from any research, studies or evaluations that they are funding, and when these kinds of data sets are ‘opened’ there are all sorts of ramifications. There is quite a lot of heated discussion happening here. I was happy to see that DFID has just conducted a review of ethics in evaluationSome of the core issues include:

  • Changing nature of privacy risks – issues here include privacy and protection of data; changing informed consent needs for digital data/open data; new risks of data leaks; and lack of institutional policies with regard to digital data.
  • Data rights and ownership: Here there are some issues with proprietary data sets, data ownership when there are public-private partnerships, the idea of data philanthropy’ when it’s not clear whose data is being donated, personal data ‘for the public good’, open data/open evaluation/ transparency, poor care taken when vulnerable people provide personally identifiable information; household data sets ending up in the hands of those who might abuse them, the increasing impossibility of data anonymization given that crossing data sets often means that re-identification is easier than imagined.
  • Moving decisions and interpretation of data away from ‘the ground’ and upwards to the head office/the donor.
  • Little funding for trialing/testing the validity of new approaches that use ICTs and documenting what is working/not working/where/why/how to develop good practice for new ICTs in evaluation approaches.

Recommendations: 12 tips for better use of ICTs in M&E

Despite the rapid changes in the field in the 2 years since we first wrote our initial paper on ICTs in M&E, most of our tips for doing it better still hold true.

  1. Start with a high-quality M&E plan (not with the tech).
    • But also learn about the new tech-related possibilities that are out there so that you’re not missing out on something useful!
  2. Ensure design validity.
  3. Determine whether and how new ICTs can add value to your M&E plan.
    • It can be useful to bring in a trusted tech expert in this early phase so that you can find out if what you’re thinking is possible and affordable – but don’t let them talk you into something that’s not right for the evaluation purpose and design.
  4. Select or assemble the right combination of ICT and M&E tools.
    • You may find one off the shelf, or you may need to adapt or build one. This is a really tough decision, which can take a very long time if you’re not careful!
  5. Adapt and test the process with different audiences and stakeholders.
  6. Be aware of different levels of access and inclusion.
  7. Understand motivation to participate, incentivize in careful ways.
    • This includes motivation for both program participants and for organizations where a new tech-enabled tool/process might be resisted.
  8. Review/ensure privacy and protection measures, risk analysis.
  9. Try to identify unintended consequences of using ICTs in the evaluation.
  10. Build in ways for the ICT-enabled evaluation process to strengthen local capacity.
  11. Measure what matters – not what a cool ICT tool allows you to measure.
  12. Use and share the evaluation learnings effectively, including through social media.

 

 

Read Full Post »

Crowdsourcing our Responsible Data questions, challenges and lessons. (Photo courtesy of Amy O'Donnell).

Crowdsourcing our Responsible Data questions, challenges and lessons. (Photo by Amy O’Donnell).

At Catholic Relief Services’ ICT4D Conference in May 2016, I worked with Amy O’Donnell  (Oxfam GB) and Paul Perrin (CRS) to facilitate a participatory session that explored notions of Digital Privacy, Security and Safety. We had a full room, with a widely varied set of experiences and expertise.

The session kicked off with stories of privacy and security breaches. One person told of having personal data stolen when a federal government clearance database was compromised. We also shared how a researcher in Denmark scraped very personal data from the OK Cupid online dating site and opened it up to the public.

A comparison was made between the OK Cupid data situation and the work that we do as development professionals. When we collect very personal information from program participants, they may not expect that their household level income, health data or personal habits would be ‘opened’ at some point.

Our first task was to explore and compare the meaning of the terms: Privacy, Security and Safety as they relate to “digital” and “development.”

What do we mean by privacy?

The “privacy” group talked quite a bit about contextuality of data ownership. They noted that there are aspects of privacy that cut across different groups of people in different societies, and that some aspects of privacy may be culturally specific. Privacy is concerned with ownership of data and protection of one’s information, they said. It’s about who owns data and who collects and protects it and notions of to whom it belongs. Private information is that which may be known by some but not by all. Privacy is a temporal notion — private information should be protected indefinitely over time. In addition, privacy is constantly changing. Because we are using data on our mobile phones, said one person, “Safaricom knows we are all in this same space, but we don’t know that they know.”

Another said that in today’s world, “You assume others can’t know something about you, but things are actually known about you that you don’t even know that others can know. There are some facts about you that you don’t think anyone should know or be able to know, but they do.” The group mentioned website terms and conditions, corporate ownership of personal data and a lack of control of privacy now. Some felt that we are unable to maintain our privacy today, whereas others felt that one could opt out of social media and other technologies to remain in control of one’s own privacy. The group noted that “privacy is about the appropriate use of data for its intended purpose. If that purpose shifts and I haven’t consented, then it’s a violation of privacy.”

What do we mean by security?

The Security group considered security to relate to an individual’s information. “It’s your information, and security of it means that what you’re doing is protected, confidential, and access is only for authorized users.” Security was also related to the location of where a person’s information is hosted and the legal parameters. Other aspects were related to “a barrier – an anti-virus program or some kind of encryption software, something that protects you from harm…. It’s about setting roles and permissions on software and installing firewalls, role-based permissions for accessing data, and cloud security of individuals’ data.” A broader aspect of security was linked to the effects of hacking that lead to offline vulnerability, to a lack of emotional security or feeling intimidated in an online space. Lastly, the group noted that “we, not the systems, are the weakest link in security – what we click on, what we view, what we’ve done. We are our own worst enemies in terms of keeping ourselves and our data secure.”

What do we mean by safety?

The Safety group noted that it’s difficult to know the difference between safety and security. “Safety evokes something highly personal. Like privacy… it’s related to being free from harm personally, physically and emotionally.” The group raised examples of protecting children from harmful online content or from people seeking to harm vulnerable users of online tools. The aspect of keeping your online financial information safe, and feeling confident that a service was ‘safe’ to use was also raised. Safety was considered to be linked to the concept of risk. “Safety engenders a level of trust, which is at the heart of safety online,” said one person.

In the context of data collection for communities we work with – safety was connected to data minimization concepts and linked with vulnerability, and a compounded vulnerability when it comes to online risk and safety. “If one person’s data is not safely maintained it puts others at risk,” noted the group. “And pieces of information that are innocuous on their own may become harmful when combined.” Lastly, the notion of safety as related to offline risk or risk to an individual due to a specific online behavior or data breach was raised.

It was noted that in all of these terms: privacy, security and safety, there is an element of power, and that in this type of work, a power relations analysis is critical.

The Digital Data Life Cycle

After unpacking the above terms, Amy took the group through an analysis of the data life cycle (courtesy of the Engine Room’s Responsible Data website) in order to highlight the different moments where the three concepts (privacy, security and safety) come into play.

Screen Shot 2016-05-25 at 6.51.50 AM

  • Plan/Design
  • Collect/Find/Acquire
  • Store
  • Transmit
  • Access
  • Share
  • Analyze/use
  • Retention
  • Disposal
  • Afterlife

Participants added additional stages in the data life cycle that they passed through in their work (coordinate, monitor the process, monitor compliance with data privacy and security policies). We placed the points of the data life cycle on the wall, and invited participants to:

  • Place a pink sticky note under the stage in the data life cycle that resonates or interests them most and think about why.
  • Place a green sticky note under the stage that is the most challenging or troublesome for them or their organizations and think about why.
  • Place a blue sticky note under the stage where they have the most experience, and to share a particular experience or tip that might help others to better manage their data life cycle in a private, secure and safe way.

Challenges, concerns and lessons

Design as well as policy are important!

  • Design drives everScreen Shot 2016-05-25 at 7.21.07 AMything else. We often start from the point of collection when really it’s at the design stage when we should think about the burden of data collection and define what’s the minimum we can ask of people? How we design – even how we get consent – can inform how the whole process happens.
  • When we get part-way through the data life cycle, we often wish we’d have thought of the whole cycle at the beginning, during the design phase.
  • In addition to good design, coordination of data collection needs to be thought about early in the process so that duplication can be reduced. This can also reduce fatigue for people who are asked over and over for their data.
  • Informed consent is such a critical issue that needs to be linked with the entire process of design for the whole data life cycle. How do you explain to people that you will be giving their data away, anonymizing, separating out, encrypting? There are often flow down clauses in some contracts that shifts responsibilities for data protection and security and it’s not always clear who is responsible for those data processes? How can you be sure that they are doing it properly and in a painstaking way?
  • Anonymization is also an issue. It’s hard to know to what level to anonymize things like call data records — to the individual? Township? District Level? And for how long will anonymization actually hold up?
  • The lack of good design and policy contributes to overlapping efforts and poor coordination of data collection efforts across agencies. We often collect too much data in poorly designed databases.
  • Policy is not enough – we need to do a much better job of monitoring compliance with policy.
  • Institutional Review Boards (IRBs) and compliance aspects need to be updated to the new digital data reality. At the same time, sometimes IRBs are not the right instrument for what we are aiming to achieve.

Data collection needs more attention.

  • Data collection is the easy part – where institutions struggle is with analyzing and doing something with the data we collect.
  • Organizations often don’t have a well-structured or systematic process for data collection.
  • We need to be clearer about what type of information we are collecting and why.
  • We need to update our data protection policy.

Reasons for data sharing are not always clear.

  • How can share data securely and efficiently without building duplicative systems? We should be thinking more during the design and collection phase about whether the data is going to be interoperable and who needs to access it.
  • How can we get the right balance in terms of data sharing? Some donors really push for information that can put people in real danger – like details of people who have participated in particular programs that would put them at risk with their home governments. Organizations really need to push back against this. It’s an education thing with donors. Middle management and intermediaries are often the ones that push for this type of data because they don’t really have a handle on the risk it represents. They are the weak points because of the demands they are putting on people. This is a challenge for open data policies – leaving it open to people leaves it to doing the laziest job possible of thinking about the potential risks for that data.
  • There are legal aspects of sharing too – such as the USAID open data policy where those collecting data have to share with the government. But we don’t have a clear understanding of what the international laws are about data sharing.
  • There are so many pressures to share data but they are not all fully thought through!

Data analysis and use of data are key weak spots for organizations.

  • We are just beginning to think through capturing lots of data.
  • Data is collected but not always used. Too often it’s extractive data collection. We don’t have the feedback loops in place, and when there are feedback loops we often don’t use the the feedback to make changes.
  • We forget often to go back to the people who have provided us with data to share back with them. It’s not often that we hold a consultation with the community to really involve them in how the data can be used.

Secure storage is a challenge.

  • We have hundreds of databases across the agency in various formats, hard drives and states of security, privacy and safety. Are we able to keep these secure?
  • We need to think more carefully about where we hold our data and who has access to it. Sometimes our data is held by external consultants. How should we be addressing that?

Disposing of data properly in a global context is hard!

  • Screen Shot 2016-05-25 at 7.17.58 AMIt’s difficult to dispose of data when there are multiple versions of it and a data footprint.
  • Disposal is an issue. We’re doing a lot of server upgrades and many of these are remote locations. How do we ensure that the right disposal process is going on globally, short of physically seeing that hard drives are smashed up!
  • We need to do a better job of disposal on personal laptops. I’ve done a lot of data collection on my personal laptop – no one has ever followed up to see if I’ve deleted it. How are we handling data handover? How do you really dispose of data?
  • Our organization hasn’t even thought about this yet!

Tips and recommendations from participants

  • Organizations should be using different tools. They should be using Pretty Good Privacy techniques rather than relying on free or commercial tools like Google or Skype.
  • People can be your weakest link if they are not aware or they don’t care about privacy and security. We send an email out to all staff on a weekly basis that talks about taking adequate measures. We share tips and stories. That helps to keep privacy and security front and center.
  • Even if you have a policy the hard part is enforcement, accountability, and policy reform. If our organizations are not doing direct policy around the formation of best practices in this area, then it’s on us to be sure we understand what is best practice, and to advocate for that. Let’s do what we can before the policy catches up.
  • The Responsible Data Forum and Tactical Tech have a great set of resources.
  • Oxfam has a Responsible Data Policy and Girl Effect have developed a Girls’ Digital Privacy, Security and Safety Toolkit that can also offer some guidance.

In conclusion, participants agreed that development agencies and NGOs need to take privacy, security and safety seriously. They can no longer afford to implement security at a lower level than corporations. “Times are changing and hackers are no longer just interested in financial information. People’s data is very valuable. We need to change and take security as seriously as corporates do!” as one person said.

 

 

Read Full Post »

Plan just released a new report called ICT Enabled Development: Using ICT strategically to support Plan’s work. The report is part of an on-going process by Plan Finland (kudos to Mika Valitalo for leading the process) in collaboration with Plan USA to support Plan’s country offices in Africa to use ICTs strategically and effectively in their development work. It was written by Hannah Beardon and builds on the Mobiles for Development Guide that Plan Finland produced (also written by Hannah) in 2009.

The idea for the report came out of our work with staff and communities, and the sense that we needed to better understand and document the ICT4D context in the different countries where we are working. Country offices wanted to strengthen their capacities to strategically incorporate ICTs into their work and to ensure that any fund-raising efforts for ICTs were stemming from real needs and interest from the ground. Plan offices were also in the process of updating their long-term strategic plans and wanted to think through how and where they could incorporate ICTs in their work internally and with communities.

The process for creating the report included 2-day workshops with staff in 5 countries, using a methodology that Mika, Hannah and I put together. We created a set of ICT training materials and discussion questions and used a ‘distance-learning’ process, working with a point person in each office who planned and carried out the workshop. Mika and I supported via Skype and email.

Hannah researched existing reports and initiatives by participating offices to find evidence and examples of ICT use. She also held phone or skype conversations with key staff at the country and regional levels around their ICT use, needs and challenges, and pulled together information on the national ICT context for each country.

The first section of the report explains the concept of ‘ICT enabled development’ and why it is important for Plan and other development organizations to take on board. “With so many ICT tools and applications now available, the job of a development organization is no longer to compensate for lack of access but to find innovative and effective ways of putting the tools to development ends. This means not only developing separate projects to install ICTs in under-served communities, but looking at key development challenges and needs with an ICT eye, asking ‘how could ICTs help to overcome this problem’?

Drawing on the research, conversations, workshop input and feedback from staff, and documented experience using ICTs in Plan’s work, Hannah created a checklist with 10 key areas to think about when planning ICT-enabled development efforts.

  1. Context Analysis: what is happening with ICT (for development) in the country or region?
  2. Defining the need: what problems can ICT help overcome? what opportunities can it create?
  3. Choosing a strategy: what kind of ICT4D is needed? direct? internal? strategic?
  4. Undertaking a participatory communications assessment: who will benefit from this use of ICT and how?
  5. Choosing the technology: what ICTs/applications are available to meet this need or goal?
  6. Adjusting the content: can people understand and use the information provided for and by the ICTs?
  7. Building and using capacity: what kind of support will people need to use and benefit from the ICT, and to innovate around it?
  8. Monitoring progress: how do you know if the ICT is helping meet the development goal or need?
  9. Keeping it going: how can you manage risks and keep up with changes?
  10. Learning from each other: what has been done before, and what have you learned that others could use?

The checklist helps to ensure that ICT use is linked to real development needs and priorities and appropriate for those who are participating in an initiative or a project. The report elaborates on the 10 key areas with detailed observations, learning and examples to illustrate them and to help orient others who are working on similar initiatives. It places the checklist into a 4-stage process for ICT integration.

  1. Understanding the context for ICT work: includes external context and internal experience and capacity
  2. Finding a match between priorities and possibilities: rooting the system in local needs and priorities and finding good uses for tools and applications
  3. Planning and implementing concrete initiatives: carrying out participatory assessments, linking to other development processes and addressing technical issues and concerns
  4. Building a culture of systematic, sustained and strategic use of ICTs: linking ICTs with program work, transforming the role of ‘the ICT guy’, and building expertise on the cultural and social aspects of ICT use

Additional material and case studies, ICT country briefings, and an overview of Plan’s current work with ICT4D in Africa are offered at the end of the report.

The report includes input from Plan staff in Ghana, Mali, Mozambique, Senegal and Uganda who participated in the ICT4D workshops. It also draws heavily on some of the work that Mika has been doing in Finland and Kenya, and work that I’ve been involved in and have written about in Mali, Cameroon, Mozambique, Ghana, Benin and Kenya involving staff, community members and community youth. You can contact Mika to get the workshop methodology in French or English or to comment on the report (ict4d [at] plan [dot] fi).

There’s so much rich material in the report that I almost want to summarize the whole thing here on my blog, section by section, so that people will take the time to read it…  I think this is a really important and useful piece of work and we’re very excited that it’s now available! Download it here.

Related posts on Wait… What?

ICT4D in Uganda: ICT does not equal computers

Demystifying Internet (Ghana)

It’s all part of the ICT jigsaw: Plan Mozambique ICT4D workshops

A positively brilliant ICT4D workshop in Kwale, Kenya

7 or more questions to ask before adding ICTs (Benin)

A catalyst for positive change (Cameroon)

Salim’s ICT advice part 1: consider both process and passion (Kenya)

Salim’s ICT advice part 2: innovate but keep it real (Kenya)

Meeting in the middle

I and C, then T (US)

Read Full Post »

the Ghana team: row 1: Steven, Joyce, Yaw, Samuel; row 2: Bismark, Maakusi, James, Chris, Dan

I was in a workshop in the Upper West Region of Ghana this past week.  The goal was two-fold.  1) to train a small group of staff, ICT teachers and local partners on social media and new technologies for communications; and 2) to help them prepare for a project that will support 60 students to use arts and citizen media in youth-led advocacy around issues that youth identify.

I was planning to talk about how social media is different from traditional media, focusing on how it offers an opportunity to democratize information, and how we can support youth to use social media to reduce stereotypes about them and to bring their voices and priorities into global discussions.  But all those theories about social media being the great equalizer, the Internet allowing everyone’s voices to flourish and yadaya, don’t mean a lot unless barriers like language, electricity, gender, and financial resources are lowered and people can actually access the Internet regularly.

Mobile internet access is extremely good in this part of Ghana, but when we did a quick exercise to see what the experience levels of the group were, only half had used email or the Internet before.  So I started there, rather than with my fluffy theories about democratization, voice, networks and many-to-many communications.

We got really good feedback from the participants on the workshop.  Here’s how we did it:

What is Internet?

I asked the ICT teachers to explain what the Internet is, and to then try to put it into words that the youth or someone in a community who hadn’t used a computer before would be able to understand.  We discussed ways in which radios, mobile phones, televisions are the same or different from the Internet.

How can you access Internet here?

We listed common ways to access Internet in the area: through a computer at an internet café or at home or work, through a mobile phone (“smart phone”), or via a mobile phone or flash-type modem connected to a computer (such as the ones that we were using at the workshop).  We went through how to connect a modem to a computer to access internet via the mobile network.

Exploring Internet and using search functions

Riffing off Google search

We jumped into Internet training by Googling the community’s name to see what popped up, then we followed the paths to where they led us. We found an article where the secondary school headmaster (who was participating in the workshop) had been interviewed about the needs of the school.

Everyone found it hilarious, as they didn’t know the headmaster was featured in an online article.  This lead to a good discussion on consent, permission and the fact that information does go global, but it doesn’t stay global, because more and more people are able to access that same information locally too through the Internet, so you need to think carefully about what you say.

The article about the school had a comments stream. The first comment was directly related to the article, and said that the school deserved to get some help.  But the comments quickly turned to politics, including accusations that a local politician was stealing tractors.  Again this generated a big discussion, and again the local-global point hit home.  The internet is not ‘over there’ but potentially ‘right here’.  People really need to be aware of this when publishing something online or when being interviewed, photographed or filmed by someone who will publish something.

Other times when we’ve done this exercise, we haven’t found any information online about the community. In those cases, the lack of an online presence was a good catalyst to discuss why, and to motivate the community to get the skills and training to put up their own information. That is actually one of the goals of the project we are working on.

We used a projector, but small groups would have also been fine if there was no projector and a few computers were available. We generally use what we can pull together through our local offices, the small amount of equipment purchased with the project funds, and what the local school and partners have, and organize it however makes the most sense so that people can practice.  4-5 people per computer is fine for the workshop because people tend to teach each other and take turns. There will be some people who have more experience and who can show others how to do things, so that the facilitator can step out of the picture as soon as possible, just being available for any questions or trouble shooting.

Social networks and privacy

When we Googled the name of the community, we also found a Facebook page for alums from the secondary school.  That was a nice segue into social networks.  I showed my Facebook page and a few others were familiar with Facebook. One colleague talked about how she had just signed up and was finding old school friends there who she hadn’t seen in years. People had a few questions such as ‘Is it free?  How do you do it? Can you make it yourself?  Who exactly can see it?’  So we had to enter the thorny world of privacy, hoping no one would be scared off from using Internet because of privacy issues.

One of the ICT teachers, for example, was concerned that someone could find his personal emails by Googling.  I used to feel confident when I said ‘no they can’t’ but now it seems you can never be certain who can see what (thank you Facebook).  I tried to explain privacy settings and that it’s important to understand how they work, suggesting they could try different things with low sensitivity information until they felt comfortable, and test by Googling their own name to see if anything came up.

Online truth and safety

Another question that surfaced was ‘Is the internet true?’ This provoked a great discussion about how information comes from all sides, and that anyone can put information online.  And anyone else can discuss it.  It’s truth and opinions and you can’t believe everything you read, it’s not regulated, you need to find a few sources and make some judgment calls.

A participant brought up that children and youth could use Internet to find ‘bad’ things, that adults can prey on children and youth using the Internet.  We discussed that teachers and parents really need to have some understanding of how Internet works. Children and youth need to know how to protect themselves on the Internet; for example, not posting personal information or information that can identify their exact location.  We discussed online predators and how children and youth can stay secure, and how teachers and communities should learn more about Internet to support children and youth to stay safe.

We discussed the Internet as a place of both opportunities and risks, going back to our earlier discussions on Child Protection in this project and expanding on them.  I also shared an idea I’d seen on ICT Works about how to set up the computers in a way that the teachers/instructor can see all the screens and know what kids are doing on them – this is more effective than putting filters and controls on the machines.

Speaking of controls: virus protection and flash drives

The negative impact of viruses on productivity in African countries has been covered by the media, and I enthusiastically concur. I’ve wasted many hours because someone has come in with a flash drive that infected all the computers we are using at a workshop.  Our general rule is no flash drives allowed during the workshop period.  I have no illusions, however, that the computers will remain flash drive free forever.  One good thing to do to reduce the risk of these autorun viruses is to disable autorun on the computers.  This takes about 2 minutes.  After you do that, you just have to manually access flash drives by opening My Computer from the start menu. A second trick is to create an autorun.inf file that redirects the virus and stops it from propagating on your machine. Avast is a free software that seems to catch most autorun viruses.  Trend Micro doesn’t seem to do very well in West Africa.

Hands on, hands on, hands on

I cannot stress enough the importance of hands on. We try to make sure that there is a lot of free time at this kind of workshop for people to play around online.  This usually means keeping the workshop space open for a couple hours after the official workshop day has ended and opening up early in the morning. People will skip lunch, come early, and stay late for an opportunity to get on-line. Those with more experience can use that time to help others. People often use this time to help each other open personal email accounts and share their favorite sites.

No getting too technical

People don’t want to listen to a bunch of theory or mechanical explanations on how things work. They don’t need to see the inside of a CPU, for example. They need to know how to make things work for them.  And the only way they will figure it out is practice, trial and error, playing around.  If a few people in the workshop are really curious to know the mechanics of something, they will start asking (if the facilitator is approachable and non-threatening), but most people for starters just want to know how to use the tools.

No showing off

I’ll always remember my Kenyan colleague Mativo saying that in this kind of work, a facilitator’s main role is demystifying ICTs.  So that means being patient and never making anyone feel stupid for asking a question, or showing any frustration with them.  If someone makes a mistake or goes down a path and doesn’t know how to get back and the facilitator has to step in to do some ‘magic’ fixing, it’s good to talk people through some of the ‘fix’ steps in a clear way as they are being done.

My friend DK over at Media Snackers said that he noticed something when working with youth vs adults on Internet training: youth will click on everything to see what happens. Adults will ask what happens and ask for permission to click.  [update:  Media Snackers calls this the ‘button theory‘].  Paying close attention to learning styles and tendencies of each individual when facilitating, including those related to experience, rural or urban backgrounds, age, gender, literacy, other abilities, personality, and adjusting methodologies helps everyone learn better.

Have fun!

Lightening up the environment and making it hands on lowers people’s inhibitions and helps them have the confidence to learn by doing.

**Check back soon for a second post about photography, filming, uploading and setting up a YouTube account….

Related posts on Wait… What?

Child protection, the media and youth media programs

On girls and ICTs

Revisiting the topic of girls and ICTs

Putting Cumbana on the map: with ethics


Read Full Post »