Feeds:
Posts
Comments

Posts Tagged ‘how to’

At the 2016 American Evaluation Association conference, I chaired a session on benefits and challenges with ICTs in Equity-Focused Evaluation. The session frame came from a 2016 paper on the same topic. Panelists Kecia Bertermann from Girl Effect, and Herschel Sanders from RTI added fascinating insights on the methodological challenges to consider when using ICTs for evaluation purposes and discussant Michael Bamberger closed out with critical points based on his 50+ years doing evaluations.

ICTs include a host of technology-based tools, applications, services, and platforms that are overtaking the world. We can think of them in three key areas: technological devices, social media/internet platforms and digital data.

An equity focus evaluation implies ensuring space for the voices of excluded groups and avoiding the traditional top-down approach. It requires:

  • Identifying vulnerable groups
  • Opening up space for them to make their voices heard through channels that are culturally responsive, accessible and safe
  • Ensuring their views are communicated to decision makers

It is believed that ICTs, especially mobile phones, can help with inclusion in the implementation of development and humanitarian programming. Mobile phones are also held up as devices that can allow evaluators to reach isolated or marginalized groups and individuals who are not usually engaged in research and evaluation. Often, however, mobiles only overcome geographic inclusion. Evaluators need to think harder when it comes to other types of exclusion – such as that related to disability, gender, age, political status or views, ethnicity, literacy, or economic status – and we need to consider how these various types of exclusions can combine to exacerbate marginalization (e.g., “intersectionality”).

We are seeing increasing use of ICTs in evaluation of programs aimed at improving equity. Yet these tools also create new challenges. The way we design evaluations and how we apply ICT tools can make all the difference between including new voices and feedback loops or reinforcing existing exclusions or even creating new gaps and exclusions.

Some of the concerns with the use of ICTs in equity- based evaluation include:

Methodological aspects:

  • Are we falling victim to ‘elite capture’ — only hearing from higher educated, comparatively wealthy men, for example? How does that bias our information? How can we offset that bias or triangulate with other data and multi-methods rather than depending only on one tool-based method?
  • Are we relying too heavily on things that we can count or multiple-choice responses because that’s what most of these new ICT tools allow?
  • Are we spending all of our time on a device rather than in communities engaging with people and seeking to understand what’s happening there in person?
  • Is reliance on mobile devices or self-reporting through mobile surveys causing us to miss contextual clues that might help us better interpret the data?
  • Are we falling into the trap of fallacy in numbers – in other words, imagining that because lots of people are saying something, that it’s true for everyone, everywhere?

Organizational aspects:

  • Do digital tools require a costly, up-front investment that some organizations are not able to make?
  • How do fear and resistance to using digital tools impact on data gathering?
  • What kinds of organizational change processes are needed amongst staff or community members to address this?
  • What new skills and capacities are needed?

Ethical aspects:

  • How are researchers and evaluators managing informed consent considering the new challenges to privacy that come with digital data? (Also see: Rethinking Consent in the Digital Age)?
  • Are evaluators and non-profit organizations equipped to keep data safe?
  • Is it possible to anonymize data in the era of big data given the capacity to cross data sets and re-identify people?
  • What new risks might we be creating for community members? To local enumerators? To ourselves as evaluators? (See: Developing and Operationalizing Responsible Data Policies)

Evaluation of Girl Effect’s online platform for girls

Kecia walked us through how Girl Effect has designed an evaluation of an online platform and applications for girls. She spoke of how the online platform itself brings constraints because it only works on feature phones and smart phones, and for this reason it was decided to work with 14-16 year old urban girls in megacities who have access to these types of devices yet still experience multiple vulnerabilities such as gender-based violence and sexual violence, early pregnancy, low levels of school completion, poor health services and lack of reliable health information, and/or low self-esteem and self-confidence.

The big questions for this program include:

  • Is the content reaching the girls that Girl Effect set out to reach?
  • Is the content on the platform contributing to change?

Because the girl users are on the platform, Girl Effect can use features such as polls and surveys for self-reported change. However, because the girls are under 18, there are privacy and security concerns that sometimes limit the extent to which the organization feels comfortable tracking user behavior. In addition, the type of phones that the girls are using and the fact that they may be borrowing others’ phones to access the site adds another level of challenges. This means that Girl Effect must think very carefully about the kind of data that can be gleaned from the site itself, and how valid it is.

The organization is using a knowledge, attitudes and practices (KAP) framework and exploring ways that KAP can be measured through some of the exciting data capture options that come with an online platform. However it’s hard to know if offline behavior is actually shifting, making it important to also gather information that helps read into the self-reported behavior data.

Girl Effect is complementing traditional KAP indicators with web analytics (unique users, repeat visitors, dwell times, bounce rates, ways that users arrive to the site) with push-surveys that go out to users and polls that appear after an article (“Was this information helpful? Was it new to you? Did it change your perceptions? Are you planning to do something different based on this information?”) Proxy indicators are also being developed to help interpret the data. For example, does an increase in frequency of commenting on the site by a particular user have a link with greater self-esteem or self-efficacy?

However, there is only so much that can be gleaned from an online platform when it comes to behavior change, so the organization is complementing the online information with traditional, in-person, qualitative data gathering. The site is helpful there, however, for recruiting users for focus groups and in-depth interviews. Girl Effect wants to explore KAP and online platforms, yet also wants to be careful about making assumptions and using proxy indicators, so the traditional methods are incorporated into the evaluation as a way of triangulating the data. The evaluation approach is a careful balance of security considerations, attention to proxy indicators, digital data and traditional offline methods.

Using SMS surveys for evaluation: Who do they reach?

Herschel took us through a study conducted by RTI (Sanders, Lau, Lombaard, Baker, Eyerman, Thalji) in partnership with TNS about the use of SMS surveys for evaluation. She noted that the rapid growth of mobile phones, particularly in African countries, opens up new possibilities for data collection. There has been an explosion of SMS surveys for national, population-based surveys.

Like most ICT-enabled MERL methods, use of SMS for general population surveys brings both promise:

  • High mobile penetration in many African countries means we can theoretically reach a large segment of the population.
  • These surveys are much faster and less expensive than traditional face-to- face surveys.
  • SMS surveys work on virtually any GSM phone.
  • SMS offers the promise of reach. We can reach a large and geographically dispersed population, including some areas that are excluded from FTF surveys because of security concerns.

And challenges:

  • Coverage: We cannot include illiterate people or those without access to a mobile phone. Also, some sample frames may not include the entire population with mobile phones.
  • Non-response: Response rates are expected to be low for a variety of reasons, including limited network connectivity or electricity; if two or people share a phone, we may not reach all people associated with that phone; people may feel a lack of confidence with technology. These factors might affect certain sub-groups differently, so we might underrepresent the poor, rural areas, or women.
  • Quality of measurement. We only have 160 CHARACTERS for both the question AND THE RESPONSE OPTIONS. Further, an interviewer is not present to clarify any questions.

RTI’s research aimed to answer the question: How representative are general population SMS surveys and are there ways to improve representativeness?

Three core questions were explored via SMS invitations sent in Kenya, Ghana, Nigeria and Uganda:

  • Does the sample frame match the target population?
  • Does non-response have an impact on representativeness?
  • Can we improve quality of data by optimizing SMS designs?

One striking finding was the extent to which response rates may vary by country, Hershel said. In some cases this was affected by agreements in place in each country. Some required a stronger opt-in process. In Kenya and Uganda, where a higher percentage of users had already gone through an opt-in process and had already participated in SMS-based surveys, there was a higher rate of response.

screen-shot-2016-11-03-at-2-23-26-pm

These response rates, especially in Ghana and Nigeria, are noticeably low, and the impact of the low response rates in Nigeria and Ghana is evident in the data. In Nigeria, where researchers compared the SMS survey results against the face-to-face data, there was a clear skew away from older females, towards those with a higher level of education and who are full-time employed.

Additionally, 14% of the face-to-face sample, filtered on mobile users, had a post-secondary education, whereas in the SMS data this figure is 60%.

Additionally, Compared to face-to-face data, SMS respondents were:

  • More likely to have more than 1 SIM card
  • Less likely to share a SIM card
  • More likely to be aware of and use the Internet.

This sketches a portrait of a more technological savvy respondent in the SMS surveys, said Herschel.

screen-shot-2016-11-03-at-2-24-18-pm

The team also explored incentives and found that a higher incentive had no meaningful impact, but adding reminders to the design of the SMS survey process helped achieve a wider slice of the sample and a more diverse profile.

Response order effects were explored along with issues related to questionnaire designers trying to pack as much as possible onto the screen rather than asking yes/no questions. Hershel highlighted that that when multiple-choice options were given, 76% of SMS survey respondents only gave 1 response compared to 12% for the face-to-face data.

screen-shot-2016-11-03-at-2-23-53-pmLastly, the research found no meaningful difference in response rate between a survey with 8 questions and one with 16 questions, she said. This may go against common convention which dictates that “the shorter, the better” for an SMS survey. There was no observable break off rate based on survey length, giving confidence that longer surveys may be possible via SMS than initially thought.

Hershel noted that some conclusions can be drawn:

  • SMS excels for rapid response (e.g., Ebola)
  • SMS surveys have substantial non-response errors
  • SMS surveys overrepresent

These errors mean SMS cannot replace face-to-face surveys … yet. However, we can optimize SMS survey design now by:

  • Using reminders during data collection
  • Be aware of response order effects. So we need to randomize substantive response options to avoid bias.
  • Not using “select all that apply” questions. It’s ok to have longer surveys.

However, she also noted that the landscape is rapidly changing and so future research may shed light on changing reactions as familiarity with SMS and greater access grow.

Summarizing the opportunities and challenges with ICTs in Equity-Focused Evaluation

Finally we heard some considerations from Michael, who said that people often get so excited about possibilities for ICT in monitoring, evaluation, research and learning that they neglect to address the challenges. He applauded Girl Effect and RTI for their careful thinking about the strengths and weaknesses in the methods they are using. “It’s very unusual to see the type of rigor shown in these two examples,” he said.

Michael commented that a clear message from both presenters and from other literature and experiences is the need for mixed methods. Some things can be done on a phone, but not all things. “When the data collection is remote, you can’t observe the context. For example, if it’s a teenage girl answering the voice or SMS survey, is the mother-in-law sitting there listening or watching? What are the contextual clues you are missing out on? In a face-to-face context an evaluator can see if someone is telling the girl how to respond.”

Additionally,“no survey framework will cover everyone,” he said. “There may be children who are not registered on the school attendance list that is being used to identify survey respondents. What about immigrants who are hiding from sight out of fear and not registered by the government?” He cautioned evaluators to not forget about folks in the community who are totally missed out and skipped over, and how the use of new technology could make that problem even greater.

Another point Michael raised is that communicating through technology channels creates a different behavior dynamic. One is not better than the other, but evaluators need to be aware that they are different. “Everyone with teenagers knows that the kind of things we communicate online are very different than what we communicate in a face-to-face situation,” he said. “There is a style of how we communicate. You might be more frank and honest on an online platform. Or you may see other differences in just your own behavior dynamics on how you communicate via different kinds of tools,” he said.

He noted that a range of issues has been raised in connection to ICTs in evaluation, but that it’s been rare to see priority given to evaluation rigor. The study Herschel presented was one example of a focus on rigor and issues of bias, but people often get so excited that they forget to think about this. “Who has access.? Are people sharing phones? What are the gender dynamics? Is a husband restricting what a woman is doing on the phone? There’s a range of selection bias issues that are ignored,” he said.

Quantitative bias and mono-methods are another issue in ICT-focused evaluation. The tool choice will determine what an evaluator can ask and that in turn affects the quality of responses. This leads to issues with construct validity. If you are trying to measure complex ideas like girls’ empowerment and you reduce this to a proxy, there can often be a large jump in interpretation. This doesn’t happen only when using mobile phones for evaluation data collection purposes but there are certain areas that may be exacerbated when the phone is the tool. So evaluators need to better understand behavior dynamics and how they related to the technical constraints of a particular digital or mobile platform.

The aspect of information dissemination is another one worth raising, said Michael. “What are the dynamics? When we incorporate new tools, we tend to assume there is just one-step between the information sharer and receiver, yet there is plenty of literature that shows this is normally at least 2 steps. Often people don’t get information directly, but rather they share and talk with someone else who helps them verify and interpret the information they get on a mobile phone. There are gatekeepers who control or interpret, and evaluators need to better understand those dynamics. Social network analysis can help with that sometimes – looking at who communicates with whom? Who is part of the main infuencer hub? Who is marginalized? This could be exciting to explore more.”

Lastly, Michael reiterated the importance of mixed methods and needing to combine online information and communications with face-to-face methods and to be very aware of invisible groups. “Before you do an SMS survey, you may need to go out to the community to explain that this survey will be coming,” he said. “This might be necessary to encourage people to even receive the survey, to pay attention or to answer it.” The case studies in the paper “The Role of New ICTs in Equity-Focused Evaluation: Opportunities and Challenges” explore some of these aspects in good detail.

Read Full Post »

This is a summary of the January 20th Technology Salon. It’s cross-posted from the Technology Salon blog.

Using GPS in Kwale, Kenya

At the Technology Salon on “How to Incorporate ICT into Proposals”, we discussed some of the challenges and solutions for proposal writers when they try to incorporate information and communication technologies into future program design. (Sign up to get info on the next Salon)

Problems with Incorporating ICT into Proposals

Essentially, short time frames for preparing proposals don’t allow for participation and end-user involvement and feedback during development of technology solutions. Donors often want details about a technological solution within a proposal; however, in order to define details, more knowledge of local context, a participatory local communications assessment, end-user testing and more need to be done.

This causes proposal writers to sometimes put unrealistic goals in proposals in order to secure funding or because they don’t have information about the local context and actual feasibility of a proposed technology solution; implementors may find later that they are unable to deliver

The issue is compounded by the real lack of organizational buy-in to allow for testing and iterating, for trial and error, and even failure, in organizations and within projects to learn what works and what can be scaled up through proposals and donor funding.

Solutions to Including ICT in Proposals

Overall, organizations that want to integrate ICTs in their work need to plan ahead, strengthen their staff capacity on the ground, and have a clear understanding of the steps to follow when integrating ICTs into proposals. And rather than detailing an exact tech solution into a proposal, the proposal writer could offer a few options, say that “a solution could be” or “might look something like this”, or be clear when negotiating with the donor that an idea will be tested but may change along the way when participatory work is conducted with end users, and as it is tested and adapted to the local context.

This can help remind donors that digital technology is only one way to innovate, and technology needs to be seen as one tool in the information and communication toolbox. For example, SMS might be just one communication channel among many options that are laid out in a project or program, and the most appropriate channels (which might also include face-to-face, paper, community bulletin board, phone calls, etc.) need to be chosen based on a local situation analysis and end-user input.

For staff, integrating ICTs as smaller aspects in programs can offer opportunities for small trial and error and learning; eg., using SMS as one channel of communication in an education or health program and comparing results with a program that didn’t use SMS could allow an organization to test small ICT efforts and slowly learn, modify and integrate those that work. (See How Plan Kwale has been using ICT in their programs since 2003)

When staff experiment with and experience ICTs in one program, they may be more likely to innovate with technology in another program. As ICTs become more commonly used in communities and by local development practitioners, space for innovations grows because innovation can happen right there, closer to the ground rather than being designed in an office in DC and parachuted into communities in other places.

Experimentation with youth programs is a good place to start with tech innovations because youth tend to be more literate (if they are school going youth) and they pick up technology skills easily in many cases. Adults need not be left out, but the learning methodology may need to be different. Engaging the community in detailing potential protection and privacy risks in data collection is key to finding ways to minimize risks. (See 8 Elements for a Positively Brilliant ICT4D Workshop)

The Plan Example

Plan Finland with support from Plan USA commissioned the ICT Enabled Development guide (PDF) to better understand and document the ICT4D context in several of the countries where Plan is working in Africa. Country offices wanted to strengthen their capacities to strategically incorporate ICTs into their work and to ensure that any fund-raising efforts for ICTs were stemming from real needs and interest from the ground. Plan offices were also in the process of updating their long-term strategic plans and wanted to think through how and where they could incorporate ICTs in their work internally and with communities.

The report process included 2-day workshops with staff in 5 countries, based on a set of ICT distance learning materials and discussion questions. The idea was to combine background and situational research, learning about ICT4D, and further input from Country Office colleagues into this process to come up with a realistic guide on how and where Plan could begin integrating ICTs into its work directly, strategically and indirectly (See 3 ways to integrate ICTs into development work).

The report team worked by Skype and email with a point person in each office who planned and carried out the workshop. The report team also developed the multi-media training pack with materials that the point persons used to support the workshop, and compiled the ICT-Enabled Development guide based on the experience.

From the report and this experience, Plan produced a 10-step process for integrating ICTs into development initiatives:

  1. Context Analysis: what is happening with ICT (for development) in the country or region?
  2. Defining the need: what problems can ICT help overcome? what opportunities can it create?
  3. Choosing a strategy: what kind of ICT4D is needed? direct? internal? strategic?
  4. Undertaking a participatory communications assessment: who will benefit from this use of ICT and how?
  5. Choosing the technology: what ICTs/applications are available to meet this need or goal?
  6. Adjusting the content: can people understand and use the information provided for and by the ICTs?
  7. Building and using capacity: what kind of support will people need to use and benefit from the ICT, and to innovate around it?
  8. Monitoring progress: how do you know if the ICT is helping meet the development goal or need?
  9. Keeping it going: how can you manage risks and keep up with changes?
  10. Learning from each other: what has been done before, and what have you learned that others could use

See the 3-page “ICT-Enabled Development Checklist” for more detail on how to go about integrating ICTs into a development proposal and be sure to download the ICT Enabled Development guide (PDF).

Read Full Post »

Bamboo Shoots training manual

I like to share good training guides when I come across them, so here is a quick summary and a link to Bamboo Shoots. It was originally created by Plan in Cambodia.

Bamboo Shoots is a training manual on child rights, child centered community development and child-led community actions for facilitators working with children and youth groups. You can download it here.

Bamboo Shoots was developed to: Increase children’s understanding of their rights as defined by the United Nations Convention on the Rights of the Child (UNCRC); raise children’s awareness of their rights and build their capacities to claim them; create opportunities for children to recognize, identify and prioritize issues and problems or gaps in relation to child rights violations; and provide opportunities for children to influence agendas and action regarding identified and prioritized child rights violations.

Bamboo Shoots takes complicated concepts and breaks them down into easy language and engaging, interactive sessions. It also offers good resources and background material for facilitators so that they can manage the sessions well.

Part One:

I like this manual because it starts off right in the first chapter with the importance of setting the tone and the context for good child and youth participation. It provides ideas on selecting participants and facilitators, and gives a description of a good facilitator. It provides recommendations on the setting and practical considerations for managing a workshop with children, as well as good paragraph to help think through when and when not to include other adults in the training.

The guideline goes through the 6 principles for making child participation a reality:

  1. Non-discrimination and inclusiveness
  2. Democracy and equality of opportunity
  3. Physical, emotional and psychological safety of participants
  4. Adult responsibility
  5. Voluntarism, informed consent and transparency
  6. Participation as an enjoyable and stimulating experience for children

It shares Plan’s code of ethics on child participation and important steps to follow in working with children, as well as tips on how to establish a good working relationship with children, how to help children learn and develop their potential, how to help children build self-confidence and self-esteem, and how to encourage children to develop a responsible attitude towards others and a sense of community. There is a section on how to keep children safe also and an explanation of a facilitator’s ‘duty of care’.

A last section of part one lists common facilitation techniques and tools, such as: role-play, working in pairs and groups, idea storming, whole group discussion, questioning, projects, buzz sessions, drawing, photographs, video, word association, recreating information and more; and gives ideas on when they are most useful.

Part Two:

Section 9 on community mapping

The next section has very complete sessions on:

  • the concept of rights
  • the history of human rights, and international treaties on rights
  • children’s rights as human rights
  • duties and responsibilities in relation to child rights
  • making sure children are involved
  • child rights and daily realities and making a child rights map
  • gaps in fulfilling child rights
  • setting priority problems and violations of child rights
  • creating an action agenda and proposed solutions to the gaps identified

Each session comes complete with a pre-training assessment, reading material for facilitators and handouts for participants.

Part Three:

The last section of the manual helps facilitators take children through the steps to child-led community action, including children’s participation in all the program and project cycles: assessment, planning, implementation, monitoring and evaluation.

Needs-based vs. Rights-based

It also explains Plan’s rights-based child-centered community development approach, the foundations of that approach, and the difference between needs-based approaches and rights-based approaches. It goes on to cover planning and supporting child-led community action.

The last section of the guide offers a list of resources and references.

For anyone working with children, or even anyone looking for an excellent comprehensive community training package on rights and community-led action, I really recommend checking out Bamboo Shoots. Whether you are working through media and ICTs or using more traditional means for engaging children, this is a great guide on how to do it well from start to finish. I’ll be referring to it often.

Additional Resources:

Minimum standards for child participation in national and regional consultation events

Protocols and documents to help ensure good quality child participation

United Nations Convention on the Rights of the Child

Insight Share’s rights-based approach to participatory video toolkit

Related posts on Wait… What?

Child participation at events: getting it right

Community based child protection

Child protection, the media and youth media programs

Read Full Post »

Plan just released a new report called ICT Enabled Development: Using ICT strategically to support Plan’s work. The report is part of an on-going process by Plan Finland (kudos to Mika Valitalo for leading the process) in collaboration with Plan USA to support Plan’s country offices in Africa to use ICTs strategically and effectively in their development work. It was written by Hannah Beardon and builds on the Mobiles for Development Guide that Plan Finland produced (also written by Hannah) in 2009.

The idea for the report came out of our work with staff and communities, and the sense that we needed to better understand and document the ICT4D context in the different countries where we are working. Country offices wanted to strengthen their capacities to strategically incorporate ICTs into their work and to ensure that any fund-raising efforts for ICTs were stemming from real needs and interest from the ground. Plan offices were also in the process of updating their long-term strategic plans and wanted to think through how and where they could incorporate ICTs in their work internally and with communities.

The process for creating the report included 2-day workshops with staff in 5 countries, using a methodology that Mika, Hannah and I put together. We created a set of ICT training materials and discussion questions and used a ‘distance-learning’ process, working with a point person in each office who planned and carried out the workshop. Mika and I supported via Skype and email.

Hannah researched existing reports and initiatives by participating offices to find evidence and examples of ICT use. She also held phone or skype conversations with key staff at the country and regional levels around their ICT use, needs and challenges, and pulled together information on the national ICT context for each country.

The first section of the report explains the concept of ‘ICT enabled development’ and why it is important for Plan and other development organizations to take on board. “With so many ICT tools and applications now available, the job of a development organization is no longer to compensate for lack of access but to find innovative and effective ways of putting the tools to development ends. This means not only developing separate projects to install ICTs in under-served communities, but looking at key development challenges and needs with an ICT eye, asking ‘how could ICTs help to overcome this problem’?

Drawing on the research, conversations, workshop input and feedback from staff, and documented experience using ICTs in Plan’s work, Hannah created a checklist with 10 key areas to think about when planning ICT-enabled development efforts.

  1. Context Analysis: what is happening with ICT (for development) in the country or region?
  2. Defining the need: what problems can ICT help overcome? what opportunities can it create?
  3. Choosing a strategy: what kind of ICT4D is needed? direct? internal? strategic?
  4. Undertaking a participatory communications assessment: who will benefit from this use of ICT and how?
  5. Choosing the technology: what ICTs/applications are available to meet this need or goal?
  6. Adjusting the content: can people understand and use the information provided for and by the ICTs?
  7. Building and using capacity: what kind of support will people need to use and benefit from the ICT, and to innovate around it?
  8. Monitoring progress: how do you know if the ICT is helping meet the development goal or need?
  9. Keeping it going: how can you manage risks and keep up with changes?
  10. Learning from each other: what has been done before, and what have you learned that others could use?

The checklist helps to ensure that ICT use is linked to real development needs and priorities and appropriate for those who are participating in an initiative or a project. The report elaborates on the 10 key areas with detailed observations, learning and examples to illustrate them and to help orient others who are working on similar initiatives. It places the checklist into a 4-stage process for ICT integration.

  1. Understanding the context for ICT work: includes external context and internal experience and capacity
  2. Finding a match between priorities and possibilities: rooting the system in local needs and priorities and finding good uses for tools and applications
  3. Planning and implementing concrete initiatives: carrying out participatory assessments, linking to other development processes and addressing technical issues and concerns
  4. Building a culture of systematic, sustained and strategic use of ICTs: linking ICTs with program work, transforming the role of ‘the ICT guy’, and building expertise on the cultural and social aspects of ICT use

Additional material and case studies, ICT country briefings, and an overview of Plan’s current work with ICT4D in Africa are offered at the end of the report.

The report includes input from Plan staff in Ghana, Mali, Mozambique, Senegal and Uganda who participated in the ICT4D workshops. It also draws heavily on some of the work that Mika has been doing in Finland and Kenya, and work that I’ve been involved in and have written about in Mali, Cameroon, Mozambique, Ghana, Benin and Kenya involving staff, community members and community youth. You can contact Mika to get the workshop methodology in French or English or to comment on the report (ict4d [at] plan [dot] fi).

There’s so much rich material in the report that I almost want to summarize the whole thing here on my blog, section by section, so that people will take the time to read it…  I think this is a really important and useful piece of work and we’re very excited that it’s now available! Download it here.

Related posts on Wait… What?

ICT4D in Uganda: ICT does not equal computers

Demystifying Internet (Ghana)

It’s all part of the ICT jigsaw: Plan Mozambique ICT4D workshops

A positively brilliant ICT4D workshop in Kwale, Kenya

7 or more questions to ask before adding ICTs (Benin)

A catalyst for positive change (Cameroon)

Salim’s ICT advice part 1: consider both process and passion (Kenya)

Salim’s ICT advice part 2: innovate but keep it real (Kenya)

Meeting in the middle

I and C, then T (US)

Read Full Post »