UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

Post on 24-May-2015

34 views 0 download

Tags:

transcript

i190 Spring 2014: Information and Communications Technology for Development (ICTD) in Practice

University of California Berkeley, School of Information

LECTURE 22: 21 Apr 2014

Instructor: San Ng (www.sanng.com)

Class Website: i190spring2014.sanng.com

i190 Framework Conceptual

Week1: Introduction to Course W2: What is Development? W3: What is ICTDW4: Who Does What in Practice? Mapping the ICTD Landscape 

i190ICTD in

Practice: Core Skills

Technical (eApplications)

W5: Overarching Issues of eApplications W6: Infrastructure, Telecenters, Agriculture,W7: Revisiting Agriculture,, W8 : e-Health, EducationW9: eGovernance Microfinance 

Management

W10: BreakW11: Intro to Project Management  Planning and Assessment W12: Initiating: Design, Scheduling, Budgeting, HR

W13: Implementation W14: Monitoring and Evaluation/ Next Cycle  W15: Final Projects & Wrap Up

Introduction to Project Management

Planning

Initiation

Implementation

Monitoring & Evaluation

Next Phase? Transformation?

Implementation- Best PracticesCase: ITC e-Choupal: What are the needs/problems that this ICTD project is trying to address?

Implementation- Best PracticesCase: ITC e-ChoupalWhat made implementation successful?•Trust: choice•Meets Needs: Clear Value•Appropriate Tech: Simplicity of technology, new and old tech•Local structures and systems•Incremental Roll out •Mission-based

Implementation- Complex Environments

Case: Competing for Development (A)

•If you were Ghazialam, would you go ahead with the $65,000 investment?

•What are the key tradeoffs? What would an ‘ideal’ outcome look like?

Implementation- Complex Environments

Case: Competing for Development (B1-6)

6 groups

Role play exercise: Make the case within your role for the $65,000 investment

Introduction to Project Management

Planning

Initiation

Implementation

Monitoring & Evaluation

Next Phase? Transformation?

Different Types of Evaluation and Performance Measurement

Program Level Organization Level Community andSocietal level

WikipediaCase

Evaluation Purpose

Measuring program effectiveness

Determining if a program meets its objectives

TYPES

*baselinesFormative*ongoing

* feedback* changing

the program

Summative* look at final outcomes

*impacts* cut or keep

OTHER PURPOSES* compliance* legitimacy* certification* lessons learned* check for unintended consequences* benchmarking * more money* white wash and eye wash p* kill a project

* political attack

* new opportunities

* protection and self interest

* melt down indicators

EVALUATION CHECK LIST

WHEN IS EVALUATION WORTH DOING?

* Who Wants This and What Decision Do They Want to Make?(lessons learned)

* Are the Impediments Manageable?(resources, objectives, agreement, special issues)

* Is there Political Support?(general support)

THE REGULAR COMPONENTS OF THE EVALUATION PROCESS* Purpose and Objectives

* Indicators

* Design

* Data and Utilization

* Problems

EVALUATION ACTOR MAPPING

BOARDOR LEGISLATORS

OTHERSTAFFOR DEPTS.

TOP MGMT

PROGRAMPARTICIPANTS

PROGRAM STAFF

PRESS ORCOMMUNITY

DONORS

MIDDLEMGMT.

OUTSIDERSFOR TEACHINGPURPOSES

RANGE OF INDICATORS

FEELINGS DO YOU TRUST THESE

INPUTS PEOPLE/

PROCESS involvement/coordination

OUTCOMES(INTERMEDIATE) INCREASED INCOME(FINAL) X LEVEL OF CONTANIMATION

EFFICIENCY AND PRODUCTIVITY lbs. Of fish/$ $/lbs. Of fish

SPAN AND SCOPE OF COVERAGE % target population served

SATISFACTION customer satisfaction/ commercial

IMPACTS (sustainable) measurable change/ broader/ (PROGRAM CAUSED OUTCOME) longer term

* HOW WOULD YOU ASSURE THAT YOURRESULTS WERE VALID AND RELIABLE?

RELIABLITY-- DO YOU GET THE SAME RESULTTIME AFTER TIME.

VALIDITY-- UNBIASED COMPARED TO A STANDARD

What is the purpose of a research design?

* tailored to each problem

* to answer very specific questions

* weigh benefits and costs and resources

* can allow cross comparison

* was it the program that made the difference?

Using Evaluation Results-- Style Differences

Academic Style* Slow

* Scientific Method

* Clear Objectives

* Careful Study

* Written Communication

* Precision

* Academic Reference Group

Managerial Style* Pressure to Decide

* Many Simultaneous +Fragmented Tasks

* Competing Objectives

* Action

* Verbal Communication

* Incomplete Date

* Managerial Reference Group

PROBLEMS

* TIME * VERY RIGOROUS * IRRELEVANT

* FORMAT * NOT RIGOROUS * COMMUNICATION

DESIGN TYPESPRE EXPERIMENTAL (PE)

* Goals verses Performance

* Before and After 0 X 0

QUASI EXPERIMENTAL (QE) * Time Series 01 02 03 04

* Non Equivalent Control Group 01 X 02 03 04

* Multiple Time Series 01 02 X 03 04 05 06 07 08

EXPERIMENTAL

* Two Group Pre and Post Test 01 X 02 R 03 04

* Post Test Only X 01 R 02

* Solomon Four 01 X 02

03 04 X 05 06

R

Some of the More Common Methods

* Balanced Score Cards and Other Overall General Assessments

* Goals vs. Performance and also Cost and Efficiency

* Outcome Assessment

* Benchmarking

* Best Practice

* Rapid Assessment Tools (quicker and dirtier rather than deeper)

* More based on sampling than 100% study

DATA COLLECTION METHODS* General Statistical Analysis

* Cost Benefit/ Rates of Return

* Simulations

* Content Analysis

* Record Reviews

* Unobtrusive Measures

* Group Observation

* Surveys and Testing

* Personal Interviews

* Participation Observation

* Case Studies

LESSINTRUSIVE

MOREINTRUSIVE

THE ETERNAL TRIANGLE

Precision

Cost Complexity

  Measureable Indicators

Types of Data needed

Data Collection Methods/Frequency for M&E

Overriding Goal      

Objectives (at least 4)

     

Instructions: Everyone was pleasantly shocked by these successful results. However, the founder Jimbo Wales intuitively knows that the number of articles per se does not measure Wikipedia's success completely, especially since Wikipedia began with a completely different set of goals/activities and became 'successful' only organically. He wants to hire you to determine a sound methodology to evaluate Wikipedia's success. He wants you to design a Logical Framework for Wikipedia (based on what we already learned in class), with indicators he can measure to determine Wikipedia progress and success. He has given us a sample template that we will discuss and brainstorm in class: