Disclaimer:
Please be aware that the content herein has not been peer reviewed. It consists of personal reflections, insights, and learnings of the contributor(s). It may not be exhaustive, nor does it aim to be authoritative knowledge.
Overview
Prepared by (Name of the experimenter)
Javier Brolo
On date (Day/Month/Year)
November 5th, 2020
Current status of experimental activity
Completed
What portfolio does this activity correspond to? If any
N/A
What is the frontier challenge does this activity responds to?
[Implemented before frontier challenge was defined]
What is the learning question(from your action learning plan) is this activity related to?
How ideas about intervention choices affect collaboration
Please categorize the type that best identifies this experimental activity:
Fully Randomised (RCTs, etc.)
Which sector are you partnering with for this activity? Please select all that apply
United Nations agency
Please list the names of partners mentioned in the previous question:
Country office programmatic areas
Design
What is the specific learning intent of the activity?
We wanted to learn if the exposure to ideas about why a policy is proffered and how development interventions are chosen has an effect in the way that people understand how to collaborate.
What is your hypothesis? IF... THEN....
If people are exposed to ideas about how policies are chosen, then people will be more open to collaboration.
Does the activity use a control group for comparison?
Yes, a different group entirely
How is the intervention assigned to different groups in your experiment?
Random assignment
Describe which actions will you take to test your hypothesis:
We created two versions of a survey and randomly assigned which link people would be invited to participate. The treatment group received information about how policy choices depend on three aspects: what we try to maximize, how we understand the world works, and what resources we have available. Also, interventions can address different levels of agency: lack of knowledge, lack of resources to use existing knowledge, lack of collaboration from others, lack of incentives. The control group received the message but after recording their responses to questions on how to collaborate across the country office.
What is the unit of analysis of this experimental activity?
Individual responses to the survey
Please describe the data collection technique proposed
Online form with different link for control and treatment group
What is the timeline of the experimental activity? (Months/Days)
One week
What is the estimated sample size?
10-49
What is the total estimated monetary resources needed for this experiment?
Less than 1,000 USD
Quality Check
This activity is relevant to a CPD outcome, The hypothesis is clearly stated, This activity offers strong collaboration oportunities, This activity offers a high potential for scaling, This activity has a low risk
Please upload any supporting images or visuals for this experiment.
Please upload any supporting links
What are the estimated non- monetary resources required for this experiment? (time allocation from team, external resources, etc) If any.
We need a list of people in the country office and their e-mails to randomly assign a link. Other than that, it can be implemented using existing office resources such as a computer, internet, online forms.
Results
Was the original hypothesis (If.. then) proven or disproven?
Partially proven
Do you have observations about the methodology chosen for the experiment? What would you change?
Certainly, the
sample size is too small to evaluate if there is a statistically significant
effect. Also, the measurements of the dependent variable are qualitative, and
additional validations would be needed to assess validity of the classification
used. The administration of the survey was very rustic. Also, people could
choose not to answer, so those included in the study may have systematically
different attributes than people who did not respond. We need to think about
how to best ask for consent in small exercises such as this. Given the sample
size, removing individual answers has an important effect on the results.
From design to results, how long did this activity take? (Time in months)
One week
What were the actual monetary resources invested in this activity? (Amount in USD)
US$0.00
Does this activity have a follow up or a next stage? Please explain
We present the
result back to the country office with the intention of making them aware of
the principles of experimental activities and its use. Also, the results of the
survey pointed out possible paths to collaboration between the lab and the
country office.
Is this experiment planned to scale? How? With whom?
We plan to scale the use of experimental surveys within projects in the country office.
Please include any supporting images that could be used to showcase this activity
Please add any supporting links that describe the planning, implementation, results of learning of this activity? For example a tweet, a blog, or a report.
Considering the outcomes of this experimental activity, which of the following best describe what happened after? (Please select all that apply)
This experiment did not scale yet
Learning
What do you know now about the action plan learning question that you did not know before? What were your main learnings during this experiment?
We don't have
enough data to understand if the ideas people were exposed to affect their
answers. However, some patterns were identified. People without exposure to the
evaluated framework focused their interest on the lab to facilitate information
about new ideas and manage knowledge. Yet, people exposed saw more
opportunities in the lab linking them to new actors and increasing the impact
of projects. Also, people exposed to the framework tended to see the value of
the lab in more abstract terms, providing learnings about strategy and
methodologies, rather than just documenting success stories.
What were the main obstacles and challenges you encountered during this activity?
The data collection method is not trivial. People are saturated with work and e-mails, and don't have enough time to fill out surveys. We could try to get responses reducing the effort to answer. Also, the survey had open ended questions, as we did not have clear ideas regarding the language used to express valuable learnings.
Who at UNDP might benefit from the results of this experimental activity? Why?
UNDP can learn
in a simple way the basic design and use of experimentation. Also, there were
clear guidelines as to what the value of the Lab is.
Who outside UNDP might benefit from the results of this experiment? and why?
Counterparts from projects at the country office can benefit from the availability of experimental designs.
Did this experiment require iterations? If so, how many and what did you change/adjust along the way? and why?
No.
What advice would you give someone wanting to replicate this experimental activity?
Be patient with
responses or try to facilitate individual links for each survey. Complement
open ended questions with single or multiple-choice ones.
Can this experiment be replicated in another thematic area or other SDGs? If yes, what would need to be considered, if no, why not?
Absolutely, ideas can be relevant for defining policy options in any SDG.
How much the "sense" and "explore" phases of the learning cycle influenced/shaped this experiment? In hindsight, what would you have done differently with your fellow Solution Mapper and Explorer?
This was an isolated experiment, done at the beginning of the Lab. It was done to showcase the experimental method, in parallel to activities to showcase co-creation and exploration.
What surprised you?
It surprised me that frameworks about policy choices do not seem to be as easily recognizable for the design of projects. For example, the progression of individual agency to determine how appropriate an intervention is, seems off; although this may be a good thing because it means there is an opportunity to add value introducing them.
Comments
Log in to add a comment or reply.