Disclaimer:
Please be aware that the content herein has not been peer reviewed. It consists of personal reflections, insights, and learnings of the contributor(s). It may not be exhaustive, nor does it aim to be authoritative knowledge.
Overview
Prepared by (Name of the experimenter)
Javier Brolo and Juan Pablo Rustrián
On date (Day/Month/Year)
September 31th, 2022
Current status of experimental activity
Completed
What portfolio does this activity correspond to? If any
Co-creation of waste management solutions.
What is the frontier challenge does this activity responds to?
How to improve the collaboration between society and public institutions to increase resilience to climate change.
What is the learning question(from your action learning plan) is this activity related to?
How to improve waste management.
Please categorize the type that best identifies this experimental activity:
Pre Experimental (trial and error, prototype, a/b testing), Fully Randomised (RCTs, etc.)
Which sector are you partnering with for this activity? Please select all that apply
United Nations agency, Public Sector, Private Sector, Civil Society/ NGOs, Academia
Please list the names of partners mentioned in the previous question:
UNDP Office, Local Governments, Municipalities, Ministries and NGOs related to environmental care and Waste Management, Universities; or any sector that wants to collect online data.
Design
What is the specific learning intent of the activity?
We wanted to learn if an open-source digital tool can increase our capacity to collect data from experimental designs.
What is your hypothesis? IF... THEN....
If we use form{`r}, then we’ll be able to easily run online survey experiments on a broad range of topics.
Does the activity use a control group for comparison?
Yes, a different group entirely
How is the intervention assigned to different groups in your experiment?
Random assignment
Describe which actions will you take to test your hypothesis:
First, to test the tool, we strategically read the documentation from form{`r} to learn what would be required to run online experiments. We needed to understand the different types of survey items, how to customize them, how to assign respondents to random groups, and how to distribute the survey to respondents. Then we created a minimum viable experimental survey design and distributed it among peers to get feedback on the functionality. The minimum viable survey included two versions for a single question that compare the preferred material to package beverages after receiving one of two prompts: (1) a message that pointed out that glass and aluminum are easier to recycle; or (2) a message that pointed out that glass and aluminum are more expensive to buy.
What is the unit of analysis of this experimental activity?
The respondents.
Please describe the data collection technique proposed
Online surveys with two randomly assigned prompts.
What is the timeline of the experimental activity? (Months/Days)
3 Days
What is the estimated sample size?
10-49
What is the total estimated monetary resources needed for this experiment?
Less than 1,000 USD
Quality Check
This activity is relevant to a CPD outcome, The hypothesis is clearly stated, This activity offers strong collaboration oportunities, This activity offers a high potential for scaling, This activity has a low risk
Please upload any supporting images or visuals for this experiment.
Results
Was the original hypothesis (If.. then) proven or disproven?
Proven. The test successfully showed that it’s possible to run online survey experiments at no
additional costs using form{`r}.
Do you have observations about the methodology chosen for the experiment? What would you change?
We would like to try and share additional online surveys to keep testing the form {'r} tool, collect other types of data and reach a bigger number of respondents.
From design to results, how long did this activity take? (Time in months)
About 3 days.
What were the actual monetary resources invested in this activity? (Amount in USD)
US$0.00.
Does this activity have a follow up or a next stage? Please explain
The idea of using form {r'} is to learn if it is useful and easier to collect online data using surveys. So, the follow up is to improve the skills using this tool. It would be useful to try a more complex survey.
Is this experiment planned to scale? How? With whom?
No.
Please include any supporting images that could be used to showcase this activity
Considering the outcomes of this experimental activity, which of the following best describe what happened after? (Please select all that apply)
Solutions tested in this experiment were scaled in numbers, This experiment led to partnerships, This experiment led to adoption of new ways of working by our partners
Learning
What do you know now about the action plan learning question that you did not know before? What were your main learnings during this experiment?
It was important to research if an open-source digital tool can increase our capacity to collect data from experimental designs, because the lack of easily accessible data collection tools has been a barrier to promote the adoption of experimentation. On the one hand, commercial data collections tools such as Qualtrics or SurveyToGo have the required functionalities but are not easy to procure for pilot projects. While not necessarily a point and click tool, form{`r} offered the possibility to run and easily customize survey experiments online, and now we are able to use this practical tool to adopt experimental designs to test the assumptions and impact of development projects.
What were the main obstacles and challenges you encountered during this activity?
The main obstacles were the skills needed to understand how this tool worked.
Who at UNDP might benefit from the results of this experimental activity? Why?
All UNDP projects could benefit from incorporating open-software in their repertoire of methods to collect and analyze data from surveys.
Who outside UNDP might benefit from the results of this experiment? and why?
Institutions or those who need the tools to collect data easily and efficient.
Did this experiment require iterations? If so, how many and what did you change/adjust along the way? and why?
We needed iterations to create the final survey and then share it with peers.
What advice would you give someone wanting to replicate this experimental activity?
We would recommend having the basic skills to create a survey and to analyze the data collected.
Can this experiment be replicated in another thematic area or other SDGs? If yes, what would need to be considered, if no, why not?
Absolutely. We could use open-source tool to work with surveys related to any topic, the idea relays on the use of form {r'}.
How much the "sense" and "explore" phases of the learning cycle influenced/shaped this experiment? In hindsight, what would you have done differently with your fellow Solution Mapper and Explorer?
The sense and explore areas have defined the topics of interest. The results will be of special interest for both solutions mapping and exploration activities that will gather data from surveys.
What surprised you?
One of the things that surprised us was the different types of feedback we received from our peers. We shared the survey with 35 people, and it was interesting to read the feedback about the question wording, its purpose, and privacy information. It's worth pointing out that the purpose of the experiment wasn’t necessarily to test the effect of prompts on the preferred material for packaging beverages. Although it would have been nice to find a result. Respondents were asked to provide feedback on the functionality of the data collection tool, and the truthfulness of their responses wasn’t important. The valuable finding is that it is possible to collect data to run these tests, which is the case.
Comments
Log in to add a comment or reply.