As some of you may know, the MetaMouse project has been ongoing at Berkeley for a number of years now. The short version is that it automatically converts any single-user Windows XP program into a multi-user, cooperative application. MetaMouse is primarily targeted at educational games in areas with limited computer hardware. We successfully conducted a short-term research experiment in the Summer of 2008, showing meaningful benefits in terms of collaboration, discussion, and learning using Azim Premji Foundation (APF) games in Bangalore. These results were published in ICTD 2010 and ACM DEV 2010.
Following this earlier study, we attempted to conduct a longer-term evaluation last Summer. This was again with the APF in Bangalore. Our hope was to evaluate long-term learning benefits in Mathematics, using existing APF content and lesson plans. We selected 4 trial schools, two control and two test. The evaluation was designed to run 6 months, hopefully long enough to demonstrate learning. Unfortunately, that didn't work out.
The study ultimately failed, with us unable to demonstrate any results, positive or negative. Some of the problems were the common problems you see with computer-based education in the developing world; powering the machines, hardware failures, and teachers not using the equipment (in either the test or the control!). However, I wanted to note a problem I hadn't anticipated, that ultimately sunk the deployment: shifting NGO priorities.
In our first deployment, APF was very excited about the technology; it was essentially designed to support their Flash educational games, and it did a good job of that. In the subsequent year, APF started a number of new initiatives, moving away from primarily providing computers and software to improving teacher education and training. Personally, we think this is fantastic. Educated, motivated teachers are always going to improve learning, and there's little argument about that fact. However, as technologists, this meant that APF had less time to devote to our research and our attempts to improve computer-based education. This meant that there was less effort put into the maintenance required for our project, the batteries were failing and the teachers were deviating from lesson plans. The study failed. This was essentially our fault, we knew their focus had changed, but we expected the same field experience we had the year before. It's worth noting, things change rapidly, and you have to keep your ear on the ground. It's not just finding people who have incentives that line up with yours, but also those who continue to.
On a somewhat related note, we're looking for partners for a long-term deployment evaluation of MetaMouse! Contact me for details.