Ontario’s standardized testing results came out last week. For parents, the EQAO results are a chance to compare their kids’ schools against others. The results are also a story my newspaper covers closely every year.
For some reason, however, the people the Education Quality and Accountability Office wouldn’t give me a copy of the school-by-school results so that my newspaper could run a list of how the local schools did. Apparently, they don’t like those kinds of comparisons.
Instead, the EQAO pointed me to a a search function that brings up result for individual schools.
No problem, I thought. I’ll just write a little Python script to “webscrape” the results — automatically download the result for every school in our board into our own database. Then we can assemble our own list to compare local schools.
But the EQAO people had obviously anticipated this possibility. Neither the PDFs with results for each school nor the HTML pages contain machine-readable results. The results were encoded into graphics with little bar charts that looked like this…
The challenge was how to get the numbers in these charts into a database. Several members of a computer-assisted reporting listserv I belong to suggested an optical-character recognition approach. But OCR is buggy and prone to errors I couldn’t afford with this data.
reporter online engagement intern Srinivas Rao at the public-interest news agency non-profit investigative news organization ProPublica emailed me with an offer. He’d help run our task through an Amazon service called Mechanical Turk that bills itself as “Artificial Artificial Intelligence.”
(The name derives from an 18th Century chess-playing machine that moved the pieces with a mechanical arm and reportedly beat good players. The Turk, alas, was a hoax. A real chess master hid inside the wooden cabinet and operated the arm.)
The EQAO provides six of these charts for each elementary school, one each for reading, writing and mathematics in Grade 3 and Grade 6. That gave us a list of more than 1,400 graphics, each with a percentage score for each of the past five years.
It would have taken our copydesk weeks to keypunch this and would likely have introduced too many errors. But Srinivas posted the list of URLs for the graphics on Mechanical Turk and offered to pay users 1 cent for each chart they keypunched.
He set up the job so that four different users would keypunch each graph, to ensure the results were accurate.
Mechanical Turk calls these jobs Human Intelligence Tasks, or HITs. Anyone can log on and start performing HITs and get paid for it. The lower rate — a penny a piece — means that many of the users come from the developing world. About half of those who transcribed our EQAO data were based in India.
By the next morning, we had our entire set of 1,400 graphics transcribed into machine-readable data and dumped into an Excel spreadsheet. Total cost: about $70.
ProPublica is using Mechanical Turk for other web-scraping projects and will be rolling out a posting about it on their nerd blog it shortly.
In short, MT is an exciting tool for journalists who are stymied by uncooperative government agencies or hard-to-scrape websites.