Hotspots of Hokkaido
It is assumed that the value of crowdsensing rises as the number of data sources rises. However, the value of the aggregated data depends upon the accuracy of the data being aggregated. This is vividly demonstrated by a website produced by Southampton University in the UK which is aggregating all the available sources of data on radiation contamination in Japan to produce visualisations of the data. I was made aware of this by a friend involved in Safecast who had read an article published in a Japanese English-language magazine who wanted to know whether I was aware of this project: I wasn’t, but I am now.
The Japan Nuclear Crowd Map (JNCM) was announced in a press release from Southampton University last month (on 16th May 2013). The press release from the Department of Electronics and Computer Science says:
“People living in Japan after the Fukushima nuclear disaster can find out the radiation level in their area thanks a new tool designed by a team of researchers from the ECS research group Agents, Interaction and Complexity Group (AIC). The Japan Nuclear Crowd Map (JNCM) intelligently combines crowdsourced nuclear radioactivity data that has been collated since the 2011 emergency when a magnitude nine Tsunami hit the North-East coast of Japan severely damaging the nuclear power plant of Fukushima-Daiichi.”
The project does combine crowdsourced nuclear radioactivity data, but it is debateable whether it does it “intelligently” .
The project website, helpfully in English and Japanese, advertises a “Live Radioactivity Map”. [UPDATE: this has now been taken down for “maintenance”]
The map is pretty, and its aesthetic attractiveness is accentuated by the balancing of the deep red splodge of contamination near Fukushima with an equally incarnadine dab of colour just north of Sapporo on the northern island of Hokkaido, over 500 miles from Fukushima. This Hokkaido blob looks nice, but its existence is news to the people of Japan, and unwelcome news to those who live under it. So how have a group of researchers at Southampton University managed to identify a radiation hotspot that has previously escaped the attention of the Japanese government and local monitoring groups like Safecast?
If we zoom in on the Southampton map we can see the location of the sensors giving them this hot spot reading.
We see that there was a “last reading” of 3.39 µSv/hour at a sensor feeding its data into Xively.com (previously Cosm, and even more previously, Pachube), a platform allowing anyone to share their data. 3.39 µSv/hour is the equivalent of an annual exposure of over 30 millisieverts/ year, when the UK “safe” limit for workers in industries exposing employees to radiation is only 20 millisieverts per year. A couple of minutes on the Xively website found the data feed that was giving this reading. [Apologies to XML sticklers for switching pointed brackets to curved brackets, but life is too short to re-remember how to CDATA a block of XML into HTML.] So here is the feed. [Spoiler Alert] For those whose idea of fun is reading XML sequentially, look out for the dates.
(environment updated=”2011-07-16T04:47:37.382161Z” created=”2011-07-04T23:56:48.078040Z” id=”29324″ creator=”https://xively.com/users/uofaeng”)
(title)Radiation @ Futomi(/title)
(description)Using a Blackcatsystems’ GM-10 with a LND-712, GM tube beside a window of my living room
Not feeded to pachube yet…
The data of GM-10 is obtained as jpeg and text file in my PC. So, what I have to know is to feed the text file to this feed ID. I’m struggling with cURL.(/description)
(location domain=”physical” exposure=”indoor” disposition=”fixed”)
(name)2094-124 Futomi, Tobetsu-town, Ishikari-county, Hokkaido-prefecture(/name)
(unit type=”” symbol=”cpm”)counts/minute(/unit)
(unit type=”” symbol=”Î¼Sv/h”)microsieverts/hour(/unit)
So, the basis for the area of high radiation contamination appearing across a large area of Hokkaido on the Southampton maps is a single reading submitted on the 16th of July 2011 by someone trying to get the connection between their sensor and Cosm/Xively set up. Allowing this single data point to contaminate the Southampton visualisation is wrong on two levels. First, if you are going to produce a “live” data map, looking at the timing of the data collection in the feed must be taken into account. Second, there is always the possibility that readings are anomalous, for example sensors may be close to medical radiation sources, so there is a need to review the data before publishing them.
This looks suspiciously like a group of academics under pressure to demonstrate “impact” for the forthcoming REF. The REF is an exercise to rate the research of academic departments across the UK. For the first time in the 2014 exercise 20% of the rating will be related to case studies of “impact” beyond academia. One unintended consequence of this is that academics are placed under pressure to get themselves in the news and do things that seem to be of value to the wider community. I am left with a suspicion that the Southampton website was seen as a way of claiming that their research on data analysis achieved “impact”, but without spending too much time thinking about whether the data was correct or whether they were spreading alarm in Japan. It is only a guess, but I cannot believe they would be quite so laid-back about the impacts of the research if it was a website set up by a prestigious Japanese university claiming that the whole of Hampshire was near uninhabitable based on a single reading made by someone two years ago.