Tuesday, March 22, 2011

Location Enabled Sensors for iOS

Yesterday evening I gave a talk on "Location Enabled Sensors" to the Bristol branch of the BCS, the slides from the talk are embedded below.

Updated Visualisations for Japan

Download: Fukushima-Data.zip (1.1MB) updated at 22-03-2011T11:44Z

With thanks to Gemma Hobson and her ongoing efforts at data entry, we've now extended the duration of the visualisation to cover a full four days. From 17:00 on the 16th of March through till 16:00 on the 20th of March.

Environmental Radioactivity Measurement, 17:00 16th March - 17:00 20th March -- For data from the Fukushima site itself see the "Readings at Monitoring Post out of 20 Km Zone of Fukushima" data sets online. Data collection from Mlyagi (Sendai) ceases at 17:00 on the 17th of March and does not resume. Several other smaller duration data drop outs also occur during the monitored period.

It can be seen that radiation levels near Fukushima are somewhat higher than the original visualisations, whilst the previous data showed peak around 0.15 µSv/h, measurements are now peaking at 0.25 µSv/h. The typical minimum and maximum values across Japan are plotted below on the same scale for comparison.

Additionally, the map embedded below again shows the environmental radioactivity measurements with respect to the typical maximum values for that locale.

Environmental Radioactivity Measurement,
Ratio with respect to typical Maximum Values

This shows that enhancement around Fukushima is now spiking around four times typical maximums, the previous visualisations showed enhancements of only around twice typical maximums.

Finally, earlier today Sarah Novotny pointed me in the direction of an excellent visualisation by Paul Nicholls showing the series of magnitude 4 to 6 aftershocks that are still rocking Japan. Over 670 to date since the original magnitude 9 earthquake.

Japan Quake Maps by Paul Nicholls

Update: I've just been pointed to this interesting visualisation of the original magnitude 9 earthquake in Japan. It was created using GPS readings from the Japanese GEONET network, which should not to be confused with the similarly named GeoNet network in New Zealand.

The video shows the horizontal (left) and vertical (right) displacements recorded when the earthquake struck. The resulting ground movement ripple propagating through the country is very evident.

Saturday, March 19, 2011

Radioactivity Measurement in Japan

This article was updated to reflect the latest data at 16:29 20/Mar/2011.
With thanks to Pete Warden and Gemma Hobson.

Download: Fukushima-Data.zip (1.1MB)

Over the weekend I came across some data on levels of radiation in Japan collected by the Japanese government, and helpfully translated into English by volunteers.

Unfortunately the data was also somewhat unhelpfully stuck in PDF format. However between us Gemma Hobson, Pete Warden and I transcribed, mostly by hand, some of the more helpfully formatted files into CSV format (16KB) making it acceptable to Pete's OpenHeatMap service. The map embedded below shows our first results.

Environmental Radioactivity Measurement,
17:00 16th March - 17:00 18th March
For data from the Fukushima site itself see the "Readings at Monitoring Post out of 20 Km Zone of Fukushima" data sets online. Data collection from Mlyagi (Sendai) ceases at 17:00 on the 17th of March and does not resume. Several other smaller duration data drop outs also occur during the monitored period.

As you can see from the visualisation environmental radiational levels change fairly minimally over the time course of the day. Most measurements are steady, and within the historic ranges, except around the troubled Fukushima plant where readings are about double normal levels.

Things become more interesting however when we look at the historic baseline data, the two maps below show the typical range of background environmental radiation in Japan. The first shows typical minimum values, while the second shows typical maximum values, put together they illustrate the observed range for environmental radiation across Japan.

Environmental Radioactivity Measurement,
Typical Minimum

Environmental Radioactivity Measurement,
Typical Maximum

Finally the map embedded below shows the environmental radioactivity measurements with respect to the typical maximum values for that locale. From this visualisation it is evident that the measured values throughout Japan are normal except in the immediate area surrounding the Fukushima reactors where levels are about double normal maximum levels.

Environmental Radioactivity Measurement,
Ratio with respect to typical Maximum Values

However when analysing this data you should bear in mind that the normal environmental range in that area is actually fairly low compared to other areas in Japan. Currently the levels of radiation at the plant boundary are actually still lower than the typical background levels in some other parts of the country. Levels also seem fairly static over time, and do not seem to be increasing as the situation progresses.

Unless the situation significantly worsens, which admittedly is always possible, human habitation in close proximity to the plant will not be affected in the medium term. From talking to people on the ground in Japan, and by looking at the actual measurements across the country, a very different picture seems to be emerging than that reported by the Western media which seems highly skewed, and heavily politicised, by comparison.

I think everyone should take a deep breath, step back, and look at the evidence which is suggesting that this is not another Chernobyl in the making. It's may not even be another Three Mile Island. If the remaining functioning reactor units are decommissioned following this incident it may well have more to do with politics than the science.

Update: A Radiation Dose Chart from the people that brought you xkcd.com. The extra dosage you would pick up in a day while in a town near the Fukushima plant is around 3.5 µSv.

Radiation Dose Chart

For comparison a typical daily background dose is around 10 µSv, whilst having a dental X-ray would expose you to an additional 5 µSv. The exposure from a single trans-continental flight from New York to L.A. is of the order of 40 µSv.

Update: This visualization compares the energy mix and number deaths related to each of the main sources of energy worldwide - coal, oil, natural gas, nuclear, hydro and biomass.

Thursday, March 03, 2011

The abandonment of technology

This article was originally posted on the O'Reilly Radar

Right now the Space Shuttle Discovery is in orbit for the last time, and docked with the International Space Station (ISS). On its return to Earth the orbiter will be decommissioned and displayed in the Smithsonian's National Air and Space Museum. Just two more shuttle flights, Endeavour in mid-April, and Atlantis in late-June, are scheduled before the shuttle program is brought to and end.

Tracy Caldwell Dyson in the Cupola module of the International Space Station
Tracy Caldwell Dyson in the Cupola module of the International Space Station observing the Earth below during Expedition 24 in 2010.
(Credit: Expedition 24, NASA)

Toward the end of last year I came across an interesting post about the abandonment of technology by Cameron Locke. A couple of months later on I read an article by Kyle Munkittrick who argues that the future is behind us, or at least that our current visions of the future are outdated compared the current technology:

The year is 2010. America has been at war for the first decade of the 21st century and is recovering from the largest recession since the Great Depression. Air travel security uses full-body X-rays to detect weapons and bombs. The president, who is African-American, uses a wireless phone, which he keeps in his pocket, to communicate with his aides and cabinet members from anywhere in the world ... Video games can be controlled with nothing but gestures, voice commands and body movement. In the news, a rogue Australian cyberterrorist is wanted by world's largest governments and corporations for leaking secret information over the world wide web; spaceflight has been privatized by two major companies, Virgin Galactic and SpaceX.

I've been thinking about these two articles ever since, and Discovery's last flight brought these thoughts to the front of my mind. On the face of things the two posts espouse very different view points, however the underlying line of argument in both is very similar.

The future is already here and we may be standing at a crucial decision point in our history. Forces are pulling us in both directions. On one hand the rate of technological progress is clearly accelerating, on the other, the resources we have on hand to push that progress are diminishing at an ever-increasing rate. In a world of declining resources, and increasingly unreliable energy supply, you have to wonder whether our current deep economic recession is a sign of things to come. Will the next few decades be a time of economic contraction and an overall lower standard of living?

Big problems, unaddressed

At the 2008 Web 2.0 Expo, Tim O'Reilly argued that there are still big problems to solve and he asked people to go after the big, hard, problems.

And what are the best and the brightest working on? You have to ask yourself, are we working on the right things?

I think Tim was right, and I don't think much has changed in the last couple of years. I'm worried that we're chasing the wrong goals, we're not yet going after the big, hard, problems that Tim was talking about. Solving them might make all the difference.

While not all big problems are related to the space program by any means, the successful first launch of SpaceX's Falcon 9 rocket last December has given me some hope that some of our best and brightest aren't just throwing sheep at one another or selling plush toys.

Despite this, I see signs of a growth in pseudo-science, and an inability of even the educated middle classes to be able to tell the difference between it and more trustworthy scientific undertakings. There is also a worrying smugness, almost pride, among many people that they "just don't understand computers." While some of us are pushing the boundaries, it appears we may be leaving others behind.

We abandoned the moon, then faster flight. What's next?

The fastest westbound trans-atlantic flight from London Heathrow to New York JFK was on the 7th of February 1996, by a BA Concorde. It made the journey in just under 3 hours. Depending on conditions, the flight typically takes between 7 and 8 hours on a normal airplane. As a regular on the LHR to SFO and LAX routes, I spend a lot of time unemployed over Greenland. I do sometimes wish it was otherwise.

A few weeks ago I watched an Airbus A380 taxi toward take-off at Heathrow, and I felt a deep sense of shame that as a species we'd traded a thing of aeronautical beauty for this lumbering giant. Despite the obvious technical achievement, it feels like a step backwards.

I'm young enough, if only just, that I don't remember the Moon landings. However when I was a child my father told me about how, as a younger man, he had avidly watched the broadcast of the Moon landings, and I have a friend whose father was in Mission Control with a much closer view. In the same way I can tell my son that we once were able to cross the Atlantic in just 3 hours, and that once it was possible to arrive in New York before you left London. I do wonder if things go the wrong way — and we enter an age of declining possibilities and narrowing horizons — whether he'll believe me.

Tuesday, March 01, 2011

The return of the Personal Area Network

This article was originally publish on the O'Reilly Radar.

The recent uprisings in Egypt and across the Middle East have caused some interesting echos amongst the great and the good back in Silicon Valley. Despite the overly dramatic language, I find Shervin Pishevar's ideas surrounding ad-hoc wireless mesh networks, embodied in the crowd-sourced OpenMesh Project, intriguing.

Back in the mid-years of the last decade I spent a lot of time attempting to predict the future. At the time I was convinced that personal area networks (PAN) were going to become fairly important. I got a few things right: that things were going to be all about ubiquity and geo-location, and that mobile data was the killer application for the then fairly new 3G cell networks. But I was off the mark to think that the convergence device, as we were calling them back then, was a dead end. Technology moved on and the convergence devices got better, thinner, lighter. Batteries lasted longer. Today we have iOS- and Android-based mobile handsets, and both platforms pretty much do everything my theorised PAN was supposed to do, and in a smaller box than the handset I was using in the mid-naughties.

It's debatable whether the OpenMesh Project is exactly the right solution to create secondary wireless networks to provide communications for protestors under repressive regimes. The off-the-shelf wireless networks Pishevar talks about extending operate on a known number of easily jammable frequencies. Worse yet, there has been little discussion, at least so far, about anonymity, identity, encryption and other security issues surrounding the plan. The potential for the same repressive regimes the protestors are trying to avoid being privy to their communications by the very nature of the ad-hoc network is a very real threat.

Where 2.0: 2011, being held April 19-21 in Santa Clara, Calif., will explore the intersection of location technologies and trends in software development, business strategies, and marketing.

Save 25% on registration with the code WHR11RAD

For these reasons I'm not absolutely convinced that mesh networking is the right technical approach to this very political, and human, problem. Perhaps he should be looking at previous work on delay-tolerant networking rather than a real-time mesh protocol.

However, Pishevar's suggested architecture does seem robust enough to cope with disaster response scenarios, where security concerns are less important, or if widely deployed to provide an alternative to more traditional data carriers. Pishevar isn't of course alone in thinking about these issues: the Serval Project uses cell phone swarms to enable mobile phones to make and receive calls without the conventional cell towers or satellites.

The Bluetooth standard proved too complicated, and at times too unreliable, a base upon which to build wireless PAN architectures. The new generation of ZigBee mesh devices is more transparent, at least at the user level, possibly offering a more robust mesh backbone for a heterogenous network of Wi-Fi, cell and other data connections.

With the web of things and the physical web now starting to emerge, and with wearables now staring to become less intrusive and slightly more mainstream, the idea of the personal area network might yet re-emerge, at least in a slightly different form. The resulting data clouds could very easily form the basis for a more ubiquitous mesh network.

I'd argue that I was only a bit early with my prediction. The technology wasn't quite there yet, but It doesn't mean it isn't coming. The 2000s proved, despite my confident predictions at the time, not to be the decade of the wireless-PAN. Perhaps its time is yet to come?