Showing posts with label iPhone. Show all posts
Showing posts with label iPhone. Show all posts

Thursday, September 06, 2012

With conflicting stories, all we can believe is the data


This post was originally published on the O'Reilly Radar
under the tile "Digging into the UDID Data"
Over the weekend the hacker group Antisec released one million UDID records that they claim to have obtained from an FBI laptop using a Java vulnerability. In reply the FBI stated:
The FBI is aware of published reports alleging that an FBI laptop was compromised and private data regarding Apple UDIDs was exposed. At this time there is no evidence indicating that an FBI laptop was compromised or that the FBI either sought or obtained this data.
Of course that statement leaves a lot of leeway. It could be the agent's personal laptop, and the data may well have been "property" of an another agency. The wording doesn't even explicitly rule out the possibility that this was an agency laptop, they just say that right now they don't have any evidence to suggest that it was.
This limited data release doesn't have much impact, but the possible release of the full dataset, which is claimed to include names, addresses, phone numbers and other identifying information, is far more worrying.
While there are some almost dismissing the issue out of hand, the real issues here are: Where did the data originate? Which devices did it come from and what kind of users does this data represent? Is this data from a cross-section of the population, or a specifically targeted demographic? Does it originate within the law enforcement community, or from an external developer? What was the purpose of the data, and why was it collected?
With conflicting stories from all sides, the only thing we can believe is the data itself. The 40-character strings in the release at least look like UDID numbers, and anecdotally at least we have a third-party confirmation that this really is valid UDID data. We therefore have to proceed at this point as if this is real data. While there is a possibility that some, most, or all of the data is falsified, that's looking unlikely from where we're standing standing at the moment.

With that as the backdrop, the first action I took was to check the released data for my own devices and those of family members. Of the nine iPhones, iPads and iPod Touch devices kicking around my house, none of the UDIDs are in the leaked database. Of course there isn't anything to say that they aren't amongst the other 11 million UDIDs that haven't been released.
With that done, I broke down the distribution of leaked UDID numbers by device type. Interestingly, considering the number of iPhones in circulation compared to the number of iPads, the bulk of the UDIDs were self-identified as originating on an iPad.
Distribution of UDID by device type

What does that mean? Here's one theory: If the leak originated from a developer rather than directly from Apple, and assuming that this subset of data is a good cross-section on the total population, and assuming that the leaked data originated with a single application ... then the app that harvested the data is likely a Universal application (one that runs on both the iPhone and the iPad) that is mostly used on the iPad rather than on the iPhone.
The very low numbers of iPod Touch users might suggest either demographic information, or that the application is not widely used by younger users who are the target demographic for the iPod Touch, or alternatively perhaps that the application is most useful when a cellular data connection is present.
The next thing to look at, as the only field with unconstrained text, was the Device Name data. That particular field contains a lot of first names, e.g. "Aaron's iPhone," so roughly speaking the distribution of first letters in the this field should give a decent clue as to the geographical region of origin of the leaked list of UDIDs. This distribution is of course going to be different depending on the predominant language in the region.
Distribution of UDID by the first letter of the "Device Name" field

The immediate stand out from this distribution is the predominance of device name strings starting with the letter "i." This can be ascribed to people who don't have their own name prepended to the Device Name string, and have named their device "iPhone," "iPad" or "iPod Touch."
The obvious next step was to compare this distribution with the relative frequency of first letters in words in the English language.
Comparing the distribution of UDID by first letter of the "Device Name" field against the relative frequencies of the first letters of a word in the English language

The spike for the letter "i" dominated the data, so the next step was to do some rough and ready data cleaning.
I dropped all the Device Name strings that started with the string "iP." That cleaned out all those devices named "iPhone," "iPad" and "iPod Touch." Doing that brought the number of device names starting with an "i" down from 159,925 to just 13,337. That's a bit more reasonable.
Comparing the distribution of UDID by first letter of the "Device Name" field, ignoring all names that start with the string "iP", against the relative frequencies of the first letters of a word in the English language

I had a slight over-abundance of "j," although that might not be statistically significant. However, the stand out was that there was a serious under-abundance of strings starting with the letter "t," which is interesting. Additionally, with my earlier data cleaning I also had a slight under-abundance of "i," which suggested I may have been too enthusiastic about cleaning the data.
Looking at the relative frequency of letters in languages other than English it's notable that amongst them Spanish has a much lower frequency of the use of "t."
As the de facto second language of the United States, Spanish is the obvious next choice  to investigate. If the devices are predominantly Spanish in origin then this could solve the problem introduced by our data cleaning. In Spanish you would say "iPhone de Mark" rather than "Mark's iPhone."
Comparing the distribution of UDID by first letter of the "Device Name" field, ignoring all names that start with the string "iP", against the relative frequencies of the first letters of a word in the Spanish language

However, that distribution didn't really fit either. While "t" was much better, I now had an under-abundance of words with an "e." Although it should be noted that, unlike our English language relative frequencies, the data I was using for Spanish is for letters in the entire word, rather than letters that begin the word. That's certainly going to introduce biases, perhaps fatal ones.
Not that I can really make the assumption that there is only one language present in the data, or even that one language predominates, unless that language is English.
At this stage it's obvious that the data is, at least more or less, of the right order of magnitude. The data probably shows devices coming from a Western country. However, we're a long way from the point where I'd come out and say something like " ... the device names were predominantly in English." That's not a conclusion I can make.
I'd be interested in tracking down the relative frequency of letters used in Arabic when the language is transcribed into the Roman alphabet. While I haven't been able to find that data, I'm sure it exists somewhere. (Please drop a note in the comments if you have a lead.)
The next step for the analysis is to look at the names themselves. While I'm still in the process of mashing up something that will access U.S. census data and try and reverse geo-locate a name to a "most likely" geographical origin, such services do already exist. And I haven't really pushed the boundaries here, or even started a serious statistical analysis of the subset of data released by Antisec.
This brings us to Pete Warden's point that you can't really anonymize your data. The anonymization process for large datasets such as this is simply an illusion. As Pete wrote:
Precisely because there are now so many different public datasets to cross-reference, any set of records with a non-trivial amount of information on someone’s actions has a good chance of matching identifiable public records.
While this release in itself is fairly harmless, a number of "harmless" releases taken together — or cleverly cross-referenced with other public sources such as Twitter, Google+, Facebook and other social media — might well be more damaging. And that's ignoring the possibility that Antisec really might have names, addresses and telephone numbers to go side-by-side with these UDID records.
The question has to be asked then, where did this data originate? While 12 million records might seem a lot, compared to the number of devices sold it's not actually that big a number. There are any number of iPhone applications with a 12-million-user installation base, and this sort of backend database could easily have been built up by an independent developer with a successful application who downloaded the device owner's contact details before Apple started putting limitations on that.
Ignoring conspiracy theories, this dataset might be the result of a single developer. Although how it got into the FBI's possession and the why of that, if it was ever there in the first place, is another matter entirely.
I'm going to go on hacking away at this data to see if there are any more interesting correlations, and I do wonder whether Antisec would consider a controlled release of the data to some trusted third party?
Much like the reaction to #locationgate, where some people were happy to volunteer their data, if enough users are willing to self-identify, then perhaps we can get to the bottom of where this data originated and why it was collected in the first place.
Thanks to Hilary MasonJulie SteeleIrene RosGemma Hobson and Marcos Villacampa for ideas, pointers to comparative data sources, and advice on visualisation of the data.


Update: In response to a post about this article on Google+, Josh Hendrix made the suggestion that I should look at word as well as letter frequency. It was a good idea, so I went ahead and wrote a quick script to do just that...
The top two words in the list are "iPad," which occurs 445,111 times, and "iPhone," which occurs 252,106 times. The next most frequent word is "iPod," but that occurs only 36,367 times. This result backs up my earlier result looking at distribution by device type.
Then there are various misspellings and mis-capitalisations of "iPhone," "iPad," and "iPod."
The first real word that isn't an Apple trademark is "Administrator," which occurs 10,910 times. Next are "David" (5,822), "John" (5,447), and "Michael" (5,034). This is followed by "Chris" (3,744), "Mike" (3,744), "Mark" (3,66) and "Paul" (3,096).
Looking down the list of real names, as opposed to partial strings and tokens, the first female name doesn't occur until we're 30 places down the list — it's "Lisa" (1,732) with the next most popular female name being "Sarah" (1,499), in 38th place.
The top 100 names occurring in the UDID data

The word "Dad" occurs 1,074 times, with "Daddy" occurring 383 times. For comparison the word "Mum" occurs just 58 times, and "Mummy" just 33. "Mom" came in with 150 occurrences, and "mommy" with 30. The number of occurrences for "mum," "mummy," "mom," and "mommy" combined is 271, which is still very small compared to the combined total of 1,457 for "dad" and "daddy."

[Updated: Greg Yardly pointed out on Twitter that I was being a bit British-centric in only looking for the words "mum" and "mummy," which is why I expanded the scope to include "mom" and "mommy."]
There is a definite gender bias here, and I can think of at least a few explanations. The most likely is fairly simplistic: The application where the UDID numbers originated either appeals to, or is used more, by men.
Alternatively, women may be less likely to include their name in the name of their device, perhaps because amongst other things this name is used to advertise the device on wireless networks?
Either way I think this definitively pins it down as a list of devices originating in an Anglo-centric geographic region.
Sometimes the simplest things work better. Instead of being fancy perhaps I should have done this in the first place. However this, combined with my previous results, suggest that we're looking at an English speaking, mostly male, demographic.
Correlating the top 20 or so names and with the list of most popular baby names (by year) all the way from the mid-'60s up until the mid-'90s (so looking at the most popular names for people between the ages of say 16 and 50) might give a further clue as to the exact demographic involved.
Both Gemma Hobson and Julie Steele directed me toward the U.S. Social Security Administration's Popular Baby Names By Decade list. A quick and dirty analysis suggests that the UDID data is dominated by names that were most popular in the '70s and '80s. This maps well to my previous suggestion that the lack of iPod Touch usage might suggest that the demographic was older.
I'm going to do a year-by-year breakdown and some proper statistics later on, but we're looking at an application that's probably used by: English speaking males with an Anglo-American background in their 30s or 40s. It's most used on the iPad, and although it also works on the iPhone, it's used far less on that platform.
Thanks to Josh Hendrix, and again to Gemma Hobson and Julie Steele, for ideas and pointers to sources for this part of the analysis.

Update: really nice analysis from David Schultz using the frequency of UDID duplicates and the names of those devices to track down the source of the leak. I really should of thought of that...
Interestingly however it does support my own analysis. BlueToad makes apps for magazine publishers, hence the predominance of of the iPad over the iPhone in my results, as those apps are more normally used on the iPad.
Also they seem to mostly market into the U.S., which supports my ethnicity findings, and looking at the list of titles they curate, it does look like my demographics are more-or-less spot on as well. Those look like magazines marketed to men in their 30's and 40's to me...
I'd actually been really confused about what type of app could possibly have that narrow a demographic, and this sort of clears up my confusion. Nice!

Thursday, August 30, 2012

Hardware Hacking for iOS Programmers

This post was originally published on Josetteorama.

The arrival of the iPhone changed the whole direction of software development for mobile platforms, and has had a profound impact on the hardware design of the smart phones that have followed it.

Not only do these devices know where they are, they can tell you how they're being held, they are sufficiently powerful to overlay data layers on the camera view, and record and interpret audio data, and they can do all this in real time. These are not just smart phones, these are computers that just happen to be able to make phone calls.

Alasdair Allan demonstrating an Augmented Reality application
The arrival of the External Accessory Framework was seen, initially at least, as having the potential to open the iOS platform up to a host of external accessories and additional sensors. Sadly, little of the innovation people were expecting actually occurred, and while there are finally starting to be some interesting products arriving on the market, for the most part the External Accessory Framework is being used to support a fairly predictable range of audio and video accessories from big-name manufacturers.

The reason for this lack of innovation is usually laid at the feet of Apple's Made for iPod (MFi) licensing program. To develop hardware accessories that connect to the iPod, iPhone, or iPad, you must be an MFi licensee.
Unfortunately, becoming a member of the MFi program is not as simple as signing up as an Apple Developer, and it is a fairly lengthy process. From personal experience I can confirm that the process of becoming an MFi licensee is not for the faint-hearted. And once you’re a member of the program, getting your hardware out of prototype stage and approved by Apple for distribution and sale is not necessarily a simple process.


However all that started to change with the arrival of Redpark's serial cable. As it's MFi approved for the hobbyist market it allows you to connect your iPhone to external hardware very simply, it also allows you to easily prototype new external accessories, bypassing a lot of the hurt you experience trying to do that wholly within the confines of the MFi program.


Another important part of that change was the Arduino. The Arduino, and the open hardware movement that has grown up with it and to a certain extent around it, is enabling a generation of high-tech tinkers to prototype new ideas with fairly minimal hardware knowledge.

Every so often a piece of technology can become a lever that lets people move the world, just a little bit. The Arduino is one of those levers. While it started off as a project to give artists access to embedded microprocessors for interactive design projects, I think it’s going to end up in a museum as one of the building blocks of the modern world. It allows rapid, cheap prototyping for embedded systems. It turns what used to be fairly tough hardware problems into simpler software problems.

Turning things into software problems makes things more scalable, it drastically reduces development time scales, and up front investment, and as the whole dot com revolution has shown, it leads to innovation. Every interesting hardware prototype to come along seems to boast that it is Arduino-compatible, or just plain built on top of an Arduino.

Controlling an Arduino directly from the iPad
I think the next round of innovation is going to take Silicon Valley, and the rest of us, back to its roots, and that's hardware. If you're a software person the things that are open and the things that are closed are changing. The skills needed to work with the technology are changing as well.

Alasdair demonstrating an Augmented Reality application
At the start of October I'll be running a workshop on iOS Sensors and External Hardware. It's going to be hardware hacking for iOS programmers, and an opportunity for people to get their hands dirty both the internal sensors in the phone, and with external hardware.

The workshop is intended to guide you through the start of that change, and get you hands-on with the hardware in your iPhone you've probably been ignoring until now. How to make use of the on-board sensors and combine them to build sophisticated location aware applications. But also how to extend the reach of these sensors by connecting your iOS device to external hardware.

Blinking the heartbeat LED of a BeagleBone from the iPhone
We'll look at three micro-controller platforms, the Arduino and the BeagleBone and Raspberry Pi, and get our hands dirty building simple applications to control the boards and gather measurements from sensors connected to it, directly from the iPhone. The course should give you the background to build your own applications independently, using the hottest location-aware technology yet for any mobile platform.

The workshop will be on Monday the 8th of October at the Hoxton Hotel in London at the heart of  Tech City, and right next to Silicon Roundabout. I'm extending a discount to readers; 10% off the ticket price with discount code OREILLY10. That makes the early bird ticket price just £449.10 (was £499), or if you miss the early bird deadline (the 1st of September) a full priced ticket still only £629.10 (£699).

Register
Monday 8th October 2012
Hoxton Hotel, London
Early Bird Price: £499 (until 1st Sept.)
Normal Price: £699
Save 10% with code OREILLY10

Sunday, August 26, 2012

Now with added Beagle Bone

After the last couple of days my workshop in London on the 8th of October, at the Hoxton Hotel, now has added BeagleBone and Raspberry Pi.

Blinking the BeagleBone's heartbeat LED using the iPhone

We're going top go hands on in a small class setting, deep diving into the iOS internal sensors and how to connect your iPhone or iPad to external hardware. Everyone will get their hands dirty, and everyone will come away knowing more about both the iPhone hardware and how to work with external accessories. So come along and get your hands dirty playing with iPhone, Arduino and now the BeagleBone and Raspberry Pi and get 10% off the Early Bird ticket price today only with code BEAGLE10.


Register
Monday 8th October 2012
Hoxton Hotel, London
Early Bird Price: £499 (until 1st Sept.)
Normal Price: £699
Save 10% with code BEAGLE10

Saturday, August 25, 2012

Blinking the BeagleBone's heartbeat LED from the iPhone

Following up on the work I was doing last night connecting the iPhone to the BeagleBone using PeerTalk. I've now reached the blinking LED stage, which is more-or-less the "Hello World" stage of any bit of hardware hack.

Blinking the BeagleBone's heartbeat LED using the iPhone

I've been having a great back-and-forth on Twitter with David House while hacking away with this project, who is working away as I type to get this working on the Raspberry Pi. It's been a lot of fun.

If you want to replicate this on the BeagleBone you should first download and build the PeerTalk library, and then build and deploy the iOS and OSX example applications and get that up and running.

Then connect up and boot your BeagleBone. You'll need to power the board using a mains adapter as when you're compiling things it's possible you'll be drawing enough amperage that you're computer will turn off the USB port to protect itself, and as a result power down your BeagleBone. I had this happen to me a couple of times before I finally dug a mains adapter out of my office drawer. However since you're powering the board from the mains you'll also have to connect an Ethernet cable so that you can ssh root@beaglebone.local and log into the board over the network.

1. Go ahead and login to your BeagleBone as root.

2. Download, build and install libusb. Version 1.0.9 builds, links and installs okay.

3. Download, build and install cmake, which you'll need to build usbmuxd later. You'll need to grab the latest Git nightly checkout as older release versions don't build, having problems with the stock libbz2 compression on the BeagleBone.

4. We also need libplist, however this is available as part of the package management system on Ångström Linux, so all you need to do to install this is type opkg install libplist-dev at the prompt.


5. Download, build and install usbmuxd. Version 1.0.8 builds, links and installs okay, although you may beed to use ccmake and configure by hand, rather than using cmake, as it can't seem to find the libusb include files that got installed into /usr/local.


6. Create a usbmux user

   groupadd -r usbmux -g 114
   useradd -r -g usbmux -d / -s /sbin/nologin -c "usbmux user" -u 114 usbmux


7. As the BeagleBoard doesn't have syslog turned on by default, and you'll need it for debugging, turn on syslogd from the relevant script in /etc/init.d.


8. Run up the usbmux deamon, by typing usbmuxd -v -v at the prompt.


9. Plug your iPhone into the (host side) USB on your BeagleBoard, you should see some debug scrolling by in /var/log/messages.


10. Download David House's peertalk-python and its dependances.


11. On your iPhone start the PeerTalk client for iOS.


12. Start the python client on the BeagleBone by typing python ./peertalk.py at the prompt.


Type in a message at the prompt, and you should see something like this...


Bi-directional communication between the iPhone and the BeagleBone via USB
From there it's pretty trivial to replicate my "Hello World" example, just by hacking around with David's code and toggling the heartbeat LED when the BeagleBone receives any messages.

    def run(self):
        framestructure = struct.Struct("! I I I I")
        ledOn ='echo 1 > /sys/class/leds/beaglebone::usr0/brightness'
        ledOff ='echo 0 > /sys/class/leds/beaglebone::usr0/brightness'
        i = 0
        while self._running:
            try:
                msg = self._psock.recv(16)
                if len(msg) > 0:
                    frame = framestructure.unpack(msg)
                    size = frame[3]
                    msgdata = self._psock.recv(size)
                    print "Received: %s" % msgdata
                    if i == 0:
                       os.system(ledOn)
                       i = 1
                    else:
                       os.system(ledOff)
                       i = 0
            except:
                pass

Which gets you to this point...

Toggling the BeagleBone heartbeat LED with my iPhone over USB.
Which is pretty much where I've reached right now. Next steps is a proper application on the iOS end of things with more generic control of the BeagleBone's header pins, and a more flexible Python backend on the BeagleBone itself...

Update: David House has managed to get everything up and working on the Raspberry Pi. The only changes from the above is that you should grab libplist using apt-get rather than opkg, and since you won't be logged in as root you should remember to sudo usbmuxd -v -v when you start the USB daemon. Apart from that, you should be good to go...

David House (@davidahouse)
25/08/2012 20:22
Video of iPhone controlling LED on Raspberry Pi.


Controlling a LED connected to a GPIO pin on the Raspberry Pi with an iPhone

Update: Come along to my workshop in London on the 8th of October and get your hands dirty playing with iPhone, Arduino and now the BeagleBone and Raspberry Pi. Get 10% off the Early Bird ticket price today only with code BEAGLE10.

Register
Monday 8th October 2012
Hoxton Hotel, London
Early Bird Price: £499 (until 1st Sept.)
Normal Price: £699
Save 10% with code BEAGLE10

Update: David House has just updated his Github repository with a better description of what he did to get the iPhone to control the Raspberry Pi's GPIO pins.
David House (@davidahouse)
26/08/2012 13:40
@aallan I just updated my github repo with a better description with attributions. Had a blast working with you...


Controlling a LED connected to a GPIO pin on the Raspberry Pi with an iPhone

PeerTalk and the BeagleBone

Earlier today I came across an excellent bit of wizardry by Rasmus Andersson called PeerTalk. It's a Objective-C library allowing you to communicate between your iPhone and your Mac over the USB dock cable using TCP sockets.

PeerTalk Demo

My immediate thought was that if this really only depended on having USB host mode capability at the far end, the same mechanism should be able to be used to talk to something like the BeagleBone, or the Raspberry Pi, not just your Mac. This would allow you connect your phone directly to the micro controller board and to drive hardware directly, a lot like the Redpark cable but bypassing Apple's External Accessory framework. 

Yup, this is going to be useful...
So I started digging around inside the source code to see if it depended on anything that was going to be specific to OS X, it became apparent that PeerTalk was mostly some really nice socket code sitting on top of the USB Multiplex Daemon (usbmuxd). This bit of software is in charge of talking to your iPhone over USB and coordinating access to its services by other applications. Effectively this is what iTunes and Xcode use to talk to your phone when you plug it into your Mac's USB port.

So any device that wants to talk to the iPhone using this method needs usbmuxd. Fortunately  for me there are a number of people that have worked oout how to talk to the iPhone from Linux, and there is a working usbmuxd for Linux.

The BeagleBone
As well as a few other dependences which aren't present on the stock Ångström Linux distribution on my BeagleBone, or even packages via opkg, building usbmuxd on my BeagleBone requires libusb and cmake. So before building usbmuxd, I had to build cmake, which meant resolving some problems with the stock compression libraries that shipped with Ångström.

However several hours later. after enough waiting around for software to build to convince me that before doing any serious development on the BeagleBone I really had to build an ARMv7 toolchain on my Mac to cross-compile things instead of building them directly on the board....

 The iPhone talking directly to my BeagleBone  using PeerTalk
...I managed to get a simple "hello" from my iPhone to the BeagleBone and then via screen to my Mac using port forwarding and that old stand by, telnet.

While I was hacking away on getting this working, I wasn't alone. David House was looking down some of the same back alleyways to get PeerTalk talking to his Raspberry Pi, and we batted the problem back and forth on Twitter while waiting for code to compile well into the night...

The next step is to put together a client on the BeagleBone sitting on top of usbmuxd that'll talk natively to the PeerTalk on iOS. Since I've got the source code of both ends, this isn't going to be too hard. I'll probably put something together in Python.
More soon...

Update: Following on from this I pushed forward till I managed to blink the BeagleBone's heartbeat LED from the iPhone which is, more-or-less, the "Hello World" stage of any hardware hack...

Sunday, August 19, 2012

A drawer full of phones...

I had a huge clear out of my home office this weekend, including the draw full of old mobile phones. A free copy of the second edition of Learning iOS Programming to the first person that can successfully identify them all...
A collection of mobile phones, but the iPhone changed everything..?
Answers accepted on the relevant thread on Google+ only....

Thursday, August 16, 2012

iOS Sensors and External Hardware Masterclass

I'm going to be down in London for O'Reilly's Strata conference in October, key-noting on the Tuesday morning, talking about the hidden data that follows us around and how I've leveraged that for my own advantage. I'll also be talking, along with my colleague Zena Wood from Exeter, about People Watching with Machine Learning and using modern smart phones, like the iPhone, to do interesting sociology. It should be good.

However I'll be kicking around town for most of the following week afterwards, talking to various people. But I had a gap, a big gap, at the beginning of that week. So I've decided to try and interesting experiment...

I've often argued that both the increasingly rich sensor suite and the ability to easily connect today's smart phones to external hardware, and sensors, make them an amazing lever on the world. It's something I've focused on a lot over the last year or so.

A conversation with Dale Dougherty and Alasdair Allan

While I've run a lot of conferences and workshops over the years, it's always been on someone else's dime. Time to put my money where my mouth is, I'm going to run a workshop.

In fact I'm going to run a master-class on iOS Sensors and External Hardware. This is going to be hardware hacking for iOS programmers. It's going to be hands on, bring your Mac, bring your iPhone and make sure you've got Xcode set up so that you can deploy apps onto your device. It'll be a small group, no more than twenty, and I'll be doing a bunch of live coding.

We'll start the day talking about the internal sensors: the accelerometer, magnetometer, gyroscope and how to combine these into sophisticated applications. Here you'll really get the benefit of my physics background, because I can take you under the user friendly skin Apple have put on top of these sensors as part of the iOS SDK and hopefully give you a decent idea of what their limitations are and how they work.

Then we'll move on to talk about how to extend the reach of the on-board sensors by connecting your iPhone to external hardware. We'll look at how to connect the Arduino micro-controller platform to your iOS device, and build simple applications to control the board and gather measurements from sensors connected to it, directly from iOS. This course will give you the background to build your own applications independently, using the hottest location-aware technology yet for any mobile platform.

iPad controlling an Arduino board via the Redpark cable
You'll take away with you an Arduino Uno board, a Redpark TTL Serial Cable for iOS, and everything you need to connect your iPhone to your new micro-controller. You'll also receive a copy of my books Basic Sensors in iOS and iOS Sensor Apps with Arduino.

The workshop will be on Monday the 8th of October at the Hoxton Hotel right next to London's Silicon Roundabout. I've opened registration and I'm offering 30% off the ticket price until the 1st of September. Sign up early, and sign up often.

More about exactly what I'm going to be talking about on the workshop's own website. I've done similar things on smaller scales in the past, but this should be a lot of fun. Hope to see at least some of you there...

Register
Monday 8th October 2012
Hoxton Hotel, London
Early Bird Price: £499 (until 1st Sept.)
Normal Price: £699

Saturday, June 02, 2012

Play testing the Lords of Midnight for iOS

In 1984 a game called Lords of Midnight written by Mike Singleton was released for the ZX Spectrum, conversions to the Commodore 64 and Amstrad CPC soon followed. It came to dominate my game playing during my mid teens, games came and went, but always I returned to the War of Solstice and the Lords of Midnight.


The Lord of Blood stands in the Keep of Blood looking north towards the invading armies of Doom Guard as they pour through the Gap of Valethor and onto the Plains of Blood. Things are not going well for the Free.

The game had a well written back story, and for the time an amazing amount of depth to the game play. A unique blend of war game, strategy, and landscape that was ground breaking...


The Lands of Midnight

...and thanks to Chris Wild's bout of nostalgia in the early nineties, and his port to MS-DOS, I went on playing the game. There was even a multi-player version of the game, called Midnight/MU built, allowing you to play online through your browser. It seems I wasn't alone in my nostalgia.

But things changed with the arrival of the iPhone, and even more so with the arrival of the iPad. I thought the iPad was the perfect platform to revive the game. While it was epic in nature, the turn-by-turn nature of the game meant that unlike some other strategy games it was well suited for the dip-in and dip-out nature of gaming on the platform. More so, I wanted to play my favourite game on my new hardware. I stopped playing Chris' port and started to think idly about porting his code, or more likely his Midnight Engine to iOS. I poked around in the source code, but eventually decided against it. Instead I waited. Someone else was going to do it, it was just a matter of time.

My patience was seemingly rewarded, there was going to be an iOS port and Chris Wild and Mike Singleton were going to work on it together...

...but time passed, actually quite a lot of time passed, more than a year, and it started to look like vapourware. Until just a couple of months ago Chris posted some video footage of the game to his blog. It existed, if only in the roughest sense, and it was playable.


The pre-alpha demo of Lords of Midnight for iOS

Content to wait at that point, I sat back. Not only was there going to be an iOS version, but because of the way Chris had ported his Midnight Engine, using the cross-platform Marmalade SDK, there was going to be a port to Android, Mac OS X and MS Windows. This wasn't just an simple iOS port, this was a cross-platform remake of the original game. There was even discussion of finally making the almost legendary missing sequel The Eye of the Moon.

I waited, I'd gotten good at it...


The Lords of Midnight for iOS

...and then Chris put out a call for play testers. I didn't spot it, but amazingly my editor at O'Reilly, Brian Jepson, did. I managed to make it into the play test, which is another one I owe Brian.


Play testing the iOS port

The graphics are still the original imagery taken from the eighties, and the interface is still a bit shaky, and there are a few bugs in artificial intelligence, but I'm enjoying having early access to the game. I'm enjoying wallowing in my eighties nostalgia.

But beyond that I think, that with the bugs and interface problems properly addressed, and the graphics updated to something that looks at home in the twenty first century, that this is still and above all a solid game. That in fact this is a game that appears as if it was always intended to be on the iPad, as if it was always meant to run on touch hardware. The new platform suits it, like a new suit of clothes.

This isn't an ageing rock star coming out of retirement for one more nostalgia tour, this is something bigger. Just like it did the first time around, I think the Lords of Midnight for iOS could change how gaming is done on the platform.

Not bad for a game that's now approaching thirty years old?