Thursday, August 30, 2012

Hardware Hacking for iOS Programmers

This post was originally published on Josetteorama.

The arrival of the iPhone changed the whole direction of software development for mobile platforms, and has had a profound impact on the hardware design of the smart phones that have followed it.

Not only do these devices know where they are, they can tell you how they're being held, they are sufficiently powerful to overlay data layers on the camera view, and record and interpret audio data, and they can do all this in real time. These are not just smart phones, these are computers that just happen to be able to make phone calls.

Alasdair Allan demonstrating an Augmented Reality application
The arrival of the External Accessory Framework was seen, initially at least, as having the potential to open the iOS platform up to a host of external accessories and additional sensors. Sadly, little of the innovation people were expecting actually occurred, and while there are finally starting to be some interesting products arriving on the market, for the most part the External Accessory Framework is being used to support a fairly predictable range of audio and video accessories from big-name manufacturers.

The reason for this lack of innovation is usually laid at the feet of Apple's Made for iPod (MFi) licensing program. To develop hardware accessories that connect to the iPod, iPhone, or iPad, you must be an MFi licensee.
Unfortunately, becoming a member of the MFi program is not as simple as signing up as an Apple Developer, and it is a fairly lengthy process. From personal experience I can confirm that the process of becoming an MFi licensee is not for the faint-hearted. And once you’re a member of the program, getting your hardware out of prototype stage and approved by Apple for distribution and sale is not necessarily a simple process.

However all that started to change with the arrival of Redpark's serial cable. As it's MFi approved for the hobbyist market it allows you to connect your iPhone to external hardware very simply, it also allows you to easily prototype new external accessories, bypassing a lot of the hurt you experience trying to do that wholly within the confines of the MFi program.

Another important part of that change was the Arduino. The Arduino, and the open hardware movement that has grown up with it and to a certain extent around it, is enabling a generation of high-tech tinkers to prototype new ideas with fairly minimal hardware knowledge.

Every so often a piece of technology can become a lever that lets people move the world, just a little bit. The Arduino is one of those levers. While it started off as a project to give artists access to embedded microprocessors for interactive design projects, I think it’s going to end up in a museum as one of the building blocks of the modern world. It allows rapid, cheap prototyping for embedded systems. It turns what used to be fairly tough hardware problems into simpler software problems.

Turning things into software problems makes things more scalable, it drastically reduces development time scales, and up front investment, and as the whole dot com revolution has shown, it leads to innovation. Every interesting hardware prototype to come along seems to boast that it is Arduino-compatible, or just plain built on top of an Arduino.

Controlling an Arduino directly from the iPad
I think the next round of innovation is going to take Silicon Valley, and the rest of us, back to its roots, and that's hardware. If you're a software person the things that are open and the things that are closed are changing. The skills needed to work with the technology are changing as well.

Alasdair demonstrating an Augmented Reality application
At the start of October I'll be running a workshop on iOS Sensors and External Hardware. It's going to be hardware hacking for iOS programmers, and an opportunity for people to get their hands dirty both the internal sensors in the phone, and with external hardware.

The workshop is intended to guide you through the start of that change, and get you hands-on with the hardware in your iPhone you've probably been ignoring until now. How to make use of the on-board sensors and combine them to build sophisticated location aware applications. But also how to extend the reach of these sensors by connecting your iOS device to external hardware.

Blinking the heartbeat LED of a BeagleBone from the iPhone
We'll look at three micro-controller platforms, the Arduino and the BeagleBone and Raspberry Pi, and get our hands dirty building simple applications to control the boards and gather measurements from sensors connected to it, directly from the iPhone. The course should give you the background to build your own applications independently, using the hottest location-aware technology yet for any mobile platform.

The workshop will be on Monday the 8th of October at the Hoxton Hotel in London at the heart of  Tech City, and right next to Silicon Roundabout. I'm extending a discount to readers; 10% off the ticket price with discount code OREILLY10. That makes the early bird ticket price just £449.10 (was £499), or if you miss the early bird deadline (the 1st of September) a full priced ticket still only £629.10 (£699).

Monday 8th October 2012
Hoxton Hotel, London
Early Bird Price: £499 (until 1st Sept.)
Normal Price: £699
Save 10% with code OREILLY10

Sunday, August 26, 2012

Now with added Beagle Bone

After the last couple of days my workshop in London on the 8th of October, at the Hoxton Hotel, now has added BeagleBone and Raspberry Pi.

Blinking the BeagleBone's heartbeat LED using the iPhone

We're going top go hands on in a small class setting, deep diving into the iOS internal sensors and how to connect your iPhone or iPad to external hardware. Everyone will get their hands dirty, and everyone will come away knowing more about both the iPhone hardware and how to work with external accessories. So come along and get your hands dirty playing with iPhone, Arduino and now the BeagleBone and Raspberry Pi and get 10% off the Early Bird ticket price today only with code BEAGLE10.

Monday 8th October 2012
Hoxton Hotel, London
Early Bird Price: £499 (until 1st Sept.)
Normal Price: £699
Save 10% with code BEAGLE10

Saturday, August 25, 2012

Blinking the BeagleBone's heartbeat LED from the iPhone

Following up on the work I was doing last night connecting the iPhone to the BeagleBone using PeerTalk. I've now reached the blinking LED stage, which is more-or-less the "Hello World" stage of any bit of hardware hack.

Blinking the BeagleBone's heartbeat LED using the iPhone

I've been having a great back-and-forth on Twitter with David House while hacking away with this project, who is working away as I type to get this working on the Raspberry Pi. It's been a lot of fun.

If you want to replicate this on the BeagleBone you should first download and build the PeerTalk library, and then build and deploy the iOS and OSX example applications and get that up and running.

Then connect up and boot your BeagleBone. You'll need to power the board using a mains adapter as when you're compiling things it's possible you'll be drawing enough amperage that you're computer will turn off the USB port to protect itself, and as a result power down your BeagleBone. I had this happen to me a couple of times before I finally dug a mains adapter out of my office drawer. However since you're powering the board from the mains you'll also have to connect an Ethernet cable so that you can ssh root@beaglebone.local and log into the board over the network.

1. Go ahead and login to your BeagleBone as root.

2. Download, build and install libusb. Version 1.0.9 builds, links and installs okay.

3. Download, build and install cmake, which you'll need to build usbmuxd later. You'll need to grab the latest Git nightly checkout as older release versions don't build, having problems with the stock libbz2 compression on the BeagleBone.

4. We also need libplist, however this is available as part of the package management system on Ångström Linux, so all you need to do to install this is type opkg install libplist-dev at the prompt.

5. Download, build and install usbmuxd. Version 1.0.8 builds, links and installs okay, although you may beed to use ccmake and configure by hand, rather than using cmake, as it can't seem to find the libusb include files that got installed into /usr/local.

6. Create a usbmux user

   groupadd -r usbmux -g 114
   useradd -r -g usbmux -d / -s /sbin/nologin -c "usbmux user" -u 114 usbmux

7. As the BeagleBoard doesn't have syslog turned on by default, and you'll need it for debugging, turn on syslogd from the relevant script in /etc/init.d.

8. Run up the usbmux deamon, by typing usbmuxd -v -v at the prompt.

9. Plug your iPhone into the (host side) USB on your BeagleBoard, you should see some debug scrolling by in /var/log/messages.

10. Download David House's peertalk-python and its dependances.

11. On your iPhone start the PeerTalk client for iOS.

12. Start the python client on the BeagleBone by typing python ./ at the prompt.

Type in a message at the prompt, and you should see something like this...

Bi-directional communication between the iPhone and the BeagleBone via USB
From there it's pretty trivial to replicate my "Hello World" example, just by hacking around with David's code and toggling the heartbeat LED when the BeagleBone receives any messages.

    def run(self):
        framestructure = struct.Struct("! I I I I")
        ledOn ='echo 1 > /sys/class/leds/beaglebone::usr0/brightness'
        ledOff ='echo 0 > /sys/class/leds/beaglebone::usr0/brightness'
        i = 0
        while self._running:
                msg = self._psock.recv(16)
                if len(msg) > 0:
                    frame = framestructure.unpack(msg)
                    size = frame[3]
                    msgdata = self._psock.recv(size)
                    print "Received: %s" % msgdata
                    if i == 0:
                       i = 1
                       i = 0

Which gets you to this point...

Toggling the BeagleBone heartbeat LED with my iPhone over USB.
Which is pretty much where I've reached right now. Next steps is a proper application on the iOS end of things with more generic control of the BeagleBone's header pins, and a more flexible Python backend on the BeagleBone itself...

Update: David House has managed to get everything up and working on the Raspberry Pi. The only changes from the above is that you should grab libplist using apt-get rather than opkg, and since you won't be logged in as root you should remember to sudo usbmuxd -v -v when you start the USB daemon. Apart from that, you should be good to go...

David House (@davidahouse)
25/08/2012 20:22
Video of iPhone controlling LED on Raspberry Pi.

Controlling a LED connected to a GPIO pin on the Raspberry Pi with an iPhone

Update: Come along to my workshop in London on the 8th of October and get your hands dirty playing with iPhone, Arduino and now the BeagleBone and Raspberry Pi. Get 10% off the Early Bird ticket price today only with code BEAGLE10.

Monday 8th October 2012
Hoxton Hotel, London
Early Bird Price: £499 (until 1st Sept.)
Normal Price: £699
Save 10% with code BEAGLE10

Update: David House has just updated his Github repository with a better description of what he did to get the iPhone to control the Raspberry Pi's GPIO pins.
David House (@davidahouse)
26/08/2012 13:40
@aallan I just updated my github repo with a better description with attributions. Had a blast working with you...

Controlling a LED connected to a GPIO pin on the Raspberry Pi with an iPhone

PeerTalk and the BeagleBone

Earlier today I came across an excellent bit of wizardry by Rasmus Andersson called PeerTalk. It's a Objective-C library allowing you to communicate between your iPhone and your Mac over the USB dock cable using TCP sockets.

PeerTalk Demo

My immediate thought was that if this really only depended on having USB host mode capability at the far end, the same mechanism should be able to be used to talk to something like the BeagleBone, or the Raspberry Pi, not just your Mac. This would allow you connect your phone directly to the micro controller board and to drive hardware directly, a lot like the Redpark cable but bypassing Apple's External Accessory framework. 

Yup, this is going to be useful...
So I started digging around inside the source code to see if it depended on anything that was going to be specific to OS X, it became apparent that PeerTalk was mostly some really nice socket code sitting on top of the USB Multiplex Daemon (usbmuxd). This bit of software is in charge of talking to your iPhone over USB and coordinating access to its services by other applications. Effectively this is what iTunes and Xcode use to talk to your phone when you plug it into your Mac's USB port.

So any device that wants to talk to the iPhone using this method needs usbmuxd. Fortunately  for me there are a number of people that have worked oout how to talk to the iPhone from Linux, and there is a working usbmuxd for Linux.

The BeagleBone
As well as a few other dependences which aren't present on the stock Ångström Linux distribution on my BeagleBone, or even packages via opkg, building usbmuxd on my BeagleBone requires libusb and cmake. So before building usbmuxd, I had to build cmake, which meant resolving some problems with the stock compression libraries that shipped with Ångström.

However several hours later. after enough waiting around for software to build to convince me that before doing any serious development on the BeagleBone I really had to build an ARMv7 toolchain on my Mac to cross-compile things instead of building them directly on the board....

 The iPhone talking directly to my BeagleBone  using PeerTalk
...I managed to get a simple "hello" from my iPhone to the BeagleBone and then via screen to my Mac using port forwarding and that old stand by, telnet.

While I was hacking away on getting this working, I wasn't alone. David House was looking down some of the same back alleyways to get PeerTalk talking to his Raspberry Pi, and we batted the problem back and forth on Twitter while waiting for code to compile well into the night...

The next step is to put together a client on the BeagleBone sitting on top of usbmuxd that'll talk natively to the PeerTalk on iOS. Since I've got the source code of both ends, this isn't going to be too hard. I'll probably put something together in Python.
More soon...

Update: Following on from this I pushed forward till I managed to blink the BeagleBone's heartbeat LED from the iPhone which is, more-or-less, the "Hello World" stage of any hardware hack...

Sunday, August 19, 2012

A drawer full of phones...

I had a huge clear out of my home office this weekend, including the draw full of old mobile phones. A free copy of the second edition of Learning iOS Programming to the first person that can successfully identify them all...
A collection of mobile phones, but the iPhone changed everything..?
Answers accepted on the relevant thread on Google+ only....

Thursday, August 16, 2012

iOS Sensors and External Hardware Masterclass

I'm going to be down in London for O'Reilly's Strata conference in October, key-noting on the Tuesday morning, talking about the hidden data that follows us around and how I've leveraged that for my own advantage. I'll also be talking, along with my colleague Zena Wood from Exeter, about People Watching with Machine Learning and using modern smart phones, like the iPhone, to do interesting sociology. It should be good.

However I'll be kicking around town for most of the following week afterwards, talking to various people. But I had a gap, a big gap, at the beginning of that week. So I've decided to try and interesting experiment...

I've often argued that both the increasingly rich sensor suite and the ability to easily connect today's smart phones to external hardware, and sensors, make them an amazing lever on the world. It's something I've focused on a lot over the last year or so.

A conversation with Dale Dougherty and Alasdair Allan

While I've run a lot of conferences and workshops over the years, it's always been on someone else's dime. Time to put my money where my mouth is, I'm going to run a workshop.

In fact I'm going to run a master-class on iOS Sensors and External Hardware. This is going to be hardware hacking for iOS programmers. It's going to be hands on, bring your Mac, bring your iPhone and make sure you've got Xcode set up so that you can deploy apps onto your device. It'll be a small group, no more than twenty, and I'll be doing a bunch of live coding.

We'll start the day talking about the internal sensors: the accelerometer, magnetometer, gyroscope and how to combine these into sophisticated applications. Here you'll really get the benefit of my physics background, because I can take you under the user friendly skin Apple have put on top of these sensors as part of the iOS SDK and hopefully give you a decent idea of what their limitations are and how they work.

Then we'll move on to talk about how to extend the reach of the on-board sensors by connecting your iPhone to external hardware. We'll look at how to connect the Arduino micro-controller platform to your iOS device, and build simple applications to control the board and gather measurements from sensors connected to it, directly from iOS. This course will give you the background to build your own applications independently, using the hottest location-aware technology yet for any mobile platform.

iPad controlling an Arduino board via the Redpark cable
You'll take away with you an Arduino Uno board, a Redpark TTL Serial Cable for iOS, and everything you need to connect your iPhone to your new micro-controller. You'll also receive a copy of my books Basic Sensors in iOS and iOS Sensor Apps with Arduino.

The workshop will be on Monday the 8th of October at the Hoxton Hotel right next to London's Silicon Roundabout. I've opened registration and I'm offering 30% off the ticket price until the 1st of September. Sign up early, and sign up often.

More about exactly what I'm going to be talking about on the workshop's own website. I've done similar things on smaller scales in the past, but this should be a lot of fun. Hope to see at least some of you there...

Monday 8th October 2012
Hoxton Hotel, London
Early Bird Price: £499 (until 1st Sept.)
Normal Price: £699

Wednesday, August 15, 2012

Mining the astronomical literature

This post was originally published on the O'Reilly Radar.

There is a huge debate right now about making academic literature freely accessible and moving toward open access. But what would be possible if people stopped talking about it and just dug in and got on with it?

NASA's Astrophysics Data System (ADS), hosted by the Smithsonian Astrophysical Observatory (SAO), has quietly been working away since the mid-'90s. Without much, if any, fanfare amongst the other disciplines, it has moved astronomers into a world where access to the literature is just a given. It's something they don't have to think about all that much.

The ADS service provides access to abstracts for virtually all of the astronomical literature. But it also provides access to the full text of more than half a million papers, going right back to the start of peer-reviewed journals in the 1800s. The service has links to online data archives, along with reference and citation information for each of the papers, and it's all searchable and downloadable.
Number of papers published in the three main astronomy journals each year Number of papers published in the three main astronomy journals each year. CREDIT: Robert Simpson
The existence of the ADS, along with the arXiv pre-print server, has meant that most astronomers haven't seen the inside of a brick-built library since the late 1990s.

It also makes astronomy almost uniquely well placed for interesting data mining experiments, experiments that hint at what the rest of academia could do if they followed astronomy's lead. The fact that the discipline's literature has been scanned, archived, indexed and catalogued, and placed behind a RESTful API makes it a treasure trove, both for hypothesis generation and sociological research.

For example, the .Astronomy series of conferences is a small workshop that brings together the best and the brightest of the technical community: researchers, developers, educators and communicators. Billed as "20% time for astronomers," it gives these people space to think about how the new technologies affect both how research and communicating research to their peers and to the public is done.

It should perhaps come as little surprise that one of the more interesting projects to come out of a hack day held as part of this year's .Astronomy meeting in Heidelberg was work by Robert Simpson, Karen Masters and Sarah Kendrew that focused on data mining the astronomical literature.

 The team grabbed and processed the titles and abstracts of all the papers from the Astrophysical Journal (ApJ), Astronomy & Astrophysics (A&A), and the Monthly Notices of the Royal Astronomical Society (MNRAS) since each of those journals started publication — and that's 1827 in the case of MNRAS.

 By the end of the day, they'd found some interesting results showing how various terms have trended over time. The results were similar to what's found in Google Books' Ngram Viewer.
The relative popularity of the names of telescopes in the literature The relative popularity of the names of telescopes in the literature. Hubble, Chandra and Spitzer seem to have taken turns in hogging the limelight, much as COBE, WMAP and Planck have each contributed to our knowledge of the cosmic microwave background in successive decades. References to Planck are still on the rise. CREDIT: Robert Simpson.
After the meeting, however, Robert has taken his initial results and explored the astronomical literature and his new corpus of data on the literature. He's explored various visualisations of the data, including word matrixes for related terms and for various astro-chemistry.
Correlation between terms related to Active Galactic Nuclei Correlation between terms related to Active Galactic Nuclei (AGN). The opacity of each square represents the strength of the correlation between the terms. CREDIT: Robert Simpson.
He's also taken a look at authorship in astronomy and is starting to find some interesting trends.
Fraction of astronomical papers published with one, two, three, four or more authors Fraction of astronomical papers published with one, two, three, four or more authors. CREDIT: Robert Simpson
You can see that single-author papers dominated for most of the 20th century. Around 1960, we see the decline begin, as two- and three-author papers begin to become a significant chunk of the whole. In 1978, author papers become more prevalent than single-author papers.
Compare the number of active research astronomers to the number of papers published each year Compare the number of "active" research astronomers to the number of papers published each year (across all the major journals). CREDIT: Robert Simpson.
Here we see that people begin to outpace papers in the 1960s. This may reflect the fact that as we get more technical as a field, and more specialised, it takes more people to write the same number of papers, which is a sort of interesting result all by itself.

Behind the project and what lies ahead

I recently talked with Rob about the work he, Karen Masters, and Sarah Kendrew did at the meeting, and the work he's been doing since with the newly gathered data.

What made you think about data mining the ADS?

Robert Simpson: At the .Astronomy 4 Hack Day in July, Sarah Kendrew had the idea to try to do an astronomy version of BrainSCANr, a project that generates new hypotheses in the neuroscience literature. I've had a go at mining ADS and arXiv before, so it seemed like a great excuse to dive back in.

Do you think there might be actual science that could be done here?

Robert Simpson: Yes, in the form of finding questions that were unexpected. With such large volumes of peer-reviewed papers being produced daily in astronomy, there is a lot being said. Most researchers can only try to keep up with it all — my daily RSS feed from arXiv is next to useless, it's so bloated. In amongst all that text, there must be connections and relationships that are being missed by the community at large, hidden in the chatter. Maybe we can develop simple techniques to highlight potential missed links, i.e. generate new hypotheses from the mass of words and data.

Are the results coming out of the work useful for auditing academics?

Robert Simpson: Well, perhaps, but that would be tricky territory in my opinion. I've only just begun to explore the data around authorship in astronomy. One thing that is clear is that we can see a big trend toward collaborative work. In 2012, only 6% of papers were single-author efforts, compared with 70+% in the 1950s.
The average number of authors per paper since 1827 The above plot shows the average number of authors, per paper since 1827. CREDIT: Robert Simpson.
We can measure how large groups are becoming, and who is part of which groups. In that sense, we can audit research groups, and maybe individual people. The big issue is keeping track of people through variations in their names and affiliations. Identifying authors is probably a solved problem if we look at ORCID.

What about citations? Can you draw any comparisons with h-index data?

Robert Simpson: I haven't looked at h-index stuff specifically, at least not yet, but citations are fun. I looked at the trends surrounding the term "dark matter" and saw something interesting. Mentions of dark matter rise steadily after it first appears in the late '70s.
Compare the term dark matter with related terms Compare the term "dark matter" with a few other related terms: "cosmology," "big bang," "dark energy," and "wmap." You can see cosmology has been getting more popular since the 1990s, and dark energy is a recent addition. CREDIT: Robert Simpson.
In the data, astronomy becomes more and more obsessed with dark matter — the term appears in 1% of all papers by the end of the '80s and 6% today. Looking at citations changes the picture. The community is writing papers about dark matter more and more each year, but they are getting fewer citations than they used to (the peak for this was in the late '90s). These trends are normalised, so the only regency effect I can think of is that dark matter papers take more than 10 years to become citable. Either that or dark matter studies are currently in a trough for impact.

Can you see where work is dropped by parts of the community and picked up again?

Robert Simpson: Not yet, but I see what you mean. I need to build a better picture of the community and its components.

Can you build a social graph of astronomers out of this data? What about (academic) family trees?

Robert Simpson: Identifying unique authors is my next step, followed by creating fingerprints of individuals at a given point in time. When do people create their first-author papers, when do they have the most impact in their careers, stuff like that.

What tools did you use? In hindsight, would you do it differently?

Robert Simpson: I'm using Ruby and Perl to grab the data, MySQL to store and query it, JavaScript to display it (Google Charts and D3.js). I may still move the database part to MongoDB because it was designed to store documents. Similarly, I may switch from ADS to arXiv as the data source. Using arXiv would allow me to grab the full text in many cases, even if it does introduce a peer-review issue.

What's next?

Robert Simpson: My aim is still to attempt real hypothesis generation. I've begun the process by investigating correlations between terms in the literature, but I think the power will be in being able to compare all terms with all terms and looking for the unexpected. Terms may correlate indirectly (via a third term, for example), so the entire corpus needs to be processed and optimised to make it work comprehensively.

Science between the cracks

I'm really looking forward to seeing more results coming out of Robert's work. This sort of analysis hasn't really been possible before. It's showing a lot of promise both from a sociological angle, with the ability to do research into how science is done and how that has changed, but also ultimately as a hypothesis engine — something that can generate new science in and of itself. This is just a hack day experiment. Imagine what could be done if the literature were more open and this sort of analysis could be done across fields?

Right now, a lot of the most interesting science is being done in the cracks between disciplines, but the hardest part of that sort of work is often trying to understand the literature of the discipline that isn't your own. Robert's project offers a lot of hope that this may soon become easier.

Friday, August 03, 2012

They promised us flying cars

This article was originally published on the O'Reilly Radar.

We may be living in the future, but it hasn't entirely worked out how we were promised. I remember the predictions clearly. The twenty first century was supposed to be full of self driving cars, personal communicators, replicators and private space ships.

Google has received Nevada's first autonomous-designated license plate. Credit: Nevada Department of Motor Vehicles

Except of course all that has come true. Google just got the first license to drive their cars entirely autonomously on public highways, Apple came along with the iPhone and changed everything, three dimensional printers have come out of the laboratories and into the home, and in a few short years, and from a standing start, Elon Musk and SpaceX has achieved what might otherwise have been thought impossible. Late last year they launched a spacecraft and returned it to Earth safely. Then they launched a another, successfully docked with the International Space Station, and then again returned it to Earth.

The SpaceX Dragon capsule is grappled and berthed to the Earth-facing port of the International Space Station's Harmony module at 12:02 p.m. EDT, May 25, 2012. Credit: NASA/SpaceX

Right now there is a generation of high-tech tinkerers breaking the seals on proprietary technology, and prototyping new ideas, leading to an rapid growth in innovation. This generation, who are building open hardware instead of writing open software seem to have come along out of nowhere. Except of course they haven't. Promised a future they couldn't have, they've started to build it. The only difference between them and Elon Musk, Jeff Bezos, Sergey Brin, Larry Page and Steve Jobs and those guys got to build bigger toys than the rest of us.

The dotcom billionaires are just regular geeks just like us. They might be the best of us, or sometimes just the luckiest, but they grew up with the same dreams, and they've finally given up waiting for the government to build the future they were promised when they were kids. They're going to build it for themselves. The same thing driving the Maker movement, and that generation of high-tech tinkerers, is the same thing that's driving the next space race.

Unlike the old space race, driven by national pride, and the hope that we could run fast enough in place so that we didn't have to start a nuclear war, this space race is being driven in part by personal pride and ambition, but also by childhood dreams.

A lot of big business seems confused by the open hardware movement, they don't understand it, don't think it's worth their while to make exceptions and cater for it. Even the so called smart money doesn't seem to get it, I've heard moderately successful venture capitalists from the Valley say that they "...don't do hardware." Those guys are about to lose their shirts. Makers are geeks like you and me, who have decided to go ahead and build our own future because the big corporations and the major governments had so singularly failed to do it for us. Is it any surprise that dotcom billionaires are doing the same? Is it any surprise that the future we build is going to look a lot like the future we were promised, and not so much like the future we'd be heading towards.

What do you do when you've changed the world? You do it again...

Thursday, August 02, 2012

The new space race

In a few short years, and from a standing start, Elon Musk and SpaceX has achieved what might otherwise have been thought impossible. Late last year they launched a spacecraft and returned it to Earth safely. Then they launched a second which successfully docked with the International Space Station (ISS) and again returned it to Earth safely.

"The impact of this adventure on the minds of men everywhere, who are attempting to make a determination, of which road they should take...", John F. Kennedy

Working relatively independently of NASA and the other government agencies, and building their technology stack from the ground up, SpaceX has in under a decade already demonstrated Apollo-era capability. However their Dragon capsule is no Apollo, it's a flexible space transport system built with modern technology, whose full capabilities have yet to be demonstrated.

DragonCaptured_01.pngThe SpaceX Dragon spacecraft on the end of the Canadarm2.

SpaceX is the first commercial company to send a spacecraft into orbit and recover it successfully, something that only three governments - the United States, Russia and China - have ever done, and with the retirement of the US Space Shuttle they have the capability that only two governments - Russia and China - now posses. The European ATV and Japanese HTV have no return capability and burn up in the Earth's atmosphere, and the US currently has no manned space capability.

Dragon Spacecraft in Ocean After Splashdown from SpaceX

The SpaceX Dragon spacecraft floats in the Pacific after returning to Earth from the International Space Station (ISS). Credit: Mike Altenhofen/SpaceX

With the retirement of the Space Shuttle, the Dragon is the only spacecraft in the world capable of returning significant cargo from the station as the Russian Soyuz has only minimal cargo capacity.

Space Stations

The first generation space stations; the Soviet Salyut and Almaz stations, along with the American Skylab station, were all monolithic designs. It wasn't really until Mir was flown with a modular design that we entered the modern era, it was the only second generation station to fly, with the US sitting on the sidelines resting on its Lunar laurels, and the Europeans seemingly uninterested in manned spaceflight.

The ISS and the docked Space Shuttle Endeavour, taken by Expedition 27 crew member Paolo Nespoli from the Soyuz TMA-20 following its undocking on May 23, 2011. It was the first-ever image of a space shuttle docked to the International Space Station. Endeavour at left. European ATV cargo carrier at right. Credit: NASA/Paolo Nespoli

Today's ISS is a bastard child of the follow-on station projects from the various countries involved in space race; the Soviet/Russian Mir-2, the American Freedom project which included the Japanese Kibō Laboratory, and the European Columbus space station. None of which came to fruition separately, mostly due to budgetary considerations but also due to politics.

The first component of the ISS was launched in 1998, and construction began the Russian Mir station was still in orbit. The last manned mission to Mir was a privately funded Soyuz mission by MirCorp, in April 2000, which carried out repair work with the hope of proving that the station could be made safe. There was no return to Mir however which was deorbited the following year following the permanent occupation of the ISS, which began in the November of 2000.

OPSEK and Gateway

Despite being declare "complete" there are two more modules destined for the ISS are due for launch over the next couple of years, both from Russia. The Nakua module will serve as Russia's primary research module on the ISS. It will replace the current Pirs module, When that happens Pirs could become the first permanent ISS module to be decommissioned, and would be destroyed during atmospheric re-entry. The Node Module is intended as the primary core of the Russian OPSEK station. Initially attached to the ISS, it will be detached along with Nakua and some of the other Russian modules before the ISS is decommissioned and deorbitted, and used for the basis of a new station.

Recent proposals by Boeing call for some of the "left over" parts of the ISS program that are still on the ground, notably the unlatched Node 4, to be used to build a Exploration Gateway Platform to be located at one of the Earth-Moon Lagrange points to be used as a launch platform for deep space exploration, robotic relay station for moon rovers, telescope servicing and a deep space practice platform located outside the Earth's protective radiation belts. The new platform would be assembled at the ISS before being boosted towards the Lagrange point. If this came about it could drastically cut the cost of future manned Lunar, Mars or NEO missions, and would represent the first manned presence beyond low-Earth orbit since the Apollo program ended in the 1970's.

Commercial operations

With the launch of first Cygus spacecraft scheduled for October of November, the number of commercial companies with access to low Earth orbit will grow to two, although SpaceX will remain the only company with return capability. There is no immediate expectation that the US government, and NASA, is on track to regain any sort of direct access and commercial operators will therefore remain the only access to space for the US for the foreseeable future.

Genesis_1.jpgThe Bigelow Genesis-I Space Station in orbit

The decommissioning of the ISS, now scheduled for 2020, would leave no US government presence in space. Of course by then Bigelow Aerospace, already with two pathfinder launches under their belt (Genesis I and Genesis II), plan to have a human-habitable commercial station online. Tentative launch dates for the first modules are around 2014 and 2015, and Bigelow has reserved a 2014 launch slot on SpaceX's Falcon 9, although they have not yet announced the payload.

Unlike the ISS and previous stations Bigelow's station technology is different, and potentially game changing. Based on the TransHub technology and patents, which Bigelow bought from NASA when they were directed to discontinue work on module by the US Congress, Bigelow's inflatable modules will provide large useful volumes for a much smaller launch weight than traditional hard-shell modules.
By the time Bigelow is ready to launch its first station SpaceX should have a fully man-rated Dragon capsule, and possibly a crewed launch to the ISS under their belt. Earlier this month Bigelow and SpaceX teamed up to do joint marketing to international customers of crew transport on SpaceX Falcon 9/Dragon up to the Bigelow BA330 space facility.

Bigelow has agreements with seven sovereign nations to utilize on-orbit facilities of the commercial space station: United Kingdom, Netherlands, Australia, Singapore, Japan, Sweden and the United Arab Emirate of Dubai.

Of course who knows what the Jeff Bezos and his skunk-works company Blue Origin are doing behind heir "cone of silence," beyond their initial test flight back in 2006 we heard very little out of the company until the beginning of September last year where they reported the loss of their second test vehicle during a developmental test at Mach 1.2 at an altitude of 45,000ft. I'm not sure most people were aware they were testing at those altitudes, at least not at the time.

Then there is Excalibur Almaz who are now planning to take customers to the Moon, with a ticket price of $100 million a seat.

History of the Soviet Almaz military program on which the Excalibur Almaz technology is based.

The company relies on the use of decommissioned Salyut-class spacecraft which Excalibur Almaz purchased from Russian. They currently own four reusable reentry vehicles and two station modules, similar to components of the Mir station and the currently flying Zarya module attached to the ISS.

Close to home

Somewhat overshadowed by SpaceX and Dragon, Virgin Galactic has announced that the FAA had given an experimental launch permit for its sub-orbital SpaceshipTwo and air carrier WhiteKnightTwo.

spaceshiptwo_001.jpgSpaceShipTwo flying with crew for the first time, during a dress rehearsal flight for its first free glide flight in 2010. Credit: Virgin Galactic/Scaled Composities

WIth this permit in hand Scaled Composites and Virgin Galactic are able to press ahead with the testing program and carry out rocket powered test flights of the new craft.

The long duration future

The new commercial space companies have ambitious plans; SpaceX's Red Dragon which may launch as early as 2018 and use a modified Dragon capsule to carry heavy instrumentation for a soft landing on the Martian surface, and as a precursor to a manned mission.

Red Dragon Landing.jpgArtist's rendition of a Dragon spacecraft using its SuperDraco thrusters to land on Mars. Credit: SpaceX.

Almost complimentary to the push towards manned spaceflight from the commercial sector, is the arrival of Planetary Resources earlier in the year with a goal of developing a sustainable (and profitable) robotic asteroid mining industry.

The new space race

The US and Russian governments aren't doing planning any novel endeavours in space, and it seems the Chinese are determined to tread the path that the US and Russia has taken before them, their own station seems to be a mix-and-match copy of the historical Russian programme, although the capability of their Shenzhou spacecraft to leave the orbital module behind means that their station might grow incrementally and much more rapidly than the ISS.

The really interesting development work happening in the space industry right now seems to be going on in the private sector. The new space race has begun, it's between SpaceX, Blue Origin, Orbital Sciences and the other commercial companies. The goal isn't national pride, it's part personal pride and ambition, as most of these companies are founded by individuals, and part profit motive.