Saturday, February 23, 2013

Distributed Network Data

My latest book, my first not talking about iOS and writing code for the iPhone and iPad, just went to press. It's called Distributed Network Data and it's hardware hacking for Data Scientists. It's the book of the +Data Sensing Lab and arrives just in time for this year's +O'Reilly Strata in Santa Clara, which starts tomorrow.

This book is intended for data scientists who want to learn how to work with external hardware. It assumes some basic computing and programming knowledge, but no real expert knowledge is assumed. From there the book walks you through build your own distributed sensor network to collect, analyse, and visualise real-time data about our environment.

If you're a data scientist, or a visualisation person, interested in getting started with hardware and collecting your own data, this is the book for you. You can use the code AUTHD to get 40% off print books, 50% on ebooks and videos when you buy the book directly from O'Reilly.

Thursday, February 21, 2013

Making money faster than you can type

The 3Doodler is a 3D printer, but it's a pen. This takes 3D printing and turns it on its head...

In fact the 3Doodler rejects quite a lot of what most people would consider necessary for it to be called a 3D printer. There is no three axis control, there is in fact no software, you can't download a design and print an object, it strips 3D printing back to basics.
What there is, what it allows you to do, is make things. This is the history of printing going in reverse, it's as if Gutenberg's press was invented first, and then somebody came along afterwards and invented the fountain pen.

While it looks simple they've obviously overcome some serious technological difficulties to get it working. One of the things that's hard to do on 3D printers, at least hard to do well, is unsupported structures.
As anyone that owns a 3D printer will tell you, the cooling time for the plastic as it leaves the print head is crucial to allow you to print unsupported structures. Too hot and it doesn't work, the structure sags and runs, too cold and it just plain doesn't work at all. From their videos they seem to have cracked the problem, building a free standing structure seems to be easy and well within the capabilities of the pen.
It also takes 3mm ABS and PLA as its “ink,” the same stuff used by most hobbyist 3D printers. I've got spools of this stuff hanging around my house which I use in my own printer. But unlike my printer, which cost just under a thousand dollars, the 3Doodler costs just $75.
It doesn't have the same capabilities, but that's the difference between a printing press and a pen. It has different capabilities, ones a "normal" 3D printer doesn't have. It's not a cheap alternative, it's a different thing entirely.
I'm currently watching the 3Doodler climb towards their first million dollars on Kickstarter, and I when I say their first million I mean that, they have over 30 days to go on their campaign which has today has gone viral and made them the best part of that million. This is the next Pebble. The next Kickstarter success story.
They've tapped into a previously untappable market; people that wanted a 3D printer but couldn't afford one, and people that see the obvious potential of a fountain pen over a printing press, for both art and engineering.
The guys behind the 3Doodler made $60,000 dollars while I wrote this post, my hat is off to them. Because it's not often someone comes up with an idea this good.
I'm going to be writing a series of posts on hardware startups for the Radar over the course of the next few months, and rest assured I'll come back to the 3Doodler. But not until  they can type faster than they can make money.

Wednesday, February 13, 2013

The black rectangle won't last as long as the beige box

It looks like putting #Linux on #Microsoft's new #Surface  is going to be an up hill struggle. I was actually expecting that...

The era of the commodity beige box is coming to an end, and the days of the general purpose computer are almost over. Most people never needed or wanted a general purpose computer, and they're going to be happy with more limited devices optimised for a single, or a few, purposes. So long as those devices just work.

As a scientist I've benefited from being able to take mass produced PCs and be able to put them on desks very cheaply. The amount of compute power we've had access to as a result meant that money that would otherwise have been spent on expensive high end workstations could be spent elsewhere.

Those of us that need general purpose computing; designers, developers, scientists, are going to have to go out and buy increasingly expensive niche machines, effectively old-fashioned workstations. High end computing platforms that the general population just don't need on their desk or in their pocket.

The fact you can't install #Linux  on the new #Surface  is just the start of what is going to be an increasingly obvious trend. It's just a symptom. The things that are open and the things that are closed are changing. Time to wake up and realise that. Being able to install #Linux  on your PC isn't important any more.

I think a lot of the web and mobile people are making the same mistake today that Nokia made five years ago, Nokia was all about the hardware and wasn't watching the software hard enough...

Today people are all about the software and aren't watching the hardware hard enough. Today's mobile phone, the black rectangle with, at most, a single button is a transition device. Don't get too comfortable with it, and don't stop thinking about innovation. Because the black rectangle won't last as long as the beige box.

Sunday, February 03, 2013

Predicting a Singularity

I think a lot about the future, and because of that I've gained somewhat of a reputation for making good predictions. This is a characteristic I share with prophets, messiahs and other ne'er-do-wells. I'm not entirely sure what to think about that.

However one of the problems with making predictions about the future, the main problem, is that it's actually not that hard to predict what'll happen next year. Although for some reason this doesn't really seem to help many of the major analysts whose job it is to make such forecasts. Conversely it's also not that hard to make a prediction for the far future, as "…any sufficiently advanced technology is indistinguishable from magic."1 Why is this a problem? Well if it's easy to make those predictions, making predictions in the sweet spot, both far enough ahead to be ahead to get you ahead of your competition, and close enough that you'll still be around to do something about it is actually almost impossible.

But how far in the future is the sweet spot? The strange thing is that this actually changes, a century ago it was twenty years, or even thirty, but twenty or thirty years ago it was just ten years. Today it's probably five, or less. The rate of technological progress is accelerating, and with it the amount of change we'll experience during our lives is also changing. The time it takes for new technologies to emerge, become mainstream, become dated and then obsolete is falling almost exponentially.

For someone like me, whose career more or less relies on being on, and being seen to be on, the bleeding edge, this is painfully evident. If I'm asked "Have you heard about…?" and I have to answer "No?" you'll generally see a look of pain cross my face, something sort of like constipation, don't worry, it's just my career flashing before my eyes...

At this point having angered both business analysts and science fiction writers I'm going to make a small admission, both professions have the right of it because "...the future is already here — it's just not very evenly distributed."2

While the pace of change has accelerated, it's hard to see how that can be sustained as the size of the install base of existing legacy technology widens. So far we seem to have sustained that pace of change by trickle down economics, the older technology spreads out, and newer technology is dropped in, eagerly seized by early adopters like myself willing to pay the premium, and experience the inevitable problems that come with all new technology, at least until the bugs are shaken out.

Despite that I'm forced to point out that if you extrapolate the current rate of technological progress the view that some sort of technological singularity must almost be inevitable is hard to argue against. Unless of course there is some sort of major catastrophe, something to set us back.

Major catastrophes that could knock us back aren't hard to spot: a global pandemic, climate change, ecological disaster, super volcanoes, mega-tsunami, overpopulation, asteroid impact, a nearby supernovae and of course worldwide thermonuclear war are all favourites. The threat of some of these seems to be fading, but some seem more likely today than ever. There are others, many others, too numerous and depressing to list here.

They're also wildcards, because sometimes the things that should set us back push us forward. It's certainly arguable that two major world wars, so close together, were a major causal factor in the acceleration of the pace of change that is part of our lives today.

Depending on the current news cycle I can swing violently; between a horribly over optimistic view of the future and the inevitability of the rise of trans-humanism, and a view bleak in pessimism, in the inevitability of the abandonment of technology and a slow slide towards narrowing horizons and the eventual extinction of the human race. Doomed as a species that turned its back on space and by having its world view limited to just that, a single world with all the disasters and catastrophes that can result.

Despite this, I'll continue to try and make predictions in the sweet spot, it's fun to be proved right, and sometimes even more fun to be proved wrong.

1 Arthur C. Clarke
2 William Gibson

Saturday, February 02, 2013

You don't have to be awesome all the time...

In her post talking about the public-ness of mourning after the death of Aaron Swartz, Danah Boyd writes "...we’ve created communities connected around ideas and actions, relishing individualistic productivity for collective good. But we haven’t created openings for people to be weak and voice their struggles and demons."

Geek culture is, at least in theory, a meritocracy, and you are measured by your accomplishments. But that means the best of us, those whose work is held up as shining examples, suffer from Impostor Syndrome. Sometimes cripplingly so, even when they are accomplishing awesome things. Because awesome things sometimes look a lot less awesome from the inside, when you know the limitations, flaws and problems with what you've built and shared with the community.

But worse than that, it means when you have your moment of weakness (and we all do), and for a while cannot contribute, cannot accomplish the day-to-day awesomeness that qualifies you as a member of good standing of the community, things can look very bleak. Because we expect the most from those of us that deliver the most, and even the great and the good can fall sometimes, and need support.

We've built a culture where it's hard to acknowledge that you don't know something, because knowing things is intricately linked with the doing of awesome things  which in turn is linked to our stature with our peers.

For someone like me, whose career more or less relies on being on, and being seen to be on, the bleeding edge, this is painfully evident. If I'm asked "Have you heard about…?" and I have to answer "No?" you'll generally see a look of pain cross my face, something sort of like constipation, don't worry, it's just my career flashing before my eyes…

I have no solutions to offer, only the sure and certain knowledge, which I give freely to other geeks, that you are not alone. That the great and the good amongst us suffer as well. That it's okay to be weak and not know the answer to a question. That it's okay to rest and take from others for a while. We'll still be here when you get back, and we'll still remember how awesome you are. You don't have to live your life on Internet time.