Recent Articles

Would you have bought a personal computer in 1975?

Categories

A personal computer in 1975 cost about $2,500 in today's dollars if you bought it assembled. It didn't do much, as there was no third-party software and it only had 256 bytes of memory. (However, for an extra $1,350 in today's dollars you could upgrade it with another 1,024 bytes of memory). A few thousand units sold that year, which was a shocking success.

As a laughably downsized version of what were then giant industrial machines, it was hard to see then what a personal computer would eventually be useful for. It would be another four years before Visicalc--arguably the first really useful thing you could do on a personal computer.

So would you have bought a personal computer in 1975?

In retrospect, many people would probably say Yes, but that's only because we now know how the technology evolved. It's much harder to be in 1975 and see how this expensive toy (which is all an Altair 8800 would ever be) would change the world.

It's this line of thinking which made me decide to buy a home 3D printer. I want to learn about this new technology not because it's useful today, but because it has such interesting potential over then next 10-20 years.

Commerical grade 3D printers are powerful pieces of industrial equipment, and have no place or purpose in the home. Their smaller cousins have only been on the market a couple years and are pretty much expensive toys with limited practical value.

On the other hand, a small 3D printer isn't much more mechanically complicated than an inkjet printer (in some ways it is actually simpler). There's no reason that, with enough manufacturing volume, someone couldn't sell a 3D printer for only a few hundred dollars and put one in every home.

The only reason nobody's selling millions of cheap 3D printers is because nobody knows what your average household would do with one.

I don't know if we will discover the Visicalc of 3D printers, the killer app which transforms this from expensive toy into useful tool. I don't think a lot of people in 1975 knew what the future held, either.

There was another interesting piece of industrial technology scaled down for home use which came around only a few years after the Altair 8800. Unlike the personal computer, however, not too many people today have a personal robot.

Optimistic about Carbon and Renewables

Categories

Three and a half years ago (in early 2008) I observed that the price of solar power modules had been dropping at a remarkably consistent 6% per year for 25 years, and that sometime before 2025 they would be cheaper than grid power in most places.

The exact year of grid parity depends a lot on where you live: sunny places with expensive electricity (think Hawaii or southern California) get there a lot sooner than cloudy places with cheap power (Seattle). For Minnesota, I estimated that sometime around 2015 a solar power system would pay for itself within the system's lifetime.

That estimate is looking pretty good, at least on the price of the solar modules (this Scientific American blog has an updated version of the graph I made in 2008). If anything, the decline in photovoltaic prices may be accelerating a little--though that could just be a short-term blip.

I'm optimistic that over the next decade solar power will become economically viable in more and more places. On a purely cost basis alone you will start seeing a substantial increase in solar power installations. That, in turn, makes me optimistic that we will manage to transition away from greenhouse-gas-emitting sources of energy in a reasonably graceful fashion.

I may be using "optimistic" in an unusual sense. There's no doubt that the earth's climate is changing, and much of the evidence now points to a faster climate change than most scientists had predicted. There's already a lot of climate change "baked in" to the atmosphere, as cabon dioxide levels have increased over 20% just in the past 50 years. What's more, moving a large fraction of energy production to solar and other renewable sources will take decades, as it's very capital intensive to build an entirely new energy infrastructure.

But I am optimistic that the long-term trends are in place to create a more sustainable energy system and eventually reduce or eliminate net emission of greenhouse gasses. It will take decades. Future historians may see the 21st century's energy revolution as just as important as the industrial revolution in the 19th century or the information revolution in the 20th.

In the meanwhile, global climate change will continue. Sea levels are likely to rise (maybe a lot), storms will get more intense, and a lot of people will have to adjust. Some cities may have to be abandoned or be put behind massive dikes like in the Netherlands (I'm looking at you, New Orleans and Miami).

But it will not be the end of civilization. We will--eventually--muddle through.

Rethinking Nuclear Power Post-Fukushima

Categories

For many years, my opinion of nuclear power has been one of an uneasy truce: I've not been 100% comfortable with it, but accepted it because of the potential to generate a lot of power relatively pollution-free.

In the wake of the accident at the Fukushima power plant, I'm rediscovering some things I kind of knew before but hadn't fully appreciated:

Underappreciated Fact 1: Nuclear Power is Inherently Dangerous

Historically speaking, far more people have been killed by fossil fuel power than nuclear power. This is a fact.

But that's not because nuclear power is inherently safer. On the contrary: nuclear power has a good safety record (so far) because it is so extremely dangerous that we entomb reactors with insanely large containment structures to keep the stuff away from us even in an unthinkable disaster. Were we to build similar containment and waste-handling systems for coal-fired power plants, pollution and global warming would be non-issues.

We don't do that with coal and oil because we don't have to.

And if a containment structure is ever catastrophically breached (an event which hasn't happened yet--the Chernobyl reactor had no containment), it would likely render hundreds of square miles uninhabitable for centuries. Nothing else made by humans has that capacity.

Underappreciated Fact 2: Spent Fuel Remains a Problem

Even after decades of nuclear power, we still haven't figured out what to do with the spent fuel. Fukushima shows that in an accident the spent fuel can be almost as dangerous as the reactor itself, in its capacity to contaminate the surroundings and prevent emergency workers from fixing problems.

Here in the U.S., spent nuclear fuel is basically stockpiled at the power plant waiting for the (hypothetical) day when there's some way to recycle or dispose of it. At the Prairie Island plant here in Minnesota, they've actually run out of storage space and have had to build new storage casks. It's safe to assume that these spent fuel casks are considerably more vulnerable than the primary containment around the reactor.

Underappreciated Fact 3: In a Disaster, You May Have Other Problems

The nuclear accidents at Chernobyl and Three Mile Island happened because of internal problems, not because of a natural disaster. Fukushima, on the other hand, was caused by a combination of a magnitude-9 earthquake and a massive tsunami--an event the power plant was not designed to survive.

Nuclear reactors are engineered to withstand the most catastrophic natural disaster expected at their site. What that means in practice is that a natural disaster big enough to damage a nuclear power plant will be bigger than anything anyone expects. Normally simple things like transportation may be difficult or nearly impossible, local emergency services may be wiped out, and it could take days to get even the most basic resources to fix the problem.

If bringing a nuclear power plant under control requires something (supplies, people, expertise) which doesn't exist at the site itself, you might not be able to get it at all.

Underappreciated Fact 4: Newer Plants May be Safer, but Old Plants Rarely Die

One argument by nuclear advocates post-Fukushima has been that the Fukushima reactor and containment was an older design with known deficiencies. New plants, they argue, would never be as vulnerable.

Unfortunately, older reactors continue to be used, even decades beyond their original design lifetime. Given the cost of decommissioning an old reactor and building a new one, power plant owners have an enormous incentive to keep the old reactors running as long as possible.

It's hard to know if the margin of safety in older nuclear plants has eroded (it may take another disaster to know for sure), but it is clear that they are not being replaced by newer designs nearly as quickly as the original designers had intended.

Apple, is this a good idea?

Apple refreshed its laptop line today, and the big new feature is the "Thunderbolt" port, aka The Mordor Plug ("....one plug to rule them all....").

Lots of people are really excited about this, but I noticed an odd design choice. Take a look at the symbol Apple is using for the Thunderbolt interface, the lightning bolt with an arrow.

Now take a look at this Google Image search. Striking resemblance, don't you think?

I don't know how eager I am to plug an expensive peripheral into a port marked with a prominent "DANGER HIGH VOLTAGE" symbol.

It seems that Apple is trying to rebrand a universally understood symbol meaning "Danger! Don't touch this or plug anything into it unless you really know what you're doing" to mean "You can plug anything into me and it will be really fast!"

What could possibly go wrong?

What happened to the attic?

Based on a purely random set of observations over my lifetime, I've noticed that houses more than about 100 years old (built before 1910 or so) usually have an attic which is fairly accessible for storage. Houses less than 60 years old (built after 1950 or so) usually have attics which are difficult to get into, or even completely sealed from the living spaces.

My own home, built in 1984, has at least three distinct attic spaces over different parts of the house, and only one of the three has any way to get in at all (without cutting through a wall or ceiling). Getting into the one accessible space requires carrying a large stepladder up to a closet on the top floor, lifting a drywall panel out of the way, and shimmying through a small hole--not at all practical for storage.

I find this a little mysterious. Attics are terribly useful things: they don't take up any living space but can provide an enormous amount of storage (think of all the billions spent on mini-storage); an accessible attic makes it much easier to inspect the condition of the insulation and look for roof leaks (and every roof, given enough time, will eventually leak); and attics are almost as handy as drop ceilings when trying to pull network cables.

So why doesn't the modern American house make it easy to get into the attic, the way our grandparents' houses did?  I have some theories:

  1. Beginning with the post-WWII housing boom, builders felt the need to put up lots of houses cheap, and eliminating the ladder to the attic was an easy way to save money.  Eventually people stopped expecting this amenity.
  2. Modern architectural styles, with their shallow roofs, don't give any usable attic space anyway, so builders stopped providing access. The trend carried over even in places (like Minnesota) where most houses are still built with a steep roof because of the snow and ice.
  3. Building codes stopped allowing a steep ladder into the attic, and people didn't want to take the floor space needed for a proper staircase.
  4. When people started heavily insulating their attics, it became more difficult to provide a hard floor suitable for walking and storage.
  5. Attic storage is, and always has been, an expensive amenity reserved for the fanciest houses. Cheaply built houses from 100 years ago have mostly been torn down, so it just seems like builders used to provide more attic access.
  6. Attic storage is just as common in new houses as in older ones, and my observation is just wrong.

My guess is that the answer is a combination of 1 and 2, with maybe a little of 3 and 4 thrown in. I really don't know, though, and my attempts to use Google-fu to find the reason came up blank.

So for now this is just a mystery.  But if I ever build my own home, I will insist that it come with a proper staircase to an attic where I can keep all my stuff.

How Frozen is the Frozen North?

Categories

I finally got around to putting the current weather back in the blog.  This was the one major cleanup from when I switched to Drupal almost two years ago.

It isn't especially elegant: at home I have a Mac Mini running Lightsoft Weather Center; this downloads the current weather from my Davis Vantage Pro weather station.  Every 15 minutes, it updates an HTML template and FTPs it as a static file to my web hosting provider.  This static HTML page is included on every page of the site through an iframe.

There are also some history graphs which can be accessed by clicking the "Weather in the Frozen North" link; those are also updated every 15 minutes.

I was forced to abandon the Davis WeatherLink software because the Mac version was simply pathetic--it has not been well maintained, and ran as a Java application which seemed to be very brittle.  Fortunately, there are now several superior alternatives for weather station software on the Mac which have more features, are easier to set up, and produce nicer-looking output.

Facebook, it's not you, it's me

Categories

Facebook, it's not you, it's me.

I tried to make our relationship work, I really did. At first I resisted--I'm not the kind of nerd who falls for every pretty Web 2.0 app--but when it seemed like everyone I knew was talking about you (even my own mother in law), I gave in.

We had fun together, at first. Reconnecting with old friends, seeing who else was hanging out. After a while, though, our relationship began to change. You didn't communicate with me the way you used to: instead of fun little updates, it seemed like all I got was messages about how many sheep someone raised in FarmVille.

I understand that you have needs too, but our relationship can't be a one-way street. If you want to monetize me, that's fine, but our relationship needs to be about more than just that.

For a while I sort of drifted away, but then I started suspecting that you had a darker side when I learned how many of our shared secrets you didn't really keep secret. Whatever trust and respect I had was gone when I learned that those games my friends were playing demanded a price: not just my friends' privacy, but mine too. Suddenly the sheep seemed more than annoying, almost sinister.

So I tried to leave you. For months I didn't log on, but eventually, and against my better judgment, I decided to give you one more try.

This time, I vowed, I would be careful and give you a fair shake. I would block all the useless applications, to protect both my time and my privacy. I would check everyone's updates regularly and comment where appropriate.

It didn't work.

The harsh reality is that I've been spending as much time blocking applications (you don't make it as easy as it should be) as communicating with people I care about. Of all my "friends," only a handful actually post updates, and those who do update post way too often (I care about these people, but not that much).

So in the end, Facebook, this is goodbye. I've invested too much energy in our relationship and gotten too little in return, and I've finally realized that to you I was never more than one more consumer profile to market to. I deleted my account today--though I have my doubts that you'll respect that. Something tells me you don't really believe our relationship is over.

On the Nature of Steve Jobs' Reality Distortion Field

Categories

Apple did not invent graphical user interface. Nor did it invent the digital music player, smartphone, or tablet computer. Apple did take each of these products, do it better than anyone else, and (for a time at least) own the market.

In each case, a large part of Apple's contribution was not a killer feature or innovation, but taking existing elements and finding a combination of elements uniquely appealing to customers.

In each case, this meant omitting some features which every other product included, and which most observers believed to be must-have.

In each case, competitors and industry pundits mocked Apple's products and predicted failure.

In each case, when customers actually tried the products, the missing features turned out to be less important than the overall experience.

Normally when a company launches a new product into a new market, it makes an effort to include all the key features. Without hitting the "checklist features" it can be difficult to get prospective customers to even try the product, and the product is often doomed before it even gets a chance.

Apple's unique talent, and the true nature of Steve Jobs' Reality Distortion Field, is in getting prospective customers to give a new product a try, even when it seems to be missing key features.

How Special is the iPhone?

Categories

Apple's products, at least since the return of Steve Jobs, have been an oasis of quality hardware and software in the sea of cheap, ugly, and crash-prone products that is the computing industry. As much as I like Apple and my iPhone, however, I would like the option to buy my next phone from a different company and not feel like I'm settling for second-best.

So far things are not looking good. The Palm Pre had outstanding software, but it never had enough backing from major carriers. As a result, both the Pre and Palm itself, are for all intents and purposes no more.

Android was also promising at first, and it still is promising in the abstract. Unfortunately, nearly all the actual Android phones on the market come with some combination of a lack of software upgradability, crippled features, obnoxious bloatware, and unhelpful user interface overlays. The main exception seems to be Google's own Nexus One, which lacked the marketing support of major carriers and is for all intents and purposes no more. Though it is technically possible to hack your Android phone or buy a new Nexus One, it's not reasonable to expect a typical consumer to go through the effort involved.

And while Android phones in aggregate are outselling iPhones, those sales are spread across hundreds of different devices from dozens of manufacturers. Given the wide variety of hardware, OS versions, and customized software commercial Android phones are sold with, it begs the question of whether Android is even a single platform.

Why is that after over three years since the original iPhone introduction no other manufacturer has been able to match Apple's combination of commercial success and high quality design?

Or, as an acquaintance recently said as he was showing off his beautiful new Droid phone, "It's great but let's face it: we all just want iPhones."

A Three Ring Circus

In order to successfully bring a mobile phone to market, three different elements must come together: the hardware, the software, and the service. That means that up to three different companies are involved in creating the customer experience, though often the hardware and software are from the same company.

Of these, the service is the hardest to differentiate, since consumers generally notice the service only when it fails: when calls drop, when the bill is wrong, etc. When everything is working properly the mobile phone service is like oxygen in the air, invisibly supporting the customer's daily activities.

However, the service provider also owns the customer relationship, since the carrier sells the customer the phone (in most cases), provides customer support, and sends the customer the monthly bill. In most cases that monthly bill is not only paying for the actual cost of delivering mobile phone service, but also most of the cost of the phone itself.

So the mobile phone companies--Verizon, AT&T, Sprint, T-Mobile, and others--use their customer control to force handset makers to make handsets the carriers want, which might or might not be the handsets which customers want.

At its most benign, this results in the carriers' logo being featured more prominently than the manufacturer's logo on most mobile phones. More importantly, phones are often shipped with important features (like data tethering) crippled or disabled to help the carriers sell more expensive services, and useless applications and overlays added which the carrier uses to "differentiate" its handsets (for example, Sprint's infamous Nascar App). It's also hard for a handset maker to innovate in ways which require the carrier's cooperation, since the handset company has very little power in the relationship and phone companies (as a rule) don't like changing their networks if they don't have to.

The iPhone, on the other hand, seems to exist entirely outside this world. Every iPhone ships with the same interface and user software (no carrier-specific apps or overlays), carriers have updated their networks specifically to support the iPhone's Visual Voicemail feature (Apple did not invent the graphical interface for voicemail, but only Apple convinced a carrier to support it), and Apple doesn't even include the carrier's logo anywhere on the phone. An iPhone from anywhere in the world is essentially the same product, with the same branding, features, applications, and interface. 

[The one exception is data tethering, which is enabled in most markets but costs extra under AT&T. In My Humble Opinion this is obnoxious but at least understandable, given that the usage profiles of a smartphone and a wireless modem--which it what a tethered phone is--are very different.]

Uniquely, in the power relationship between carriers and handset makers, somehow Apple has come out on top where every other mobile device manufacturer has had to kowtow to the phone companies.

A Unique Confluence of Circumstances

I'm starting to believe that the iPhone and its success is due to a set of circumstances which make it unlikely any other company will be able to repeat Apple's feat.

At the time Apple was developing the iPhone and looking for a carrier partner, AT&T was still working through the aftereffects of a series of mergers and rebrandings which had, in the course of only a couple of years, confusingly merged Cingular and AT&T Wireless, killed the AT&T Wireless name, then returned the AT&T name and eliminated the Cingular brand.  Network and customer service integration was also rocky, and the company needed something unique to offer customers.

Apple is notoriously finicky about its products, and other carriers (notably Verizon) wouldn't give Apple the degree of control Apple wanted. But for AT&T, this was exactly what it needed: Apple was (thanks to the iPod) a powerful brand associated with hip, cutting-edge gadgets, and could be counted on to produce something special. AT&T would give up control of the handset and the customer relationship, but in return would get a phone no other carrier (in the U.S.) could offer.

Only Apple could make this deal, since only Apple had the Apple brand. Had Palm, RIMM, Motorola, Nokia, HTC, or any of the other handset companies built a similar product, it never would have gotten the carrier support required to succeed without the branding, crapware, crippling, and overlays which plague those same companies' products today. The fact that Apple is still the only handset maker to succeed without compromising its product to get access to the customer just proves the point.

It took the unique combination of a powerful brand, a groundbreaking product, and a desperate phone company to break through the carriers' reflexive need to be front-and-center with the customer. These circumstances aren't likely to happen again in the near future, and as a result, Apple's position is likely to remain safe for some time to come.

iPad 3G Hands On

Categories

My iPad 3G arrived on Friday, so after watching other people's shiny new toys I finally got to use my own.

The main use we're planning at my company (or excuse, if you prefer) is to use the iPad as a demonstration device when we exhibit at trade shows. Right now we ship a large iMac to set up in our booth in order to show off our web-based reporting tools. The iMac is a good platform for this, since it gives us an attractive, large display which helps draw people into the booth but doesn't detract from what we're trying to show off.

The iMac has it's downsides, though: it costs over $100 to ship and insure (even ground) each way; the padded shipping crate is heavy and unwieldy; and since we only have one, we can only do one demo at a time, meaning that at busy times we have a lot of people crowded around the one scree

The iPad seems like the perfect alternative. It costs nothing to ship since several will fit into a briefcase to carry onto a flight, it's easy to carry on and off a trade show floor, and we can have several in the booth so we can give multiple one-on-one demonstrations at a tim

After receiving the iPad Friday, I loaded it with a variety of productivity software and tools (mainly Apple's iWork suite, Omnigraffle, and OmniGraphSketcher) on the theory that we may want to actually use the iPads when we're not at a trade show to do real work.

First Impressions

Thursday afternoon, before my 3G model arrived, I was at a technology committee meeting for a local school. Since the non-faculty members of the committee are all technophiles like myself, I wasn't surprised when the pre-meeting discussion was about the iPad. Both of the other non-faculty members had brought a few (non-3G) iPads into their organizations to evaluate, and both had essentially the same conclusion: the iPad is more useful as a business and technology tool than they had expected.

After playing with mine for the weekend, I have to agree. The productivity applications on the iPad are necessarily more limited, but for basic tasks they are significantly faster and more natural to use than the desktop equivalent. The small form-factor, touch interface, and instant-on-always-available quality of the iPad allow the machine to get out of the way of whatever task may be at hand. The fact is that 98% of business tasks do not require advanced features, so it becomes natural to just grab for the pad rather than open the laptop.

A case in point is the OmnigGraphSketcher application. I evaluated this on the desktop a while ago as a charting package and came away unimpressed. The concept is that rather than start with numerical data in a table like every other graphing program, you draw the graph you want freehand and the program makes it look pretty. This didn't work (for me) since I've always found drawing freehand with a mouse to be unnatural, clumsy and imprecise, and I couldn't see why you would want to get away from numerical input data.

On the iPad, however, OmniGraphSketcher becomes and entirely different experience. Sketching a chart with a touch screen is about the most natural thing you can do, and the program takes what you draw freehand and makes it look pretty and professional. The experience is like using a magic whiteboard which takes your rough ideas and turns them into something which looks like a professional graphic artist created it.

Drawbacks

There are some surprising glitches and limitations (which I fully expect will be addressed in a future software update). For me at least, the lack of Flash and multitasking are not problems at al

However, long popup menus render in a way which doesn't look scrollable, meaning that items at the top or bottom of a list can get lost. For such a polished user interface, this usability mistake is surprising.

The lack of printing capability and the clumsy mechanisms for sharing and synchronizing files limit the iPad as a serious workhorse. Right now about the only way to move files on and off the iPad for most programs is through e-mail which, while functional for small files, is really not acceptable for big documents.

I also discovered that while most websites work very well on the iPad, some advanced AJAXy things don't work at all. The touch interface has no way to hover the mouse pointer over things, and there is no way (yet) to do drag-and-drop operations on the touch screen (the drag motion of the fingers is interpreted as a scrolling action and doesn't get padded to Javascript).

I'm typing this article entirely on the on screen keyboard (works surprisingly well) but the fancy WYSIWYG text editor I use on this blog does not work on the iPad at all (so after typing this, I will be cleaning up the formatting from my laptop).

Stray taps also seem to be a problem, and when I type too fast I seem to get a little sloppy and occasionally hit the screen outside the keyboard, moving the insertion point in my text and wreaking havoc on what I'm trying to type.

On the whole, these are minor complaints, and I fully expect they will be fixed in months not years. I can definitely see a day when the iPad or it's successor becomes my primary computing device, with the laptop or desktop only hauled out for particularly demanding tasks.