Recent Articles

First 3D prints


I got my 3D printer late last week and have been having fun making a variety of models to see how it performs. I bought the Up Plus instead of one of the many kits like the Makerbot or the Reprap. The Up is more expensive, but everything I read suggested that the kit-based models require a lot more fussing to get working properly (even if you buy them preassembed) and the software is fairly painful to work with.

The Up, on the other hand, comes fully assembed and tested and has relatively user-friendly software which works out of the box. The software is a big deal, because it will automatically add support structure (for printing overhangs) and take care of other routine chores without too much tinkering from the end user.

Most hobbyist/home 3D printers work through extruding a thin filament of melted plastic--imagine a hot glue gun mated to an old-fashioned pen plotter. There are probably a dozen different basic technologies for 3D printing, but this one seems best suited for the hobby market: it is relatively inexpensive, the materials are also cheap and readily available, and safety issues are minimal.

On the downside, this method is slow, and limited in the materials you can use. The core of the unit (and most expensive component) is the print head, so hobby printers generally have a single head. That means that each model must be made from a single material, so only one color of material can be used and the support material has to be the same stuff as the model itself.

Professional units often have two or four print heads--that lets you use some other type of material for the support (making it easier to remove all the support scaffolding), and have several colors of plastic included in the same model.

I've posted some pictures of models I've build on Thingiverse. So far I've found that the printer can produce really amazing output, though sometimes the software needs some tweaking to get the best results. Support material is a pain to remove, so it's best to use the least amount of support which will still give good results.

I've also found that kids (of all ages) find the 3D printer endlessly fascinating--it's a great way to inspire interest in designing and building stuff, and my kids have already started making models in Sketchup to print.

Christmas Day Ride


The weather this "winter" has been shockingly dry and mild for Minnesota. As of today we have gotten a grand total of only a couple inches of snow, and no below-zero temperatures.

Yesterday, Christmas Day, it was in the 40's. I went for an 18+ mile ride to my parent's and back (the kids rode in the car with She Who Puts Up With Me). In anything close to a normal winter, that ride would be out of the question for me because of icy/wet roads and cold. I managed a similar ride on Thanksgiving Day.

The extended forecast is showing only slight chances of snow and nothing in the way of January-like cold for the next week. At the rate we're going this could be the year without a winter. January and February are normally the coldest months of the year, but we are now gaining sunlight every day and the lack of snow means that the ground absorbs a lot more solar energy.

Would you have bought a personal computer in 1975?


A personal computer in 1975 cost about $2,500 in today's dollars if you bought it assembled. It didn't do much, as there was no third-party software and it only had 256 bytes of memory. (However, for an extra $1,350 in today's dollars you could upgrade it with another 1,024 bytes of memory). A few thousand units sold that year, which was a shocking success.

As a laughably downsized version of what were then giant industrial machines, it was hard to see then what a personal computer would eventually be useful for. It would be another four years before Visicalc--arguably the first really useful thing you could do on a personal computer.

So would you have bought a personal computer in 1975?

In retrospect, many people would probably say Yes, but that's only because we now know how the technology evolved. It's much harder to be in 1975 and see how this expensive toy (which is all an Altair 8800 would ever be) would change the world.

It's this line of thinking which made me decide to buy a home 3D printer. I want to learn about this new technology not because it's useful today, but because it has such interesting potential over then next 10-20 years.

Commerical grade 3D printers are powerful pieces of industrial equipment, and have no place or purpose in the home. Their smaller cousins have only been on the market a couple years and are pretty much expensive toys with limited practical value.

On the other hand, a small 3D printer isn't much more mechanically complicated than an inkjet printer (in some ways it is actually simpler). There's no reason that, with enough manufacturing volume, someone couldn't sell a 3D printer for only a few hundred dollars and put one in every home.

The only reason nobody's selling millions of cheap 3D printers is because nobody knows what your average household would do with one.

I don't know if we will discover the Visicalc of 3D printers, the killer app which transforms this from expensive toy into useful tool. I don't think a lot of people in 1975 knew what the future held, either.

There was another interesting piece of industrial technology scaled down for home use which came around only a few years after the Altair 8800. Unlike the personal computer, however, not too many people today have a personal robot.

Optimistic about Carbon and Renewables


Three and a half years ago (in early 2008) I observed that the price of solar power modules had been dropping at a remarkably consistent 6% per year for 25 years, and that sometime before 2025 they would be cheaper than grid power in most places.

The exact year of grid parity depends a lot on where you live: sunny places with expensive electricity (think Hawaii or southern California) get there a lot sooner than cloudy places with cheap power (Seattle). For Minnesota, I estimated that sometime around 2015 a solar power system would pay for itself within the system's lifetime.

That estimate is looking pretty good, at least on the price of the solar modules (this Scientific American blog has an updated version of the graph I made in 2008). If anything, the decline in photovoltaic prices may be accelerating a little--though that could just be a short-term blip.

I'm optimistic that over the next decade solar power will become economically viable in more and more places. On a purely cost basis alone you will start seeing a substantial increase in solar power installations. That, in turn, makes me optimistic that we will manage to transition away from greenhouse-gas-emitting sources of energy in a reasonably graceful fashion.

I may be using "optimistic" in an unusual sense. There's no doubt that the earth's climate is changing, and much of the evidence now points to a faster climate change than most scientists had predicted. There's already a lot of climate change "baked in" to the atmosphere, as cabon dioxide levels have increased over 20% just in the past 50 years. What's more, moving a large fraction of energy production to solar and other renewable sources will take decades, as it's very capital intensive to build an entirely new energy infrastructure.

But I am optimistic that the long-term trends are in place to create a more sustainable energy system and eventually reduce or eliminate net emission of greenhouse gasses. It will take decades. Future historians may see the 21st century's energy revolution as just as important as the industrial revolution in the 19th century or the information revolution in the 20th.

In the meanwhile, global climate change will continue. Sea levels are likely to rise (maybe a lot), storms will get more intense, and a lot of people will have to adjust. Some cities may have to be abandoned or be put behind massive dikes like in the Netherlands (I'm looking at you, New Orleans and Miami).

But it will not be the end of civilization. We will--eventually--muddle through.

Rethinking Nuclear Power Post-Fukushima


For many years, my opinion of nuclear power has been one of an uneasy truce: I've not been 100% comfortable with it, but accepted it because of the potential to generate a lot of power relatively pollution-free.

In the wake of the accident at the Fukushima power plant, I'm rediscovering some things I kind of knew before but hadn't fully appreciated:

Underappreciated Fact 1: Nuclear Power is Inherently Dangerous

Historically speaking, far more people have been killed by fossil fuel power than nuclear power. This is a fact.

But that's not because nuclear power is inherently safer. On the contrary: nuclear power has a good safety record (so far) because it is so extremely dangerous that we entomb reactors with insanely large containment structures to keep the stuff away from us even in an unthinkable disaster. Were we to build similar containment and waste-handling systems for coal-fired power plants, pollution and global warming would be non-issues.

We don't do that with coal and oil because we don't have to.

And if a containment structure is ever catastrophically breached (an event which hasn't happened yet--the Chernobyl reactor had no containment), it would likely render hundreds of square miles uninhabitable for centuries. Nothing else made by humans has that capacity.

Underappreciated Fact 2: Spent Fuel Remains a Problem

Even after decades of nuclear power, we still haven't figured out what to do with the spent fuel. Fukushima shows that in an accident the spent fuel can be almost as dangerous as the reactor itself, in its capacity to contaminate the surroundings and prevent emergency workers from fixing problems.

Here in the U.S., spent nuclear fuel is basically stockpiled at the power plant waiting for the (hypothetical) day when there's some way to recycle or dispose of it. At the Prairie Island plant here in Minnesota, they've actually run out of storage space and have had to build new storage casks. It's safe to assume that these spent fuel casks are considerably more vulnerable than the primary containment around the reactor.

Underappreciated Fact 3: In a Disaster, You May Have Other Problems

The nuclear accidents at Chernobyl and Three Mile Island happened because of internal problems, not because of a natural disaster. Fukushima, on the other hand, was caused by a combination of a magnitude-9 earthquake and a massive tsunami--an event the power plant was not designed to survive.

Nuclear reactors are engineered to withstand the most catastrophic natural disaster expected at their site. What that means in practice is that a natural disaster big enough to damage a nuclear power plant will be bigger than anything anyone expects. Normally simple things like transportation may be difficult or nearly impossible, local emergency services may be wiped out, and it could take days to get even the most basic resources to fix the problem.

If bringing a nuclear power plant under control requires something (supplies, people, expertise) which doesn't exist at the site itself, you might not be able to get it at all.

Underappreciated Fact 4: Newer Plants May be Safer, but Old Plants Rarely Die

One argument by nuclear advocates post-Fukushima has been that the Fukushima reactor and containment was an older design with known deficiencies. New plants, they argue, would never be as vulnerable.

Unfortunately, older reactors continue to be used, even decades beyond their original design lifetime. Given the cost of decommissioning an old reactor and building a new one, power plant owners have an enormous incentive to keep the old reactors running as long as possible.

It's hard to know if the margin of safety in older nuclear plants has eroded (it may take another disaster to know for sure), but it is clear that they are not being replaced by newer designs nearly as quickly as the original designers had intended.

Apple, is this a good idea?

Apple refreshed its laptop line today, and the big new feature is the "Thunderbolt" port, aka The Mordor Plug (" plug to rule them all....").

Lots of people are really excited about this, but I noticed an odd design choice. Take a look at the symbol Apple is using for the Thunderbolt interface, the lightning bolt with an arrow.

Now take a look at this Google Image search. Striking resemblance, don't you think?

I don't know how eager I am to plug an expensive peripheral into a port marked with a prominent "DANGER HIGH VOLTAGE" symbol.

It seems that Apple is trying to rebrand a universally understood symbol meaning "Danger! Don't touch this or plug anything into it unless you really know what you're doing" to mean "You can plug anything into me and it will be really fast!"

What could possibly go wrong?

What happened to the attic?

Based on a purely random set of observations over my lifetime, I've noticed that houses more than about 100 years old (built before 1910 or so) usually have an attic which is fairly accessible for storage. Houses less than 60 years old (built after 1950 or so) usually have attics which are difficult to get into, or even completely sealed from the living spaces.

My own home, built in 1984, has at least three distinct attic spaces over different parts of the house, and only one of the three has any way to get in at all (without cutting through a wall or ceiling). Getting into the one accessible space requires carrying a large stepladder up to a closet on the top floor, lifting a drywall panel out of the way, and shimmying through a small hole--not at all practical for storage.

I find this a little mysterious. Attics are terribly useful things: they don't take up any living space but can provide an enormous amount of storage (think of all the billions spent on mini-storage); an accessible attic makes it much easier to inspect the condition of the insulation and look for roof leaks (and every roof, given enough time, will eventually leak); and attics are almost as handy as drop ceilings when trying to pull network cables.

So why doesn't the modern American house make it easy to get into the attic, the way our grandparents' houses did?  I have some theories:

  1. Beginning with the post-WWII housing boom, builders felt the need to put up lots of houses cheap, and eliminating the ladder to the attic was an easy way to save money.  Eventually people stopped expecting this amenity.
  2. Modern architectural styles, with their shallow roofs, don't give any usable attic space anyway, so builders stopped providing access. The trend carried over even in places (like Minnesota) where most houses are still built with a steep roof because of the snow and ice.
  3. Building codes stopped allowing a steep ladder into the attic, and people didn't want to take the floor space needed for a proper staircase.
  4. When people started heavily insulating their attics, it became more difficult to provide a hard floor suitable for walking and storage.
  5. Attic storage is, and always has been, an expensive amenity reserved for the fanciest houses. Cheaply built houses from 100 years ago have mostly been torn down, so it just seems like builders used to provide more attic access.
  6. Attic storage is just as common in new houses as in older ones, and my observation is just wrong.

My guess is that the answer is a combination of 1 and 2, with maybe a little of 3 and 4 thrown in. I really don't know, though, and my attempts to use Google-fu to find the reason came up blank.

So for now this is just a mystery.  But if I ever build my own home, I will insist that it come with a proper staircase to an attic where I can keep all my stuff.

How Frozen is the Frozen North?


I finally got around to putting the current weather back in the blog.  This was the one major cleanup from when I switched to Drupal almost two years ago.

It isn't especially elegant: at home I have a Mac Mini running Lightsoft Weather Center; this downloads the current weather from my Davis Vantage Pro weather station.  Every 15 minutes, it updates an HTML template and FTPs it as a static file to my web hosting provider.  This static HTML page is included on every page of the site through an iframe.

There are also some history graphs which can be accessed by clicking the "Weather in the Frozen North" link; those are also updated every 15 minutes.

I was forced to abandon the Davis WeatherLink software because the Mac version was simply pathetic--it has not been well maintained, and ran as a Java application which seemed to be very brittle.  Fortunately, there are now several superior alternatives for weather station software on the Mac which have more features, are easier to set up, and produce nicer-looking output.

Facebook, it's not you, it's me


Facebook, it's not you, it's me.

I tried to make our relationship work, I really did. At first I resisted--I'm not the kind of nerd who falls for every pretty Web 2.0 app--but when it seemed like everyone I knew was talking about you (even my own mother in law), I gave in.

We had fun together, at first. Reconnecting with old friends, seeing who else was hanging out. After a while, though, our relationship began to change. You didn't communicate with me the way you used to: instead of fun little updates, it seemed like all I got was messages about how many sheep someone raised in FarmVille.

I understand that you have needs too, but our relationship can't be a one-way street. If you want to monetize me, that's fine, but our relationship needs to be about more than just that.

For a while I sort of drifted away, but then I started suspecting that you had a darker side when I learned how many of our shared secrets you didn't really keep secret. Whatever trust and respect I had was gone when I learned that those games my friends were playing demanded a price: not just my friends' privacy, but mine too. Suddenly the sheep seemed more than annoying, almost sinister.

So I tried to leave you. For months I didn't log on, but eventually, and against my better judgment, I decided to give you one more try.

This time, I vowed, I would be careful and give you a fair shake. I would block all the useless applications, to protect both my time and my privacy. I would check everyone's updates regularly and comment where appropriate.

It didn't work.

The harsh reality is that I've been spending as much time blocking applications (you don't make it as easy as it should be) as communicating with people I care about. Of all my "friends," only a handful actually post updates, and those who do update post way too often (I care about these people, but not that much).

So in the end, Facebook, this is goodbye. I've invested too much energy in our relationship and gotten too little in return, and I've finally realized that to you I was never more than one more consumer profile to market to. I deleted my account today--though I have my doubts that you'll respect that. Something tells me you don't really believe our relationship is over.

On the Nature of Steve Jobs' Reality Distortion Field


Apple did not invent graphical user interface. Nor did it invent the digital music player, smartphone, or tablet computer. Apple did take each of these products, do it better than anyone else, and (for a time at least) own the market.

In each case, a large part of Apple's contribution was not a killer feature or innovation, but taking existing elements and finding a combination of elements uniquely appealing to customers.

In each case, this meant omitting some features which every other product included, and which most observers believed to be must-have.

In each case, competitors and industry pundits mocked Apple's products and predicted failure.

In each case, when customers actually tried the products, the missing features turned out to be less important than the overall experience.

Normally when a company launches a new product into a new market, it makes an effort to include all the key features. Without hitting the "checklist features" it can be difficult to get prospective customers to even try the product, and the product is often doomed before it even gets a chance.

Apple's unique talent, and the true nature of Steve Jobs' Reality Distortion Field, is in getting prospective customers to give a new product a try, even when it seems to be missing key features.