Tag Archives: Rants

What does Apple have against drag and drop?

In Seriously, Apple — hire some of your engineers from the 80s I pointed out how Apple ignores/rejects perfectly obvious use cases, especially when it comes to drag and drop. I ran into another one on Friday, so here goes the hot sauce:

I have a set of people I email regularly in OS X Mail. I want to create a group, but there doesn’t seem to be any function built-in to mail for that. I’d be curious to know what Apple’s research is on how many people use their address book for anything other than email, but setting that aside, the task at hand seems simple once you understand it must be done in the address book…

Except for the fact that the people in question aren’t in my address book. They’re coming from an Exchange server. So the use case here is: I have a bunch of qualified addresses in an email and I want to get them into the Address Book application. I started by creating a group in the address book and having it open, ready to receive the new addresses.

Failure #1: select all the addresses, drag them into the group. No. They just streak back to Mail. Whenever this happens I picture Wayne Knight in Jurassic Park saying “Ah ah ah, you didn’t say the magic word.

Failure #2: drag a single address. This was a long shot, and predictably it failed as well.

Failure #3: drag the addresses to the desktop. This works, and creates a text file with the addresses in it. But then dragging the file into the Address Book application fails: it only accepts vCard format. Gee, it’s a shame that’s not the format Mail creates when you drag to the desktop. Interestingly, you can drag a vCard into an email’s address box and it works; you just can’t go the other way.

Failure #4: I notice there is a menu item available for an address: Add to Address Book. Select all the addresses, select the menu choice, and only the email address I clicked on is added to the address book even though all the addresses are still selected.

Success? One by one I use the menu to add each address to the address book. Then I add them all to the group. I wonder what will happen if any of the people’s email addresses change in Exchange. I’m assuming they won’t update in Address Book, but it’s a low probability that any of them will change.

Drag and Drop is meant to be universal: from anywhere to anywhere. It’s defined to include as many formats as the outputting application can support to increase the likelihood of the receiving application finding a format it can accept. Receiving applications are supposed to be as tolerant as possible. The lowest common denominator is text; an application should include a text representation of the data whenever possible, and the receiving application should accept text if it can. I’ve just checked, and yes, if I create a contact in the Address Book and drag an email address to the email field for that contact, it works. So Address Book understands drag and drop of text at least, it just refuses to do the reasonable thing.

I’ve used Macs for twenty-five years now, and things like this are just disappointing.

Cargo Cult Design

In the Pacific Ocean after World War II, islanders built imitation landing strips, aircraft, and radio equipment, and mimicked the behavior of the departed military personnel, not understanding that going through the motions wouldn’t bring back the huge amounts of goods that arrived when the real military was there during the war. A similar situation is developing with tablet devices.

Tablet 101

For nearly ten years people have been proclaiming that tablets are the way of the future (2001), that the tablet PC still has a strong future (2005), that the tablet PC is going to obsolete paper (2007). Now there is the iPad, which as some point out is just a big iPod Touch. That’s not the insult PC World seems to think it is: the one break out successful tablet device of the last decade — or two, if you like — is the iPhone/iPod Touch, with roughly 75 million sold, which as far as I can tell is more than the total number of tablet PCs sold in the last ten years.

The industry has learned from the iPhone experience, and Apple won’t have a grace period to establish the iPad as dominant in the re-made field of tablet computers; there are dozens of competitors looking to jump into the field immediately. But do any of them really understand what they’re doing?

Cars with tillers

Henry Ford sits at the tiller of the first car he made, in 1896 when he was 33.

In the early days of the automobile, cars looked very much like the wagons that preceded them:

That’s understandable: no one knew what a car should look like, and a wagon was the closest thing to a car anyone had seen. As car manufacturers experimented with different designs, they came up with the many things we take for granted today. Some advances, like seat belts, took far later than you might expect, but once the steering wheel was introduced the tiller quickly faded away.

Still, that doesn’t make the early car designs any more practical just because their design flaws were understandable. A car with a tiller was impractical and was doomed to be replaced by those with a steering mechanism not mired in the past.

Cargo cult design is everywhere

The Space Shuttle -- over 100 flights

Buran -- 1 unmanned test flight

Honda CVCC -- 37MPG City, 47MPG Highway

AMC Pacer -- 16MPG City, 26MPG Highway

It’s easier to take an existing design and copy it. Come up with a few variations on the theme and you’re done. But if you don’t understand which attributes are important, you’ll copy the wrong things and modify the crucial parts and end up with a dud.

Defining a (successful) tablet

There are many attributes necessary for a tablet computer: accurate, finger-based multi-touch; a display large enough to work with the web and web applications, and suitable for general use including video; wireless networking. There are more, but one in particular that Apple’s eager competitors (and the tablet PC makers of the past) seem to have overlooked is a user interface designed for use on a tablet. It’s not enough to put a coat of paint on a desktop-keyboard-mouse-oriented user interface. Henry Ford could have pointed to the wheels on his first car, shown above, and said, “Look how the wheels are specifically designed for a car, not a wagon!” That’s great, but it doesn’t change the fact that the user interface, the tiller, is wrong.

This lack of UI support has doomed every tablet PC that came out with a customized version of XP, Vista, and now Windows 7. There are basic differences between a finger and a mouse/pointer that have to be accounted for.  Imagine a Chevy Corvette with a leather-covered tiller and you have something like this:

Lots of little fiddly bits

The application icons are a reasonable size only because they’re heavily framed (doesn’t the OS support larger-resolution icons?), and the other controls are just too small and too close together. After ten years you’d think the UI designers for tablet OSes (mostly from Microsoft) would have realized the things they need to give up in order to deliver a viable user experience.

The Cargo Cult designers of the HP Slate, the Dell Streak, the Lenovo U1, and others are aping Apple’s hardware design, using large touch screens in stylish cases, and trumpeting how they support Flash or function as a laptop replacement, but they will all fail miserably as long as they try to dress up a tiller to make it work like a steering wheel.

Edit: here’s another example of someone who thinks it’s about the hardware.

Postscript: I first ran into the term “cargo cult” because of Richard Feynman’s use of the term “cargo cult science.”

Anything less than instant is unacceptable

Application responsiveness is like the weather

I used to live in Los Angeles, and I was always amazed at those “Best Places to Live” articles that failed to properly weight (in my opinion) the weather. Living in Southern California it’s easy to take for granted that the weather will be nice most of the time. If I even thought about the weather I’d simply look out the window and smile at yet another perfect day. Now I live in St. Louis, and I check the weather forecast online every day before I head out. Don’t get me wrong, I love St. Louis, but the fact that I could freeze to death if I dress wrong is a bit of a negative.

When it comes to usability, responsiveness is like the weather: everyone agrees that a slow user interface is a bad thing, but the moment some feature comes along that looks cute but runs a little bit like sap in winter we still have to deal with interfaces that suck because they’re slow. It’s important to remember that the CPUs in our computers today are roughly one hundred times as powerful as the CPUs of just fifteen years ago. Those CPUs managed a GUI that wasn’t that different than the computers of today.

So why are GUIs slow today?

Pervasive Multi-Tasking. Today’s computer is doing many more things than the computer of 1995. Open the Activity Monitor or the Task Manager and take a look. The Mac of 1995 also famously didn’t have pre-emptive multi-tasking. In a way that was a good thing: it meant that the program in front of you had the power to use as much of your 25MHz CPU as it needed to try to keep you happy.

Web Apps. They suck (with rare exceptions) because javascript is interpreted and it’s too easy to add features to web apps. At my last job I used the Zimbra email client. That thing ate my soul one second at a time. I found that I didn’t respond to email as well or as promptly because of the minor inconvenience of repeatedly waiting on Zimbra’s interface.

Eye Candy. It’s nice, and it can even be a usability boon, but the moment it slows you down it should go. It’s important here to distinguish between reality and appearance. Studies have shown that people are happy to wait longer for something that appears to be doing something than they are for something that gives no feedback. So if the eye candy is the zoom out-zoom in of switching applications on the iPhone, even if it makes the actual transition take a bit longer that’s a net win.

Innefficiency. I don’t know this for a fact, but it would seem there’s not enough eye candy in recent OSes to justify the hardware requirements they demand.

An example of instant done right

Back the 90s there was a company called Be that put out the Be Operating System. Initially it ran on custom hardware with two 66MHz CPUs in it (still ten to thirty times slower than what you likely use now). The Be reps did a demo where they would start a dozen or so applications and show that both CPUs were pegged at full utilization. Then they would grab a window and drag it around the display. Everything else slowed to a crawl, but that window would continue happily doing whatever it was doing, because the BeOS knew how to prioritize: it was an end user system, not a server, so whatever you focused on, that got the first shot at the CPUs. The reps would then repeat the demo, but with one of the CPUs disabled. Even with the same tasks that maxed out two CPUs, when they dragged a window around, while everything else nearly froze, that window was still immediately responsive.

That’s the way it was fifteen years ago, and there’s no reason why it shouldn’t be that way today. Everyone I’ve read who’s had their hands on the iPad says that it gives instant feedback in a way that other devices don’t. If it does, I’m looking forward to it.

Addendum: some things that should be instant but aren’t

  • In the Finder, right clicking a file and selecting Open With. The submenu should be pre-calculated based on file type. For the few hundred most common file types that would take what, a few kb?
  • Clicking Update in the WordPress editor for this post.

Cosmo used to mean something — other than sex

Wow. I only know Cosmopolitan magazine from the salacious covers and headlines. Currently on their home page:

  • Bedroom Blog’s Shocking Twist — K. discovers the startling truth about Zach…
  • 77 Sex Positions in 77 Days — One couple takes this crazy Cosmo Challenge…!
  • 4 Make or Break Dating Moments — Can your love survive these tests?
  • What Beauty Editors Know That You Don’t — Cosmo’s experts share the latest trends…and much more

Oh, and:

  • Workout Tips from Olympians — You’ll love this exclusive advice from elite athletes

In case you thought it was all about sex.

What I didn’t know was Cosmo’s intellectual history. Back in the day they published fiction by Sinclair Lewis, George Bernard Shaw, Upton Sinclair, and H. G. Wells. In the 1940s it was known as The Four-Book Magazine, and it contained novels. I’m all for empowering women, and I understand that the era of fiction-based magazines is long past, but comparing Cosmo today to The Cosmopolitan of yesterday would be as if McDonald’s had started out as a four-star restaurant.

That’s not how evolution works

Update: the BBC reports that North American bird species are getting smaller, likely in response to increased temperatures.

In a Mother Jones article, Julia Whitty says, “Birds are rapidly evolving different shapes to cope with clear-cut forests.” Now, she may be accurately quoting the source paper by André Desrochers, and she’s certainly not the only one to describe evolution this way, but that doesn’t make it any less misleading and stupid.

How Evolution Works (with only a trace of self-importance)

It’s not that hard to get it right. Evolution is the “…change in the genetic material of a population of organisms through successive generations.” [wikipedia] This happens because of two opposing processes:

  • Over time, the genetic diversity of a population tends to grow. Taking humans as an example, we have tens of thousands of genes. Each individual is a random mix of the genes of her parents, and is potentially a unique/novel combination. Each individual may also have mutations, making their collection of genes not only not a perfect selection of genes from her parents, but again potentially unique/novel. Finally gene transfer can also increase genetic diversity, although I don’t know of any proven instance of this happening in humans.
  • Opposing the increase in genetic diversity is natural selection. If a particular combination of genes is unsuccessful — if the person with those genes has bad eyesight and can’t hunt, or has a bad complexion and can’t get a mate — then those genes don’t get passed on, and assuming some other genes do, the species has evolved. (if other genes don’t get passed on either, the species goes extinct)

You can think of evolution as something like a bush in a topiary garden: left to its own devices it will simply grow larger — that’s genetic diversity at work; but if the gardener (in the form of natural selection) comes along and trims here and there, you end up with an elephant.

This is where the stupid comes in

If you have a four-foot-high bush, no amount of trimming is going to turn it into a ten-foot-tall image of an elephant. The article in Mother Jones starts with “A new study shows how North American birds have changed the shape of their wings in the past century as the landscapes around them have been fragmented by clear-cutting.” Bzzt, wrong. It’s not like at the annual convention the bird-leader said, “Guys, the trees are getting farther apart. We have to fly so far, and it’s tiring! I say let’s switch to those new streamlined wings. All in favor?” Again, think of the bush. The change observed in the birds’ wings took place over the last century; it’s not the bush growing, it’s the gardener trimming.

As another example of this aspect of evolution, consider Thoroughbred racing. The Kentucky Derby has been run at one and a quarter miles for over a hundred years. There is tremendous prestige, not to mention money, involved in producing the fastest horses, and breeders have worked very hard over the last century to improve their mounts’ times. Yet the average time of the last ten winners of the Derby is less than a 6% improvement on the time of the first ten winners. Even that overstates the situation, since training methods and riding techniques have presumably improved significantly over that time span as well. The reason for this modest improvement is simple: you can’t trim the bush taller. If you could, horses would be running the Derby at sixty miles per hour by now.

So what’s almost certainly happening with the birds is that previously there was some amount of genetic diversity in the shape of the birds’ wings. This may or may not have manifested in actual variations in the birds’ wing shapes; it’s possible for significant genetic variation to hide. As a gross simplification, one bird might have gene variations A1, B1, and C1. A1 contributes to more pointed wings, but only when accompanied by B3 and C2. Another bird might have A2, B3, and C3. If the two birds mate, none of their offspring will have more pointed wings because they all have either C1 or C3. Even if one of their offspring has A1, B3, and C3, it might then go on to mate with another bird that has A2, B1, and C1, and produce offspring that have A1, B1, and C1, putting us back at the start.

But some subset of the birds have A1, B3, and C2, and thus have more pointed wings. It doesn’t matter much how many of them there are, or even if any of them exist in a particular generation of birds; we have a century to work with here, and about a hundred generations of birds. As long as somewhere along the way A1, B3, and C2 show up and prove beneficial, we’re set. As we trimmed the forests, the bird population declined, and those with A1, B3, and C2 didn’t decline as much. They were better suited to the new environment, where before they were just average among the population. As birds without A1, B3, and C2 failed to compete as well and died off, the percentage of birds with more pointed wings went up: evolution happened.

The more harsh the environment is to the bird status quo, the faster this die-off and re-jiggering of the genetic population happens. The birds are “rapidly evolving,” but only in the sense of the herd being rapidly thinned. As another example, consider if we lined up every human being on the planet and killed off all of them who couldn’t run a mile in six minutes. We’d be left with a much smaller, much fitter population. But if we killed off everyone who couldn’t run a mile in three minutes, we’d be extinct. Doing it in stages wouldn’t help much. If we started with six minutes, and every ten years lowered the limit by ten seconds, we might find in 150 years that there were a number of people who could run sub-3:30 miles, but the human race would be only a few years away from extinction as the required time continued to drop. Again, you can’t trim the bush taller.

“Evolving” is a transitive verb

While it’s not technically wrong to say the birds are rapidly evolving, it’s more accurate and less prone to misunderstanding to say that we are rapidly evolving the birds. We have a history of doing this, and not just recently: we evolved dogs from wolves, we evolved cattle from aurochs (which we then eradicated), and the list goes on and on.

So don’t say that the birds are rapidly evolving. They aren’t hurrying ahead of us, morphing their genes to stay viable as we remake their world. We are changing their environment, and those least suited to the new situation are dying off. To paraphrase Darwin, the fittest birds are surviving, and they are surviving us.


A note about the use of the term “gardener”

In most instances where it appears in this entry, the “gardener” is us — mankind. In those instances where it isn’t, it is a metaphor for natural selection. Do not make the mistake of thinking this article supports intelligent design, or that I am in any way swayed by the temper tantrums put forth as arguments by its supporters. If you misquote me to support your falsehoods I will hunt you down and lecture you mercilessly.

The iPad Revolution: It’s 1984 All Over Again

Edit: Caleb Elston gets it.

A lot of people have called the iPad revolutionary. Some say it will change media consumption. Some say it’s Kindle Killer; others say it isn’t. Others say that “The iPad itself was something of a yawn, but the implications of [the A4 CPU] are not.” Still others say it’s a laptop replacement. They’re all missing the point: the iPad is the first fundamental change in human/computer interaction since Apple introduced the mouse/pointer/GUI back in 1984.

Media Consumption

People are hailing the iPad (or reviling it) as a media consumption device destined to save the publishing industry — in other words, not a full-blown computer. Although nothing can save the publishing industry, I admit this is what I thought the iPad would be. Before the announcement I envisioned replacing my aging laptop with the new Apple tablet, but having a mac mini tucked away for when I wanted to do “real” computing. But I was wrong.

Apple made that clear by demoing iWork on the iPad. This device is not just for sitting on the couch and surfing while you watch TV. It’s for getting real work done. When Scott Forstall said there would be a new gold rush for application developers he wasn’t kidding, and he wasn’t hyping; he was putting developers on notice: every software niche is now up for grabs. Just as the migration from DOS to Windows and from the Classic Mac OS to OS X changed the software development landscape, so too will the expansion of the App Store, and developers with apps on the iPhone have a head start.

Kindle Killer

The iPad isn’t a Kindle killer but the notion is silly on the face of it: the Kindle is a single-purpose device and the iPad is a general purpose computer. It’s like saying the iPhone is a Motorola Razr killer. The Razr has a very limited set of functions, where the iPhone can accurately be described as a computer that makes phone calls.

Amazon doesn’t release sales figures, but estimates are that the Kindle so far has sold a total of a half million Kindles in 2009. All up they have perhaps sold “millions” since the introduction in 2007. Compare that to estimates for the iPad of four million in the first year, and it’s obvious that Steve Jobs isn’t targeting the Kindle with the iPad.

Which isn’t to say that the iPad won’t have an impact on the Kindle. The trend over time is obviously toward a single device that does everything, and the Kindle is no exception. The iPad will marginalize the Kindle, but the reader will likely hold out until display technologies converge, possibly with the Mirasol display.

The A4 CPU

Certainly it’s amazing. Consider that the iPad has a 25 watt-hour battery and is rated for ten hours of continuous use. That means that in practice the iPad on average uses only 2.5 watts of power for everything: CPU, storage, and display. That’s an amazing achievement, but it’s not going to change the world. Good hardware lives in service to good software.

Laptop Replacement

Close, but misses the point. The iPad isn’t a laptop replacement, it’s a computer replacement. Every computer designed for human interaction is in the iPad’s sights. But it’s not just a question of hardware, as so many want to make it. Just as the mouse demanded a new interface to make it useful, so does a touchscreen. That’s why tablets have failed again and again through the last ten years: bolting a touchscreen onto standard Windows (or OS X) interface makes about as much sense as adding a mouse to MS-DOS.

At the bottom of every Apple press release is the statement: “Apple…reinvented the personal computer in the 1980s with the Macintosh.”  In a very real sense that’s true: nearly every computer in use today has a user experience that a Macintosh user from 1984 would understand immediately. Menus, a desktop metaphor, windows, all of these things have been in place for over twenty-five years. Apple hasn’t said it out loud, but the iPad is intended to be the next 1984: it will replace every computer that isn’t a server.

Don’t Believe the Infographics

During the presentation, Steve Jobs showed a graphic that asked, is there room for something between a laptop and a smartphone? That implies that there will be some way in which each is better than the other two. Of course that’s true, otherwise why have a separate category?

For the iPhone it’s obvious: it’s a phone. Second, it’s pocket-able.

For the iPad it’s a combination of portability, affordability, the app store and the touch interface when compared to the laptop, and the fact that it will be a “real” comupter compared to the iPhone/iPod Touch.

But what is it for the laptop? At least initially there will be a need for the laptop (or a desktop): the iPad syncs to iTunes on another computer, for example. But does it have to be that way? Of course not. There is no reason the iPad needs to depend on its aging brethren. As the iPad progresses, the dependency will shrink, both because Apple wants it to and because users will demand it. Many people won’t own both an iPad and another computer, so any way in which those people are at a disadvantage initially will be a huge incentive for Apple to make the iPad independent.

Initially there will be whole categories of software not represented in the app store. But as the iPad gains traction, the software gap will shrink as developers leap to satisfy a market that within a few years will number in the tens of millions.

But, But…

There can be objections to this idea:

Every computer needs a physical keyboard. No, they don’t, and anyway, the iPad has one if you want it.

Every computer needs USB. Maybe, but there was a time when every computer needed a floppy drive. There was a significant outcry when the first iMac shipped without one, but it worked out. In addition, it’s important to remember that the iPad won’t replace regular computers overnight.

It doesn’t multitask. Well, it does, but only in limited ways. And the point is that, apart from playing music, how often are the apps on your computer actually doing something in the background other than waiting for you to bring them back to the foreground? Unless you’re applying complex transformations in Photoshop, or compiling code, or processing log files, or <fill in your special task here> you don’t need multi-tasking. Okay, maybe you do, but you’re special, and as the iPad matures there will likely be ways to meet your multi-tasking needs.

Any real computer needs a way for us savvy types to dig in to the tech. So do you perform your own tune-ups on your car? Do you drive a stick shift? Bringing it back to computers, do you program? In assembly? If so, good for you. Likely the iPad will adapt to meet your needs; if the iPad really is successful at replacing the current user experience, then sooner or later people will need to be able to create iPad applications using the iPad. Remember that when the Mac was first released, you couldn’t program on it.

The app store is evil. Maybe, maybe not, but that won’t stop the iPad from being successful. And as its market share grows, so will the pressure on Apple to give up some control.

So if Apple is Playing the Part of Apple in this Re-enactment of 1984, Who’s Playing the Part of Microsoft?

It’s arguable whether it’s a good or bad thing that Microsoft ate Apple’s lunch through the 80s, 90s, and 00s. But they’re not likely to do it again; Microsoft has shown no talent at producing a compelling portable touch interface. Google/Android is the obvious candidate. There’s plenty of time for iPad competitors to arrive, although it has none at present. It remains to be seen whether Apple can avoid the mistakes that led to them not owning the desktop market in the 90s.

How Long Will It Take?

The mouse/desktop interface took somewhere around ten years to fully assert its dominance over the command-line. With replacement cycles being what they are, and the current lack of a full software catalog, it might take almost that long for the multi-touch interface to replace the mouse and desktop. It certainly isn’t going to happen overnight. There will be people for whom the iPad is their first computer. There will be others for whom it’s a replacement device; that will take several years to move through the marketplace. Still others will buy it as a supplemental machine, and it’s anyone’s guess how long it will take those people to give up their mouse.

But make no mistake: the mouse is an endangered tech species:

There is a special place in hell for web designers who put white text on a black background

Bad designer! No biscuit!

It makes my eyes hurt just thinking about it. I get so frustrated I want to grab a knife…and spread cream cheese on bagels and then taunt them with it: “No bagels with cream cheese for you, bad web designers!”

Fortunately there is a quick and easy solution, on my Mac at least. Command-Control-Option-8 reverses the screen colors, turning white on black into black on white. All other colors reverse as well, but it’s a small price to pay.

And seriously, if you’re a web designer and you get the urge to put white text on a black background, don’t. Just don’t. If this keeps up I’ll have to sneak into your house at night and wipe vaseline on your eyeglasses every night for a month. And neither of us wants that.