Spot any errors? let me know, but Unleash your pedant politely please.

Tuesday, 9 March 2010

Bosses buy bad UX

User Experience matters to users. In Enterprise, users usually have no say in the software they use. Those decisions are made by bosses, bean counters or 'IT'. UX may not even be on their checklist.

Most bosses won't be using the software. Not even email in many cases. They have a secretary/PA for that. This is not a criticism. It's an observation. It's just the nature of their jobs.

Accountants should look at the bigger picture, the TCO, but they'll have a hard time seeing beyond tangible, measurable costs.

In organisations where IT call the shots, I don't think they have the best interests of the user at heart. Or rather the UI that they use is the admin end. How easy it is to configure and maintain users on the system. End user UX may not be of great concern. If it's really bad, they'll get swamped with support calls, but most likely, they'll get no calls from disgruntled users who can work software that they hate and bitch about it to their colleagues.

I can think of only one case where universal hatred of a product caused change in a large organisation. Said organisation moved from an ancient but functional email system, Teamworks, to Lotus Notes. Initially most people were quite excited to get a modern client. There were some initial problems. There was some initial dislike. This is normal. People don't like change. They'll grumble. They'll pine for the old, unless you give them a chance to try the old again: then they'll run from it screaming.

What's not normal is for universal long term hatred of a new product. It took about two or three years of mass user complaint, but eventually, thankfully, it was dropped in favour of Outlook. The product became known as 'FLN' ("F*cking Lotus Notes"). The only person I've ever heard defend FLN worked at basement level IT, made his own Roman clothing for re-enactments and knew his way around a twelve-sided die. He also derided users for not being able to use the unusable FLN. (He's a really nice bloke, but he's more Moss than Roy. )

Outlook is far from perfect, of course, but the UX improvement, compared to FLN, was astonishing.

I've worked on products that have appalling UX. Old fashioned, like some Windows 3.1 app. I worked for one company for a short time that took pride in its applications looking shit. They drew attention to it in the brochure. They didn't call it 'shit' directly, they called it something like 'focused on utility, not on eye candy'. Anyone trying to make it look better to work better would probably get a formal warning.

The terrible thing ? They were right. Looking shit was a genuine advantage for them. Some kind of negative reinforcement. The message was 'Hah! Look at the eye candy in [rival product]. We didn't waste time on that nonsense, we made our product work'. This marketing was successful because the people buying the product were amenable to the message, and they were not the ones who would have to use it day in, day out for hours and hours.

When function and form are treated as mutually exclusive, it's often the user that suffers, not the bloke signing the cheques.

Friday, 19 February 2010

The iPad isn't enough simple*

The iPad will be a revolution in computing. It'll be your Nan's computer. It'll be the exercise book and textbook and exam paper in classrooms. It's the future of computing, blah, bah, etc, as others have said.

I already use my iPod touch for 90% of my leisure computing needs. I'd still need a laptop, but that could change, particularly if I admit to myself that no, I'm not actually ever going to get around to doing any proper development. I may never need a new laptop. For my father, who got his first laptop last year at age 69, an iPad would not only be perfectly adequate, it'd be simpler. It would be better. I'd considered getting a little Linux netbook for him, but decided supporting a MacBook would be a lot easier on both of us.

The iPad isn't a standalone device. It still needs iTunes running on a Mac or a PC. This is a considerable expense and complication. Like a HiFi, where the sound is only as good as the weakest component, so the iPad will only be as simple as the most complex component. People/n00bs may not be scared of the iPad, but they'll continue to be scared of their computers.

If you've ever seen the episode of Gavin & Stacey in which Bryn explains the web to Gavin, I'm like Bryn, but the patronising assumptions of the ignorance of my father's Gavin are correct. That's how I have to explain computing tasks to my father, an intelligent, curious man. Remote support, on even the simplest of tasks can be frustrating. I can't say 'just drag it to the preview icon on the dock', I have to email him step-by-step instructions, with clearly labelled screen grabs (thank you www.skitch.com, for making this easy).

The iPad comes close to fixing this, but the iTunes on a complex computer problem remains. It needs to be made simpler, and ideally cheaper too. Simpler is in everybody's interest. Cheaper is too, even for Apple. As Pogue said a few years ago : "Simple sells". I'd have had a hard time getting my father to accept buying a Mac mini and an iPad rather than just a MacBook. The MacBook was already twice the cost of the windows laptops he'd seen advertised. A standalone iPad, or an iPad + iTunes magic box that was as cheap as a MacBook would have been another story.

There's room here for Apple to make a bunch more money by reducing mandatory complexity.

Time Capsule runs an embedded version of OS-X (AFAIK). I think Apple can take iTunes off the computer, make it client-server and have the server run on a Time Capsule. A Time Capsule is technically a computer, but it's abstracted. Nobody thinks of Time Capsule as a computer. Adding an iTunes Server to it won't change that. Apple TV boxes could use this server too. Hell, Apple TV boxes could be the server.

Backup and sync via WiFi would be nice. It would be a new thing for iPhones/iPods/iPads, but Time Machine does wireless syncing from Mac to Time Capsule and Apple TV syncs wirelessly to iTunes too, so it's certainly possible. I don't think wireless backup/sync is necessary right now, just that a hypothetical Time Capsule/Apple TV + iTunes server, with iPads no longer requiring Macs or PCs, will seem clunky without it.

Perhaps I'm thinking too small here. Accustomed to keeping data local. Perhaps Apple's facility in North Carolina is gearing up to be a massive cloud store for our entire media libraries. Who needs more than 16GB on an iPad when all your media is in the cloud ?

* 'Simple' is a noun here. Apologies if that is irritating.

Thursday, 28 January 2010

The iPad

Well, everyone else is talking about it, I may as well add to the noise.

Q: Do I need one ?
A: No.

Q: Do I want one ?
A: I'm not sure.

Q: Would I use one ?
A: Yes.

Q: Will I get one ?
A: Probably.

A lot of people are puzzled by the iPad. They don't know what it's for. Although it sits in between an iPod/iPhone and a laptop, it also invades the iPod/laptop space. It's portable, more convenient than a laptop, but not pocketable. I see it living on the coffee table. The iWork apps aside, it's really a device for consuming rather than creating. I'm sure we'll see creative apps in time, but it'll be best for the kind that

Wednesday, 25 November 2009

Disconnection doesn't necessarily mean Broken Web App

I was wrong about something yesterday…

Peter-Paul Koch unambiguously sets this straight

Let’s debunk one argument. It is perfectly possible to write an offline Web app for Safari iPhone. The browser supports appcache and local storage for storing the application and its data, respectively.

Both Web apps and native apps can work offline. Thus this argument has no value.

I still think web apps suck. They suck on a desktop. Running them on a phone, with a fraction of the clock speed, a fraction of the cores, a fraction of the RAM, and they're going to suck an order of magnitude more.

Tuesday, 24 November 2009

iPod Touch, iPhone, Native vs Web Apps and pinch of ChromeOS

I'm prompted to write this by John Gruber's iPhone Web Apps Alternative post.

I pretty much agree with Gruber on this, but he neglects to consider the iPod Touch. Most apps, unless they use the camera, compass or GPS, are iPod Touch apps too. Apple screwed up by not having a common name for the two devices (No, I can't think of a good one), and I don't think they expected or planned for the iPod Touch to be such a success.

Many apps require connectivity to work because the data is in the cloud. Most of the apps that rely on data in the cloud could be written as web apps. They would probably be slow and suck, and suck more life of the 3G network, but they'd do the job. It's the difference between www.twitter.com and Tweetie or Twitterific. Twitter sucks in a desktop browser. It sucks really badly in the mobile optimised version on the iPod/iPhone. (Thinks: "iApp" as a common name?)

An app that doesn't need data from the cloud should be written as a native app. It'd be faster, not suck as much, and not require any wifi/3G bandwidth at all. Most importantly, it would work on an iPod Touch where there was no accessible Wi-Fi. (And on an iPhone where there was no 3G or edge).

An app that lives in the cloud and uses data from the cloud is sometimes necessary, such as a webmail client. These still suck compared to native clients, but needs must when the devil vomits into your kettle.

I have no problem not using Mail, Twitter, Safari, NNW, Facebook, linked etc when I'm on a bus or a train. To not be able to play Bpop or boxed-in or Bones or Reversi because of no connectivity makes no sense whatsoever.

Apps and data in the cloud break when connectivity breaks.

It's why ChromeOS makes little sense to me.

Friday, 30 October 2009

Defying Gravity Series 1 Episode 2 'Natural Selection'

I'm really rather enjoying this series so far. I'm kind of waiting for it to jump the shark, which it seems in danger of doing quite early on if they're not careful. That'd be a shame. There are unscientific/supernatural things in the series, and that's fine, it's TV, they're necessary and entertaining devices. If it doesn't need to be wrong though, it really needs to be right.

In episode 2, ('Natural Selection'), an experiment is performed by one of the astronauts that she describes as 'natural selection'. Anyone with a passing knowledge of evolution knows immediately that it's bullshit TV science rearing its ugly head yet again.

An external actor, selecting randomly is about as far from natural selection as you can get. The survivors won't be the fittest / most appropriate, they'll simply be lucky.

Two things required for evolution are (random) mutation and natural selection. What the stupid experiment in Defying Gravity does is 'Random Selection'.

Darwin's finches are the result of evolution by natural selection. The ones that remain are best suited to their environment.

The pigeons he studied are the result of selective breeding. That is taking two birds with particular characteristics and letting them get it on (repeat to freak). Same with domestic dogs. Their suitability is not governed by nature, but by man. This unatural selection yields significant changes quickly.

I have a very basic understanding of this stuff, gleaned from TV and getting halfway through The Blind Watchmaker 10 years ago. If I was writing a scene with this stuff in, I'd run it by a biologist. I'd check. I'd really not want to look this foolish on TV. And that's just me. One person. Nobody. Not one single person noticed when this was being written and rehearsed and recorded, that it was bullshit ? What astounds me about this Defying Gravity nonsense is this carelessness.

If you work on this show and told your boss that this scene was bullshit, show them this blog. Tell them I said they were idiots and they should listen to you next time. People notice this shit. Mmmkay ?

Saturday, 15 August 2009

Filmmakers and musicians having a “credit score”

I often think about writers and filmmakers and musicians having a “credit score” with my bureau. The currency is the amount of faith I have that a certain project will be great or even good. Some creators rate so low that if the currency were actual money, I won’t loan them $50 if they left behind a $100 bill as collateral. Whereas the Coen Brothers could tell me “Our next movie is going to be a static, 90-minute shot of a bowl of Cheerios getting soggier and soggier over time” and I’d still mark Opening Day on my calendar.


So says Andy Ihnatko.

He's right. This is how I feel about Tarantino. I can't imagine him ever making a bad movie. I sometimes worry that I'm blindly accepting of anything that he does, but really, is that such a bad thing?