Economist Brian Easton muses on the user-unfriendliness of many computer interfaces

Economist Brian Easton muses on the user-unfriendliness of many computer interfaces

This is a re-post of an article originally published on pundit.co.nz. It is here with permission.


Windows 95 is famous for requiring the shutting down of the system by clicking ‘start', like stopping your car by turning the ignition key on. Why are so many interfaces so user-unfriendly?

The Covid app to register your entering premises can be so clumsy. Sometimes I have signed in, sat down and ordered drinks by the time my companion has got the bloody thing to work. Its coverage is scattered. Many people do not carry a phone. Some do not use apps or they do not work on their phones (even if their users think they do).

I could now go on to discuss the difficulties the government has had with managing the Covid crisis. Many are keen to criticise destructively. My view is almost the opposite. The initial lockdown was a dream but it misled us that the task was easy. We cannot expect governments (or businesses) to implement complex regimes without making mistakes and overlooking the obvious – or the ‘obvious’ with hindsight.

Public criticism may actually be helpful by pressuring the bureaucracy to do better; its whining tone may not be. The fact is we have done very well. I get no pleasure in observing that each day the state of Victoria, with roughly the same population as ours, has more deaths than we have community cases. I pray that our defences continue to hold.

However, the lesson this column wants to draw attention to is that the poor design of this app is not isolated but reflects poor quality arising from geeks who may be technically very sophisticated but do not think about the circumstances of the much less technically savvy users who are not like them in a variety of social characteristics either.

The problem is not confined to the public sector. I offer the example of Microsoft Office, having recently had to upgrade my version (when I ran out of room). I won’t bore you with my troubles (and you in turn do not need to chip in with yours). To simplify, I reckon that compared to the previous version, the new version is much more complicated than the old one – I have to click twice as often to get anything done – it often does things for no explicable reason and while there are some better features including the additional capacity, it could be made much more user-friendly.

Sometimes it is so confusing I google for advice. It is usually there but written on the assumption that I know as much as the geek who wrote it; often the vocabulary is opaque and the constructions contorted. (A friend was expected to download a 40-page manual to understand her new phone.)

When I thinking about this dysfunctional design, I was struck that both examples were monopolies – one public, the other private – and there was that sense they were not accountable to their users. But then I thought of a huge number of poorly designed – typically very fussy – websites in which the owner is hungrily competing for attention. Not only are many not user-friendly, but the difficulty of using them means one looks elsewhere – customer lost.

Presumably the senior executives never go near their websites and they never ask their grandmothers to use them either. They would be told ‘I am sorry dear, I cannot make any sense of it, but it does look nice’. In some instances I have discussed my problems with someone who works for the agency, to be told that the website baffles them too.

The problem of failed digital design was explained to me by my son as follows:

‘To make a good product in the technology industry you need good understanding of people, good understanding of visual design and a good understanding of technology. These attributes rarely exist in one person, so a team must be formed whose skills & experiences collectively cover the three.

‘Too often, internal or external pressures can result in one of these three pillars being elevated over the others – resulting in products that look beautiful, but are unusable, or are easy to use but technically unstable or insecure.

‘A common issue as the project progresses is that the focus shifts from the end users to the end project (i.e. just getting the bloody thing over the line). Organisations can create their own internal groupthink, framework and languages which is applied to their own external product without thought of the audience's understanding. Think of coming across an acronym in an organisational report which only makes sense to people inside the organisation.

 ‘Even internal personalities can affect the outcome, based on who is in the driver's seat and what they see as important. One US company managed to severely damage their market position when a single stubborn engineer successfully badgered the entire team to drop their market-leading technology platform for one the engineer had designed (which turned out to be disastrous in the real world).’

(Tama elaborates these issues here, here, here, here and here.)

To be reflective, a major challenge of any columnist is how to engage with the person reading the column. It is particularly difficult for one like me, who is often dealing with very technical and challenging material. It is simpler to choose the easy path of sloppy analysis and platitudes. Readers (or audiences) love it. But all that happens is that their prejudices are confirmed and the national conversation stagnates. How to keep the reader comfortable and stimulated is quite a challenge if far more valuable.

In writing, a key player is the subeditor. (Incidentally I am amazed at those who can correct copy almost as fast as I can read it, and then discuss the contents.) My impression is that geek designers do not have the equivalent and that those who commission them rarely look at the final product. They certainly do not get it adequately tested by potential users or their ‘grandmothers’.

I am not totally pessimistic. There are highly technical design teams, like the group which Tama works for, which produce user-friendly interfaces.

Why the competent do not clean out the market is a puzzle. Perhaps where there is competition, inertia means it takes time. Perhaps design teams are limited in size and so quality businesses cannot scale up.

Monopolies are another matter. I despair of Microsoft and other businesses which dominate their markets. I rail against equivalent public failures. I wish my political representatives gave us more support. With luck we will have a Covid-monitoring user-friendly identification system by the time we get to the next outbreak.


Brian Easton, an independent scholar, is an economist, social statistician, public policy analyst and historian. He was the Listener economic columnist from 1978 to 2014. This is a re-post of an article originally published on pundit.co.nz. It is here with permission.

We welcome your comments below. If you are not already registered, please register to comment.

Remember we welcome robust, respectful and insightful debate. We don't welcome abusive or defamatory comments and will de-register those repeatedly making such comments. Our current comment policy is here.

29 Comments

Speaking from the position of having in IT degree, and post graduate diploma in quality systems; your son has pretty much nailed it. Really smart people often lack humility and consequentially think they know better than others (the engineer example used), this can also take a form a tribalism when expanded to a group, and geeks are really, really good at this.

As you have an IT degree murray you may be able to answer a question for me?
Could a bluetooth "handshake type" track/trace app be downloaded to a device (eg smartphone) by an external agency without the device owner's input or permission? Just curious
I've been told it's "technically impossible" but I doubt they have a background in IT.

Depends. With our current understanding of the OS, protections on the phone are the most important to stopping it like a high grade antivirus app like Norton. But then there are probably many who do not have these or even activate all their phones protections. The vector in is the vulnerability area, like a virus. I doubt just a blanket download could happen, but then a software update on an OS could create the method to do that without anyone knowing. Do you know all the capabilities of your phone or computer operating systems? I doubt many do. Do you update your phones OS when told there is an update available? Most do. The capability may already be there.

'Technically impossible' is a big statement. After all if your phone is turned on, it can be tracked, why couldn't it be doing something else too?

Interesting. Thanks for the reply. Very useful to get a qualified answer.

The point is telco's in NZ could not do this without help from Google, Apple, and likely also the cell phone manufacturers. They won't help NZ telco's do this, hence it is technically impossible.

I also have an IT degree btw.

That's a fair comment Lanth, but your direction also more about politics than technical capability. From a strict programming perspective, and a good understanding of the OS, I don't see much preventing it from happening.

Sure, someone (or some company) can achieve the stated request, given enough resources and willpower (lets just ignore all the problems like "what do you do when the device doesn't have any spare space left" and "what do you do when the installation goes wrong for some obscure reason (maybe the device is rooted and running custom apps that interfere?) and you've now made someone's device unusable" - the sorts of reasons that it often takes manufacturers 12+ months to customise an Android upgrade for their own hardware).

Those people and companies don't exist in New Zealand and we don't have the resources / leverage to incentivise them to do it for us. Thus technically impossible for NZ to achieve the stated request.

Has anyone really looked at whether the computer revolution has been as good as it said on the tin? Over the last fifty years Western productivity growth rates have been in decline. Did complexity just expand to fill the space available?

We went from an age of supersonic commercial flight for those that could afford it, to slower and much cheaper flights for all. We also went from being thinner to being fatter. The Boeing 747 and McDonalds conquered the world.

Did computers mainly enable monopolistic (or oligarchic) institutions, both public and private?

What happened to the successor to Concorde?

Interesting question Roger, essentially are we better off knowing more, or is what we think we know being manipulated, and therefore are we worse off? Complex question - how do we determine what is right/correct v what is not?

Come forth the conspiracy theorists!

The more we learn, the less we know.

The Concorde question sort of popped up when I read a breathless article about a hypersonic (oooh) commercial airliner than might get built, one day. My mates father was working on that when I was at school. Why don't we already have them? My mechanic says new cars are not built as well as 1990s ones. What is going on? It suggests resources have been poorly allocated. Death by financialisation, computerisation and politicisation of decision making?

Clay Christensen's YouTube lectures on financialisation's negative impacts on companies and society are a very interesting watch in this regard.

The more we learn the less we know we know. Started my work in banking early sixties. No computers. No calculators. Just Facet & Burroughs mechanical adding machines etc. So everyday all day long hand ledgers and balancing. Slow and laborious but boy did we learn about accuracy and team work. If that was not done satisfactory, you would be on the late bus home. These days if the question is not on the screen. the answer will never be known.

Roger,
Re the decline in productivity rates, I am looking at Jerome Powell's recent speech in which he said this; "More troubling has been the decline in productivity growth, which is the primary driver of improving living standards over time". In the speech notes, he writes, "Between 1995-2003, business-sector output per hour increased at an annual rate of 3.40%, and has risen only 1.40% since then. Fernald(2015) suggets 2003 as a break point for the beginning of the productivity slowdown".
Unfortunately, there was no analysis of these figures.

Roger,
Re the decline in productivity rates, I am looking at Jerome Powell's recent speech in which he said this; "More troubling has been the decline in productivity growth, which is the primary driver of improving living standards over time". In the speech notes, he writes, "Between 1995-2003, business-sector output per hour increased at an annual rate of 3.40%, and has risen only 1.40% since then. Fernald(2015) suggets 2003 as a break point for the beginning of the productivity slowdown".
Unfortunately, there was no analysis of these figures.

Y'day I listened to Microsoft's top execs (elqouent Indian chaps) wax lyrical about open source, data privacy, customer experience, seamless connectivty on a lengthy segment on MSNBC. Honestly it sounded like a strong of buzzwords that I could read on a Linkedin sales pitch (if it weren't so convincing).

Microsoft and open source in the same sentence - lol - trust your gut on that one.

sometimes it's the things you think you know that just are not so.
https://www.zerohedge.com/markets/your-money-gone-all-gone-how-softbanks...

Lets have a 'system' where the retailers have to print out a QR code. I don't know anyone managing a retail space but wonder what guidance beyond this is given?
Like despite talk of social distancing, the QR code creates a pinch point, i.e. at the door. And a really good way to slow down the queue of people trying to scan and get in the door is to laminate it behind something shiny and for added effect behind a glass door and with plenty of sunlight behind the shoulder of people trying to scan the code.

The QR codes should be at entrances. But nothing is stopping companies putting them elsewhere. I've seen them at checkouts at supermarkets and at registers at restaurants. Nothing stopping restaurants having them on every table.

to be honest it's a fairly clumsy interface but it has to be for utility - an NFC scanner or bluetooth deviceis not practical - I don't know whty they don't just leverage the GPS mapping - oh that's right - civil liberties yet

to be honest it's a fairly clumsy interface but it has to be for utility - an NFC scanner or bluetooth device is not practical - I don't know why they don't just leverage the GPS mapping - oh that's right - civil liberties - yet it's an opt out for Apple and Google and yet _ Pokemon Go - figure.

Why the competent do not clean out the market is a puzzle. Perhaps where there is competition, inertia means it takes time. Perhaps design teams are limited in size and so quality businesses cannot scale up.

Or fundamentally, because computers and software are sophisticated and can do many many things, this complexity needs to be hidden from the user so it stays out of their way, but also accessible when they need it.

Compare Microsoft Office to a typewriter in terms of what functionalities each offers.

Coming up with ways to hide the complexity but make it available is a difficult job, and it's always going to require the user to learn how to use the tool if they want to do more than the basics. That's just the nature of using complex tools.

Until computers can read your mind - and even then you're likely to have to think in specific ways or terms the computer can understand - you're just going to have to get used to putting in effort to use these very powerful tools at your disposal.

oh, it appears I managed to come up with the same comment as you, just below.
Spot on, by the way.

I'm pretty tired of people bleating "make it easy for me!!!". So the question to ask is "what is 'it' exactly?" and then "how do you propose I make 'it' easy for you, without inconveniencing other people?".

We don't yet have systems that can read your mind, so you have to convey your intent to the computer system. This requires you to do at least some of the work of converting your intent to actions to drive the computer to get the output you want.

If you think using software is hard, try writing it.

It is like English speaking people who go to a foreign country and are aghast that they do not speak English. You need to learn the language if you want to live there :).

I agree with your son, but I think when this knowledge is wrapped up inside another article and discussion you tend to miss a huge point.

That is technology has enabled possibilities and complexity in our lives unlike anything that ever existed before. Of course trying to use this technology is going to be hard. Life, learning is hard.

Compare tech to navigating a library, store, bank. Or more accurately, when you visit 50 different libraries in one day. In your day to day you might interact with 50 computer or technology systems that all share some ideas but they have to invent many new approaches to solve the interesting (trivial) problems they have found.

Unless that is, they take away the possibilities, and leave you with the bare minimum. ala Mac OS, iPhone, your 1990 TV remote. Office 2021 edition with 25 years of features removed would be a dream to use.

(Maybe one thing AI will actually solve is knowing what to take away and what complexity to leave, in the world of user interfaces)

The reason that there are few changes in, and ever-increasing complexity in, UI and the squillions of libraries sitting underneath them to make 'em work, is well-known to those in the trade.

Incumbent Inertia.

So middling-to-poor functionality is just too hard to change - the cement has long since been poured around the base concepts involved - and when sizable market share is considered, there are no good commercial reasons to move those concepts.

The mouse has been around since 1962, when Doug Englebart invented it, the Xerox Alto from 1973 used the first commercial windowing UI, and the now-ubiquitous SQL database was developed by IBM researcher Edgar Codd based on his work dating from the late 1950's.

These roots go deep.....