I
hate modern algorithms in my life. The only time I notice them (and I’m sure
there are a lot of good ones I never notice) are when they’re limiting my
choices in the name of personalisation. When I visit YouTube, for example, I’m flooded
with junk I don’t want to watch. How do they regard this ability to hide
interesting videos as an asset? I see the same on Netflix or Amazon. Amazon shows
me the exact same thing I’ve just bought, or others like it. How many boxes of
marzipan should I give my sister as a Christmas present? 1 or 20? Netflix tries
to offer similar things in categories, but the categories are clogged up with videos
I have already seen: it thinks that these are more relevant to my watching
habits that something entirely novel. I’m trapped in a bubble: my last video
looked like the one before it, so I’m offered the chance to watch the one I
watched before, and then re-suggested the second one, ad infinitum.
So
I do. I re-watch good things and bad. I watch one marble run video on YouTube,
because I really wanted one when I was a child, and then my homepage is flooded
with dozens of marble races (strangely entrancing that they are). The default
setting is too tiresome for me to change. I wouldn’t even know how to search for
something interesting.
Defaults
affect us the world over. Someone offers you a free trial, but then wants your
bank details. I remember that the first time this happened to me I was shocked.
“Why do you need bank details if it’s free?”, I asked. So that they can set up
a subscription and it’ll run unless I cancel. I ran away from that; I didn’t
want a subscription. I wanted a free trial. My youthful naivety is long gone
now: we’re all accustomed to such sales tactics. It’s just the way world works.
But is it the best way?
Imagine
all the people having money and time leeched from them because they’ve lost
track of all these defaults that they need to change… changes that are made
deliberately difficult. I’ve complained to numerous companies over the years:
van rentals, cancelled flights etc. No complaints process was easy. I can send
the complaint, but then I get a robotic reply that doesn’t address the problem…
we have to correspond, time goes by, nothing happens, I have to pursue it and
forget… and then my complaint has gone away. No refund, bad service justified.
This
is just another way that people try to control our decisions. It’s not advice
or nudging: it’s outright manipulation.
So
what would be better? Well, if we return to Amazon or YouTube, it’s easy to
imagine what better would look like. Amazon should know when I’ve ordered
something and remove it and similar items from display. Maybe it works this way
to make money: lots of gullible idiots buy another copy, just in case, or
because they forgot they’d already bought it. Or in some rare cases, because
they genuinely could use a similar item at the same time. And then, just when
Amazon forgets I ordered something, it should be remembering: when I’ve had a
chance to use up, break or lose the last order, or decide it wasn’t good enough.
That would be helpful. And if I don’t want something, it’d be lovely to tell
Amazon to forget that I ever even looked at something like that: to remove the
high ranking it has for me.
I
need something similar on YouTube. The flood of videos about X-box games
because I once made a mistake and watched a video I thought was a general
comment about computer games was a waste of space. I don’t own an X-Box. But I
watched another one from that channel, so I must be interested! Well, of course
I watched another one. I was looking for some mild diversion while chopping
vegetables, and that was all YouTube was showing me. I need a way of
deactivating the ranking for those videos: of down-rating them, to say ‘that
was just noise, don’t treat it as hugely meaningful signal’. But without saying
‘this is something I would never watch: never show me this again’.
What
I need, in fact, is a tree structure. Broad categories, all available, which
open into subcategories, like a folder system on an old computer. Only because
we’re modern, we can use metadata and have some things show up in multiple
places, but maybe with different rankings. An item might be 100% comedy, 50%
news. So when I go to news, I might find pure news sources first, but see the
option of a satirical report.
I
don’t want a helpful machine to learn that when I look for news, I really mean
comedy. Maybe I really mean news but nothing was interesting enough, or maybe I’m
trying to be more informed but the comedy seemed diverting enough this time. In
essence, what I say is what my thinking, conscious brain has decided, and what
I do is this decision, corrupted by temptation, emotions and the limited
options available. If algorithms learn to bypass my thinking free will and feed
the beast of my emotional, whimsical self, then they are not helping me: they
are destroying what makes me ‘me’.
Netflix
does the same. I can’t simply say ‘reduce the ranking of things I’ve watched by
90%’, or ‘nothing I’ve watched before should be visible before scrolling’. Instead,
90% of my screen is full of things I’ve seen, helpfully displayed as most
similar to the things I’ve watched. I assume that this is to hide the lack of
content, and because they have found that when faced with the choice of
re-watching something or spending 30 minutes searching for something new,
people who have decided to watch Netflix rewatch something. So Netflix gets
viewers, and can tell itself that it is adding value to their lives.
But
there is no option to customize for myself: to write my own algorithm, to
tailor it to reduce temptation, to help me live the way I want. I must submit
to someone else’s choices; to the tyranny of how they choose to interpret my
history and desires, all based on the assumption that their calculations can understand
me better than I understand myself, and that I will never change nor want to
change. The only path is deeper into the rabbit hole.
That’s
how defaults are set up at the moment: traps for the unwary. We must devote cognitive
attention to escaping them, and if we do escape, it’s usually extreme: complete
removal or deletion of the option.
Humans
don’t work like that. There is noise in our decisions, and gradual shifts in
our choices. I might watch one thing because it seems like an appropriate
diversion right now; or because a friend is visiting and chose it; or because it
was talked about at work, or any of a range of reasons. I might want to watch
one or two on a subject, but not hundreds; and I might also want to watch
things on other subjects. Some things are only just acceptable; others are brilliant. But the binary information of whether I do it or not doesn't reflect that. I'd do brilliant things more, but sadly they are rare. So I drop down to mediocre things. If that's a sale for whichever salesman or company I get it from, that's fine for them. More of that, please: get me on something that's good enough, and save time and effort that might have been wasted on getting me something great. Soon I'll come to accept that good enough is all I can get.
I
can’t let this discussion go without Facebook, which thinks its algorithm is so
hugely valuable. Sure, it might help serve advertisements, but it turned
Facebook from a brilliant way to see what every friend I had had done recently
into a mediocre way to see what the friends I spend most time talking to had
done recently… as if I needed that help.
It
still passes my threshold for use: nowhere else has any of my friends’ activities
and thoughts displayed for me to engage with and maintain some social contact.
But even when I turn this much-vaunted algorithm off, it looks very much like
480 of my 500 or so friends never do anything, until I look directly at their
page. That’s not how I want to arrange my life; limited to as few as possible.
I want to pick my interests myself.
Maybe
if I repeatedly click through a sequence of folders, I might want a shortcut to
develop, as happens at work: Windows gives me five recently used folders, but only
uses ones that I use at least a few times. They’ve realized that there’s noise
in my activity. My neurons do the same thing: connections are strengthened or
weakened over time, not created or severed.
But
even if algorithms learn to adapt properly: to deal with my interests as a
network, with hub subjects and spoke subjects, all of which I want to be
available, rather than the hub drowning out the spokes I already had, let alone
potential new ones; even then, they can’t replace free choice. Humans thrive on
a wonderful combination of self-directed spontaneity. Shops are wonderful
because we see things we never thought about. That’s why people love Tiger or
Pod, or department stores: for the novel nicknacks that ruin the environment
but stimulate the mind. Algorithms, as they deal with me, would take my passing
glance at a novelty pencil and teleport me to a pencil shop, smugly proclaiming
that they were being helpful.
The
same is true in creativity. We need those infamous water-cooler moments where
two different ideas collide that do not usually meet. How can an algorithm,
based on history, ever hope to create such truly stimulating and productive
moments? The actions of chance created evolution and the remarkable panoply of
species that we’re killing off today. Chance moments spark people’s minds,
bringing meaning and progress. Our choice systems are instead designed to
create ruts and dig us deeper into them. It would be better to help us never
dig a rut. Maybe when I’ve watched things, or bought things, it would support me to deliberately
show me something new!
The
people who help us most are civil servants. Because they are rightly careful
not to be illiberal, every nudge comes with requirements to give information
and the right to opt out. We can opt out of pension schemes, but we don’t. Evaluating
them and then filling in the paperwork is too much of a burden. Most of us can’t
spare the time (and we all know that the administration of life has been
gradually outsourced to the individual, while working hours and the demands on
our time have increased).
In
summary, algorithms are designed to dig us deeper into whatever hole we start
digging for ourselves, because it looks to be more profitable to keep us hooked
on whatever just about passed our threshold for interest in the first place
than to help us out of holes to search for new places to explore by ourselves.
They contract our worldview to whatever is easiest to give us that we still
accept using. They do not expand our horizons, helping us see further and
better and to be more discerning. They make us small-minded, like the idiot
programmers (or PR teams) who call them progress.We should be creating tools to make us better, as Steve Jobs said he did.
No comments:
Post a Comment