Algo-calyptic Armaggedon

There seems to be hysteria these days around AI and algorithm-based consumption. The algorithmic economy threatens to bring to the realm of the tangible that totem of economic dogma called the "invisible hand." Computers are finally linking --through big data-- individuals' behavior with algorithms that sort them, allowing machines to predict reactions: the "invisible hand" materializes(!). Nothing like predicting human conduct to be able to make money. That damned watch I dared to search for 3 weeks ago in a moment of stupid leisure has haunted me everywhere I go on the internet. Please, stop it Google. I'm not buying it.

The idea that we are just soft machines is a fascinating one. It feeds the collective dread of an algorithmically-dominated Armageddon, making people fret about a time in the not-so-distant future when we will decide nothing for ourselves, and when everything will be dictated by an algorithm --surreptitiously creating the illusion of  freedom. Free will will disappear. Wacky futurists scaring us with their vision of bio/AI-powered "people" and such.

But will it? Will algorithms be eventually able to decide for us, directing Capitalism 2.0 corporations in this apocalyptic world to produce "X" or "Y" to please our insatiable consumerist selves?

To accept this as truth, you need to rely on a very heavy assumption: that we are "modelable"; that our behavior can be hard-coded into a series of mathematical formulas that mimic and --most importantly-- presciently announce our reactions to specific stimuli. In other words, that we are rational. (HA!)

The unending litany of financial bubbles that populates human history should alone be the crushing data point that obviates this assumption as ridiculous. If anything, algorithms only exacerbate human irrationality, but they don't take control. In the end, it's always a --not-so-smart-- human who decides to act on his innate impulses and crashes down. After all, algorithms --being the distilled interpretation of human behavior as devised by a human-- can only filter and dissect behavior as perceived by those who created them, who happen to suffer from the same biases of the subjects they attempt to arrogantly catalog. The same heuristics that rule the brains of consumers dominate those of programmers. So, when Myron Scholes and Robert Merton devised their formula to price options, they relied on (incorrect) assumptions that fit their limited perception of human rationality (namely, that big movements in markets are actually very, very rare), ultimately leading a bunch of blindsided traders into a financial crash that still after 15 years is the subject of postmortem analysis.

So, don't fear nor blame the algo's. Behind every outrageous, extremely over-the-top Facebook update your read in your newsfeed, there is always someone who ultimately clicked "send." Yes, an algorithm may have fed him/her the crap he/she's sharing, but really, are we really going to blame a formula for that?

Comments

Most Read Pieces

Fear is Good

Messi Jersey Guy

The Matrix has you