How does our near future look like, as computing and fast internet access become ubiquitous, ever more digital data become available in easy to use formats? Well, it seems our world is being transformed by algorithms, and at the LIFT11 conference in Geneva, Switzerland, Kevin Slavin presented some fascinating insights about this disruptive change.
I try to summarize his talk. I added some musings of my own, such as the stuff about social capital rankings and the Singularity.
Kevin Slavin is the co-founder of Starling, a co-viewing platform for broadcast TV, specializing in real-time engagement with live television. He also works at Area/Coding, now Zynga New York, taking advantage “of todayâ€™s environment of pervasive technologies and overlapping media to create new kinds of gameplay.” He teaches Urban Computing at NYUâ€™s Interactive Telecommunications Program, together with Adam Greenfield (author of Everyware: The dawning age of ubiquitous computing).
Slavin loves Lower Manhattan, the Financial District. It’s a place built on information. Big cities had to learn to listen, for instance London had to use a new technology during World War II, called radar, to detect incoming enemy bombers. Which would lead to the Stealth airplanes, the so-called invisible, untraceable planes – but anyway, also the Stealth plane can be located, and shot, as it appeared in Serbia.
Slavin is a master in explaining technologically complex things. For instance, the idea behind Stealth is to break up the big thing – the bomber – into a lot of small things which look like birds. But what if you don’t try to look for birds, but for big electrical signals? If you can “see” such a signal while nothing appears on your radar, well, chances are that you’re looking at an American bomber.
(Which reminds me: in this day and age, forget about privacy. If you want to hide, the only strategy is to send out lots of conflicting and eventually fake signals – I think futurist Michael Liebhold said that somewhere. His vision of the Geospatial Web: “Imagine as you walk through the world that you can see layers of information draped across the physical reality, or that you see the annotations that people have left at a place describing the attributes of that place!”
Just as was the case for the Stealth, it just takes math, pattern recognition etc to find out who or what hides behind all the bits of information one leaves behind).
The same reasoning applies for other stealthy movements, like those on financial markets. Suppose you want to process a huge financial deal through the market, without waking up other players. The stealth logic is obvious: split it up in many small parts and make them appear to move randomly.
But then again, it’s only math, which can be broken by other math. It’s a war of algorithms. As explains Wikipedia:
Starting from an initial state and initial input (perhaps null), the instructions describe a computation that, when executed, will proceed through a finite  number of well-defined successive states, eventually producing “output” and terminating at a final ending state.
Slavin says that 70 percent of all trades on Wall Street are either an algorithm trying to be invisible or an algorithm trying to find out about such algorithms. That’s what high frequency trading is about: finding those things moving through the financial skies.
Who will be the winner? It’s not only about the best algorithm or the best computer, but also about the best network – we’re talking here about milliseconds. If you’re sitting on top of a carrier hotel where all the internet pipes in a big city are surfacing, you have such an advantage. The internet is not this perfectly distributive thing floating around there, it has its physical properties which for instance determine the price of real estate in cities.
Slavin explains how it are the needs of the algorithms which can determine real estate prices and urban architecture in New York, London, Tokyo or Frankfurt. Real estate 20 blocks away from the Financial District suddenly becomes more expensive than offices which appear to be better connected in human terms. Referring to Neal Stephenson, our professor said that cities are becoming optimized as motherboards.
(Read Mother Earth Mother Board by Neal Stephenson on Wired and, also on Wired, Netscapes: Tracing the Journey of a Single Bit by Andrew Blum. Which also brings us back to Adam Greenfield, who gave a great talk at the Web and Beyond conference in Amsterdam, showing how web design principles and discussions are becoming largely relevant in urbanism – the city as a mother board or as a web site, to be organized as such and where the same concepts and algorithms can be used. Just think about the application of access and permissioning regimes in a world where the overwhelming majority of the citizens is perfectly traceable by their cell and smartphones. Which means that design becomes a very political matter).
Algorithms determine what we hear on the radio and what movies we see – and also what we won’t hear or see. They claim to predict what we want to read or watch, organize traffic, investment decisions, research decisions, and determine which conversations or searches on the web point to terrorist plots and who should be monitored and/or arrested by the security services.
Sixty percent of all movies rented on Netflix are rented because that company recommended those movies to the individual customers. The algorithms Netflix uses even take into account the unreliability of the human brain (we are rather bad in consistently rating things. Epagogix helps studios to determine the box office potential of a script – and influences in that way what will actually be produced.
There is an opacity at work here. Slavin showed a slide depicting the trajectory of the cleaning robot Roomba, which made it obvious that the logic applied here does not match with a typical human way of cleaning a floor.
Crashing black boxes
One may think that an algorithm is just a formalization of human expert knowledge. After all, a content producer knows what has the biggest chances to succeed in terms of box office revenue, clicks, comments and publicity. Isn’t an algorithm not just the automated application of that same knowledge? Not really. In fact, competing algorithms will be tweaked so as to produce better results, or they will tweak themselves. The algorithm often is a black box.
Genetic algorithms seem to mimic the process of natural evolution using mutations, selections, inheritances. Tell the algorithm that a certain weight has to travel from A to B, and provide some elements such as wheels, and the algorithm will reinvent the car for you – but the way in which it works is beyond are human comprehension (it does not even realize from the start that the wheels go on the bottom, it just determines that later on in its iterations): “they don’t relate back to how we humans think.”
Which is important, because think about it: algorithms determine which movies will be produced, and algorithms will provide a rating saying whether a movie is recommended for you. Where is the user in all this? Slavin: “maybe it’s not you.”
Maybe these algorithms smooth things out until it all regresses toward the mean, or maybe they cause panic when all of a sudden financial algorithms encounter something they weren’t supposed to encounter and start trading stocks all of a sudden at insane prices. This happened on May 6 2010. Wikipedia about this Flash Crash:
On May 6, US stock markets opened down and trended down most of the day on worries about the debt crisis in Greece. At 2:42 pm, with the Dow Jones down more than 300 points for the day, the equity market began to fall rapidly, dropping more than 600 points in 5 minutes for an almost 1000 point loss on the day by 2:47 pm. Twenty minutes later, by 3:07 pm, the market had regained most of the 600 point drop.
Humans make errors, but those are human errors. algorithms are far more difficult to “read”, they do their job well – most of the time – but it’s often impossible to make sense in a human, story-telling way of what they do.
There is no astronomy column in the newspaper, there is astrology. Because humans like the distort facts and figures and tell stories. That’s what they do in astrology, but also on Wall Street – because we want to make sense to ourselves, even if means we’ve to distort the facts.
Now what does a flash crash look like in the entertainment industry? In criminal investigations? In the rating of influence on social networks? Maybe it happened already.
Some other presentations at LIFT are also relevant in this context. Algorithms are for instance increasingly being used to determine your personal â€˜valueâ€™ – for instance your value as an â€˜influencerâ€™ on social media. Klout is a company which uses its algorithm to measure the size of a person’s network, the content created, and how other people interact with that content. PeerIndex is also working with social network data to determine your â€˜social capitalâ€™.
This is not just a weird vanity thing. Some hotels will give people with a high Klout ranking a VIP-treatment, hoping on favorable comments on the networks. Social influence and capital can be used as an element in the financial rating of a person or a company.
This in turn will incite companies but also individuals to manage their online networks. At the LIFT11 conference, Azeem Azhar, founder of PeerIndex, gave a great presentation about online communities and reputations management while social media expert Brian Solis talked about social currencies. Of course, people will try to game social ranking algorithms, just as they try to game search algorithms on the web.
Rapidly increasing computer and network power, an avalanche of digital data and self-learning networks, ambient intelligence could lead to what some call the Singularity: â€œa hypothetical event occurring when technological progress becomes so rapid and the growth of artificial intelligence is so great that the future after the singularity becomes qualitatively different and harder to predictâ€ (Wikipedia).
Many scientists dispute the spectacular claims of Singularity thinkers such as Ray Kurzweil. There is also controversy about whether, if the Singularity would take place, this would be good or bad for humanity. Slavin points out the opacity of the algorithms. They can be efficient, but donâ€™t tell stories and we cannot tell a good story about the inner workings of black boxes. Now already algorithms are capable of taking into account our weird human imperfections and inconsistencies, while humans also respond by trying to game algorithms. In that sense weâ€™re witnessing not one spectacular moment of a transition to Singularity, but a gradual shift where algorithms become a crucial part of our endeavours and societies.