Finding reality while looking through code

Our newspaper site www.tijd.be exists 15 years now. In May 1996 someone who had a 128 kbit connection was a rather fortunate citizen, while nowadays we consider 100 megabit normal (in Belgium anyway). In May 2026 speed will no longer be an issue. Access to networks, information streams, databases will be ubiquitous and instantaneously. The internet will be as self-evident as the air and people will deal with news in very new ways.

Access to news will be ubiquitous. Today smartphones and tablets enable us to be “always on” but in 2026 those devices will be as archaic as the Remington typewriter today.

Companies such as Apple are licensing wearable electronics – computer power which you will carry with you, embedded in your clothes, maybe even in your body.

As is often the case in technology, the new developments emerge in a military context: pilots of fighter jets have to analyze lots of data almost instantaneously, and keep their hands free, so they use head-up displays (HUD). The next step is integrating this technology in luxury cars, and finally it’ll go mainstream.

Keyboards will be replaced by voice commands, touch en gestures. Screens become projections which you can manipulate as you wish, in 2D or 3D. Information will even more become a layer projected upon the physical reality. Or it will transform that reality into a virtual realm, where mixed realities games will be played.

However, the future is not just about new gadgets. The nature of that ubiquitous news will change, and there’ll be some crucial discussions about how to organize our news streams.

Filters

Many apps, especially those produced by mainstream media, propose news selected and produced by their editorial staff. Apps such as FlipBoard – not produced by mainstream media – change this. They transform the articles, videos and pictures selected by your online contacts into a glossy online magazine.

Some articles will come from The New York Times, others from The Wall Street Journal or TechCrunch. Algorithms will also check which articles you read and how long you consult them. The news selection becomes personalized.

Facebook for instance shows you status updates from those people who are most crucial to you – or at least, that’s what the algorithm tries to detect.

Eli Pariser explains in his book The Filter Bubble how Google yields search results not just based on what you’re looking for, but taking into account which computer you use, which browser, where you are and tens of other criteria. Which means that your friends, looking for exactly the same topic on Google, will get different results. More in general, it will become very difficult to find something on the web which is not personalized, customized etc.

This seems to be an advance compared to traditional mass-media which paternalistically suggested the same news for everyone, because a priesthood of journalists decided what was essential, what was just ‘nice to know’ and what was unnecessary. But, as Pariser explain in this TEDtalk, the danger is that we lock ourselves inside information bubbles offering an environment of news we ‘like’ or consider interesting but which is not necessarily the news we should know.

Transparency

So we have human gatekeepers and algorithmic ones. We know even less about those algorithms than we know about the human editors. We can have an idea about the news selection at The New York Times, but many people are even not aware of the fact that Google shows them different results depending on supposedly personal criteria, or they are not always aware of the selection Facebook makes of status updates.

The code used by those major corporations in order to filter what we see is politically important. If we want to keep an internet which confronts us with a diversity of viewpoints and with facts and stories which surprise and enlighten us, we need to be aware of those discussions about algorithms and filters. If we don’t pay attention to those codes, we’ll be programmed behind our backs.

Beneath all those human, network and algorithmic filters we find an ever-increasing stream of information. Tweets, status updates, blogposts by experts, witnesses and actors are flooding us, second by second.

I’m sure that in 2026 there will be something we could call journalism: people who have a passion for certain subjects, making selections, verifying and commenting, providing context. The BBC already has a specialized desk analyzing images and texts which are distributed via social media: they check whether a specific picture could be taken where and when people claim it is taken, to give but one example. Almost every day there are new curating tools for journalists and bloggers, facilitating the use of social media.

This ‘curating’ of the news is an activity with high added value. Whether those curators call themselves ‘journalists’, ‘bloggers’, ‘newspaper editors’, ‘internet editors’ is not important: most important is the quality of the curation and the never-ending discussion about these practices.

Everyone who has the energy and time to have a look at the raw information streams, will be able to see how the curation added elements, omitted or changed stuff. Not only will we be able to check this out, many curation projects will invite us in to suggest improvements or to participate directly (e.g. Quora).

Bloggers and journalists who clearly state what their position is regarding the issues they cover, even though they also promise to represent other viewpoints faithfully, will be considered more credible. Those being open about their curation practice, will gain an advantage. As Jeff Jarvis says: “transparency is the new objectivity.”

In May 2026 the editorial news of my newspaper will reach our community in many different ways. I very much doubt that the print newspaper will be as relevant as today, and people will smile when they look at screenshots of today’s site. But there will always be news and discussion, and people trying to cover what is essential in the information flood and who try to find reality through the algorithmic codes.

In preparing this post I learned a lot while having discussions on Twitter, Facebook, LinkedIn, The Well, Quora… In order to be transparent I posted about these preparations. You’ll find links to the original articles and videos, and to stuff I finally did not use for this post, but which could be interesting for other explorations.

Roland Legrand

Tagged , , , , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.