W.I.R.E.
close

Thinking our way through the fog. By Stephan Sigrist

“Knowledge is power.” With this shrewd insight, English philosopher Sir Francis Bacon anticipated as early as the 17th century what would make today’s world go round. In the light of the triumphant success of the Internet, bringing with it the knowledge society and its concomitant prosperity, no one can seriously doubt that today’s world belongs to those who know the things that really matter – and how these are controlled. Over the last decade we have all been witness to an unprecedented transformation of almost every part of our lives as the digital revolution has changed some things marginally and others radically. The examples are common knowledge and require no further explanation. They range from the redefining of the music industry with new digital sales models to the ubiquitous spread of social networks and the democratisation of knowledge via the online encyclopaedia Wikipedia, which has given people around the globe access to knowledge irrespective of their educational or ethnic background.

The key lesson from this brief period in the history of the world, in which society, politics, science and the economy have been transformed more rapidly than ever before, can be summed up in a nutshell with the phrase “the more, the better”. The more data there is – this period appears to teach us – the greater the transparency, the more autonomous the individual and the more competitive the economy will become.

 

The age of control

All of this gives us more control over the world around us, as we have a better understanding of it and are able to use this knowledge to our own advantage. For businesses it offers a wealth of new opportunities from targeted marketing activities to individualised products for their customers. Politicians are now able to analyse much more precisely what the public are thinking and address issues that concern voters directly. And we all have access to sophisticated radar systems which show us the quickest route to our destinations and help us avoid nasty surprises. At the same time there is an increasing belief that it should be possible to obtain more objective descriptions of complex systems such as the financial markets, the human organism or even the world itself – all on the basis of advanced statistics.

In the final analysis, the hope for the next generation of powerful algorithms is that they will herald a new stage of social development in which it will be possible to pass on the responsibility for decision-making to an intelligent technological environment. The division of labour and the service society have enabled us to leave it to the experts to do more and more things that we are not good at or which we find onerous – such as manufacturing clothes, cooking or preparing the necessary documentation for the decision-making process in companies. Now, based on statistical analyses, machines are on the brink of becoming powerful enough to be able to take more and more decisions off our hands – from buying a house to choosing a restaurant and even medical issues.

 

Beyond the white noise

This vision of an intelligent world which is constantly optimising our lives and making things easier for us, is – if one believes recent media coverage of the topic – a likely scenario for the near future. So we are on the road to an “age of transparency” with even more knowledge and more data on which to base balanced decisions. However, if we look critically at the possible consequences of the radical digitisation of our lives, there is also evidence to the contrary. It is quite possible to counter the hypothesis of the predicted radical transparency with its very opposite: a continuing lack of transparency in which the white noise of the data does not ultimately lead to the much sought-after perfect order and increased objectivity but leaves us – against all expectations – in the dark. There are a number of reasons for this.

Firstly, there are technical limitations to this process. For while data storage capacity doubles every year, the performance of the CPus which process this data only doubles every eighteen months. This means that the amount of unprocessed data – in absolute terms – is growing faster than our acquired knowledge. This is exacerbated by the fact that even today we are generating more data than we can store. The risk of losing one’s orientation and ending up drowning in the sea of data, is growing.

Secondly, considerable doubt exists as to whether mathematical models will ever have the capacity to grasp fully such complex systems as society, the financial markets or the human brain.

Added to this is the fact that the ever-growing volume of data is increasingly becoming too much not just for machines but for us humans as well. For at the end of the day data points are not information. This is why we are more and more frequently unable to cope with the data overload we are confronted with daily in our professional and private lives. This results in a deterioration in our ability to concentrate, a decrease in efficiency and creativity, combined with stress, which in turn often leads to illness.

Thirdly, as data becomes more diversified, it is more and more difficult to believe that there is such a thing as objectivity. For it is becoming increasingly easy to find facts to back up any possible theory – if you look long enough, you will find confirmation somewhere among the mountains of data. For instance, there a numerous studies indicating that drinking a glass of red wine every day reduces the risk of contracting cardiovascular diseases or suffering from depression. However, there are just as many studies that come to the opposite conclusion. So statistics alone are not enough for us to be able to make decisions. Facts are losing their status as facts, and, far from suddenly evaporating and allowing us to see clearly, the lack of transparency is becoming even more conspicuous in people’s consciousness than in the past. And, as a result, human intuition is reasserting its importance in the decision-making process.

A fourth factor is that it is unclear whether people, as they grow increasingly aware of how each and every one of their movements is being tracked and stored, will continue to act naturally, which is the prerequisite for inferences to be made about their actual needs. By interfering with the system, people change their behaviour and skew the results.

The fifth point is that so-called “overfitting” casts doubt on the fundamental tenet of the information society: that more data leads to more transparency and better decisions. For the quality of analyses and predictions does not necessarily improve with the number of factors taken into account; in fact, quite the opposite is sometimes the case. This is because including more and more parameters in the analysis that are only indirectly connected to the problem at hand skews the results instead of making them more accurate. Alternatively, when certain variables are counted on more than one occasion due to the algorithms’ complex self-doubling calculation processes, their importance is overestimated.

The sixth and final factor is that with blind trust in computer-based prediction and surveillance systems there will always be the risk of system failure. It is, of course, possible to use automated systems to influence more and more areas of life – from education to dietary recommendations or asset management. However, since data and algorithms will always remain nothing more than approximations and will never be able to depict and simulate reality itself, they have their limitations. One example of how algorithms reach their limit is stock market crashes triggered by automated trading programs executing pre-programmed instructions in a matter of seconds. However, what the widespread use of automated software programs undermines more than anything is the need to generate diversity – for the simple reason that algorithms are unable to create the unexpected, the new and unpredictable.

It follows, therefore, that if we increasingly attempt to replicate the world around us with a degree of detail that approximates to reality, we will hardly be able to create a better basis for decision-making. The data society as it were reduces itself to absurdity.

 

The power of Opacity

The prospect of a scenario in which the belief in objectivity is gradually disappearing and a lack of transparency prevails, is admittedly hardly desirable. However, a life beyond objectivity and transparency would also open up opportunities that might even bring us closer to the ideal of the Enlightenment – what Kant called “man’s emergence from his self-incurred immaturity” – than all the data processing formulas in the world.

First of all the assumption that the complex reality of the world in its entirety cannot be described by algorithms provides us with the certainty that control will be limited to sub-systems. This will reduce the threat of the rise of totalitarian regimes, and individual freedom will be preserved in the future – in spite of the data society.

However, the loss of what we believed to be objectivity will mean that we must continue to think for ourselves, perhaps even more than ever. It will only be possible to pass on specific decisions to our semi-intelligent digital world. We can continue to rely on those smart parking guidance systems and digital recommendations for medicines. But when it comes to understanding the big picture, there will be no substitute for our own minds, even in the future. Not least because our brains are capable of using experience and “intuition” to find connections between things in ways that a computer could never come up with – because these links are simply not accessible by means of logical programming or pattern recognition algorithms.

Another effect of the limitations that algorithms have, is that this will help maintain diversity and thus guarantee one of the important prerequisites for innovative thinking. Prediction models based solely on logic and rationality inevitably lead to a homogenisation of perspectives. unexpected developments are factored out of the equation. Yet in the real world of product development – just as in evolution itself – it is often mistakes that lead to progress. The less our lives are determined by algorithms, the greater the likelihood that errors will occur. These may not be desirable in bookkeeping, but they are definitely vital for innovative thinking.

 

Back to the analogue world?

In spite of all of this, the rise of the data society is inexorable. Data will affect our future lives more than ever before – perhaps not on the larger scale but definitely in many small ways. In response it is imperative that society, businesses, but most of all we individuals, develop strategies that help in the battle against the data overload. Withdrawing completely from the digital world would appear to be neither realistic nor judicious. On the contrary, what we really need to do is reflect critically on the chances and risks that the digital society entails, and ask ourselves where automated analyses of large amounts of data can help us make better decisions and where human brainpower and common sense will remain unchallenged.

Businesses as well as politicians will have to get used to the idea that their credibility can no longer simply be proved by hard facts, since these can be chosen at will. Consequently, this will see them attaching more importance to values and ethics in their positioning as well as their public utterances.

Moreover, the whole issue of the value of data must be revisited, and ideas need to be developed that on the one hand offer better protection for personal data and on the other provide people with appropriate financial remuneration when they allow others to have access to their data. The Swiss Health Bank project, for instance, plans to provide secure storage for the people’s medical data, while those who are willing to allow access to their data for research purposes are to be financially recompensed.

We ourselves can decide to turn our backs on the digital world and set out on the path back to analogue reality. A simple step in that direction would be to commit “digital suicide”. By using a website called Seppukoo – a play on the Japanese word “seppuku” used for the ritual suicide by samurai warriors – it is possible to delete completely everything you have ever posted on social networks. What would be more practical for everyday use, however, would be to develop our own skills to help us find our way in a world increasingly dominated by data, and thus retain control of our own decisions. This would encompass, for example, consciously selecting what we choose to perceive and what we prefer to ignore. It would also entail taking breaks in order to obtain the distance necessary to decide what is truly important and what is not. It is equally vital to resist the allure of statistics and to promote thought beyond hard facts. This means consciously making your own decisions and resisting the temptation to outsource these to machines. This is the only way we will be able to retain our ability to recognise patterns and to make connections between things from the most disparate of areas. All things considered, it is important to counterbalance the motto of “the more, the better” with the old adage that “less is more”. Our innovative capacity should not be targeted at generating more and more data come what may, but should place human beings, or better still their minds – the workings of which are fortunately not always guided by logic – at the centre of the data society of the future.

 

Stephan Sigrist is the founder and head of W.I.R.E. and has focused for many years on developments in the life sciences and long-term trends in industry and society. He has also authored many books and publications, advises companies and political institutions on strategic issues and is a regular speaker at international conferences. 

© 2024 W.I.R.E. - Web for Interdisciplinary Research and Expertise
mrks.ch - professional web work zurich