01 Jul 2016
The frontiers of technological advancement are not just for the experts any more. Probably because of the ubiquity of advanced consumer technologies such as the smartphone, awe-inspiring innovations make headlines almost daily, grabbing the attention of the masses. Self-driving cars are steadily becoming a familiar sight on the streets of certain countries. Ambient home devices control the lights, the music or do the shopping with natural language voice interaction. Some people are now purchasing robot home companions. Advanced prosthetics, DNA editing and the 3D printing of human organs transform our concept of medical treatment. After the 2007-2009 financial crisis, businesses in advanced economies have been able to resume growth, productivity and profitability by exploiting their installed base of information technology, without having to hire back the employees that had been fired during the crisis. From factories and warehouses to legal firms and hospitals, robotics and other advanced technologies are taking over jobs that were up to recently considered the exclusive domain of specialized human workers. Technological advancement promises a bounty of benefits, while at the same time challenging long-established norms. Where are we headed?
The exponential pace of technology exceeds our ability to grasp the magnitude of change. Fiction is sometimes better at exploring future possibilities. Like Jules Verne in the first machine age, a growing Science Fiction filmography explores what the near future might look like if current trends continue apace. Films like Her, Ex Machina, Wall-e, Minority Report and others remind us how close today’s technology is to radically alien worlds. They show relatively familiar, by now, scenarios such as micro-drones, personalized advertisements and holographic user interfaces, as well as more bizarre ideas such as computers displaying empathy and human emotion, people lost in simulated worlds, computers augmenting human skill acquisition, genetically engineered human species, a humanity in permanent vacation, not to mention space colonization. As radical as they may seem, these scenarios are little more than a logical extrapolation of current developments. At the same time, leading scientists express concern over the existential threat from future advances in artificial intelligence[iii]. It is such developments that force us to question the unquestionable: if we can create machines to be better than humans and bio-engineer humans to be as good as machines; if we can abolish work and conquer disease, then what does it mean to be human?
This question has been the realm of theologians, philosophers and, more recently, cosmologists, neuroscientists and others. Instead of attempting advanced theorizing, we might as well observe everyday human experience. In an everyday life saturated with digital media, often unable to distinguish the simulated from the real, is there anything that we miss out? For example, we are so excited sharing the fleeting moment of a sunset on Instagram, that we forego the timeless immersion of our senses to the colors and smells, the eerie silence, as nature shifts from daylight into the night. Notice how popular musicians perform so-called “live” concerts in front of a sea of glowing smartphones, while the audience is present-absent as they attend via their 5-inch screens right in front of the stage. It seems that if you don’t share the concert or sunset on social media and if your “friends” do not “engage” with your posts, it’s like you weren’t there. Presence is validated through its digital representation, so much so that the latter ends up being all that matters. The lived human experience gets reduced to digital data, only to disappear in the all-devouring “timeline”. Long before the advance of the Sci-Fi robots, we are already reducing ourselves to machines, unavailable to experience what it means to be alive[iv]. This is not a recent development; it is the culmination of a mindset that has dominated all of modernity. But it is in our time that the unprecedented acceleration of technological change forces everybody to come face to face with such taken-for-granted issues, such as the nature of being human or a world without work[v].
What is to be done about it? Should we reject technology altogether and attempt some kind of historical regress? That would be more than naïve, absurd. An answer has yet to emerge. This is the opportunity of our times: for an answer to emerge we must first become focally aware that technology is not just a tool. It is a value system, the overarching worldview which guides all aspects of modern society. Only then can we start conceiving alternatives to the dystopian futures of Science Fiction.
Even though most of the theory and practice of management epitomizes the technologizing mindset over human affairs – the logic dictating that people, nature and their inter-relations are fungible resources to be optimized, a problem to be solved – the kind of adjustment that we probably want to contemplate must come from within the realm of management. Business is de facto the most powerful and influential institution in the world today. It spans boundaries and operates at speeds that are impossible for political processes to catch up with. It follows that the responsibility resting on the shoulders of business leaders is enormous: whether they realize it and accept it or not, the choices they make have a profound impact on the future viability and wellbeing of humanity.
This is why education is of such paramount importance. Most of the public discourse on education centers on the need to invest on the engineering skills needed for further technological development in the service of problem solving and optimization. This approach is indeed vital if we are to prepare employable young people when demand for work changes, and to sustain prosperity for an ageing population. Having said that, and while doing so, if we are to re-prioritize humanity over technology, we need to look further than that. We need to place renewed value to forms of human knowing that eschew the privileged domain of computers, algorithmic processing or scientific experimentation. Without foregoing our great technological accomplishments, we must rise above them and heed Martin Heidegger when he wrote that “what is essential in the discovery of reality happened and happens not through science, but through primordial philosophy, as well as through great poetry and its projections”[vi]. Throughout history, people articulate the ineffable through great art. All uniquely human experience that cannot be reduced to ones and zeroes becomes music, theater, painting or sculpture, where expression is but a gateway to meanings that are greater than words.
It is for this reason that the top Business Schools make a point of enriching their MBA and leadership development programs with education on the arts. As paradoxical as it seems to follow up on corporate finance with a course on theater, the existential challenges of the second machine age will be answered in the boardroom as much as in the academy. And to be able to rise to this challenge, business leaders must be prepared to look at the bottom line and much further beyond.
[i] Erik Brynjolfsson and Andy McAfee, “The Second Machine Age”, W.W. Norton & Co, 2016.
[ii] Klaus Schwab, “The Fourth Industrial Revolution”, World Economic Forum, 2016.
[iii] Nick Bostrom, “Superintelligence: Paths, Dangers, Strategies”, Oxford University Press, 2016.
[iV] Jaron Lanier, “You Are Not A Gadget: A Manifesto”, Penguin 2011.
[v] “A World Without Work?”, World Economic Forum Annual Meeting 2016, https://www.weforum.org/events/world-economic-forum-annual-meeting-2016/sessions/a-world-without-work/
[vi] Martin Heidegger, “The Essence of Truth” (1930) translated to English by Ted Sadler, Bloomsbury, 2013.