The collapse of context and the elimination of friction…

A retreat into Plato’s Cave

Richard Schutte
4 min readJul 9, 2019

--

As our physical and digital worlds increasingly converge [1], the process by which we give meaning to our collective experiences (i.e. sensemaking [2] and collective sensemaking [3]) is undergoing profound shifts.

More and more of our time is spent in a parallel digital universe where physical distance, time, culture, identity, and context collapse.

Digital sight, sound, and language become the primary ways we apply our five human senses to navigate and make sense of this virtual world.

This digital environment is shaped by transnational organisations, algorithms [4] and our behaviours, which interact through complex feedback loops to present a personalised version of reality for “people like us”.

Information and knowledge no longer roam freely.

Our attention has become the economy; we are the product, data is the new oil, and new forms of hyper and surveillance capitalism have emerged that benefit from network effects, algorithms, artificial intelligence (AI), automation and prediction.

The promise of this digital utopia has been anchored in making our lives easier & simpler by removing the friction in everything we do.

Our mobile phone has become the remote control for managing our life.

Emerging technologies such as bots, Ai, apps, voice search, and the blockchain are seen as ways of extracting any remaining friction from the customer experience, further streamlining, automating and optimising our interactions.

What if the universal embracement of this narrative poise new risks for our Society?

What if “context” really mattered?

What if “friction” and “humanity” are central to trust?

The US editor of The Financial Times and a former anthropologist — Gillian Tett — recently raised the issue of a digital context collapse [5] and the importance of ancient rituals and shared physical experiences in providing personal and collective meaning.

She notes that whilst these digital technologies sometimes feel wildly liberating, they can also be terrifying and disorienting.

A quote:

“In the cyber world, there is a constant problem of what anthropologists and psychologists call “context collapse”: it is hard to discern the contours of social interactions because the factors that use to frame our cultural lives are only half-present.”

In addition, Oxford lecturer and author Rachel Botsman has spoken extensively about the nature and importance of trust to a well-functioning modern society and how digital environments alter this.

She has eloquently and succinctly defined Trust as :

“a confident relationship to the unknown[6]”…

At its core are the elements of relationship — a concept anchored in humanity — and the unknown — a reflection of the uncertain nature of reality.

She believes friction is essential to demonstrate we are trustworthy.

Why?

To become trustworthy requires four human traits to be present:

Competence — (How?)

Reliability — Responsiveness, consistency and time (How?)

Benevolence — Do you care? (Why?)

Integrity (Why?)

Our human relationships, behaviours and time are essential for the How? and the Why?

As we accelerate into digital platforms, networks, AI and even digital Trust (blockchain), it will raise some profound issues.

How can we demonstrate trust without friction (i.e. requires time), our behaviours or a human relationship?

Does the algorithm care?

Does Ai have our best interests at heart?

Are the machines right?

UK Nesta CEO Geoff Mulgan has raised similar emerging issues around adopting Ai.

In thinking about ethics [7] in the context of Ai, our initial response seems to be anchored in a desire to simplify complexity and prescriptively codify a series of algorithmic principles or rules for areas such as bias, privacy, safety, transparency, explicability, truth etc.

What if reality was far more complex and “context” dependent and human sensemaking, and reflexivity, together with rules or code and embodied actions , were required?

What if all our perspectives ( both humans & machines) cannot grasp the complexity of reality solely through the Primacy of Human Consciousness ( Conscious Self – Ego – Phenomena of Will) – a Nominalism prism of reality?

What if it was impossible to “codify” all aspects of reality given its emergent and non-ergodic qualities?

What if the central question we should ask is whether the technology is right for the future we want to create?

In closing, US Rice University Computer Science Professor Moshe Vardi recently asked an essential question in these colliding worlds:

“Technology is driving our Future, but who is doing the steering?[8]”…

A question that is becoming more apparent as every day goes by.

Footnotes:

[1] Kevin Kelly about the “Emerging Virtual-Cloud” — http://realityblenders.com/2018/05/29/kevin_kelly_tnw_conference_2018/

[2] Sensemaking, the core skill for the 21st Century… — https://richardschutte.medium.com/sensemaking-the-core-skill-for-the-21st-century-ebc8c679cfe8

[3] Collective Sensemaking… — https://richardschutte.medium.com/collective-sensemaking-90826d1cb007

[4] The Yoda of Silicon Valley — Donald Knuth, master of algorithms, reflects on 50 years of his opus-in-progress, “The Art of Computer Programming.” — https://www.nytimes.com/2018/12/17/science/donald-knuth-computers-algorithms-programming.html

[5] How ancient rituals help us adapt to the digital age — https://www.ft.com/content/c3a55ae0-9797-11e9-9573-ee5cbb98ed36

[6] What does it really mean to trust? — https://medium.com/@rachelbotsman/trust-thinkers-72ec78ec3b59

[7] AI ethics and the limits of code(s) — ai-ethics-and-limits-codes

[8] Technology is driving the future, but who is doing the steering? — technology-driving-future-who-doing-steering

--

--

Richard Schutte
Richard Schutte

Written by Richard Schutte

Innovation, Intrapreneurship, Entrepreneurship, Complexity, Leadership & Community Twitter: @complexityvoid

Responses (1)