Photo by JJ Ying on Unsplash

How the blessing of dimensionality enables us to untangle and compress Complexity

“There’s an interdependence between flowers and bees. Where there are no flowers there are no bees, and where there are no bees, there are no flowers. They are really one organism. And so in the same way, everything in nature depends on everything else”…

- Alan Watts[1]

“Umuntu ngumuntu ngabantu”…

“A person is a person through other persons”…

- an Ubuntu Philosophy phrase [1b]

How do we make sense of our World ? All the complexity of our Mental World[2] and Material World [2].

It’s a question asked by UK Philosopher, Poet, Novelist, Cultural Critic, Medical Physician and Clinical NeuroScientist — Raymond Tallis[3] — in his 2018 book — Logos: The mystery of how we make sense of the world[4].

Tallis explores the challenges we face in trying to make sense of our Sensemaking[5].

Our remarkable innate capacity to bring a coherent rich perspective to a high dimensional complex[6] Reality.

Through Sensemaking[5], Intelligence, Collective Sensemaking[7] and Collective Intelligence[8] we have an innate capacity to bring order and meaning to an emergent dynamic World.

Philosophers and thinkers throughout human history have been fascinated and perplexed by this capacity.

The embracement of reason — Deduction, Induction & Abduction[10] — in our search for truth.

Our search for truth being a search for meaning — Semantics[9].

Our capacity to develop both Physical & Abstract Tools for Human Sensemaking including Linguistics (Languages), Mathematics and now Computational Languages[ 11](Software) & Devices (Hardware).

A World where our human condition is shaped as much by our survival instinct[12], emotions, desire for belonging & meaning[13], as it is by our bounded rationality[14] .

In the Book Tallis outlines how two-forms of reductionism have emerged in our attempt to make sense of Reality.

Materialists that are anchored in a Material World perspective.

The evolutionary physical material structures of the brain shape our consciousness, values[15], experiences, cognition and reason.

Idealists that are anchored in a Mental World perspective.

Our ideas shape our perceptions of the Material World.

Tallis suggests these classical binary approaches to making sense of Reality are insufficient.

Materialism ignores the human condition in it’s attempt to make sense of the physicality of our Material World.

Idealism ignores the objectivity of our Material World by focussing on our ExperienceMental World.

“We are accustomed to the idea that the truth of things may be neither pleasant nor comforting; we are less accustomed to the idea that the truth may be unfruitful”…

- Raymond Tallis[16]

Tallis believes that it’s the relationship between the Mental World and Material World that matters.

Human knowledge is dependent on an evolving dance between these 2 Worlds.

Our Mental World & Material World are interdependent.

Human Sensemaking[5] enables us to overcome this Mind Body Problem[2] and move beyond Cartesian Dualism[18].

Idealism vs Materialism

Mental World vs Material World

Abstraction vs Experience

Reason vs Emotions

The How? and What? vs The Why?

From an Age of Reason to an Age of Entanglement[20].

The disruption of Internet Search, emergence of Semantic Graphs and the Semantic Web

By 16 January 2020 the market capitalisation of US Internet Company — Alphabet — the holding company of Google — exceeded US $1 Trillion Dollars[21].

It was the fourth US Internet Technology firm to exceed this figure and stood alongside Apple, Amazon and Microsoft as the leading US Companies of the Internet Era.

It highlighted a shift from an Industrial Age to a Knowledge Age where Digital Bits (1’s and 0’s) had replaced Energy as the primary engine of the US Economy.

In 1980 the Energy Sector represented ~30% of the US Stock Market — By October 2020 it represented ~1.9% of the Vanguard Total U.S. Stock Market Index Fund[22].

At the core of Google’s success was it’s Internet Search Engine and at the core of this technology was it’s PageRank[23] algorithm.

It is an algorithm that was named after one of the founders of Google Larry Page — and was used to rank web pages in their search engine.

According to Google:

“PageRank works by counting the number and quality of links to a page to determine a rough estimate of how important the website is. The underlying assumption is that more important websites are likely to receive more links from other websites”…[24]

It was the first algorithm that was used by the Company and was at that centre of the innovation that drove it’s success.

In 1996 Sergey Brin and Larry Page had developed PageRank at Stanford University as part of a research project for a new type of search engine.

At the time the dominant public search engines such as Yahoo were increasingly experiencing scaling, computation, memory and latency challenges as the amount of information on the internet continued to grow exponentially.

The software architecture of these existing search engines literally catalogued every search page — akin to a book or a library.

As the number of websites and pages grew so did the size of the library catalogue in an attempt to organise the worlds information[25].

Reflective of this approach, Yahoo had a Chief Ontologist — an “Ontological Yahoo”[26] .

Tim Berners-Lee — the inventor of the World Wide Web in his book — Weaving the Web — sums up the Semantic shift in making sense of Complexity and our emergent World — the emergence of the Semantic Web.

“I was excited about escaping from the straightjacket of hierarchical documentation systems…. By being able to reference everything with equal ease, the web could also represent associations between things that might seem unrelated but for some reason did actually share a relationship. This is something the brain can do easily, spontaneously. … The research community has used links between paper documents for ages: Tables of content, indexes, bibliographies and reference sections… On the Web… scientists could escape from the sequential organization of each paper and bibliography, to pick and choose a path of references that served their own interest”[27] …

It reflected his March 1989, May 1990 proposal — Information Management — concerning the management of general information about accelerators and experiments at The European Organization for Nuclear Research (CERN).

The proposal discussed the problems of the loss of information in complex evolving systems and proposed a solution based on a distributed hypertext system[28].

It was a radical shift in how we make sense of Information and the World.

A shift from pre-defined classification systems, ontologies and taxonomies to emergent Semantics.

A transition from Books & Libraries to Semantic Networks & Graphs.

Interconnections and Interdependencies.

A new way to connect our Mental World & Material World.

A Semantic & Semiotic World US Pragmatist Philosopher Charles Sanders Peirce [29] had envisaged almost a Century earlier.

Is the almost universal embracement of Algorithmic Management[30] by our organisations akin to a Chief Ontologist in a Pre-Google World?

Has Sensemaking[5] become the most important skill for an Age of Complexity[6]?

Are we shifting from an Algorithmic to Semantic[9] way of making sense of Complexity?

As we outlined in — Sensemaking, the core skill of the 21st Century[5] — Israel History Professor Yuval Harari made some observations in UK Wired Magazine [31]on the role of teaching and learning in this emergent environment. The following quote illustrates the shift:

“In such a world, the last thing a teacher needs to give her pupils is more information. They already have too much of it. Instead, people need the ability to make sense of information, to tell the difference between what is important and what is unimportant, and above all to combine many bits of information into a broad picture of the world”…

The limits of Algorithms in Economics & Finance and the increasing Complexity of the Global Financial System

The first known use of the term Econometrics[32] was by Polish economist — Pawel Clompa — in 1910.

Econometrics being the embracement of statistical methods to economic data to provide a causal links between economic relationships and empirical content — information received from the human senses.

By the 1930’s Economics had become increasingly mathematically based — Algebraic Mathematical Relationships to develop Economic Theories & Heuristics Algorithms — and — Econometrics to refine those Theories (Mental World) against the Real Economy (Material World).

Whilst statistics had been central to Finance Theory since the 16th Century when statisticians in Europe applied these mathematical methods to Maritime shipping to quantify Risk, the embracement of Algorithms in Finance exploded in the second half of the twentieth century.

It was the combination of — new mathematical theories in Finance — Capital Asset Pricing Model, Sharpe Ratio, Z-Scores, Black-Scholes Option Pricing etc… — the adoption of computation in Finance — more & more financial and economic data — the democratisation of Finance — and — emergence of various Financial Asset classes that drove this explosion.

Algorithmic Management, Algorithmic Portfolio Construction, Algorithmic Investing (e.g. Passive ETFs and High Frequency Trading), Algorithmic Balance Sheet Construction and Algorithmic Risk Management emerged.

Yet despite this scientific mathematical (Algorithmic) precision — the World experienced an acceleration in Financial & Economic Crisis post the end of Bretton Woods in August, 1971.

It culminated in the US Tech Crash in 2001 and the Global Financial Crisis in 2008.

The Global Financial System had become increasingly interconnected and interdependent.

A global ecosystem of Financial Counterparties including Institutional Investors, Retail Investors, Commercial Banks, Traders, Financial Market Intermediaries, Hedge Funds, Private Equity Firms, Non Bank Financial Institutions, Investment Banks, Sovereigns, Venture Funds, Governments, Corporations and Central Banks had emerged.

All connected by a digital fibre optic network that had the capacity to transmit information and “value” at the speed of light.

By November 2015 the European Union had realised that the traditional prism of Financial Regulation & Monitoring the System required a fundamentally different approach given the Complexity & growing Interdependencies.

A recognition of the limits of Econometrics, Financial Models & Algorithms.

It authorised a data collection process — Regulation (EU) 2015/2365 —[33] — to improve the transparency and monitoring of Securities Financing Transactions.

A shift from an Algorithmic to Semantic[9] approach to understand the emergent complex nature of the Global Financial System.

Its Interdependencies and Interrelationships.

Better understanding Securities Collateral and Securities Lending that was central to the 2008 Global Financial Crisis.

However, despite these first steps the absence of urgency remains an issue, with little progress by the last quarter of 2020 in shifting from Algorithmic to Semantic methodologies in making sense of the Complexity of the Global Financial System.

Have we created a Fragile Algorithmic Global Financial System in a World that requires Anti-Fragility and Resilience?

Increasing Convexity[34] & Systemic Risks[35]?

Models of the World moving further away from the Material World[36].

The emergence of computational Deep Learning including new Semantic Theories such as the Manifold Hypothesis and the Information Bottleneck to make sense of Reality — our Mental World and Material World

In — Alchemy[37] — we outlined the explosion in breakthroughs in computational Deep Learning as Semantic Tools over the first two decades of the 21st Century.

However, despite their effectiveness in Prediction & Perception — the Geometric Mind Ali Rahimi — a previous Google & now Amazon Ai Researcher — asks a key question at the NIPS 2017 Conference:

Was Machine Learning the new Alchemy?

A black box?

Tools without explanation.

A phenomena that had been repeated a Century earlier with the emergence of Quantum Mechanics Algorithms in Physics — powerful abstract tools for prediction in a complex Material World.

Prediction that has enabled advances in electronics, optics, information science, semiconductors and chemical bonding.

Predictions without reductionist deterministic theories for a complex Material World.

So how many Dimensions has Reality ?[38]

The widely accepted view across our Society is that our Material World has at least 3 dimensions — height — depth — and — width.

But if we overlay the — Ontology Axiology — and — Epistemology — of our Mental World — we have Semantic Abstraction of at least 6 dimensions.

Our innate capacity to make sense of an emergent high dimensional Semantic Reality remains a profound central question for Humanity.

The curse of Dimensionality

As the dimensionality (i.e. number of variables — number of attributes in a data set) increases, the volume of space increases so fast that the available data becomes sparse.

This sparsity becomes problematic for statistical significance given the amount of data required often grows exponentially with dimensionality.

A second issue arises relating to classifying and sorting the data. The more dimensions the further away these data points may be.[39]

The blessing of Dimensionality

As the number of attributes are made on the same observations there is an inherent structure to these observations.

By taking advantage of this structure then the increasing dimensionality provides better estimates of the structure in high dimensional Semantic Abstraction.

A quote from a 2015 Article — Blessing of dimensionality often observed in high-dimensional data sets:

“For example, As an example, suppose that we make measurements on 10 people. We start out by making one measurement (blood pressure), then another (height), then another (hair color) and we keep going and going until we have one million measurements on those same 10 people. The blessing occurs because the measurements on those 10 people will all be related to each other. If 5 of the people are women and 5 or men, then any measurement that has a relationship with sex will be highly correlated with any other measurement that has a relationship with sex. So by knowing one small bit of information, you can learn a lot about many of the different measurements”[40]…

In other words, the high dimensional nature of the data anchored to an observation enables us to expose the strength of the Semantic relationships of the attributes.

In Rich Suttons March 2019 Blog Post — The Bitter Lesson — he outlines his perspectives of ~70 years of Ai Research.

A world where more & more computation, memory and data coupled with declining processing costs had become the primary engine for the Ai Revolution.

A computational singularity.

Is the blessing of dimensionality at the core of the effectiveness of these Deep Learning Computational Systems?

The Power of the Patterns.

Our Semantic interconnections & interdependencies?

A new type of Logic.

The Paradox of Dimensionality enables us to expose Semantic Interdependencies in high dimensional abstraction.

A quote from an 2014 Article — Systems generating systems — architectural design theory by Christopher Alexander[40] — on the nature of Linguistics — a Semantic Language.

“Chomsky’s work on generative grammar will soon be considered very limited… It does not deal with the interesting structure of language because the real structure of language lies in the relationships between words — the semantic connections. The semantic network — which connects the word “fire” with “burn,” red,” and “passion” — is the real stuff of language. Chomsky makes no attempt to deal with that and therefore, in a few years, his work will be considered primitive”[41]…

The Dimensionality Paradox and nature of the Semantic Mind[9] raises some profound questions about our Mental World and Material World.

Is Deep Learning computation a new lens and prism from which we can untangle the complexity of Reality[42]?

An example of Simplicity Theory[43]?

Exposing and untangling manifolds, geometry and patterns in higher dimensional Semantic Abstraction?

Uncovering degrees of freedom for various types of ontology, epistemology and axiology ?

A way to separate the random from the regular?

A Markov Blanket[44]?

The Manifold Hypothesis[45] and Bottleneck Theory[46].

Complexity Compression[47].

Simplicity Theory[43].

A Complexity Void[6].

“Quantum entanglement is a physical phenomenon that occurs when pairs or groups of particles are generated, interact, or share spatial proximity in ways such that the quantum state of each particle cannot be described independently from the state of the other” – a metaphor for Human Life & the interdependencies of the Complex Systems we inhabit.”

– The Age of Entanglement

[ 1] — Alan Watts —

[1b] – Descartes was wrong: ‘a person is a person through other persons’ -

[2] — Mind Body Problem —

[3] — Raymond Tallis —

[4] — Logos: The mystery of how we make sense of the world —

[5] — Sensemaking, the core skill for the 21st Century… —

[6] — The Complexity Void —

[7] — Collective Sensemaking… —

[8] — The emergence of Collective Intelligence… —

[9] — The Semantic Mind —

[10] — Why Philosophy matters more than ever in the Age of Entanglement?… —

[11] — Some foundational questions for Humanity for the 21st century… —

[12] — The Survival Instinct… — — and — The Survival Instinct — Part 2… —

[13] — In search of wisdom in the information and knowledge age… —

[14] — Humility is truth and the sea of ignorance… —

[15] — In the age of disruption what is your North Star?… —

[16] — The Explicit Animal: A Defence of Human Consciousness —

[17] — Reflexivity — reflexivity-bdd9d0a0fc7d

[18] — Cartesian Dualism & Mind-Body Problem Explained | Rupert Sheldrake Interview —

[19] — When two worlds collide…—

[20] — The Age of Entanglement… —

[21] — Google owner Alphabet becomes trillion-dollar company —

[22] — Does The Price of Oil Even Matter Anymore? —

[23] — PageRank —

[24] — Google Wiki —

[25] — Organise the worlds information —

[26] — Once The Most Powerful Person In Search, Srinija Srinivasan Leaves Yahoo —

[27] —

[28] — Information Management: A Proposal —

[29] — The American Aristotle — Charles Sanders Peirce was a brilliant philosopher, mathematician and scientist. His polymathic work should be better known —

[30] — Re-Imagining Organisations for the 21st Century… — re-imagining-organisations-for-the-21st-century-94d32e4e7e91

[31] — Yuval Noah Harari on what the year 2050 has in store for humankind —

[32] — Econometrics —

[33] — REGULATION (EU) 2015/2365 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 25 November 2015 on transparency of securities financing transactions and of reuse and amending Regulation (EU) No 648/2012


[35] — Systemic Risk in the Broad Economy — Interfirm Networks and Shocks in the U.S. Economy —

[36] — The Reality Gap —

[37] — Alchemy —

[38] — Radical dimensions — Relativity says we live in four dimensions. String theory says it’s 10. What are ‘dimensions’ and how do they affect reality? —

[39]The Curse of Dimensionality —

[40] — A blessing of dimensionality often observed in high-dimensional data sets —

[41] — Systems generating systems — architectural design theory by Christopher Alexander —

[42] — Some foundational questions for Humanity for the 21st century… —

[43] — Simplicity Theory —

[44] — The Markov Blanket —

[45] — The Manifold Hypothesis —

[46] — New Theory Cracks Open the Black Box of Deep Learning —

[46] — Compress Data And Win Hutter Prize Worth Half A Million Euros — — and — Occam’s razor —

Innovation, Intrapreneurship, Entrepreneurship, Complexity, Leadership & Community Twitter: @complexityvoid