Spinoza, Descartes and suspension of disbelief in the ivory tower of economics

Here’s something I want to run by you on behavioural economics and the way economic issues are being debated in the blogosphere.

We are witnessing an implosion of long-held belief structures that go the core of how we believed our economic system functioned. You heard Alan Greenspan admit this after the financial system collapsed in his testimony in October 2008 before Congress:

“Yes, I found a flaw,” Greenspan said in response to grilling from the House Committee on Oversight and Government Reform. “That is precisely the reason I was shocked because I’d been going for 40 years or more with very considerable evidence that it was working exceptionally well.”

Greenspan said he was “partially” wrong in opposing regulation of derivatives and acknowledged that financial institutions didn’t protect shareholders and investments as well as he expected.

Greenspan Concedes to `Flaw’ in His Market Ideology, Bloomberg, 23 Oct 2008

The implosion of this neo-classical laissez-faire belief system of economics is a death about which we now grieve.

Disbelief

However, the grief is getting in the way of rational conversation. It goes to suspension of disbelief, cherished values, strongly-held beliefs and fear. I believe these are major issues in how accepting we are of new ideas  — and consequently for why this particular financial crisis is so devastating.

I have run into this problem on two specific occasions recently.

Despite my Austrian economics sympathies, I recently posted some articles inspired by Modern Monetary Theory (MMT).

Now, I don’t buy into some of what Modern Monetary Theory says about source of money’s value and the role of the state in monetary affairs. But I do very much appreciate MMT’s understanding of the mechanics of the fiat currency monetary system (a system I don’t fully support, by the way).

So I have presented some MMT-based ideas from a neutral frame in order to demonstrate their applicability to the present financial crisis. Invariably, I run into a lot of spurious arguments by people who sound like they don’t understand the accounting.

Or maybe they just feel threatened on some strange existential level – as if what I am writing threatens their core belief system. I think that is a lot of what is going on. So I am writing this post to explain how the human brain processes information. And then I will make a few remarks about how this applies to the present day situation.

Suspension of disbelief

The core of my argument will come from James Montier, now at the fund manager GMO. As a strategist at Dresdner Kleinwort Benson in 2005, he wrote a timeless piece on the debate between two 17th century philosophers René Descartes of France and Baruch de Spinoza of the Netherlands. Descartes was of the view that people process information for accuracy before filing it away in memory. Spinoza made the opposite claim, that people must suspend disbelief in order to process information. The two competing ideas were put to the test; and it appears that Spinoza was right about the need for naïve belief, something that has grave implications for investing, the subject of Montier’s essay.

Here is a long excerpt of what Montier wrote. The article is available online via John Mauldin (the link is at the bottom). This is a fantastic look into how people process information.

Sometime ago a client asked us to compile a list of myths that the markets seemed to hold dear. We came up with twelve potential myths ranging from stocks for the long run to dividends don’t matter via such topics as commodities for the future and bond supply matters. However, this exercise also made me wonder why it was that supposedly smart people ended up believing such strange things.

This pondering sent me (as is usually the case) to the annals of psychology. To some extent these errant beliefs seem to stem from bounded awareness/inattentional blindness and framing. We have explored such elements before. However, there may well be another factor at work. We seem to be hard wired to ‘believe’.

Daniel Gilbert, a professor of psychology at Harvard, has explored how we go about believing and understanding information. In a series of truly insightful papers Gilbert and co-authors have explored the belief process using two alternative philosophical viewpoints.

Cartesian systems
The first view is associated with the work of Rene Descartes. When it came to belief, Descartes suggested the mind performs two separate mental acts. First it understands the idea. Secondly, the mind assesses the validity of the idea that has been presented. This two stage process seems intuitively correct. After all, we can all imagine being presented with some novel idea, holding it in our minds and then pondering the truth or otherwise associated with the idea. The Cartesian approach fits well with folk psychology.

Descartes was educated by Jesuits and like many 17th century philosophers generally deployed psychology and philosophy in the aid of theology. Like anyone of any sense Descartes was well aware that people were capable of believing things that weren’t true. In order to protect the Church, Descartes argued that God had given man the power to assess ideas. So it clearly wasn’t God’s fault when people believed things that weren’t true.

As Gilbert (1993, op cit) notes, Descartes approach consisted of two axioms. Firstly, the mental separation and sequencing of understanding and believing and secondly, that people have no control over how or what they understand, but are totally free to believe or disbelieve ideas as they please.

Spinozan systems
Spinoza’s background and thinking could not be much more different than Descartes. Born a Jew, Barauch de Espinoza (later to become Benedict Spinoza) outraged his community and synagogue. The tensions finally resulted in Spinoza being excommunicated, accused of abominable heresies and monstrous deeds. The order of excommunication prohibited other members of the synagogue from having any contact with Spinoza.

Freed of the need to conform to his past, Spinoza was able to explore anything he chose. One of the areas he turned his considerable mental prowess to was the faults contained in the Cartesian approach. Spinoza argued that all ideas were first represented as true and only later (with effort) evaluated for veracity. Effectively Spinoza denied the parsing that Descartes put at the heart of his two step approach.Spinoza argued that comprehension and belief were a single step. That is to say, in order for somebody to understand something, belief is a necessary precondition. Effectively all information or ideas are first accepted as true, and then only sometimes evaluated as to their truth, once this process is completed a ‘corrected belief’ is constructed if necessary.

Libraries
Gilbert et al (1990, op cit) use the example of a library to draw out the differences between these two approaches. Imagine a library with several million volumes, of which only a few are works of fiction. The Cartesian approach to filing books would be to put a red tag on each volume of fiction and blue tag on each volume of non-fiction. Any new book that appeared in the library would be read, and then tagged as either fiction or nonfiction. Any book that is unread is simply present in the library until it is read.

In contrast, a Spinozan library would work in a very different fashion. Under this approach a tag would be added to each volume of fiction but the non-fiction would be left unmarked. The ease of this system should be clear; it requires a lot less effort to run this system than the Cartesian approach. However, the risk is that if a new book arrives it will be seen as non-fiction.

Gilbert et al note that under ideal conditions both systems produce the same outcome if allowed to run to conclusion. So if you pick up a copy of Darwin’s ‘The expression of emotions in man and animals’ and asked the Cartesian librarian what he knew about the book, he would glance at the tag and say non-fiction. The Spinozan librarian would do pretty much the same thing, concluding the book was non-fiction because of the absence of a tag.

However, imagine sneaking a new book into the library, say the latest Patricia Cornwell thriller. If you took the book to the librarian and asked them what they knew about the book, their response would reveal a lot about the underlying process governing the library’s approach to filing. For instance, the Cartesian librarian would say “I don’t know what sort of book that is. Come back later when it has been read and tagged appropriately”. The Spinozan librarian would glance up and see the absence of a tag and say “it doesn’t have a tag so it must be non-fiction” – an obviously incorrect assessment.

 
A testing structure
The picture below taken from Gilbert (1993) shows the essential differences between the two approaches, and also suggests a clever way of testing which of the two approaches has more empirical support.

Say an idea is presented to the brain, and then the person considering the idea is interrupted in some fashion. Under a Cartesian system, the person is left merely with an understanding of a false idea, but no belief in it. However, if people are better described by a Spinozan approach then interrupting the process should lead to a belief in the false idea. So giving people ideas or propositions and then interrupting them with another task should help to reveal whether people are Cartesian or Spinozan systems when it comes to beliefs.

The empirical evidence
It has long been known that distracting people can impact the belief they attach to arguments. For instance, in their 1994 review Petty et al report an experiment from 1976 which clearly demonstrated the impact of distraction techniques.

To test the impact of distraction, students were exposed to a message arguing that tuition at their university should be cut in half. Students listened to the ideas which were presented over headphones. Some heard strong arguments, others heard relatively weak arguments. At the same time, the students were subjected to a distraction task which consisted of tracking the positions of Xs that were flashed on a screen in front of them. In the high distraction version of the task, the Xs flashed up at a fast pace, in the low distraction task the rate was reduced heavily.

The results Petty et al found are shown in the chart below. When the message was weak, people who were highly distracted showed much more agreement with the message than did the people who only suffered mild distraction. When the message was strong and distraction was high, the students showed less agreement than when the message was strong and the distraction was low. Distraction did exactly what it was meant to do… prevented people from concentrating on the important issue.

Petty et al conclude “Distraction, then, is an especially useful technique when a person’s arguments are poor because even though people might be aware that some arguments were presented, they might be unaware that the arguments were not very compelling.” Something to bear in mind at your next meeting with brokers perhaps? The next time an analyst comes around and starts showing you pictures of the next generation of mobile phones, just stop and think about the quality of their investment arguments.

Is there more direct evidence of our minds housing a Spinozan system when it comes to belief? Gilbert et al (1990, op cit) decided to investigate. They asked people to help them with an experiment concerning language acquisition in a natural environment. Participants were shown ostensibly Hopi words with an explanation (such as a monishna is a bat). They had to wait until the experimenter told them whether the word they had been given was actually the correct word in Hopi or whether it was a false statement.

Subjects also had to listen out for a specific sound which if they heard required them to press a button. The tone sounded very shortly after the participant had been told whether the statement was true or false. This was aimed at interrupting the natural processing of information. Once they responded to the tone, the next Hopi word appeared preventing them from going back and reconsidering the previous item.

When subjects were later asked about their beliefs, if they worked in a Spinozan way then people should recall false propositions as true more often after an interrupt than the rest of the time. As the chart below shows, this is exactly what Gilbert et al uncovered.

Interruption had no effect on the correction identification of a true proposition (55% when uninterrupted vs. 58% when interrupted). However, interruption did significantly reduce the correct identification of false propositions (55% when uninterrupted vs. 35% when interrupted). Similarly one could look at the number of true-false reversals (the right side of the chart above) When false propositions were uninterrupted, they were misidentified as true 21% of the time, which was roughly the same rate as true propositions were identified as false. However, when interrupted the situation changes, false propositions were identified as true some 33%, significantly higher than the number of true propositions were identified as false (17%).

In another test Gilbert et al (1993, op cit) showed that this habit of needing to believe in order to understand could have some disturbing consequences. They set up a study in which participants read crime reports with the goal of sentencing the perpetrators to prison. The subjects were told some of the statements they would read would be false and would appear on screen as red text, the true statements would be in black text.

By design, the false statements in one case happened to exacerbate the crime in question; in the other case they attenuated the crimes. The statements were also shown crawling across the screen – much like the tickers and prices on bubble vision. Below the text was a second row of crawling numbers. Some of the subjects were asked to scan the second row for the number (5) and when they saw it, they were asked to press a button.

At the end of experiment, subjects were asked to state what they thought represented a fair sentence for the crimes they had read about. The chart below shows that just like the previous example, interruption significantly reduced the recognition of false statements (69% vs. 34%), and increased the recognition of false statements as being true (23% vs. 44%).

The chart below shows the average recommended sentence depending on the degree of interruption. When the false statements were attenuating and processing was interrupted there wasn’t a huge difference in the recommended jail term. The interrupted sentences were around 4% lower than the uninterrupted ones. However, when the false statements were exacerbating and interruption occurred the recommended jail term was on average nearly 60% higher than in the uninterrupted case!

The reptilian response 

Edward Here. What Gilbert, Petty, and Montier have demonstrated is that human beings have to suspend disbelief to process information and make judgments based on that information. Unfortunately, distractions (think bread and circuses) can lead people to believe something is true when in fact it is not – with grave implications for investing.

However, that’s not what happens with strongly-held beliefs at all. I remember talking to my mother about the Montier post, asking her about her own strongly held views on religion. Her answers were interesting because it demonstrated to me an unwillingness to even process information that ran counter to her most cherished and strongly-held beliefs. She admitted this interpretation was correct when we discussed it afterward.  Remember what Montier said “in order for somebody to understand something, belief is a necessary precondition.” The point was that she didn’t even process the information – such an existential threat it was to her.

Human beings have a very clear view of self and this is strongly intertwined with a belief system which generates what we describe as core values. So, if you attack those core values, you are likely to get an irrational and reptilian response. There is no processing of information as I described in “Through a glass darkly: the economy and confirmation bias in the econoblogosphere” going on; the cognitive dissonance is too great. Instead what you get is fear and an irrational defence. This is what my mother described.

The resolution of cognitive dissonance

So the world view widely held in Anglo-Saxon economies that markets are self-regulating and self-equilibrating is under threat because of the dislocations of the last two years. However, this view is deeply entrenched, having built up over nearly three decades of history. It is now adhered to with almost religious fervour (see my thoughts on this in The year in review at Credit Writedowns – Kleptocracy). People are not going to relinquish the self-equilibrating/regulating view overnight and not without overwhelming evidence to the contrary; the cognitive dissonance would be too great.

What this effectively means for me is that financial calamity and economic collapse are really the only way to dislodge this thinking.  Maybe I’m wrong – and, in fact, the markets are self-regulating and self-equilibrating.  Recent events suggest otherwise as does the frequency of what were viewed as similarly improbable market disturbances. And maybe I’m wrong about suspension of disbelief. Perhaps, humans are resilient and can process information despite the existential threat it poses to their sense of self.  I sure hope I am wrong for the sake of the economy.

 

 

 

 

 

Source

Scepticism is rare, or, Descartes vs. Spinoza – Investor Insight

Alan GreenspanEconomic DataEconomicsJames Montierpsychology