“A fool thinks himself to be wise, but a wise man knows himself to be a fool”
– William Shakespeare, “As You Like It”, Act 5 scene 1
As we ponder the possibility of a recovery from the recent economic slowdown and other political and economic uncertainties it is easy to think that this century and millennium, whose birth was celebrated so enthusiastically thirteen years ago, has not got off to the most auspicious start. It seems that leaders worldwide have rarely, if ever, had such a full agenda. To address it effectively they need the trust of their followers. However, public confidence in institutional leadership appears to be on the decline (Harris Poll data, cited in P Barrett, 2012). Although it is difficult to identify one single reason for such a decline, we could speculate that the perception that individuals in leadership roles have of their performance (judging e.g. by the growth of financial rewards and bonuses they expect) does not match their followers’ perception of their performance. At the same time the apparent sincerity of leaders’ reactions to this mismatch suggests that self-deception might be involved. This article takes a look at the phenomenon of self-deception with examples from people in prominent corporate or public leadership roles. We also aim to explore the nature of self-deception in leadership and potential consequences of this phenomenon from a developmental perspective.
What Constitutes Self-Deception?
Of all the major political decisions made so far this century, the decision to invade and institute ‘regime change’ in Iraq has been the most controversial. Principally justified on the basis of Iraq’s possession of weapons of mass destruction (WMDs), President George Bush launched the invasion in 2003, supported by the UK’s Prime Minister Tony Blair. Despite the subsequent failure to find any trace of the infamous WMDs both heads of state have subsequently sought to justify their decisions. In the UK, at the start of 2010 Tony Blair was called to account before the Chilcot commission into the Iraq war. Ironically, the enquiry has yet to publish its findings, however Tony Blair’s testimony was full of conviction that he had made the right decision. When asked if he thought Saddam Hussein had weapons of mass destruction, he said: “I did believe it. And I did believe it beyond doubt.” Blair’s conviction was apparently so strong that what started for him as an issue of probability based on a supposedly balanced assessment of the evidence became a matter of incontrovertible knowledge. On the one hand he knew there was evidence for and against the presence of such WMDs – he had, after all, heard the request of Hans Blix, the UN’s weapons inspector, for more time to find such weapons. On the other hand, he decided that the matter was beyond doubt. Is this a case of self-deception? We will come back to this question later.
In the business domain, Jamie Dimon, CEO of JP Morgan, is on record testifying before the US senate banking committee last year in June that he was “dead wrong” about statements he made the previous month comparing an incipient $2bn trading loss to a “tempest in a teapot’. How did Jamie Dimon fail to see that a trading unit that once had been extremely profitable was so risky that massive losses could also accrue? Why did he fail to get his Chief Investment Officer and Head of Risk Management, Ira Drew, to explain the extent of the bank’s exposure? Harvard Business School Professor Max Bazerman (2012) refers to this phenomenon as “bounded awareness” – the human tendency to fail to perceive and process important information that is available. Jamie Dimon somehow convinced himself that something he knew might be a massive problem was actually relatively minor.
The phenomenon of self-deception has long been of interest to dramatists, psychologists and philosophers amongst others. Its existence is commonly accepted, however an analysis of the concept throws up a variety of difficulties in understanding what exactly constitutes self-deception. How, for example, is it possible for a person to believe something is true and also believe it is not true at the same time? Similarly, if deception is by definition intentional, how can people deceive themselves knowingly without undermining the deception? What is the nature of the ‘self’ that is being deceived? And what assumptions does the claim of deception entail about our conception of the nature of ‘truth’? Despite the attention of many researchers, there remains no dominant theory of self-deception.
One way of illuminating the paradox of self-deception is through discrepancy between the first and the third person perspective. Self-deception is sometimes called “wilful blindness” implying that a person deliberately closes his eyes to what is otherwise obvious to others and should also be obvious to him/her. Jeffrey Skilling and Kenneth Lay, ex-CEO and Chairman, respectively, of the Enron Corporation, were convicted of fraud in 2006 on the basis that they knew the true state of the business before it went bankrupt. As much as they both pleaded innocent, their claim not to know the financial condition of their business was seen as either an outright lie, or a case of self-deception. Note that the ‘wilful blindness’ concept assumes that a person has control over whether they pay attention to particular data in their environment. While Skilling and Lay might have protested that they were not aware of the facts surrounding their business, the argument predominates that they could and should have been aware. Although this position is justified in legal terms, psychologically speaking it is only the self-deceiver who can know the reality that is presented to him/her. This is why postmodernists have argued that traditional perspectives on self-deception assume a third person account of the ‘truth’ and so are biased against the first person’s account.
An exception to such third person accounts is Leon Festinger’s (1957) theory of cognitive dissonance. This describes the discomfort that people feel when they hold conflicting cognitions simultaneously, and how such uncomfortable feelings are resolved. The classic illustration is that of the smoker who, knowing that cigarettes can cause lung cancer, feels uncomfortable with the tension between the idea of himself as a sensible person wanting to live a long and healthy life, and the idea that by smoking he is actively taking action to shorten his life. Typically this tension is reduced by the smoker dismissing the idea that cigarettes are harmful or that they are likely to be the cause of an early death, captured by the phrase “I never thought that it would happen to me”. Cognitive dissonance theory suggests that, when faced with contradictory thoughts, people are motivated to reduce the contradiction by downplaying or ignoring one or other side, and are thereby deceiving themselves. This theory stands without the need for an objective verification of the facts and is only concerned with the first person perspective in terms of their attempts to reconcile an internal conflict.
Unconscious bias is also sometimes described as self-deception. For example, the implicit association test (IAT) attempts to measure unconscious bias by timing the speed of responses to pairs of stimuli (Greenwald et al, 1998). Typically in such a test people are asked to categorise words or pictures into one of two groups. The relative speed with which they might pair pictures of people of either Caucasian or Asian race with words having a positive or negative connotation is suggested to reveal a racial bias about which people are unaware and may deny if confronted. Such a bias might show up, for example, in the recruitment selection decisions made by a manager, even despite the manager’s intention to maintain impartiality and apply consistent procedures in evaluating applications. Is this a self-deception though? If there is no way that the bias could be perceived by the manager, which ‘self’ is being deceived? This example obviously implies only a third person perspective in evaluating bias as self-deception.
It appears that the term self-deception is applied to many different phenomena, which makes defining self-deception a difficult task. While in a wider sense the phenomena described above could be seen as self-deception, a stricter definition implies that one deceives oneself only if acting in ways that prevent one becoming informed about unwanted information (Bandura, 2011, Fingarette, 2000). It seems fair to add that the concept of self-deception should also make sense from both first and third person perspectives in order to be called self-deception. This of course, opens a new set of problems about what is meant by ‘self’ (Bachkirova, 2011).
However, defining self-deception is only part of the problem. It is also difficult to explain how a person can be both the perpetrator and the victim of a deception. One way to resolve this paradox is to conceive of the mind as partitioned (von Hippel & Trivers, 2011; Kurzban, 2011). In this conception, different parts of the mind generate different representations of reality according to the current goals of each particular part. This position contradicts common sense as it suggests that there is no unified self, but it has been supported recently by findings from neuroscience (e.g. Gazzaniga, 1992). According to this position our everyday mental functioning is a result of the operation of different mental modules that aim to fulfill particular tasks. When one module is dominant, the mind selects only information from the environment that is relevant for the fulfillment of that module’s particular task. Information that might be important for another module is simply ignored as irrelevant. So, if we refer to the example of Tony Blair’s belief in the existence of WMDs, it could be argued that he had a highly activated mental module that dismissed information that would have been relevant to other modules e.g. those concerned with efficiency and co-operation in decision-making. Although partitionists see self-deception only as cases of selectivity in the perception of information, people reflecting on their actions may come to realise that their thinking, being determined by one of their dominant modules, was self-deceptive. When they are able to see what they gained by this selectivity, the self-deception becomes apparent to them and so acknowledged from the first person perspective.
The Benefits and Costs of Self-Deception
From the perspective of evolutionary psychology, for self-deception to persist in each person and to be a common phenomenon there should be a gain in deceiving oneself. On a personal scale the gain is usually to be found in avoiding distress and/or acquiring benefits such as enhanced self-image. However, some authors argue that the benefits of self-deception should be visible from both a first and third person perspective to be motivational (e.g. Pinker, 2011). For example, one particular theory suggests that self-deception evolved in order to make it easier for people to deceive others (von Hippell and Trivers, 2011). The assumption underpinning this theory is that those who can conceal deception from others, e.g. when trying to promote their abilities, have an evolutionary advantage. In situations where people might be at risk of having their deception detected, efforts to deceive others about one’s abilities are likely to be more successful if accompanied by self-deception. When a person is able to make a claim with the utmost sincerity, it is less likely that others will uncover such deception. Self-deception may also carry some advantages in reducing cognitive load and anxiety, and so improve performance.
It is also interesting to note that according to the evolutionary approach, individuals in leadership roles may have a need to be particularly skilful at self-deception. The role of a leader is to envisage the future and to inspire their followers, e.g. according to the concept of “Transformational Leadership” (Bass & Avolio, 1994). How might self-deception be involved in these functions? The ability to imagine future scenarios has obvious adaptive benefits, allowing people to prepare in case of problems and to secure future rewards. As it is in the job description of leaders to do this, they may also be more prone than most of us to systematic errors in anticipation. Equally, in order to instil confidence in their followers, leaders need to appear confident even when they may be in doubt. Most people tend to exaggerate the positive or negative emotional consequences of future events, but in leaders this tendency may be on a larger scale and so potentially more costly.
In terms of the costs of self-deception, and also more pronounced in leadership, the obvious disadvantage is that, if self-deception helps to make deception more effective, more people become victims of deception with implications for the scale of retribution once the deception is uncovered. Also, as self-sanctions are disengaged by self-deception, it is possible that people with a greater individual tendency for self-deception may have more capacity to harm others without perceiving their actions as harmful (Frost et al, 2001). The scale of this harm may at times extend to the loss of life during conflict and the loss of economic stability of businesses, or even whole countries. Because of the potential magnitude of such harm leaders need to be aware of their capacity to deceive themselves, and therefore capable of offsetting it. It is with considerable irony that we discover that Jamie Dimon is quoted in a speech from 2009 saying, “You have to fight self-deception. Human beings are experts at it. I do it, too.”
As there is a good reason for keeping self-deception at a reasonably low level, it is surprising how little is known about individual differences in terms of the tendency for self-deception. Even less is discussed about the ways to influence it. However, by applying developmental theory it is possible to offer an interesting perspective with which to understand self-deception that might also suggest a unique way of working with leaders to minimise it.
A Developmental Perspective on Self-Deception
Developmental theories emphasise changes in the way adults make sense and meaning over the course of their lifespan. Such development is considered to be analogous to an expansion of ‘consciousness’ or a growing awareness of the nature of the self and the reality of the universe. For example, Kegan (1982, 1994) identifies 5 stages through which people develop, characterised by qualitative changes in the way people see and understand themselves and their world. In such theories, individuals’ understanding and experience of reality is viewed as being a construction of their minds − a construction that becomes progressively more complex and sophisticated over time as the mind develops but which is nevertheless idiosyncratic and unique, taking the form of a narrative that typifies the way the person makes sense and meaning from their experience.
By implication, a narrative is limited in content. It contains what is most salient and meaningful for its creator, but it necessarily leaves out elements of experience that passed unnoticed. Thus, the narrative of an individual about oneself and one’s engagements in the world is an indication of the stage they are at and of what they are able and not able to perceive. If something is missed out of the narrative can this be seen as self-deception? We would argue – not necessarily. If the person is simply not yet capable of creating a wider or richer narrative, and so does not notice some elements of themselves and their environment, this does not constitute self-deception. The ‘filters’ that prevent such perception are just an indication of the stage they are at – they are innate mechanisms typical for a particular stage of development. However, if a person is already capable of perceiving certain aspects of themselves or their environment according to their stage of development, but nevertheless does not so because filters that were present earlier in life become reactivated, then this would indicate self-deception. For example, when a person at Kegan’s stage 3 or 4 of development, already capable of critical and abstract thinking, begins to see a situation only in ‘black and white’ terms then this might constitute self-deception, particularly when he benefits from this ‘black and white’ picture. By the same token, a person could not be said to be deceiving himself if he had not yet developed these higher capacities (Bachkirova, 2013).
For example if we make an assumption, which might be reasonable for a leader of a country, that Tony Blair’s developmental level was at, or close to, stage 5, he should have been able to consider the WMD situation from different perspectives and should not have invested himself totally in a single perspective, however ‘noble’ he considered that position to be. By contrast, it would appear that he actively avoided information that ran counter to his beliefs. Hence we could take his case as one of clear self-deception. However, if Blair was only developmentally at stage 4, we might say that this was not a case of self-deception because the views that he took and his unshaken belief that his was the right view is not inconsistent with the way people make sense at this stage and how they see their ‘self’.
Therefore, self-deception from a developmental perspective implies consideration of the developmental stage of the self-deceiver and the capacities available to him at each stage of development. Self-deception can be seen a temporary step back from the stage achieved. This is why it appears to both the perpetrator and the observer as something that is unusual and somewhat a ‘betrayal’ of the self. Such developmental regressions are not, in fact, unusual in principle. Many developmental theorists suggest that development is not necessarily a matter of continual forward progress, but more accurately can be described as movement around a developmental centre of gravity, which involves some regression when needs from a previous stage become activated (Bachkirova, 2013).
It is also to be expected that the nature of the self-deception to which a person is vulnerable, as outlined above, will change with time (Valliant, 2005; Cook-Greuter, 1999). At later developmental stages people are typically more aware of the ways in which their thinking could be biased or distorted, and are more curious to explore and find ways to prevent this. They become more emotionally secure in themselves, their psychological defences become looser and their vulnerabilities are less threatening to them (Valliant, 2005). As King and Kitchener’s (1994) model of the stages of reflective judgement suggests, when people develop, their assumptions regarding what constitutes “truth” change and become progressively more sophisticated. They also become more open to alternative perspectives. The following examples show what self-deception might look like in leaders at three different stages of development.
Self-Deception at Stage 3 – The Socialized Mind
The person at Stage 3 has grown in terms of being able to hold the perspectives of others. They are now aware that others may have a valid perspective that is different to their own, and that this must somehow be taken into account if they are to maintain a relationship with that person. The person seeks out communities that are felt to be attractive and which share similar aspirations and ideals, such as a professional grouping. The norms, values and expectations of that group become determining of that person’s behaviour. In effect, he or she becomes socialized. Such socialization comes at a price, which is that the coherence of the persons’ thinking about themselves and others is fragile. They are dependent on other people in their chosen community to validate their views and self-image, and such validation is not always forthcoming (Kegan, 1982).
An example of a leader’s self-deception at this stage of development might be that of a team leader who feels threatened by, and is competitive towards a leader of another team within her organisation. Although she is usually capable of taking other peoples’ needs into consideration and accurately reading their moods and intentions, in this case she misinterprets the efforts of the other leader to establish a productive relationship. She avoids direct communication with this leader as often as possible because she feels hostility and negativity expressed towards herself by the other person. At the same time she remains adamant that she is the one and only party to the relationship who continually exemplifies good communication and who puts team spirit first.
Self-Deception at Stage 4 – The Self-Authoring Mind
Stage 4 is characterised by the establishment of an autonomous self, guided by the person’s own values, standards and goals. Here, the individual is no longer subject to the views of actual or internalised others in defining himself. He marches to his own drummer and can take full responsibility for the consequences. The self-authoring person knows who he is, but having “found himself” he is reluctant to give up his self-concept. While he can respect alternative perspectives and viewpoints, he is likely to believe that these too could be as subjective as his own ideas, and therefore he prefers to stick with his own point of view (Kegan, 1982).
An example of self-deception in such a leader might be an executive who disagrees but finds himself unable to express his concern about a decision advocated by someone of higher status in the executive committee. In order to preserve his self-image as someone who is always prepared to speak up for his position, he re-interprets the situation as one where there was no possibility of debate, and reinforces this interpretation by describing the situation as such to his subordinates. Although the executive genuinely believes his version of events, he avoids providing any details of the meeting and unconsciously changes the topic when asked for more detail.
Another example of such self-deception would be that of self-authoring leaders participating in ‘groupthink’ (Janis, 1972). The self-deception that occurs is of one assuming that one’s decision making process is adequate for the situation, however the main motivation for this complacency is a desire to belong, or not to ‘rock the boat’. The ill-fated takeover of ABN Amro by the Royal Bank of Scotland (RBS) in 2007 appears as a classic example of such groupthink within a board of company directors. One of the 17 members of the RBS board subsequently confessed to the Financial Services Authority (FSA) that no-one ever said he or she was worried by the ABN deal. The FSA report states: “It is very difficult to reconcile this approach with the degree of rigorous testing, questioning and challenge that would be expected in an effective board process dealing with such a large and strategic proposition” (FSA, 2011).
Self-Deception at Stage 5 – The Self-Transforming Mind
People at stage 5 are aware of the limitations and idiosyncratic nature of their history, character and frame of reference. They see that the self that has served them well up to stage 4 is, in fact, a carefully crafted fiction, and a limitation to their future growth. By consequence they are willing to seek out others whose differences of perspective can provide fertiliser for their further development. They become progressively less invested in any single way of thinking about the world and themselves, apply fewer dichotomies and polarities and are more open to alternative, even opposite, possibilities. For them, life long learning becomes a reality (Kegan, 1982).
Their self-deception in a position of leadership might be manifested in subtle ways and they would be curious to explore any sign of such occurrence, and not defensive in doing so. For example, such a leader might discover that her strong efforts to convince her colleagues about a particular course of action were not solely in the interests of clarity and to achieve the best decision as she believed at the time, but also because of a competitive desire on her part to be ‘right’ and to be seen to show strong leadership
What Can Leaders Do to Counter Self-Deception?
People who attain significant leadership roles in life frequently show interest in leading at an early age, for example in school at sport or in other activities. Many learn about their human fallibilities through experience, and are thus able to find ways to offset these later in their careers. However, history is riddled with the stories of leaders who wielded immense power whilst displaying tendencies to narcissism, paranoia or sadism (Coolidge & Segal, 2009). No doubt such leaders were able to convince themselves that their actions were justified, but experience itself is not sufficient to overcome the forces of self-deception.
The developmental perspective that we have outlined suggests that self-deception in a variety of forms is likely to be present for leaders at all levels, even the very highest. It occurs when a leader has a motive for self-deception that reflects a stance from a developmental stage that he already surpassed. In a way it is a sign of regress, but it is temporary because the leader can potentially become aware of the issues involved. In this respect it is very interesting that Jamie Dimon was able to own up to self-deception in a way to which Bob Diamond, the ex-CEO of Barclays, appeared unable when giving evidence to the Treasury Select Committee on rigging key LIBOR rates.
Although self-deception seems an unusual feature of human nature, it is not inevitable. In fact, there are various means for leaders, and indeed for all those concerned with quality of their engagement with the world to regain or enhance their awareness about the potential for self-deception. Here are some of these means:
Accepting that self-deception is the human condition
Jim Collins, in the book Good to Great (2001), refers to the compelling modesty of the top-level leaders of a selection of great companies. Such leaders lead, not through the cult of their own personalities, but by assembling teams of competent people around them. Those who think they are above self-deception are those most likely to suffer from it. Leaders that can accept that they are human and are vulnerable to the same faults and weaknesses as anyone else are then able to create the conditions for others to identify and correct their flawed thinking.
Becoming an active participant in one’s own development
Self-awareness is the first half of the battle in combating self-deception, necessary but not sufficient on its own. By gathering an understanding of how adults develop cognitively and socio-emotionally over their lifespan, and by studying their own development, leaders can position themselves optimally to counter the forms of self-deception to which they are most vulnerable. Developmental practices that appeal to individuals in leadership roles are often ones that help to improve the way organisational issues are addressed and problems solved. Examples can be found in the work of theorists, such as Chris Argyris’s Action Science (1995), Reg Revans’s Action Learning (1980) and Bill Torbert’s Action Inquiry (2004). Such practices typically involve the identification and testing of underlying assumptions as well as structured and iterative processes to test whether proposed solutions are effective. Many people find meditation and mindfulness practices also beneficial as a form of calming the mind, creating insight, and managing stress and anxiety.
Learning about cognitive biases and improving your thinking skills
The great investor Warren Buffett, whose company Berkshire Hathaway is one of the largest companies in the world, has made a practice of reviewing the mistakes he has made in his annual letters to shareholders since 1979; he is now 82. Buffett is also used to testing his thinking against that of his partner and vice-chairman, Charlie Munger who, according to Buffett, has always emphasized the study of mistakes rather than successes. Munger (2005) says that he tries not to speculate on a subject unless he can state the arguments against his position better than the people who support it. Neither man pretends that he does not make mistakes, and both seek to offset their own fallibility by identifying the sources of their errors and by finding ways to counteract them. There exists a considerable literature on cognitive biases and research is being carried out to develop activities that can help people learn to recognize and correct their cognitive biases.
Using support from a mentor, counsellor, therapist or coach.
Almost by definition, the perspective of another person can effectively counter one’s own self-perception regarding the tendency for self-deception. Consequently, no matter the stage of development of a person, the benevolent guidance and support from another who is ideally further advanced developmentally, is essential. In some cases a clinically trained psychotherapist may be the person best qualified for such a role. But for most, it is simply a question of having a positive relationship with someone who cares enough to provide honest feedback and practical advice.
In this article we discussed the phenomenon of self-deception within the context of leadership. At the conceptual level, self-deception remains a puzzle, yet the examples we have provided show that this phenomenon can be a costly one for all of us. By introducing a developmental perspective on self-deception we may be adding another layer of complexity to what is already a difficult and even controversial concept. We have a hope, however, that despite this complexity a greater understanding of the nature of self-deception by those in leadership roles may lead to stronger efforts to counteract self-deception, and more benign forms when it does occur. As Lao Tsu said “He who controls others may be powerful, but he who has mastered himself is mightier still.”
Argyris, C. (1995). Action science and organizational learning. Journal of Managerial Psychology 10.6 pp. 20–26
Bachkirova, T. (2013, in submission). Self-deception from a pragmatic perspective: towards a developmental theory, Psychological Bulletin.
Bachkirova, T. (2011). Developmental coaching: Working with the self. Maidenhead: Open University Press.
Bandura, A (2011). Self-deception: A paradox revisited. Behavioral and Brain Sciences 34, p16
Barrett, P. (2012). The public perception of institutional leadership as a function of $$ spend on executive training and development. Retrieved from http://www.pbarrett.net/stratpapers
Bazerman, M. (2012). How could I miss that? Jamie Dimon in the hot seat. Retrieved from http://blogs.hbr.org/cs/2012/06/how_did_i_miss_that_jamie_dimo.html
Bass, B.M. & Avolio, B.J. (Eds.). (1994). Improving organizational effectiveness through transformational leadership. Thousand Oaks, CA: Sage Publications.
Collins, J (2001). Good to Great. London: Random House.
Cook-Greuter, S (2005) Ego development: Nine levels of increasing embrace. Retrieved from http://www.stillpointintegral.com
Coolidge, F. L., & Segal, D. L. (2009). Is Kim Jong-il like Saddam Hussein and Adolf Hitler? A personality disorder evaluation. Behavioral Sciences of Terrorism and Political, 1, 195-202.
Dimon, J. (2009) Address to Harvard Business School MBA Class of 2009. Retrieved from: http://www.youtube.com/watch?v=9T9Kp4NE5l4
Festinger, L (1957) A theory of cognitive dissonance. Stanford, CA: Stanford University Press.
Fingarette, H. (2000) Self-Deception. London: University of California Press.
Frost, C. Arfken, M. & Brock, W. (2001) The psychology of self-deception as illustrated in literary characters, Janus Head, 4.2, pp. 331-357. Retrieved from http://www.janushead.org
Financial Services Authority (2011) The failure of the Royal Bank of Scotland. P229. Retrieved from: http://www.fsa.gov.uk/static/pubs/other/rbs.pdf
Gazzaniga, M. (1992). Nature’s mind. London: Basic Books.
Goleman, D., Boyatzis, R., & McKee, A. (2002) Primal leadership: Learning to lead with emotional intelligence. Boston, MA: Harvard Business School Press.
Greenwald, A. G., McGhee, D. E., & Schwartz, J. L. K. (1998) Measuring individual differences in implicit cognition: The Implicit Association Test. Journal of Personality and Social Psychology, 74, 1464–1480.
Kegan, R. (1994) In over our heads: the mental demands of modern life. Cambridge, MA: Harvard University Press.
Kegan, R. (1982) The evolving self. Cambridge, MA: Harvard University Press.
Kegan, R. (2009. Immunity to change, Boston, MA: Harvard Business Press.
King, P.M. & Kitchener, K.S. (1994) Developing reflective judgment: Understanding and promoting intellectual growth and critical thinking in adolescents and adults. San Francisco: Jossey-Bass.
Kurzban, R. (2011) Two problems with “self-deception”: No “self” and no “deception. Behavioral and Brain Sciences 34, 32-33
Janis, I. L. (1972) Victims of groupthink: a psychological study of foreign-policy decisions and fiascoes. Boston, MA: Houghton Mifflin.
Munger, C. (2005) Poor Charlie’s Almanack: The Wit and Wisdom of Charles T. Munger. Virginia Beach, VA: Donning
Pinker, S (2011) The better angles of our nature. London: Penguin
Revans, R. (1980) Action learning: New techniques for management. London: Blond & Briggs, Ltd.
Torbert, W. (2004) Action Inquiry. San Francisco, CA: Berret-Koehler
Vaillant, G. (1992) Ego mechanisms or defense: A guide for clinicians and researchers, Washington, DC: American Psychiatric Press Inc.
von Hippel, W & Trivers, R. (2011) The evolution and psychology of self-deception, Behavioral and Brain Sciences 34, 1–15.
About the Authors
Tatiana Bachkirova, MEd, MSc, PhD, C Psychol, AFBPsS
Tatiana is an academic, coach and coaching supervisor. At Oxford Brookes University, UK, she is a Reader in Coaching Psychology, teaching and supervising coaches on MA and Doctoral programme in Coaching and Mentoring. She is also responsible for an MA in Coaching and Mentoring Practice in collaboration with the University of Hong Kong. Tatiana is a Chartered Occupational Psychologist with a specific professional expertise in coaching psychology and coaching supervision. She is an active researcher and author of many articles, book chapters and three books including Developmental Coaching: Working with the self. In 2011 she received an Achievement award in recognition of distinguished contribution to coaching psychology from the British Psychological Society, Special Group in Coaching Psychology.
Nick Shannon MA, MBA, MSc, C Psychol, AFBPsS
Nick Shannon is the founder and principal of Management Psychology Limited, a UK based practice specializing in organizational and management development. Nick is a Chartered Psychologist, a member of the British Psychological Society and a founding member of the Association of Business Psychologists. After studying Psychology and Philosophy at Oxford University, Nick’s career has involved working variously as a commodity and derivatives trader, a director of a foreign exchange business, and a restaurateur. Now working as a consultant to organisations in the private and public sectors, Nick believes passionately in the value of a combination of philosophical inquiry and rigorous scientific methodology. He has a particular interest in adult developmental psychology and the cognitive and emotional transitions that managers must make in order to work effectively at successively higher levels of complexity and hierarchy. His work with clients is focussed on helping them develop great management teams in order to improve performance and to provide a better environment for all employees.