pexels-alberta-studios-16535485

10/13 – Transdisciplinary Reflections

Alfonso Montouri

History is not an outdated Operating System. Or, why those who ignore history are doomed to reinvent the wheel (and make me cranky).

Alfonso Montuori

Alfonso Montuori

Alfonso Montuori

A colleague and I were recently discussing texts for an upcoming course. Robert and Michele Root-Bernstein’s Sparks of genius: The thirteen thinking tools of the world’s most creative people came up, and excellent choice. My colleague replied glumly that she liked the book a lot, but students would inevitably get all riled up about having to read a book that’s more than 10 years old. This does offer my (reluctant) colleague an opportunity for a “teachable moment” about why at least some things that are over 5 years old are not past their due date and therefore completely useless, but the issue is quite deep and interesting. Where does this temporal bias come from? What if this book is still the best for the course or one of the best introductions to the creative process applied in a variety of contexts, easy to read, informative, and well-researched? Apparently not. We need to take a look at some of the factors that play into this pernicious presentism.

I have recently been reading a lot of the popular neuroscience books. Wherever I go, there they are: neuro-economics, neuro-politics, neuro-this and neuro-that. Granted, I have now reached the age where I can revel in my crankiness, refer to it dismissively as neuro-babble even when I don’t really mean it, and also point out that I have seen trends come and go. When I was in graduate school, admittedly only a little after the Pre-Cambrian, it was the social construction of X, Y, and Z. Discussing “hardwired differences,” particularly between men and women, was not a good idea. Now we have Louann Brizendine’s popular books on the male brain and the female brain, and it’s perfectly OK and even hip to explore these differences. Sadly, fewer people seem to have read Cordelia Fine’s brilliant Delusions of Gender on neurosexism, in which she demolishes Brizendine’s book and much of the excessive claims of neuro-research. She also reminds us of the journal Nature’s devastating review of Brizendine’s book on the female brain. The review included such gems as “The text is rife with facts that do no exist in the supporting references.” In other words, the references, which should be providing the support for her claims, often do not address the claims at all. (I had to repeat that, even if only because it’s shocking to me.) Nature also writes that misrepresentations of scientific details are legion. Not very impressive. But apparently the book’s truthiness was not so problematic that it stopped the book becoming a hit, w/the requisite appearances on Oprah, etc.

Creativity research has also been neuro-colonized, of course. And the results have been very impressive. Apparently the creative process, and particularly improvisation, involves a deactivation of the lateral prefrontal cortex. In other words, we (temporarily) shut down our critical faculties. fMRI research with improvising (and presumably rather uncomfortable) musicians and rappers actually shows this process happening during improvisation. I realize that it’s very interesting to see parts of the brain light up (or not). It gives us a real insight into what’s happening in the brain during the creative process. But at the same time, I can’t help but think that the fact that creativity requires a temporary suspension of critical faculties is not exactly news.

Research published in 2012 suggested that creativity involves the ability to engage in contradictory modes of thought, both cognitive and affective, deliberate and spontaneous. Discussing this research in his recent (and highly recommended) Un-gifted, Scott Barry Kauffman subtly and gently adds a footnote reminding the reader that this is a theme that has already received some attention in creativity research, pointing for instance to Csikzentmihalyi’s work. The seemingly paradoxical characteristics of the creative process and the creative person were discussed even earlier.

In the sequence of related acts which taken together as a process result in the creation of something new, there occur consistently a rhythmic alternation and occasionally a genuine resolution or synthesis of certain common antinomies. By this I mean that certain apparently contradictory principles of action, thought, and feeling, which usually must be sacrificed one to another, are instead expressed fully in the same sequence, the dialectic leading at special moments to an unusual integration. (Barron, 1964, p. 81)

The creative person is both more primitive and more cultured, more destructive and more constructive, crazier and saner, than the average person. (Barron, 1958, p. 164)

We also learn about the ongoing research on the connection between creativity and madness. Schyzotypy is now a big player—not quite schizophrenia, but part of that continuum. Schyzotypy is broken down into four factors, including a disposition towards unusual experiences (i.e. altered states of consciousness), cognitive disorganization, introverted anhedonia, or not getting much or any stimulation from social interaction, and impulsive nonconformity. Schyzotypy is associated with magical thinking, thin mental boundaries, and a tendency to see patterns, arguably where there are none.

In the 1958 article on The Creative Imagination cited above and published in Scientific American, and his later books Creativity and Psychological Health (1963) and Creative Person, Creative Process (1969) Frank Barron covered much of this material, and arguably with more psychological sophistication, because the discussion revolved around human experience, not areas of the brain that light up.

This is not just a question of scholarship, or of giving credit where it’s due. It’s also about recognizing that research conducted many years ago, particularly in less overtly “scientific” ways can offer rich insights, and, in this case, detail about human experience. Lighting up the brain is all good and well, but as fascinating as it is to see a part of the brain light up during our creativity research, or to show the light show that happens when we fall in love, surely the pretty lights simply cannot compare with the enormous complexity and overwhelming intensity of the actual experience of falling in love. To the extent that this qualitative dimension, so well articulated in research conducted 50 or more years ago, is being left out of the picture, the neuro-mania is actually creating a much “thinner” understanding of the processes in question. Barron’s research—and the research of many of the early creativity researchers—provided rich qualitative dimensions that are much less likely to be found in today’s neuro-research.

Reading a book from 1960 is not like reading the instruction manual for an operating system from 2005, which is now completely useless. But sadly this is how it seems to strike many people today. If we decontextualize the wheel, we’ll keep reinventing it. We’ll keep going over the same material, the same issues, without the benefit of those who have gone before us. Granted, Frank Barron’s creativity research did not have the benefit of the fMRI. It did, on the other hand, involve research with many truly eminent creatives, who came to Berkeley for 3 days to be interviewed, tested, talked with, and as Frank liked to remind me, eat and drink with. Of course, many of the most interesting insights were provided by the more relaxed environment. And I suspect it would be tricky to stick Norman Mailer in an fMRI.

Now, let me step back from my crankiness for a moment. First of all, let me make clear that I have no issue with neuro-science at all. I find some of the work fascinating. Being able to see in “real time” how the brain responds to certain specific situations is quite illuminating, even though there is mounting criticism about the use fMRI, and so on.

My problem is seeing this research, and particularly attempts to popularize it, staying so narrowly limited in time and space that in many cases it’s just reinventing the wheel or providing a flashy yet anemic perspective on the issue. By limited in time and space I mean that very often researchers, as well as popularizers, seem to reference very little if any research older than 10 years, and typically stay narrowly inside the realm of neuroscience. Even when dealing with creativity, their knowledge seems to be limited to neuroscience research conducted in the last few years, as if the topic had no history of inquiry, or as if history meant nothing. There can be a not-so subtle arrogance here, because the argument is that now we’re doing “real” science, as opposed to just talking to people. We’re dealing with objective brains as opposed to talking to subjective people. (And we should note that for many of today’s qualitative researchers the research conducted by Barron and colleagues at the Institute for Personality Assessment and Research is considered quantitative because of its use of questionnaires, as well as the in-depth interview.)

But are the researchers and popularizers entirely to blame for this not-so latent positivism? There’s no question that academia (and social science in particular) puts a big emphasis on recent research. In some journals, one of the criteria for acceptance is use of recent references. I can buy that. You don’t necessarily want an article with references that quit in 1980. And in neuro-science this is particularly likely to happen, because in the sciences, we do believe that the most recent article is in fact likely to give us the most recent, state of the art research.

The problems with the latest neuro-science rage reflect a larger social devaluing of history and qualitative research. Several years ago, I foolishly agreed to co-author a textbook on creativity in business for a major textbook publisher. We included a 2 page overview of creativity in history, covering Pre-Modern, Modern, and Post-Modern, to provide context and point to how what we think of as “creativity” has in fact changed over time. Below are some of the comments from the in-house reviewers, faculty members at universities that might be interested in using the text for their classes. Remember, a 2 page historical overview.

(This) was of little interest to me and would be of less interest to my undergraduate students. Understanding the Shift to a PostModern World – Here we go back to the Agrarian era. Egad! What does this have to do with my students unlocking their creativity so they will be successful in the 21st century organization?

We are in the 21st century, and trust me, my students have little concern for the theoretical background underlying the Pre-modern, Modern, and Post-modern historical eras in the West. My goodness, my students can’t relate to the 1980s much less 500 years ago.

What my students want to know is how, specifically, they can be more creative and innovative so they can get and keep a job.

I teach in an MBA program. I find MBA students are very impatient with philosophical treaties and elaborations of historical context. Unless you want to market a theoretical text, I would suggest that this text delete all historical analysis.

I particularly liked “My goodness, my students can’t relate to the 1980s much less 500 years ago.” And God forbid that they should learn anything about them! With instructors like these, who needs recalcitrant students?

I completely understand the emphasis on practicality and getting a job. But I also find it terribly sad that the historical dimension is not viewed as something that could lead to a new understanding of the present and the future, that it could be the source of new insights. This painfully limited, instrumental view is a sad excuse for education. And interestingly, it’s increasingly clear from creativity research that an openness to diverse perspectives, stepping outside one’s immediate field of concern, and drawing on ideas and approaches from a field other than our own are all aspects of the creative process.

Framed this way, history really is just one damned thing after another, to be memorized for a test but ultimately utterly useless. How very sad. But once again, we have to ask, how is history understood, how is it taught? I can’t say I was exactly a history buff as a kid, or even a graduate student. Part of the problem was that history was all too often taught as just that…one damned date after another. History lived in its own exclusive little silo, and no connection was made between it and anything else on God’s green earth. The aridity and sterility of this fragmentation has meant that there is no effort to connect. History is the history of wars, of leaders. At no point did I ever hear any of the music that was being played and written during the periods I studied. No novels, no idea what daily life was like. No wonder history elicits an almost visceral response. Why? A transdisciplinary approach seeks to illuminate and articulate the complexity of a phenomenon, situating it in time and space, providing us with a richer picture that makes the interconnections and interdependencies of open systems come alive.

I’ve spent quite a bit of time over the years outlining how the dominant understanding of creativity was the result of specific historical events and trends. A better understanding of these events and trends gives us a view into the way creativity itself was constructed, into assumptions that were at the basis for a particular view of creativity. The historical perspective can be enormously illuminating. It can show us for instance why, in 2012, there was a sudden fascinating with the fact that creativity, and more generally excellence, require hard work. A minimum number of hours, even. Looking back we can see that the notion of “genius without learning” became part of the romantic mythology of creativity, and created an image of the creative person as somebody who essentially didn’t require schooling or hard work, but simply somehow transmitted or “channeled” brilliance. Don’t let them see you sweat. It helps with marketing too, of course. Whose novel would we rather read? The writer who sweated over it for years, worked and reworked it, and threw out thousands of pages? Or the one who claims it came to him “in a dream” and he just wrote the outline down in the morning and finished it in a couple of weeks of feverish frenzied writing? Surely no the one who has to “work at it.”

Despite a modicum of interest, and a dramatic shift in the understanding of creativity over the last 10 years or so, the main focus is still, as the animated reviewer above suggests, how can I become more creative in a way that does not involve having to delve into theory, history, and have a better understanding of the process and the larger phenomenon itself? “Just tell me what to do. Spare me the theory.” A little ironic, perhaps. Let’s just say that if you don’t own the theory, the theory owns you.

I’ve discussed this anti-intellectualism and anti-theoretical stance in a previous column (Transdisciplinarity as Play and Transformation, October 2012). One of my main concerns is that we now have an increasing number of people who dismiss theory, but in the process also cannot make sense of it, not least because it’s not clear what theory is. And yet many theoretical works have spawned tremendously practical applications, not to mention revolutions and much more of a rather “practical nature.” Kurt Lewin of course made the classic statement that there’s nothing as useful as a good theory. But increasingly I’m finding that many students, consultants, and leaders across the country don’t know what to do with theory. They need somebody to translate it into something practical, preferably in 5 or 7 or 9 steps.

While it might be argued that people just don’t have the time to read theoretical works, we need to be cognizant of the fact that this also means that they don’t understand deeper matrix from which their practices and tools arise. They are condemning themselves to a superficial world where, not surprisingly, everybody is waiting for the new flavor of the month. With this anti-theoretical attitude, theories are either rejected out of hand, or, interestingly, embraced uncritically. They’re never “metabolized,” never really seriously explored. Arguably they’re never given the chance to be thoroughly applied, and with a lack of theoretical understanding they may well not be applied in a relevant way. And so they’re discarded until the next Big Idea comes along.

There’s no question that the realm of theory can become an infinite universe of nitpicking, grandiose pronouncements, and wild abstraction. But from all that, and usually the less disembodied versions, emerge the foundations for the practices and tools most people use. And even more importantly, perhaps, we all have implicit theories about how the world works, how human beings relate, how systems change (or not), the nature of the self, the problems and potentials of diversity, what constitutes “good” or “bad” leadership and so on. In other words, there’s no escaping theory.

The reasons for this anti-intellectualism in the US are historical, and Richard Hofstadter’s Anti-intellectualism in American Life does a bang up job reviewing its origins and implications. Only problem is, this Pulitzer-prize winning book was first published in 1963. Even the Amazon reviews contain a debate about whether it’s “dated” or not. Of course, the books’ examples do not make references to William Bennett, Apple, the internet, or Miley Cyrus. But they’re still fascinating (if a little quaint) and the book as a whole has very little if anything that’s dated. The book’s focus is historical, after all, meaning not all of the examples have to be from 20 minutes ago, and the readers can use their own intelligence to apply the concepts to examples in their own life.

History shapes us. It makes us who we are. I’m not advocating for historical determinism here, but of an awareness that if we think and act and feel in certain ways rather than others, it’s because of our history. If large organizations have certain problems, certain default ways of doing things and ideas about how things should be, what is considered a “good” organization, and so on, it pays to understand the history. The same can be said for individuals.

Why do most bicycle seats look the way they do? And why do we take it for granted and pay little attention to it, even though they’re not exactly a paragon of comfort? Well, because that’s just the way they are. And what they are is essentially shrunken saddles: the saddle was the model used for the first bicycles, because that was the closest references point in those days. This might be an interesting factoid for a particularly dreary party, but it also gives us an insight into how an awareness of history shows us how what we take for given, has historical roots, meaning decisions by somebody somewhere in some kind of context. And that consequently things can be different.

Leave a Comment