Sunday, December 29, 2013
The British artist, Simon, Beck has created some memorable art over snow at a ski resort in France, some of which can be seen here. The pictures are attractive to look at but they must be more exciting to see for real.
With every new snowfall, the creations will gradually change and then disappear altogether. The art is therefore, of necessity, transient and has, therefore, transience as an added element. I wonder whether these creations, and their transience, will not be even more appealing to those of a Japanese culture, which emphasizes transience (wabi) as a feature of beauty.
Perhaps some contemporary art gallery should buy good quality photographs of these creations, to exhibit permanently what is only transient.
Just a thought!
Tuesday, December 24, 2013
At a recent event to launch the exhibition on the Large Hadron Collider at the Science Museum in London, the great Stephen Hawking made what must seem to many an unusual declaration. He said, “Physics would be far more interesting if [the Higgs boson] had not been found”. Physicists would then have to re-think many of their fundamental ideas about particles and the forces that bind or repel them.
By saying so, Hawking was displaying both the qualities and perhaps the failings of scientists. Scientists, or at least the great ones like him, love the process of solving great and difficult problems. The solution may be quite marvelous and exciting to think about; it may even be very moving. But, once solved, it ceases to be a problem, which the enquiring mind needs.
So, what Hawking was saying, it seems to me, is that if the Higgs boson had not been found, the problem would have persisted and exercised and concentrated minds, which is what scientists like so much.
This of course is very distant from those who wish that a problem should never be solved, because they fear the results. Some have written of their fear of work on the neurobiology of love, because it will “de-mystify” it; others have written, of neuroesthetics, that they would find it unwelcome to learn what happens in their brains when they view a work of art or listen to music. Hawking wants to learn; they don’t. If Hawking prefers that the Higgs boson had not been found, it is simply that he relishes the process of discovery. He is not fearful of the results; they are.
Why, then, should this also be a failing. I think it is because lesser scientists (and let us not under-estimate the degree to which scientific progress also depends upon lesser scientists) can easily be distracted from trying to solve great problems into solving relatively minor ones, precisely because they love the process of solving problems! I have seen it happen many times.
But there are of course many problems that remain in physics and astronomy. And Hawking is hoping that physicists will move on to solving even grander problems about the nature of our universe.
Hawking is not afraid of de-mystification. Not at all. The mark of a real intellect.
Monday, December 23, 2013
The quite wonderful exhibition, Shunga: sex and pleasure in Japanese art at the British Museum in London, carries with it a surprising paradox or contradiction, which no one has so far been able to explain to me adequately.
Japanese culture in general emphasizes the unstated and the understated, leaving much to the imagination. Yet Shunga art, which is basically erotic art, is the exact opposite. Here, almost nothing to do with the genitals is left to the imagination; instead they are given prominence, the size of the organs more often than not exaggerated beyond reasonable dimensions.
Yet, in spite of this prominence, most of the rest of the body is covered up in many, if not most, depictions of sexual encounters; in many it is the genitalia alone that are exposed. There is of course, also something of the artificial in these works; couples make love with their clothes on; the hair is usually immaculately coiffed, in some a lady is having her hair combed while having intercourse while in others there are spectators, including children, witnessing the scene.
Why would a culture that has traditionally emphasized the understated produce work that is anything but under-stated? Some Japanese friends have told me that Shunga is nothing but pornography. I do not believe it. In spite of the fact that they may have been used as stimulants or as props for sexual pleasure, these are works of art as well. It is the brilliant depiction of interiors, the wonderful colour combinations, and the immaculate detail with which clothes are represented that turns them into visually pleasurable works. Indeed, it may be said that the genitalia are in fact often a distraction from the rest of the work, especially the depiction of the graceful women in the Shunga work of Kitigawa Utamaro. If “art is fantasy”, as a quote at the exhibition proclaims, then it is those graceful figures that invite the viewer into a world of fantasy, not the prominently exposed genitalia. A critic once wrote that the sexual figure in Boticcelli’s Birth of Venus is not the naked lady but her richly dressed companion to the right, presumably meaning that it is the latter who draws the viewer into a world of fantasy. Maybe the great masters of Shunga art were trying to balance the explicitness of their images with depictions that allow a world of fantasy and imagination to come into play, all in one.
Shunga was apparently not legal in Japan for very long periods, though tolerated throughout and popular with all levels of society. It is, I gather, still frowned on in Japan. Indeed, I am told that, in modern-day Japan, adult movies in hotels often blur the genitalia – in striking contradiction to Shunga art of earlier times. And there is the contradiction: explicit pornographic films that blur the genitalia on the one hand (perhaps in keeping with the understated in Japanese culture), and great art that is implicit in everything but the genitalia (quite unlike the understated characteristic of Japanese culture).
Saturday, November 16, 2013
Francis Bacon claimed that he wanted to give “a visual shock”, and his paintings over the decades never seem to have departed from that aim. One of his first exhibitions, in New York, was described as a “chamber of horrors” and Margaret Thatcher, perhaps echoing the views of many outside the art world, once described him as “that man who paints those horrible pictures”. As I understand it, most people (even those who admire his painterly style) would prefer not to have his paintings hanging in their living rooms.
Last week, his three-panelled painting, entitled Three Studies of Lucian Freud, produced another shock – a financial one. It fetched a record price in New York, being sold for the sum of $142.4 millions. What is it that attracted buyers to spend so much (the bidding started at $80 million)?
I believe that Bacon subverted the brain’s normal representation of faces and bodies, which is what turned his pictures into shocking displays. The brain, it seems, cannot easily adapt to departures from what constitutes a normal face; it cannot adapt easily to the disfigured faces and bodies that Bacon specialized in, as a means of making images of the violent reality which, according to him, was so prevalent in the world. Hence the enduring shock effect that he produced.
Most of the discussions I heard and articles I read on this sale revolved around the topic of money. It is not that buyers were only speculating. Rather “deep-pocketed” buyers were also ready, it seems, to splash out considerable sums to buy paintings for their national museums or their homes. I am inclined to the view that when it comes to spending such vast sums, the long-term value is naturally important but cannot be the only or even dominant factor. So what, beyond the prestige of Bacon, drove prices so high? How could paintings reviled through the use of phrases like “horrors” or “mutilated corpses” or “extremely repellent”, which so many (including one on the radio last week) declared they would rather not see hanging in their living rooms, be so much sought after.
Perhaps we have a very deep-seated fascination with horror, especially when it is so evocatively depicted. Perhaps those who yearn to view such paintings are an infinitely more sophisticated and refined, indeed artistic, version of those who jam the roads on their way to see a crashed plane. There are, of course, huge artistic qualities to Bacon’s work – they are formally masterful works, with a quite spectacular, and often unusual, combination of colours. But the fact remains that they also depict mutilated and savaged faces and bodies – viewing of which almost certainly stimulates strongly sub-cortical centres such as the amygdala and the insula, which seemingly respond to fear and horror. And let us not forget that Bacon once said that he was not appealing to the intellect: “I make paintings that the intellect cannot make” he once said, which also implies that he was appealing to something more primitive in his work. In his quite wonderful book on Francis Bacon, Michael Peppiat says that Bacon’s aim was to deliver a visual shock before things got spelled out in the brain (or words to that effect). Perhaps, combining the aesthetically pleasing colours with the mutilation that he so consistently depicted makes the latter more palatable – and even pleasing. The more so if one knows that such a combination is a good place to park one’s money.
Sunday, October 6, 2013
One is always somewhat surprised when academics who, in the words of HL Mencken, are generally as “harmless as so many convicts in the death house”, turn to violence. In general, academics dislike violence and prefer to pursue their trade peacefully, although there are many examples of verbal violence. I know of an English university department which speakers are reluctant to speak at because of the extreme verbal violence of one member there.
Yet it is surprising when this violence escalates to the level of arms. The BBC reports one such incident in which an argument about the German philosopher Immanuel Kant escalated to such levels that it ended by the use of rubber bullets fired by one protagonist against another. What the bone of contention was is not recorded. It could have been the “a priori synthetic” or the “categorical imperative” or perhaps the “transcendental synthetic”. At any rate, one of the protagonists was charged with causing grievous bodily harm.
Kant himself would probably have been very surprised. His book, Critique of Pure Reason, apparently sold only five copies when first published, of which two were purchased by himself (I cannot vouch for the accuracy of this story, which I read somewhere years ago). He was in general a very peaceful man whose habits were so punctual that housewives apparently set their watches by when he went to work and when he returned. The French critic Rémy de Gourmont marveled that a man like Kant who had neither wife nor mistress, who died a virgin (as Gourmont believed) could have written a book on the metaphysics of morals!
Yet, violence in academic circles has been recorded before (I mean real violence, not the verbal one, which is very common). There is, for example, the story of Pierre Marie, an eminent French neurologist, who accused another eminent French neurologist, Déjerine, of doing science as some play roulette. But, upon being challenged to a duel, Marie wisely chose to retract his accusation.
On one occasion, I was told not to mention 40Hz when giving a seminar if a certain gentleman was in the audience, for fear that he may suffer a heart attack. I wisely obeyed. But I am told that he later died of a heart attack anyway.
Perhaps it is only fear that keeps academics from resorting to real violence. I know of stories of one German physiologist saying of another, “Now that I have shown that he cannot use a slide ruler, I intend to take no further notice of his work”, while another accused a colleague of “auto-plagiarizing”. I can well imagine such incidents boiling over and resulting in - well, the firing of rubber bullets, at least.
It all goes to show that the dispassionate academics, searching for truth in their ivory towers, may not be impervious to these human instincts, just like the rest.
Monday, July 29, 2013
This has been quite a Wagner week at the Proms in London, the first time that the complete Ring cycle was performed there, to celebrate the bi-centenary of Wagner’s birth.
I am sure that many much more qualified than me will write about this historic occasion. All I need to say is that I enjoyed it tremendously, in spite of the oppressive heat in the hall. As one critic wrote somewhere, “it can’t get much better than this”.
My purpose here is really only to record one extraordinary moment where nothing happened…at the end of Götterdämmerung. Maestro Daniel Barenboim held his baton up for a good 18 seconds after the last note, and everyone held their breath, leaving the gigantic hall, filled to capacity, completely hushed. There was a great deal left to the imagination in those few moments, much longer imaginatively than the real time of 18 seconds suggests.
You can listen to the silence here at 1:21:13 onwards (for the next six days only).
I say nothing happened, but of course a great deal must have gone through the minds of the thousands attending the performance and the millions listening at home.
This was a perfect ending, for there was nothing left to say but much to think about silently in those few moments.
Indeed, one of the more remarkable features about this Ring cycle was the complete silence from the audience at those silent or subdued moments during the performance, a fact that Barenboim commented on, and thanked the audience for, in his speech at the end of the performance of Götterdämmerung.
I have written many times here about the value of the unstated in art and the silence in music. Yesterday, Daniel Barenboim demonstrated it to powerful effect.
Saturday, April 6, 2013
I describe briefly a new and hitherto undocumented phobia, which I shall name neurophobia and those who display it as neurophobes. It is a somewhat new phobia, perhaps no more than 15 years old but it shares characteristics with other phobias. It is to be distinguished from the neurophobia that medical students apparently suffer from when studying neurology.
Neurophobia can be defined as a profound dislike, with various degrees of severity, for cognitive neurobiology and especially for neuroesthetics and for what these disciplines promise to show us.
Neurophobes are a motley crowd and, as with so many other phobias, they include people from different backgrounds and walks of life – philosophers of different degrees of eminence, humanists, religionists and even (surprisingly) some neurobiologists. This is not to say that all philosophers and humanists are neurophobes, far from it; many are interested and excited by the discoveries that neurobiology and neuroesthetics have to offer, but neurophobes are more vocal. Nor are all religionists neurophobes: I have had some very interesting discussions with some religionists, who have shown themselves to be hospitable to new ideas. Interestingly, I have not encountered neurophobia among artists (yet), which again does not mean that there aren’t neurophobes among them. Hence, neurophobia, like other phobias, cannot be associated with any particular grouping, either socio-economic, cultural or otherwise.
Among the characteristics of neurophobia, one may list the following:
1. An irrational fear: they invest neuroesthetics in particular with imaginary powers; these include weapons of mass destruction (WMD), for how else to interpret a statement that neuroesthetics “… will flatten all the complexity of culture, and the beauty of it as well”? and other similar statements.
2. A desire to find a place for the mind outside the brain, not perhaps realizing that cognitive neurobiology and neuroesthetics study neural mechanisms and hence the brain, and that their conclusions are to be seen in that context.
3. The use of emotionally charged and pejorative terms to dismiss neuroesthetics, terms such as “trash”, “buncombe”, “rubbish” and others like them, which have no place, or should have no place, in scholarly (and especially scientific) discourse. Hence neurophobia shares a similarity with other phobias in that it is not easy to rationalize it cognitively, an appeal to emotional and pejorative language being the only way out.
4. The pursuit of ignorance: As with so many other phobias, this amounts to the wish not to know. Hence, neurophobes don’t want any scientific ‘de-mystification’, which they would regard as a “desecration” (note again the emotive language) and prefer to live in ignorance. This is of course similar to other prejudices, where ignorance is the preferred course.
5. This arrogance displays itself in their protecting themselves against the facts. As I have said before, once they relegate our discipline to the status of “trash”, they need not bother with it. And there is, in their writings, good evidence that they have not read what we have written.
6. Arrogance of ignorance: neurophobes always assume that they know better, and hence lecture us on what they suppose we are not aware of. They never cease to tell us that art and beauty are not the same, as if we are not aware of that and have not written about it. They never cease to emphasize the importance of culture and learning in aesthetic appreciation, as if this is a new insight that we are not aware of.
7. Attack the methods: where all else fails, there is always recourse to attacking our methodology – principally the imaging techniques. They fault these for their spatial and temporal resolution (sometimes using emotive language) as if we are not aware of these shortcomings and do not take account of them in our interpretations. (I will have more to say about this in a future post.) I imagine most are scared of new technologies that will have greater powers of resolution.
This collection of characteristics is very descriptive of neurophobia, and they are interlinked. Hence if one detects one of these characteristics in an individual, one must suspect him/her of being a neurophobe and display the other characteristics on gentle probing. Here I would advise caution; it is best to probe a little further before classifying someone as a neurophobe.
Of course, many of them preface their pejorative remarks with feint praise, such as "Neurology has made important advances" (rather like, some of my best friends are neurologists).
And finally…what one neurophobe says or writes is remarkably similar to what another one says or writes, reminding me of the famous line of President Reagan, “There you go again”. Indeed, so similar are their articles that it becomes reminiscent of another one of Reagan’s famous lines (about redwood trees): “Once you’ve seen one, you’ve seen them all”.
Tuesday, April 2, 2013
In 1995, a Japanese team was awarded the Ig Nobel Prize in Psychology for their work describing how pigeons can be trained to discriminate between the paintings of Picasso and those of Monet. Previous work had shown that pigeons could distinguish between the music of Bach and Stravinsky.
Receiving the Ig Nobel Prize must be a mixed blessing, as its very title implies. Often the implication is that there is something trivial in the research reported and sometimes it is awarded for what many would regard as work that is not scientifically worthy, for example a report to the US Congress that nicotine is not addictive (awarded the Ig Nobel Prize in Medicine in 1996).
Others are frankly funny, such as the Ig Nobel Prize for Peace (2000), awarded to the British Royal Navy for a Monty Python-like command, that its sailors should not use live cannon but instead shout “Bang”, or the one awarded in Biology (2004) for showing that herrings communicate by passing wind (farting).
In fact, many of these Ig Nobel prizes go to worthy and scientifically interesting work. The one about herrings communicating by farting turned out, apparently, to be strategically and financially important because the Swedish Navy, suspecting that Swedish waters were being infiltrated by Soviet submarines, instigated a widespread but futile hunt for those submarines. After many inconclusive years, it turned out that the noises were probably coming from farting herrings. Had this been known, it is claimed, the Swedes would have saved hundreds of millions of Swedish Krones.
Science is, or should be, fun. And even apparently simple science can be fun BECAUSE it leads to new and interesting clues. The work for which the Japanese scientists got the Ig Nobel prize in 1995 really showed that pigeons, which have a well-developed visual apparatus, could distinguish between the paintings of Picasso and those of Monet because they formed a concept of these paintings. They did not apparently distinguish them because of the presence of sharp edges in the cubist paintings or colour in those of Monet. Hence, in addition to a well-developed visual apparatus, they have brains that are sophisticated enough (if that is the right word) to develop visual concepts about visual stimuli unrelated to their daily lives.
Concept formation, critical for the acquisition of knowledge, is a fascinating subject, but how the brain forms concepts is not known in any detail. That pigeons should be able to form concepts around works designed by humans for consumption by humans, works which have little to do with their world, perhaps has the germs of an insight into how more complex brains form concepts. It would, in fact, be just as interesting to learn how humans form concepts around different schools of paintings.
If the Ig Nobel prize brings such interesting science to wider attention, then it is pursuing a worthy cause.
Saturday, March 23, 2013
There are some who fear neuroesthetics because they fear that it may ‘de-mystify’ what they prefer to remain mysterious. Knowledge about brain mechanisms that may be involved in the experience of beauty or of love and desire would deprive them, so they believe, of the full enjoyment of those experiences. I gather that a prominent professor has said that he regards it as ‘unwelcome’ to learn what happens in his brain when he is experiencing beauty. Presumably, if he were sitting on some research council, he would use his influence to suspend research in these areas. So, it is a relief that those who hate neuroesthetics and fear it are not in a position to halt research in the subject, at least not at present. There was a time when they could have and, in some areas of research, came close to doing so. Galileo was investigated by the Inquisition and ordered to stay silent, which he did, sort of, for a while. In the Soviet Union, a law was passed forbidding dissent from Lysenko’s anti-Mendelian views, which resulted in many losing their jobs and even being imprisoned. The law was rescinded in the 1960s.
I have no complaints against those who do not want, through knowledge, to de-mystify things which they hope will remain mysterious. That is their view, and I respect it, sort of. But it has to be noted that these are not people who are avid to learn more. It is not that they are simply dis-interested in certain things but that they are vocal in trying to discourage the rest of us from trying to learn more about important subjects – for I take it that the experience of love, beauty and desire are important and interesting subjects. In this sense, then, their intellect is somewhat limited. Though perfectly entitled to their views, these are not the sort of people whom I would like to have sitting on research councils.
In other ways, their attitude seems strange. Science has been de-mystifying things for millennia but I am not at all sure that the world has been rendered any less marvelous because of it. One could say that landing humans on the moon and bringing them back safely to earth was a step in de-mystifying the heavenly bodies, but it has not rendered the moon any less glorious; one could say that compressing all the secrets of life into two strands of DNA de-mystifies life, but it has made it all the more wondrous to me; one could also say that the role of neurotransmitters in regulating sexual behaviour (and hence determining, at least in the world of rodents, the extent of promiscuity) de-mystifies morality or immorality, at least in the world of rodents, but to me it raises a host of interesting questions about how behaviour is regulated, even when it threatens to invade the world of morality.
Perhaps much the more interesting question is a neurobiological one: why do some people (and there are many of them) prefer mystery to knowledge? What advantage does it bring them and what does it satisfy in them? If one of the functions of the brain is to acquire knowledge, what mechanism is it that suppresses the desire to acquire knowledge in such interesting spheres, when the knowledge does not harm anyone? What dis-advantage would such knowledge bring to them?
The answers to such questions, too, might de-mystify things and those hostile to learning more might want to discourage research councils from funding research in these areas as well. But they remain, nevertheless, interesting questions and so I hope that those who want to dictate what kind of knowledge should be pursued and what avoided are never given a seat in the councils that make decisions about funding research.
Sunday, March 10, 2013
The Light Show at the Hayward Gallery, London, is a delight and, quite rightly, oversubscribed. The number entering at any one time is strictly controlled, allowing viewers the space to appreciate the exhibits – quite unlike the disgraceful “cram them in” policy at the Leonardo exhibition at the National Gallery last year. Some of the exhibits, like the Chromosaturation of Carlos Cruz-Diez, or the Model for a Timeless Garden of Olafur Eliasson or Conrad Shawcross’ Show Arc Inside a Cube IV (a bit of an unnecessary mouthful this one) are ones to enjoy sensorially and to reflect about as much as one would about any work of art.
The weakness of the Hayward exhibition is that it pretends to combine science with art, or rather give a scientific explanation of the artistic exhibits, when it should really be seen as an art show and a delight to the senses, or should have appended to the exhibits something that is scientifically valid. As it is, the show was spoiled somewhat for me by the explanations appended. At the entrance, the viewer reads that “Vision is the least reliable of the senses”. What is the basis for this? Many, probably most, neurobiologists would argue exactly the opposite; it is the most reliable of the senses, perhaps reflected in the fact that so much of our brain is devoted to vision.
We are then told that “What we see, or think we see, is not always how things are”. This is a profound misunderstanding of the workings of the brain – for what we see and experience is dictated by the organization of our brains, and is precisely how things are in perceptual reality, however that reality may depart from the “objective” reality. That is why, at my own exhibition at the Pecci Museum of Contemporary Art in Milan (Bianco su bianco: oltre Malevich), the visitor was welcomed with the following statement: “The only reality we experience is brain reality”.
When one looks at the Hering Illusion, the two straight lines, which are parallel, appear perceptually to be somewhat curved. The perceptual reality dominates even when one knows that the two lines are straight and strictly parallel. Or consider the rapid motion in the rings in Isia Leviant’s Enigma; to those who see the movement, there is no doubting its reality, even if there is no actual movement in the rings.
It never ceases to surprise me that we downgrade our true perceptual reality in favour of the “objective reality”; the former is always what it does not seem, while the latter is always true. This gives to the reality we experience a subservient place when in fact the only truths that we are able to experience are brain truths.
I am not saying anything particularly new here. Immanuel Kant said it long ago – that our knowledge of this world is a compound of the objective reality and the operations of the mind; we can therefore never know the thing as it is (Das ding an sich) because our only knowledge of the world is through the operations of the mind (brain). In discussing the philosophical importance of colour vision, Arthur Schopenhauer wrote of its importance for understanding the “Kantian doctrine of the likewise subjective, intellectual forms of all knowledge” – in other words that all knowledge is mediated through the operations of the brain.
This exhibition pretends to explain the visual sensory process through art. Thus, the exciting Chromosaturation of Carlos Cruz-Diez has appended to it the following: “since the retina perceives a wide range of colours simultaneously, experiencing these monochromatic situations causes visual disturbances”.
Almost everything in that statement is incorrect. There are no monochromatic lights in the exhibit (all the lights are broadband although there may be some dominance of one waveband over the others in some), the retina does not “perceive” colours, and there is no “visual disturbance” but only visual sensory excitement, leaving one wondering where the “misty” environment induced comes from. The exhibit would have been better without these incorrect explanations. Why not call it an unusual visual experience instead?
Perhaps artists do not read about advances in science – why should they after all? Perhaps we do not explain our findings properly. Whatever the real reasons, here is a good example of artists and curators trying to explain perceptual processes through artistic achievements and doing so very badly and, worse, inaccurately. It is exactly the reverse of what neuroesthetics has been falsely accused of doing, namely explain works of art through neuroscience, even though that is not its aim (see this post and this post).
Hence, my advice is – go to this delightful exhibition and enjoy the exhibits as creative works of art. Many might want to do more than that; they might wonder what these exhibits tell us about the brain’s perceptual mechanism. But, please ignore the explanations appended to the exhibits – they say nothing about the visual process, or about the sensory brain or about perception, which is not to say that viewing these works does not raise questions about sensory processes.
Here, then, is an exhibition which inspires thinking about the operations of the brain. It is not what it pretends to be, namely an explanations of overall sensory processes. It is a good illustration of how works of art can inspire neuroesthetic studies.
Sunday, March 3, 2013
A jury was unable to reach a verdict at a recent high profile trial of the wife of a disgraced ex-politician who had been accused of the obstruction of justice. The jury came in for much ridicule for the questions they asked of the judge while deliberating. The lawyer for the prosecution,
“questioned whether the case could continue. “I don’t ever recollect getting to this stage in any trial – even for more complicated trials than this – and after two days of retirement a list of questions of this very basic kind illustrating at least some jurors don’t appear to have grasped it,” he said.”
I myself do not share the view that these were all silly or irrelevant questions, although one was somewhat funny and got a funny answer in return:
[Jury]: Can you define what is reasonable doubt?
[Judge]: A reasonable doubt is a doubt which is reasonable. These are ordinary English words that the law doesn’t allow me to help you with beyond reasonable written directions.
But my main interest is the jurors’ question that captured the headline in at least one daily newspaper:
[Jury]: Can a juror come to a verdict based on a reason that was not presented in court and has no facts or evidence to support it either from the prosecution or the defence?
[Judge]: The answer to that question is firmly no. That is because it would be completely contrary to the directions I have given.
But in assessing a situation we often rely on evidence that is not “factual” in the literal sense but may be factual in that it speaks to our faculties of judgment. It is absurd to believe that we do not frequently come to doubt whether someone is telling the truth simply by studying their body language, or the hesitation in the voice or because when we gaze into the eyes, there was something that jarred with the ‘factual’ story being told.
I myself have been a juror on two occasions and can testify that these, though not facts presented in court and do not constitute evidence presented by the prosecution or the defence, nevertheless play an important role in reaching a decision. I believe that, in addition to the evidence presented in court, my co-jurors used the same or similar signals in reaching our common verdict.
I have also asked judges whether, when presiding over a case, they use visual cues which are not facts presented in court to reach a judgment as to whether the defendant is innocent or guilty. They have always answered that it plays an important role. Of course in most such cases, the judge can leave the final verdict to the jury but there have been cases where the judge has disagreed with the jury. Somerset Maugham even wrote a very interesting short story about the consequences of such a divergence of views when the judge in a case meets the (acquitted) accused years later at a dinner party (I read it years ago and cannot now recall its name).
The final verdict must depend significantly upon whether a witness or the accused is telling the truth or lying and, in judging that, many factors besides the evidence presented in court come into play – in the form of signals that the brain receives and interprets but which do not constitute part of the body of evidence presented in court.
My point is that the brain is very good at picking up signals that do not constitute ‘evidence’ in the legal sense but are nevertheless vital in reaching judgment.
Nor am I saying something new. Everyone knows that we make judgments based on objectively non-factual but subjectively critical evidence, nearly every day.
Hence, to ridicule the jurors’ question given above is silly. It is indeed as silly as the statement attributed to Picasso, which I alluded to in a previous post, that “when we love a woman we don’t start by measuring her limbs.” The truth is of course otherwise; we actually start by making very detailed, often nearly instantaneous but perhaps partially unconscious, measurements of a great deal before we fall in love.
Posted by S.Z. at 12:35 PM
Thursday, January 10, 2013
Labeling something often suggests a haste to catalogue it and be done with it. It also implies some level of understanding of that which is labeled. But labels, especially pejorative ones, also commonly help to insulate one from the need to enquire further. Why would anyone who has labeled something as “trash”, for example, be bothered to read or learn anything further about it?
Every now and then, someone who is seemingly exasperated by the profusion of neurobiological facts describing a localization of some function or other in the brain, labels the whole enterprise as nothing more than the manifestation of the “new phrenology”. Nor does such labeling come only from those outside the field; sometimes the same dismissive label is used by neurobiologists themselves.
Essentially, (the old) phrenology supposed that mental faculties are localized in the brain and that an especially well developed mental faculty would result in a corresponding bump in the skull. By measurement of the skull and its bumps one would therefore be able to infer something about character, moral qualities and personality. Its originator was Franz Josef Gall, who took refuge in France after his ideas had been disapproved of in Austria and was shocked when the Institut de France, at the instigation of Napoleon, did not promote him to membership.
There were some good reasons for dismissing phrenology and especially the use that was later made of it to promote racists ideas. But there is nothing wrong with its implicit assumption that the brain is the seat of the mind.
Those who label the tendency of modern neurobiological research to find that special cortical areas are associated with distinct functions as nothing more than a manifestation of the “new phrenology” do both the subject and themselves a disservice. That distinct cortical areas are associated with distinct functions does not mean that they can act in isolation; indeed all cortical areas have multiple inputs and outputs, both to other cortical zones as well as to sub-cortical stations and the healthy activity of an entire system is critical for a specialized area to execute its functions. It is trite to suppose, as some (non-scientists) have, that an area that is specialized for a special function, for example colour vision, can be isolated from the rest of the cortex or the brain and still mediate the experience of colour. No biologist has ever made such a claim and those outside biology who make it know nothing of biology or the brain.
It is equally untrue that the whole of the brain is involved in all its functions, as was believed in the 19th century. No one could possibly deny that there is an area of the brain that is specialized for vision or for some attributes of vision, such as visual motion; nor can anyone deny that there are areas of the brain that are specialized for audition. Nor would any reasonable person want to deny that lesions in these different zones of the cerebral cortex have different consequences.
More recently, with the advent of brain imaging studies, neurobiologists have shown that even the experience of subjective mental states does not mobilize the entire brain with equal intensity. Rather the results of such studies commonly show that a set of areas is especially involved in some subjective state or another. But activity in the areas comprising that set does not necessarily correlate only with one subjective state. An area of that set may do “double” or “multiple” duty and be active during the experience of several subjective states, even contradictory ones. But one nevertheless commonly finds that the set of areas especially active during some experiences is different from the set of areas active in another, or in other, subjective experiences, even if they share common areas.
This, of course, is a far cry from those who, usually anxious to stigmatize the findings of neurobiology, write of neurobiologists as having discovered the “love spot” or the “beauty spot” in the brain. Or to dismiss them as nothing more than “modern phrenologists”.
The “unity” of mind
A fertile terrain for questioning the localizationist claim – that cortical areas with characteristic histologies and specific sets of inputs and outputs can be associated with special functions - lies in the so-called “unity of mind” which makes us act holistically.
But let those who ridicule the efforts of neurobiologists consider what has been the greatest success of cortical studies on the one hand and what has been its greatest failure on the other.
The greatest success - which almost links the history of cortical neurobiology in one unbroken thread – is the association of special functions with distinct cortical areas. This theme has run through cortical studies since the day in April 1862 when Broca announced that the third left frontal convolution is critical for the production of articulate language.
The greatest failure has been its inability to account for how these specialized areas “interact to provide the integration evident in thought and behavior” as the American neuropsychologist Karl Lashley put it in the 1930s. He also added, however, that just because the mind is a unit, it does not follow that the brain is a unit.
Those who dismiss all these “localizationist” studies as nothing more than a “modern phrenology” may want to ask why neurobiology has failed so miserably just when it might have been expected to succeed spectacularly in light of its findings.
Perhaps a good first step in this enquiry would be to stand back – even if momentarily – and ask whether the mind is an integrated unit after all. The answer may come as a surprise.
Thursday, January 3, 2013
In his last speech to the House of Lords as Archbishop of Canterbury, Rowan Williams lamented society’s attitude towards older people. He said: "It is assumptions about the basically passive character of the older population that foster attitudes of contempt and exasperation, and ultimately create a climate in which abuse occurs" and referred to estimates that a quarter of the older population is abused one way or another.
This comes against ghastly stories of the mis-treatment of older people by their nurses in old peoples’ homes, often verging on outright cruelty, stories that are repeated annually throughout the country, and probably mirroring similar stories in many other countries as well.
I believe that the Archbishop showed wisdom and compassion in choosing the theme for his last speech and in speaking up for older people, but he did not go far enough in his analysis.
I have long wondered whether we are not biologically programmed to dislike and even hate older people for being older, just as we seem to be biologically programmed to love vulnerable and defenseless young children just because they are younger. The latter merit our attention and care while the former our avoidance and, where occasion permits, our cruelty and mis-treatment of them.
I have no scientific evidence for this belief, though there might be such evidence somewhere. But if my analysis is correct, or turns out to be correct, then it is not that we have “assumptions about the basically passive character” of older people that leads to their mis-treatment, as the Archbishop believes, but something biological and therefore much more difficult to control.
Of course, the hatred is probably more easily directed against those older people who are not members of the family, or at least the immediate family. But even in that context, older people are not immune. In the Prologue to his autobiography, Bertrand Russell wrote that one of the things that had made him suffer was the sight of “helpless old people a hated burden to their sons”.
If we are biologically programmed to dislike older people at best and hate them at worst, especially when they are not members of our family, then it is right, as the Archbishop suggested, that they should be given some kind of state protection, for example by appointing a national Older People’s Commissioner.
Society does, after all, police other biological urges that are difficult to control. It is perhaps time to introduce severe punishment for those who heap so much misery on the helpless in our society.
But that of course leaves another aspect which society simply cannot control. The dislike of old people, and their avoidance, are no doubt the source of much misery and alienation for them, and I just don’t know how society can combat that. We cannot, after all, legislate against dislike though we should be able to do so against its consequences