Friday, July 18, 2014

"Nature" and the retracted STAP cell paper

-->


After its publication in January this year, to much fanfare and international acclaim, the two STAP cell papers have been retracted because, it seems, there were flaws in them.

In an editorial, Nature has absolved itself of all responsibility for the flawed papers, claiming that neither its referees nor its editorial team could have spotted the apparently serious flaws in the them, flaws which led to the papers’ rapid demise.

Nature is in fact quite correct. It is not the function of editors or journals to look for manipulated images or plagiarism. I have no doubt that the very great majority of referees would notify the editors at once if they detect such flaws. There is, or ought to be, a certain element of trust between authors, journals and their editors. Moreover, as I understand it, Nature and its referees did not give these papers an easy ride. It took several months before the papers were published, implying that the referees had asked for substantial modifications to the manuscript.

Thus, Nature could be said to come out of it smelling like roses.

Yes, but not quite so fast.

Nature should take a leaf from one of its sister publication, Frontiers in Human Neuroscience, which is in fact owned by the Nature Publishing Group.

After a paper is accepted in Frontiers (but not before, and not if it is rejected), the names of the referees are published on the front page of the article. Publication in Frontiers is also not an easy ride, but at least the authors are allowed to enter into dialogue with the referees to put right or respond to criticisms, something that few journals allow, to the disadvantage of authors. The referees remain anonymous throughout this process, and only if a paper is accepted for publication are their names published.

Hence, if a paper is of extraordinary significance, some of the glory is reflected onto the referees and of course onto the journal. I mean, just imagine, if the Crick-Watson DNA paper had the names of the referees on it, they would no doubt have wanted to share in the glory to some minor extent. Indeed, Nature itself periodically reminds its readers that the DNA article was published in their pages, thus basking in the reflected glory.

Since all reasonable people understand that referees and editors cannot be held accountable for things like manipulated images or plagiarism in a paper, publication of their names in an accepted paper would do no harm, if the published paper turns out to have serious flaws.

If, on the other hand, the paper turns out to be some extraordinary contribution, then they can at least feel pride in helping to bring it to fruition and bask in its glory.

It is a classic case of “heads I win, tails you lose”

Why not try it?

Monday, June 23, 2014

Why impact factors are here to stay


I had intended to follow up my earlier post on impact factors (in defense of Nature and Science) with this one some weeks ago but could not get to it until now. This is fortunate for, in the meantime, my colleague David Colquhoun has written an excellent post about impact factors and such like, with which I agree completely. In fact, David is using facts and figures to give teeth to what we all know – that impact factors are deeply flawed and that no self respecting academic institution or individual should take the slightest notice of them when judging an individual for a position or for a grant application. Above all, papers should be judged by their content – which of course implies reading them, not always an easy thing to do it seems. A particularly important point that David makes is that the importance and influence of a paper may emerge years after its publication. This is why I regard the reason given by the open access journal  e-Life for declining my paper – that  “we only want to publish the most influential papers” - as among the silliest, and indeed the most stupid, I have ever received from a journal, and a journal, moreover, which is viewed as a rival to the so-called “glamour” journals, at least by some. I suspect that the editors of Nature and Science have had much too much experience to write anything quite so silly.

It is now generally acknowledged in the academic community that there is something unsavoury about impact and allied factors. Indeed, there is a San Francisco declaration (DORA) which is a sort of code for honourable conduct for the academic community. I have signed the DORA declaration – even though I detected a few cynical signatures there – and applaud its aims. Given the large number of academics and those from allied trades who have also done so, it is obvious that we all know (or most of us do) that impact factors and the allied statistics regularly collected to classify scientists and their standing are dangerous because so flawed. Yet in spite of this knowledge, impact factors continue to be used and abused on unprecedented scales. Therefore the more interesting question, it seems to me, is why this should be so. That, I think is the question that the academic community must address.

Here to stay because they serve deep-seated human needs
I believe that, whatever their flaws, impact factors are here to stay because they serve at least two needs, as I stated in my last post on the question in respect of editors. They increase their self-esteem and allow them, in the endless and often silent competition that goes on between journals, to declare to others, “we are better than you”. Hence Nature sent out emails some time ago stating that “To celebrate our impact factor…” we are doing such and such. They were the producers and consumers of their own propaganda, declaring in the same breath to themselves and to others that they are better than the competition.

What applies to editors applies equally to individuals.  When a scientist declares  “I have published 20 papers in Nature and 15 in Science”, s/he is also saying “I am a better scientist [than those of you who cannot publish in these high impact journals], without actually having to spell it out in words, perhaps even without thinking about it. In this way, they give notice to their colleagues, both superior and inferior, that they are better.  They too are producers and consumers of their own propaganda, since it elevates their self-esteem on the one hand and the esteem accorded them on the other. In other words, impact factors serve the same purposes for both editors and contributors. And the desire for esteem is deeply ingrained, and any means of obtaining it – of which impact factors is one - is bound to enjoy great success. Impact factors, in brief, are not going to be easily brushed aside by pointing out their flaws, when they cater for much deeper needs.

Two examples
This was dramatically illustrated for me during a lecture I attended fairly recently, given not by an aspirant to a job but by a senior professor. Slide after slide had emblazoned on it Nature or Science in conspicuous letters at least four times the size of the title of the article displayed, the face of the professor growing ever more smug with each additional slide. The irony is that I remember nothing of his lecture, save this and the title of the lecture. And that, of course, was part of the intended effect. For I now remember that the eminent professor publishes in “glamour” journals. He must therefore be good, which is what I assume he was trying to tell us.

Another example: Some four years ago, an advertisement for a research fellow from a very distinguished institution stated that “the successful candidate will have a proven ability to publish in high impact journals”. Nothing about attracting research grants or, better still, attacking important problems. Presumably, if someone publishes in glamour journals, s/he is worthy of consideration because they must be good.

When I made my disapproval of the wording known to the Director of the institution (whom I knew vaguely) he looked at me as if I had come from another planet, or perhaps one who had seen better days and had not kept up with the world. What he said in defense of his preposterous ad confirmed this. I was shocked.

Actually, I should not have been, because he was right and I was wrong.

Centuries of impact factors
There is in essence nothing new in impact factors, although they have been with us for only about 15 years or so. In one guise or another, they have actually been with us for centuries. Does such “in your face” advertising differ radically from putting some grand title after one’s name? Fellows of the Royal Society have for centuries put FRS after their name, to indicate to all their somewhat elevated status. In France, those who have a Légion d’Honneur do it more conspicuously “in your face”, by wearing permanently a badge on their lapel to announce their higher status. In Britain, in official ceremonies, guests are often asked to wear their decorations – to signify their status to others. Impact factors, as used by scientists, are just another means of declaring status. The difference between these displays of superiority and impact factors is simply a difference in the means of delivery, nothing else.

It all really boils down to the same thing. So, if impact factors were to be banished by some edict from the scientific nomenclatura, their place would be taken by some other factor. And what replaces it may in fact be worse.

Impact factors as short-cuts
Impact factors appeal to another feature of the human mind, namely to acquire knowledge through short-cuts.  Where we do not understand easily or cannot be bothered or are sitting on a committee that is judging dozens of applicants for a top position,  impact factors become a welcome aid.

 How does saying that “s/he has published in high impact factor journals” differ from a common phraseology used by many (not only trade book authors), along the lines of, “Dr. X, the Nobel Prize physicist, has affirmed that the war in Mont Blanc is totally unjustified from a demographic point of view”, when there is not the slightest guarantee that a Nobel Prize in physics qualifies its recipient to give informed views on such wars? Once again, the difference between this and impact factors is simply a difference in means of delivery. Like impact factors, this one short-cuts the thinking process and, since similar phraseologies are used so often, we must assume that most of us, when handling issues outside our competence, welcome the comfort of a label that apparently guarantees superior knowledge or ability and hence soothes our ignorance.  

There was a hilarious correspondence years ago (so my memory of it is somewhat blurred) in, I think, The Times, about the nuclear deterrent. Someone, anxious to convince readers of the wisdom of maintaining our nuclear deterrent, wrote that he knew of two Nobel laureates who had come to the conclusion that we should keep our nuclear deterrent. In response, Peter Medawar (the Nobel laureate, if I may say so) wrote back: “If we are going to conduct this dialogue by the beating of gongs, let me say that I know of 5 Nobel laureates who think that we should not keep our nuclear deterrent”.

A recent paper states that “… publication in journals with high impact factors can be associated with improved job opportunities, grant success, peer recognition, and honorific rewards, despite widespread acknowledgment that impact factor is a flawed measure of scientific quality and importance”. It is obvious why this is so; the habit of short-cutting the thinking process, common to most, is wonderfully aided by impact factors. I know of only one outstanding scientist (you guessed it, he was a Nobel laureate) to whom impact factors and all other short-cuts made not the slightest difference in assessing the suitability of a person; he searched for the evidence by reading the papers; that was his sole guide and authority. He was, of course, a loner; almost all of us are vulnerable to being impressed.  

It is totally fatuous to assume that, sitting in a committee that is trying to appoint a new research fellow, I would not – like countless others and absent the process of assessing all the evidence– be impressed by someone who has published 10 papers in Nature and 20 in Science and not show a preference for him over someone who has published entirely in those “specialized” journals which the editors of glamour journals so cynically advise those whose papers they reject without review to publish in.

Hence, for editors, administrators, ordinary and extraordinary scientists alike, impact factors serve deep-seated human needs – to classify people easily and painlessly on the one hand, to declare to others one’s superiority, and to soothe oneself with the belief that one is actually better. There is no way in which all the flaws attached to impact factors could overcome these more basic needs.

One can point to other similarities, where what is considered to be flawed nevertheless is durable and highly successful. Take, for example, posh addresses, say in Belgrave Square in London or Avenue Foch in Paris, or Park Avenue in New York. There is absolutely no guarantee that those inhabiting such addresses are in any way better than those who do not, although they may be (and often are) far richer. This is not unlike those who advertise incessantly that they have published in the glamour journals. There is no evidence whatever that those who publish in them are better scientists. But giving an address such as Science on a paper is like giving an address in Belgravia. It works like magic. Once again, the difference lies only in the means of delivery.

So, let us sink back and get used to it…impact factors are here to stay, and for a very long time. No good blaming editors of journals for it, no good blaming scientists, no good blaming administrators.  It all has to do with the workings and habits of the mind, as well as with its needs.

The only way to deal with the crushing and highly undesirable influence of impact factors in science is to prohibit their use in judging candidates and research grants and indeed the merit of individuals. That would be a great help, and some research organizations are actually implementing such procedures, with what success I cannot guess. But I fear that even that will not be enough. To get rid of it completely, one must change human nature. And that is not currently in our gift.

Sunday, April 27, 2014

Impact factors...in defence of "Nature" and "Science"


More often than not, when the corrupting influence of impact factors on science is discussed, fingers are pointed at Nature and Science, as if these two scientific journals invented impact factors and as if they are the main culprits in debasing science. This is not even remotely true. Rather, the finger should be pointed at the academic community exclusively and on no one else. In fact, there are even limits as to how much the academic community can be blamed, as I argue in my next post.

Nature and Science, and especially the former, are the best known and most sought after of what has come to be known as the “glamour” journals in science. There are, of course, other “glamour journals”, as well as ones that aspire to that status, but none has reached quite the same status as these two. It is therefore perhaps not surprising that they should bear the brunt of the blame.

But it would be hard for even the enemies of these two journals not to acknowledge that both have done a remarkably good job in selecting outstanding scientific papers over so many years. Journals do not gain prestige status by impact factors alone; if they did, their prestige wouild fall, and their impact factors along with it. I myself have little doubt but that the editors of both journals are hard-working, conscientious people, striving to do the best by themselves, by their journal and by science. One way of measuring their success is through impact factors, which is a guide to how often papers in the journal are cited. Impact factors are blind to quality, readability, or importance of a paper. They are simply one measure – among others – of how well a journal is doing and how wide an audience it is reaching. One could equally use other measures, for example the advertising revenue or some kind of Altmetric rating. Impact factors just happen to be one of those measures. And let us face it, no editor of any journal would be satisfied with low impact factors for their journal; if s/he says otherwise, s/he lies. The un-ending search for better impact factors really boils down to human behaviour - the desire to convince oneself and others that one is doing a good job, to esteem oneself and gain the esteem of others. Editors of journals are no different. Like the rest of us, they aspire to succeed and be seen – by their employers, their colleagues and the world at large – to have succeeded. Is it any wonder that they celebrate their impact factors?

To the editors of these journals – and to the rest of us - impact factors are therefore a guide to how successful they have been. I see nothing wrong with that, and find it hard to blame them for competing against each other to attain the best impact factor status. In other words, there is nothing really wrong with impact factors, except the uses to which they are put, and they are put to undesirable uses by the academic community, not by the journals.

In spite of the sterling service both have done to science, by publishing so many good papers, it is also true that they have published some pretty bad ones. In fact, of about the ten worst papers I have read in my subject, in my judgment one (the worst) was published in Nature Neuroscience, one in Nature, and one in Science.  I have, as well, read many mediocre papers in these journals, as well as in others aspiring to the same status, such as the Proceedings of the National Academy of Sciences and Current Biology. This is not surprising; the choice of papers is a matter of judgment, and the judgment made by these journals is actually made by humans; they are bound to get it wrong sometimes, and apparently do so often. By Nature’s own admission in an editorial some time ago, there are also “gems” in it which do not get much notice. Hence, not only does one find some bad or mediocre papers in these journals but un-noticed good ones as well. Retraction rates in both journals are not much worse or better than other journals although retraction rates apparently correlate with impact factors, the higher the impact factor, the more frequent the retractions. But it would of course be entirely wrong to blame the journals themselves, or their referees, if they publish papers which subsequently have to be retracted. The blame for that must lie with the authors.

“Send it to a specialized journal” (euphemism for “Your paper won’t help our impact factor”)
I recently had an interesting experience of how they can also be wrong in their judgment, at least their judgment of the general interest in a scientific work (of course the more the general interest, the higher their impact factor is likely to be). We sent our paper on “The experience of mathematical beauty and its neural correlates” first to Nature, which rejected it without review, stating that “These editorial judgements are based on such considerations as the degree of advance provided, the breadth of potential interest to researchers and timeliness (somewhere in that sentence, probably at “breadth of potential interest”, they are implicitly saying that our paper does not have the breadth of potential interest – in other words will not do much to improve their impact factors). We then sent it to Science, which again returned it without sending it out for review, saying that “we feel that the scope and focus of your paper make it more appropriate for a more specialized journal.” (Impact factors playing a role again here, at least implicitly, because, of course, specialized articles will appeal to a minority and will not enhance the impact factor of a journal, since they are also likely to be cited less often and then only by a minority).

Finally, going several steps down a very steep ladder, we sent it to Current Biology, which also returned it without sending it out to referees for in-depth review, writing that “…our feeling is that the work you describe would be better suited to a rather more specialized journal than Current Biology” (my translation- it will do nothing for our impact factor since only a limited number of workers are likely to read and cite it).

The paper was finally published in Frontiers in Human Neuroscience (after a very rigorous review). Given that this paper has, as of this writing, been viewed over 71,000 times in just over 2.5 months, and that it has been viewed even in war-torn countries (Syria, Libya, Ethiopia, Iraq, Kashmir, Crimea, Ukraine), it would seem that our article was of very substantial interest to a very wide range of people all over the world;  very few papers in neuroscience, and I daresay in science generally, achieve the status of being viewed so many times over such a brief period.  On this count, then, I cannot say that the judgment that the paper should be sent to a specialized general or that its breadth of interest was potentially limited inspires much confidence.


We only want to publish the most influential papers
It is of course a bit rich for these journals to pretend that they are not specialized. I doubt that any biologist reading the biological papers in Nature or Science would comprehend more than one paper in any issue, and that is being generous. In fact, very often what makes their papers comprehensible are the news and views sections in the same issue, a practice that some othert journals are taking up, though somewhat more timidly. By any standard, Nature and Science and all the other journals that pretend otherwise are in fact highly specialized journals.

Be that as it may, they are only pursuing a policy that many other journals also pursue. Consider this letter from e-Life, a recently instituted open access journal, which I have seen being written about as if it is a welcome answer to Nature.

Well, they returned a (different) paper I sent within 24 hours, after an internal review, saying that “The consensus of the editors is that your submission should not be considered for in-depth peer review”, adding prissily “This is not meant as a criticism of the quality of the data or the rigor of the science, but merely reflects our desire to publish only the most influential research”, apparently without realizing that a research can only be judged to have been influential retrospectively, sometimes years after it has been published. But what does “influential” research amount to – one which is cited many times, thereby boosting – you guessed it – the impact factor of the journal. Indeed, e-Life (which has also published some interesting articles) even has a section in its regular email alerts that is intended for the media – which of course help publicize a paper and boost – you guessed correctly again – the impact factor!

So why single out Nature and Science, when so many journals are also pursuing impact factors with such zeal? It is just that Nature and Science are better at it. And their higher impact factors really means that the papers they select for publication are being cited more often than those selected in other journals with aspirations to scientific glamour.

So, instead of pointing fingers at them, let us direct the enquiry at ourselves, while acknowledging that both journals, in spite of all their blemishes and warts, have done a fairly good job for science in general. 

In my next post, I will discuss why impact factors - however repellent the uses to which they are put by us - are here to stay.

Monday, January 27, 2014

Art and science meet up, sort of...

Some time ago, I wrote about an empty canvas by Bob Law, entitled Nothing to be Afraid Of, which was to be auctioned for an estimated £60, 000. Law was described by the head of the contemporary art department at the auction house as the "most underestimated and overlooked minimalist artist in Britain...who didn't get the recognition that he deserved". In his painting he had apparently "... applied the seductive idea of nothing to a canvas, and asks the viewer to reflect”.

A somewhat puzzled David Hockney was reported as saying "It seems to me that if you make pictures there should be something on the canvas".

In the end, the empty canvas was never sold, at least not at that auction.

Now, I have just read in Real Clear Science about the shortest paper ever published.

It is entitled "The Unsuccessful Self-Treatment of a Case of Writer's Block" by one Dennis Upper.
The paper is an empty page. The referee's comments are reproduced below the empty page and read as follows:

"I have studied this manuscript very carefully with lemon juice and X-rays and have not detected a single flaw in either design or writing style. I suggest it be published without revision. Clearly it is the most concise manuscript I have ever seen-yet it contains sufficient detail to allow other investigators to replicate Dr. Upper's failure. In comparison with the other manuscripts I get from you containing all that complicated detail, this one was a pleasure to examine. Surely we can find a place for this paper in the Journal-perhaps on the edge of a blank page."

There is nothing on the page -- and yet "it contains sufficient detail to allow other investigators to replicate..."

Bob Law asked the viewer to reflect by applying "the seductive idea of nothing to a canvas"

Both scientists and artists can now, in the absence of all detail, create their own details.

So science and art do meet, sort of, don't they? After all, who can deny the similarity here?

Maybe someone should ask the auction house to sell a copy of the paper (preferrably signed by Dennis Upper) alongside Bob Law's empty canvas.

That will be a true meeting of art and science - united under money.

The question is: which one will fetch the higher price?

Sunday, December 29, 2013

Wonderful, transient, art in the snow

 

The British artist, Simon, Beck has created some memorable art over snow at a ski resort in France, some of which can be seen here. The pictures are attractive to look at but they must be more exciting to see for real.

With every new snowfall, the creations will gradually change and then disappear altogether. The art is therefore, of necessity, transient and has, therefore, transience as an added element. I wonder whether these creations, and their transience, will not be even more appealing to those of a Japanese culture, which emphasizes transience (wabi) as a feature of beauty.

Perhaps some contemporary art gallery should buy good quality photographs of these creations, to exhibit permanently what is only transient.

Just a thought!

Tuesday, December 24, 2013

The great Stephen Hawking

-->
At a recent event to launch the exhibition on the Large Hadron Collider at the Science Museum in London, the great Stephen Hawking made what must seem to many an unusual declaration. He said, “Physics would be far more interesting if [the Higgs boson] had not been found”. Physicists would then have to re-think many of their fundamental ideas about particles and the forces that bind or repel them.

By saying so, Hawking was displaying both the qualities and perhaps the failings of scientists. Scientists, or at least the great ones like him, love the process of solving great and difficult problems. The solution may be quite marvelous and exciting to think about; it may even be very moving. But, once solved, it ceases to be a problem, which the enquiring mind needs.

So, what Hawking was saying, it seems to me, is that if the Higgs boson had not been found, the problem would have persisted and exercised and concentrated minds, which is what scientists like so much.

This of course is very distant from those who wish that a problem should never be solved, because they fear the results. Some have written of their fear of work on the neurobiology of love, because it will “de-mystify” it; others have written, of neuroesthetics, that they would find it unwelcome to learn what happens in their brains when they view a work of art or listen to music. Hawking wants to learn; they don’t. If Hawking prefers that the Higgs boson had not been found, it is simply that he relishes the process of discovery. He is not fearful of the results; they are.

Why, then, should this also be a failing. I think it is because lesser scientists (and let us not under-estimate the degree to which scientific progress also depends upon lesser scientists) can easily be distracted from trying to solve great problems into solving relatively minor ones, precisely because they love the process of solving problems! I have seen it happen many times.

But there are of course many problems that remain in physics and astronomy. And Hawking is hoping that physicists will move on to solving even grander problems about the nature of our universe.

Hawking is not afraid of de-mystification.  Not at all. The mark of a real intellect.

Monday, December 23, 2013

The paradox of Shunga


The quite wonderful exhibition, Shunga: sex and pleasure in Japanese art  at the British Museum in London, carries with it a surprising paradox or contradiction, which no one has so far been able to explain to me adequately.

Japanese culture in general emphasizes the unstated and the understated, leaving much to the imagination. Yet Shunga art, which is basically erotic art, is the exact opposite. Here, almost nothing to do with the genitals is left to the imagination; instead they are given prominence, the size of the organs more often than not exaggerated beyond reasonable dimensions. 

Yet, in spite of this prominence, most of the rest of the body is covered up in many, if not most, depictions of sexual encounters; in many it is the genitalia alone that are exposed. There is of course, also something of the artificial in these works; couples make love with their clothes on; the hair is usually immaculately coiffed, in some a lady is having her hair combed while having intercourse while in others there are spectators, including children, witnessing the scene.

Why would a culture that has traditionally emphasized the understated produce work that is anything but under-stated? Some Japanese friends have told me that Shunga is nothing but pornography. I do not believe it. In spite of the fact that they may have been used as stimulants or as props for sexual pleasure, these are works of art as well. It is the brilliant depiction of interiors, the wonderful colour combinations, and the immaculate detail with which clothes are represented that turns them into visually pleasurable works. Indeed, it may be said that the genitalia are in fact often a distraction from the rest of the work, especially the depiction of the graceful women in the Shunga work of Kitigawa Utamaro. If  “art is fantasy”, as a quote at the exhibition proclaims, then it is those graceful figures that invite the viewer into a world of fantasy, not the prominently exposed genitalia. A critic once wrote that the sexual figure in Boticcelli’s Birth of Venus is not the naked lady but her richly dressed companion to the right, presumably meaning that it is the latter who draws the viewer into a world of fantasy. Maybe the great masters of Shunga art were trying to balance the explicitness of their images with depictions that allow a world of fantasy and imagination to come into play, all in one.

Shunga was apparently not legal in Japan for very long periods, though tolerated throughout and popular with all levels of society. It is, I gather, still frowned on in Japan. Indeed, I am told that, in modern-day Japan, adult movies in hotels often blur the genitalia – in striking contradiction to Shunga art of earlier times. And there is the contradiction: explicit pornographic films that blur the genitalia on the one hand (perhaps in keeping with the understated in Japanese culture), and great art that is implicit in everything but the genitalia (quite unlike the understated characteristic of Japanese culture).

Saturday, November 16, 2013

The shocks of Francis Bacon

 
Francis Bacon claimed that he wanted to give “a visual shock”, and his paintings over the decades never seem to have departed from that aim. One of his first exhibitions, in New York, was described as a “chamber of horrors” and Margaret Thatcher, perhaps echoing the views of many outside the art world, once described him as “that man who paints those horrible pictures”. As I understand it, most people (even those who admire his painterly style) would prefer not to have his paintings hanging in their living rooms.

Last week, his three-panelled painting, entitled Three Studies of Lucian Freud, produced another shock – a financial one. It fetched a record price in New York, being sold for the sum of $142.4 millions. What is it that attracted buyers to spend so much (the bidding started at $80 million)?

I believe that Bacon subverted the brain’s normal representation of faces and bodies, which is what turned his pictures into shocking displays. The brain, it seems, cannot easily adapt to departures from what constitutes a normal face; it cannot adapt easily to the disfigured faces and bodies that Bacon specialized in, as a means of making images of the violent reality which, according to him, was so prevalent in the world. Hence the enduring shock effect that he produced.

Most of the discussions I heard and articles I read on this sale revolved around the topic of money. It is not that buyers were only speculating. Rather “deep-pocketed” buyers were also ready, it seems, to splash out considerable sums to buy paintings for their national museums or their homes. I am inclined to the view that when it comes to spending such vast sums, the long-term value is naturally important but cannot be the only or even dominant factor. So what, beyond the prestige of Bacon, drove prices so high? How could paintings reviled through the use of phrases like “horrors” or “mutilated corpses” or “extremely repellent”, which so many (including one on the radio last week) declared they would rather not see hanging in their living rooms, be so much sought after.

Perhaps we have a very deep-seated fascination with horror, especially when it is so evocatively depicted. Perhaps those who yearn to view such paintings are an infinitely more sophisticated and refined, indeed artistic, version of those who jam the roads on their way to see a crashed plane. There are, of course, huge artistic qualities to Bacon’s work – they are formally masterful works, with a quite spectacular, and often unusual, combination of colours. But the fact remains that they also depict mutilated and savaged faces and bodies – viewing of which almost certainly stimulates strongly sub-cortical centres such as the amygdala and the insula, which seemingly respond to fear and horror. And let us not forget that Bacon once said that he was not appealing to the intellect: “I make paintings that the intellect cannot make” he once said, which also implies that he was appealing to something more primitive in his work. In his quite wonderful book on Francis Bacon, Michael Peppiat says that Bacon’s aim was to deliver a visual shock before things got spelled out in the brain (or words to that effect). Perhaps, combining the aesthetically pleasing colours with the mutilation that he so consistently depicted makes the latter more palatable – and even pleasing. The more so if one knows that such a combination is a good place to park one’s money.

Sunday, October 6, 2013

Academic violence

 
One is always somewhat surprised when academics who, in the words of HL Mencken, are generally as “harmless as so many convicts in the death house”, turn to violence. In general, academics dislike violence and prefer to pursue their trade peacefully, although there are many examples of verbal violence. I know of an English university department which speakers are reluctant to speak at because of the extreme verbal violence of one member there.

Yet it is surprising when this violence escalates to the level of arms. The BBC reports one such incident in which an argument about the German philosopher Immanuel Kant escalated to such levels that it ended by the use of rubber bullets fired by one protagonist against another. What the bone of contention was is not recorded. It could have been the “a priori synthetic” or the “categorical imperative” or perhaps the “transcendental synthetic”. At any rate, one of the protagonists was charged with causing grievous bodily harm.

Kant himself would probably have been very surprised. His book, Critique of Pure Reason, apparently sold only five copies when first published, of which two were purchased by himself (I cannot vouch for the accuracy of this story, which I read somewhere years ago). He was in general a very peaceful man whose habits were so punctual that housewives apparently set their watches by when he went to work and when he returned. The French critic Rémy de Gourmont marveled that a man like Kant who had neither wife nor mistress, who died a virgin (as Gourmont believed) could have written a book on the metaphysics of morals!

Yet, violence in academic circles has been recorded before (I mean real violence, not the verbal one, which is very common). There is, for example, the story of Pierre Marie, an eminent French neurologist, who accused another eminent French neurologist, Déjerine, of doing science as some play roulette. But, upon being challenged to a duel, Marie wisely chose to retract his accusation.

On one occasion, I was told not to mention 40Hz when giving a seminar if a certain gentleman was in the audience, for fear that he may suffer a heart attack. I wisely obeyed. But I am told that he later died of a heart attack anyway.

Perhaps it is only fear that keeps academics from resorting to real violence. I know of stories of one German physiologist saying of another, “Now that I have shown that he cannot use a slide ruler, I intend to take no further notice of his work”, while another accused a colleague of “auto-plagiarizing”. I can well imagine such incidents boiling over and resulting in - well, the firing of rubber bullets, at least.

It all goes to show that the dispassionate academics, searching for truth in their ivory towers, may not be impervious to these human instincts, just like the rest.

Monday, July 29, 2013

Silence at Götterdämerung

-->
This has been quite a Wagner week at the Proms in London, the first time that the complete Ring cycle was performed there, to celebrate the bi-centenary of Wagner’s birth.

I am sure that many much more qualified than me will write about this historic occasion. All I need to say is that I enjoyed it tremendously, in spite of the oppressive heat in the hall. As one critic wrote somewhere, “it can’t get much better than this”.

My purpose here is really only to record one extraordinary moment where nothing happened…at the end of Götterdämmerung. Maestro Daniel Barenboim held his baton up for a good 18 seconds after the last note, and everyone held their breath, leaving the gigantic hall, filled to capacity, completely hushed. There was a great deal left to the imagination in those few moments, much longer imaginatively than the real time of 18 seconds suggests. 

You can listen to the silence here at 1:21:13 onwards (for the next six days only).

I say nothing happened, but of course a great deal must have gone through the minds of the thousands attending the performance and the millions listening at home.

This was a perfect ending, for there was nothing left to say but much to think about silently in those few moments.

Indeed, one of the more remarkable features about this Ring cycle was the complete silence from the audience at those silent or subdued moments during the performance, a fact that Barenboim commented on, and thanked the audience for, in his speech at the end of the performance of Götterdämmerung.

I have written many times here about the value of the unstated in art and the silence in music. Yesterday, Daniel Barenboim demonstrated it to powerful effect.

Saturday, April 6, 2013

Neurophobes and their neurophobia

 
I describe briefly a new and hitherto undocumented phobia, which I shall name neurophobia and those who display it as neurophobes. It is a somewhat new phobia, perhaps no more than 15 years old but it shares characteristics with other phobias. It is to be distinguished from the neurophobia that medical students apparently suffer from when studying neurology.

Neurophobia can be defined as a profound dislike, with various degrees of severity, for cognitive neurobiology and especially for neuroesthetics and for what these disciplines promise to show us.

Neurophobes are a motley crowd and, as with so many other phobias, they include people from different backgrounds and walks of life – philosophers of different degrees of eminence, humanists, religionists and even (surprisingly) some neurobiologists. This is not to say that all philosophers and humanists are neurophobes, far from it; many are interested and excited by the discoveries that neurobiology and neuroesthetics have to offer, but neurophobes are more vocal. Nor are all religionists neurophobes: I have had some very interesting discussions with some religionists, who have shown themselves to be hospitable to new ideas. Interestingly, I have not encountered neurophobia among artists (yet), which again does not mean that there aren’t neurophobes among them. Hence, neurophobia, like other phobias, cannot be associated with any particular grouping, either socio-economic, cultural or otherwise.

Among the characteristics of neurophobia, one may list the following:

1. An irrational fear: they invest neuroesthetics in particular with imaginary powers; these include weapons of mass destruction (WMD), for how else to interpret a statement that neuroesthetics “… will flatten all the complexity of culture, and the beauty of it as well”? and other similar statements.

2. A desire to find a place for the mind outside the brain, not perhaps realizing that cognitive neurobiology and neuroesthetics study neural mechanisms and hence the brain, and that their conclusions are to be seen in that context.

3. The use of emotionally charged and pejorative terms to dismiss neuroesthetics, terms such as “trash”, “buncombe”, “rubbish” and others like them, which have no place, or should have no place, in scholarly (and especially scientific) discourse. Hence neurophobia shares a similarity with other phobias in that it is not easy to rationalize it cognitively, an appeal to emotional and pejorative language being the only way out.

4. The pursuit of ignorance: As with so many other phobias, this amounts to the wish not to know. Hence, neurophobes don’t want any scientific ‘de-mystification’, which they would regard as a “desecration” (note again the emotive language) and prefer to live in ignorance. This is of course similar to other prejudices, where ignorance is the preferred course.

5. This arrogance displays itself in their protecting themselves against the facts. As I have said before, once they relegate our discipline to the status of “trash”, they need not bother with it. And there is, in their writings, good evidence that they have not read what we have written.

6. Arrogance of ignorance: neurophobes always assume that they know better, and hence lecture us on what they suppose we are not aware of. They never cease to tell us that art and beauty are not the same, as if we are not aware of that and have not written about it. They never cease to emphasize the importance of culture and learning in aesthetic appreciation, as if this is a new insight that we are not aware of.

7. Attack the methods: where all else fails, there is always recourse to attacking our methodology – principally the imaging techniques. They fault these for their spatial and temporal resolution (sometimes using emotive language) as if we are not aware of these shortcomings and do not take account of them in our interpretations. (I will have more to say about this in a future post.) I imagine most are scared of new technologies that will have greater powers of resolution.

This collection of characteristics is very descriptive of neurophobia, and they are interlinked. Hence if one detects one of these characteristics in an individual, one must suspect him/her of being a neurophobe and  display the other characteristics on gentle probing. Here I would advise caution; it is best to probe a little further before classifying someone as a neurophobe.

Of course, many of them preface their pejorative remarks with feint praise, such as "Neurology has made important advances" (rather like, some of my best friends are neurologists).

And finally…what one neurophobe says or writes is remarkably similar to what another one says or writes, reminding me of the famous line of President Reagan, “There you go again”. Indeed, so similar are their articles that it becomes reminiscent of another one of Reagan’s famous lines (about redwood trees): “Once you’ve seen one, you’ve seen them all”.



Tuesday, April 2, 2013

Pigeons and Picassos


In 1995, a Japanese team was awarded the Ig Nobel Prize in Psychology for their work describing how pigeons can be trained to discriminate between the paintings of Picasso and those of Monet. Previous work had shown that pigeons could distinguish between the music of Bach and Stravinsky.

Receiving the Ig Nobel Prize must be a mixed blessing, as its very title implies. Often the implication is that there is something trivial in the research reported and sometimes it is awarded for what many would regard as work that is not scientifically worthy, for example a report to the US Congress that nicotine is not addictive (awarded the Ig Nobel Prize in Medicine in 1996).

Others are frankly funny, such as the Ig Nobel Prize for Peace (2000), awarded to the British Royal Navy for a Monty Python-like command, that its sailors should not use live cannon but instead shout “Bang”, or the one awarded in Biology (2004) for showing that herrings communicate by passing wind (farting).

In fact, many of these Ig Nobel prizes go to worthy and scientifically interesting work. The one about herrings communicating by farting turned out, apparently, to be strategically and financially important because the Swedish Navy, suspecting that Swedish waters were being infiltrated by Soviet submarines, instigated a widespread but futile hunt for those submarines. After many inconclusive years, it turned out that the noises were probably coming from farting herrings. Had this been known, it is claimed, the Swedes would have saved hundreds of millions of Swedish Krones.

Science is, or should be, fun. And even apparently simple science can be fun BECAUSE it leads to new and interesting clues. The work for which the Japanese scientists got the Ig Nobel prize in 1995 really showed that pigeons, which have a well-developed visual apparatus, could distinguish between the paintings of Picasso and those of Monet because they formed a concept of these paintings. They did not apparently distinguish them because of the presence of sharp edges in the cubist paintings or colour in those of Monet. Hence, in addition to a well-developed visual apparatus, they have brains that are sophisticated enough (if that is the right word) to develop visual concepts about visual stimuli unrelated to their daily lives.

Concept formation, critical for the acquisition of knowledge, is a fascinating subject, but how the brain forms concepts is not known in any detail. That pigeons should be able to form concepts around works designed by humans for consumption by humans, works which have little to do with their world, perhaps has the germs of an insight into how more complex brains form concepts. It would, in fact, be just as interesting to learn how humans form concepts around different schools of paintings.

If the Ig Nobel prize brings such interesting science to wider attention, then it is pursuing a worthy cause.

Saturday, March 23, 2013

The Fear of Neuroesthetics IV

 
There are some who fear neuroesthetics because they fear that it may ‘de-mystify’ what they prefer to remain mysterious. Knowledge about brain mechanisms that may be involved in the experience of beauty or of love and desire would deprive them, so they believe, of the full enjoyment of those experiences. I gather that a prominent professor has said that he regards it as ‘unwelcome’ to learn what happens in his brain when he is experiencing beauty. Presumably, if he were sitting on some research council, he would use his influence to suspend research in these areas. So, it is a relief that those who hate neuroesthetics and fear it are not in a position to halt research in the subject, at least not at present. There was a time when they could have and, in some areas of research, came close to doing so. Galileo was investigated by the Inquisition and ordered to stay silent, which he did, sort of, for a while. In the Soviet Union, a law was passed forbidding dissent from Lysenko’s anti-Mendelian views, which resulted in many losing their jobs and even being imprisoned. The law was rescinded in the 1960s.

I have no complaints against those who do not want, through knowledge, to de-mystify things which they hope will remain mysterious. That is their view, and I respect it, sort of. But it has to be noted that these are not people who are avid to learn more. It is not that they are simply dis-interested in certain things but that they are vocal in trying to discourage the rest of us from trying to learn more about important subjects – for I take it that the experience of love, beauty and desire are important and interesting subjects. In this sense, then, their intellect is somewhat limited. Though perfectly entitled to their views, these are not the sort of people whom I would like to have sitting on research councils.

In other ways, their attitude seems strange. Science has been de-mystifying things for millennia but I am not at all sure that the world has been rendered any less marvelous because of it. One could say that landing humans on the moon and bringing them back safely to earth was a step in de-mystifying the heavenly bodies, but it has not rendered the moon any less glorious; one could say that compressing all the secrets of life into two strands of DNA de-mystifies life, but it has made it all the more wondrous to me; one could also say that the role of neurotransmitters in regulating sexual behaviour (and hence determining, at least in the world of rodents, the extent of promiscuity) de-mystifies morality or immorality, at least in the world of rodents, but to me it raises a host of interesting questions about how behaviour is regulated, even when it threatens to invade the world of morality.

Perhaps much the more interesting question is a neurobiological one: why do some people (and there are many of them) prefer mystery to knowledge? What advantage does it bring them and what does it satisfy in them? If one of the functions of the brain is to acquire knowledge, what mechanism is it that suppresses the desire to acquire  knowledge in such interesting spheres, when the knowledge does not harm anyone? What dis-advantage would such knowledge bring to them?

The answers to such questions, too, might de-mystify things and those hostile to learning more might want to discourage research councils from funding research in these areas as well. But they remain, nevertheless, interesting questions and so I hope that those who want to dictate what kind of knowledge should be pursued and what avoided are never given a seat in the councils that make decisions about funding research.

Sunday, March 10, 2013

Explaining art through perception

 
The Light Show at the Hayward Gallery, London, is a delight and, quite rightly, oversubscribed. The number entering at any one time is strictly controlled, allowing viewers the space to appreciate the exhibits – quite unlike the disgraceful “cram them in” policy at the Leonardo exhibition at the National Gallery last year. Some of the exhibits, like the Chromosaturation of Carlos Cruz-Diez, or the Model for a Timeless Garden of Olafur Eliasson or Conrad Shawcross’ Show Arc Inside a Cube IV (a bit of an unnecessary mouthful this one) are ones to enjoy sensorially and to reflect about as much as one would about any work of art.

The weakness of the Hayward exhibition is that it pretends to combine science with art, or rather give a scientific explanation of the artistic exhibits, when it should really be seen as an art show and a delight to the senses, or should have appended to the exhibits something that is scientifically valid. As it is, the show was spoiled somewhat for me by the explanations appended. At the entrance, the viewer reads that “Vision is the least reliable of the senses”. What is the basis for this? Many, probably most, neurobiologists would argue exactly the opposite; it is the most reliable of the senses, perhaps reflected in the fact that so much of our brain is devoted to vision.

We are then told that “What we see, or think we see, is not always how things are”. This is a profound misunderstanding of the workings of the brain – for what we see and experience is dictated by the organization of our brains, and is precisely how things are in perceptual reality, however that reality may depart from the “objective” reality. That is why, at my own exhibition at the Pecci Museum of Contemporary Art in Milan (Bianco su bianco: oltre Malevich), the visitor was welcomed with the following statement: “The only reality we experience is brain reality”.

When one looks at the Hering Illusion, the two straight lines, which are parallel, appear perceptually to be somewhat curved. The perceptual reality dominates even when one knows that the two lines are straight and strictly parallel. Or consider the rapid motion in the rings in Isia Leviant’s Enigma; to those who see the movement, there is no doubting its reality, even if there is no actual movement in the rings.


It never ceases to surprise me that we downgrade our true perceptual reality in favour of the “objective reality”; the former is always what it does not seem, while the latter is always true. This gives to the reality we experience a subservient place when in fact the only truths that we are able to experience are brain truths.

I am not saying anything particularly new here. Immanuel Kant said it long ago – that our knowledge of this world is a compound of the objective reality and the operations of the mind; we can therefore never know the thing as it is (Das ding an sich) because our only knowledge of the world is through the operations of the mind (brain). In discussing the philosophical importance of colour vision, Arthur Schopenhauer wrote of  its importance for understanding the “Kantian doctrine of the likewise subjective, intellectual forms of all knowledge” – in other words that all knowledge is mediated through the operations of the brain.

This exhibition pretends to explain the visual sensory process through art. Thus, the exciting Chromosaturation of Carlos Cruz-Diez has appended to it the following: “since the retina perceives a wide range of colours simultaneously, experiencing these monochromatic situations causes visual disturbances”.

Almost everything in that statement is incorrect. There are no monochromatic lights in the exhibit (all the lights are broadband although there may be some dominance of one waveband over the others in some), the retina does not “perceive” colours, and there is no “visual disturbance” but only visual sensory excitement, leaving one wondering where the “misty” environment induced comes from. The exhibit would have been better without these incorrect explanations. Why not call it an unusual visual experience instead?

Perhaps artists do not read about advances in science – why should they after all? Perhaps we do not explain our findings properly. Whatever the real reasons, here is a good example of artists and curators trying to explain perceptual processes through artistic achievements and doing so very badly and, worse, inaccurately. It is exactly the reverse of what neuroesthetics has been falsely accused of doing, namely explain works of art through neuroscience, even though that is not its aim (see this post and this post).

Hence, my advice is – go to this delightful exhibition and enjoy the exhibits as creative works of art. Many might want to do more than that; they might wonder what these exhibits tell us about the brain’s perceptual mechanism. But, please ignore the explanations appended to the exhibits – they say nothing about the visual process, or about the sensory brain or about perception, which is not to say that viewing these works does not raise questions about sensory processes.

Here, then, is an exhibition which inspires thinking about the operations of the brain. It is not what it pretends to be, namely an explanations of overall sensory processes. It is a good illustration of how works of art can inspire neuroesthetic studies.

Sunday, March 3, 2013

Judgment without "facts"


A jury was unable to reach a verdict at a recent high profile trial of the wife of a disgraced ex-politician who had been accused of the obstruction of justice. The jury came in for much ridicule for the questions they asked of the judge while deliberating. The lawyer for the prosecution,

 “questioned whether the case could continue. “I don’t ever recollect getting to this stage in any trial – even for more complicated trials than this – and after two days of retirement a list of questions of this very basic kind illustrating at least some jurors don’t appear to have grasped it,” he said.”

I myself do not share the view that these were all silly or irrelevant questions, although one was somewhat funny and got a funny answer in return:

[Jury]: Can you define what is reasonable doubt?
[Judge]: A reasonable doubt is a doubt which is reasonable. These are ordinary English words that the law doesn’t allow me to help you with beyond reasonable written directions.

But my main interest is the jurors’ question that captured the headline in at least one daily newspaper:

[Jury]: Can a juror come to a verdict based on a reason that was not presented in court and has no facts or evidence to support it either from the prosecution or the defence?
[Judge]: The answer to that question is firmly no. That is because it would be completely contrary to the directions I have given.

But in assessing a situation we often rely on evidence that is not “factual” in the literal sense but may be factual in that it speaks to our faculties of judgment. It is absurd to believe that we do not frequently come to doubt whether someone is telling the truth simply by studying their body language, or the hesitation in the voice or because when we gaze into the eyes, there was something that jarred with the ‘factual’ story being told.

I myself have been a juror on two occasions and can testify that these, though not facts presented in court and do not constitute evidence presented by the prosecution or the defence, nevertheless play an important role in reaching a decision. I believe that, in addition to the evidence presented in court, my co-jurors used the same or similar signals in reaching our common verdict.

I have also asked judges whether, when presiding over a case, they use visual cues which are not facts presented in court to reach a judgment as to whether the defendant is innocent or guilty. They have always answered that it plays an important role. Of course in most such cases, the judge can leave the final verdict to the jury but there have been cases where the judge has disagreed with the jury. Somerset Maugham even wrote a very interesting short story about the consequences of such a divergence of views when the judge in a case meets the (acquitted) accused years later at a dinner party (I read it years ago and cannot now recall its name).

The final verdict must depend significantly upon whether a witness or the accused is telling the truth or lying and, in judging that, many factors besides the evidence presented in court come into play – in the form of signals that the brain receives and interprets but which do not constitute part of the body of evidence presented in court.

My point is that the brain is very good at picking up signals that do not constitute ‘evidence’ in the legal sense but are nevertheless vital in reaching judgment.

Nor am I saying something new. Everyone knows that we make judgments based on objectively non-factual but subjectively critical evidence, nearly every day.

Hence, to ridicule the jurors’ question given above is silly. It is indeed as silly as the statement attributed to Picasso, which I alluded to in a previous post, that when we love a woman we don’t start by measuring her limbs.” The truth is of course otherwise; we actually start by making very detailed, often nearly instantaneous but perhaps partially unconscious, measurements of a great deal before we fall in love.