The chrysalis that isn’t there

I wrote the following post while listening to this track. Perhaps you will enjoy reading it to the same sounds. Otherwise, please consider it a whimsical recommendation. 🙂

I should really start keeping a log of different stories in the news all of which point to the little-acknowledged but only-evident fact that science – like so many things, including people – does not embody lofty ideals as much as the aspirations to those ideals. Nature News reported on January 31 that “a language analysis of titles and abstracts in more than 100,000 scientific articles,” published in the British Medical Journal (BMJ), had “found that papers with first and last authors who were both women were about 12% less likely than male-authored papers to include sensationalistic terms such as ‘unprecedented’, ‘novel’, ‘excellent’ or ‘remarkable’;” further, “The articles in each comparison were presumably of similar quality, but those with positive words in the title or abstract garnered 9% more citations overall.” The scientific literature, people!

Science is only as good as its exponents, and there is neither meaning nor advantage to assuming that there is such a thing as a science beyond, outside of and without these people. Doing so inflates science’s importance when it doesn’t deserve to be, and suppresses its shortcomings and prevents them from being addressed. For example, the BMJ study prima facie points to gender discrimination but it also describes a scientific literature that you will never find out is skewed, and therefore unrepresentative of reality, unless you acknowledge that it is constituted by papers authored by people of two genders, on a planet where one gender has maintained a social hegemony for millennia – much like you will never know Earth has an axis of rotation unless you are able to see its continents or make sense of its weather.

The scientific method describes a popular way to design experiments whose performance scientists can use to elucidate refined, and refinable, answers to increasingly complex questions. However, the method is an external object (of human construction) that only, and arguably asymptotically, mediates the relationship between the question and the answer. Everything that comes before the question and after the answer is mediated by a human consciousness undeniably shaped by social, cultural, economic and mental forces.

Even the industry that we associate with modern science – composed of people who trained to be scientists over at least 15 years of education, then went on to instruct and/or study in research institutes, universities and laboratories, being required to teach a fixed number of classes, publish a minimum number of papers and accrue citations, and/or produce X graduate students, while drafting proposals and applying for grants, participating in workshops and conferences, editing journals, possibly administering scientific work and consulting on policy – is steeped in human needs and aspirations, and is even designed to make room for them, but many of us non-scientists are frequently and successfully tempted to address the act of being a scientist as an act of transformation: characterised by an instant in time when a person changes into something else, a higher creature of sorts, like a larva enters a magical chrysalis and exits a butterfly.

But for a man to become a scientist has never meant the shedding of his identity or social stature; ultimately, to become a scientist is to terminate at some quasi-arbitrary moment the slow inculcation of well-founded knowledge crafted to serve a profitable industry. There is a science we know as simply the moment of discovery: it is the less problematic of the two kinds. The other, in the 21st century, is also funding, networking, negotiating, lobbying, travelling, fighting, communicating, introspecting and, inescapably, some suffering. Otherwise, scientific knowledge – one of the ultimate products of the modern scientific enterprise – wouldn’t be as well-organised, accessible and uplifting as it is today.

But it would be silly to think that in the process of constructing this world-machine of sorts, we baked in the best of us, locked out the worst of us, and threw the key away. Instead, like all human endeavour, science evolves with us. While it may from time to time present opportunities to realise one or two ideals, it remains for the most part a deep and truthful reflection of ourselves. This assertion isn’t morally polarised, however; as they say, it is what it is – and this is precisely why we must acknowledge failures in the practice of science instead of sweeping them under the rug.

One male scientist choosing more uninhibitedly to call his observation “unprecedented” than a female scientist might have been encouraged, among other things, by the peculiarities of a gendered scientific labour force and scientific enterprise, but many male scientists indulging just as freely in their evaluatory fantasies, such as they are, indicates a systemic corruption that transcends (but not escapes) science. The same goes for, as in another recent example, for the view that science is self-correcting. It is not because people are not, and they need to be pushed to be. In March 2019, for example, researchers uncovered at least 58 papers published in a six-week period whose authors had switched their desired outcomes between the start and end of their respective experiments to report positive, and to avoid reporting negative, results. When the researchers wrote to the authors as well as the editors of the journals that had published the problem papers, most of them denied there was an issue and refused to accept modifications.

Again, the scientific literature, people!

A science for the non-1%

David Michaels, an epidemiologist and a former US assistant secretary of labour for occupational safety and health under Barack Obama, writes in the Boston Review:

[Product defence] operations have on their payrolls—or can bring in on a moment’s notice—toxicologists, epidemiologists, biostatisticians, risk assessors, and any other professionally trained, media-savvy experts deemed necessary (economists too, especially for inflating the costs and deflating the benefits of proposed regulation, as well as for antitrust issues). Much of their work involves production of scientific materials that purport to show that a product a corporation makes or uses or even discharges as air or water pollution is just not very dangerous. These useful “experts” produce impressive-looking reports and publish the results of their studies in peer-reviewed scientific journals (reviewed, of course, by peers of the hired guns writing the articles). Simply put, the product defence machine cooks the books, and if the first recipe doesn’t pan out with the desired results, they commission a new effort and try again.

Members of the corporate class have played an instrumental role in undermining trust in science in the last century, and Michaels’s exposition provides an insightful glimpse of how they work, and why what they do works. However, the narrative Michaels employs, as illustrated above, treats scientists like minions – a group of people that will follow your instructions but will not endeavour to question how their research is going to be used as long as, presumably, their own goals are met – and also excuses them for it. This is silly: the corporate class couldn’t have done what it did without help from a sliver of the scientific class that sold its expertise to the highest bidder.

Even if such actions may have been more the result of incompetence than of malice, for too long have scientists claimed vincible ignorance in their quasi-traditional tendency to prize unattached scientific progress more than scientific progress in step with societal aspirations. They need to step up, step out and participate in political programmes that deploy scientific knowledge to solve messy real-world problems, which frequently fail and just as frequently serve misguided ends (such as – but sure as hell not limited to – laundering the soiled reputation of a pedophile and convicted sex offender).

But even so, even as the scientists’ conduct typifies the problem, the buck stops with the framework of incentives that guides them.

Despite its connections with technologies that powered colonialism and war, science has somehow accrued a reputation of being clean. To want to be a scientist today is to want to make sense of the natural universe – an aspiration both simple and respectable – and to make a break from the piddling problems of here and now to the more spiritually refined omnipresent and eternal. However, this image can’t afford to maintain itself by taking the deeply human world it is embedded in for granted.

Science has become the reason for state simply because the state is busy keeping science and politics separate. No academic programme in the world today considers scientific research to be at par with public engagement and political participationa when exactly this is necessary to establish science as an exercise through which, fundamentally, people construct knowledge about the world and then ensure it is used responsibly (as well as to demote it from the lofty pedestal where it currently lords over the social sciences and humanities). Instead, we have a system that encourages only the production of knowledge, tying it up with metrics of professional success, career advancement and, most importantly, a culture of higher educationb and research that won’t brook dissent and tolerates activist-scientists as lesser creatures.

a. And it is to the government’s credit that political participation has become synonymous with electoral politics and the public expression of allegiance to political ideologies.

b. Indeed, the problem most commonly manifests as a jaundiced impression of the purpose of teaching.

The perpetuators of this structure are responsible for the formation and subsequent profitability of “the strategy of manufacturing doubt”, which Michaels writes “has worked wonders … as a public relations tool in the current debate over the use of scientific evidence in public policy. … [The] main motivation all along has been only to sow confusion and buy time, sometimes lots of time, allowing entire industries to thrive or individual companies to maintain market share while developing a new product.”

To fight the vision of these perpetuators, to at least rescue the fruits of the methods of science from inadvertent ignominy, we need publicly active scientists to be the rule, not the exceptions to the rule. We need structural incentives to change to accommodate the fact that, if they don’t, this group of people will definitely remain limited to members of the upper class and/or upper castes. We need a stronger, closer marriage of science, the social sciences, business administration and policymaking.

To be sure, I’m neither saying the mere presence of scientists in public debates will lead to swifter solutions nor that the absence of science alone in policymaking is responsible for so many of the crises of our times – but that their absence has left cracks so big, it’s quite difficult to consider if they can be sealed any other wayc. And yes, the world will slow down, the richer will become less rich and economic growth will become more halting, but these are all also excuses to maintain a status quo that has only exploited the non-1% for two centuries straight.

c. Michaels concludes his piece with a list of techniques the product-defence faction has used to sow doubt and, in the resulting moments of vulnerability, ‘sell science’ – i.e. techniques that represent the absence of guiding voices.

Of course, there’s only so much one can do if the political class isn’t receptive to one’s ideas – but we must begin somewhere, and what better place to begin than at the knowledgeable place?

The cycle

Is it just me or does everyone see a self-fulfilling prophecy here?

For a long time, and assisted ably by the ‘publish or perish’ paradigm, researchers sought to have their papers published in high-impact-factor journals – a.k.a. prestige journals – like Nature.

Such journals in turn, assisted ably by parasitic strategies, made these papers highly visible to other researchers around the world and, by virtue of being high-IF journals, tainted the results in the papers with a measure of prestige, ergo importance.

Evaluations and awards committees in turn were highly aware of these papers over others and picked their authors for rewards over others, further amplifying their work, increasing the opportunity cost incurred by the researchers who lose out, and increasing the prestige attached to the high-IF journals.

Run this cycle a few million times and you end up with the impression that there’s something journals like Nature get right – when in fact it’s just mostly a bunch of business practices to ensure they remain profitable.

Performing with and without an audience

My feeling is that as far as creativity is concerned, isolation is required. … The presence of others can only inhibit this process, since creation is embarrassing.

– Isaac Asimov (source)

Be it far from me to fall for a behavioural studies paper that’s not yet been replicated, and much farther to do so based on a university press release, but this one caught my attention because it suggests something completely opposite to my experience: “when there’s an audience, people’s performance improves”. Sure enough, four full paras into the piece there’s a qualification:

Vikram Chib, an assistant professor of biomedical engineering at Johns Hopkins … who has studied what happens in the brain when people choke under pressure, originally launched this project to investigate how performance suffers under social observation. But it quickly became clear that in certain situations, having an audience spurred people to do better, the same way it would if money was on the line. (emphasis added)

The situation in question involved 20 participants playing a videogame in front of an audience of two and, in a different ‘act’, in front of no audience at all. If a participant played the game better, he/she received a higher reward. Brain activity was monitored at all times using an fMRI machine.

You realise now that the press release’s headline is almost criminally wrong, considering it’s likely been vetted by some scientists if not those who conducted the study itself. It suggests that people’s performance improves in all circumstances; however, a videogame is nothing like writing, for example. In fact, you’d be hard-pressed to find someone who can write when they’re being watched. This is because writing isn’t a performance art whereas a videogame could be. And when executing a performance, having an audience helps.

According to Chib and the press release, this is the mechanism of action:

When participants knew an audience was watching, a part of the prefrontal cortex associated with social cognition, particularly the thoughts and intentions of others, activated along with another part of the cortex associated with reward. Together these signals triggered activity in the ventral striatum, an area of the brain that motivates action and motor skills.

While this is interesting, 20 people isn’t too much, the task is too simple and definitely not generalisable, and the audience is too small. Playing a videogame in front of two strangers (presumably) is nothing like playing a videogame in a room chock full of people, or when the stakes are higher. In fact, in real life, you’re almost certainly being judged if there’s an audience watching you as you conduct a task, and your stress levels are going to be far higher than when you’re playing something on your Xbox in front of two people.

A final quibble is more a wondering about the takeaway. The study seems to have focused on a very narrowly defined task while one of its authors – Chib – freely acknowledges its various shortcomings. Why weren’t these known issues addressed in the same paper instead of angling for a follow-up? I suspect future studies will also perform the same experiment multiple times with different kinds of tasks.

But if the audience was a lot bigger, and the stakes higher, the results could have gone the other way. “Here people with social anxiety tended to perform better,” Chib said, “but at some point, the size of the audience could increase the size of one’s anxiety but we still need to figure that out.”

Perhaps this is a case of someone trying to jack up their publication count.

Featured image credit: Skitterphoto/pixabay.

R&D in China and India

“A great deal of the debate over globalization of knowledge economies has focused on China and India. One reason has been their rapid, sustained economic growth. The Chinese economy has averaged a growth rate of 9-10 percent for nearly two decades, and now ranks among the world’s largest economies. India, too, has grown steadily. After years of plodding along at an average annual increase in its gross domestic product (GDP) of 3.5 percent, India has expanded by 6 percent per annum since 1980, and more than 7 percent since 1994 (Wilson and Purushothaman, 2003). Both countries are expected to maintain their dynamism, at least for the near future.”

– Gereffi et al, ‘Getting the Numbers Right: International Engineering Education in the United States, China and India’, Journal of Engineering Education, January 2008

A June 16 paper in Proceedings of the National Academy of Sciences, titled ‘China’s Rise as a Major Contributor to Science and Technology’, analyses the academic and research environment in China over the last decade or so, and discusses the factors involved in the country’s increasing fecundity in recent years. It concludes that four factors have played an important role in this process:

  1. Large human capital base
  2. A labor market favoring academic meritocracy
  3. A large diaspora of Chinese-origin scientists
  4. A centralized government willing to invest in science

A simple metric they cite to make their point is the publication trends by country. Between 2000 and 2010, for example, the number of science and engineering papers published by China has increased by 470%. The next highest climb was for India, by 234%.

Click on the image for an interactive chart.
Click on the image for an interactive chart.

“The cheaters don’t have to worry they will someday be caught and punished.”

This is a quantitative result. A common criticism of the rising volume of Chinese scientific literature in the last three decades is the quality of research coming out of it. Dramatic increases in research output are often accompanied by a publish-or-perish mindset that fosters a desperation among scientists to get published, leading to padded CVs, falsified data and plagiarism. Moreover, it’s plausible that since R&D funding in China is still controlled by a highly centralized government, flow of money is restricted and access to it is highly competitive. And when it is government officials that are evaluating science, quantitative results are favored over qualitative ones, reliance on misleading performance metrics increases, and funds are often awarded for areas of research that favor political agendas.

The PNAS paper cites the work of Shi-min Fang, a science writer who won the inaugural John Maddox prize in 2012 for exposing scientific fraud in Chinese research circles, for this. In an interview to NewScientist in November of that year, he explains the source of widespread misconduct:

It is the result of interactions between totalitarianism, the lack of freedom of speech, press and academic research, extreme capitalism that tries to commercialise everything including science and education, traditional culture, the lack of scientific spirit, the culture of saving face and so on. It’s also because there is not a credible official channel to report, investigate and punish academic misconduct. The cheaters don’t have to worry they will someday be caught and punished.

At this point, it’s tempting to draw parallels with India. While China has seen increased funding for R&D…

Click on the chart for an interactive view.
Click on the chart for an interactive view.

… India has been less fortunate.

Click on the chart for an interactive view.
Click on the chart for an interactive view.

The issue of funding is slightly different in India, in fact. While Chinese science is obstinately centralized and publicly funded, India is centralized in some parts and decentralized in others, public funding is not high enough because presumably we lack the meritocratic academic environment, and private funding is not as high as it needs to be.

Click on the image for an interactive chart.
Click on the image for an interactive chart.

Even though the PNAS paper’s authors say their breakdown of what has driven scientific output from China could inspire changes in other countries, India is faced with different issues as the charts above have shown. Indeed, the very first chart shows how, despite the number of published papers having double in the last decade, we have only jumped from one small number to another small number.

“Scientific research in India has become the handmaiden of defense technology.”

There is also a definite lack of visibility: when little scientific output of any kind is accessible to 1) the common man, and 2) the world outside. Apart from minimal media coverage, there is a paucity of scientific journals, or they exist but are not well known, accessible or both. This Jamia Milia collection lists a paltry 226 journals – including those in regional languages – but it’s likelier that there are hundreds more, both credible and dubious. A journal serves as an aggregation of reliable scientific knowledge not just for scientists but also for journalists and other reliant decision-makers. It is one place to find the latest developments.

In this context, Current Science appears to be the most favored in the country, not to mention the loneliest. Then again, a couple fingers can be pointed at years of reliance on quantitative performance metrics, which drives many Indian researchers to publish in journals with very high impact factors such as Nature or Science, which are often based outside the country.

In the absence of lists of Indian and Chinese journals, let’s turn to a table used in the PNAS paper showing average number of citations per article compared with the USA, in percent. It shows both India and China close to 40% in 2010-2011.

The poor showing may not be a direct consequence of low quality. For example, a paper may have detailed research conducted to resolve a niche issue in Indian defense technology. In such a case, the quality of the article may be high but the citability of the research itself will be low. Don’t be surprised if this is common in India given our devotion to the space and nuclear sciences. And perhaps this is what a friend of mine referred to when he said “Scientific research in India has become the handmaiden of defense technology”.

To sum up, although India and China both lag the USA and the EU for productivity and value of research (albeit through quantitative metrics), China is facing problems associated with the maturity of a voluminous scientific workforce, whereas India is quite far from that maturity. The PNAS paper is available here. If you’re interested in an analysis of engineering education in the two countries, see this paper (from which the opening lines of this post were borrowed).

R&D in China and India

“A great deal of the debate over globalization of knowledge economies has focused on China and India. One reason has been their rapid, sustained economic growth. The Chinese economy has averaged a growth rate of 9-10 percent for nearly two decades, and now ranks among the world’s largest economies. India, too, has grown steadily. After years of plodding along at an average annual increase in its gross domestic product (GDP) of 3.5 percent, India has expanded by 6 percent per annum since 1980, and more than 7 percent since 1994 (Wilson and Purushothaman, 2003). Both countries are expected to maintain their dynamism, at least for the near future.”

– Gereffi et al, ‘Getting the Numbers Right: International Engineering Education in the United States, China and India’, Journal of Engineering Education, January 2008

A June 16 paper in Proceedings of the National Academy of Sciences, titled ‘China’s Rise as a Major Contributor to Science and Technology’, analyses the academic and research environment in China over the last decade or so, and discusses the factors involved in the country’s increasing fecundity in recent years. It concludes that four factors have played an important role in this process:

  1. Large human capital base
  2. A labor market favoring academic meritocracy
  3. A large diaspora of Chinese-origin scientists
  4. A centralized government willing to invest in science

A simple metric they cite to make their point is the publication trends by country. Between 2000 and 2010, for example, the number of science and engineering papers published by China has increased by 470%. The next highest climb was for India, by 234%.

Click on the image for an interactive chart.
Click on the image for an interactive chart.

“The cheaters don’t have to worry they will someday be caught and punished.”

This is a quantitative result. A common criticism of the rising volume of Chinese scientific literature in the last three decades is the quality of research coming out of it. Dramatic increases in research output are often accompanied by a publish-or-perish mindset that fosters a desperation among scientists to get published, leading to padded CVs, falsified data and plagiarism. Moreover, it’s plausible that since R&D funding in China is still controlled by a highly centralized government, flow of money is restricted and access to it is highly competitive. And when it is government officials that are evaluating science, quantitative results are favored over qualitative ones, reliance on misleading performance metrics increases, and funds are often awarded for areas of research that favor political agendas.

The PNAS paper cites the work of Shi-min Fang, a science writer who won the inaugural John Maddox prize in 2012 for exposing scientific fraud in Chinese research circles, for this. In an interview to NewScientist in November of that year, he explains the source of widespread misconduct:

It is the result of interactions between totalitarianism, the lack of freedom of speech, press and academic research, extreme capitalism that tries to commercialise everything including science and education, traditional culture, the lack of scientific spirit, the culture of saving face and so on. It’s also because there is not a credible official channel to report, investigate and punish academic misconduct. The cheaters don’t have to worry they will someday be caught and punished.

At this point, it’s tempting to draw parallels with India. While China has seen increased funding for R&D…

Click on the chart for an interactive view.
Click on the chart for an interactive view.

… India has been less fortunate.

Click on the chart for an interactive view.
Click on the chart for an interactive view.

The issue of funding is slightly different in India, in fact. While Chinese science is obstinately centralized and publicly funded, India is centralized in some parts and decentralized in others, public funding is not high enough because presumably we lack the meritocratic academic environment, and private funding is not as high as it needs to be.

Click on the image for an interactive chart.
Click on the image for an interactive chart.

Even though the PNAS paper’s authors say their breakdown of what has driven scientific output from China could inspire changes in other countries, India is faced with different issues as the charts above have shown. Indeed, the very first chart shows how, despite the number of published papers having double in the last decade, we have only jumped from one small number to another small number.

“Scientific research in India has become the handmaiden of defense technology.”

There is also a definite lack of visibility: when little scientific output of any kind is accessible to 1) the common man, and 2) the world outside. Apart from minimal media coverage, there is a paucity of scientific journals, or they exist but are not well known, accessible or both. This Jamia Milia collection lists a paltry 226 journals – including those in regional languages – but it’s likelier that there are hundreds more, both credible and dubious. A journal serves as an aggregation of reliable scientific knowledge not just for scientists but also for journalists and other reliant decision-makers. It is one place to find the latest developments.

In this context, Current Science appears to be the most favored in the country, not to mention the loneliest. Then again, a couple fingers can be pointed at years of reliance on quantitative performance metrics, which drives many Indian researchers to publish in journals with very high impact factors such as Nature or Science, which are often based outside the country.

In the absence of lists of Indian and Chinese journals, let’s turn to a table used in the PNAS paper showing average number of citations per article compared with the USA, in percent. It shows both India and China close to 40% in 2010-2011.

The poor showing may not be a direct consequence of low quality. For example, a paper may have detailed research conducted to resolve a niche issue in Indian defense technology. In such a case, the quality of the article may be high but the citability of the research itself will be low. Don’t be surprised if this is common in India given our devotion to the space and nuclear sciences. And perhaps this is what a friend of mine referred to when he said “Scientific research in India has become the handmaiden of defense technology”.

To sum up, although India and China both lag the USA and the EU for productivity and value of research (albeit through quantitative metrics), China is facing problems associated with the maturity of a voluminous scientific workforce, whereas India is quite far from that maturity. The PNAS paper is available here. If you’re interested in an analysis of engineering education in the two countries, see this paper (from which the opening lines of this post were borrowed).

Create your website at WordPress.com
Get started