Redeeming art v. redeeming science

Recently, someone shared the cover of a soon to be released book, entitled The Physics of Climate Change, authored by Lawrence M. Krauss and expressed excitement about the book’s impending publication and the prospect of their reading it. I instinctively responded that I would be actively boycotting the book after the sexual harassment allegations against Krauss plus his ties with Jeffrey Epstein. I didn’t, and don’t, wish to consume his scholarship.

Now, I don’t think that facts alone can be redemptive – that if a book’s contents are right, as ascertained through dispassionate tests of verification, we get to ignore questions about whether the contents are good. There are many examples littering the history of science that tell a story about how a fixation on the facts (and more recently data), and their allegedly virtuous apoliticality, has led us astray.

Consider the story of Geoffrey Marcy. It does not matter, or matters less, that humankind as a whole has made great astronomical discoveries. Instead, it should matter – or matter more – how we go about making them. And Marcy was contemptible because his discoveries were fuelled not just by his appreciation of the facts, so to speak, but also because he pushed women out of astronomy and astrophysics and traumatised them. As a result, consuming the scholarship of Marcy, and Krauss and so many others, feels to me like I am fuelling their transgressions.

Many of these scholars assumed prominence because they drew in grants worth millions to their universities. Their scholarship dealt in facts, sure, but in the capitalist university system, a scholarship also translates to grants and an arbitrarily defined ‘prestige’ that allow universities to excuse the scholars’ behaviour and to sideline victims’ accusations. Some universities even participate in a system derisively called ‘passing the trash’; as BuzzFeed reported in the case of Erik Shapiro in 2017, “the ‘trash’ … refers to high-profile professors who bring status and money to universities that either ignore or are unaware of past scandals.”

So supporting scholars for the virtues of their scholarship alone seems quite disingenuous to me. This is sort of like supporting the use of electric vehicles while ignoring the fact that most of the electricity that powers them is produced in coal-fired power plants. In both cases, the official policy is ultimately geared in favour of maximising profits (more here and here). As such, the enemy here is the capitalist system and our universities’ collective decision to function on its principles, ergo singling scholarship out of for praise seems misguided.

This is also why, though I’ve heard multiple arguments to the contrary, I really don’t know how to separate art from artist, or scholarship from scholar. An acquaintance offered the example of Georges Lemaître, the Belgian Catholic priest and cosmologist who – in the acquaintance’s telling – attempted to understand the world as it was without letting his background as a priest get in the way. I was not convinced, saying the case of Lemaître sounded like a privileged example for its clean distinction between one’s beliefs as a person and one’s beliefs as a scientist. I even expressed suspicion that there might be a reason Lemaître turned to a more mechanistic subject like cosmology and not a more negotiated one like social anthropology.

In fact, Krauss also discovered the world as is in many ways, and those findings do not become wrong for the person he was, or was later found to be. But we must not restrict ourselves to the rightwrong axis, and navigate the goodbad axis as well.

In this time, I also became curious about non-white-male (but including trans-male) scientists who may have written on the same topic – the physics of climate change. So I went googling, finding quite a few results. My go-to response in such situations, concerning the fruits of a poisoned tree, has been to diversify sources – to look for other fruits – because then we also discover new scholarship and art, and empower conventionally disprivileged scholars and artists.

In this regard, the publishers of Krauss’s book also share blame (with Krauss’s universities, which empowered him by failing to create a safe space for students). If publishers are sticking with Krauss instead of, say, commissioning a professor from the Indian Institute of Tropical Meteorology, they are only embellishing preexisting prejudices. They reinforce the notion that they’d much rather redeem an unrepentant white man who has sinned than discover a new writer who deserves the opportunity more. So the publishers are only worsening the problem: they are effectively signalling to all guiltless perpetrators that publishers will help salvage what universities let sink.

At this point, another acquaintance offered a reconciliatory message: that while it’s unwise to dismiss misconduct, it’s also unwise to erase it. So it might be better to let it be but to take from it only the good stuff. Sage words, but therein lay another rub because of a vital difference between the power of fiction versus (what I perceive to be) the innate amorality of scientific scholarship.

Fiction inspires better aspirations and is significantly more redeemable as a result, but I don’t suppose we can take the same position on, say, the second law of thermodynamics or Newton’s third law of motion. Or can we? If you know, please tell me. But until I’m disabused of the notion, I expect it will continue to be hard for me to find a way to rescue the scholarship of a ‘tainted’ scholar from the taint itself, especially when the scholarship has little potential – beyond the implicit fact of its existence, and therefore the ‘freedom of research’ it stands for – to improve the human condition as directly as fiction can.

[Six hours later] I realise I’ve written earlier about remembering Richard Feynman a certain way, as well as Enrico Fermi – the former for misogyny and the latter for a troublingly apolitical engagement with America’s nuclear programme – and that those prescriptions, to remember the bad with the good and to remember the good with the bad, are now at odds with my response to Krauss. This is where it struck me the issue lay: I believe what works for Feynman should work for Krauss as well except in the case of Krauss’s new book.

Feynman was relatively more prolific, since he was also more of a communicator and teacher, than Fermi or Krauss. But while it’s impossible for me to escape the use of Feynman diagrams or Fermi-Dirac statistics if I were a theoretical particle physicist, I still have a choice to buy or boycott the book Surely You’re Joking, Mr Feynman! (1985) with zero consequences for my professional career. If at this point you rebut that “every book teaches us something” so we can still read books without endorsing the authors themselves, I would disagree on the simple point that if you wish to learn, you could seek out other authors, especially those who deserve the opportunity of your readership more.

I expect for the reasons and uncertainty described earlier that the same can go for Krauss and The Physics of Climate Change as well: remember that Krauss was a good physicist and a bad man, and that he was a bad man who produced good physics, but even as other scientists stand on the shoulders of his contributions to quantum physics, I can and will skip The Physics of Climate Change.

Axiomatically, the more we insist that good science communication, an instance of which I believe the book is, is important to inculcate better public appreciation of scientific research, and in the long run improve funding prospects, increase public interest in science-backed solutions to societal problems, draw more students into STEM fields and hold the scientific enterprise accountable in more meaningful as well as efficacious ways, the more science communication itself becomes a stakeholder in the mechanisms that produce scientific work that universities capitalise on, that is currency of this whole enterprise.

Clarity and soundness

I feel a lot of non-science editors just switch off when they read science stuff.

A friend told me this earlier today, during yet another conversation about how many of the editorial issues that assail science and health journalism have become more pronounced during the pandemic (by dint of the pandemic being a science and health ‘event’). Even earlier, editors would switch off whenever they’d read science news, but then the news would usually be about a new study discussing something coffee could or couldn’t do to the heart.

While that’s worrying, the news was seldom immediately harmful, and lethal even more rarely. In a pandemic, on the other hand, bullshit that makes it to print hurts in two distinct ways: by making things harder for good health journalists to get through to readers with the right information and emphases, and of course by encouraging readers to do things that might harm them.

But does this mean editors need to know the ins and outs of the subject on which they’re publishing articles? This might seem like a silly question to ask but it’s often the reality in small newsrooms in India, where one editor is typically in charge of three or four beats at a time. And setting aside the argument that this arrangement is a product of complacency and not taking science news seriously more than resource constraints, it’s not necessarily a bad thing either.

For example, a political editor may not be able to publish incisive articles on, say, developments in the art world, but they could still help by identifying reliable news sources and tap their network to commission the right reporters. And if the organisation spends a lot more time covering political news, and with more depth, this arrangement is arguably preferable from a business standpoint.

Of course, such a setup is bound to be error-prone, but my contention is that it doesn’t deserve to be written off either, especially this year – when more than a few news publishers suddenly found themselves in the middle of a pandemic even as they couldn’t hire a health editor because their revenues were on the decline.

For their part, then, publishers can help minimise errors by being clear about what editors are expected to do. For example, a newsroom can’t possibly do a great job of covering science developments in the country without a science editor; axiomatically, non-science editors can only be expected to do a superficial job of standing in for a science editor.

This said, the question still stands: What are editors to do specifically, especially those suddenly faced with the need to cover a topic they’re only superficially familiar with? The answer to this question is important not just to help editors but also to maintain accountability. For example, though I’ve seldom covered health stories in the past, I also don’t get to throw my hands up as The Wire‘s science, health and environment editor when I publish a faulty story about, say, COVID-19. It is a bit of a ‘damned if you do, damned if you don’t’ situation, but it’s not entirely unfair either: it’s the pandemic, and The Wire can’t not cover it!

In these circumstances, I’ve found one particular way to mitigate the risk of damnation, so to speak, quite effective. I recently edited an article in which the language of a paragraph seemed off to me because it wasn’t clear what the author was trying to say, and I kept pushing him to clarify. Finally, after 14 emails, we realised he had made a mistake in the calculations, and we dropped that part of the article. More broadly, I’ve found that nine times out of ten, even pushbacks on editorial grounds can help identify and resolve technical issues. If I think the underlying argument has not been explained clearly enough, I send a submission back even if it is scientifically accurate or whatever.

Now, I’m not sure how robust this relationship is in the larger scheme of things. For example, this ‘mechanism’ will obviously fail when clarity of articulation and soundness of argument are not related, such as in the case of authors for whom English is a second language. For another, the omnipresent – and omnipotent – confounding factor known as unknown unknowns could keep me from understanding an argument even when it is well-made, thus putting me at risk of turning down good articles simply because I’m too dense or ignorant.

But to be honest, these risks are quite affordable when the choice is between damnation for an article I can explain and damnation for an article I can’t. I can (and do) improve the filter’s specificity/sensitivity 😄 by reading widely myself, to become less ignorant, and by asking authors to include a brief of 100-150 words in their emails clarifying, among other things, their article’s intended effect on the reader. And fortuitously, when authors are pushed to be clearer about the point they’re making, it seems they also tend to reflect on the parts of their reasoning that lie beyond the language itself.

Where is the coolest lab in the universe?

The Large Hadron Collider (LHC) performs an impressive feat every time it accelerates billions of protons to nearly the speed of light – and not in terms of the energy alone. For example, you release more energy when you clap your palms together once than the energy imparted to a proton accelerated by the LHC. The impressiveness arises from the fact that the energy of your clap is distributed among billions of atoms while the latter all resides in a single particle. It’s impressive because of the energy density.

A proton like this should have a very high kinetic energy. When lots of protons with such amounts of energy come together to form a macroscopic object, the object will have a high temperature. This is the relationship between subatomic particles and the temperature of the object they make up. The outermost layer of a star is so hot because its constituent particles have a very high kinetic energy. Blue hypergiant stars, thought to be the hottest stars in the universe, like Eta Carinae have a surface temperature of 36,000 K and a surface 57,600-times larger than that of the Sun. This isn’t impressive on the temperature scale alone but also on the energy density scale: Eta Carinae ‘maintains’ a higher temperature over a larger area.

Now, the following headline and variations thereof have been doing the rounds of late, and they piqued me because I’m quite reluctant to believe they’re true:

This headline, as you may have guessed by the fonts, is from Nature News. To be sure, I’m not doubting the veracity of any of the claims. Instead, my dispute is with the “coolest lab” claim and on entirely qualitative grounds.

The feat mentioned in the headline involves physicists using lasers to cool a tightly controlled group of atoms to near-absolute-zero, causing quantum mechanical effects to become visible on the macroscopic scale – the feature that Bose-Einstein condensates are celebrated for. Most, if not all, atomic cooling techniques endeavour in different ways to extract as much of an atom’s kinetic energy as possible. The more energy they remove, the cooler the indicated temperature.

The reason the headline piqued me was that it trumpets a place in the universe called the “universe’s coolest lab”. Be that as it may (though it may not technically be so; the physicist Wolfgang Ketterle has achieved lower temperatures before), lowering the temperature of an object to a remarkable sliver of a kelvin above absolute zero is one thing but lowering the temperature over a very large area or volume must be quite another. For example, an extremely cold object inside a tight container the size of a shoebox (I presume) must be lacking much less energy than a not-so-extremely cold volume across, say, the size of a star.

This is the source of my reluctance to acknowledge that the International Space Station could be the “coolest lab in the universe”.

While we regularly equate heat with temperature without much consequence to our judgment, the latter can be described by a single number pertaining to a single object whereas the former – heat – is energy flowing from a hotter to a colder region of space (or the other way with the help of a heat pump). In essence, the amount of heat is a function of two differing temperatures. In turn it could matter, when looking for the “coolest” place, that we look not just for low temperatures but for lower temperatures within warmer surroundings. This is because it’s harder to maintain a lower temperature in such settings – for the same reason we use thermos flasks to keep liquids hot: if the liquid is exposed to the ambient atmosphere, heat will flow from the liquid to the air until the two achieve a thermal equilibrium.

An object is said to be cold if its temperature is lower than that of its surroundings. Vladivostok in Russia is cold relative to most of the world’s other cities but if Vladivostok was the sole human settlement and beyond which no one has ever ventured, the human idea of cold will have to be recalibrated from, say, 10º C to -20º C. The temperature required to achieve a Bose-Einstein condensate is the temperature required at which non-quantum-mechanical effects are so stilled that they stop interfering with the much weaker quantum-mechanical effects, given by a formula but typically lower than 1 K.

The deep nothingness of space itself has a temperature of 2.7 K (-270.45º C); when all the stars in the universe die and there are no more sources of energy, all hot objects – like neutron stars, colliding gas clouds or molten rain over an exoplanet – will eventually have to cool to 2.7 K to achieve equilibrium (notwithstanding other eschatological events).

This brings us, figuratively, to the Boomerang Nebula – in my opinion the real coolest lab in the universe because it maintains a very low temperature across a very large volume, i.e. its coolness density is significantly higher. This is a protoplanetary nebula, which is a phase in the lives of stars within a certain mass range. In this phase, the star sheds some of its mass that expands outwards in the form of a gas cloud, lit by the star’s light. The gas in the Boomerang Nebula, from a dying red giant star changing to a white dwarf at the centre, is expanding outward at a little over 160 km/s (576,000 km/hr), and has been for the last 1,500 years or so. This rapid expansion leaves the nebula with a temperature of 1 K. Astronomers discovered this cold mass in late 1995.

(“When gas expands, the decrease in pressure causes the molecules to slow down. This makes the gas cold”: source.)

The experiment to create a Bose-Einstein condensate in space – or for that matter anywhere on Earth – transpired in a well-insulated container that, apart from the atoms to be cooled, was a vacuum. So as such, to the atoms, the container was their universe, their Vladivostok. They were not at risk of the container’s coldness inviting heat from its surroundings and destroying the condensate. The Boomerang Nebula doesn’t have this luxury: as a nebula, it’s exposed to the vast emptiness, and 2.7 K, of space at all times. So even though the temperature difference between itself and space is only 1.7 K, the nebula also has to constantly contend with the equilibriating ‘pressure’ imposed by space.

Further, according to Raghavendra Sahai (as quoted by NASA), one of the nebula’s cold spots’ discoverers, it’s “even colder than most other expanding nebulae because it is losing its mass about 100-times faster than other similar dying stars and 100-billion-times faster than Earth’s Sun.” This implies there is a great mass of gas, and so atoms, whose temperature is around 1 K.

All together, the fact that the nebula has maintained a temperature of 1 K for around 1,500 years (plus a 5,000-year offset, to compensate for the distance to the nebula) and over 3.14 trillion km makes it a far cooler “coolest” place, lab, whatever.

Go easy on the dexamethasone hype

  1. The people involved with the RECOVERY clinical trial have announced via statements to the press that they have found very encouraging results about the use of dexamethasone in people with severe COVID-19 who had to receive ventilator support. However, the study’s data isn’t available for independent verification yet. So irrespective of how pumped the trial’s researchers are, wait. Studies in more advanced stages of the publishing process have been sunk before.
  2. Dexamethasone is relatively cheap and widely available. But that doesn’t mean it will continue to remain that way in future. The UK government has already announced it has stockpiled 200,000 doses of the drug, and other countries with access to supply may follow suit. Companies that manufacture the drug may also decide to hike prices, foreseeing rising demand, leading to further issues of availability.
  3. Researchers found in their clinical trials that the drug reduced mortality among patients with COVID-19 and who needed ventilator support by around 33%, and who needed oxygen by about 20%. This describes a very specific use-case, and governments must ensure that if the drug is repurposed for COVID-19, its use is limited to people who fulfil the specific criteria that benefit from the drug’s use. As the preliminary report notes, “It is important to recognise that we found no evidence of benefit for patients who did not require oxygen and we did not study patients outside the hospital setting.” In addition, dexamethasone is a steroid, and indiscriminate use is quite likely to lead to adverse side effects with zero benefits.
  4. The novel coronavirus pandemic is not a tragedy in the number of deaths alone. An important long term effect will be disability, considering the virus has been known to affect multiple parts of the body, including the heart, brain and the kidneys, apart from the lungs themselves, even among patients who have survived. Additionally, it cuts mortality in patients in a later stage of the COVID-19 infection. So go easy on words like ‘game-changer’. Dexamethasone isn’t exactly one because game-changers need to allow people to contract the virus but not fear disability or their lives…
  5. … or in fact not fear contracting the virus at all – like a vaccine or an efficacious prophylactic. This is very important, for example, because of what we have already seen in Italy and New York. Many patients who don’t need ventilator support or oxygen care still need hospital care, and the unavailability of hospital beds and skilled personnel can lead to more deaths than may be due to COVID-19. This ‘effect’, so to speak, is more pronounced in developing nations, many of which have panicked and formulated policies that pay way more or way less attention to COVID-19 than is due. In India, for example, nearly 900 people have died due to the lockdown itself.

Journalistic entropy

Say you need to store a square image 1,000 pixels wide to a side with the smallest filesize (setting aside compression techniques). The image begins with the colour #009900 on the left side and, as you move towards the right, gradually blends into #1e1e1e on the rightmost edge. Two simple storage methods come to mind: you could either encode the colour-information of every pixel in a file and store that file, or you could determine a mathematical function that, given the inputs #009900 and #1e1e1e, generates the image in question.

The latter method seems more appealing, especially for larger canvases of patterns that are composed by a single underlying function. In such cases, it should obviously be more advantageous to store the image as an output of a function to achieve the smallest filesize.

Now, in information theory (as in thermodynamics), there is an entity called entropy: it describes the amount of information you don’t have about a system. In our example, imagine that the colour #009900 blends to #1e1e1e from left to right save for a strip along the right edge, say, 50 pixels wide. Each pixel in this strip can assume a random colour. To store this image, you’d have to save it as an addition of two functions: ƒ(x, y), where x = #009900 and y = #1e1e1e, plus one function to colour the pixels lying in the 50-px strip on the right side. Obviously this will increase the filesize of the stored function.

Even more, imagine if you were told that 200,000 pixels out of the 1,000,000 pixels in the image would assume random colours. The underlying function becomes even more clumsy: an addition of ƒ(x, y) and a function R that randomly selects 200,000 pixels and then randomly colours them. The outputs of this function R stands for the information about the image that you can’t have beforehand; the more such information you lack, the more entropy the image is said to have.

The example of the image was simple but sufficiently illustrative. In thermodynamics, entropy is similar to randomness vis-à-vis information: it’s the amount of thermal energy a system contains that can’t be used to perform work. From the point of view of work, it’s useless thermal energy (including heat) – something that can’t contribute to moving a turbine blade, powering a motor or motivating a system of pulleys to lift weights. Instead, it is thermal energy motivated by and directed at other impetuses.

As it happens, this picture could help clarify, or at least make more sense of, a contemporary situation in science journalism. Earlier this week, health journalist Priyanka Pulla discovered that the Indian Council of Medical Research (ICMR) had published a press release last month, about the serological testing kit the government had developed, with the wrong specificity and sensitivity data. Two individuals she spoke to, one from ICMR and another from the National Institute of Virology, Pune, which actually developed the kit, admitted the mistake when she contacted them. Until then, neither organisation had issued a clarification even though it was clear both individuals are likely to have known of the mistake at the time the release was published.

Assuming for a moment that this mistake was an accident (my current epistemic state is ‘don’t know’), it would indicate ICMR has been inefficient in the performance of its duties, forcing journalists to respond to it in some way instead of focusing on other, more important matters.

The reason I’m tending to think of such work as entropy and not work per se is such instances, whereby journalists are forced to respond to an event or action characterised by the existence of trivial resolutions, seem to be becoming more common.

It’s of course easier to argue that what I consider trivial may be nontrivial to someone else, and that these events and actions matter to a greater extent than I’m willing to acknowledge. However, I’m personally unable to see beyond the fact that an organisation with the resources and, currently, the importance of ICMR shouldn’t have had a hard time proof-reading a press release that was going to land in the inboxes of hundreds of journalists. The consequences of the mistake are nontrivial but the solution is quite trivial.

(There is another feature in some cases: of the absence of official backing or endorsement of any kind.)

So as such, it required work on the part of journalists that could easily have been spared, allowing journalists to direct their efforts at more meaningful, more productive endeavours. Here are four more examples of such events/actions, wherein the non-triviality is significantly and characteristically lower than that attached to formal announcements, policies, reports, etc.:

  1. Withholding data in papers – In the most recent example, ICMR researchers published the results of a seroprevalence survey of 26,000 people in 65 districts around India, and concluded that the prevalence of the novel coronavirus was 0.73% in this population. However, in their paper, the researchers include neither a district-wise breakdown of the data nor the confidence intervals for each available data-point even though they had this information (it’s impossible to compute the results the researchers did without these details). As a result, it’s hard for journalists to determine how reliable the results are, and whether they really support the official policies regarding epidemic-control interventions that will soon follow.
  2. Publishing faff – On June 2, two senior members of the Directorate General of Health services, within India’s Union health ministry, published a paper (in a journal they edited) that, by all counts, made nonsensical claims about India’s COVID-19 epidemic becoming “extinguished” sometime in September 2020. Either the pair of authors wasn’t aware of their collective irresponsibility or they intended to refocus (putting it benevolently) the attention of various people towards their work, turning them away from the duo deemed embarrassing or whatever. And either way, the claims in the paper wound their way into two news syndication services, PTI and IANS, and eventually onto the pages of a dozen widely-read news publications in the country. In effect, there were two levels of irresponsibility at play: one as embodied by the paper and the other, by the syndication services’ and final publishers’ lack of due diligence.
  3. Making BS announcements – This one is fairly common: a minister or senior party official will say something silly, such as that ancient Indians invented the internet, and ride the waves of polarising debate, rapidly devolving into acrimonious flamewars on Twitter, that follow. I recently read (in The Washington Post I think, but I can’t find the link now) that it might be worthwhile for journalists to try and spend less time on fact-checking a claim than it took someone to come up with that claim. Obviously there’s no easy way to measure the time some claims took to mature into their present forms, but even so, I’m sure most journalists would agree that fact-checking often takes much longer than bullshitting (and then broadcasting). But what makes this enterprise even more grating is that it is orders of magnitude easier to not spew bullshit in the first place.
  4. Conspiracy theories – This is the most frustrating example of the lot because, today, many of the originators of conspiracy theories are television journalists, especially those backed by government support or vice versa. While fully acknowledging the deep-seated issues underlying both media independence and the politics-business-media nexus, numerous pronouncements by so many news anchors have only been akin to shooting ourselves in the foot. Exhibit A: shortly after Prime Minister Narendra Modi announced the start of demonetisation, a beaming news anchor told her viewers that the new 2,000-rupee notes would be embedded with chips to transmit the notes’ location real-time, via satellite, to operators in Delhi.

Perhaps this entropy – i.e. the amount of journalistic work not available to deal with more important stories – is not only the result of a mischievous actor attempting to keep journalists, and the people who read those journalists, distracted but is also assisted by the manifestation of a whole industry’s inability to cope with the mechanisms of a new political order.

Science journalism itself has already experienced a symptom of this change when pseudoscientific ideas became more mainstream, even entering the discourse of conservative political groups, including that of the BJP. In a previous era, if a minister said something, a reporter was to drum up a short piece whose entire purpose was to record “this happened”. And such reports were the norm and in fact one of the purported roots of many journalistic establishments’ claims to objectivity, an attribute they found not just desirable but entirely virtuous: those who couldn’t be objective were derided as sub-par.

However, if a reporter were to simply report today that a minister said something, she places herself at risk of amplifying bullshit to a large audience if what the minister said was “bullshit bullshit bullshit”. So just as politicians’ willingness to indulge in populism and majoritarianism to the detriment of society and its people has changed, so also must science journalism change – as it already has with many publications, especially in the west – to ensure each news report fact-checks a claim it contains, especially if it is pseudoscientific.

In the same vein, it’s not hard to imagine that journalists are often forced to scatter by the compulsions of an older way of doing journalism, and that they should regroup on the foundations of a new agreement that lets them ignore some events so that they can better dedicate themselves to the coverage of others.

Featured image credit: Татьяна Чернышова/Pexels.

The number of deaths averted

What are epidemiological models for? You can use models to inform policy and other decision-making. But you can’t use them to manufacture a number that you can advertise in order to draw praise. That’s what the government’s excuse appears to be vis-à-vis the number of deaths averted by India’s nationwide lockdown.

When the government says 37,000 deaths were averted, how can we know if this figure was right or wrong? A bunch of scientists complained that the model wasn’t transparent, so its output had to be taken with a cupful of salt. But as an article published in The Wire yesterday noted, these scientists were asking the wrong questions – that the number of deaths averted is only a decoy.

So say the model had been completely transparent. I don’t see why we should still care about the number of deaths averted. First, such a model is trying to determine the consequences of an action that was not performed, i.e. the number of people who might have died had the lockdown not been imposed.

This scenario is reminiscent of a trope in many time-travel stories. If you went back in time and caused someone to do Y instead of X, would your reality change or stay the same considering it’s in the consequent future of Y instead of X? Or as Ronald Bailey wrote in Reason, “If people change their behaviour in response to new information unrelated to … anti-contagion policies, this could reduce infection growth rates as well, thus causing the researchers to overstate the effectiveness of anti-contagion policies.”

Second, a model to estimate the number of deaths averted by the lockdown will in effect attempt to isolate a vanishingly narrow strip of the lockdown’s consequences to cheer about. This would be nothing but extreme cherry-picking.

A lockdown has many effects, including movement restrictions, stay-at-home orders, disrupted supply of essential goods, closing of businesses, etc. Most, if not all, of them are bound to exact a toll on one’s health. So the number of deaths the lockdown averted should be ‘adjusted’ against, say, the number of people who couldn’t get life-saving surgeries, the number of migrant labourers who died of heat exhaustion, the number of TB patients who developed MDR-TB because they couldn’t get their medicines on time, even the number of daily-wage earners’ children who died of hunger because their parents had no income.

So the only models that can hope to estimate a meaningful number of deaths averted by the lockdown will also have simplified the context so much that the mathematical form of the lockdown will be shorn of all practical applicability or relevance – a quantitative catch-22.

Third, the virtue of the number of deaths averted is a foregone conclusion. That is, whatever its value is, it can only be a good thing. So as an indisputable – and therefore unfalsifiable – entity, there is nothing to be gained or lost by interrogating it, except perhaps to elicit a clearer view of the model’s innards (if possible, and only relative to the outputs of other models).

Finally, the lockdown will by design avert some deaths – i.e. D > 0 – but D being greater than zero wouldn’t mean the lockdown was a success as much D‘s value, whatever it is, being a self-fulfilling prophecy. And since no one knows what the value of D is or what it ought to be, even less what it could have been, a model can at best come up with a way to estimate D – but not claim a victory of any kind.

So it would seem the ‘number of deaths averted’ metric is a ploy disguised as a legitimate mathematical problem whose real purpose is to lure the ‘quants’ towards something they think challenges their abilities without realising they’re also being lured away from the more important question they should be asking: why solve this problem at all?

The costs of correction

I was slightly disappointed to read a report in the New York Times this morning. Entitled ‘Two Huge COVID-19 Studies Are Retracted After Scientists Sound Alarms’, it discussed the implications of two large studies of COVID-19 recently being retracted by two leading medical journals they were published in, the New England Journal of Medicine and The Lancet. My sentiment stemmed from the following paragraph and some after:

I don’t know if just these two retractions raise troubling questions as if these questions weren’t already being asked well before these incidents. The suggestion that the lack of peer-review, or any form of peer-review at all in its current form (opaque, unpaid) could be to blame is more frustrating, as is the article’s own focus on the quality of the databases used in the two studies instead of the overarching issue. Perhaps this is yet another manifestation of the NYT’s crisis under Trump? 😀

One of the benefits of the preprint publishing system is that peer-review is substituted with ‘open review’. And one of the purposes of preprints is that the authors of a study can collect feedback and suggestions before publishing in a peer-reviewed journal instead of accruing a significant correction cost post-publication, in the form of corrections or retractions, both of which continue to carry a considerable amount of stigma. So as such, the preprints mode ensures a more complete, a more thoroughly reviewed manuscript enters the peer-review system instead of vesting the entire burden of fact-checking and reviewing a paper on a small group of experts whose names and suggestions most journals don’t reveal, and who are generally unpaid for their time and efforts.

In turn, the state of scientific research is fine. It would simply be even better if we reduced the costs associated with correcting the scientific record instead of heaping more penalties on that one moment, as the conventional system of publishing does. ‘Conventional – which in this sphere seems to be another word for ‘closed-off’ – journals also have an incentive to refuse to publish corrections or perform retractions because they’ve built themselves up on claims of being discerning, thorough and reliable. So retractions are a black mark on their record. Elisabeth Bik has often noted how long journals take to even acknowledge entirely legitimate complaints about papers they’ve published, presumably for this reason.

There really shouldn’t be any debate on which system is better – but sadly there is.

Why we need *some* borders between us

Borders are often a bad thing because they create separation that is unconducive for what are generally considered to be socially desirable outcomes. And they’re often instituted to maximise political outcomes, especially of the electoral variety. However, as electoral politics – and the decisions politicians make leading up to elections – become increasingly divisive, the people’s perception of politics, especially among those belonging to the middle classes, simultaneously becomes more cynical. At one point, those engaged in less political activities could even begin to see politics as a meaningless enterprise engaged solely in furthering the interests of the powerful.

This is a wholly justified conclusion given the circumstances but it’s also saddening since this cynicism is almost always paid for by writing off all political endeavours, and all the borders they maintain – and it is even more saddening now, in this time of protests, riots, apathy and deaths among the poor of hunger, of all things. This particular point is worth highlighting more now because space, especially human spaceflight, is in the news. Elon Musk’s SpaceX recently launched two astronauts to the International Space Station in history’s first crewed mission by a non-governmental company (that still subsists mostly on government funds).

For many decades, creators, engineers and officials alike have billed space as an escape, particularly in two ways. First, as a material volume of the universe that humanity is yet to occupy in any meaningful way, space is a frontier – a place other than Earth where there are some opportunities to survive but more importantly which could present a fresh start, a new way to do things that apparently benefits from millennia of civilisation on Earth that has only left us with great inequality and prejudice. Second, as a vast emptiness composed of literally nothing for billions of kilometres at a time, space imposes a ‘loneliness tax’ on Earth that – as many spaceflight entrepreneurs are fond of saying – should prompt us to remember that “we’re all in this together”.

However, the problem with both perspectives is that they gloss over borders, and when some borders disappear, our awareness of inequality disappears while inequality itself doesn’t. A common refrain aspiring spacefarers like to pitch is of the view of Earth from the Moon, accompanied by a gruff but nonetheless well-intentioned reminder that borders are of our own making, and that if we got rid of them and worked in humanity’s best-interests as a whole, we’d be able to achieve great things.

I call bullshit because without borders to constantly remind ourselves that invisible lines exist in the ground as well as in our minds that a Dalit or a black person can’t cross, no Dalit or black person – or even many women for that matter – can enter the spaceflight programme, leave alone get to the Moon.

More broadly, what many of those engaged in less-political work see as “unnecessary borders” are really discomfiting borders, a fact that became immutably apparent during India’s #MeToo uprising on Twitter in October-November 2018. Then, the mass of allegations and complaints pouring in every day indicated, among other things, that when inequality and discrimination have become ubiquitous, affording men and women equal opportunities by way of redressal can’t make the inequality and discrimination go away. Instead, women, and indeed all underprivileged groups, need affirmative action: to give more women, more Dalits, more black people, more transgender people, etc. access to more opportunities for a time until both the previously privileged groups and the newly privileged groups are on equal footing. It’s only then that they can really become equals.

A popular argument against this course of action has been that it will only create a new asymmetry instead of eradicating the old one. No; it’s important to recognise that we don’t need to eradicate privileges by eradicating opportunities, but to render privileges meaningless by ensuring all people have equal access to every new opportunity that we develop.

Another contention, though it doesn’t dress like a contention, is that we should also discuss why it’s important to have people of diverse identities around the table. But to me, this view is awfully close to the expectation of people from underprivileged groups to justify themselves, often more than those from privileged groups ever have for the same or equal positions. Instead, to quote Tarun Menon, of the National Institute for Advanced Studies, Bengaluru: “Deliberative democracy” – “a form of democracy in which deliberation is central to decision-making” (source) – “is key to any well-ordered democratic society, both because it helps ensure that a variety of concerns are taken into account in democratic decision-making, and because it grants legitimacy to decision-making by making it participatory.”

This is why borders are important – to define groups that need to be elevated, so to speak; without them, our economic and political structures will continue to benefit who they always have. And this is also why borders not used to achieve socially desirable outcomes are nothing but divides.

More importantly from the spaceflight bros’ point of view, when the borders we do need are erased, space will mostly be filled with white men, and a proportionately fewer number of people of other racial, ethnic, gender and caste identities – if at all.

Featured image: Daria Shevtsova/Pexels.

Ocean-safe consumption

Just spotted this ad on the website of The Better India, a journalism website that focuses on “positive stories”:

India’s nationwide lockdown has many important lessons – including the fact that it wasn’t useful in slowing the spread of the novel coronavirus through the Indian population; and though there’s no way yet to tell if it was useless instead, state opacity, data manipulation, false advertisement, medical research devoid of science, struggling hospitals and apathy of the poor all make it so. This said, two lessons in particular have been decidedly positive: the air becoming cleaner, at least to see through, and the Ganga river becoming cleaner, reportedly even to drink from.

K.A.S. Mani, a hydrologist, observed shortly after the latter was reported that rivers could clean themselves in a matter of weeks only if we took a break from stuffing them with pollutants – and axiomatically that when governments spend crores of rupees on fancy technological solutions and set themselves deadlines that are years away to achieve the same goals, they’re probably not doing it right. The final message for the people is simple, and what it’s always been: if you want to protect the rivers – or for that matter the oceans – consume less.

This is also what makes any attempt to combine consumerism with eco-friendliness absurd, including The Better India‘s advertisement for a combination of different surface cleaners.

I admit their business model is worth considering: if you subscribe to their ‘service’, they’ll ship refill pouches to your place every month and whose contents you can store in the bottle you purchased the first time round.

(However, I’m skeptical of the claims about the cleaning substances, per the FAQ: “Our cleaners for laundry and dishwashing contain enzymes in addition to plant-based surfactants. These enzymes are lab-processed. Floor and toilet cleaners contain active microbes that create enzymes while performing the cleaning action.” Quite vague. I’m also very skeptical of the “non-toxic” bit: toxicity is highly context-specific, and the claim can’t possibly mean the cleaners are safe to drink!)

Most of all, none of this is “ocean-safe” – or even ocean-friendly – by any stretch of imagination. Bottles, refill pouches and cleaning agents still need to be made and shipped to households – all processes that will generate trash. It doesn’t make sense to claim simply that the contents of the bottles are unlikely to harm the ocean when spilled into the water (and even then I’d like to see some test results). What it is is very marginally less offensive to the world’s water bodies, where our waste eventually ends up.

And if anyone asks if I have a better idea: I don’t, but that doesn’t mean I get to pretend that what I’m doing is “safe” or “friendly” when it’s not.

Note: This post was updated on June 2, 2020, at 3.30 pm to clarify the lockdown’s usefulness in more detail.

Eight years

On June 1 last year, I wrote:

Today, I complete seven years of trying to piece together a picture of what journalism is and where I fit in.

Today, I begin my ninth year as a journalist. I’m happy to report I’m not so confused this time round, if only because in the intervening time, two things have taken shape that have allowed me to channel my efforts and aspirations better, leaving less room for at least some types of uncertainty.

The first is The Wire Science, which was born as an idea around August 2019 and launched as a separate website in February 2020. From The Wire‘s point of view, the vision backing the product is “to build a constituency for science journalism – of contributors as well as readers – and drive a science journalism ecosystem.”

For me, this is in addition an opportunity to publish high-quality science writing that breaks away from the instrumental narratives that dominate most journalistic science pieces in India today.

The second thing that took shape was our readers’ and supporters’ appreciation for The Wire‘s work in general. I like to think we’re slowly breaking even on this front, indicating that we’re doing something right.

On these notes of focus, progress and hope – even though the last 12 months have been terrible in many ways – I must say I do look forward to the next 12 months. I’m sure lots of things are going to go wrong, just as they’ve been going wrong, but for once it also feels like there are going to be meaningful opportunities to do something about them.

The virus and the government

In December 2014, public health researchers and activists gathered at a public forum in Cambridge, Massachusetts, to discuss how our perception of diseases and their causative pathogens influences our ideas of what we can and can’t do to fight them. According to a report published in The Harvard Gazette:

The forum prompted serious reflection about structural inequalities and how public perceptions get shaped, which often leads to how resources are directed. “The cost of believing that something is so lethal and fatal is significant,” [Paul] Farmer said.

[Evelynn] Hammonds drew attention to how perceptions of risk about Ebola had been shaped mostly through the media, while noting that epidemics “pull the covers off” the ways that the poor, vulnerable, and sick are perceived.

These statements highlight the importance of a free press with a spine during a pandemic – instead of one that bends to the state’s will as well as doesn’t respect the demands of good health journalism while purporting to practice it.

We’ve been seeing how pliant journalists, especially on news channels like India Today and Republic and in the newsrooms of digital outlets like Swarajya and OpIndia, try so hard so often to defend the government’s claims about doing a good job of controlling the COVID-19 epidemic in India. As a result, they’ve frequently participated – willingly or otherwise – in creating the impression that a) the virus is deadly, and b) all Muslims are deadly.

Neither of course is true. But while political journalists, who in India have generally been quite influential, have helped disabuse people of the latter notion, the former has attracted fewer rebuttals principally because the few good health journalists and the vocal scientists operating in the country are already overworked thanks to the government’s decoy acts on other fronts.

As things stand, beware anyone who says the novel coronavirus is deadly if only because a) all signs indicate that it’s far less damaging to human society than tuberculosis is every year, and b) it’s an awfully powerful excuse that allows the government to give up and simply blame the virus for a devastation that – oddly enough – seems to affect the poor, the disabled and the marginalised too far more than the law of large numbers can account for.