Lander Foquet has a worrying Twitter thread about how a poster abstract – no even a whole poster, just an abstract – is making the rounds in the medical disinformation circles.
Foquet says “a poster is never peer-reviewed.” This might be true for the meeting in question (The American Heart Association Scientific Sessions 2021), but there are some conferences that do peer review submissions. It is rare, though.
Regardless, there are occasionally stories that crop up from some academic conference or another where someone gives a poster or talk that is at odds with the views of the field. Once, well known racist Phillipe Rushton gave a poster at the Neuroscience meeting (sometime in the 1990s, can’t recall the year). I’ve seen other cases where people brought some sort of presentation that most society members said, “That shouldn’t have been presented at this meeting.”
You can read the thread for a detailed debunking of the particulars of the abstract, but the issue of vetting conference posters has been on my mind for a while. In an interview, I was asked about poster competitions. Part of the question about competitions was that posters aren’t peer reviewed, so is there not a risk that we might be rewarding dodgy work?
Because anything online (like conference abstracts) can be quickly weaponized, maybe it is time for conference organizers to implement some light peer review. A sanity check, if you will.
- A physics conference should not accept a poster about a claimed perpetual motion machine.
- A chemistry conference should not accept a poster about the elements are earth, air, fire, and water.
- A biology conference should not accept a poster about how natural selection isn’t real.
- A geology conference should not accept a poster about how the Earth is flat.
- An anthropology conference should not accept a paper about how Africans are inferior human beings.
And so on. In fact, now that I think of it, none of those topics should be permissible at any professional academic conference. These should not be controversial calls.
I know what the immediate objections from conference organizers will be.
It takes a lot of time and effort. This is undoubtedly true. Each one would have to be read by a human being with some level of professional expertise. (No, I don’t think machine learning is ready to tackle this yet.)
But how much is too much? Let’s look at the meeting that started Foquet on his thread.
Judging from the American Heart Association program planner, it looks like there are between three hundred and four hundred posters spread out over four days. Let’s say a team of four professionals reviewed abstracts. Maybe one person takes five minutes maximum to read an abstract. That’s twelve an hour. One person could read 96 abstracts in an eight hour day; let’s round to 100 abstracts per person per day.
A team of four could have reviewed all the abstracts for the meeting in one day.
Obviously, this might not scale to the biggest conferences with tens of thousands of posters. But many conferences have hundreds of presentations, which seems potentially manageable.
The “time and effort” argument could be made for journals. Peer review takes time and effort for editors, and yet journals do it. Peer review is considered a mark of a high quality product, so why could not the same be true for journals and conferences?
The second objection is that peer review stifles free discussion. Maybe, and there was a time when I think that argument would have weighed much more heavily for me. But not now.
The last few years have show that “more free speech is always the answer” doesn’t take into account the power asymmetries that exist in the modern communication ecosystem. A well organized misinformation campaigns can spread bad content much faster than reasoned professionals can correct it. Professional organizations have often anywhere from reluctant to perilously slow in combating misinformation.
Science still has credibility. Appearing in a scientific conference, even if not peer reviewed, give a patina of authority and respectability. People with bad motives and worse science will attempt to squeeze into these venue if they have the chance.
Deplatfomring works.
Update: Other potential examples of cases where peer review might have prevented some problems.
At the 2017 meeting of the Ecological Society of America (ESA), someone gave a presentaion on human abortion.
Another year (unknown), at ESA again, someone gave a poster about insects in space. “He was having some obvious mental health issues.”
Update, 22 December 2021: The Mexican Geart Association has taken action on the abstract that prompted this post. Whether this will include stronger review of conference abstracts remains to be seen.
External links
Lander Foquet on Twitter