30 November 2021

Interview on the PolicyViz podcast

I recently spoke to Jon Schwabish (author of Better Data Visualizations, reviewed here) on the PolicyViz podcast, and I’m happy that the episode is now available wherever you listen to podcasts! 

Jon is a great person to talk to, and his questions got me thinking about some new topics that I hadn’t considered before.

This season, Jon has been experimenting with a video version of the podcast. I already knew of my bad speaking habits as an interviewer on audio (I go on tangents way to easily, I start sentences without knowing where they’ll land), but now I get to see entirely new bad habits (looking away from the camera, shifting my weight).

The show notes also contain a complete transcript in case you read faster than I can talk.

Related posts

Review: Better Data Visulizations

External links

PolicyViz podcast Episode #206: Zen Faulkes show notes 

25 November 2021

Why conference posters should be peer reviewed

Lander Foquet has a worrying Twitter thread about how a poster abstract – no even a whole poster, just an abstract – is making the rounds in the medical disinformation circles. 

Foquet says “a poster is never peer-reviewed.” This might be true for the meeting in question (The American Heart Association Scientific Sessions 2021), but there are some conferences that do peer review submissions. It is rare, though.

Regardless, there are occasionally stories that crop up from some academic conference or another where someone gives a poster or talk that is at odds with the views of the field. Once, well known racist Phillipe Rushton gave a poster at the Neuroscience meeting (sometime in the 1990s, can’t recall the year). I’ve seen other cases where people brought some sort of presentation that most society members said, “That shouldn’t have been presented at this meeting.”

You can read the thread for a detailed debunking of the particulars of the abstract, but the issue of vetting conference posters has been on my mind for a while. In an interview, I was asked about poster competitions. Part of the question about competitions was that posters aren’t peer reviewed, so is there not a risk that we might be rewarding dodgy work?

Because anything online (like conference abstracts) can be quickly weaponized, maybe it is time for conference organizers to implement some light peer review. A sanity check, if you will.

  • A physics conference should not accept a poster about a claimed perpetual motion machine. 
  • A chemistry conference should not accept a poster about the elements are earth, air, fire, and water. 
  • A biology conference should not accept a poster about how natural selection isn’t real. 
  • A geology conference should not accept a poster about how the Earth is flat.
  • An anthropology conference should not accept a paper about how Africans are inferior human beings.

And so on. In fact, now that I think of it, none of those topics should be permissible at any professional academic conference. These should not be controversial calls.

I know what the immediate objections from conference organizers will be.

It takes a lot of time and effort. This is undoubtedly true. Each one would have to be read by a human being with some level of professional expertise. (No, I don’t think machine learning is ready to tackle this yet.)

But how much is too much? Let’s look at the meeting that started Foquet on his thread.

Judging from the American Heart Association program planner, it looks like there are between three hundred and four hundred posters spread out over four days. Let’s say a team of four professionals reviewed abstracts. Maybe one person takes five minutes maximum to read an abstract. That’s twelve an hour. One person could read 96 abstracts in an eight hour day; let’s round to 100 abstracts per person per day. 

A team of four could have reviewed all the abstracts for the meeting in one day.

Obviously, this might not scale to the biggest conferences with tens of thousands of posters. But many conferences have hundreds of presentations, which seems potentially manageable.

The “time and effort” argument could be made for journals. Peer review takes time and effort for editors, and yet journals do it. Peer review is considered a mark of a high quality product, so why could not the same be true for journals and conferences?

The second objection is that peer review stifles free discussion. Maybe, and there was a time when I think that argument would have weighed much more heavily for me. But not now. 

The last few years have show that “more free speech is always the answer” doesn’t take into account the power asymmetries that exist in the modern communication ecosystem. A well organized misinformation campaigns can spread bad content much faster than reasoned professionals can correct it. Professional organizations have often anywhere from reluctant to perilously slow in combating misinformation.

Science still has credibility. Appearing in a scientific conference, even if not peer reviewed, give a patina of authority and respectability. People with bad motives and worse science will attempt to squeeze into these venue if they have the chance.

Deplatfomring works.

Update: Other potential examples of cases where peer review might have prevented some problems.

At the 2017 meeting of the Ecological Society of America (ESA), someone gave a presentaion on human abortion

Another year (unknown), at ESA again, someone gave a poster about insects in space. “He was having some obvious mental health issues.”

External links

Lander Foquet on Twitter

08 November 2021

A report from Neuroscience 2021, one the biggest poster sessions in the world

Monday, 8 November 2021: From the online meeting of the Society for Neuroscience, 2021 meeting:

“Something went wrong” error message.

Elizabeth West tweeted, “You misspelled ‘everything.’”

Neuroscience this year was originally announced as a hybrid of in person and virtual, moved to virtual (which prompted anger). Things seemed fine earlier today. But now it seems that almost everything on thie poster session today (Monday, 8 November 2021) is unreachable.

How interesting. 

This is a meeting I have gone to before, but was not attending this year. But now I will be watching the hashtag for the next few days with interest to see if things improve.

Photo by Rolf Skyberg

• • • • •

Tuesday, 9 November 2021: Um. Day 2 of poster sessions do not seem to be going better yet, based on Ian Stewart’s report.

So presenters at #SfN2021 can’t share their screens in the new back-up plan zoom rooms - we’re having a poster session without posters

Maxime Beau wrote

Great #SfN2021 has transitioned to Zoom for a less chaotic day of poster presentations - apart from sessions where no host is there to start the meeting 🤷‍♂️

Cana Quave wrote:

Have you ever had the negative experience of presenting a poster in a busy session where nearby presentations interfere with each other? Well, @SfNtweets has perfected this negative experience at #sfn2021 by placing all poster presenters in a session into a single Zoom meeting

Maria Reva noted that there had been no announcement for rescheduling the lost Monday poster sessions. This came very late in the day, with an announcement that Monday posters would be rescheduled for the same time Thursday,

Scott Knudstrup delivered one of the best one-liners:

Curious decision by SfN to hire the guys that rolled out the Obamacare website

• • • • • 

Wednesday, 10 November 2021: The woes of the SfN meeting are covered in Spectrum News

• • • • •

This tweet pinpoints Cadmium as provider for the SfN online conference. 

Apparently the Society for Neuroscience is offering a discounted membership for next year to conference participants. 50% off is nothing to sneeze at, but it would cost the Society nothing to say, “We’re sorry.”

Although it’s fair to point out that this is a minority view, at least one person said it was a “good” conference.