Quantcast
Channel: Publishing – Mike the Mad Biologist
Viewing all articles
Browse latest Browse all 11

Schlong Rat and the Scientific Reviewing (and Editing) Crisis

$
0
0

By the way, who could have possibly predicted a tool trained on images from the internet generates ridiculously ginormous penises?

Anyway, a while ago, some asshole with a blog noted this about reviewing and editing failures in the scientific literature (boldface added):

Every so often, there’s a spate of articles about scientific fraud. I don’t mean to downplay the issue–it is serious. In my daily work though, I encounter far more issues with poor reviewing and editing: that is, errors*. One of the things that keeps me off the streets and out of trouble is bioinformatic curation, which might (?) sound glamorous but is really nothing more than reading scientific literature closely and then extracting useful information to be added to databases and used in various software programs. For example, a paper might describe a mutation in a gene that confers resistance to an antibiotic, so I need to figure out what the mutation is and if it’s actually worth incorporating: I might not agree with the assessment of the authors (e.g., what they’re calling ‘resistance’ probably shouldn’t be considered resistance), or there might be some other data issue.

…there are several important things to note. First, it’s a pain in the ass to extract these data, and some people just might give up, which is a failure of communication (I stick with it because I am paid to do so–it’s not the fun part of my job). Second, which is far more important than the Mad Biologist’s suffering, how do we trust the other parts of the paper? These examples are not about some minute part of a supplemental table, but are the critical findings of the paper. So how does a reader handle this? Ignore everything else? Pick only the good parts? Third, all of these issues should have been caught. Reviewers should check the key findings of the manuscript in review (and editors should make sure they’ve done so).

I don’t think these are cases of fraud at all (I’ve seen very few cases that even make me suspect fraud), but these are errors, ones that could have and should have been caught in the review process. So I do think fraud is an obvious and documented problem, but poor reviewing and editing also is a serious problem.

With that as prelude, we give you Rat Schlong (boldface mine);

Appall and scorn ripped through scientists’ social media networks Thursday as several egregiously bad AI-generated figures circulated from a peer-reviewed article recently published in a reputable journal. Those figures—which the authors acknowledge in the article’s text were made by Midjourney—are all uninterpretable. They contain gibberish text and, most strikingly, one includes an image of a rat with grotesquely large and bizarre genitals, as well as a text label of “dck.”

Behold Schlong Rat:

ratschlong

You can’t unsee that.

From a coherence point of view, this nonsensical diagram is even worse–what the hell is “tramioncatiion”? (though it’s not as schlongy):

ridiculouscellimage

While this is being treated as a problem with ‘AI’–who could have predicted a tool trained on images from the internet generates ridiculously ginormous penises?–it’s really a problem of review and editing. Neither Schlong Rat nor the second image should have ever made it past peer review, yet it did. That’s a reviewing problem, not an AI problem. Though it does highlight the importance of human oversight (how one overlooks a mutant schlong is difficult to understand however…).


Viewing all articles
Browse latest Browse all 11

Latest Images

Trending Articles





Latest Images