THE Ohio State University is THE Leader!

… In Producing Unreliable Research

James Heathers
8 min readSep 16, 2020

I found this one in the archives.

I think I never bothered to publish it because it felt a bit … mean spirited. I know people who work at OSU. I trust them. I’d read their work out of reflex. I don’t like the damned-by-association aspect of this at all.

But the desperate grasping vapid stupidity of university branding exercises have always made me think of my OWN ‘brand associations’ with different universities… which are usually ‘who did what dodgy business and when’.

And, of course, the gap between the glossy smiling pandering brochures, usually featuring students judiciously chosen for their ‘visible diversity’, even if the university couldn’t give a toss about any such thing on a day to day level.

So, with the usual simmering discontent in mind, what tipped the whole thing over and got this written was the attempt to trademark the definite article.

It’s easier to maintain the energy necessary to ridicule something if it patently represents itself as ridiculous.

Found out yesterday that Ohio State University just failed to trademark THE.

As in THE Ohio State University.

No joke, they tried this.

This was probably less silly than it sounded. If the article is to be believed, they had a previous dust-up with Oklahoma State University, and they’re trying to get the trademark with some kind of remit specific to their name / branding.

Straightforward enough, I suppose. Maybe not — obviously, they can’t have the word ‘the’ out of context. I think THE Rock and THE Big Show might chokeslam them.

Soundtrack by THE Damned, of course.

Anyway, the trademark hit THE wall.

But I do have a point.

Here’s what they said about THE trademark:

“Like other institutions, Ohio State works to vigorously protect the university’s brand and trademarks,” university spokesman Chris Davey told The Columbus Dispatch in a statement.

“These assets hold significant value, which benefits our students and faculty and the broader community by supporting our core academic mission of teaching and research.”

In the middle of this faintly ridiculous slappy-fight about trying to own the definite article, they put this brandflexing silliness up as an ‘asset which benefits our mission’.

And that mission of course is their strong commitment to their ‘core academic mission of … research.’

That got my goat a bit.

Let’s talk about that.

Retraction Statistics Are Never On The Brochure

This is a series of names.

If it was a basketball team, it would be a short roster. The coach would have to sub in as a point guard.

What the names represent, though, is a long roster.

David Padgett.

Carlo Croce.

Shiladitya Sen.

Samson Jacob.

Ching-Shih Chen.

Steven Devor.

Terry Elton.

Jodi Whitaker.

Brad Bushman.

To most people, these are just names.

But to me, they’re cases, and back-stories. They’re all researchers who’ve had — in some combination — high profile work retracted, been investigated, been removed/pushed out/“retired”, had funding withdrawn, or had serious questions raised over the accuracy of their research.

They all worked (or work) at THE Ohio State University.

A while ago, I pulled the numbers from THE RetractionWatch Database numbers for the top 25 public universities in the country, and like many things I download, I left them in a pile after the curiosity faded.

THE numbers for all THE problematic research in the database, which is (retractions+corrections+expression of concern), looked like this when I downloaded them:

THE big red bar is OSU.

(Note: you got a higher number if you didn’t specify “Columbus” — 77 rather than 55, which is off the graph and into the paragraph above it — but I have no idea if that’s accurate. Always Steel Man, always guess low. We’ll use the lower number.)

Either way, they are the clear RetractionWatch leader, over such research powerhouses such as UCLA, University of Florida, UCSD and UWash. These are in the light-hatched bars above.

As these are older numbers, and I’m not sure how old they are because my record keeping is occasionally questionable, I re-did the big lookups for the above just now:

University of California San Diego: 41 entries (UCSD: 3 items)

University of Florida: 42 entries

University of Washington: 46 entries

University of California Los Angeles: 48 entries (UCLA: 24 items)

Ohio State University: 90 entries

(If you add ‘Columbus’, 71 entries)

(If you add ‘THE’, 59 entries)

Basically, THE OSU leads THE top public universities in producing research which needs to be corrected or removed from THE research record.

You might not like it, but those are THE facts.

Unlikely you’ll see that on the brochures: “Come to THE Ohio State University! See our new $15 million dollar aquatic center! It’s next to the building which spawned an impressive chunk of THE RetractionWatch database!”

Now, there is no question that the genesis of this number is multi-factorial, and that it’s hard to determine exactly (lots of cross-over between institutional names given how randomly people report their affiliations, of course).

And — of course — not all corrections and retractions are created equal.

For instance: is THE OSU better or more proactive at the process of investigating researchers whose work is unreliable?

I’ve seen that argument made about Japan and I believe it — they don’t believe in sparing the rod in substantial research misconduct cases, which is HUGELY to their credit. Internationally, their ability to provide a full exegesis of academic naughtiness is right up there with the Dutch.

Or maybe this simply a matter of luck or random assortment. Do the researchers above work in more or less controversial/difficult/carefully reviewed areas? Do THE OSU take more risks with hires?

Or perhaps most simply of all this number simply proportional to the amount of research staff present i.e. more people predicts, on aggregate, more mistakes and/or assorted naughtiness? Or maybe not the amount of staff, but the total amount of research items produced? Perhaps I need to adjust per staff member or per research paper.

Overall, we have no idea, and it’d be flat-out mean to speculate. It’d be a reasonable question to resolve numerically if you had a strong hypothesis, a big fat regression model and a few spare months. However, I have no hypothesis, no time, and the only regression I have is “childhood”.

Dealing with THE problem

What we do know, though, is that this number and all its concordant publicity has not escaped the management of the university. We know because they said so, they don’t spend all their time on corporate brand fiddle-faddle.

Only I remember, of course, but in 2018, we found out they were going to have a National Conference on Research Integrity. It made sense. Some of the cases from the rogue’s gallery above were current at the time.

Which happened, as promised, in September.

Here’s the website: http://research.osu.edu/risummit/

I don’t know how effective it was, or who it reached, or what happened subsequently, because I’m probably the only person who watched it.

What doesn’t help: the video is an 8 hour, un-annotated, non-downloadable, 5.7Gb video file. Releasing it is a good commitment to openness, releasing it in a way where it can’t be watched is, uh, not so much.

There’s a pretty substantial list of worthies on the speaker’s list. Deans, consultants, VCs, VPs, directors, editors, COPE and ORI members, et al.

The really small font is to fit all the really fancy people in.

This is fine, I guess. These high-level policy meetings take place all the time. Much is said, fine words are exchanged. If they resulted in a great deal of change, one of the four thousand other previous conference might have left a scar by now. Nothing looks as stale as the recommendations of last year’s Steering Committee to establish new recommendations in next year’s committee.

Now, I don’t expect such junkets to stop, or for the people who run them to pay the slightest attention to my opinion. They’ve never heard of me.

But, then again, I’ve never heard of them.

As might be expected, I have a fairly well developed spidey sense for recognising names in the research integrity space, and I recognise one name from that lot (Yucel), because she was one of the addressees on an open letter being sent to OSU to complain about their treatment of people detecting errors in work published there.

However.

An element of this conference was so unbelievably short-sighted I’m imagining it wearing coke-bottle glasses then walking into lampposts. It has a ludicrous tinge to it.

Unfortunately, it was also the tagline of the entire event.

Rather than rewrite this a few times for my own personal amusement (“A view from all senior management perspectives”), here is my question:

Where are the early career researchers?

A room full of senior compliance officers and omni-deans is NOT. ALL. PERSPECTIVES. The actual business of science is overwhelmingly performed by young people, and managed by old people. This means whistleblowers (at least, the internal kind) are overwhelmingly junior people connected to problematic projects.

If I was in an uncharitable mood, I’d say a summit on research integrity which doesn’t involve anyone who actually does THE science and point out THE errors has THE most acute potential to devolve into pointless hand-wringing.

And with that, we return to a portion of that initial statement about brand fiddle-faddle:

“These assets hold significant value, which benefits our students and faculty and the broader community by supporting our core academic mission of teaching and research.”

Well, I hope you can get your trademark and sell a whole lot of silly baseball hats, because you might need that money for a bit more research oversight.

THE End.

--

--

James Heathers
James Heathers

Written by James Heathers

I write about science. We can probably be friends.

Responses (1)