Advertisement

If you have an ACS member number, please enter it here so we can link this account to your membership. (optional)

ACS values your privacy. By submitting your information, you are gaining access to C&EN and subscribing to our weekly newsletter. We use the information you provide to make your reading experience better, and we will never sell your data to third party members.

ENJOY UNLIMITED ACCES TO C&EN

Ethics

Movers And Shakers

Data integrity specialist worked to keep published images honest at ASBMB

Kaoru Sakabe describes how her former group combs through images for signs of manipulation before publication

by Louisa Dalton, special to C&EN
September 15, 2020 | A version of this story appeared in Volume 98, Issue 36

Head-and-shoulders photo of Kaoru Sakabe.
Credit: Emily Huff/ASBMB
Kaoru Sakabe

Kaoru Sakabe was academic publishing’s version of an in-house detective. In 2017, she and editors at the Journal of Biological Chemistry (JBC) conducted a pilot study looking for image manipulation in accepted papers. When 10% of papers came back with a possible issue, the team was shocked.

The study’s results led Sakabe to help set up an image-screening process that she oversaw until recently. As data integrity manager for the American Society for Biochemistry and Molecular Biology, Sakabe preemptively flagged apparent image manipulation or shoddy data representation in accepted papers at JBC and the Journal of Lipid Research. She and three analysts hunted for hard-to-detect anomalies in images, particularly those of blots, gels, and micrographs.

Sakabe also handled readers’ complaints about published images and fielded all policy and manuscript questions from editors, reviewers, and society staff. Louisa Dalton spoke with Sakabe about her role as a data integrity specialist. Since their conversation earlier this year, Sakabe has changed jobs and now works at AstraZeneca. The following interview was edited for length and clarity.

Vitals

Hometown: Born in Kyoto, Japan; grew up in North Carolina

Studies: BS, chemistry with a focus in biochemistry, University of North Carolina at Chapel Hill; PhD, biochemistry, Johns Hopkins University; postdoc, University of Dundee

Blogs she follows:Retraction Watch, For Better Science

Papers that can take all day to check: Papers heavy with Western blots

Papers that are easy to check: Crystallography papers with protein structures

Favorite molecule: O-linked β-N-acetylglucosamine, because “my thesis focused on this posttranslational modification of proteins”

Hobbies: Cooking, baking, knitting, and some sewing. “I like making things with my hands. And, like everybody else during the COVID-19 lockdown, I’m working on getting a thriving sourdough starter.”

Once-favorite games: Crosswords, sudoku, and Photo Hunt, a spot-the-difference game. “Now that I spend most of my day on the computer, I don’t do these types of games.”

What spurred you to take on a job as a data integrity manager?

There was something about the integrity part. I’m a real stickler for rules, and my partner and I actually get in little tiffs about it because we’re the complete opposite. If it says, “No parking here if you’re not a resident,” then I will make sure that I don’t park there.

Also, I fixate on a lot of minutiae. I look for tiny things that are wrong. I think that’s what makes me good for this job. As a researcher, I had a harder time looking at the big picture, which you really need to do as a research scientist if you want to publish that paper or write a research grant. The way my mind works is better for image analysis than for doing bench work.

Describe your image analysis process.

We download all the figure files that the authors have uploaded into our system, and then we put them in Photoshop. We apply a bunch of gradient maps from the Office of Research Integrity (ORI) of the US Department of Health and Human Services to add false colors to figures. The false colors help bring out small differences that are harder to see when the images are in gray scale. They help us visualize where the authors may have manipulated an image. Using these, we can tell if perhaps a brush tool was applied or the blur tool used. The gradient maps work best on blots, gels, and micrographs. They are provided on the ORI website as a service to the research community. We look for discontinuities, duplications, manipulations, compression artifacts from saving images in different file formats, splices, or other subtle alterations that change the nature of the original image. Our process also includes screening spectra, graphs, schematics, and structures.

We also look for author mistakes. For example, they may have an extra arrow in their figure that’s not pointing anywhere, or the labels on the graph may be covered up. We make sure that the figures comply with our editorial guidelines.

What tools in Photoshop are OK for researchers to use on an image? Are others off limits?

I really recommend only using the crop tool. It may be necessary to adjust the brightness, contrast, levels, or gamma settings, but I think these should be done carefully. The final image should always look like the original one.

How do authors react?

I run into a few irate authors who blame me for the problems that we’ve uncovered. But I think that just comes with the job. Some authors are completely unaware that something happened. The image modification is often done by one person in the group. I think in most instances, there’s an inherent level of trust that coauthors have with each other, which is why, oftentimes, these issues go by unnoticed.

Do you think this image checking will become more automated?

There are some companies working on this actively. I don’t know if they have the throughput yet to look for duplication of figures, and I’m not so sure that they will be really good at finding some of the manipulations we see. One of them we see is with Photoshop’s clone stamp tool, which allows you to copy a little part of an image and stamp it anywhere else on the image to hide blemishes or bands. We can tell the clone stamp tool was used when we see the same really, really tiny pattern stamped around the blot several times. I don’t know how good artificial intelligence would be in looking for that.

I hope it gets better because it would certainly make it a lot easier to screen figures, and it would also allow high-throughput image screening in papers that have already been published, kind of like plagiarism software but with images.

Do you think data integrity is an increasing problem, or is it just easier to find now?

I think it’s easier to find, for sure. I also think it used to be a lot harder to make the prettiest Western blot you could. You just had to deal with whatever blot you developed. Now you have Photoshop, you have PowerPoint, you have all these programs that make it really easy for someone to make the “perfect” blot. And I feel like our society is kind of pushing to have the most perfect image presented.

I don’t think we put enough education up front to our graduate students to let them know what’s acceptable and what’s not acceptable. When I was in graduate school, this is not something that was taught to us. We were just expected to know.

What advice do you have for scientists?

Always save original data. Don’t save images as JPEGs or in PowerPoint. I recommend a column in the ASMB’s member magazine that describes common image problems. The ASMB also has a video about preparing figures for publication on our YouTube channel.

ACS Central Science Logo

Louisa Dalton is a freelance writer. A version of this story first appeared in ACS Central Science: cenm.ag/sakabe.

Article:

This article has been sent to the following recipient:

0 /1 FREE ARTICLES LEFT THIS MONTH Remaining
Chemistry matters. Join us to get the news you need.