Newsstand Menu

Podcast Library

2018 Webby NomineeCSHL’s Base Pairs podcast has been selected as a 2018 Webby Award nominee

Base Pairs was “singled out as one of the five best in the world” in its category.

Hailed as the “Internet’s highest honor” by The New York Times, The Webby Awards, presented by the International Academy of Digital Arts and Sciences (IADAS), is the leading international awards organization honoring excellence on the Internet.

Base Pairs banner

Cold Spring Harbor Laboratory’s Base Pairs podcast tells stories that convey the power of genetic information—past and present.

Listening to a podcast is easy

Let us show you how in our online tutorial: How to listen to a Base Pairs podcast.

For educators

To make our episodes easier to repurpose for educational uses, including other podcasts, we provide “no music” versions of every episode under a Creative Commons license.

Base Pairs is also available on

Base Pairs Season 3

Latest episode

Episode 14.5: Medicine and mad scientists

It’s important to know that a drug works, but knowing how it works can be just as crucial. CSHL Fellow Jason Sheltzer discovered that the hypothesis explaining the action of a new cancer drug was incorrect, indicating that its beneficial effects had to be due to other factors. Hear more from him following up on his discussion in Base Pairs episode 14, “The cancer answer that wasn’t.” Also, in a new pop culture segment, we talk about movie “mad scientists” and how they contribute to misconceptions about the way real science is done.

Brian: Hey everybody. My name is Brian.

Andrea: And I’m Andrea.

Brian: And this is a Base Pairs chat episode.

So for those of you who don’t know we follow up every full episode, kind of our story telling episodes, with what we call a chat episode. So this is the content that we leave on the cutting room floor or interviews that we had wanted to discuss but weren’t able to include in the podcast and then Andrea and I kind of just talk it out.

Andrea: But today we have another person joining us, someone else from our team at Cold Spring Harbor Laboratory, who is our kind of resident pop culture aficionado. Her name is Sara Roncero-Menendez and she’ll be joining us a little later in the show, so look forward to that.

Brian: It’s going to be fun. But first let’s start into what we normally do.

So Andrea, I know in our last episode, which we called The Cancer Answer That Wasn’t, you talked to Jason Sheltzer.

Andrea: Yes. Jason is a CSHL fellow who studies cancer and he and his team kind of stumbled upon this really surprising result, which was that this cancer gene and supposed cancer drug target called MELK, that’s M-E-L-K-

Brian: Right. Not milk, the beverage. Want to make that pretty clear starting out here.

Andrea: No. MELK the cancer gene, or so they thought, because it wasn’t actually a cancer drug target at all. And that was very surprising to them because there was a cancer drug in clinical trials that they thought was targeting MELK. And so that kind of lead us to talking about, how common is this? When researchers know that a drug works, how much do they really know about how it works? And so I’m going to play a little clip about that.

Jason: It’s killing cancer cells, we know that, but the reason that people thought it was killing cancer cells must be totally wrong. And so we think that this drug, which is in clinical trials, it’s effective at killing cancer cells, we can see that very well in our own hands, it just has to have some different mechanism, which we and, to our knowledge no one else, have discovered yet.

Andrea: Right. It’s definitely important to make that point because a lot of people would see drug target invalidated and think, “Oh my gosh, you’re giving this to cancer patients and wasting their time.” But that is not exactly the conclusion to draw from this work.

Jason: There are a lot of cancer drugs out there that have been studied for 20, 30, 40 years and we still have a very incomplete understanding of how they work in the cell. We know that they kill cancer cells and that they’re effective in patients and so there are a lot of drugs that are effective that we have an incomplete understanding of.

Andrea: Right. And that’s not only true of cancer that’s true of other drugs.

Jason: Sure. Psychiatric drugs time a million.

Andrea: Oh yes.

Brian: Times a million. I’m really glad he brought that up because that reminded me immediately of one of our previous episodes. It’s actually one of my favorite episodes, which was episode seven. It was the season finale of our first season, in which we talked about psychiatric drug discovery. And in that episode we talked about kind of the craziest surprising fact that a lot of the drugs that we use today we’ve been using for 20, 40 years and we still don’t fully understand why they work. We just know that they do.

Andrea: Yeah, and I mean, how would we when scientists are so at the beginning of understanding how the brain works, just in general? When you think about it, it’s just totally unrealistic that scientists would not only have cured a disease with this drug but then they also know exactly how it works.

Brian: So it’s not like I come up with an idea, it’s a solution to a problem, and I fully understand every little bit of how I reached that solution and why it works.

Sara: That reminds me of something.

Brian: Welcome Sara. As we mentioned at the top of the episode, this is Sara Roncero-Menendez, a member of our little digital den down at Cold Spring Harbor Laboratory.

Sara: The discussion you guys were having about MELK and not having everything figured out reminds me of a story.

Andrea: Okay, what’s your story?

Brian: Okay, shoot.

Sara: So have you guys ever heard of the ancient Greek mathematician Archimedes.

Brian: It’s ringing a bell, a very tiny bell.

Andrea: Refresh our memory.

Sara: Well, there’s lots of reasons to remember the name but the story I want to tell you guys is about Archimedes and the word Eureka. Now, once upon a time, Archimedes was charged by King Hiero II to figure out a way to detect a fraudulent crown, or in some versions it’s something about a boat not sinking with all the silver on it. The legend varies. And you know how you always get your best ideas in the shower? Well, the ancient Greeks got their best ideas at the public bath. So Archimedes goes to get a good steam, he sits down in the bathtub, realizes that his volume actually creates water displacement and, so excited, he shouts …

Andrea: “Eureka!”

Sara: Exactly. And he’s so jazzed about this idea that he runs out of the public bath naked.

But ever since then, we’ve associated the word Eureka with scientific discovery that happens in an instant. It’s an idea we carry over even to other scientists.

Andrea: Oh yeah, definitely. I mean, the whole Ben Franklin with his key on a kite and figuring out electricity all in one nice neat story.

Brian: Right, or another one where it’s bodily harm triggers genius was the apple falling from the tree, knocking on the head of Isaac Newton.

Sara: Right. And even Mendeleev, the guy who created the periodic table, was said to have thought of it in a dream. But it’s not even just about Eureka in these science legends, but in science fictions too.

Brian: So what do you mean, science fictions?

Sara: So even in movies that we all come to know and love, this Eureka myth persists and is perpetuated over and over again. This has been around since the early days of cinema. I want to introduce you guys to a beloved classic, the 1931 Universal Pictures Frankenstein, starring Boris Karloff.

Frankenstein: Look, it’s moving. It’s alive. It’s alive! It’s alive, it’s moving. It’s alive, it’s alive, it’s alive, it’s alive!

Andrea: Very spooky and dramatic, for sure.

Sara: Right. But we can definitely see that there are some problems here with Victor Frankenstein’s method.

Andrea: Oh yeah. I mean, what did he even really just do?

Sara: Well, for those of you who haven’t seen the movie, he just put a body on top of a slab, pumped it full of thousands of volts of electricity and then watched it’s hand twitch and declared that it was alive.

Brian: That’s a heck of a conclusion to jump to.

Sara: Right. So it’s not like we see Victor Frankenstein running any tests or running a slew of monster models, but rather he becomes horrified by it and lets Frankenstein’s monster destroy a village.

Andrea: I’m very glad that that is not how science is done.

Brian: But that is a very classic mad scientist, right? I’m sure modern Hollywood kind of takes it a little bit easier on scientists.

Sara: Oh, Brian. Well, unfortunately, I am here to ruin some sci-fi classics for you.

Brian: Oh no.

Sara: I’m sure you guys have seen Back to the Future?

Andrea: I wouldn’t be so sure about that, but-

Brian: What?

Andrea: -but this is why we have Sara on the show, to tell me about pop culture.

Sara: Well Andrea, let me get you up to speed.

So Back to the Future is about this total loser named Marty McFly, who’s best friends with a mad scientist named Doc Brown. Now Doc Brown has a dream and he wants to build himself a time machine, which he does, out of a DeLorean.

Doc Brown: What did I tell you? 88 mph! The temporal displacement occurred exactly 1:20 AM and zero seconds.

Marty McFly: Jesus Christ. Jesus Christ Doc, you disintegrated Einstein.

Doc Brown: Calm down Marty, I didn’t disintegrate anything. The molecular structure of both Einstein and the car are completely intact.

Marty McFly: Then where the hell are they?

Doc Brown: The appropriate question is, when the hell are they?

Brian: So for those of you who can’t see the clip, we’ve got this 30-year-old car that you’ll never see driving around today, directed right at this little boy and this old crazy man and there’s a dog driving it. Am I getting this right Sara?

Sara: That’s actually a pretty accurate summary. So as you can see, there are definitely some problems with Doc Brown’s method. The first of it being that he put a dog in a car on his very first test run of this time machine.

Andrea: How is the dog going to report back on what happened even?

Sara: And that’s if the dog comes back at all because Doc Brown doesn’t know that 88 mph is the magic number he needs to achieve time travel.

Brian: Right. Marty here thinks that Einstein, the dog, got disintegrated. And Doc Brown’s just assuming that’s not the case?

Sara: Essentially. He’s so confident that he basically even knows when to tell Marty to move for when the DeLorean comes rushing back onto the scene.

Andrea: Oh my goodness. You really can’t be that confident about your first experiment when you’re doing real science. I mean, first of all, you have to open to being surprised, like when Jason realized that this cancer drug target was not what it was thought to be. You ought to be open to that and if our mad scientist here was open to that, he would have been putting himself in mortal danger.

Brian: Okay, but right now, Sara, we still have two mad scientists. What about Hollywood portrayal of a real scientist, somebody who is-

Andrea: Legit.

Brian: Legit. All right.

Sara: Well, have you guys ever heard of a little movie called Jurassic Park?

Andrea: I have heard of it. Maybe not seen it.

Brian: You’re killing me Andrea. I’ve seen it.

Sara: Well, for those of you who haven’t, just in case, basically the film is about these scientists who find a preserved mosquito that has dinosaur DNA and they use that to make more dinosaurs.

Brian: So far, so good.

Sara: Right. And they even have a fail safe. They make all the dinosaurs female so they can’t reproduce.

Henry Wu: This is really not that difficult. All vertebrate embryos are inherently female anyway, they just require an extra hormone given at the right developmental stage to make them male. We simply deny them that.

Ellie Sattler: Deny them that?

Ian Malcolm: John, the kind of control you’re attempting, it’s not possible. Listen, if there’s one thing the history of evolution has taught us, it’s that life will not be contained. Life breaks free. It expands to new territories and crashes through barriers painfully, maybe even dangerously, but … well, there it is.

John Hammond: There it is.

Henry Wu: You’re implying that a group composed entirely of females will breed?

Ian Malcolm: No, I’m simply saying that life finds a way.

Andrea: I definitely like the sentiment of life finds a way. I’m not as confident as the scientist is that it’s going to go the way he planned.

Sara: Right. They don’t wait a couple of life cycles to see how these dinosaurs are going to work and interact. They don’t check to see if they are able to reproduce due to the amphibian DNA that they used to fix the dinosaurs. They sort of just hope that this project is ready to go public in a year or less.

Andrea: That’s not even enough time to get a drug ready for FDA approval, let alone to unleash dinosaurs on the entire planet.

Brian: But of course, this movie almost seems like a good thing in that it’s portraying a lesson for scientists, where it says, “Hey, if you want to do good science, you have to rigorously check what you’re doing. Otherwise, you get eaten by dinosaurs.”

Andrea: Right. You might think that you know how it all works but you really need to test every little aspect, especially when you might be putting people in danger.

Sara: That’s probably not how most audiences saw it, but maybe they should have.

So the long and short of it ends up being that narratives really love this Eureka moment, and it often overlooks the months and years of hard work and testing and laboratory work that’s necessary to really come up with these real rigorous results, not just quick answers.

Brian: So thanks Sara, for coming in and talking to us about this.

For everybody else out there, we talk to Sara during the production of every podcast episode. She’s kind of always there in the background, giving suggestions and always tying everything into pop culture, so I’m really glad we were able to have her on the show now and share that with you guys. We’re going to be doing this every chat episode. Sara will be her to drop her pop culture knowledge bomb, so look forward to it. Please stay tuned.

Andrea: And we’ll be back in May with another full episode for you all, so stay tuned for that too.

Brian: Thanks a lot guys.

Andrea: We’re coming to you from Cold Spring Harbor Laboratory, a private not-for-profit institution at the forefront of molecular biology and genetics.

If you’d like to support the research that goes on here, you can find out how to do that at and while you’re there, you can check out our newsstand, which showcases our videos, photos, interactive stories and more.

Brian: And if that’s still not enough, you can always pay us a visit. Between our undergraduate research program, high school partnerships, graduate school, meetings and courses, and public events, there really is something for everyone.

Andrea: I’m Andrea …

Brian: And I’m Brian …

Sara: And I’m Sara.
Andrea: And this is Base Pairs. More science stories soon.

Episode 14: The cancer answer that wasn’t

Science is a process, something we learn in elementary school as we plan our papier-mâché volcanoes. First, a hypothesis is put forward. It is rigorously tested through observation and experimentation, and then the scientists put forth their results.

But one step they overlooked at your fifth-grade science fair is absolutely crucial—the experiment should be reproducible by others using your methods and materials.

AA: Hey everyone! Andrea Alfano here.

BS: With me, Brian Stallard

AA: And we’re really thrilled to be starting this new season of Base Pairs! But first, I wanted to make a short-but-exciting announcement: Base Pairs and CSHL’s blog, LabDish, have officially moved!

BS: Cue the Music!

[m: Tada!/parade music]

AA: Oh! I uh, wasn’t expecting… [clears throat] well anyway, has just undergone a huge upgrade [- BS: it’s bigger and better than ever! – ] and with it, you can find every LabDish post and the whole episode list—all two complete seasons—of our Base Pairs podcast.

BS: Right! And as always, we can still be found on SoundCloud, Stitcher, iTunes, and wherever else you get your podcasts.

[parade music fades]

BS: But let’s get straight into today’s episode! And for it, Andrea and I have decided to dive into a subject that many scientists and science enthusiasts…

AA: …which I’d guess is most of you, dear listeners…

BS: …yup, it’s something that you guys may be familiar with already… and might even be a little worried about. [p] That’s because today we’re going to talk about what many are calling science’s “reproducibility crisis.”


IO: It’s a little bit like, if you provide enough information, like grandma and her recipe for meatballs, then, the meatballs should more or less come out the same.

AA: That is Doctor Ivan Oransky. He’s a Distinguished Writer In Residence at New York University’s Arthur L. Carter Journalism Institute and the co-founder of the website known as Retraction Watch.

BS: That’s him! I reached out to Ivan because he has written a lot about the so-called “reproducibility crisis,” and I was hoping he could share that knowledge with us. [p] So, of course, the first things we talked about was meatballs.

IO: Now, in terms of grandma’s meatballs, I want a little variation, a little variability, otherwise, life becomes very boring. Biology has that natural tendency—biology has natural variation, natural variability, and so that’s to be expected. It’s not that you would expect to get the exact same results every single time.

BS: But you would still expect to get meatballs… Now, this is a metaphor, obviously, but it really gets at the heart of what we mean when we say “reproducibility” in this episode.

AA: Ok, then let’s say that I, a chef, want to make the next great meatball. I’m reading my cookbook literature and I stumble upon a meatball recipe that I just HAVE to try, and then, maybe build upon. So, I set up my kitchen and get to work.

BS: Now in this metaphor—now follow me here—chef is to scientist, recipe is to paper, cookbook is to journal, kitchen is to lab, etcetera, etcetera, and so on.

AA: Right and at the end of it all, after following the recipe as closely as I can, I have made…

BS: An apple pie.

AA: [laughs] A what?!

BS: An apple pie! Or the most delicious chicken cordon bleu ever, orrrr maybe just a charred square of what was once chop meat. Whatever your result, it’s clear to you and me that that’s not meatballs. Even accounting for the natural variability of biology, like Ivan said, clearly, there was something wrong with the recipe you used.

AA: In other words, the paper’s result—if we step away from the metaphor—was not reproducible. [p] But then what? Say I find out that something is wrong with this paper. What happens then?

[MT: explainer]

BS: Well, one of the celebrated parts of science is that it undergoes peer review and in turn, is self-correcting. If enough folks realize there is something wrong with a recipe, they stop using it. Maybe an edit is made. Or maybe, the recipe itself is removed from the cook book entirely.

AA: That last part is called a retraction—when a paper’s author or the journal where it’s published actually take it down. And being part of Retraction Watch makes Ivan and his colleagues particularly aware of this kind of thing.

IO: So, the rate of retractions has been definitely been on the rise. It’s actually a pretty dramatic increase from year 2000 when there were about 35 retractions in the literature out of about probably about a million papers published. The year 2016, when we had sort of the most up to date information so far, there were more than 1,300 retractions. There were about two million papers published, so, obviously the denominator increased, but, overall, that still represents a pretty significant increase in the number of retractions, and the rate of retractions, more importantly.

BS: Now, Ivan was careful to tell me that knowing the rate of retractions lets you know one thing for certain: The rate of retractions. However, he added that if he had to guess, he’d say the rising rate is –

IO: due to at least two factors. One of them is pretty clear, which is that we’re all better at finding problems in the literature. There are more people looking at papers. It’s also, certainly, at least possible that there’s more misconduct happening.

AA: Oh my. Misconduct. Ivan’s talking about the possibility of fraud. That can happen in highly competitive environments and science, of course, is not immune. However, in the case of our discussion today, Brian, we’re actually going to focus on that other part, right? The fact that we’re getting better at finding problems.

BS: That’s right. This increased scrutiny of scientific literature has led to the discovery of all these papers that, despite being driven by hard work and genuine science, STILL can’t be reproduced. In fact, a stunning analysis in 2015 from the non-profit Global Biological Standards Institute in Washington DC attracted a lot of attention. They estimated that billions of dollars each year are spent on biomedical research that cannot be reproduced successfully. They went as far as to say we might have a “reproducibility crisis” on our hands. But… that might not be the best name for it.

RH: I don’t think this is a crisis, because I think this has actually been a problem in science for a long time.

BS: And that is Richard Harris.

RH: I’m Richard Harris. I have been a science correspondent at NPR for 32 years. I wrote and published a book last year called “Rigor Mortis,” about rigor and reproducibility in biomedical research.

BS: The Financial Times called the book “Rigor Mortis” “a rewarding read for anyone who wants to know the unvarnished truth about how science really gets done.”

AA: Oh! I’ve heard of this book. It describes a lot of the reasons why research may not be reproducible and the problems that this can cause in academia and industry alike, so I was happy to hear Richard had some good news too.

RH: People are now aware about the scope and the seriousness of this issue, and I think that’s good news because I think that means people are thinking about how to make it better.

BS: However, Richard was quick to add that in the case of irreproducibility, it may be that we first want to see even more corrections and retractions.

RH: I think there’s … a little bit of trepidation about admitting errors. If it’s a serious mistake, it’s to say I would like to retract my paper and take it out of the literature because there’s something fundamentally wrong with it. The problem is that that’s very often perceived as a black mark for a scientist. Even if a scientist is really doing the right thing, saying, “Oops, I screwed up a little bit here. I want to tell the community and I want to take this out of the literature,” that’s often seen as a potential sign of fraud or misbehavior or something like that. So, scientists are very reluctant to do that unfortunately and that means a lot of papers in the literature that are problematic aren’t removed.

AA: This is a powerful reminder that scientists—when all is said and done—are people, like you or me! So, it really shouldn’t come as a surprise that mistakes happen and sometimes go undetected, ignored, or unreported.

BS: And to solve this problem, Richard explains that we need to first get rid of the stigma surrounding experimental mistakes. After all, without mistakes to learn from, how else can scientists improve?

RH: I think we have to recognize that error is part and parcel of the scientific process. We can’t pretend or we shouldn’t imagine that everything will be 100% perfect. In fact, I think if scientists strive for that, then they won’t be trying hard enough to push the frontiers … The question is can we shorten the cycle between understanding there’s an error and recognizing that—and getting the word out that actually we have a deeper understanding and that turned out not to be correct and so on.

AA: That’s a wonderful point he’s making, and it reminds me of a recent conversation I had with a biologist right here at CSHL. He told me a story that shows how learning from those kinds of “errors”—the ones that arise from the unknown unknowns at the frontier of discovery—those errors can help drive science forward. Reproducibility, after all, isn’t as black and white as your conversations about retractions may make it seem.

[MT] (chat setup from last year’s interview?)

AA: That scientist’s name is Jason Sheltzer. He’s a CSHL Fellow. And he ended up in the middle of this whole reproducibility issue when he accidentally discovered that the target for a cancer drug that’s in clinical trials… well, that drug target is actually not involved in involved in tumor growth at all.

BS: Uh-oh. And it’s in clinical trials, so that means actual cancer patients are receiving this drug.

AA: Yes.

BS: What went wrong?

AA: Well, ideally scientists would have figured this out earlier of course, so you could say something went wrong in that sense—and we’ll get to that. But when I talked to Jason, this is what he said about the role of contradictory results like these in science.

JS: I think that finding contradictory results, and then understanding why you found you a contradictory result — is a very important scientific endeavor.

BS: Oh Ok. So, we’re talking about contradictory results here. Like when you made apple pie instead of meatballs at the top of the episode. That was quite contradictory.

AA: Right, but I picked this story in particular because it shows how complicated this reproducibility thing actually gets. In fact, up until Jason made his accidental discovery, it was as if everyone thought apple pie WAS meatballs…. But I’m getting ahead of myself.


AA: Jason and his team just published their second paper about this, in February, but they first reported results that invalidate the cancer drug target, called MELK (that’s M-E-L-K), about a year ago.

BS: And MELK is a gene?

AA: Yes, MELK is a gene that has the instructions for building the MELK protein. The protein is actually the part that the drug was supposed to be targeting. And when Jason’s team started those experiments, they weren’t even trying to learn about MELK, because they thought what the other scientists thought: that cancer cells are addicted to MELK and therefore getting rid of MELK makes it impossible for them to thrive.

BS: Or in other words, that our apple pie recipe makes meatballs.

AA: That’s a bit of a simplification, but yes. It’s a lot like that.

JS: There are a number of different genes that cancer cells express, which they depend on, which they are addicted to in order to grow and divide and metastasize, and do all the terrible things that they do. Sometimes when you can mutate or block the function of these cancer addictions, you can kill the cancer cells.

BS: And I’m guessing that’s what researchers thought this cancer drug did. They thought it killed cancer cells by blocking MELK.

AA: They thought so. Actually, Jason and his team were so confident that MELK was an addiction for cancer and therefore a good cancer drug target that they used it to kind of standardize their experiment, as a point of comparison.

BS: A control.

AA: Exactly. They were setting up this big screen where they would delete various genes in cancer cells—get rid of those genes entirely—and then see which genes cancer cells could live without and which ones they were totally addicted to. And when you’re designing an experiment like that…

JS: …you want, as controls, to be able to target something that is a known addiction and one of the controls that we chose for our work was this gene called MELK, which had been published to be an addiction of breast cancer. However — it didn’t behave like a cancer cell addiction, and we could mutate this gene in breast cancer cells and they didn’t seem to care at all.

BS: That must have been confusing. Hadn’t the earlier MELK results been reproduced before? They must have if the supposed MELK-targeting drug was already in clinical trials.

AA: Many different groups had independently reproduced the MELK results. Since 2005, more than 30 papers have reported results that implicate MELK as a cancer drug target. Like I mentioned before, there’s more to this reproducibility issue than simply repeating experiments and getting the same results.

JS: In biology people often talk about technical reproducibility and conceptual reproducibility. And technical reproducibility, I think means — doing everything step by step in an exact same manner and then coming out with the same results — and that’s of course very, very important for the biological literature. But one step beyond that is conceptual reproducibility, which is taking a concept or a conclusion demonstrated by an experiment and then showing that you can come to the same conclusion using a different approach.

AA: And getting to conceptual reproducibility by using different approaches to answer the same question is important, because repeating the same experiment over and over again can only get you so far.

JS: With technical reproducibility, if there is some flaw in the technique, if you use a chemical that’s not specific or if there’s some error in the protocol, well if you do the same protocol ten times the exact same way with the exact same error, you’re gonna get the same result each time but that doesn’t mean your conclusion is correct.

AA: In fact, the scientists who did this MELK research did test its effectiveness as a drug target with two different methods, so they did even achieve a level of conceptual reproducibility.

JS: But sometimes in science you can answer a question using two different techniques and get the same answer but you pull a third in and the third gives you a different result, and science has to be internally consistent, and in this case it wasn’t.

BS: Ok, then what did Jason’s team do differently from the scientists who had done all of that earlier research showing that MELK is a promising drug target?


BS: Ah, that is new—relatively at least. We’ve talked about this tool called CRISPR in a couple of our previous episodes because it has had an enormous impact on biological research in the few years since it’s become widely available. CRISPR—that’s C-R-I-S-P-R—is a gene editing tool that enables scientists to make changes to the genome more precisely than ever before.

AA: Which is great! But that also means the best technology that scientists had at their disposal before CRISPR was not as precise. That doesn’t mean that the older technology was useless—far from it. Jason told me about the pre-CRISPR technology that scientists used in the earlier MELK research.

JS: As a cancer researcher, we try to investigate cancer genomes in different ways and some of the previous ways that have been very popular and in many cases very, very effective have involved a technique called RNA interference.

[MT- explainer]

AA: The whole idea of RNA interference was a really big deal when scientists discovered it in the late 1990s. Earlier on, the thinking was that RNA was little more than a messenger for DNA, the molecule that carries the entire genome. But RNA interference showed that a cell’s RNA often tells the DNA what to do, in a sense. It can “interfere” with the process of making proteins based on particular genes, and it does that by binding to those parts of the DNA.

BS: It basically turns the volume on a gene down, and that’s a really useful way to learn about what a gene does. It’s a way of learning by subtraction, you might say. When one element and only one is altered, is there any difference in the organism or cell? Scientists—including Professor Greg Hannon, who was then at CSHL and now at Cambridge Cancer Research UK—figured out a way to tap into the cells RNA interference system and target specific genes they were interested in. That way, they can see what cells do without that gene.

AA: Super useful. Learned a lot with it. But—

JS: Unfortunately, it also has off-target effects in some cases. And you can try to block the expression of one gene, and you end up blocking the expression of another.

BS: Off-target effects are exactly what they sound like, and they can really throw off an experiment. It can be very hard to draw the right conclusion when you change more than one thing at the same time, especially when you don’t even realize it’s happening.

AA: CRISPR produced such a different result because you can target a gene much more precisely.

JS: With CRISPR, one thing that we were able to do is we were able to generate cancer cells that totally lacked MELK expression. They had a deletion in part of the genome where MELK is encoded, so they have no MELK left whatsoever. So if you have a drug that targets MELK, and then you take a cell line that has no MELK, you would expect that cell to be resistant to that drug. We found exactly the opposite. The cells, which were MELK knockout, which totally lacked MELK expression, still remained totally sensitive to the MELK inhibitor that’s being given to cancer patients.

BS: Oh, that’s a relief! The drug still killed the cancer cells, just not the way that scientists thought it does.

AA: Right. Cancer patients may still benefit from the drug, even if no one knows exactly why. All we know now is that whatever it DOES, it doesn’t do it by targeting MELK! In any case, Jason has reached out the physicians involved in that clinical trial about his team’s MELK findings, and has been in touch with some of them via email.

BS: Well, cells that don’t have MELK at all definitely shouldn’t respond to a drug that targets MELK. That sounds like compelling evidence. But, if Jason and his team already invalidated MELK as a cancer drug target, what is this new paper about?

AA: Even though the first paper was pretty strong evidence that MELK was not a cancer drug target, they were still skeptical.

JS: There were a number of caveats and limitations to the work that we did.

AA: They had still only looked at cancer cells in a dish, not an actual organism.

BS: Experiments done on cells in a dish are really useful, but sometimes cells behave very differently when they’re part of a full, living body.

AA: That was the logical next step to see if their conclusion held up.

JS: We did a number of additional — screens — including what’s called in vivo work, doing experiments in mice instead of just in a Petri dish, where we continued to look at MELK. And our additional experiments largely recapitulated our initial observations, which are that we can delete MELK — and the cancer cells unfortunately, continued to divide.

BS: It seems like a moment that might have at least been bittersweet, not just unfortunate. After all, their results suggested that they were right about MELK! But they didn’t really want to be right about this.

AA: Yeah, Jason was not excited to be right about the conclusions from earlier experiments being wrong because…

JS: …well, because the more drug targets you have in breast cancer, I think the better it is for breast cancer patients.

AA: But being a scientist means you have to go with what the evidence tells you. That’s what the scientific process is all about, and the scientific process is really what science is. Scientists like Jason want to find ways to stop cancer, but they have to make decisions based on evidence, not what they want to happen.

JN: Showing people how evidence-based thinking works with real experiences and real stories I think is important.

BS: That sounds like Jackie Novatt! And… coins clinking? Where was she?

AA: I caught up with her over tea recently here at Blackford Bar on campus, and I had the recorder on while we talked—that’s why you heard coins in the register in the background. She was a researcher here at CSHL until a little over a year ago, and now she’s pursuing teaching at Long Island University’s Pharmacy School. As I’ve been learning about this MELK research story, I keep thinking back to this one part of my conversation with Jackie. She was telling me about her experiences leading tours of the CSHL campus and telling people about the work that scientists do.

JN: I found it important to tell people about the failed experiments too, because that’s not something that you hear a lot about. And I’m sure—I don’t know if you’ve had this taxi driver, but there was one at Rockefeller, there was one at my grad school, and there was one here, where you get the taxi driver that hears you’re going to the Lab and then berates you for sitting on the cure for cancer and then hiding it because we all want money and we want to control the world.

AA [in recording]: I knew this story before you even told it, because I’ve had the same experience.

JN: We’ve all had that experience. And the thing is, people truly believe that because we’ve been fighting the war on cancer for a long time and a lot of money has gone into it, and why the heck don’t we have a cure yet? And the reason is, it’s really hard and it’s really complicated and a lot of experiments fail. And if we only communicate that A leads to B leads to C leads to this beautiful conclusion, then why the heck haven’t we cured cancer yet? So, I think it’s really important to communicate the failures as well so that people see science as a process, not as an endpoint.

BS: It’s heartbreaking to hear this kind of misconception about the power of science.

A: It really is, because the root of it is the belief that science is powerful, which is true. But if you are a busy person who is just catching the headlines, you could get misled about what the power of science is—where it really comes from.

JS: Lots of scientific discoveries get boiled down to, oh this is a cure for Alzheimer’s, oh this is a cure for cancer, oh this a cure for heart disease. But in many instances what’s actually been discovered in the lab is insight into a biological process, is the discovery of a gene that might be important in a particular disease, the finding that a drug in a cell line model or in a mouse line model has a moderately beneficial effect. But often times the translation from what was actually discovered in the laboratory to how it’s reported in say, the newspaper or on a website, you can lose a lot of the detail and you can lose a lot of the subtlety.

BS: Those headlines can make it sound like the science is done, or we’ve reached the “endpoint,” as Jackie put it. But in reality, science reveals answers bit by bit. We always need more because that’s the only way that science can self-correct, like it did with the research on MELK.

AA: Exactly. Now, scientists know that the secret behind that drug’s ability to kill cancer cells is not MELK, but something else. Understanding what is really allowing the drug to kill cancer cells is really valuable knowledge, because it helps researchers design related drugs or fine-tune existing ones. [p] This story shows why scientists have to remain skeptical. Even when science brings us exciting things, like new potential treatments for cancer, there is always more to learn.

IO: when science works, it is absolutely, there’s no question, it’s the best way to understand the world …

AA: That’s Ivan Oransky of Retraction Watch, from the top of the show.

IO: but I will also challenge those aspects of the scientific endeavor—The human endeavor which is science—I will challenge that to be as good as I know that everyone wants it be.

BS: And we wouldn’t have it any other way! … But what does Richard Harris think about all this? After all, his book “Rigor Mortis” dives into many other causes of error and irreproducibility that we didn’t get to explore in this episode.

RH: Science is a matter of trial and error. We learn a little bit and we make an observation. We do our best to interpret those observations but then when we get more information or deeper insights or better tools, we realize, you know, we didn’t quite understand everything as thoroughly as we thought and so we improve our knowledge and our understanding of science.

B: That’s all folks – thanks Rich and Ivan

A: Thanks Jason Jacky…. Musicians in this episode include, Broke For Free, Podington Bear, Lee Rosevere, Ketsa, the united states army old guard fife and drum corps, and—as always—the Blue Dot Sessions.

B: We’ll be back next month with another new episode, but in the meantime, we’d love it if you’d review us on iTunes and tell us what you think of the show!

A: Were coming to you from Cold Spring Harbor Laboratory: a private not-for-profit institution at the forefront of mol biol and genetics. If you’d like to support the research that goes on here, you can find out how to do that at And while you’re there, you can check out our news-stand, which showcases our videos, photos, interactive stories, and more.

B: And and if that’s not enough, you can always pay us a visit! Between our Undergraduate Research Program, high school partnership, graduate school, meetings & courses, and public events, there really is something for everyone.

A: I’m Andrea.

B: And I’m Brian.

A: And this is Base Pairs. More science stories soon!

Explore more