Author’s Notes: I’ve updated the original post to list the news agencies that reported on this as if their audiences should accept it as fact. I only selected from news agencies with a national reach or an ostensibly scientific mission – those that have the resources to know better and be more critical in reporting “emerging medical research.” I also add a list of news agencies that got it right – they critically and skeptically appraised the claim in the larger context of markers of addition, study design, etc.
I have also edited my comment on blinding to indicate that it’s unclear whether they used it, and the fact that it’s not mentioned is a red flag.
I also added a comment on misuse of reasoning in drawing the conclusion about cocaine addition.
I know that some people object to my title – however, it’s intentionally provocative to indicate that failing to consider rival causes can lead you down a potentially wrong path when drawing conclusions from data. Seriously – what if the rats just found the rice cakes disgusting and derived an inevitable and strong reward from a food source that wasn’t disgusting?
I saw this headline all over the place today – here is one representative example:
Oreos May Be As Addictive As Cocaine; That stuf is addictive [1]
Here is another one:
Addicted to Oreos? You truly might be. [2]
Wow! Is that really what this study found? Nope. The only thing these researchers proved is that given a choice between Oreo cookies and rice cakes, mice choose Oreo cookies. Is that really a surprise? Let’s take a closer look.
The Study
The study was conducted by a group of Connecticut College students supervised by a professor of psychology [3]. The two students are undergrads – they are not trained or credentialed scientists, but are certainly in training to be scientists. This is excellent – that students are learning to construct experiments, gather data, and analyze results – but one should be cautious in just accepting the conclusions of their wok. Their work has not been peer reviewed, nor has it yet been presented at a conference and subjected to feedback from the community.
They trained the rats to run a maze. On one side of the maze they offered the rats rice cakes, and on the other they offered them Oreos. They compared the amount of time the rats preferred to spend on the different sides of the maze. They then compared that data to data from a separate trial of rats comparing how much they prefer a saline injection to a cocaine injection. You can already see the design flaws here. Why rice cakes? Why not something rats normally like to eat? Or, for that matter, if you’re going to compare to cocaine . . . why not cocaine? You can begin to pick apart the premise of this experiment in a heartbeat.
So the data gathering is fatally flawed. What makes it worse is the conclusion drawn by the faculty member supervising this work:
“Our research supports the theory that high-fat/ high-sugar foods stimulate the brain in the same way that drugs do,” Schroeder said. “It may explain why some people can’t resist these foods despite the fact that they know they are bad for them.”
Professor Schroeder fails to consider rival causes – the research also supports that Oreos are more attractive than rice cakes to the rat palate. So . . . how did they draw the cocaine-related conclusion? The students compared a marker indicating neural activity in response to stimuli and checked it against the presence of that marker in a cocaine-addicted rat brain.
[The students] used immunohistochemistry to measure the expression of a protein called c-Fos, a marker of neuronal activation, in the nucleus accumbens, or the brain’s “pleasure center.” – “It basically tells us how many cells were turned on in a specific region of the brain in response to the drugs or Oreos,” said Schroeder. They found that the Oreos activated significantly more neurons than cocaine or morphine.
What’s unclear is: were the studies of the chemical markers done blind (e.g. did the students KNOW that they were analyzing brain chemistry from Oreo-conditioned rats)? Did the students bother to see if ANY conditioning, using Oreos, or cocaine, or any other pair of choices, led to the same brain chemistry? This whole thing seems like a really bad oranges-to-apples comparison
Is this reliable science?
Nope. The science reporting on this has been terrible, to boot. I applaud students getting into research, but I find deplorable the fact that their school made a press release about unreviewed research and cited it like gospel.
This, as presented, is not reliable science because:
- It’s unclear that the study actually compared the things that it then claimed to compare in the conclusion – Oreos and cocaine. It merely trained rats to prefer one of two choices and then looked at their brain chemistry for markers of addiction. That’s nice, but it’s not at all supportive of their conclusions.
- Regarding the conclusion, the reasoning applied here is based on incorrect logic. One can frame the independent observation – that addiction leads to the stimulating of reward centers in the brain – as a syllogism: “If a rat is addicted, then the pleasure center of its brain will be stimulated.” One can then take the conclusion from the above study and see if it affirms the consequent or affirms the antecedent. The conclusion that the professor in charge of the study draws is, “Based on the data, the rat brain pleasure center was stimulated.” That’s a correct conclusion. However, the next step in the reasoning is wrong. The professor then concludes, “Since the rat brain pleasure center was stimulated, the rat was addicted to Oreos.” This is a wrong form of logical reasoning called “Affirming the Consequent.” For instance, consider the following syllogism: “If it is a car, it has wheels.” Now, affirm the consequent: “I observe something with wheels.” One cannot then conclude, “This thing I observe is a car.” That is wrong. It might be a skateboard, or a cart, or something else with wheels. Just because it has wheels doesn’t make it a car. Just because the rat brain pleasure center is stimulated, doesn’t mean the only cause is addiction.
- Addiction is more than just preferring A over B – it’s also withdrawal symptoms and a host of other associated risks and problems. Did the students then withdraw Oreos and observe whether or not they executed the same withdrawal behaviors as cocaine-addicted rats?
- There was no reported attempt to blind the study – that raises a red flag. At the very least, the authors should have indicated key features of their study in the press release, since there is no scientific supporting documentation. Students analyzing brain chemistry should NOT know what rat brains – Oreo-conditioned or non-conditioned – they are measuring. They should have been asked to see if there were any populations present in their data. If they identified two, they should have then determined if the populations correlated with the Oreo-conditioned rats and normal rats; or, if the Oreo-conditioned rats are then comparable to cocaine-addicted rats. It will be interesting to see whether or not they conducted the study blind, to remove bias.
- The statistics are unclear. How many rats? What are the errors on their chemical measurements? Did they even bother to assess uncertainty?
- It’s unreviewed. It’s unverified.
At best, this study concludes that rats, like humans, prefer Oreos over rice cakes. Surprise. Rice cakes taste like shit.
News Sources That Should Have Been More Critical and Skeptical But Were Not:
Forbes: http://www.forbes.com/sites/alicegwalton/2013/10/16/why-your-brain-treats-oreos-like-a-drug/
Christian Science Monitor: http://www.csmonitor.com/Science/2013/1016/Oreos-addictive-Rats-treat-Oreos-like-cocaine-study-suggests
Discovery News: http://news.discovery.com/human/health/oreo-cookies-as-addictive-as-cocaine-131016.htm
ABC News: http://abcnews.go.com/Health/oreos-addictive-cocaine/story?id=20590182
Hartford Courant: http://articles.courant.com/2013-10-15/news/hc-oreo-addictive-rats-1016-20131015_1_rice-cakes-one-experiment-humans
LA Times: http://www.latimes.com/food/dailydish/la-dd-oreo-cookies-addictive-cocaine-20131016,0,3166408.story
TIME: http://newsfeed.time.com/2013/10/16/oreos-may-be-as-addictive-as-cocaine/
CBS News: http://www.cbsnews.com/8301-204_162-57607785/oreos-may-be-as-addictive-as-cocaine-morphine/
Chicago Tribune: http://www.chicagotribune.com/business/technology/chi-nsc-why-your-brain-treats-oreos-like-a-drug-20131016,0,1494058.story (regurgitated the Forbes story – for shame)
News Venues that Were More Critical and Skeptical
Live Science: http://www.livescience.com/40488-oreos-addictive-cocaine.html
[1] http://newsfeed.time.com/2013/10/16/oreos-may-be-as-addictive-as-cocaine/
[2] http://www.today.com/health/addicted-oreos-you-truly-might-be-8C11399682
5 thoughts on “New study finds that mice agree with humans – rice cakes taste like shit”
Well, that study has now been reviewed. By you, Dr. Sekula. Hopefully the experimenters will learn from this review, and improve the methodology of their next experiment. That’s how science is supposed to work, isn’t it?
Of course, publishing results before a peer review is not the right thing to do. But thinking of the entire sequence of badly constructed experiment, erroneous conclusions, and inaccurate publicity as a teaching moment can certainly improve the students’ education as scientists.
–Bob, who happens to like rice cakes just fine (but not as much as Oreo cookies).
A blog is not scientific peer review; I am not a research psychologist, and the only fair scientific assessment of their work at the peer level would be by published, trained, and practicing research psychologists, biochemists, etc. However, as a teacher and practitioner of the scientific method, I can certainly assess the reported methods and whether or not the conclusions suit the data gathered. From a purely pedagogical point of view, the conclusions drawn from the experiment overreach the data gathered. As a person who does some work at the interface of science and policy, and works with communicators to facilitate the communication of science, I can definitely state that the public affairs folks at Conn College vastly overstepped the bounds of good journalism when reporting the results of this study. And I can definitely state that the science reporting in the media at large is deficient in its ability to determine what is good science and what is nonsense, as revealed by the small media circus yesterday. If anything, the publicity of this study says more about the sorry state of U.S. science reporting than it does the work these students actually did.
While I agree that the behavioral part of the experiment and the discussion of addiction are very poor I feel like you’re jumped to negative conclusions about the blinding of the experiment. It doesn’t make much sense to criticize them for not blinding the experiment on the basis that a bunch of pop-new sources didn’t feel the need to mention it. Drawing conclusion from ignorance is just a way to make yourself angry.
What the pop-news is missing that is really damning to what the experimenters say is that addiction is still a wide open field in a lot of ways (the largest recent review on addiction noted that its still a struggle to find an agreeable definition of addiction in which we can’t say “MLK was addicted to marching” or an equally unacceptable conclusion). There are cognitive and even econometric models of addiction that make very little sense to apply to rats, for example, which make it hard to say that any rat experiment could show human-like addiction. Biomarkers have an appeal but honestly I don’t see them as catchall solution to this kind of thing since the premise that addiction is some kind of discrete neurological event is fairly absurd.
I drew the conclusion from blinding/not blinding from the press release from Conn College, not popular news sources. They may have blinded. They may not. The fact that it’s not mentioned is a red flag. It’s always a red flag when process is not mentioned, since science is a process – not a set of conclusions – and the only thing that matters in science to the integrity of the result is the integrity of the process.
It always makes sense to criticize “emerging medical science” reporting for failing to indicate the quality and key methodologies of the study. Since this is unpublished, there is no way at all of assessing the study itself – one can only assess the statements made by the researchers from their own institution’s communications office. That’s is worth criticizing. Publicizing and popularizing a study without supporting documentation to allow the scientific community to assess the claim is the weakest form of science reporting and irresponsible on the part of the faculty member who was interviewed for the press release. I look forward to this actually being made public so I can actually comment on the methods; here, I can only raise the red flags that the **science press** should have raised before blindly reporting on the results.
I should also note, by the way, that I actually like rice cakes. However, in a taste test, I’d take anything else over rice cakes . . . except maybe natto, rocky mountain oysters, geoduck, . . . ok, like, a whole bunch of other things that are way worse to my own psychological conditioning than rice cakes. 🙂