Showing posts with label Cymbalta. Show all posts
Showing posts with label Cymbalta. Show all posts

Friday, October 01, 2010

Cymbalta and Effexor: Hype Over Science

ResearchBlogging.org
Remember the hype around the serotonin-norepinephrine reuptake inhibitors (SNRIs)? Effexor and Cymbalta impact both serotonin and norepinephrine, so they should be more effective than SSRI’s in treating depression? Mind you, that’s not a high bar to clear - it’s not like SSRI’s are much better than placebo. So get the hell outta the way, Prozac and Paxil, because Cymbalta and Effexor will unleash their incredible efficacy onto the world of psychiatry. Doubt me? Read this 2009 article regarding the wonders of Pristiq (son of Effexor) and learn about how “The emergence of the selective serotonin reuptake inhibitor (SSRI) and serotonin norepinephrine reuptake inhibitors (SNRI) antidepressants has improved the treatment of MDD.” Or this press release from Wyeth. Or Dr. Danny Carlat’s experience selling Effexor to his peers. I don’t think anyone who has followed drug marketing would deny that both Wyeth and Lilly tried to pimp Effexor and Cymbalta as working better because of their SNRI properties.

But is that actually true? A team of German researchers examined the data and concluded that neither Effexor nor Cymbalta really work better than SSRIs. They actually found a small advantage for Effexor over SSRIs for treatment response (but not depression remission), but they also found that the manufacturer was hiding studies from them (and the rest of the world). I haven’t said this for a while, but enter Charles Nemeroff. To understand the research by the Germans, we first need to recall that a 2008 study (lead author: Nemeroff) found

...the pooled effect size across all comparisons of venlafaxine versus SSRIs reflected an average difference in remission rates of 5.9%, which reflected a NNT of 17 (1/.059), that is, one would expect to treat approximately 17 patients with venlafaxine to see one more success than if all had been treated with another SSRI. Although this difference was reliable and would be important if applied to populations of depressed patients, it is also true that it is modest and might not be noticed by busy clinicians in everyday practice. Nonetheless, an NNT of 17 may be of public health relevance given the large number of patients treated for depression and the significant burden of illness associated with this disorder. [my emphasis]


As I wrote then, the benefit to public health claim is ridiculous. To understand the reasons why this is so laughable, please check out my prior post on the topic. This meta-analysis included a bunch of data from Wyeth that was previously unpublished...

Which leads to the freshly published meta-analysis on how Effexor compares to SSRIs. The German researchers requested unpublished data from Wyeth and only got some of it - you’d think that just maybe Wyeth sent them the “good news” data and maybe held back on some of the “bad news” data. So when an ever-so-small benefit emerged for Effexor (5% high treatment response rate), well, call me crazy, but I ignored it. We’re not playing with a full dataset because the manufacturer wants to keep some of it hidden, so shame on Wyeth and let’s look at Effexor with a little bit of suspicion. So Effexor vs. SSRIs - no difference. Except that more people drop out of clinical trials on Effexor due to side effects compared to SSRIs (about 3% more). So even if you believe that Wyeth’s hidden data really doesn’t impact these findings, we’re left with a very small advantage for Effexor that is probably negated by its slightly higher dropout rates.

Cymbalta. It had a 3% higher dropout rate due to adverse events and the same efficacy as SSRIs. So nothing to write home about, except that it costs a boatload more than generic SSRIs and is harder to tolerate. But Cymbalta has been marketed to the gills and is clearing $3 billion a year in sales. Hey, this is the company marketed Zyprexa for dementia (oops), and for, well, lots of other stuff (1, 2). So it’s not surprising at all that they can take a mediocre antidepressant like Cymbalta and turn it into a big moneymaker - the wonders of a good marketing department. But Depression Hurts and Cymbalta is a painkiller. Well, that’s fine and dandy until you actually look at the data which show Cymbalta doesn’t do much for pain in depression.

It’s time to get over the hype surrounding SNRIs. The next “advance” in antidepressants, well, who knows what it will be - but let’s hope it’s something a little more substantial than SNRIs. But I’m not hopeful. And no, I don’t want to hear anything more about agomelatine.

I know it’s been a long time between posts. So pardon me if my writing is more awful than usual. And it doesn’t mean I will be posting regularly. Thanks to the multiple readers who sent me a copy of this article.

Citation to new meta-analysis of Effexor and Cymbalta:
Schueler, Y., Koesters, M., Wieseler, B., Grouven, U., Kromp, M., Kerekes, M., Kreis, J., Kaiser, T., Becker, T., & Weinmann, S. (2010). A systematic review of duloxetine and venlafaxine in major depression, including unpublished data Acta Psychiatrica Scandinavica DOI: 10.1111/j.1600-0447.2010.01599.x

Thursday, September 25, 2008

The Cymbalta Schatz-Storm: Duplicate Publication and Lying by Omission

ResearchBlogging.org

This post details the duplicate publication of data on the antidepressant duloxetine (Cymbalta). Marketing and "science" collide to produce hideous offspring: an experimercial that pimps Lilly's bogus "Depression Hurts" marketing for Cymbalta using the exact same (weak) data twice. Data were published in the Journal of Clinical Psychiatry (JCP), and then the same data were published a second time in the Journal of Psychiatric Research (JPR), a blatant violation of JPR policy. Oh, and Alan Schatzberg, president-elect for the American Psychiatric Association is involved in the story.

The study: Lilly conducted a rather uninteresting study of Cymbalta, in which patients who had not shown a treatment response to an SSRI were then assigned to either a) Direct switch: Switch to Cymbalta and immediately discontinue the SSRI medication or b) Start-Taper-Switch: taper the SSRI over a 2 week period while also starting Cymbalta. Note that there was not a control group of any sort, an issue that the authors dance around (i.e., essentially ignore) in the papers based on this study's data.

Publication #1 -- Journal of Clinical Psychiatry: Data from this study were published in the January 2008 issue of the Journal of Clinical Psychiatry. The findings were that, in essence, there were no notable differences between patients who were directly switched to Cymbalta as opposed to those who did the start-taper-switch method. But what do the authors conclude?

Despite the lack of control group, the authors get the message out that not only was depression improved, so were "painful physical symptoms." As anyone who has a television has probably noticed, Lilly has been pushing hard for quite some time to convince patients and physicians that Cymbalta will relieve depression and pain in depressed patients. So if the marketing points can be pushed in one journal, why not pimp the same idea using the same data in another journal?

Publication #2 -- Journal of Psychiatric Research: Data from the same study were published online (to appear in print soon) in the Journal of Psychiatric Research (JPR). And I mean the exact same data appear again in this paper. This is a huge scientific no-no. Findings are supposed to be published once, not over and over again. Journals are struggling to find space for new and interesting findings, so there is no need to waste space on duplicate data. In fact, to quote from JPR's website
Submission of a paper to the Journal of Psychiatric Research is understood to imply that it is an original paper which has not previously been published, and is not being considered for publication elsewhere. Prior publication in abstract form should be indicated. Furthermore, authors should upload copies of any related manuscript that has been recently published, is in press or under consideration elsewhere. The following circumstances indicate that a paper is related to the manuscript submitted to the Journal: a) any overlap in the results presented; b) any overlap in the subjects, patients or materials the results are based on.
So it's pretty clear -- don't submit data that has already been published. Here is a figure from the Journal of Clinical Psychiatry (JCP) article mentioned above:
And here is the same data, in a figure in JPR:
But wait -- that's just the beginning. How about the data tables... From JCP:
And the right-side half of this table in JCP:
And the exact same data appearing in JPR:
To be fair to these "researchers" in JPR, they reported data from subscales of two measures not reported in JCP. But the vast majority of the data is just reprinted from the article in JCP. Which is completely trouncing journal policy and, more importantly, conveying Lilly's marketing messages to the audiences of two different journals. Unfortunately, they apparently did not consider that some people might actually read both journals and notice that essentially the same article had appeared twice. Or, Lilly considered this prospect and said, "Who cares." I'll leave it to my readers to decide if they care.

Authors: The JCP paper was authored by David Perahia, Deborah Quail, Derisala Desaiah, Emmanuele Corruble, and Maurizio Fava. The JPR paper was "authored" by Perahia, Quail, Desaiah, Angel Montejo, and Alan Schaztberg. So to re-publish the same data, it was out with Corruble and Fava -- in with Montejo and Schatzberg. Why Schatzberg? We're almost there...

JPR describes the contributions of each author. For these two authors (Schatzberg and Montejo) who were not credited in the JCP paper, they were both described as "involved in data review and interpretation, including the development of this manuscript." How could they have been involved with data review and interpretation -- the vast majority of the data were already analyzed, interpreted and written up by other researchers in the JCP paper? Did they write the paper? Apparently not, since the JPR article mentioned that "Dr. Desaiah worked with Dr. Perahia to draft the manuscript..." So Montejo and Schatzberg could not conceivably have played any significant role in data analysis, interpretation, or writing the paper. It seems that if Desaiah and Perahia "drafted" the manuscript, then the most Montejo and Schatzberg could have done is to maybe review the paper.

So why is Schatzberg on the paper? Well, it just so happens, I'm sure by sheer coincidence, that Schatzberg is the co-editor in chief of JPR. So he'd be in a good position to help a paper that essentially republishes data from JCP with only minor additions make it into publication against his own journal's policies.

Nice work, Schatzberg. That's pimpin' it hard. That, my friend, is worthy of nomination for a coveted Golden Goblet Award. Congratulations. It is not the first time Schatzberg's "scientific" behavior has been noted. He has been stumping (in the face of much contradictory data) in favor of his pet drug RU-486/Corlux in the treatment of psychotic depression for some time. Between the bad science surrounding Corlux and Schaztberg's myriad conflicts of interest, much has been written (1, 2, 3, 4, 5) -- add another chapter to the chronicles of the storied American Psychiatric Association Leader. This reminds me of an earlier incident involving Charles Nemeroff.

Discussion: As I've noted previously, the discussion section of a journal article often contains key marketing points, science being relegated to secondary status at best. The JPR article provides a few good examples of Cymbalta's talking points:
The current paper focuses on pain-related outcomes, demonstrating that a switch of SSRI non- or partial-responders to duloxetine was associated with a significant improvement in all pain measures including six VAS pain scales, the SQ-SS and its pain subscale, and the SF-36 bodily pain domain.

Switch of SSRI non- and partial-responders to duloxetine resulted in mean improvements on all pain measures regardless of the switch method used.

Duloxetine, an SNRI, has previously been shown to be effective in the treatment of PPS associated with depression, and it is also effective in the treatment of chronic pain such as diabetic peripheral neuropathic pain (DPNP) for which it is approved in the US, Europe and elsewhere, so duloxetine’s effects on pain in our sample of SSRI non- or partial-responders was not unexpected.

Patients with MDD present with a broad range of symptoms including those related to alteration of mood and PPS, all of which may contribute to global functional impairment. Effective treatment of both mood symptoms and PPS associated with depression may therefore optimize the chances of functional improvement. Recent findings that residual PPS in depressed patients may be associated with impaired quality of life (Wise et al., 2005, 2007), decreased productivity and lower rates of help seeking (Demyttenaere et al., 2006) and a lower likelihood of attaining remission (Fava et al., 2004), further demonstrate the importance of effective treatment of PPS in patients with MDD, so duloxetine’s effects on PPS are reassuring.

Improvements in pain are consistent with previously reported studies demonstrating duloxetine’s efficacy for pain, either as part of depression, or as part of a chronic pain condition such as DPNP.
Where do I start? How about by mentioning that JPR states:

7. Discussion: The results of your study should be placed in the appropriate context of knowledge, with discussion of its limitations and implications for future work.
So maybe if there was research that questions Lilly's talking points about Cymbalta relieving pain in depression, such research should be discussed. Well, it just so happens that there is research, which analyzed Lilly's own clinical trials and found that Cymbalta was no better than a placebo or Paxil in treating pain in depression. This meta-analysis of Cymbalta trials was published in January 2008, yet the JPR article, which was originally received by JPR on March 26, 2008 did not mention the negative data. Hmmm, that doesn't exactly sound like placing the findings "in the appropriate context of knowledge," does it? All this talk about Cymbalta's fantastic analgesic effects despite Lilly's own data showing that Cymbalta is at best close to useless in treating pain among depressed patients. Another study that claimed to show Cymbalta was a helluva painkiller was also smacked in a letter to the editor a few months ago -- and the authors of the Lilly-sponsored trial conceded defeat by refusing to reply to the critiques of their study.

Better Than "Weak" SSRIs (Not Really): In the JPR study, it was mentioned that the evidence for SSRIs in treating pain is "weak." No disagreement on my end. But see, once SSRI patients switched to Cymbalta, their pain magically went away because Cymbalta, unlike SSRIs, relieves pain. Never mind the lack of control group, which was allotted a grand total of 15 words in the discussion as a potential limitation of the study. The authors also failed to note that prior research showed that Cymbalta was no better than Paxil in treating pain in depressed patients. And Perahia, the lead author of the JCP and JPR "studies" is certainly aware of the research showing that Cymbalta works no better than a "weak" SSRI, since he was the lead author on one such study! So he is quite aware that Cymbalta has never been shown superior to Paxil in treating pain, yet he accurately describes research indicating that SSRIs are "weak" pain treatments, but then neglects to mention that Cymbalta failed to demonstrate superiority to Paxil in treating pain in depression. This is called lying by omission.

I may pass along my concerns to the Journal of Psychiatric Research. My prior experiences in passing along such concerns to journals via my blog identity is that they either a) ignore my concerns entirely or b) instruct me to write a letter to the editor which would be considered for publication, with the stipulation that I use my real identity. Sorry, but a published letter to the editor is not worth blowing my cover.

Call for Action: Rather than my running into point b. from the last paragraph, how about one or more scientifically inclined readers submit your concerns to the journal, under the following condition: Make sure you read the original papers first to judge if my concerns are valid. Then, if you feel similarly, why not send a letter to the editor? This is bad science which does nothing to advance patient care -- it seeks only to advance sales of Cymbalta by pimping it as a painkiller in depression while ignoring all contradictory data. So let's try a little research of our own -- see if JPR is willing to address these issues or if they will be swept under the rug.

Reference to JPR article:

D PERAHIA, D QUAIL, D DESAIAH, A MONTEJO, A SCHATZBERG (2008). Switching to duloxetine in selective serotonin reuptake inhibitor non- and partial-responders: Effects on painful physical symptoms of depression Journal of Psychiatric Research DOI: 10.1016/j.jpsychires.2008.07.001

Update: Also see an excellent follow-up post on the topic at Bad Science.

Friday, July 25, 2008

Cymbalta Smacked Via Excellent Letter to Editor


Eli Lilly/Boehringer Ingelheim published a study claiming that duloxetine (Cymbalta) was an effective treatment for pain in depressed patients. Nothing new – they’ve run several such studies. The results were published in the Journal of Clinical Psychiatry in November 2007. Three wise readers (Jay Griffith, Joseph Hasley, and Daniel Severn) noted serious issues with the study and submitted a letter to the editor, which was then published in the June 2008 issue. It was noted that the patients were unclearly described: What kind of pain were they experiencing? The study also noted that patients weren’t taking pain-relieving medications for six months prior to the start of the study, at least not on a “regular basis,” a term that was not defined in the paper. Don't most patients who experience serious pain take analgesic medication at least somewhat regularly? Finally, and most importantly, the difference between Cymbalta and placebo was “statistically significant,” but more importantly (and ignored by the study authors), the difference was small. On an 11-point rating scale, the average difference in pain ratings favored Cymbalta by less than a point. Griffith and colleagues note, accurately, that the small advantage for Cymbalta “is not robust from the standpoint of clinical practice.” I’m always glad to see that there are a few readers of medical journals who are willing to take the time to pen a good letter to the editor in which the massive inadequacies of a study are noted. If we’re going to have evidence based medicine, we might want to make sure the evidence is of somewhat palatable quality, eh?

The kicker is that the lead author of the Cymbalta study, Stephan Brecht of Boehringer Ingelheim opted not to offer a response to Griffith et al.’s letter. So I suppose Brecht is conceding that the patient population was poorly defined and that Cymbalta’s advantage over placebo was meager. So this kinda runs counter to the conclusions of the study, which claimed in part that duloxetine is an effective painkiller. Not a big surprise, given that a prior analysis also cried foul about Cymbalta’s claim to successfully treat pain in depression.

Yet Cymbalta continues to fly off pharmacy shelves. Are physicians really this poorly trained at understanding scientific literature? “Gee, the ads say that Depression Hurts and the rep handed me these journal reprints that prove it's a painkiller, so now I’m writing Cymbalta scripts like there’s no tomorrow!”

For failing to note that Cymbalta's effects over placebo were small, and for refusing to reply to the concerns about their study, I hereby nominate the authors of the November 2007 Cymbalta study (especially the lead author) for a Golden Goblet Award. Your dedication to obfuscation is notable -- keep up the bad work.

Monday, June 30, 2008

Cymbalta: Good For Whatever Ails You

I don't have time to write much on the topic, suffice to say that John Russell of the Indianapolis Star raises some good questions about Cymbalta, Eli Lilly's antidepressant/antianxiety/analgesic/good for whatever ails you pill.  He calls it a Swiss Army Knife, which is ironic given that Lilly gave out Swiss Army Knives as part of its Viva Zyprexa campaign, likely as a reminder that Zyprexa (much like Cymbalta) was a broad spectrum psychotropic that could be used to treat, um, a lot of things.  Despite Cymbalta being touted as a cure for both depression and all sorts of different types of physical pain, once again it appears that the science has failed to live up to the marketing, at least for treating pain in depressed patients. Russell's article asks whether it is reasonable to expect that one drug could really work for so many different conditions.  It's well worth a read.

Hat Tip: Furious Seasons and an anonymous reader.

Monday, May 05, 2008

In the Name of Science and Charity

Philip Dawdy at Furious Seasons has noted that Eli Lilly released a short report in which they describe the funding they provided to a variety of organizations. All in the name of science and charity, of course. Beneficiaries of Lilly's largess include:
These were just some of the big recipients. The report itself is well worth checking out. One will note that Lilly is kindly funding a lot of "education" about fibromyalgia just as they try to move Cymbalta for all things pain-related. The amount of "education" regarding bipolar disorder is also instructive. Um, Viva Zyprexa?

Read some of the details at Furious Seasons and read Lilly's report as well. To Lilly's credit, at least they are making an attempt at disclosure; their industry colleagues are more than welcome to follow suit. Remember that the figures from Lilly's report are from the first quarter of 2008 only.

Monday, January 07, 2008

Inaccurate Advertising Hurts

I'm late to the game on this post, and this material has been covered well on other sites. In case you've missed it, a recent meta-analysis indicated that the effect of Cymbalta on pain in depression relative to placebo was somewhere between nothing and minimal. This was noted on Furious Seasons, the WSJ Health Blog, and Pharmalot. According to the Pharmalot post, it also appears that Lilly has not fully disclosed all relevant data in Cymbalta's clinical trials, which contradicts Lilly's pledge to share all data openly.

This is apparently another example of how we cannot trust that pharmaceutical advertising is any more accurate than advertising for quick weight-loss programs, exercise equipment, or get-rich-quick schemes. Caveat emptor.

Props to John Mack for noting many months ago that the Depression Hurts campaign reeked of off-label marketing.

Friday, June 08, 2007

Astroturf: Welcome to the Machine


I welcomed y’all to the PR Machine yesterday, where I discussed the Drug Wonks blog and the general topic of how pro-industry speech is magnified while dissenting voices are generally muffled.

I’m not the only blogger who has noticed this trend. Philip Dawdy at Furious Seasons noted recently that Lilly, maker of Doggie Prozac (aka Reconcile), is now supporting a patient support group (Support Partners) that touts the benefits of dog ownership for people with depression. Wonderful. I wonder if this new support group will ever discuss Reconcile? Nah, too obvious, you think? We’ll see.

On the page that discusses treatment options for depression, it is stated…

Some of your questions may include the different medications used to treat depression. If you want to learn more about a medication for the treatment of depression from Eli Lilly and Company, click here

As you probably guessed, it links to the lovely Depression Hurts website. The page also states:

Therapy typically means that you spend about an hour a week talking with a mental health professional. Treatment can continue for several weeks or up to one to two years. Every person's situation is different.

What does it say about antidepressants, besides linking to Cymbalta?

Taking medication to treat depression doesn't change your personality; you'll simply start to feel better. You may begin to feel improvement in your symptoms in the first couple of weeks of taking an antidepressant. Typically, within four to six weeks, you should notice a significant improvement.

So with medication, “you’ll simply start to feel better,” usually within four to six weeks, whereas with psychotherapy, you might spend several weeks or up to two years and who knows if you’ll feel better. Who cares that the evidence on treating depression does not support Lilly’s marketing?

Oh well. At least the website for Support Partners is obviously sponsored by Lilly, with the Lilly logo on the bottom of the page.

Sneaky Sponsorship: Some groups are not nearly as blatantly sponsored as Support Partners. That is why I am so pleased that Seroxat Secrets has been keeping an eye on patient support groups that, by sheer coincidence, happen to recite marketing talking points from industry. The posts on the Diabetes Monitoring Forum (1, 2) are well worth a read.

A patient support group known as Depression Alliance has also been dissected at Seroxat Secrets. Rather than copy his words, I’ll just refer interested readers to the posts on the link between the patient advocacy group Depression Alliance and public relations firms that helped with the UK launch of Cymbalta (1, 2).

Other astroturfing posts include:

Once again, Welcome to the Machine



Monday, April 16, 2007

Lilly Posts 1Q Earnings

And they're pretty good. Of psych note, Zyprexa sales were up 10% compared to the 1st quarter of '06; Cymbalta sales up 89% from 1Q '06, and Strattera sales down 8% from 1Q '06. Overall, sales were up 14%. Source.

According to Bloomberg, Lilly is planning to ramp up marketing of Cymbalta and Zyprexa. Can you say Viva Zyprexa Strikes Back? How about Anxiety Hurts (1, 2)? I wonder to whom Lilly will market Zyprexa? Seriously -- they've tried primary care, psychiatrists, and likely the geriatric market, so who's next? How about pediatricians? Tantrum = Zyprexa?

Thursday, March 01, 2007

Cymbalta for GAD: Pimp That Thang

John Mack at the Pharma Marketing Blog has laid down the smack on Cymbalta. First, he points out that the Depression Hurts campaign sure looks like off-label promotion, since Cymbalta is FDA-approved for depression, not for pain associated with depression. Note that pain is not an official symptom of depression, so it seems quite strange to market a drug for the pain that allegedly occurs in depression, eh? Also note that the data on Cymbalta for pain in depression are not impressive at all.

Now that Cymbalta has been FDA-approved for generalized anxiety disorder, Mack wonders if some disease mongering is on the way, and he notes that the diagnostic criteria for GAD are fairly vague, which makes it much easier to foist the idea that untold millions of Americans are suffering from GAD. I'd love to see Lilly's sales scripts for the new GAD indication -- if they're anything like the sales tactics for Zyprexa, look for the rate of prescriptions for GAD to skyrocket.

Monday, February 26, 2007

Cymbalta for GAD: Repeating the Pattern

Lilly's drug Cymbalta (duloxetine, aka Yentreve) was approved by the FDA as a treatment for generalized anxiety disorder today.

Is this good for people with GAD? Is this good for Lilly? Are there any academic cheerleaders in the house?

People with GAD: Sometimes a medication really does a good job in treating symptoms and sometimes there are red flags that the science does not match the marketing. In this case, it took very little digging to find a red flag – just read the press release!

In clinical trials, on average, patients treated with Cymbalta for generalized anxiety disorder experienced a 46 percent improvement in anxiety symptoms compared to 32 percent for those who took placebo, as measured by the Hamilton Anxiety Scale [HAM-A].

I have not seen the full text of what appears to be the only published study of Cymbalta for GAD. I cannot find the other two duloxetine studies on Medline (the press release claims there have been three studies), which seems odd. Perhaps I should dig a little deeper… In any case, let’s move on to the efficacy of duloxetine in the study referenced in the press release.

My guess (from a similar study examining Zoloft for GAD) is that participants averaged about 25 on the HAM-A at the start of treatment. Using the press release’s numbers, the Cymbalta group comes down to an average of 13.5 while the placebo group came down to an average of 17. Is a difference of 3.5 points on the HAM-A noticeable? Could a clinician tell the difference between someone with a 17 and a 13.5 on this measure? I think that in most cases, yes. But it is probably a small difference.

Adverse events reporting in press releases (and often in journal reports) is a joke, so I am not sure about the safety of the compound. The infamous Traci Johnson case suggests that there may be a risk of suicidality with Cymbalta, but I would prefer to see more data before making a firm association. Of course, given the general link between SSRIs (and newer antidepressants) and increased suicidality, my initial instinct is to be leery of Cymbalta in this regard.

Good for Lilly: This is great news for Lilly. As the Zyprexa debacle continues to unfold, Lilly’s search for good news appears to have arrived in Cymbalta. According to the Indianapolis Star, Cymbalta sales hit $1.3 billion in 2006. With the GAD indication and its rapid expansion into the depression market, I’m guessing that we’re looking at least a $2 billion drug for 2007. Not bad for a drug that has no evidence of working better than any of the older, now mostly off-patent SSRIs.

Academics in the House: Dr. Susan Kornstein of Virginia Commonwealth University, whom I have mentioned in the past in relation to her highly positive statements about Lexapro, is back. In the press release, she states:

With this approval, physicians and patients will be happy to know that there is another medication now available to treat this debilitating condition.

Yeah, they’ll be dancing in the streets. With all SSRIs likely showing similar efficacy to Cymbalta and with psychotherapy showing pretty good (but not great) outcomes for GAD (likely better in the long-term than with meds), we needed a more expensive medication because… WHY?

It is amazing how the pattern repeats itself. Show a medication is at least a bit better than a placebo in the short-term, have an “independent” academic give it the stamp of approval, then talk about how it is an important new addition that will prevent untold amounts of grief and suffering. The safety data are often glossed over and the benefits compared to existing treatments are not discussed at all. This helps people (outside of the sponsoring company)… HOW?