Thanks for stopping by to read this week's Biotech Stock Mailbag. Brief discussions about the upcoming FDA advisory panels for Sarepta Therapeutics ( SRPT) and Acadia Pharmaceuticals ( ACAD) are below. First, some tips on how to spot red flags in clinical trial results.
PM16 writes, "Thank you for doing the call with Dr. [Mark] Ratain, although I had to sign off early. Can you write an article recapping the clinical trial tips you and Dr. Ratain talked about? Any help you can offer about how to analyze clinical trial results would be great and very much appreciated."
Apologies again for the technical glitch that delayed the start of my conference call with Ratain on Tuesday. I know many registered listeners had trouble hearing the call or couldn't access it at all. Slingshot Insights, the research outfit sponsoring the call, ran into some unfortunate and unforeseen problems caused by too many people trying to dial in.
That's called a "good" bad problem, but still not ideal. Joe McCann, Slingshot's co-founder, tells me his engineers are working on a plan to prevent this from happening again. As I said in last week's Mailbag, I like Slingshot's business model -- providing affordable access to industry experts for smaller institutional and retail investors -- so I hope to work with McCann again on future research calls.
Here, then, is a summary of my call with Ratain, including a checklist of things to think about when a biotech or drug company announces clinical trial results:
Generally speaking, single-arm clinical trials are difficult to interpret because they lack a comparator arm. A single-arm study could produce credible evidence of drug activity in certain circumstances -- a study measuring response rate in deeply refractory cancer patients, for example. Be very skeptical, however, of single-arm studies with progression-free survival or overall survival endpoints. Likewise, the use of an historical control is suspect.
Ratain very much likes randomized, controlled studies (preferably blinded as well) for the converse of the reasons stated above.
When the company's press release hits the tape, do these things:
Read the press release. The whole thing. If the headline of the press release states results are "positive," "outstanding" or uses some other superlative, assume there are problems hidden somewhere in the waning paragraphs until proven otherwise. Read the press release again.
The design of the clinical trial, including number of patients enrolled, the type of patient enrolled and the endpoints (primary, secondary), should be explained clearly. There will be a paragraph discussing the study results. Look for these re-assuring words/terms: Intent-to-treat analysis, statistical significance, achieved the primary endpoint. More details, more specific data are better than less.
Be wary of companies which describe "positive" clinical trial results using wishy-washy adjectives. A recent favorite was CymaBay ( CBAY) , which described responses to its cholesterol-lowering drug as "broad" and having "potential utility." It shouldn't surprise you to learn that CymaBay's drug actually performed quite badly.