Sunday, March 25, 2007

13 Rules for Finding the Perfect Seat at a Medical Conference: A Journalist's Guide

These rules are more or less in the order of importance.

1. This may seem almost too obvious even to mention , but you can't sit in a seat that someone else is occupying. The key to getting a good seat is to get to the room as early as possible. As you'll see, there are so many constraints defining the perfect seat thatin many conference rooms there are only a handful of acceptable choices. (Actually, since some of the constraints are quite stringent, in many rooms there's not a single seat that satisfies all of them, so the only choice is to choose the least bad seat.)

2. You want to be sitting near the front, especially if you hope to grab the speakers for interviews or photos immediately after their talks.

3. You want to be sitting at a conference table, if possible, not in the rows of chairs behind the conference tables in many conference rooms. A table allows you to spread out your material, your notepad, the conference program, your recorder, your camera, your glass of water, and your coffee. If the room's not completely full, I often try to spread my material over two adjacent spaces at a conference table.

4. Don't sit by the pitcher of ice water often placed at intervals on the conference tables. Not only does it decrease the amount of real estate available to spread out, but there is a danger of spills or drips as people lean over you to get glasses of water. Murphy demonstrated conclusively that those spills and drips will certainly land directly on your sensitive electronic equipment.

5. If there are no conference tables, it's critical to appropriate two adjacent chairs. The alternative is to balance the notepad, program, recorder, and camera on your knees, putting the coffee and water underneath the seat. That way lies madness.

6. Aisle seats are critical. Otherwise you'll be annoying people every time you need to run up for a photo or for a word with the speaker.

7. You want to sit as close to the front as possible, but notice where the loudspeakers are. Often they're located behind the first few rows and pointed to the back of the auditorium. If you're closer to the front than the loudspeakers, your recording will sound muffled. If the sound is bad in one part of the room, you may have to move to another during the first talk

8. You want to sit where there's a clean line of sight to the screen, especially if you plan to photograph the speakers' PowerPoint presentations. It's really annoying when the bald head of the guy sitting in front of you obscures the one piece of data you need to complete your article. It's often best to sit a bit farther from the screen if that allows you to put the center aisle between the screen and your camera. That aisle makes it less likely that you'll have an obstructed view.

9. Don't sit too close to the video projector. Not only is the warm air blown by the exhaust fans annoying, but your recorder is likely to pick up a lot of background noise.

10. Some meetings have rules against taking photographs or making audio recordings. I believe such rules don't--or at least shouldn't--apply to reporters, in much the same way that TV camera crews at a major news event refer to designated no-parking areas as "the minicam zone." But if you're anticipating an argument with one of the medical society employees, discretion being the better part of valor, you may want to sit in the middle of the auditorium and away from an aisle.

11. If you're at a big meeting, and you're covering parallel sessions in different rooms, you're going to want to sit fairly close to an exit door.

12. If you take notes on your laptop, try to find a seat at the edge of the auditorium, near a source of electrical power.

13. If you're going to be sitting through some boring talks before you get to the good ones, sit where there's good WiFi reception. That way you can catch up on your e-mail and even write blog posts to while away the time.

Thursday, March 15, 2007

Ghost Writers on the Sly

I noticed something strange about a couple of posters I covered at a recent conference. Both reported clinical trials on the same drug, and both acknowledged that the studies were supported by GlaxoSmithKline, the drug's manufacturer.

That wasn't strange. Pharmaceutical companies sponsor clinical trials all the time, and they frequently report the results of the trials (at least the favorable ones) at medical conferences.

What was strange was the authorship of those studies. In one study the first author was a physician at an obscure hospital in an obscure town in Ohio, and the other five authors were GlaxoSmithKline employees. In the other study four of the six authors were physicians at a private specialty practice in Southern California, and the other two authors were GlaxoSmithKline employees.

I suspect that the physician in Ohio and the physicians in Southern California were not truly the investigators who carried out the study or analyzed the results, and they almost certainly didn't write the poster presentations. They were "beards," enticed by GlaxoSmithKline to add the appearance of independence to studies that were designed and analyzed by company scientists, with the results written up in-house by a ghost writer.

This is an unfortunately common practice in the pharmaceutical industry, and GlaxoSmithKline is certainly not the only offender. One of the most highly publicized cases of ghostwriting by pharmaceutical companies involved Procter & Gamble, their osteoporosis drug Actonel, and a scientist named Aubrey Blumsohn, a pathologist, bone specialist, and formerly a professor at Sheffield Teaching Hospitals, Sheffield, UK.

Dr. Blumsohn relates this complex--and still evolving--story on his blog, and it's been covered fairly extensively in the press, including in a detailed article that appeared in Slate at the end of 2005. Briefly, after giving a $252,000 research contract to Sheffield Teaching Hospitals for a study on Actonel, Procter and Gamble employees analyzed the data and had a ghost writer prepare abstracts for several medical conferences, listing the Sheffield scientists as senior authors. Dr. Blumsohn suspected that the analyses may not have been kosher, and he requested access to the randomization codes that would allow him to conduct independent analyses. Incredibly, Procter & Gamble repeatedly denied the senior author of the study full access to the data that he himself had generated.

You'll have to read Dr. Blumsohn's blog for all the ins and outs of this astounding story. He ended up losing his job for daring to discuss that story with journalists. He finally did get access to some, but not all, of the raw data. He has reanalyzed this data and has begun publishing these re-analyses, which to no one's surprise are somewhat less favorable to Actonel than the original analyses.

In his latest blog post, Dr. Blumsohn describes how Procter & Gamble continues to act very strangely. He submitted an abstract of his reanalysis to be presented at an upcoming meeting of the International Bone and Mineral Society. As he is required to do by the society's disclosure rules, he acknowledged that the original research was supported by Procter & Gamble. Somehow, a mysterious Procter & Gamble scientist named Dr. Purple got a hold of this abstract in advance of publication and demanded that the society remove the funding disclosure. The society agreed at first, but when it became apparent after Dr. Blumsohn's complaints that he had not authorized the disclosure's removal, they put it back in. Dr. Blumsohn includes documentary evidence of his allegations on his blog, including the orginal abstract as well as email correspondence between the society and Dr. Purple (who may well have stepped out of a game of Clue, holding a candlestick in the conservatory).

I'm a writer myself, and I know honorable writers who ghostwrite scientific articles for pharmaceutical companies. I don’t believe that this, in itself, is unethical, as long as all the study's authors are aware that this is being done, as long as they have all had full access to the data, and as long as they endorse all of the study’s results and interpretations. Some medical journals have begun insisting that all co-authors of a paper describe their roles in the study and formally state that they endorse its results and conclusions. Perhaps the organizers of medical meetings should take similar steps.

Sunday, March 04, 2007

A Poster with Bite

There's a big difference between science writing and scientific writing. That difference nearly bit me on the ass at the last meeting I attended.

Scientific writing is highly formalized. Following the abstract, articles start with an Introduction that describes the background of the study. In the Methods section, which comes next, the investigators describe how the experiment was conducted. Then come the Results, the actual data generated by the experiment. At the end will be a section called the Discussion or the Conclusion that explains what the results mean.

In journalistic science writing the order is completely different. Journalists typically lead with the most interesting conclusion, go on to describe some of the results, and only then give the important background. (This is true for news articles; it tends to be less true for feature articles or personal essays.)

When I cover poster sessions, therefore, I look for promising titles, and when I find one I tend to go straight to the lower right-hand corner of the poster to read the conclusions. If the conclusions seem newsworthy enough, I'll go back and read the Introduction, the Methods, and the Results.

I'm not normally under heavy deadline pressure when I'm covering a medical conference. Since I usually work with publications that have monthly deadlines, I have the luxury of spending most of my time at meetings collecting stories that I'll write when I return to my office.

But last week I was working for a wire service that needed a fresh, newsworthy story from the meeting completely written by 11:30 a.m. No problem, I thought. There was a poster session starting at 9 a.m., and sitting in my hotel room the night before I circled about a dozen promising titles in the program. I showed up at the poster session as soon as the doors opened, and I figured I'd have no trouble choosing one of the posters by about 9:30, and then I’d have two full hours to write.

Trouble is, none of those dozen posters was exciting enough or newsworthy enough to make a good wire story. A handful of them were decent, and I planned to write about them in the coming weeks, but none had that certain je ne sais quoi.

Abandoning the list of posters I identified by their titles, I started walking up and down the aisles at a gradually increasing pace and with a growing sense of alarm. Nothing was grabbing me, and time was a'wasting. Finally I found a poster describing an epidemiological study of a disease that was difficult to diagnose. Moving straight to the conclusions, I found a sentence about "the high frequency of death from asphyxia in undiagnosed patients" in the 15-year interval between when symptoms appeared in the average patient and when that patient received a proper diagnosis.

Bingo! There's my lead, I thought. Short of time, I snapped a few shots of the poster with my digital camera and ran up to the press room while composing a sexy lead in my head. "Patients with Roueche's syndrome [not its real name] wait an average of 15 years between the appearance of their first symptoms and the proper diagnosis, and during that time x% of them will suffocate to death, according to a poster presentation at the annual meeting of the blah, blah, blah."

By this time it was 10 a.m. I still have 90 minutes to write, I thought, no problem. So I sat down, uploaded the photos of the poster to my laptop, and started reading the Introduction, the Methods, and the Results. The poster contained a relatively complex presentation of the results, with some data described in prose, some data in tables, and some data in grafts. I stared at that for another 15 minutes, all the while searching for the numbers to back up the "high frequency of death from asphyxia" conclusion, but as far as I could tell there wasn't a single sentence, table, or graph that dealt with that.

I ran down to the poster session hoping to find the study's author. Although this was the time period set aside for authors to stand by their posters, she was nowhere to be found. I stood next to that poster so long waiting for her to return that passersby assumed I was the author and started asking me questions.

After another ten minutes I realized that the first author apparently wouldn't be returning, but I also realized that the author of the poster immediately to the left was the second author (out of ten) of my poster. Shifting anxiously from foot to foot, I waited until he was free of the person he was talking to, and I introduced myself and asked him if he'd be able to answer one question about the poster I was interested in. He agreed, so I asked him what data supported the "high frequency of death from asphyxia" sentence in the conclusions.

"Oh, that's not correct," he said. He went on to say that only about one person dies from asphyxia due to undiagnosed Roueche syndrome every couple of years.

"So how did this conclusion get into the poster?" I asked.

"I don't know."

"But you're the second author."

"Yes, but I wasn't involved in preparing the poster. I’ll have to have a word with Dr. Smith [the first author]."

With less than an hour before my deadline, my life started flashing before my eyes, and my own death from asphyxia began to seem preferable to the conversation I'd soon be having with my editor. "How the hell am I going to salvage this situation?" I wondered.

It was too late to find another poster to write about, so I was stuck with this one. I ended up keeping the 15-year gap in diagnosis as the lead while leaving out the dramatic bit about the "high frequency of death from asphyxia." I wrote like the wind, and I uploaded the story at about 11:29:45.

The moral of the story? Don't assume that the conclusions of a scientific study are supported by its data.

Other Posts on this Blog