Exercise Science Reviews
Medicine & Science Review, 2002.
This article is in PDF format. You will need
Reader to read this article.
Adobe Acrobat Reader is a free program.
only appears below.
Bicycling Medicine & Science, 2002
method of knowledge is experiment.
I’ve written similar articles for the past eight years.
Again, I’ve culled over 2,000 abstracts, reports and papers during the last
year. Some other-than-bicycling sports science may be relevant to bicycling and
is included in this review.
Here’s my synopsis and occasional spin on some of
the published information on bicycle-related medicine and science that came out
What’s the latest medical and scientific info about
Do you read the ad copy in the magazines to figure out what might be worth
trying? Do you look to the pro athletes, who are sponsored, and figure that if
they do it or use it, it must be great? Do you rely on coaches, some of whom
receive kickbacks if you buy on their recommendation? Do you ask your friends?
Or do you just spend your time, effort, or money and try everything yourself?
For most of us, it’s a combination of all of the above, plus a little hope. And,
unfortunately, that little hope is what lots of companies cash in on when they
manage for example, to sell us plain old water at a couple of bucks a gallon or
There’s another way—the scientific way. Looking at what studies or experiments
really show. The scientific way is the best way to evaluate what works and what
doesn’t. The scientific method is better than opinion or guessing, but it’s not
foolproof. Good sport science studies are hard to come by. Worse, unfortunately,
there is sometimes bad science.
A complete review of what makes good science isn’t possible in this article, but
here are a few examples of “science” problems.
Initially, only studies showing an effect tend to be published: Few publications
are interested in reporting, for example, that Vitamin X doesn’t cure
cancer. Once something has been accepted as working, then it is fair game for
challenge. So it’s common for some substance or training method to burst on the
scene for a few years, and then have its bubble burst—by being shown not to work
or having undesirable side effects. Androstenediol, androstenedione, bee pollen,
chromium, medium chain triglycerides, nasal dilators, and royal jelly are now
out of favor.
An interested party pays for some studies. Peanuts were reported to help ballet
dancers’ performance (presumably by increasing deficient caloric intake) in a
study paid for by a consulting company. A company I’d guess was representing a
Peanuts may well help calorically
deficient ballet dancers, but so might Häagen-Dazs ice-cream.
Worse, imagine a company that pays for ten studies from ten different sets of
researchers and advertises only the findings, perhaps obtained by chance, that
promote the company’s products.
Some studies appear to provide important or new information but the wrong
question is being asked or answered. Recently the recovery drink R4 was shown to
provide better recovery than Gatorade when 24 ounces of either was consumed
between taxing exercise bouts. Sounds promising, doesn’t it? But the R4 provided
almost four times as many calories. Would a couple of donuts with the Gatorade
have been as good?
A problem with sport science, unlike general
medicine, is that studies tend to use small groups—fewer than 20 subjects. Small
groups require relatively large differences to find statistical significance.
Studies often initially appear as abstracts. These present preliminary data, are
often incomplete, are less subject to peer or other review, may be withdrawn,
and are often cited in promotions by sponsoring commercial companies.
Keep in mind that it’s common for studies to show apparently conflicting
results. For example, over the years bicarbonate loading and caffeine have been
accepted as improving human performance. Newer studies have questioned that
Each study often adds just a little piece to the puzzle. It’s important not to
put too much faith in any one study.