Are Food Intolerance/Allergy Tests Ethical?

Ethics in science can be tricky. Bad practice is often a case of “I know it when I see it” rather than clear “don’t do that” rules. We all have heard of the Tuskegee syphilis study on black men in the 1930s to the 1970s, and of tests the military ran on thousands of animals to see the effects of radiation, and what the Nazis did during the second World War. We all can look back on these studies and say clearly that they were unethical. Yet at the time that they were performed, it was seen as a necessary evil. Why is that?

April 18, 1979, the Belmont Report was released. It summarizes ethical principles and guidelines for research involving human subjects. It talks about respect for persons, beneficence, and justice. As I read it, I realized that what we see today when we look at those experiments is not what the designers of the experiments saw when they did them. For example, when the report discusses beneficence it asks the question, “Who ought to receive the benefits of research and bear its burdens?” Back in Alabama during the 1930s no practicing white medical doctor would ever have believed that a white person of any stripe should ‘bear the burden’ of syphilis. They also felt that poor, uneducated black men were not really human and thus did not afford them any rights when conducting their experiments. It was seen as an acceptable “risk-to-benefit ratio”. The benefits to white people were seen as outweighing the harm to black people.

This same dissonance can be seen in all human experimentation that is later regarded as inhuman or unethical.

I think it is interesting that a lot of treatments that we could use today to better peoples’ health are not seen as “scientifically valid” because they have not been double blind or placebo tested. It seems that in order to ‘prove’ that these treatments are good and helpful, researchers have to hold back helpful treatment or provide sub par treatment or knowingly give bad treatment to patients/subjects. All of which can be unethical.

An example I saw recently is fecal transplant for treatment of C. difficile. C. difficile is the name of a very common bacteria to the human gut. C. difficile usually is kept in check by normal healthy bacterial growth in the intestines. It only becomes a problem in vulnerable populations, such as people who are sick, injured or on antibiotics. Unfortunately when C. difficile becomes a problem it is life threatening. “In 2008, a total of 93% of deaths from C. difficile occurred in persons [over] 65 years of age, and C. difficile was reported as the 18th leading cause of death in this age group.” (according to to this)

There is a known treatment for C. difficile overgrowth that is very effective with few side effects, but it is not common practice. The safe and effective treatment called ‘fecal transplant’ is socially unappealing. Doctors prefer to use dangerous antibiotics.

In 2013 a study was finally released proving the efficacy of fecal transplants. It showed that “Overall, donor feces cured 15 of 16 patients (94%)” with some side effects of diarrhea, cramping and belching that resolved within 3 hours.”

Which was compared with the traditional methods of using antibiotics, “Resolution of infection occurred in 4 of 13 patients (31%) in the vancomycin-alone group and in 3 of 13 patients (23%) in the group receiving vancomycin with bowel lavage.”

The problem with C. difficile is that it has a tendency to come back after treatment. But only 6% of people who received the fecal transplant had a recurrence where as 62% of the vancomycin-alone group had recurrence, and 54% of the group receiving vancomycin with bowel lavage became reinfected.

The main symptoms of C. difficile are watery diarrhea, fever, nausea, abdominal pain/tenderness, and loss of appetite, but serious conditions can also result such as, inflammation of the colon, perforations of the colon, and sepsis. Often times death is not the only complication from sepsis or inflammation. Even if a persons life is saved, they can loose brain, liver, kidney or other bodily functions. There can be nerve damage and tissue damage that never heals and always causes pain and difficulties. This kind of thing can leave permanent scars and lasting damage.

What I took note of is that one person in the study died from antibiotic complications and the trial had to be ended prematurely so that the people on the traditional treatment of antibiotics who had relapsed with C. difficile could be treated with donor feces infusions, 85% of whom were cured. In order to do this study, patients were put at risk and someone died. Even though there was a safe and effective treatment available.

If we are to look at the cost to benefit ratios of this study now, today, with our societies hang ups about poop, many doctors and patients might feel that the sacrifices these study subjects made were worth it to further human understanding. But if we were as a society less afraid of poop would this study still be considered ethical? After all, participants were put at risk by being denied access to a potentially beneficial treatment, to prove that the treatments that are currently being used are inadequate. Are there other ways to prove to doubting doctors the benefits of treatments, without putting patient lives at risk?

Currently in the case of food intolerance, researchers use the terms ‘unreliable’ and ‘non-compliant’ for subjects in their studies who drop out after finding that their health improved on a diet which removed problem foods. Researchers complain that when participants drop out before the double blind reintroduction of problem foods it makes any data they obtained invalid. Without showing the reintroduction of symptoms, there is no ‘proof’ that the person’s illness was caused by the offending food. But I think that it is heinous to expect someone to get better from symptoms such as mine, and then go ahead and introduce foods to deliberately induce symptoms again. Unfortunately that is exactly what scientists expect subjects to undergo. Unlike with the case of C. difficile, scientists don’t have an effective treatment to offer food intolerance or allergy sufferers. The only treatment for them is to avoid the offending foods.

(My symptoms are typical: nausea, fever, vomiting, gas, cramps, belly pain, sometimes so gut wrenching that I feel bruised the next day, back pain, teeth grinding, jaw clenching, inability to relax muscles, tremors, obsessive thoughts, crying jags, feelings of self loathing, drugged feelings (not in anyway pleasant), inability to make decisions, trembling, anxiety, headaches (sometimes migraines), sensitivity to light, noise, and temperature changes, rashes, hives, swelling of previous injuries, aches, joint pain, diarrhea, constipation, irritability, mental difficulties ranging from memory loss, brain fog and to inability to make decisions and reason through things, loss of coordination, loss of energy and drive, depression, anxiety both low grade floating and acute, panic attacks, vision problems, including spots, floaters, and flashes as well as difficulty transitioning between light and dark, blood sugar issues, hormonal issues, sleep issues, inability to heal and low immune function, sties, canker sores, low grade infections like sinus infections, yeast infections, fungal infections, aggravated allergies, hives, breathing problems, dry skin, bad breath, body oder, neck pain, wrist problems, joint pain, shin splints, and Charlie horses.)

The biggest problem that I see with the tests for celiac or food sensitivities is that in order for the test to show anything, they require a person with the sensitivity to expose themselves to the offending food to keep their immune system heightened. Having a heightened immune system has been shown to cause massive problems in the human body. Generally the consequences are long term. Celiacs have been shown to suffer from auto immune issues like lupus, malnutrition, cancer, thyroid issues, infertility issues, bone problems, gout, eye issues, hearing loss, mental degradation and Alzheimer’s. This damage is caused even from trace amounts of gluten in the diet. Doctors and researchers feel that the damage is minimal in the short term, but they have no way to back that up.

In order to diagnose celiac, doctors first run a test called tTG-igA which looks for antibodies to gluten. If the test is negative the doctor will conclude that the patient doesn’t have celiac, but…. wheat has many types of proteins beyond gluten, and a body may still be reacting to any one of them.

If the tTG-igA test is positive, a doctor cannot diagnose celiac until a biopsy of the colon is done. (Although some doctors will have patients start a gluten free diet, their peers will not agree that the patient has celiac.) A biopsy requires a small chunk of the duodenum (the small intestine). They check the chunk for villous atrophy. If the villi in the sample are flattened the doctor will put the patient on a gluten free diet, although this dose not confirm the diagnoses of celiac, only if the diet works will the patient be confirmed as celiac. Unfortunately villous atrophy can be caused by other things than celiac, (some medications are now known) and by the time that villi have been compromised there is likely to be quite a bit of internal damage done to the body. During this whole time the patient has to continue eating gluten, and almost every doctor I have read about agrees that a person should never go on a gluten free diet unless celiac has been ruled out.

There are some tests for allergies and for food sensitivities. However there is little agreement in the medical and scientific community about which antibodies to use for which foods, companies change their testing parameters regularly and there are a significant number of false positives and negatives. Still doctors are adamant that patients should never start an exclusionary diet without having the tests run.

The only treatment option doctors have to offer is an elimination diet. I refer to the diet that eliminates the suspected food as treatment, not the elimination diet that diagnoses the food intolerance by removing foods to reintroduce them in order to cause symptoms and diagnose the food intolerance. Anyone who has ever done a diagnosis elimination diet of any kind knows that it is usually unnecessary to deliberately introduce suspected trigger foods, that will generally happen all on its own by accident.

According to the Belmont Report “Two general rules have been formulated as complementary expressions of beneficent actions [] (1) do no harm, and (2) maximize possible benefits and minimize possible harms.”

So my question is: if the tests don’t change the treatment, and tests don’t actually reveal that much information to doctors and/or researchers but the requirement of consuming the suspected food for testing is potentially damaging to the food sensitive individual, are food intolerance/allergy tests ethical? Are there significant benefits to be had from injuring patient health as a side effect of testing?

Advertisements
This entry was posted in Health, Oil Intolerance, Oils and tagged , , , , , , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s