People have remarked how could medicine not understand how powerful circadian biology and the sun are for humans? How did we swing and miss on thiamine, deuterium, and mitochondrial biology?
Might it be how we collect our evidence in medicine?
I think so.
Why? When I went through my own awakenings in 2004 and 2005 a key paper came out in the literature.
In 2005, Dr. John Ioannidis, a well-known meta-researcher, published an article in PLoS Medicine called Why Most Published Research Findings Are False. This article caused a splash and has been making waves in the medical research community ever since.
When I first read his article, I wasn’t the least bit shocked as a clinician because my patients were not getting better with any Evidence Based Medicine recipe that was published up until this. This raises the question, if our evidence is bad, what could possibly go right in healthcare for patients? The article left me begging the clinical question for my patients, what should I do now for my patients?
The answer was to go back and look how the evidence was collected and then try to link it back to nature’s laws.
I think we must increase value by reducing waste in research. Too many studies are being funded in areas that are useless to move the needle for the public good. The reason this happens is because Big Pharma is paying the bill for RCT’s.
Nature doesn’t make mistakes – but people sure do with poor choices around light. None of this is reflected in our evidence. THIS IS A PROBLEM.
Modern evidence based medicine (EBM) via algorithms is a flawed process. The data that were sent to clinicians from the researchers are nothing more than fruit from a poison tree. How good is evidence if you are asking the wrong question and using a flawed metholodology in your data collection and testing? The process now used by medicine to harvest research data to make guidelines is scientific illegitimate, therefore it is clinically inappropriate.
Why EBM does not work is illustrated by Michel Accad having (mis)quoted the definition of EBM as “the conscientious, explicit, and judicious use of best evidence in making decisions about the care of individual patients”, in cite one below.
The correct quote should be:
“Evidence based medicine is the conscientious, explicit, and judicious use of CURRENT best evidence in making decisions about the care of individual patients”.
Most of today’s EBM is based on the drug cartel ideas of what we should study so they can gain more customers while they remain sick and untreated. Vested interests, such as the drug and medical device companies, often fund medical research. This means quality marks or guidelines based on this research may not represent the best clinical practice but rather the treatment option that benefits these companies.
The laudable goal of making clinical decisions based on evidence can be impaired by the restricted quality and scope of what is collected as “best available evidence.” The “authoritative aura” given to the collection alone, however, may lead to major abuses that produce inappropriate guidelines or doctrinaire dogmas for clinical practice. Today medicine’s algorithms are run by these horrendous ideas. This is why the public is getting a small benefit from healthcare today.
This illustrates the reasons why EBM is broken by those who use it:
1. Evidence comes in quality and selective reporting, either by publication bias or post-hoc subgroup analysis to obtain statistically and clinically significant results. It is quite common to use the examples illustrated by ISIS-2 study researchers to warn against frivolous subgroup analysis, in this case astrological signs, in blind pursuit of the holy grail of statistical significance.
2. Misquotation or taking the conclusion out of context is another expected way of forming the wrong basis for change in practice. The classic example for this is the conclusion drawn from NASCIS 2 involving the use of corticosteroids in spinal cord injury. This has been a huge issue in neurosurgery and spine care my entire career.
3. Garbage in, garbage out. This is my favorite reason why EBM is today’s best example of HOT GARBAGE. Most research today is not worth the paper it is written on. They ask the wrong questions and never serve the public good by helping lower diseases. This is why human epidemics are running wild. There are increasingly more observational studies in which causative links are suggested when the link can only be concluded as associative. These studies are retrospective research on prospectively collected data which is often flawed in certain aspect and cannot be used for anything other than what is it originally intended for.
4. Slow and out of date. Guidelines are often EBM in concept but can be biased due to institutional support, financial or material conflict of interest by various members of the committee and quickly out of date after publication. It is not unusual for various national clinical network to take 5+ years to form a consensus which itself is soon overtaken by new revelations and technology. The new guideline for AHA ACLS training come to mind. Over 330 new guidelines and only three have class one evidence? WHY ARE WE CHANGING A THING IN THIS CASE?
5. Bias in researchers/opinion makers: conscious and unconscious. Except for triple-blind studies, most results can be influenced (to various extents) by the conduct of the study which is dependent on researchers. Unconscious bias can occur based on individual outlook, professional training and past experience when a group of experts come to consider inclusion or exclusion of prospective studies to base their recommendations on.
6. The temple of Meta-analysis and Randomized Controlled Trials (RCTs) and their worshippers. Many EBM converts enthusiastically proclaim that without RCT or Meta-analysis, all treatments warrant review. THIS IS LUDICROUS. Workman’s compensation guideline use this to block PBM/LLLT treatments to people with TBI. However, many questions cannot have RCT to be performed due to rarity of the conditions or ethical issues. Some questions (for example, benchmarking diagnostic tests) do not need RCT to be performed. Well-conducted RCTs are often expensive, labor-intensive and take time to perform and reach their conclusion, sometimes being overtaken by other technological and social changes. TBI is one such area.
7. No evidence of effect is not the same as evidence of no effect. Many confuse the state of having no studies showing effect as the same as having studies showing no effect. Some suggest that certain treatment should be stopped when there are no high quality studies showing effectiveness of a therapy; that may be a valid assertion but as suggested by the definition of EBM, we ‘make do’ with whatever CURRENT evidence is available until something better comes along (we should remain vigilant for new knowledge). However, when there is high quality evidence of no effect, it is unethical to persevere with treatment proven to make no difference.
8. What matters to you does not necessarily matter to me. The recent move towards Patient Reported Outcome Measures (PROMs) and Patient Reported Experience Measures (PREMs) when designing new studies may still not be relevant to patients. Various studies looked at outcomes when the treatment was never intended to solve the problem. A recent example found paracetamol ineffective in long term back pain underlies the common sense that short acting symptom-relieving paracetamol was never meant to be used as a disease modifying drug. Evidence-based guidelines often map poorly to complex cases where the patient has multiple co-morbidities. Most specialist care cases fall here.
9. Ask the right questions, do the right maths in statistics. Today, this methodology is vastly ignored. It is often perplexing when considering large studies where some researchers appear to demonstrate lack of care in the most important aspects of the study both asking the question that is clinically relevant, choosing the right outcome indicators to measure, and harnessing the skills of a clinical statistician to determine what is needed to be done. 2 meta-analyses published with 12 months of each other can reach opposite conclusions; the difference lies in what question is really (and not reportedly) asked, which studies are chosen for analysis. Statistically significant benefits of one treatment over another may be marginal in clinical practice, but this information may not be included in CDS or other tools.
10. Academic and Institutional integrity of Centers and Hosptals must be questioned because of how they are using guideline to police clinicians. Inflexible treatment recommendations from evidence-based medicine tools may produce care that is management-driven rather than patient-centered. Clinicians can be punished by hospitals for not doing their dirty profitable work that never suits the patient.
Moreover, often times 2 apparently similar studies reach different conclusion in spite of similar setting and control; chance occurs providing conflicting answers. On the other hand, there are times when deliberate acadmic misconduct occurs and it can take years to identify the culprits of the misdeed. Being aware of websites like https://retractionwatch.com keeps people up to date but all researchers should be considered with some initial suspicion; even the work of a scientific icon like Mendel has been considered by some as ‘prescient’. No authors should be immune to the vigors of scientific curiousity and testing.
Does evidence based medicine adversely affect clinical judgment?
Yes, but only because the clinicians allow this to occur. In the age of information overload and excesses, it is important for clinicians to be professional is their approach to evidence, be it single landmark studies or national guidelines.
If the evidence is important enough to change your practice, then make sure the quality of research is high, the analysis is correct, the conclusions are reasonable and the relevance is current and applicable. If clinicians want to ignore the study conclusion or guideline recommendation, the onus is still on them to prove without bias why this should be.
The obligation rests with clinicians who are in direct therapeutic relationship with patients; hence they have the ultimate responsibility as learned sentinels advising the patient.
11. The current volume of evidence, especially clinical guidelines, has become almost unmanageable for ANY CLINICIAN.
Oculi tanquam speculatores altissimum locum obtinent = The eyes, like sentinels, occupy the highest place in the body.
Consistent with hypoxia/ischemia, thiamine deficiency stabilizes and activates Hypoxia Inducible Factor-1α (HIF-1α) under physiological oxygen levels. Oxygen levels and thiamine control the intake and outtake systems of the mitochondrial matrix…………How did medicine miss these basics?
This series will continue to abolish more of your beliefs.
The reality is this: the induction of HIF-1α mediated transcriptional up-regulation of pro-apoptotic/inflammatory signaling contributes to astrocyte cell death in the CNS and PNS during thiamine deficiency. This destruction begins in the mitochondria and destroys neurons everywhere in your body.
Maybe now we can see why the evidence and treatment guideline has not yet supported why our modern environment is causing neurodegeneration?
CITES:
Sackett DL, Rosenberg WMC, Gray JA, Haynes RB, Richardson WS. Evidence based medicine: what it is and what it isn’t. BMJ1996;312:71-2
David H. Freedman (November, 2010) Lies, Damned Lies, and Medical Science, The Atlantic
Ebrahim S, Sohani ZN, Montoya L, et al. “REanalyses of Randomized Clinical Trial Data.” JAMA 312, no. 10 (September 10, 2014): 1024–32. doi:10.1001/jama.2014.9646.
Ioannidis, John P. A. “Why Most Published Research Findings Are False.” PLoS Med 2, no. 8 (August 30, 2005): e124. doi:10.1371/journal.pmed.0020124.
Ioannidis JA, and Khoury MJ. “Assessing Value in Biomedical Research: The Pqrst of Appraisal and Reward.” JAMA 312, no. 5 (August 6, 2014): 483–84. doi:10.1001/jama.2014.6932.
Young, Neal S, John P. A Ioannidis, and Omar Al-Ubaydli. “Why Current Publication Practices May Distort Science.” PLoS Med 5, no. 10 (October 7, 2008): e201. doi:10.1371/journal.pmed.0050201.