PUTTING EVIDENCE TO WORK FOR YOUR PROGRAM

 

BY JAMES HOFFMAN Juggernaut

If you have ever studied exercise science, read a personal training manual, or have been involved in coaching or strength and conditioning in any way, you have probably heard of the concept of “evidence based practice.” Unfortunately, in recent years this concept has been butchered and misrepresented in order to promote intellectual pissing contests about training modalities and training theory.

Traditionally, methods and theories are considered evidence-based if they go through a peer review process where colleagues of similar interests and specializations agree that the findings and conclusions are valid based on solid methodology and the current unanimous understanding of the topic.

But is that all there is to it? Do we take things at face value? Is science limited to peer reviewed literature? I freely and gladly admit I am neither a pure scientist, nor a practitioner. I consider myself a scientist, teacher and coach, but I’d like to shed some light on modern evidence-based practice for athletics. Come with me if you want to science…

In our current state, there are very few fundamental training concepts which have not been explored to the extent of MULTIPLE literature reviews or even text books. Are there some concepts or theories in which only a few studies exist? Absolutely, but these are few and far between and generally involve a very specific concept or variable. What does this mean? For the most part, we know what training methods will achieve a given physiological or performance-based outcome. Do we fully understand the mechanisms or why they happen in every detail? No, and its likely there are technological limitations to some of those questions. But in theory, we understand what generally makes you stronger, faster, better at sport, or what increases your physical abilities critical for success.

We may not know every detail about what it takes to improve performance, but the basics are well-supported and almost universally agreed upon.

Unfortunately, what has become more commonplace is for practitioners who have newer, maybe more novel training ideas, to cling on to one or two studies that seem to support this idea. They claim this as evidence-based, even though this new idea seems to defy conventional logic and the gross body of literature on the topic. Don’t get me wrong, I love new ideas! My dissertation topic on sled work for Rugby players was one of the first studies in its area. Let’s not forget the once-novel idea of the Sun actually being at the center of our galaxy. That being said, when addressing something new in our sport sciences we should be critical and skeptical, and also consider the following:

What are the fundamental, basic scientific principles already established in regard to this topic? What do we know to be almost absolutely true? (Given the current understanding)

How does this new evidence fit in with our theoretical understanding of the topic? Is it logical based on similar findings?

What are the practical implications of this idea or finding? Are the implications comparable to the established/traditional methods?

Once we have assimilated these new findings into the great body of literature, we can either try integrate them into practice if they seem to concord with our basic understanding or wait on more investigations if they seem at odds.

On the other end of the spectrum from our lack of literature support, we have practitioners and scientists who sit in the proverbial ivory tower and shame everyone for their training methods while hiding behind this great evidence-based wall of science. They claim theirs is the only way because the literature dictates it as if god has spoken. But is this really evidence-based? In our traditional sense yes, the methods used are well established and have (virtually) clearly defined outcomes. But I challenge you to maybe approach the concept evidence-based in a different light.

When we look at literature-based evidence, we are trying to find out if the study has internal validity (is the experimental design able to conclude what it claims?), external validity (the ability to generalize the findings to other people and situations), and ecological validity (how well the study represents the real world people and situations used). Many of our training concepts can be highly generalized: Strength is good, Specificity is good, Periodization is good, Over-Training is bad. The trouble is generally finding strong ecological support for the group of athletes you want to apply evidence-based practice to; for example having a population of 18-21 year old East Tennessee small college division rugby 7′s players. Good luck! In fact, unless you read my dissertation I’m willing to bet you will never find support for that population ever! BUT the good news is that data exists, or could exist. In fact, the strongest evidence to support your population or situation is in a place you might not have looked – your own athletes.

Just because you may not have IRB approval, a strong research question, or intent to publish does not mean you cannot collect data and do research on your team. Sport science is often exploratory in nature; what we like to call hypothesis-generating. This means we collect data in hopes of finding relationships, trends, differences, and associations between variables that will warrant future investigation. Very often all it comes down to is tracking what you did and the resultant outcomes, then comparing those things with similar approaches in the past/future. Although in the field you may not have the strict controls of a pure scientist, through athlete testing and monitoring data you collect throughout the year, you can and should be used to describe or potentially make modifications to your training program.

I had the great pleasure of observing my colleagues Dr. Jake Reed and Doctoral Candidate Chris Sole perform what I consider to be real evidence-based practice with their Division I Volleyball girls. By collecting data using previously established methods, they were able to address questions like:

What are the preceding signs and symptoms that generally occur before illness or injury?

What are the minimum weekly volume loads required to maintain LBM?

What training intensities would generally cause nagging injuries to resurface?

How long performance might be impaired from bouts of high volume training?

Who was likely to have a ‘good’ performance based on physical and psychological states?

Did they have all the answers? No, but they were able to say “With our girls we did approach X , and found result Y , indicating the methods seem to be working/not working. “

The best evidence-based practice comes from not only the supporting literature, but from your own trial and error as well. Even simple things like the effects of exercise selection or exercise order on the ability to generate high rates of force development. The total volume loads of weight training during a mesocycle and the resultant magnitude of strength gain. Improvements in sprint speed after introducing a resisted sprint protocol at a specific load. All of these things you are probably already doing with your athletes, and with a little book-keeping and exploration, you can use this self-generated evidence to make meaningful changes to your program.

Therefore I suggest that the evidence-based practitioner consider the following things when addressing training theory:

What are the fundamental, basic scientific principles surrounding this topic? What do we know to be absolutely true? (Given the current understanding)

Do the methods and outcomes have strong literature support in the form literature reviews and/or meta-analyses , or are they limited to only a few studies ?

Is there a strong theoretical framework of both specific and generalized knowledge on the topic?

What does my own data, published or un-published, seem to indicate ?

If we can keep these things in mind, we can stop arguing and focus on what’s important to us. For some it’s the pursuit of the truth, for others it’s the success of their team, and for the vast majority of us it is simply to use the best information available to live out happy and successful careers.

Source: http://jtsstrength.com/articles/2014…-work-program/

 

Be Sociable, Share!

Leave a Reply

* Copy This Password *

* Type Or Paste Password Here *