Concussion risk in young athletes has prompted researchers to use a variety of head impact sensors to measure frequency and severity of impacts during sports. A new study from Children's Hospital of Philadelphia (CHOP) shows these head sensors can record a large number of false positive impacts during game play. The CHOP team's study emphasizes video confirmation on the sensor data is essential for research. The video confirmation also allows use of the data in injury prevention strategies for player safety.
Video footage of the 1,893 sensor recorded events for players on the field and within the frame of the camera were reviewed, and the events were categorized into three types: impact events (69.5%), trivial events such as adjusting a headband (20.9%,), and nonevents like a player remaining stationary (9.6%). The most common impact event was head-to-ball contact, which represented 78.4% of all impact events. Other impact events included player contact (10.9%), falls (9.8%) and ball-to-head contact (0.8%). Additionally, this study looked at both male and female athletes. Female athletes had a lower proportion of impact events (48.7% vs 78.4%) and a higher proportion of trivial events (36.6% vs 14.2%), which may be due to more frequent adjustments of the headband. However, among the actual impact events, the breakdown of types of impacts was similar between the genders.
Approximately 1 in 5 high school athletes who plays a contact sport suffers a concussion each year. To understand the frequency, magnitude, and direction of head impacts that athletes sustain, a wide variety of sensors have been developed to collect head impact biomechanics data, including instrumented helmets, skull caps, headbands, mouthguards, and skin patches. However, when data are collected during game play rather than in a controlled laboratory environment, there is potential for false positives and false negatives. In this study, CHOP researchers used data collected from headband-based head impact sensors worn by male and female soccer players to determine the proportion of false positives within the data and if video confirmation improved the quality of the data.
"Head impact sensors are a readily accessible tool for studying the mechanics of head impacts," said Declan Patton, PhD, lead author of the study and a research associate at the Center for Injury Research and Prevention at CHOP. "However, in order for researchers to have reliable data to analyze, they first need to verify whether sensor-recorded events are actual head impacts using either video- or observer-confirmation."
In this study, researchers fitted 72 high school varsity and junior varsity soccer players (23 female and 49 male) with headband mounted impact sensors during 41 games over 2 seasons to capture sensor-recorded events during a competition. All the games were video recorded. The research team analyzed the video to quantify the percentage of events recorded by the sensors that corresponded to an observable head impact event. In addition, video-verified sensor-recorded events were compared against the manufacturer filtering algorithm, which was developed to eliminate false-positives.
The sensors recorded 9,503 events during scheduled game times, which was reduced to 6,796 when the verified game time was identified on video. Of the 6,796 events during verified game time, 4,021 or approximately 60%, were removed as they were associated with players not on the field at the time of recording and therefore not an actual head impact. This indicates that prior studies, which used head impact sensor data without these important methodological steps, probably had a high proportion of nonhead impact events in their dataset.