Table 3.
Checklist item | Explanation | Example(s) | Reported on Page No |
---|---|---|---|
1. Study Design and Case Selection | |||
(1a) Study purpose | The purpose of the study shall be explained clearly—e.g. Collect observational data, compare injury to controls, evaluate interventions | PMID: 28632058, “The goal of this study is to assess the accuracy of the MBIM method relative to reflective marker-based motion analysis data for estimating six degree of freedom head displacements and velocities in a staged pedestrian impact scenario at 40 km/h.” | ________ |
(1b) Video source material | The settings and locations of the events and how the video footage was obtained (e.g., recorded for research purposes, game film) | PMID: 30398897, “For each injury play, NFL Films provided all available video footage from 4 distinct video sources in the highest-quality formats available from each source: (1) “all 22” footage shot by the hosting team, which included the wide-angle end zone and sideline views typically used by coaches and teams for game review purposes; (2) network broadcast footage with replays, which included all footage aired during the televised network broadcast of the game; (3) NFL Films footage, which is untelevised footage frequently shot for the purposes of NFL documentaries; and (4) network melt reel footage, which is footage shot by the network but not included in the televised video feed of the game.” | ________ |
(1c) Eligibility criteria for included events | Eligibility criteria for included events (e.g., sport, sex/gender, age, level of play), type of impact, and definition of object being tracked (e.g., player, helmet) | PMID: 30398897, “All reported concussions sustained in an NFL preseason, regular season, or postseason game played during the 2 target seasons were included in this study.” | |
2. Physical Camera Specifications | |||
(2a) Number and type of cameras | Number and type of cameras used for the recreation of the event | PMID: 34274559, “Eleven action cameras (HERO6; GoPro, Inc., San Mateo, CA, USA) with 41° field of view (FOV) lenses recorded video at 2.7 K resolution and 120 frames per second (fps) with a shutter speed of 1/1920s…” | ________ |
(2b) Relative locations of the camera | The location of the camera relative to the event being recorded and how far apart, in linear and angular dimensions, the cameras from each other |
PMID: 34274559, “…four cameras along each sideline at 15-yard intervals and three cameras across the back of the end zone.” PMID: 30274537, “The distance between the camera and impact area ranged from 20 to 100 m.” PMID: 30274537, “Angle of separation between camera views ranged from 66° to 96°…” |
________ |
(2c) Camera calibration or alignment | The methods for calibrating cameras and/or aligning camera views | PMID: 32095268, “Common points were selected in the video clips and laser scan data to align the camera views.” | ________ |
(2d) Camera field of view | Field of view of the cameras | PMID: 34274559, “Eleven action cameras (HERO6; GoPro, Inc., San Mateo, CA, USA) with 41° field of view (FOV) lenses recorded video at 2.7 K resolution and 120 frames per second (fps) with a shutter speed of 1/1920s…” | ________ |
(2e) Camera height | Height of the camera relative to the ground and/or location of the event | PMID: 30274537, Ranged from − 1.5 to − 48.5 m (Table 1) | ________ |
(2f) Camera angle of incident | The angle of the camera relative to the ground that results from the camera’s height off the ground | PMID: 14519212, “Cameras 3 and 4 were 31° and 57° from the sideline.” | ________ |
(2g) Landmarks | Landmarks used to recreate the 3D field | 10.1007/s12283-018-0263-4, “A calibration grid was placed in Kinovea to cover the entirety of the exercise test area where the speed was being measured… The grid was then assigned the corner-to-corner distances that matched the measurements of those markings on the ice, establishing a calibrated grid.” | ________ |
(2h) Physical obstructions and/or environmental conditions | How were physical obstructions (e.g., structures or people) and/or glare, light levels, reflections managed | 10.4271/2018-01-0516, “Other factors like the number of photographs, the specific vantage of the photographs, and occlusion of recognizable features within these photographs can also limit the number of common features available for use in camera matching. In these instances, camera matching solutions can be improved by extending the 3D environment to include objects visible in the distance such as mountains, valleys, and other notable landmarks that are typically outside of the scope of 3D scene mapping.” | ________ |
3. Camera Recording Specifications | |||
(3a) Frame rate | The rate at which consecutive images or frames are captured or displayed (e.g., in frames per second); must identify if the original video had interlaced frames and the analysis used de-interlaced frames | PMID: 34274559, “Eleven action cameras (HERO6; GoPro, Inc., San Mateo, CA, USA) with 41° field of view (FOV) lenses recorded video at 2.7 K resolution and 120 frames per second (fps) with a shutter speed of 1/1920s…” | ________ |
(3b) Frame rate variability | If there was variable frame rate (e.g., Yes/No), the methods used to account for frame rate variability in analyses | PMID: 29175825, “All seven videos had an interlaced scan with frame rates of 25 and 30 Hz, making it possible to double the effective frame rates to 50 and 60 Hz. Videos were edited and deinterlaced using Adobe premiere Pro CS6 (Adobe Systems, San Jose, CA). The video display resolution was 1024 × 576 for cases 1 and 7; 704 × 480 for case 2; and 788 × 576 for cases 3–6.” | ________ |
(3c) Shutter speed | The shutter speed of the camera used in recording | PMID: 34274559, “Eleven action cameras (HERO6; GoPro, Inc., San Mateo, CA, USA) with 41° field of view (FOV) lenses recorded video at 2.7 K resolution and 120 frames per second (fps) with a shutter speed of 1/1920s…” | ________ |
(3d) Resolution | The image or video resolution (e.g., the number of pixels that make up the image by the horizontal and vertical axes) |
PMID: 32095268, “…an additional camera with a 120° FOV GoPro Hero6 lens with 4 K resolution…” PMID: 33025319, “Resolution was assessed in terms of pixels per helmet…the resolution of the images ranged from 64 to 126,7000 pixels per helmet.” |
________ |
(3e) Display or pixel aspect ratio | Ratio of width to height (in pixels) of an image | PMID: 32130020, “…16:9 aspect ratio…” | ________ |
(3f) Data compression type | Method for image or video compression to reduce file size (e.g., Lossy or Loss-Less) | ________ | |
(3g) File format | The file format, which dictates compression algorithm (e.g., JPEG, MPEG-4); must report if fille formats were changed (i.e., file conversion) | PMID: 34274559, “Extract 3-s video clips (.mp4) and image sequences (.bmp) from a minimum of two camera views that captured the impact case. Extract a calibration image (.bmp) that corresponds with each of the aforementioned camera views.” | |
4. Corrections Related to Video Quality | |||
(4a) Lens distortion correction | Methods/software used for correction for lens distortion | PMID: 30274537, “The effects of lens distortion were incorporated into the virtual view based on the lens profile of each camera.” | ________ |
(4b) Motion blur | If motion blur occurs, the methods taken to make sure the point on the object being tracked is consistent between frames | 10.1080/14763141.2018.1513059 “Edge definition was subjectively assessed based on qualities such as brightness, contrast, and presence of blur. Based on these factors, it was determined that the cameras at locations 2 and 4 provided the best footage for analysis.” | ________ |
(4c) Unstable footage | The time of video removed due to unstable footage (e.g., unpredictable panning, zooming and movement); explain distortion management | ||
5. Data Processing | |||
(5a) Analysis software | Software used for analyses (e.g., ProAnalyst 3D, SynthEyes, PFTrack, Houdini, Nuke) | PMID: 32095268, “Three-second video clips from the two primary views were extracted and uploaded into head tracking software (PFTrack, The Pixel Farm, UK)…” | ________ |
(5b) Data reduction/down sampling, if applicable | Initial and re-sampled frame rates as well as how it was managed | ||
(5c) Video stabilization | Methods/software used to correct for camera motion, such as panning and zooming | PMID: 30274537, “All camera views were ‘stabilised’ to remove the effects of camera movement, panning, tilting and zooming so that the background shown in each image remained stationary (Nuke X 10.0v1, Foundry, London, UK).” | |
(5d) Filtering | Methods for filtering kinematic data obtained by video tracking | PMID: 33025319, “Position time histories were filtered using a 30 Hz, 4-pole Butterworth filter.” | |
(5e) Start and end times | Method for determining the beginning and end of an impact | PMID: 33025319, “Start and End times of the impact were manually selected based on visual inspection of the slope of the translational velocity time histories alongside the game footage.” | |
6. Data Reporting | |||
(6a) Accuracy | Validation procedures and accuracy of the measurement of interest | PMID: 32095268, “In a laboratory validation of our model-based image matching implementation, we determined the mean absolute errors in the estimated change in resultant translational velocity and rotational velocity (ΔVR and ΔωR, respectively) during simulated H2G and H2H impacts to be ± 0.24 m/s (± 10.7%) and ± 3.4 rad/s (± 21.8%), respectively (ref).” | ________ |
(6b) Outcome measures | Primary outcome measures (e.g., object's orientation, velocity, acceleration, size, shape) and methods to calculate |
PMID: 32095268, “Model-based image matching was used to reconstruct head translational velocity (V) and rotational velocity (ω) over time, similar to previously published methods (ref).” PMID: 33025319, “Velocities were calculated using the central difference method…” |
________ |