A Late Permian Tectonothermal Event in Grenville Crust of the Southern Maya Terrane: U-Pb Zircon Ages from the Chiapas Massif, Southeastern Mexico
In: International Geology Review, Band 47, Heft 5, S. 509-529
18 Ergebnisse
Sortierung:
In: International Geology Review, Band 47, Heft 5, S. 509-529
OBJECTIVES: There is no consensus on the optimal method of stabilization (arthroscopic or open) for revision anterior shoulder stabilization. The purpose of this study was to determine the success of revision arthroscopic stabilization at preventing further recurrence in active duty military patients. METHODS: 53 revision arthroscopic stabilizations were performed at our institution between 2005-2016 for recurrent anterior shoulder instability after an arthroscopic Bankart index procedure. Shoulders with glenoid bone loss >20% were excluded from the study. The primary outcome of interest was the ability to return to activity/duty without subsequent instability. Patients were followed for time to a subsequent instability event and repeat revision arthroscopic stabilization following return to duty/activity. RESULTS: Patient age at revision surgery averaged 22.9 ± 4.3 years. Mean follow up was 6.1 years (range 0.4-12.9). 34 out of 53 patients (64%) returned to duty without recurrent instability following revision arthroscopic anterior stabilization. 19 patients (36%) experienced recurrent instability following return to duty after revision arthroscopic stabilization. Glenoid bone loss averaged 7.8% ± 8.0% in the successful group and 6.5 % ± 6.5% in the failure group (p=0.573). Durability of the index surgery was significantly longer in the successful group (38.1 ± 31.3 months vs. 20.5 ± 17.8 months, p=.029). There was no difference between groups in patient age or number of anchors used in the index or revision stabilization procedures. CONCLUSION: Revision arthroscopic stabilization of failed primary arthroscopic Bankart repair has a failure rate of 36% in a young active duty military population, which is substantially higher than primary arthroscopic Bankart repair in this population and higher than revision arthroscopic Bankart repair in other patient populations. The similar amounts of bone loss between groups indicates that bone loss is not the primary determinant of failure in revision arthroscopic ...
BASE
BACKGROUND: Injury incidence for physically active populations with a high volume of physical load can exceed 79%. There is little existing research focused on timing of injury and how that timing differs based on certain risk factors. PURPOSE/HYPOTHESIS: The purpose of this study was to report both the incidence and timing of lower extremity injuries during cadet basic training. We hypothesized that women, those with a history of injury, and those in underweight and obese body mass index (BMI) categories would sustain lower extremity musculoskeletal injury earlier in the training period than men, those without injury history, and those in the normal-weight BMI category. STUDY DESIGN: Cohort study; Level of evidence, 2. METHODS: Cadets from the class of 2022, arriving in 2018, served as the study population. Baseline information on sex and injury history was collected via questionnaire, and BMI was calculated from height and weight taken during week 1 at the United States Military Academy. Categories were underweight (BMI <20), middleweight (20-29.99), and obese (≥30). Injury surveillance was performed over the first 60 days of training via electronic medical record review and monitoring. Kaplan-Meier survival curves were used to estimate group differences in time to the first musculoskeletal injury. Cox proportional hazard regression was used to estimate hazard ratios (HRs). RESULTS: A total of 595 cadets participated. The cohort was 76.8% male, with 29.9% reporting previous injury history and 93.3% having a BMI between 20 and 30. Overall, 16.3% of cadets (12.3% of male cadets and 29.7% of female cadets) experienced an injury during the follow-up period. Women experienced significantly greater incident injury than did men (P < .001). Separation of survival curves comparing the sexes and injury history occurred at weeks 3 and 4, respectively. Hazards for first musculoskeletal injury were significantly greater for women versus men (HR, 2.63; 95% CI, 1.76-3.94) and for those who reported a history of injury ...
BASE
BACKGROUND: Meniscal allograft transplantation (MAT) is considered a viable surgical treatment option in the symptomatic, postmeniscectomy knee and as a concomitant procedure with ACL revision and articular cartilage repair. Although promising outcomes have recently been reported in active and athletic populations, MAT has not been well-studied in the high-demand military population. QUESTIONS/PURPOSES: (1) What proportion of active-duty military patients who underwent MAT returned to full, unrestricted duty? (2) What demographic and surgical variables, if any, correlated with return to full, unrestricted duty? METHODS: Between 2005 and 2015, three fellowship-trained sports surgeons (TMD, SJS, BDO) performed 110 MAT procedures in active-duty military patients, of which 95% (104 patients) were available for follow-up at a minimum 2 years (mean 2.8 ± SD 1.1 year). During the study period, indications for MAT generally included unicompartmental pain and swelling in a postmeniscectomized knee and as a concomitant procedure when a meniscal-deficient compartment was associated with either an ACL revision reconstruction or cartilage repair. Demographic and surgical variables were collected and analyzed. The primary endpoints were the decision for permanent profile activity restrictions and military duty termination by a medical board. The term "medical board" implies termination of military service because of medical reasons. We elected to set statistical significance at p < 0.001 to reduce the potential for spurious statistical findings in the setting of a relatively small sample size. RESULTS: Forty-six percent (48 of 104) of eligible patients had permanent profile activity restrictions and 50% (52 of 104) eventually had their military duty terminated by a military board. Only 20% (21 of 104) had neither permanent profile activity restrictions nor medical-board termination and were subsequently able to return to full duty, and only 13% (13 of 104) continued unrestricted military service beyond 2 years after ...
BASE
INTRODUCTION: There is a large incidence of shoulder instability among active young athletes and military personnel. Shoulder stabilization surgery is the commonly employed intervention for treating individuals with instability. Following surgery, a substantial proportion of individuals experience acute post-operative pain, which is usually managed with opioid pain medications. Unfortunately, the extended use of opioid medications can have adverse effects that impair function and reduce military operational readiness, but there are currently few alternatives. However, battlefield acupuncture (BFA) is a minimally invasive therapy demonstrating promise as a non-pharmaceutical intervention for managing acute post-operative pain. METHODS: This is a parallel, two-arm, single-blind randomized clinical trial. The two independent variables are intervention (2 levels, standard physical therapy and standard physical therapy plus battlefield acupuncture) and time (5 levels, 24 h, 48 h, 72 h, 1 week, and 4 weeks post shoulder stabilization surgery). The primary dependent variables are worst and average pain as measured on the visual analog scale. Secondary outcomes include medication usage, Profile of Mood States, and Global Rating of Change. DISCUSSION: The magnitude of the effect of BFA is uncertain; current studies report confidence intervals of between-group differences that include minimal clinically important differences between intervention and control groups. The results of this study may help determine if BFA is an effective adjunct to physical therapy in reducing pain and opioid usage in acute pain conditions. TRIAL REGISTRATION: ClinicalTrials.gov NCT04094246. Registered on 16 September 2019.
BASE
CONTEXT: Military service members commonly sustain lower extremity stress fractures (SFx). How SFx risk factors influence bone metabolism is unknown. Understanding how SFx risk factors influence bone metabolism may help to optimize risk-mitigation strategies. OBJECTIVE: To determine how SFx risk factors influence bone metabolism. DESIGN: Cross-sectional study. SETTING: Military service academy. PATIENTS OR OTHER PARTICIPANTS: Forty-five men (age(pre) = 18.56 ± 1.39 years, height(pre) = 176.95 ± 7.29 cm, mass(pre) = 77.20 ± 9.40 kg; body mass index(pre) = 24.68 ± 2.87) who completed Cadet Basic Training (CBT). Individuals with neurologic or metabolic disorders were excluded. INTERVENTION(S): We assessed SFx risk factors (independent variables) with (1) the Landing Error Scoring System (LESS), (2) self-reported injury and physical activity questionnaires, and (3) physical fitness tests. We assessed bone biomarkers (dependent variables; procollagen type I amino-terminal propeptide [PINP] and cross-linked collagen telopeptide [CTx-1]) via serum. MAIN OUTCOME MEASURE(S): A markerless motion-capture system was used to analyze trunk and lower extremity biomechanics via the LESS. Serum samples were collected post-CBT; enzyme-linked immunosorbent assays determined PINP and CTx-1 concentrations, and PINP : CTx-1 ratios were calculated. Linear regression models demonstrated associations between SFx risk factors and PINP and CTx-1 concentrations and PINP : CTx-1 ratio. Biomarker concentration mean differences with 95% confidence intervals were calculated. Significance was set a priori using α ≤ .10 for simple and α ≤ .05 for multiple regression analyses. RESULTS: The multiple regression models incorporating LESS and SFx risk factor data predicted the PINP concentration (R(2) = 0.47, P = .02) and PINP : CTx-1 ratio (R(2) = 0.66, P = .01). The PINP concentration was increased by foot internal rotation, trunk flexion, CBT injury, sit-up score, and pre- to post-CBT mass changes. The CTx-1 concentration was increased by ...
BASE
BACKGROUND: In-season return to play after anterior glenohumeral instability is associated with high rates of recurrent instability and the need for surgical stabilization. We are not aware of previous studies that have investigated in-season return to play after posterior glenohumeral instability; furthermore, as posterior shoulder instability in collision athletes occurs frequently, understanding the expected outcome of in-season athletes may improve the ability of physicians to provide athletes with a better understanding of the expected outcome of their injury and their ability to return to sport. QUESTIONS/PURPOSES: (1) What proportion of athletes returned to play during the season after posterior instability in collegiate football players? (2) How much time did athletes lose to injury, what proportion of athletes opted to undergo surgery, and what proportion of athletes experienced recurrent instability after a posterior instability episode during a collegiate football season? METHODS: A multicenter, prospective, observational study of National Collegiate Athletic Association (NCAA) Division 1 Football Bowl Subdivision athletes was performed at three US Military Service Academies. Ten athletes who sustained a posterior instability event during the regular football season and who pursued a course of nonoperative treatment were identified and prospectively observed through the subsequent season. All athletes in the observed cohort attempted an initial course of nonoperative treatment during the season. All athletes sustained subluxation events initially identified through history and physical examination at the time of injury. None of the athletes sustained a dislocation event requiring a manual reduction. Intraarticular pathology consisting of posterior labral pathology was further subsequently identified in all subjects via MRI arthrogram. Return to play was the primary outcome of interest. Time lost to injury, surgical intervention, and subsequent instability were secondary outcomes. RESULTS: Of the 10 ...
BASE
CONTEXT: Approximately half of individuals who sustain a concussion do not immediately report their injuries. Motivators for not reporting include thinking the suspected concussion was not a serious injury and wanting to continue participating in activity. Additionally, military personnel have concerns about how concussions may affect their careers. However, delayed reporting can prolong neurobehavioral recovery. Understanding the frequency of delayed reporting and contributing factors will aid in identifying individuals who may be more likely to delay reporting. OBJECTIVE: To describe the frequency of delayed concussion reporting by service academy cadets and determine if sex, injury setting, sport level, or medical history is capable of predicting delayed reporting. DESIGN: Cohort study. SETTING: Service academies. PATIENTS OR OTHER PARTICIPANTS: A total of 316 patients with concussions were observed from January 2014 to August 2016. MAIN OUTCOME MEASURE(S): All cadets completed an annual concussion baseline collection of demographic, medical history, and sports participation information. Delayed concussion reporting served as the outcome variable. Predictor variables were sex, injury setting, and sport level, as well as concussion, headache, and learning disorder history. Frequencies were calculated to describe the proportion of participants who delayed reporting. Univariable and multivariable logistic regression models were used to assess if the predictor variables were associated with delayed concussion reporting. Odds ratios (ORs) and 95% confidence intervals were calculated for all variables included in the final model. RESULTS: Of the patients with concussion, 51% were classified as delayed reporting. In univariable models, females (OR = 1.70) and National Collegiate Athletic Association cadet-athletes (OR = 1.98) were more likely to delay reporting than males and intramural cadet-athletes, respectively. The multivariable model yielded similar findings. CONCLUSIONS: Roughly half of the cadets who ...
BASE
Despite the significant impact that concussion has on military service members, significant gaps remain in our understanding of the optimal diagnostic, management, and return to activity/duty criteria to mitigate the consequences of concussion. In response to these significant knowledge gaps, the US Department of Defense (DoD) and the National Collegiate Athletic Association (NCAA) partnered to form the NCAA-DoD Grand Alliance in 2014. The NCAA-DoD CARE Consortium was established with the aim of creating a national multisite research network to study the clinical and neurobiological natural history of concussion in NCAA athletes and military Service Academy cadets and midshipmen. In addition to the data collected for the larger CARE Consortium effort, the service academies have pursued military-specific lines of research relevant to operational and medical readiness associated with concussion. The purpose of this article is to describe the structure of the NCAA-DoD Grand Alliance efforts at the service academies, as well as discuss military-specific research objectives and provide an overview of progress to date. A secondary objective is to discuss the challenges associated with conducting large-scale studies in the Service Academy environment and highlight future directions for concussion research endeavors across the CARE Service Academy sites.
BASE
Despite the significant impact that concussion has on military service members, significant gaps remain in our understanding of the optimal diagnostic, management, and return to activity/duty criteria to mitigate the consequences of concussion. In response to these significant knowledge gaps, the US Department of Defense (DoD) and the National Collegiate Athletic Association (NCAA) partnered to form the NCAA-DoD Grand Alliance in 2014. The NCAA-DoD CARE Consortium was established with the aim of creating a national multisite research network to study the clinical and neurobiological natural history of concussion in NCAA athletes and military Service Academy cadets and midshipmen. In addition to the data collected for the larger CARE Consortium effort, the service academies have pursued military-specific lines of research relevant to operational and medical readiness associated with concussion. The purpose of this article is to describe the structure of the NCAA-DoD Grand Alliance efforts at the service academies, as well as discuss military-specific research objectives and provide an overview of progress to date. A secondary objective is to discuss the challenges associated with conducting large-scale studies in the Service Academy environment and highlight future directions for concussion research endeavors across the CARE Service Academy sites.
BASE
Introduction The prevalence and possible long-term consequences of concussion remain an increasing concern to the U.S. military, particularly as it pertains to maintaining a medically ready force. Baseline testing is being used both in the civilian and military domains to assess concussion injury and recovery. Accurate interpretation of these baseline assessments requires one to consider other influencing factors not related to concussion. To date, there is limited understanding, especially within the military, of what factors influence normative test performance. Given the significant physical and mental demands placed on service academy members (SAM), and their relatively high risk for concussion, it is important to describe demographics and normative profile of SAMs. Furthermore, the absence of available baseline normative data on female and non-varsity SAMs makes interpretation of post-injury assessments challenging. Understanding how individuals perform at baseline, given their unique individual characteristics (e.g., concussion history, sex, competition level), will inform post-concussion assessment and management. Thus, the primary aim of this manuscript is to characterize the SAM population and determine normative values on a concussion baseline testing battery. Materials and Methods All data were collected as part of the Concussion Assessment, Research and Education (CARE) Consortium. The baseline test battery included a post-concussion symptom checklist (Sport Concussion Assessment Tool (SCAT), psychological health screening inventory (Brief Symptom Inventory (BSI-18) and neurocognitive evaluation (ImPACT), Balance Error Scoring System (BESS), and Standardized Assessment of Concussion (SAC). Linear regression models were used to examine differences across sexes, competition levels, and varsity contact levels while controlling for academy, freshman status, race, and previous concussion. Zero inflated negative binomial models estimated symptom scores due to the high frequency of zero scores. Results Significant, but small, sex effects were observed on the ImPACT visual memory task. While, females performed worse than males (p < 0.0001, pη2 = 0.01), these differences were small and not larger than the effects of the covariates. A similar pattern was observed for competition level on the SAC. There was a small, but significant difference across competition level. SAMs participating in varsity athletics did significantly worse on the SAC compared to SAMs participating in club or intramural athletics (all p's < 0.001, η2 = 0.01). When examining symptom reporting, males were more than two times as likely to report zero symptoms on the SCAT or BSI-18. Intramural SAMs had the highest number of symptoms and severity compared to varsity SAMs (p < 0.0001, Cohen's d < 0.2). Contact level was not associated with SCAT or BSI-18 symptoms among varsity SAMs. Notably, the significant differences across competition level on SCAT and BSI-18 were sub-clinical and had small effect sizes. Conclusion The current analyses provide the first baseline concussion battery normative data among SAMs. While statistically significant differences may be observed on baseline tests, the effect sizes for competition and contact levels are very small, indicating that differences are likely not clinically meaningful at baseline. Identifying baseline differences and significant covariates is important for future concussion-related analyses to inform concussion evaluations for all athlete levels.
BASE
BACKGROUND: Concussion, or mild traumatic brain injury, is a major public health concern affecting 42 million individuals globally each year. However, little is known regarding concussion risk factors across all concussion settings as most concussion research has focused on only sport-related or military-related concussive injuries. METHODS: The current study is part of the Concussion, Assessment, Research, and Education (CARE) Consortium, a multi-site investigation on the natural history of concussion. Cadets at three participating service academies completed annual baseline assessments, which included demographics, medical history, and concussion history, along with the Sport Concussion Assessment Tool (SCAT) symptom checklist and Brief Symptom Inventory (BSI-18). Clinical and research staff recorded the date and injury setting at time of concussion. Generalized mixed models estimated concussion risk with service academy as a random effect. Since concussion was a rare event, the odds ratios were assumed to approximate relative risk. RESULTS: Beginning in 2014, 10,604 (n = 2421, 22.83% female) cadets enrolled over 3 years. A total of 738 (6.96%) cadets experienced a concussion, 301 (2.84%) concussed cadets were female. Female sex and previous concussion were the most consistent estimators of concussion risk across all concussion settings. Compared to males, females had 2.02 (95% CI: 1.70-2.40) times the risk of a concussion regardless of injury setting, and greater relative risk when the concussion occurred during sport (Odds Ratio (OR): 1.38 95% CI: 1.07-1.78). Previous concussion was associated with 1.98 (95% CI: 1.65-2.37) times increased risk for any incident concussion, and the magnitude was relatively stable across all concussion settings (OR: 1.73 to 2.01). Freshman status was also associated with increased overall concussion risk, but was driven by increased risk for academy training-related concussions (OR: 8.17 95% CI: 5.87-11.37). Medical history of headaches in the past 3 months, diagnosed ADD/ADHD, and BSI-18 Somatization symptoms increased overall concussion risk. CONCLUSIONS: Various demographic and medical history factors are associated with increased concussion risk. While certain factors (e.g. sex and previous concussion) are consistently associated with increased concussion risk, regardless of concussion injury setting, other factors significantly influence concussion risk within specific injury settings. Further research is required to determine whether these risk factors may aid in concussion risk reduction or prevention.
BASE
CONTEXT: Assessments of the duration of concussion recovery have primarily been limited to sport-related concussions and male contact sports. Furthermore, whereas durations of symptoms and return-to-activity (RTA) protocols encompass total recovery, the trajectory of each duration has not been examined separately. OBJECTIVE: To identify individual (eg, demographics, medical history), initial concussion injury (eg, symptoms), and external (eg, site) factors associated with symptom duration and RTA-protocol duration after concussion. DESIGN: Cohort study. SETTING: Three US military service academies. PATIENTS OR OTHER PARTICIPANTS: A total of 10 604 cadets at participating US military service academies enrolled in the study and completed a baseline evaluation and up to 5 postinjury evaluations. A total of 726 cadets (451 men, 275 women) sustained concussions during the study period. MAIN OUTCOME MEASURE(S): Number of days from injury (1) until the participant became asymptomatic and (2) to complete the RTA protocol. RESULTS: Varsity athlete cadets took less time than nonvarsity cadets to become asymptomatic (hazard ratio [HR] = 1.75, 95% confidence interval = 1.38, 2.23). Cadets who reported less symptom severity on the Sport Concussion Assessment Tool, third edition (SCAT3), within 48 hours of concussion had 1.45 to 3.77 times shorter symptom-recovery durations than those with more symptom severity. Similar to symptom duration, varsity status was associated with a shorter RTA-protocol duration (HR = 1.74, 95% confidence interval = 1.34, 2.25), and less symptom severity on the SCAT3 was associated with a shorter RTA-protocol duration (HR range = 1.31 to 1.47). The academy that the cadet attended was associated with the RTA-protocol duration (P < .05). CONCLUSIONS: The initial total number of symptoms reported and varsity athlete status were strongly associated with symptom and RTA-protocol durations. These findings suggested that external (varsity status and academy) and injury (symptom burden) factors ...
BASE