Neurology
2026-Mar-24
Contrast-associated acute kidney injury (CA-AKI) is a potentially preventable complication after exposure to iodinated contrast media. In patients undergoing endovascular thrombectomy (EVT) for acute ischemic stroke (AIS), the incidence and clinical impact are poorly characterized, and no validated prediction tool is currently available. The aim of this study was to assess the incidence and prognostic significance of CA-AKI in EVT-treated patients with AIS and to develop and validate a predictive score.
A retrospective, multicenter cohort study was conducted involving EVT-treated patients across 73 centers in 16 countries (January-December 2023). Inclusion criteria were age ≥18 years, absence of dialysis, availability of preprocedural and 48-hour postprocedural creatinine levels, and available 90-day follow-up (modified Rankin Scale [mRS] score). The primary outcome was CA-AKI, defined by KDIGO (Kidney Disease: Improving Global Outcomes criteria;creatinine increase ≥0.3 mg/dL or ≥1.5 times baseline, within 48 hours). Secondary outcomes were (1) in-hospital mortality, (2) 90-day mRS score, and (3) 90-day severe disability or death (mRS score >3). Logistic models assessing associations with outcomes accounted for within-center clustering by applying robust standard errors. CA-AKI prediction models were developed across imputed data sets using univariable selection (p < 0.20), backward elimination (p < 0.05), and coefficient-based scoring after categorization of continuous predictors, with internal validation by bootstrap to obtain optimism-adjusted estimates.
Among 6,638 patients (median age 74 years; 48.7% male), CA-AKI occurred in 326 (4.9%) and was independently associated with in-hospital mortality (adjusted odds ratio [aOR] 2.269; 95% CI 1.615-3.190), higher 90-day mRS scores (adjusted common odds ratio 1.584; 95% CI 1.110-2.258), and 90-day severe disability or death (aOR 1.530; 95% CI 1.057-2.216). A preprocedural risk model including 12 routine clinical variables-sex, ethnicity, arterial hypertension, dyslipidemia, chronic kidney disease, antiplatelet therapy, NIH Stroke Scale score at admission, serum glucose, estimated glomerular filtration rate, hemoglobin, mean arterial pressure, and IV thrombolysis-demonstrated acceptable discrimination (area under the receiver operating characteristic curve 0.710 [95% CI 0.682-0.738]; precision-recall area under the curve 0.13 [95% CI 0.10-0.16]), good calibration (slope 0.870 [95% CI 0.759-0.928]), good overall performance (Brier score 0.045 [95% CI 0.042-0.049]). A second model that included EVT-related variables (e.g., contrast volume) showed similar performances.
In this large, international cohort, CA-AKI occurred in approximately 1 in 20 EVT-treated patients with AIS and was independently associated with poor outcomes. A simple preprocedural risk score enables early identification of high-risk individuals and may support preventive strategies.
Obstetrics and gynecology
2026-Mar-01
To investigate the treatment effect of adjuvant chemotherapy for stage I ovarian clear cell carcinoma.
We searched Cochrane, PubMed, International Standard Randomised Controlled Trial Number registry, ClinicalTrials.gov, the World Health Organization International Clinical Trials Registry Platform, and Ichushi-Web to January 22, 2025.
We included randomized controlled trials (RCTs) and non-RCTs that included more than 50 patients with stage I ovarian clear cell carcinoma. The primary and secondary outcomes were disease-free survival and overall survival, respectively. We performed a meta-analysis of the stage-adjusted hazard ratios (HRs) of adjuvant chemotherapy compared with placebo or no intervention. The substage-related heterogeneity of effects was also assessed. A meta-analysis of proportions was also conducted to assess 5-year disease-free survival and 5-year overall survival. Risk of bias was assessed with the Risk of Bias in Non-randomized Studies of Interventions tool.
Because no RCTs reported HRs for the ovarian clear cell carcinoma subgroup, data from nine non-RCTs were analyzed. The pooled substage-adjusted HR for disease-free survival associated with use of chemotherapy was 0.47 (95% CI, 0.29-0.74) and that for overall survival was 0.66 (95% CI,0.43-1.00). Heterogeneity in the effect by substage was not evident for either disease-free survival (P for subgroup difference=.91) or overall survival (P=.60). The pooled 5-year disease-free survival was 0.80 (95% CI, 0.65-0.89) for stage I overall, 0.95 (95% CI, 0.47-1.0) for stage IA, and 0.61 (95% CI, 0.47-0.74) for stage IC. The estimated number needed to treat was 10.2 (95% CI, 5.8-18.6) for stage I overall, 40.8 (95% CI, 3.9-infinity) for stage IA, and 5.2 (95% CI, 3.9-7.8) for stage IC.
Adjuvant chemotherapy improves disease-free survival and may prolong overall survival in patients with stage I ovarian clear cell carcinoma. Available evidence suggests that recurrence is reduced by approximately 50%. Treatment decisions should consider the baseline recurrence risks and absolute benefits.
PROSPERO, CRD42024562486.
Proceedings of the National Academy of Sciences of the United States of America
2026-Feb-24
Straw return reshapes the biogeochemical processes in paddy soils by driving microbial transformation of key elements. Despite growing awareness of these individual processes, the integration of these processes under millimeter-scale spatiotemporal heterogeneity remains unclear. Combining high-resolution geochemical profiling with multiomics, we revealed that straw addition altered the depth-dependent dynamics of arsenic, carbon, and nitrogen, establishing a sophisticated three-layer microbial stratification. We identified 1) an 18 mm organic matter (OM)-rich layer extending from the straw layer, which serves as a methanogenic epicenter co-occurring with active nitrogen fixation microbes; 2) an overlying layer dominated by aerobic methane oxidation and denitrification microbes; and 3) a deeper substraw layer dominated by anaerobic arsenite oxidation and denitrification microbes. Significantly positively correlated abundances of transcribed mcrA with nifH genes and pmoA or aioA/arxA with denitrification genes were identified. Corroboratively, intensified co-occurrence patterns of mcrA with nifH, pmoA with denitrification, and aioA/arxA with denitrification genes were observed in the OM-rich, upper, and lower layers, respectively. Moreover, the co-occurred mcrA-nifH and aioA-nirS/arxA-narG genes in different metagenome-assembled genomes presented 80.6 to 260.8- and 1.55 to 6.85-fold greater transcriptional activity in the OM-rich and lower layers than in the other layers, respectively. Our results demonstrated that straw incorporation established a dynamic soil redox zone, restructuring millimeter-scale microbial networks and promoting potentially coupled denitrification with arsenite or methane oxidation, as well as methanogenesis with nitrogen fixation. These findings provide a mechanistic basis for optimizing subsurface straw placement and nitrate application to enhance nutrient cycling and mitigate environmental risks.
JMIR formative research
2026-Feb-19
Anemia is a widespread global health issue. Hemoglobin (Hb) concentration measurement remains the most common method for anemia screening and diagnosis. In recent years, there has been growing interest in the development of noninvasive point-of-care technologies that eliminate the need for blood sampling.
This pilot study explores the feasibility of using a noncontact photoplethysmography-based mobile app for Hb monitoring.
Adult volunteers aged 18 years and older, of both sexes, were consecutively recruited. Participants were seated and allowed a 2-minute rest before measurements. During testing, they faced a smartphone running comestai.app, which used the front-facing camera to capture facial videos. Simultaneous readings were collected for Hb over approximately 90 seconds using the app. Ambient lighting was standardized for all remote photoplethysmography recordings. No medical decisions were made based on the app-generated data. A complete blood count, including Hb levels, was used as a reference for comparison with the data collected using comestai.app.
A total of 555 (female: n=313, 56.4%; male: n=242, 43.6%) individuals participated in the study. The app achieved a mean absolute error of 1.46, a mean absolute percentage error of 11.26, a mean error of -0.67, and a root mean square error of 1.88. The Bland-Altman plot evaluated the agreement between the app-based and laboratory-based Hb measurements, with the mean difference between the 2 methods being -0.70 g/dL. The method demonstrated an overall accuracy of 75%. The area under the curve was 0.701 (95% CI 0.655-0.745).
Comestai.app offers an innovative approach to wellness monitoring by providing noninvasive Hb estimation using the smartphone's front-facing camera. Continued development, including algorithmic refinement and larger-scale validation in diverse populations, will be key to enhancing accuracy and broadening its utility. By leveraging the ubiquity of smartphones, comestai.app contributes meaningfully to the democratization of health monitoring and the promotion of proactive self-care.
Journal of medical Internet research
2026-Feb-19
The exponential growth of medical data and advancements in artificial intelligence (AI) have accelerated the development of data-driven health care. However, the secure and efficient sharing of sensitive medical data across institutions remains a major challenge due to privacy concerns, data silos, and regulatory restrictions. Traditional centralized systems are prone to data breaches and single points of failure, while existing privacy-preserving techniques face high computational and communication costs.
This study aims to provide a comprehensive review of the recent advances in blockchain-based federated learning (BCFL) within the medical field. By exploring the synergistic integration of federated learning and blockchain, this review evaluates how BCFL enhances data security, supports privacy-preserving cross-institutional collaboration, and facilitates practical applications in health care, including medical data sharing, Internet of Medical Things, public health surveillance, and telemedicine.
We conducted a systematic literature review using databases such as PubMed, IEEE Xplore, Web of Science, and Google Scholar. Boolean logic and domain-specific keywords were used to retrieve studies from 2018 to 2025. After automated deduplication and multistage manual screening, over 100 high-quality papers were included. These works cover BCFL's theoretical foundations, system architectures, application domains, limitations, and future directions.
BCFL frameworks combine the decentralized trust and auditability of blockchain with the privacy-preserving collaborative learning capabilities of federated learning. This integration mitigates risks such as model tampering, data leakage, and a lack of incentives in federated systems. Applications span across cross-institutional medical data sharing, Internet of Medical Things, epidemic forecasting, and telemedicine. Architectures including fully coupled, flexibly coupled, and loosely coupled models offer varying trade-offs between efficiency, scalability, and security.
BCFL represents a transformative paradigm for secure, collaborative, and privacy-preserving medical AI. By combining decentralized trust, incentive-driven participation, and privacy-enhancing machine learning, BCFL paves the way for next-generation smart health care systems. Despite current technical and practical challenges, BCFL demonstrates strong potential to support precision medicine, global health data collaboration, and large-scale AI deployment in health care.
Journal of medical Internet research
2026-Feb-19
Alerts, a key feature of electronic health record systems, intend to improve patient safety by providing timely information at the point of care. However, many electronic health record systems generate excessive alerts that are not immediately clinically relevant and that contribute to alert fatigue. Despite growing recognition of alert fatigue as a safety concern, clinicians' experiences of alert fatigue and the broader system-level factors that contribute to it being experienced are not well understood.
This study aims to use a human factors approach to (1) comprehensively explore how alert fatigue is experienced by junior doctors; (2) identify factors that contribute to experiences of alert fatigue; (3) identify perceived impacts of alert fatigue on employees, organizations, and patients; and (4) identify strategies to reduce alert fatigue in practice.
Semistructured interviews were conducted with junior doctors working in hospitals across Australia. Data were thematically analyzed using a hybrid inductive and deductive approach, informed by the Systems Engineering Initiative for Patient Safety and an information processing model.
A total of 20 junior doctors were interviewed. Alert fatigue was described to occur at different stages of information processing, including when alerts were not detected, were superficially processed using mental shortcuts, or required excessive cognitive effort to interpret. When alerts were not detected or thoroughly processed, participants more often perceived impacts on patient safety and care quality due to the potential to miss important information. Further, when alerts required excessive cognitive effort, participants frequently reported interruptions, frustration, and time and effort loss as impacts. Factors influencing experiences of alert fatigue were identified in all Systems Engineering Initiative for Patient Safety work system domains, including those related to people, tasks, the environment, tools and technologies, and the organization. Key contributors included the design and clinical relevance of alerts, institutional norms and expectations, and information overload from system alerts as well as other alerts and tasks. Alert fatigue was also described to be experienced differently depending on provider characteristics, such as experiences with and knowledge of alerts, mood, and personality, and organizational factors, including culture, shift type, and time of day.
Alert fatigue is not a binary concept but is instead experienced on a continuum and influenced by interacting individual, technical, and contextual factors. Future research should incorporate clinician self-reports to evaluate experiences of alert fatigue in addition to objective measures. Addressing alert fatigue requires tailored interventions that target its different causes and outcomes. These could include technical and design improvements, changes to organizational practices, and individual customization to reduce experiences of fatigue and accommodate differences in clinicians' needs.
JMIR human factors
2026-Feb-19
Patients with insomnia have difficulty in both falling asleep and maintaining sleep. Individuals with long-term sleep deprivation are prone to poor concentration and impaired memory; however, these problems can be alleviated following brief behavioral treatment for insomnia (BBT-I). This study involved the design of an app called "Sleep Well" that enables individuals with insomnia to easily record their sleep behavior. The app guides users to recall and record sleep-related information, acquire sleep hygiene knowledge, and communicate with therapists online.
This study examined how specific sleep diary interface design features in a brief cognitive behavioral therapy for insomnia (BBT-I) app influence users' attention and short-term memory. Using a combination of objective eye-tracking measures and subjective attention assessments, the study compared 3 interface designs to determine how visual layout, input modality, and interaction style interact with insomnia symptoms to affect attentional performance, memory accuracy, and user preference.
Three sleep diary interfaces were designed, varying background mode (day vs night), color scheme (blue vs green), box shape (circular, rounded rectangular, or rectangular), and input method (slide-in, tap, or type-in). A total of 33 participants completed standardized diary-entry tasks while eye movements were recorded using an eye tracker to capture gaze trajectories and visual attention patterns during app interaction. User experience, subjective attention, and interface preferences were assessed using structured questionnaires. Data were analyzed using descriptive statistics, nonparametric tests, Pearson correlation analysis, cross-tabulation analysis, and exploratory factor analysis to examine associations among interface design, attentional performance, memory accuracy, and user characteristics.
A total of 33 participants (n=13, 39.4% male and n=20, 60.6% female) aged 20 to 64 years completed this study. Based on the Insomnia Severity Index, 6 of 33 (18.2%) participants had clinical insomnia and 13 of 33 (39.4%) reported insomnia symptoms. Most participants reported staying up late (22/33, 66.7%), and more than half of participants reported drinking tea (17/33, 51.5%). Interface design significantly influenced objective attentional performance, as measured by eye-tracking indicators of task efficiency and visual allocation. Sleep quality and insomnia symptoms were consistently associated with attentional and short-term memory outcomes, with memory accuracy varying across interfaces and showing particular sensitivity to sleep maintenance difficulties. Subjective attentional control was strongly associated with both eye-tracking metrics and memory performance, and interface preferences differed by insomnia status.
Interface design significantly modulates attention and short-term memory performance in users with insomnia. Eye-tracking revealed that insomnia symptoms and sleep quality influence visual attention and task efficiency, whereas subjective attentional control showed stronger and more consistent associations with memory accuracy than physiological eye-movement indicators. These findings suggest that cognitive processing during sleep diary completion relies more on internal attentional states than on observable gaze behavior. Designing low-load, attention-supportive interfaces may therefore improve usability and data accuracy in digital BBT-I interventions.
JMIR research protocols
2026-Feb-19
Urinary calculi (UC), affecting 1%-13% globally, pose a significant health burden due to high recurrence rates (up to 50% within 10 years) and substantial health care costs. Adequate fluid intake is a cornerstone of prevention; yet, its adherence remains poor due to forgetfulness, lifestyle barriers, and limited patient education. Existing mobile health interventions for UC prevention often lack medical oversight and clinical validation. WeChat-based digital therapeutic intervention may have a positive effect on fluid adherence in this patient group.
Our objective is to develop a WeChat applet to improve hydration behavior and reduce stone recurrence among postoperative patients with UC.
This is an open-label, 2-arm, parallel-group randomized controlled trial. We will recruit 148 participants from China's tertiary hospital and randomly allocate them in a ratio of 1:1 to the intervention or control group. The intervention group received standard postoperative care supplemented by the WeChat-Based Applet Fluid Intake Reminder (WAFIR), which delivers personalized fluid intake reminders, urine color monitoring, 24-hour fluid intake and urine output tracking, and evidence-based educational content on hydration and urolithiasis management. The control group receives standard care of general discharge instructions from nurses. The primary outcome is the fluid adherence, measured by 24-hour fluid intake and urine volume; secondary outcome measures are the Wisconsin Stone Quality of Life Questionnaire, Patient Health Questionnaire-9, Electronic Health Literacy Scale, physical activity (International Physical Activity Questionnaire-Short Form), and recurrence rate of UC. Outcomes are measured before intervention (T0) and after a 1-month (T1) and 3-month (T2) follow-up period. Intention-to-treat analysis, 2-tailed t tests, and repeated measures ANOVA will be used to compare outcomes; statistical significance is set at a P<.05 significance threshold. The study was approved by the ethics review board in December 2024.
The development of WAFIR, conducted in collaboration with stakeholders, was finalized in February 2025. Recruitment commenced on March 1, 2025; data collection was completed in September 2025, and data analysis was analyzed in December 2025. Dissemination of findings is planned through conferences and publications in 2026.
This research evaluates the effectiveness of a nurse-led, evidence-based digital therapeutic intervention, WAFIR, in overcoming fluid adherence barriers among postoperative patients following urolithiasis surgery, aiming to increase daily fluid intake and urine output, reduce recurrence rates, enhance quality of life, and generate empirical evidence for its application in urology care, thereby optimizing postoperative management within clinical settings.
JMIR pediatrics and parenting
2026-Feb-19
Parents, as the most proximal influence on young children, play an important role in shaping toddler behaviors. Yet, evidence on how parents shape toddler screen use is limited. Little is also known about the relationship between toddler screen use and BMI. Given existing disparities in screen use and early childhood obesity, a focus on Mexican American families with toddlers is warranted.
This study aimed to evaluate the independent contributions of both maternal screen use and screen-related parenting practices with toddler screen use duration, for both TV viewing and mobile device use, and examine the relationship between toddler screen use duration and BMI.
This cross-sectional study enrolled 384 Mexican American mother-toddler dyads recruited from safety net clinics. Enrolled mothers completed 7-day screen use diaries and surveys on screen-related parenting practices, and toddler anthropometrics were obtained. Negative binomial regression models estimated the relationships between screen-related parenting practices and maternal screen use (predictors) with child duration of daily TV use and mobile device use (outcomes). Spearman correlations were calculated to estimate the relationship between toddler screen use duration and age- and sex-specific BMI z scores.
Maternal duration of daily TV and mobile device use were associated with toddler duration of daily TV (adjusted rate ratios [aRRs] 1.27-1.28; all P<.001) and mobile device use (aRRs 1.17-1.18; all P<.001), respectively, even after adjusting for maternal screen-related parenting practices. Specific parenting practices, including restriction of TV time (aRR=0.86; P=.01), restriction of mobile device time (aRR=0.80; P=.02), use of TV (aRR=1.27; P=.003) and mobile devices (aRR=1.78; P<.001) for child behavior regulation, and coviewing of mobile devices (aRR=1.51; P<.001), were associated with toddler duration of daily screen use, adjusted for maternal duration of daily screen use. Neither toddler duration of daily TV viewing nor daily mobile device use was correlated with toddler BMI z scores.
Both the duration of maternal screen use and screen-related parenting practices, for both TV and mobile devices, should be considered when promoting healthy screen use in toddlers in Mexican American families. Interventionists should consider the family ecology when designing interventions promoting healthy screen use in early childhood.
JMIR formative research
2026-Feb-19
Individuals with tic disorders (TDs) have access to a small but growing number of digital tools (such as apps and websites) for tic management and support. While prior work has shown promise for these tools, they have traditionally been designed by researchers first and evaluated by members of the TD community after tool development is complete. A human-centered design process targeting this domain has the potential to reveal new insights relevant to the development of future tools. We seek to establish a preliminary understanding of how the TD community uses and perceives current resources for tic management and support as well as their overall concerns and needs in this area.
This study aimed to explore the design potential of future digital tools for helping adults manage their tics by gathering an initial set of needs and requirements from adult members of the TD community in the United States.
An online survey was distributed via TD community groups and also via TD clinicians and researchers in the United States. The survey contained a combination of dichotomous, multiple-choice, and open-ended questions, with opportunities for participants to specify how they currently receive support, rank their preferred features and requirements, and express their needs and concerns relevant to future work. Qualitative responses were analyzed with inductive thematic analysis.
Most respondents typically sought answers from digital platforms first (124/158, 78.5%) when confronting a question about their tics. Even so, only 18.4% (29/158) reported having previously used a digital tool to help with their tics or any other aspect of their health. Simultaneously, 88.9% (136/153) indicated that they would be very (81/153, 52.9%) or somewhat (55/153, 35.9%) likely to use a digital tool designed for adults with tics. Of those listing concerns (42/158, 26.6%), common reported concerns included the tool being too time-consuming, difficult to use, or generally not meeting accessibility standards. When asked to rank the one feature of a digital tool that they believed to be most important, tic monitoring (66/154, 42.9%) and trigger monitoring (54/154, 35.1%) were among the most popular requested features as opposed to other options, such as information gathering, reminders to practice a therapeutic skill or take medicine, social support, or opportunities to share their story. While screen navigation was most preferred, results indicated that a multimodal design overall would support the most users.
Our study participants reported a lack of useful technology for tic management and indicated a need for accessible tools to assist in tic and trigger monitoring in particular. Other concerns included that new tools would be difficult to use or learn due to tics. Findings suggest a cautious excitement for future digital tools in this area.
Journal of medical Internet research
2026-Feb-19
Evidence-based interventions effectively treat sexual dysfunctions. Up to 13.5% of women with gynecological conditions are affected, yet access to therapy is limited. Self-guided digital interventions may offer scalable, accessible first-line support.
This randomized controlled mixed methods pilot trial evaluated adherence, acceptance, and safety of the Odeya app and changes in sexual and health outcomes among women with sexual dysfunctions and endometriosis.
Following online and flyer-based recruitment, participants completed an online screening and were randomized to either an intervention group (IG) receiving 8 self-guided app modules targeting biopsychosocial aspects of sexuality or to a control group (CG) receiving routine care. Self-administered online questionnaires were completed at baseline (T0), midintervention (T1), postintervention (T2), and 6-month follow-up (T3). Standardized instruments assessed acceptance (Client Satisfaction Questionnaire-Internet [CSQ-I] and German mHealth App Usability Questionnaire [G-MAUQ]), safety (Inventory for the balanced assessment of Negative Effects of Psychotherapy-Online Intervention), sexual health (Female Sexual Distress Scale-Desire/Arousal/Orgasm [FSDS-DAO], Female Sexual Function Index-German version [FSFI-d], and Partnership Questionnaire), and overall health (Patient-Reported Outcome Measurement Information System-29-Item Profile, Beck Depression Inventory-II, and Generalized Anxiety Disorder-7). Adherence indicators included module completion, dropout rates, and symptom tracker use. Group differences were examined descriptively and using Cohen d. Qualitative data were collected through free-list questionnaires from dropouts (n=11) and interviews with completers (IG: n=3; CG: n=2).
A total of 60 women (mean age 31.12, SD 6.67 years) with confirmed or suspected endometriosis and sexual distress (FSDS-DAO score >18) were randomized to the IG (n=29) or CG (n=31). IG participants completed on average 61.2% (4.9/8) of modules; the dropout rate was 65.5% (19/29). Emotional strain, time demands, and technical issues were key barriers causing dropout, while persona-based stories facilitated engagement. Participants wished for more professional interaction. IG completers (n=10, 34.5%) showed lower baseline depression and anxiety but higher sexual distress. Satisfaction was high (CSQ-I=26.60; G-MAUQ=5.38). Although some adverse health changes were reported, findings indicate safety. FSDS-DAO scores decreased in both groups, with mean reductions from baseline of -10.39, -12.61, and -14.98 in the IG and -3.68, -14.83, and -6.92 in the CG from T1 to T3, respectively. Moderate to large between-group effects favoring the IG were observed at T1 (d=-0.66) and T3 (d=-0.79). Sexual function (FSFI-d) improved only in the IG (T1-T3: d=0.16-1.00). Qualitative findings highlighted rediscovery of positive sexual experiences, improved communication, and increased openness. Both groups reported improvements in anxiety, depression, and physical functioning, with additional gains in emotion regulation, distress reduction, and body awareness reported in the IG. Women emphasized symptom complexity and a need for more professional guidance.
The self-guided intervention was well accepted and showed preliminary improvements among completers. Adherence and sustained engagement seemed shaped by baseline psychosocial health, pointing to a need for tailored adaptations and larger confirmatory trials.
German Clinical Trials Register DRKS00034351; https://drks.de/search/en/trial/DRKS00034351.
JMIR research protocols
2026-Feb-19
The aging trend of people living with HIV or AIDS in China is increasing day by day. Frailty is a common condition among older adults living with HIV or AIDS and represents a significant cause of poor prognosis, including falls, decreased quality of life, increased mortality, and potentially prolonged hospital stays. Consequently, early frailty screening in this population holds important clinical significance.
This study aims to describe the theoretical basis, research objectives, and implementation plan of a prospective observational study. It will focus on investigating the current status of frailty syndrome in hospitalized older adults living with HIV or AIDS, while simultaneously exploring the development of a clinically applicable risk prediction model.
This study is an ongoing single-center prospective observational study, with a plan to recruit at least 556 hospitalized older adults living with HIV or AIDS (n=445 for development and n=111 for validation). According to the theory of unpleasant symptoms, candidate predictors are categorized into physiological factors (including sociodemographic factors, disease-related influencing factors, sleep, nutrition, and neurocognitive function), psychological factors (including anxiety and depression status), and environmental factors (including social support status). Potential predictors are screened using univariate analysis and least absolute shrinkage and selection operator regression to identify variables for final model inclusion. Model construction and validation employ 3 standard machine learning algorithms: logistic regression, random forest, and support vector machine. Model performance will be evaluated by reporting accuracy, precision, sensitivity, specificity, and the area under the curve.
This study is conducted at a designated infectious disease hospital in Changsha, Hunan Province, China. Participant recruitment commenced on December 22, 2024, and as of December 5, 2025, a total of 603 patients have been enrolled. The primary study findings are anticipated to be published in August 2026.
The findings of this study are expected to provide clinicians in the department of infectious diseases with a convenient tool for frailty risk prediction, thereby enabling early intervention and ultimately improving the long-term health status and quality of life of people living with HIV.
JMIR research protocols
2026-Feb-19
Diabetic dyslipidemia (DD), characterized by a classical triad of abnormal lipid profiles among the diabetic population, presents a major public health concern in South Africa, particularly among Black South Africans. The increasing prevalence of DD significantly contributes to the development of atherosclerotic cardiovascular disease. With the incidence of diabetes rising from 4.5% in 2010 to 12.7% in 2021, urgent preventive measures and effective treatments are crucial to tackle the risk of premature mortality.
This systematic review and meta-analysis protocol aims to examine the existing literature on DD, providing an understanding of its prevalence and associated predictors among the diabetic population in South Africa, with the intention of informing more effective clinical and public health interventions.
The protocol is registered in PROSPERO (International Prospective Register of Systematic Reviews) and will adhere to the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines. The available literature on DD will be systematically searched in common scholarly databases and reviewed accordingly. All published and unpublished studies conducted in South Africa prior to 2024 and written in English will be included. Two members (MN and FA) of the review team will independently screen the studies identified through the database search and assess risk of bias using the revised JBI critical appraisal tools. The review will integrate both quantitative and qualitative data synthesis. Results from both qualitative and quantitative data synthesis will be presented through forest plots, subgroup forest plots, and summary tables, which will present findings on pooled prevalence, odds ratios for predictors, heterogeneity statistics, and sensitivity analyses.
The protocol was finalized in January 2025. The literature search was conducted between October 2024 and March 2025. Title and abstract screening began in April 2025, and full-text review was completed by July 2025, with data extraction scheduled for completion by September 2025. The completion of statistical analyses is expected by October 2025. We anticipate submission of the completed systematic review and meta-analysis for publication in December 2025.
The findings of the study protocol will inform the design of targeted interventions and policies aimed at advancing the management of DD and subsequently reducing the increased risk of atherosclerotic cardiovascular disease among the diabetic population.
JCO oncology practice
2026-Feb-19
Next-generation sequencing (NGS) is recommended for patients with metastatic prostate cancer (PC). Nationwide, testing rates are low. Whether PC disease characteristics and courses differ between those with and without NGS testing is unknown. We identified predictors of testing, explored likely reasons for lack of testing, and compared survival between those with and without testing.
We retrospectively reviewed patients with metastatic PC initially seen between 2020 and 2022 at Johns Hopkins. Clinical data and reasons for nontesting were abstracted from the electronic medical record. We conducted a logistic regression assessing predictors of NGS testing, adjusting for age, Gleason grade, marital status, and metastatic diagnosis year. We used Cox regression to compare overall survival, defined from the time patients had both a metastatic diagnosis and a visit at our institution until death/last follow-up, between those tested and not tested. We adjusted for age, Gleason grade, initial metastasis (M) stage, comorbidities, and time from metastatic diagnosis to first visit.
Of the 435 patients, 257 (59%) had NGS testing. Older patients were less likely to have testing (adjusted odds ratio [aOR], 0.96 [95% CI, 0.94 to 0.98]). Unmarried patients were less likely to have testing (aOR, 0.62 [95% CI, 0.38 to 1.01]). Patients with Gleason Grade Group 5 were more likely to undergo testing than patients with Groups 1-3 (aOR, 1.86 [95% CI, 1.14 to 3.04]). Among those without testing, 139 (78%) had at least one potential reason for lack of testing in the medical record. The most common reason for nontesting was patient/disease factors (37%).
Older and unmarried men with metastatic PC were less likely to obtain NGS testing, whereas those with high Gleason grade were more likely. Interventions are needed to improve testing rates.
Interactive journal of medical research
2026-Feb-19
Self-rated health (SRH) is a robust predictor of morbidity, functional decline, and mortality in later life. As internet use becomes increasingly embedded in older adults' daily routines, clarifying its association with SRH and the pathways through which it may operate is important for research, practice, and policy.
This scoping review aimed to map and characterize the international evidence on the association between internet use and SRH among older adults, synthesize how potential mediators and moderators have been examined, and identify key methodological, theoretical, and population gaps in the literature.
Guided by the Joanna Briggs Institute methodology and PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews) reporting standards, we conducted a scoping review and searched 5 databases: PubMed, CINAHL, AgeLine, PsycINFO, and Web of Science. The final search was performed on February 5, 2024. Reference lists were screened, and Google Scholar searches were conducted as supplementary search methods.
Database searches identified 4294 records; after removing 615 duplicates, 3679 records were screened, and 77 full texts were assessed, resulting in 27 included studies. All included studies were quantitative, and the evidence base was predominantly cross-sectional (25/27). Explicit theoretical frameworks were used in 6 out of 27 studies. Most studies were published between 2019 and 2024 (22/27) and were conducted most frequently in China (11/27) and the United States (7/27). All studies were conducted in high-income countries. SRH was typically assessed using a single-item measure, while internet use was operationalized as access/use (yes/no), frequency, and/or purpose/domain-specific measures. Most studies reported a statistically significant positive association between internet use and better SRH (24/27), with socially oriented uses (eg, communication and social participation) showing the most consistent associations. Mediating pathways were examined in 6 out of 27 studies, and most often suggested social mechanisms such as greater social support, higher social engagement, and lower loneliness. Subgroup heterogeneity was reported in 10 out of 27 studies, including differences by age, gender, residence, and marital status.
Overall, internet use, particularly socially oriented use, was most consistently associated with better SRH among older adults. Policy efforts should support digital inclusion by improving access, skills, and ongoing assistance that enable meaningful use for social connection and service access. At the same time, nondigital options are essential to avoid excluding older adults who do not use the internet. In addition, evidence gaps, including limited use of theoretical frameworks and scarce data from low- and middle-income countries, underscore the need for theory-informed longitudinal and intervention studies to strengthen causal inference, expand knowledge on mediating and moderating factors, and assess generalizability across diverse contexts.
American journal of health promotion : AJHP
2026-Feb-19
Background: Nutrition Incentive (NI) programs increase fruit and vegetable (FV) purchasing and consumption among Supplemental Nutrition Assistance Program (SNAP) participants by providing financial incentives at the point of sale. Through supported and sustained Farm Bill investment and bipartisan backing, NI programs operate in diverse retail settings. Discussion: Evidence indicates that NI programs generate benefits at multiple levels. At the individual and household level, they are associated with improved diet quality and enhanced food security. At the community and systems level, NI programs contribute to local economic activity by increasing FV sales in grocery and farm direct settings, supporting farmers, and reinforcing retailer participation in healthy food initiatives. This "triple-win" is dynamic, benefiting consumers, retailers, and producers and positions NI programs as a strategic mechanism for aligning public health and economic development goals. Conclusions: As a proven and scalable intervention, NI programs represent a cross-sector solution that advancas public health, strengthens local food systems, and promotes community resilience. Continued policy support and investment are critical to sustaining and expanding their impact nationwide.
Expert review of anti-infective therapy
2026-Feb-19
Human T-cell lymphotropic virus type 1 (HTLV-1) is an oncogenic retrovirus responsible for adult T-cell leukemia/lymphoma and severe inflammatory diseases. Although at least several million individuals are infected worldwide, surveillance remains limited and the infection is frequently overlooked. Global migration has altered the epidemiology of HTLV-1, increasing the number of carriers in previously low-prevalence regions and creating hidden high-risk clusters. As universal population screening is not cost-effective, targeted testing of groups with elevated risk has become a critical public health priority.
This review summarizes the changing global distribution of HTLV-1 and examines evidence supporting targeted screening strategies. A literature review was conducted focusing on epidemiological data, transmission dynamics, and policy initiatives in both endemic and non-endemic settings. Populations evaluated include migrants from endemic regions, family members of carriers, individuals with sexually transmitted infections, pregnant women, blood and organ donors, and patients with clinical conditions strongly associated with HTLV-1.
Expanding targeted screening offers a practical approach to reduce transmission and enable earlier clinical intervention. Integration of HTLV-1 testing into antenatal care, sexual health services, and specialty clinics is feasible. Broader recognition of HTLV-1 will be essential to reducing the global burden of this neglected infection.
What is HTLV-1 and why does it matter?Human T-cell lymphotropic virus type 1 (HTLV-1) is a virus that can cause serious illnesses, including a type of blood cancer and long-term nerve damage. Although millions of people around the world are infected, HTLV-1 receives far less attention than other viruses such as HIV. Many people who carry the virus do not know they are infected, which means it can be passed to others without being noticed.Why is screening important?HTLV-1 spreads through breastfeeding, sexual contact, blood transfusion, and organ transplantation. Because most countries do not screen the general population, many infections remain hidden. Recent global migration has brought HTLV-1 from traditionally high-risk regions (such as Japan, the Caribbean, and parts of South America and Africa) into large cities in Europe and North America. As a result, clusters of infection are now found even in places where the virus was once considered rare.What did this review examine?This article explains how the distribution of HTLV-1 is changing and identifies who is most likely to benefit from testing. These groups include migrants from endemic areas and their families, people with sexually transmitted infections, pregnant women, blood and organ donors, and patients with conditions known to be associated with HTLV-1.Why does this matter for public health?Targeted screening can help detect infections earlier, prevent mother-to-child and sexual transmission, and provide timely care to reduce the risk of severe complications. Increasing awareness among healthcare providers and integrating HTLV-1 testing into existing health programs could greatly improve prevention and patient outcomes worldwide.
PLoS biology
2026-Feb-19
Bacteria can encode dozens of different immune systems that protect them from infection by mobile genetic elements (MGEs). MGEs themselves may also carry immune systems, such as CRISPR-Cas, to target competitor MGEs. It is unclear when this is favored by natural selection, and whether toxin-antitoxin (TA) systems-common competitive mechanisms carried by plasmids-can alter their efficacy. Here, we develop and test novel theory to analyze the outcome of competition between plasmids when one carries a CRISPR-Cas system that targets the other plasmid. Our mathematical model and experiments using Escherichia coli and competing IncP plasmids reveal that plasmid-borne CRISPR-Cas is beneficial to the plasmid carrying it when the plasmid has not recently transferred to a new host. However, CRISPR-Cas is selected against when the plasmid carrying it transfers horizontally, if a resident competitor plasmid encodes a TA system that elicits post-segregational killing. Consistent with a TA barrier to plasmid-borne CRISPR-Cas, a bioinformatic analysis reveals that naturally occurring CRISPR-Cas-bearing plasmids avoid targeting other plasmids with TA systems across bacterial genera. Our work shows how the benefit of plasmid-borne CRISPR-Cas is severely reduced against TA-encoding competitor plasmids, but only when plasmid-borne CRISPR-Cas is horizontally transferred. These findings have key implications for the distribution of prokaryotic defenses and our understanding of their role in competition between MGEs, and the utility of CRISPR-Cas as a tool to remove plasmids from pathogenic bacteria.
PLoS pathogens
2026-Feb-19
New SARS-CoV-2 variants pose an ongoing threat due to persistent immune escape of natural and vaccine-induced immunity. The emergence of BA.1 (Omicron) produced a large antigenic shift in the spike protein, rendering many antibodies ineffective with concomitant loss of Emergency Use Authorization (EUA) status. While strains have evolved far from BA.1, re-emergence of variants from branches closer to BA.1 are of recent concern. Here, we engineered a self-assembling nanoparticle displaying RBD 4mut g5.1, an immunogen developed using structure-guided design to focus antibody responses to the receptor binding site (RBS) epitope and promote cross-reactivity by inclusion of four rationally selected BA.1 mutations in the RBS. Unlike multi-component RBD approaches, we demonstrate a single, rationally designed component is sufficient for generating broad immunity. We demonstrate that in both naïve and antigen-experienced mice, RBD 4mut g5.1 nanoparticle induced cross-reactive and durable antibody responses capable of potent neutralization of ancestral SARS-CoV-2 and many Omicron variants. RBD 4mut g5.1 provided heterologous protection at a memory timepoint. By showcasing how subtle changes in an epitope can trigger a diversified antibody response, this study offers a promising new avenue for developing vaccines that can more effectively tackle the ever-evolving threat of immune escape, not only against SARS-CoV-2 but potentially against a range of variable pathogens.
JMIR human factors
2026-Feb-19
Digital health technologies offer new opportunities for cognitive screening and monitoring among older adults. In Thailand, where dementia prevalence is rising, accessible web-based cognitive tools remain limited despite their potential to facilitate early detection and community-based assessment. Understanding usability and validity is critical to ensure successful implementation in real-world contexts.
This study aimed to develop and validate a web-based application, Healthy Brain Test, for cognitive and functional assessments in dementia screening among older Thai adults. Specific objectives were to (1) design user-centered cognitive modules covering key cognitive domains and (2) evaluate correlations between the web-based assessments and conventional clinical tools to determine diagnostic cutoffs for cognitive impairment.
We designed Healthy Brain Test as a self-administered web application suitable for older users and their caregivers. The platform includes digital versions of the Thai Mental State Examination (e-TMSE), a clock drawing test, and a category verbal fluency test, along with electronic versions of the short form of the Informant Questionnaire on Cognitive Decline in the Elderly (IQCODE-16) and cognitive instrumental activities of daily living (IADLs). Participants completed both web-based and paper-based assessments. Correlations between modalities were analyzed, and receiver operating characteristic (ROC) curves were generated to determine sensitivity and specificity. Data were analyzed using SPSS for Windows, version 30.0 (IBM Corp) and MedCalc Statistical Software (MedCalc Software Ltd).
A total of 198 older adults participated (women: 137/198, 69.2%; median age 69.4 years), with 57.1% (113/198) having more than 6 years of education. Of the 198 participants, 44 were diagnosed with major neurocognitive disorder, 58 were diagnosed with mild neurocognitive disorder, and 96 were cognitively normal. The e-TMSE showed strong agreement with the traditional TMSE (r=0.837; P<.001). Category verbal fluency, IQCODE-16, and IADL modules also demonstrated significant correlations (P<.001). The e-TMSE achieved an area under the ROC curve of 0.84 (bootstrapped 95% CI 0.78-0.89); a cutoff ≤23 provided 88.6% sensitivity and 70.1% specificity for identifying major neurocognitive disorder. Participants reported high ease of use and engagement during pilot testing.
Healthy Brain Test demonstrated strong validity and usability as a web-based cognitive and functional assessment platform for dementia screening. Its integration of established cognitive measures into a digital interface enables remote, accessible, and user-friendly evaluation for older adults and caregivers. Future research should assess long-term feasibility, user adherence, and integration with clinical workflows to support large-scale screening initiatives.
Work (Reading, Mass.)
2026-Feb-19
BackgroundThe number of shipyard accidents should be reduced by examining the effects of the various demographic and workplace factors on the severity of the accident.ObjectiveThe study examines shipyard accidents and various occupational-behavioral-environmental factors affecting these accidents to find minor accidents (or near-misses) that turned out to be major and to examine the effects of factors on the possible consequences of the accidents, to compare the predicted results with the actual results, and to investigate possible hidden reasons for the occurrence of accidents.MethodsThe study uses an accident causality model and conducts experiments with a multi-factor approach on accident causality in the shipbuilding industry through logistic regression and machine learning. It performs an association rules analysis to further enhance the causality model.ResultsMachine learning algorithm outputs yielded results that differed significantly from the apparent descriptive distribution of causes of major accidents. Lack of control and audit stands out as the most important accident factor in the occurrence of major accidents. Design errors and lack of training are also two important administrative factors in the occurrence of major accidents. 38.2% of major occupational accidents in shipyards are preventable or can be overcome with minor injury. In 87% of preventable major accidents, the employee had been involved in one or two previous minor incidents.ConclusionAdministrative deficiencies are prominent in major accidents. The main employer's workers and managers are at higher risk in terms of major accident exposure. The effectiveness of safety training should be increased in accordance with the changing working environment and technological conditions.
Psychology, health & medicine
2026-Feb-19
To explore the efficacy of online self-help procrastination intervention program (OPSI) in alleviating adult procrastination behaviors. A randomized controlled trial (RCT) design was adopted. Participants were randomly classified into the intervention group (n = 33) and the waitlist control group (n = 33). Three time points - baseline (t0), 6 weeks following the intervention (t1), and 12 weeks following the intervention (t2) - were assessed. The intervention adherence, adverse effects, and feasibility were assessed. Procrastination, perceived stress, depression, and anxiety were employed to measure the intervention effects. 66 Chinese adults diagnosed with procrastination were recruited, and age ranged from 20.25 to 47.06 years (M = 29.72). For participants in the experimental group, they got a 12-week online intervention. Compared with the control group, the procrastination of the intervention group was remarkably decreased. The group and time's interaction greatly affected IPS scores (t (194) = 3.42, p < .001). The intervention group had a meaningful within-group difference between the post-intervention and baseline assessments (d = -1.43, 95% CI: -1.91; -0.95). There was a pronounced group-by-time interaction for PSS scores (t (194) = 2.82, p < .01). The perceived stress of intervention group was notably reduced between post-intervention and baseline (d = -1.06, 95% CI: -4.46; -2.33). The results provide empirical support for the use of scalable online self-help interventions as an accessible approach to procrastination management and adult mental health promotion.Trial Registration: ClinicalTrials.gov. Identifier: ChiCTR2200065752 (registered June 2022).
General thoracic and cardiovascular surgery
2026-Feb-19
Primary spontaneous pneumothorax (PSP) is a condition that primarily affects young patients and has a high recurrence rate. While surgery is the treatment option associated with the lowest recurrence rate for PSP, some patients experience long-term chest drain placement due to prolonged air leak. Our study aimed to elucidate the relationship between coagulation abnormalities and prolonged postoperative air leak in PSP.
Patients who underwent surgery for PSP were retrospectively reviewed. Patients were divided into the exploratory and the validation cohorts. From the exploratory cohort, patients with prolonged chest drain placement were identified as the air-leak prolonged (AL-P) group, and the Control group matched at a 1:4 ratio was selected using propensity score matching.
In the exploratory cohort, 15 patients were assigned to the AL-P group and 60 to the control group. Among the coagulation markers including prothrombin time, activated partial thromboplastin time (APTT) and platelet count, univariate analysis revealed a significantly prolonged APTT in the AL-P group (median 33 vs. 31 s, odds ratio 1.26, p = 0.006). Multivariate analysis identified prolonged APTT as an independent risk factor for prolonged chest drain placement. Receiver operating characteristic curve of APTT values for predicting the incidence of prolonged chest drain placement showed a cutoff of 31.5 s. In the validation cohort, patients with an APTT ≥ 31.5 s showed significantly longer chest drain placement (p = 0.03).
Our study suggests a potential association between prolonged APTT and prolonged postoperative chest drain placement in patients with PSP.
Environmental geochemistry and health
2026-Feb-19
Manganese (Mn) is a common water contaminant in mining areas and one of the most challenging metals to treat. This study assessed Mn removal efficiencies of pilot-scale slag reactors using steelmaking slag mixed with limestone as the reactive material. Three different reactor configurations of baffle-type, weir-type, and vertical flow-type were installed and tested over a 316-day operational period with initial, middle, and final phases. Geochemical modeling indicated that most effluents were saturated with calcite (CaCO₃), rhodochrosite (MnCO₃), and manganite (MnOOH), suggesting that both carbonate and hydroxide precipitation contributed to Mn removal. As effluent pH increased, alkalinity decreased due to the consumption of carbonate ions (CO₃2⁻) during calcite precipitation. The potential formation of Mn carbonates may have contributed to the Mn removal efficiencies. X-ray photoelectron spectroscopy (XPS) analyses of the accumulated precipitates indicated that Mn(III, IV) oxides were the dominant phase. Although temperature, which influences Mn removal rate, decreased from middle to final phase, removal efficiencies and/or effluent concentrations of Mn exhibited increasing and decreasing trends, possibly due to the autocatalytic oxidation by the accumulated Mn(III, IV) oxides. These findings highlight the potential of slag reactors as a cost-effective and sustainable solution for treating Mn-contaminated mine drainage, groundwater, and industrial wastewater. Moreover, this approach contributes to reducing CO₂ emissions from lime production processes (e.g. calcination) while promoting the utilization of waste materials.
Journal of bone and mineral metabolism
2026-Feb-19
Bone is a multifunctional organ that provides structural support and hosts the bone marrow, a key site for hematopoiesis and systemic homeostasis. These dual features have long attracted the attention of both bone biologists and hematologists. Each field has pursued the identification of stem-like cells responsible for hard tissue formation and the regulatory microenvironment/niche that supports hematopoietic stem cells (HSCs), which give rise to all blood cell lineages. Converging advances in bone and hematopoietic biology have led to the identification of skeletal stem/progenitor cells (SSPCs), a multifunctional population that gives rise to osteolineage cells and serves as a principal component of the HSC niche. This landmark discovery was largely enabled by Cre/loxP-based genetic mouse models. Among them, the leptin receptor (LepR)-Cre system has become one of the most widely used tools in skeletal stem cell research worldwide.
In this review, we summarize the historical background and recent advances in SSPC research, specifically LepR+ SSPCs, highlighting their function and lineage plasticity during development, adolescence, aging, and fracture healing. Advanced genetic labeling-based studies and single-cell transcriptomics unveiled the fate, dynamics and indispensible roles of LepR⁺ SSPCs under both homeostatic and pathological conditions.
American journal of public health
2026-Feb-19
Objectives. To describe health care‒related educational divides in 2 dimensions-outpatient care utilization and medically preventable deaths-over the past 25 years. Methods. We examined education-based disparities in ambulatory care utilization by analyzing data on 476 277 respondents aged 25 years or older to the 1996-2022 US Medical Expenditure Panel Survey, and in deaths potentially preventable by medical care (defined by International Classification of Diseases, 10th Revision, code) from 26 092 720 death certificates of individuals aged 25 to 74 years in the United States from 2001 to 2023. Results. In 1996, the share of adults with zero provider visits was higher among those without (26.4%; 95% confidence interval [CI] = 25.3, 27.5) than with (20.2%; 95% CI = 18.5, 22.0) a bachelor's degree, a gap that widened to a nearly 2-fold difference by 2022; the gap in the proportion with no doctor visit also widened. Disparities in health care use were larger after adjustment for health factors. Separately, we observed large and growing education-based gaps in age-adjusted health care‒amenable mortality. Conclusions. Education-based disparities in ambulatory health care utilization have grown since 1996, as have medically preventable deaths. Public Health Implications. Improved health care access for less-educated Americans might help address widening disparities in ambulatory health care use and, potentially, health outcomes. (Am J Public Health. Published online ahead of print February 19, 2026:e1-e10. https://doi.org/10.2105/AJPH.2025.308373).
Journal of participatory medicine
2026-Feb-19
The United States faces a persistent maternal mortality crisis, with rates far higher than those in other high-income nations. The mortality rate among Black women is more than 3 times that among White women. Traditional data visualizations, such as bar and line charts, often emphasize aggregate outcomes, masking inequities and failing to reflect patient-level experiences.
This study aimed to address the gaps by taking a systems view and developing a Visualized Combined Experience (VCE) diagram, which is an innovative tool that integrates persona-based storytelling with data visualization to provide a more comprehensive understanding of maternal health outcomes. Specifically, the following research questions were explored: (1) How can the VCE diagram approach be applied to illustrate maternal mortality disparities in the United States? (2) To what extent does this integrated visualization technique reveal connections between individual patient experiences and population-level health outcomes that traditional visualization methods do not? (3) How can the VCE diagram inform a learning health system (LHS)?
This mixed methods study used publicly available quantitative data from the US Centers for Disease Control and Prevention and adapted qualitative data from the ProPublica award-winning investigative series "Lost Mothers" to construct the VCE diagram through a seven-step process involving the following elements: (1) composite persona derived from publicly available narratives, (2) journey map illustrating patient experiences and health system touchpoints, (3) emotive elements of the patient, (4) Sankey diagram of population-level maternal mortality outcomes, (5) "closer look" inset to unmask disparities obscured in aggregate data, (6) evaluation, and (7) data integration.
The VCE diagram revealed critical connections between individual experiences and population-level disparities. When examining mortality rates per 1000 births, Black women had a high rate of 51.2, compared with 16.8 for White women, 14.3 for Hispanic women, and 10.2 for Asian women. The relationship between diagnostic delay and population-level mortality was revealed, with the "closer look" inset demonstrating how disparities can be obscured in aggregate data. The VCE diagram supported a more efficient and empathetic understanding of maternal health outcomes.
The VCE diagram bridges micro-level patient experiences with macro-level population data, holding promise to enhance service evaluation, delivery, and design, and improve health care outcomes. The VCE diagram provides a replicable framework for data visualization that highlights systemic disparities often hidden in aggregate data. Moreover, the availability of structured human experience and service outcome data can provide robust context-specific and situational data to foster a culture of organizational learning and continuous improvement via an LHS. The LHS's knowledge translation loops provide a conduit to improve patient experiences and reduce morbidity and mortality across populations and health systems. Future work will include usability testing across diverse audiences to assess interpretability and refine applications in LHSs.
Science (New York, N.Y.)
2026-Feb-19
Nontoxic approach bolsters plant's own defenses.