Infrequent HDV Testing Raises Concern for Worse Liver Outcomes

Timely Testing Using Reflex Tools
Article Type
Changed
Mon, 04/07/2025 - 23:37

Only 1 in 6 US veterans with chronic hepatitis B (CHB) is tested for hepatitis D virus (HDV)—a coinfection associated with significantly higher risks of cirrhosis and hepatic decompensation—according to new findings.

The low testing rate suggests limited awareness of HDV-associated risks in patients with CHB, and underscores the need for earlier testing and diagnosis, lead author Robert J. Wong, MD, of Stanford University School of Medicine, Stanford, California, and colleagues, reported.

Dr. Robert J. Wong



“Data among US populations are lacking to describe the epidemiology and long-term outcomes of patients with CHB and concurrent HDV infection,” the investigators wrote in Gastro Hep Advances (2025 Oct. doi: 10.1016/j.gastha.2024.10.015).

Prior studies have found that only 6% to 19% of patients with CHB get tested for HDV, and among those tested, the prevalence is relatively low—between 2% and 4.6%. Although relatively uncommon, HDV carries a substantial clinical and economic burden, Dr. Wong and colleagues noted, highlighting the importance of clinical awareness and accurate epidemiologic data.

The present study analyzed data from the Veterans Affairs (VA) Corporate Data Warehouse between 2010 and 2023. Adults with CHB were identified based on laboratory-confirmed markers and ICD-9/10 codes. HDV testing (anti-HDV antibody and HDV RNA) was assessed, and predictors of testing were evaluated using multivariable logistic regression.

To examine liver-related outcomes, patients who tested positive for HDV were propensity score–matched 1:2 with CHB patients who tested negative. Matching accounted for age, sex, race/ethnicity, HBeAg status, antiviral treatment, HCV and HIV coinfection, diabetes, and alcohol use. Patients with cirrhosis or hepatocellular carcinoma (HCC) at base-line were excluded. Incidence of cirrhosis, hepatic decompensation, and HCC was estimated using competing risks Nelson-Aalen methods.

Among 27,548 veterans with CHB, only 16.1% underwent HDV testing. Of those tested, 3.25% were HDV positive. Testing rates were higher among patients who were HBeAg positive, on antiviral therapy, or identified as Asian or Pacific Islander.

Conversely, testing was significantly less common among patients with high-risk alcohol use, past or current drug use, cirrhosis at diagnosis, or HCV coinfection. In contrast, HIV coinfection was associated with increased odds of being tested.

Among those tested, HDV positivity was more likely in patients with HCV coinfection, cirrhosis, or a history of drug use. On multivariable analysis, these factors were independent predictors of HDV positivity.

In the matched cohort of 71 HDV-positive patients and 140 HDV-negative controls, the incidence of cirrhosis was more than 3-fold higher in HDV-positive patients (4.39 vs 1.30 per 100,000 person-years; P less than .01), and hepatic decompensation was over 5 times more common (2.18 vs 0.41 per 100,000 person-years; P = .01). There was also a non-significant trend toward increased HCC risk in the HDV group.

“These findings align with existing studies and confirm that among a predominantly non-Asian US cohort of CHB patients, presence of concurrent HDV is associated with more severe liver disease progression,” the investigators wrote. “These observations, taken together with the low rates of HDV testing overall and particularly among high-risk individuals, emphasizes the need for greater awareness and novel strategies on how to improve HDV testing and diagnosis, particularly given that novel HDV therapies are on the near horizon.”

The study was supported by Gilead. The investigators disclosed additional relationships with Exact Sciences, GSK, Novo Nordisk, and others.

Body

Hepatitis D virus (HDV) is an RNA “sub-virus” that infects patients with co-existing hepatitis B virus (HBV) infections. HDV infection currently affects approximately 15-20 million people worldwide but is an orphan disease in the United States with fewer than 100,000 individuals infected today.

Dr. Robert G. Gish

Those with HDV have a 70% lifetime risk of hepatocellular carcinoma (HCC), cirrhosis, liver failure, death, or liver transplant. But there are no current treatments in the US that are Food and Drug Administration (FDA)-approved for the treatment of HDV, and only one therapy in the European Union with full approval by the European Medicines Agency.

Despite HDV severity and limited treatment options, screening for HDV remains severely inadequate, often only testing those individuals at high risk sequentially. HDV screening, would benefit from a revamped approach that automatically reflexes testing when individuals are diagnosed with HBV if positive for hepatitis B surface antigen (HBsAg+), then proceeds to anti-HDV antibody total testing, and then double reflexed to HDV-RNA polymerase chain reaction (PCR) quantitation. This is especially true in the Veterans Administration (VA)’s hospitals and clinics, where Wong and colleagues found very low rates of HDV testing among a national cohort of US Veterans with chronic HBV.

This study highlights the importance of timely HDV testing using reflex tools to improve diagnosis and HDV treatment, reducing long-term risks of liver-related morbidity and mortality.

Robert G. Gish, MD, AGAF, is principal at Robert G Gish Consultants LLC, clinical professor of medicine at Loma Linda University, Loma Linda, Calif., and medical director of the Hepatitis B Foundation. His complete list of disclosures can be found at www.robertgish.com/about.

Publications
Topics
Sections
Body

Hepatitis D virus (HDV) is an RNA “sub-virus” that infects patients with co-existing hepatitis B virus (HBV) infections. HDV infection currently affects approximately 15-20 million people worldwide but is an orphan disease in the United States with fewer than 100,000 individuals infected today.

Dr. Robert G. Gish

Those with HDV have a 70% lifetime risk of hepatocellular carcinoma (HCC), cirrhosis, liver failure, death, or liver transplant. But there are no current treatments in the US that are Food and Drug Administration (FDA)-approved for the treatment of HDV, and only one therapy in the European Union with full approval by the European Medicines Agency.

Despite HDV severity and limited treatment options, screening for HDV remains severely inadequate, often only testing those individuals at high risk sequentially. HDV screening, would benefit from a revamped approach that automatically reflexes testing when individuals are diagnosed with HBV if positive for hepatitis B surface antigen (HBsAg+), then proceeds to anti-HDV antibody total testing, and then double reflexed to HDV-RNA polymerase chain reaction (PCR) quantitation. This is especially true in the Veterans Administration (VA)’s hospitals and clinics, where Wong and colleagues found very low rates of HDV testing among a national cohort of US Veterans with chronic HBV.

This study highlights the importance of timely HDV testing using reflex tools to improve diagnosis and HDV treatment, reducing long-term risks of liver-related morbidity and mortality.

Robert G. Gish, MD, AGAF, is principal at Robert G Gish Consultants LLC, clinical professor of medicine at Loma Linda University, Loma Linda, Calif., and medical director of the Hepatitis B Foundation. His complete list of disclosures can be found at www.robertgish.com/about.

Body

Hepatitis D virus (HDV) is an RNA “sub-virus” that infects patients with co-existing hepatitis B virus (HBV) infections. HDV infection currently affects approximately 15-20 million people worldwide but is an orphan disease in the United States with fewer than 100,000 individuals infected today.

Dr. Robert G. Gish

Those with HDV have a 70% lifetime risk of hepatocellular carcinoma (HCC), cirrhosis, liver failure, death, or liver transplant. But there are no current treatments in the US that are Food and Drug Administration (FDA)-approved for the treatment of HDV, and only one therapy in the European Union with full approval by the European Medicines Agency.

Despite HDV severity and limited treatment options, screening for HDV remains severely inadequate, often only testing those individuals at high risk sequentially. HDV screening, would benefit from a revamped approach that automatically reflexes testing when individuals are diagnosed with HBV if positive for hepatitis B surface antigen (HBsAg+), then proceeds to anti-HDV antibody total testing, and then double reflexed to HDV-RNA polymerase chain reaction (PCR) quantitation. This is especially true in the Veterans Administration (VA)’s hospitals and clinics, where Wong and colleagues found very low rates of HDV testing among a national cohort of US Veterans with chronic HBV.

This study highlights the importance of timely HDV testing using reflex tools to improve diagnosis and HDV treatment, reducing long-term risks of liver-related morbidity and mortality.

Robert G. Gish, MD, AGAF, is principal at Robert G Gish Consultants LLC, clinical professor of medicine at Loma Linda University, Loma Linda, Calif., and medical director of the Hepatitis B Foundation. His complete list of disclosures can be found at www.robertgish.com/about.

Title
Timely Testing Using Reflex Tools
Timely Testing Using Reflex Tools

Only 1 in 6 US veterans with chronic hepatitis B (CHB) is tested for hepatitis D virus (HDV)—a coinfection associated with significantly higher risks of cirrhosis and hepatic decompensation—according to new findings.

The low testing rate suggests limited awareness of HDV-associated risks in patients with CHB, and underscores the need for earlier testing and diagnosis, lead author Robert J. Wong, MD, of Stanford University School of Medicine, Stanford, California, and colleagues, reported.

Dr. Robert J. Wong



“Data among US populations are lacking to describe the epidemiology and long-term outcomes of patients with CHB and concurrent HDV infection,” the investigators wrote in Gastro Hep Advances (2025 Oct. doi: 10.1016/j.gastha.2024.10.015).

Prior studies have found that only 6% to 19% of patients with CHB get tested for HDV, and among those tested, the prevalence is relatively low—between 2% and 4.6%. Although relatively uncommon, HDV carries a substantial clinical and economic burden, Dr. Wong and colleagues noted, highlighting the importance of clinical awareness and accurate epidemiologic data.

The present study analyzed data from the Veterans Affairs (VA) Corporate Data Warehouse between 2010 and 2023. Adults with CHB were identified based on laboratory-confirmed markers and ICD-9/10 codes. HDV testing (anti-HDV antibody and HDV RNA) was assessed, and predictors of testing were evaluated using multivariable logistic regression.

To examine liver-related outcomes, patients who tested positive for HDV were propensity score–matched 1:2 with CHB patients who tested negative. Matching accounted for age, sex, race/ethnicity, HBeAg status, antiviral treatment, HCV and HIV coinfection, diabetes, and alcohol use. Patients with cirrhosis or hepatocellular carcinoma (HCC) at base-line were excluded. Incidence of cirrhosis, hepatic decompensation, and HCC was estimated using competing risks Nelson-Aalen methods.

Among 27,548 veterans with CHB, only 16.1% underwent HDV testing. Of those tested, 3.25% were HDV positive. Testing rates were higher among patients who were HBeAg positive, on antiviral therapy, or identified as Asian or Pacific Islander.

Conversely, testing was significantly less common among patients with high-risk alcohol use, past or current drug use, cirrhosis at diagnosis, or HCV coinfection. In contrast, HIV coinfection was associated with increased odds of being tested.

Among those tested, HDV positivity was more likely in patients with HCV coinfection, cirrhosis, or a history of drug use. On multivariable analysis, these factors were independent predictors of HDV positivity.

In the matched cohort of 71 HDV-positive patients and 140 HDV-negative controls, the incidence of cirrhosis was more than 3-fold higher in HDV-positive patients (4.39 vs 1.30 per 100,000 person-years; P less than .01), and hepatic decompensation was over 5 times more common (2.18 vs 0.41 per 100,000 person-years; P = .01). There was also a non-significant trend toward increased HCC risk in the HDV group.

“These findings align with existing studies and confirm that among a predominantly non-Asian US cohort of CHB patients, presence of concurrent HDV is associated with more severe liver disease progression,” the investigators wrote. “These observations, taken together with the low rates of HDV testing overall and particularly among high-risk individuals, emphasizes the need for greater awareness and novel strategies on how to improve HDV testing and diagnosis, particularly given that novel HDV therapies are on the near horizon.”

The study was supported by Gilead. The investigators disclosed additional relationships with Exact Sciences, GSK, Novo Nordisk, and others.

Only 1 in 6 US veterans with chronic hepatitis B (CHB) is tested for hepatitis D virus (HDV)—a coinfection associated with significantly higher risks of cirrhosis and hepatic decompensation—according to new findings.

The low testing rate suggests limited awareness of HDV-associated risks in patients with CHB, and underscores the need for earlier testing and diagnosis, lead author Robert J. Wong, MD, of Stanford University School of Medicine, Stanford, California, and colleagues, reported.

Dr. Robert J. Wong



“Data among US populations are lacking to describe the epidemiology and long-term outcomes of patients with CHB and concurrent HDV infection,” the investigators wrote in Gastro Hep Advances (2025 Oct. doi: 10.1016/j.gastha.2024.10.015).

Prior studies have found that only 6% to 19% of patients with CHB get tested for HDV, and among those tested, the prevalence is relatively low—between 2% and 4.6%. Although relatively uncommon, HDV carries a substantial clinical and economic burden, Dr. Wong and colleagues noted, highlighting the importance of clinical awareness and accurate epidemiologic data.

The present study analyzed data from the Veterans Affairs (VA) Corporate Data Warehouse between 2010 and 2023. Adults with CHB were identified based on laboratory-confirmed markers and ICD-9/10 codes. HDV testing (anti-HDV antibody and HDV RNA) was assessed, and predictors of testing were evaluated using multivariable logistic regression.

To examine liver-related outcomes, patients who tested positive for HDV were propensity score–matched 1:2 with CHB patients who tested negative. Matching accounted for age, sex, race/ethnicity, HBeAg status, antiviral treatment, HCV and HIV coinfection, diabetes, and alcohol use. Patients with cirrhosis or hepatocellular carcinoma (HCC) at base-line were excluded. Incidence of cirrhosis, hepatic decompensation, and HCC was estimated using competing risks Nelson-Aalen methods.

Among 27,548 veterans with CHB, only 16.1% underwent HDV testing. Of those tested, 3.25% were HDV positive. Testing rates were higher among patients who were HBeAg positive, on antiviral therapy, or identified as Asian or Pacific Islander.

Conversely, testing was significantly less common among patients with high-risk alcohol use, past or current drug use, cirrhosis at diagnosis, or HCV coinfection. In contrast, HIV coinfection was associated with increased odds of being tested.

Among those tested, HDV positivity was more likely in patients with HCV coinfection, cirrhosis, or a history of drug use. On multivariable analysis, these factors were independent predictors of HDV positivity.

In the matched cohort of 71 HDV-positive patients and 140 HDV-negative controls, the incidence of cirrhosis was more than 3-fold higher in HDV-positive patients (4.39 vs 1.30 per 100,000 person-years; P less than .01), and hepatic decompensation was over 5 times more common (2.18 vs 0.41 per 100,000 person-years; P = .01). There was also a non-significant trend toward increased HCC risk in the HDV group.

“These findings align with existing studies and confirm that among a predominantly non-Asian US cohort of CHB patients, presence of concurrent HDV is associated with more severe liver disease progression,” the investigators wrote. “These observations, taken together with the low rates of HDV testing overall and particularly among high-risk individuals, emphasizes the need for greater awareness and novel strategies on how to improve HDV testing and diagnosis, particularly given that novel HDV therapies are on the near horizon.”

The study was supported by Gilead. The investigators disclosed additional relationships with Exact Sciences, GSK, Novo Nordisk, and others.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTRO HEP ADVANCES

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Mon, 04/07/2025 - 15:37
Un-Gate On Date
Mon, 04/07/2025 - 15:37
Use ProPublica
CFC Schedule Remove Status
Mon, 04/07/2025 - 15:37
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Mon, 04/07/2025 - 15:37

Don’t Overlook Processed Meat as Colorectal Cancer Risk Factor

Article Type
Changed
Fri, 04/04/2025 - 16:20

Even though older adults are more likely to be diagnosed with colorectal cancer (CRC), there is a concerning rise in diagnoses among younger adults, making it essential for healthcare providers to educate adult patients of all ages about the lifestyle-related risk factors associated with the disease.

Many are familiar with the modifiable risk factors of obesity, smoking, and alcohol consumption, but the impact of processed meat — a common element of the Western diet —often remains underappreciated.

But the data are clear: Processed meat, defined as meat that has been altered through methods such as salting, curing, fermentation, or smoking to enhance flavor or preservation, has been linked to an increased risk for CRC.

The International Agency for Research on Cancer, part of the World Health Organization, analyzed over 800 global studies and classified processed meats as carcinogenic to humans, whereas red meat was deemed “probably” carcinogenic. Their findings were later published in The Lancet Oncology, confirming that the strongest epidemiological evidence linked processed meat consumption to CRC.

“While I routinely counsel my patients about lifestyle and dietary risk factors for CRC, including processed meat, I’m not sure how often this is specifically mentioned by physicians in practice,” Peter S. Liang, MD, MPH, an assistant professor and researcher focused on CRC prevention at NYU Langone Health in New York City, and an AGA spokesperson, told GI & Hepatology News.

Dr. Peter S. Liang



David A. Johnson, MD, chief of gastroenterology at Eastern Virginia Medical School and Old Dominion University, both in Norfolk, Virginia, concurred.

Many healthcare providers may not fully recognize the risks posed by processed meat in relation to CRC to counsel their patients, Johnson said. “In my experience, there is not a widespread awareness.”

 

Understanding the Carcinogenic Risks 

The excess risk for CRC per gram of intake is higher for processed meat than for red meat. However, the threshold for harmful consumption varies among studies, and many group red and processed meat together in their analyses.

For example, a 2020 prospective analysis of UK Biobank data reported that a 70 g/d higher intake of red and processed meat was associated with a 32% and 40% greater risk for CRC and colon cancer, respectively.

More recently, a 2025 prospective study examined the associations between CRC and 97 dietary factors in 542,778 women. Investigators found that, aside from alcohol, red and processed meat were the only other dietary factors positively associated with CRC, with a 30 g/d intake increasing the risk for CRC by 8%.

Although the World Cancer Research Fund (WCRF) and the American Institute for Cancer Research (AICR) recommend limiting red meat consumption to no more than three portions a week, their guidance on processed meat is simpler and more restrictive: Consume very little, if any.

The risk for CRC associated with processed meats is likely due to a naturally occurring element in the meat and carcinogenic compounds that are added or created during its preparation, Johnson said.

Large bodies of evidence support the association between certain compounds in processed meat and cancer, added Ulrike Peters, PhD, MPH, professor and associate director of the Public Health Sciences Division at the Fred Hutchinson Cancer Center in Seattle.

These compounds include:

  • Heterocyclic amines: Prevalent in charred and well-done meat, these chemicals are created from the reaction at high temperatures between creatine/creatinine, amino acids, and sugars.
  • Nitrates/nitrites: Widely used in the curing of meat (eg, sausages, ham, bacon) to give products their pink coloring and savory flavor, these inorganic compounds bind with amines to produce N-nitrosamines, among the most potent genotoxic carcinogens.
  • Polycyclic aromatic hydrocarbons: Generated during high-temperature cooking and smoking, these compounds can induce DNA damage in the colon.
  • Heme iron: This type of iron, abundant in red and processed meats, promotes formation of carcinogenic N-nitroso compounds and oxidative damage to intestinal tissue.

Peters said that the compounds may work synergistically to increase the risk for CRC through various mechanisms, including DNA damage, inflammation, and altered gut microbiota.

While it would be useful to study whether the different meat-processing methods — for example, smoking vs salting — affect CRC risk differently, “practically, this is difficult because there’s so much overlap,” Liang noted.

 

Risk Mitigation

Lifestyle factors likely play a crucial role in the risk for CRC. For example, a study of European migrants to Australia found that those from countries with lower CRC incidences tended to develop a higher risk for CRC the longer they resided in Australia due to the dietary change.

Understanding how to mitigate these risk factors is becoming increasingly important with the rates of early-onset CRC projected to double by 2030 in the United States, a trend that is also being observed globally.

“With early-onset CRC, it’s becoming quite clear that there’s no single risk factor that’s driving this increase,” Liang said. “We need to look at the risk factors that we know cause CRC in older adults and see which have become more common over time.”

The consumption of processed meats is one such factor that’s been implicated, particularly for early-onset CRC. The average global consumption of all types of meat per capita has increased significantly over the last 50 years. A 2022 report estimated that global mean processed meat consumption was 17 g/d, with significantly higher rates in high-income regions. This number is expected to rise, with the global processed meat market projected to grow from $318 billion in 2023 to $429 billion by 2029. Given this, the importance of counseling patients to reduce their meat intake is further underscored.

Another strategy for mitigating the risks around processed meat is specifically identifying those patients who may be most vulnerable.

In 2024, Peters and colleagues published findings from their genome-wide gene-environment interaction analysis comparing a large population with CRC and healthy control individuals. The research identified two novel biomarkers that support the role of red and processed meat with an increased risk for CRC and may explain the higher risk in certain population subgroups. They are working on genetic risk prediction models that will incorporate these genetic markers but must first ensure robust validation through larger studies.

“This approach aligns with precision medicine principles, allowing for more personalized prevention strategies, though we’re not quite there yet in terms of clinical application,” Peters said.

Another knowledge gap that future research efforts could address is how dietary factors influence survival outcomes after a diagnosis of CRC.

“The existing guidelines primarily focus on cancer prevention, with strong evidence linking processed meat consumption to increased CRC risk. However, the impact of dietary choices on survival after CRC diagnosis remains poorly understood,” Peters said. “This distinction between prevention and survival is crucial, as biological mechanisms and optimal dietary interventions may differ significantly between these two contexts.”

Well-designed studies investigating the relationship between dietary patterns and CRC survival outcomes would enable the development of evidence-based nutritional recommendations specifically tailored for CRC survivors, Peters said. In addition, she called for well-designed studies that compare levels of processed meat consumption between cohorts of patients with early-onset CRC and healthy counterparts.

“This would help establish whether there’s a true causal relationship rather than just correlation,” Peters said.

 

Simple Strategies to Dietary Changes

With a 2024 study finding that greater adherence to WCRF/AICR Cancer Prevention Recommendations, including reducing processed meat consumption, was linked to a 14% reduction in CRC risk, physicians should emphasize the benefits of adopting dietary and lifestyle recommendations to patients.

Johnson advised simple strategies to encourage any needed dietary changes.

“Pay attention to what you eat, proportions, and variation of meal menus. Those are good starter points,” he told GI & Hepatology News. “None of these recommendations related to meats should be absolute, but reduction can be the target.”

Liang stressed the importance of repeated, nonjudgmental discussions.

“Research shows that physician recommendation is one of the strongest motivators in preventive health, so even if it doesn’t work the first few times, we have to continue delivering the message that can improve our patients’ health.”

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Even though older adults are more likely to be diagnosed with colorectal cancer (CRC), there is a concerning rise in diagnoses among younger adults, making it essential for healthcare providers to educate adult patients of all ages about the lifestyle-related risk factors associated with the disease.

Many are familiar with the modifiable risk factors of obesity, smoking, and alcohol consumption, but the impact of processed meat — a common element of the Western diet —often remains underappreciated.

But the data are clear: Processed meat, defined as meat that has been altered through methods such as salting, curing, fermentation, or smoking to enhance flavor or preservation, has been linked to an increased risk for CRC.

The International Agency for Research on Cancer, part of the World Health Organization, analyzed over 800 global studies and classified processed meats as carcinogenic to humans, whereas red meat was deemed “probably” carcinogenic. Their findings were later published in The Lancet Oncology, confirming that the strongest epidemiological evidence linked processed meat consumption to CRC.

“While I routinely counsel my patients about lifestyle and dietary risk factors for CRC, including processed meat, I’m not sure how often this is specifically mentioned by physicians in practice,” Peter S. Liang, MD, MPH, an assistant professor and researcher focused on CRC prevention at NYU Langone Health in New York City, and an AGA spokesperson, told GI & Hepatology News.

Dr. Peter S. Liang



David A. Johnson, MD, chief of gastroenterology at Eastern Virginia Medical School and Old Dominion University, both in Norfolk, Virginia, concurred.

Many healthcare providers may not fully recognize the risks posed by processed meat in relation to CRC to counsel their patients, Johnson said. “In my experience, there is not a widespread awareness.”

 

Understanding the Carcinogenic Risks 

The excess risk for CRC per gram of intake is higher for processed meat than for red meat. However, the threshold for harmful consumption varies among studies, and many group red and processed meat together in their analyses.

For example, a 2020 prospective analysis of UK Biobank data reported that a 70 g/d higher intake of red and processed meat was associated with a 32% and 40% greater risk for CRC and colon cancer, respectively.

More recently, a 2025 prospective study examined the associations between CRC and 97 dietary factors in 542,778 women. Investigators found that, aside from alcohol, red and processed meat were the only other dietary factors positively associated with CRC, with a 30 g/d intake increasing the risk for CRC by 8%.

Although the World Cancer Research Fund (WCRF) and the American Institute for Cancer Research (AICR) recommend limiting red meat consumption to no more than three portions a week, their guidance on processed meat is simpler and more restrictive: Consume very little, if any.

The risk for CRC associated with processed meats is likely due to a naturally occurring element in the meat and carcinogenic compounds that are added or created during its preparation, Johnson said.

Large bodies of evidence support the association between certain compounds in processed meat and cancer, added Ulrike Peters, PhD, MPH, professor and associate director of the Public Health Sciences Division at the Fred Hutchinson Cancer Center in Seattle.

These compounds include:

  • Heterocyclic amines: Prevalent in charred and well-done meat, these chemicals are created from the reaction at high temperatures between creatine/creatinine, amino acids, and sugars.
  • Nitrates/nitrites: Widely used in the curing of meat (eg, sausages, ham, bacon) to give products their pink coloring and savory flavor, these inorganic compounds bind with amines to produce N-nitrosamines, among the most potent genotoxic carcinogens.
  • Polycyclic aromatic hydrocarbons: Generated during high-temperature cooking and smoking, these compounds can induce DNA damage in the colon.
  • Heme iron: This type of iron, abundant in red and processed meats, promotes formation of carcinogenic N-nitroso compounds and oxidative damage to intestinal tissue.

Peters said that the compounds may work synergistically to increase the risk for CRC through various mechanisms, including DNA damage, inflammation, and altered gut microbiota.

While it would be useful to study whether the different meat-processing methods — for example, smoking vs salting — affect CRC risk differently, “practically, this is difficult because there’s so much overlap,” Liang noted.

 

Risk Mitigation

Lifestyle factors likely play a crucial role in the risk for CRC. For example, a study of European migrants to Australia found that those from countries with lower CRC incidences tended to develop a higher risk for CRC the longer they resided in Australia due to the dietary change.

Understanding how to mitigate these risk factors is becoming increasingly important with the rates of early-onset CRC projected to double by 2030 in the United States, a trend that is also being observed globally.

“With early-onset CRC, it’s becoming quite clear that there’s no single risk factor that’s driving this increase,” Liang said. “We need to look at the risk factors that we know cause CRC in older adults and see which have become more common over time.”

The consumption of processed meats is one such factor that’s been implicated, particularly for early-onset CRC. The average global consumption of all types of meat per capita has increased significantly over the last 50 years. A 2022 report estimated that global mean processed meat consumption was 17 g/d, with significantly higher rates in high-income regions. This number is expected to rise, with the global processed meat market projected to grow from $318 billion in 2023 to $429 billion by 2029. Given this, the importance of counseling patients to reduce their meat intake is further underscored.

Another strategy for mitigating the risks around processed meat is specifically identifying those patients who may be most vulnerable.

In 2024, Peters and colleagues published findings from their genome-wide gene-environment interaction analysis comparing a large population with CRC and healthy control individuals. The research identified two novel biomarkers that support the role of red and processed meat with an increased risk for CRC and may explain the higher risk in certain population subgroups. They are working on genetic risk prediction models that will incorporate these genetic markers but must first ensure robust validation through larger studies.

“This approach aligns with precision medicine principles, allowing for more personalized prevention strategies, though we’re not quite there yet in terms of clinical application,” Peters said.

Another knowledge gap that future research efforts could address is how dietary factors influence survival outcomes after a diagnosis of CRC.

“The existing guidelines primarily focus on cancer prevention, with strong evidence linking processed meat consumption to increased CRC risk. However, the impact of dietary choices on survival after CRC diagnosis remains poorly understood,” Peters said. “This distinction between prevention and survival is crucial, as biological mechanisms and optimal dietary interventions may differ significantly between these two contexts.”

Well-designed studies investigating the relationship between dietary patterns and CRC survival outcomes would enable the development of evidence-based nutritional recommendations specifically tailored for CRC survivors, Peters said. In addition, she called for well-designed studies that compare levels of processed meat consumption between cohorts of patients with early-onset CRC and healthy counterparts.

“This would help establish whether there’s a true causal relationship rather than just correlation,” Peters said.

 

Simple Strategies to Dietary Changes

With a 2024 study finding that greater adherence to WCRF/AICR Cancer Prevention Recommendations, including reducing processed meat consumption, was linked to a 14% reduction in CRC risk, physicians should emphasize the benefits of adopting dietary and lifestyle recommendations to patients.

Johnson advised simple strategies to encourage any needed dietary changes.

“Pay attention to what you eat, proportions, and variation of meal menus. Those are good starter points,” he told GI & Hepatology News. “None of these recommendations related to meats should be absolute, but reduction can be the target.”

Liang stressed the importance of repeated, nonjudgmental discussions.

“Research shows that physician recommendation is one of the strongest motivators in preventive health, so even if it doesn’t work the first few times, we have to continue delivering the message that can improve our patients’ health.”

A version of this article appeared on Medscape.com.

Even though older adults are more likely to be diagnosed with colorectal cancer (CRC), there is a concerning rise in diagnoses among younger adults, making it essential for healthcare providers to educate adult patients of all ages about the lifestyle-related risk factors associated with the disease.

Many are familiar with the modifiable risk factors of obesity, smoking, and alcohol consumption, but the impact of processed meat — a common element of the Western diet —often remains underappreciated.

But the data are clear: Processed meat, defined as meat that has been altered through methods such as salting, curing, fermentation, or smoking to enhance flavor or preservation, has been linked to an increased risk for CRC.

The International Agency for Research on Cancer, part of the World Health Organization, analyzed over 800 global studies and classified processed meats as carcinogenic to humans, whereas red meat was deemed “probably” carcinogenic. Their findings were later published in The Lancet Oncology, confirming that the strongest epidemiological evidence linked processed meat consumption to CRC.

“While I routinely counsel my patients about lifestyle and dietary risk factors for CRC, including processed meat, I’m not sure how often this is specifically mentioned by physicians in practice,” Peter S. Liang, MD, MPH, an assistant professor and researcher focused on CRC prevention at NYU Langone Health in New York City, and an AGA spokesperson, told GI & Hepatology News.

Dr. Peter S. Liang



David A. Johnson, MD, chief of gastroenterology at Eastern Virginia Medical School and Old Dominion University, both in Norfolk, Virginia, concurred.

Many healthcare providers may not fully recognize the risks posed by processed meat in relation to CRC to counsel their patients, Johnson said. “In my experience, there is not a widespread awareness.”

 

Understanding the Carcinogenic Risks 

The excess risk for CRC per gram of intake is higher for processed meat than for red meat. However, the threshold for harmful consumption varies among studies, and many group red and processed meat together in their analyses.

For example, a 2020 prospective analysis of UK Biobank data reported that a 70 g/d higher intake of red and processed meat was associated with a 32% and 40% greater risk for CRC and colon cancer, respectively.

More recently, a 2025 prospective study examined the associations between CRC and 97 dietary factors in 542,778 women. Investigators found that, aside from alcohol, red and processed meat were the only other dietary factors positively associated with CRC, with a 30 g/d intake increasing the risk for CRC by 8%.

Although the World Cancer Research Fund (WCRF) and the American Institute for Cancer Research (AICR) recommend limiting red meat consumption to no more than three portions a week, their guidance on processed meat is simpler and more restrictive: Consume very little, if any.

The risk for CRC associated with processed meats is likely due to a naturally occurring element in the meat and carcinogenic compounds that are added or created during its preparation, Johnson said.

Large bodies of evidence support the association between certain compounds in processed meat and cancer, added Ulrike Peters, PhD, MPH, professor and associate director of the Public Health Sciences Division at the Fred Hutchinson Cancer Center in Seattle.

These compounds include:

  • Heterocyclic amines: Prevalent in charred and well-done meat, these chemicals are created from the reaction at high temperatures between creatine/creatinine, amino acids, and sugars.
  • Nitrates/nitrites: Widely used in the curing of meat (eg, sausages, ham, bacon) to give products their pink coloring and savory flavor, these inorganic compounds bind with amines to produce N-nitrosamines, among the most potent genotoxic carcinogens.
  • Polycyclic aromatic hydrocarbons: Generated during high-temperature cooking and smoking, these compounds can induce DNA damage in the colon.
  • Heme iron: This type of iron, abundant in red and processed meats, promotes formation of carcinogenic N-nitroso compounds and oxidative damage to intestinal tissue.

Peters said that the compounds may work synergistically to increase the risk for CRC through various mechanisms, including DNA damage, inflammation, and altered gut microbiota.

While it would be useful to study whether the different meat-processing methods — for example, smoking vs salting — affect CRC risk differently, “practically, this is difficult because there’s so much overlap,” Liang noted.

 

Risk Mitigation

Lifestyle factors likely play a crucial role in the risk for CRC. For example, a study of European migrants to Australia found that those from countries with lower CRC incidences tended to develop a higher risk for CRC the longer they resided in Australia due to the dietary change.

Understanding how to mitigate these risk factors is becoming increasingly important with the rates of early-onset CRC projected to double by 2030 in the United States, a trend that is also being observed globally.

“With early-onset CRC, it’s becoming quite clear that there’s no single risk factor that’s driving this increase,” Liang said. “We need to look at the risk factors that we know cause CRC in older adults and see which have become more common over time.”

The consumption of processed meats is one such factor that’s been implicated, particularly for early-onset CRC. The average global consumption of all types of meat per capita has increased significantly over the last 50 years. A 2022 report estimated that global mean processed meat consumption was 17 g/d, with significantly higher rates in high-income regions. This number is expected to rise, with the global processed meat market projected to grow from $318 billion in 2023 to $429 billion by 2029. Given this, the importance of counseling patients to reduce their meat intake is further underscored.

Another strategy for mitigating the risks around processed meat is specifically identifying those patients who may be most vulnerable.

In 2024, Peters and colleagues published findings from their genome-wide gene-environment interaction analysis comparing a large population with CRC and healthy control individuals. The research identified two novel biomarkers that support the role of red and processed meat with an increased risk for CRC and may explain the higher risk in certain population subgroups. They are working on genetic risk prediction models that will incorporate these genetic markers but must first ensure robust validation through larger studies.

“This approach aligns with precision medicine principles, allowing for more personalized prevention strategies, though we’re not quite there yet in terms of clinical application,” Peters said.

Another knowledge gap that future research efforts could address is how dietary factors influence survival outcomes after a diagnosis of CRC.

“The existing guidelines primarily focus on cancer prevention, with strong evidence linking processed meat consumption to increased CRC risk. However, the impact of dietary choices on survival after CRC diagnosis remains poorly understood,” Peters said. “This distinction between prevention and survival is crucial, as biological mechanisms and optimal dietary interventions may differ significantly between these two contexts.”

Well-designed studies investigating the relationship between dietary patterns and CRC survival outcomes would enable the development of evidence-based nutritional recommendations specifically tailored for CRC survivors, Peters said. In addition, she called for well-designed studies that compare levels of processed meat consumption between cohorts of patients with early-onset CRC and healthy counterparts.

“This would help establish whether there’s a true causal relationship rather than just correlation,” Peters said.

 

Simple Strategies to Dietary Changes

With a 2024 study finding that greater adherence to WCRF/AICR Cancer Prevention Recommendations, including reducing processed meat consumption, was linked to a 14% reduction in CRC risk, physicians should emphasize the benefits of adopting dietary and lifestyle recommendations to patients.

Johnson advised simple strategies to encourage any needed dietary changes.

“Pay attention to what you eat, proportions, and variation of meal menus. Those are good starter points,” he told GI & Hepatology News. “None of these recommendations related to meats should be absolute, but reduction can be the target.”

Liang stressed the importance of repeated, nonjudgmental discussions.

“Research shows that physician recommendation is one of the strongest motivators in preventive health, so even if it doesn’t work the first few times, we have to continue delivering the message that can improve our patients’ health.”

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Fri, 04/04/2025 - 16:19
Un-Gate On Date
Fri, 04/04/2025 - 16:19
Use ProPublica
CFC Schedule Remove Status
Fri, 04/04/2025 - 16:19
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Fri, 04/04/2025 - 16:19

Could Statins Prevent Hepatocellular Carcinoma?

Article Type
Changed
Mon, 04/07/2025 - 14:12

Long-term use of statins may delay or deflect the development of hepatocellular carcinoma in adults with chronic liver disease, as well as in the general population, emerging research, including several large cohort studies, suggested.

The most recent study, published in JAMA Internal Medicine, showed a lower incidence of hepatic decompensation among statin users in a registry for adults aged 40 years or older with baseline chronic liver disease.

“Our findings support the idea that statins may offer benefits beyond lipid-lowering in patients with [chronic liver disease], and clinicians may be more confident in prescribing statins when indicated,” even in these patients, said corresponding Co-author Raymond T. Chung, MD, gastroenterology investigator at Mass General Research Institute, Boston, in an interview. 

Dr. Raymond T. Chung



“While prior studies have suggested an association between statin use and reduced hepatocellular carcinoma risk, our study aimed to build on that evidence by using a large, real-world, hospital-based cohort inclusive of all etiologies of chronic liver disease,” Chung told GI & Hepatology News.

Chung, along with Jonggi Choi, MD, of the University of Ulsan College of Medicine, Seoul, South Korea, and colleagues, reviewed data from the Research Patient Data Registry from 2000 to 2023 for 16,501 participantsaged 40 years or older with baseline chronic liver disease and baseline Fibrosis-4 (FIB-4) scores ≥ 1.3.

The study population had a mean age of 59.7 years, and 40.9% were women. The researchers divided the population into statin users (n = 3610) and nonusers (n = 12,891). Statin use was defined as a cumulative defined daily dose ≥ 30 mg.

The primary outcome was the cumulative incidence of hepatocellular carcinoma and hepatic decompensation.

At 10 years follow-up, statin users showed a significantly reduced incidence of hepatocellular carcinoma vs nonusers (3.8% vs 8.0%; P < .001) as well as a significantly reduced incidence of hepatic decompensation (10.6% vs 19.5%; P < .001).

Incorporating FIB-4 scores, a surrogate marker for liver fibrosis, also showed that statin users were less likely to experience fibrosis progression, offering a potential mechanism of action for the observed reduction in adverse liver outcomes, Chung told GI & Hepatology News.

“Similar trends have been observed in prior observational studies, but our findings now support a real effect of statin use on fibrosis progression,” he said. “However, what strengthened our study was that the association remained consistent across multiple subgroups and sensitivity analyses.”

Another study published in Clinical Gastroenterology and Hepatology showed a reduced risk of developing severe liver disease in a Swedish cohort of noncirrhotic adults with chronic liver disease who used statins (n = 3862) compared with control patients with chronic liver disease (matched 1:1) and who did not use statins (hazard ratio [HR], 0.60). 

In that study, Rajani Sharma, MD, and colleagues found a protective association in both prefibrosis and fibrosis stages at diagnosis, and statin use was associated with reduced rates of progression to both cirrhosis and hepatocellular carcinoma (HR, 0.62 and 0.44, respectively).

 

Exciting and Necessary Research

The research by Choi and colleagues is “exciting,” said Bubu Banini, MD, PhD, an assistant professor in digestive diseases at Yale School of Medicine, New Haven, Connecticut, in an interview.

Dr. Bubu Banini

Liver cancer prevalence has risen over the past few decades in the United States and worldwide, and the 5-year overall survival rate of liver cancer is less than 20%, Banini told GI & Hepatology News.

Clinicians often withhold statins out of fear of liver injury in persons with chronic liver disease; however, a takeaway from this study is that for persons with chronic liver disease who have indications for statin use, the medication should not be withheld, she said.

Of course, prospective studies are needed to replicate the results, Banini added.

The study findings were limited by several factors, including the inability to adjust for all potential confounding variables, lack of data on post-index treatments, and the use of wide, cumulative, defined daily dose categories to ensure statistical power, the researchers noted.

“Moving forward, randomized controlled trials are essential to establish a causal relationship and clarify the molecular and clinical pathways through which statins exert hepatoprotective effects,” Chung added.

Randomized controlled trials are also needed to determine whether statins can actually reduce the risk for hepatocellular carcinoma and hepatic decompensation in patients with chronic liver disease, and cost-effectiveness analyses may be essential for translating this evidence into clinical guidelines, he added.

 

Statins and HCC Risk in the General Population

large cohort study, published in JAMA Network Open by Mara Sophie Vell, PhD, and colleagues, showed an association between reduced risk for hepatocellular carcinoma and statin use in the general population and in those at increased risk for liver disease.

The study, which included data for individuals aged 37-73 years from the UK Biobank, found a 15% reduced risk for new-onset liver disease and a 28% reduced risk for liver-related death among regular statin users than among nonusers (HR, 0.85 and 0.72, respectively). 

In addition, regular statin users showed a 74% reduced risk (P = .003) of developing hepatocellular carcinoma compared with those not using statins. The researchers identified a particular impact on liver disease risk reduction among men, individuals with diabetes, and patients with high levels of liver scarring at baseline based on the FIB-4 index.

meta-analysis of 24 studies, previously published in the journal Cancers, showed a significant reduction of 46% in hepatocellular carcinoma risk among statins users compared with nonusers.

The researchers found this risk reduction was significant in subgroups of patients with diabetes, liver cirrhosis, and those on antiviral therapy, and they suggested that the antiangiogenic, immunomodulatory, antiproliferative, and antifibrotic properties of statins may contribute to their potential to reduce tumor growth or hepatocellular carcinoma development.

The meta-analysis authors noted that although most studies have reported a low risk for statin-induced hepatotoxicity, clinicians should proceed with caution in some patients with existing cirrhosis.

“If the patients are diagnosed with decompensated cirrhosis, then statins should be prescribed with caution at low doses,” they wrote.

Advocating statin use solely for chemoprevention may be premature based on observational data, Chung told GI & Hepatology News.

“However, in patients with [chronic liver disease] who already meet indications for statin therapy, the potential added benefit of reducing liver-related complications strengthens the rationale for their use,” he said. Future randomized clinical trials will be key to defining the risk-benefit profile in this context.

The study by Choi and colleagues was supported by the National Institutes of Health.

The study by Sharma and colleagues was supported by the Karolinska Institutet, Stockholm, Sweden, and the Columbia University Irving Medical Center, New York City; researchers were supported by grants from the Swedish Research Council, Center for Innovative Medicine, the Swedish Cancer Society, and the National Institutes of Health.

The study by Vell and colleagues had no outside funding.

The study by Mohaimenul Islam and colleagues was supported by the Ministry of Education and Ministry of Science and Technology, Taiwan.

Chung and Banini had no financial conflicts to disclose.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Long-term use of statins may delay or deflect the development of hepatocellular carcinoma in adults with chronic liver disease, as well as in the general population, emerging research, including several large cohort studies, suggested.

The most recent study, published in JAMA Internal Medicine, showed a lower incidence of hepatic decompensation among statin users in a registry for adults aged 40 years or older with baseline chronic liver disease.

“Our findings support the idea that statins may offer benefits beyond lipid-lowering in patients with [chronic liver disease], and clinicians may be more confident in prescribing statins when indicated,” even in these patients, said corresponding Co-author Raymond T. Chung, MD, gastroenterology investigator at Mass General Research Institute, Boston, in an interview. 

Dr. Raymond T. Chung



“While prior studies have suggested an association between statin use and reduced hepatocellular carcinoma risk, our study aimed to build on that evidence by using a large, real-world, hospital-based cohort inclusive of all etiologies of chronic liver disease,” Chung told GI & Hepatology News.

Chung, along with Jonggi Choi, MD, of the University of Ulsan College of Medicine, Seoul, South Korea, and colleagues, reviewed data from the Research Patient Data Registry from 2000 to 2023 for 16,501 participantsaged 40 years or older with baseline chronic liver disease and baseline Fibrosis-4 (FIB-4) scores ≥ 1.3.

The study population had a mean age of 59.7 years, and 40.9% were women. The researchers divided the population into statin users (n = 3610) and nonusers (n = 12,891). Statin use was defined as a cumulative defined daily dose ≥ 30 mg.

The primary outcome was the cumulative incidence of hepatocellular carcinoma and hepatic decompensation.

At 10 years follow-up, statin users showed a significantly reduced incidence of hepatocellular carcinoma vs nonusers (3.8% vs 8.0%; P < .001) as well as a significantly reduced incidence of hepatic decompensation (10.6% vs 19.5%; P < .001).

Incorporating FIB-4 scores, a surrogate marker for liver fibrosis, also showed that statin users were less likely to experience fibrosis progression, offering a potential mechanism of action for the observed reduction in adverse liver outcomes, Chung told GI & Hepatology News.

“Similar trends have been observed in prior observational studies, but our findings now support a real effect of statin use on fibrosis progression,” he said. “However, what strengthened our study was that the association remained consistent across multiple subgroups and sensitivity analyses.”

Another study published in Clinical Gastroenterology and Hepatology showed a reduced risk of developing severe liver disease in a Swedish cohort of noncirrhotic adults with chronic liver disease who used statins (n = 3862) compared with control patients with chronic liver disease (matched 1:1) and who did not use statins (hazard ratio [HR], 0.60). 

In that study, Rajani Sharma, MD, and colleagues found a protective association in both prefibrosis and fibrosis stages at diagnosis, and statin use was associated with reduced rates of progression to both cirrhosis and hepatocellular carcinoma (HR, 0.62 and 0.44, respectively).

 

Exciting and Necessary Research

The research by Choi and colleagues is “exciting,” said Bubu Banini, MD, PhD, an assistant professor in digestive diseases at Yale School of Medicine, New Haven, Connecticut, in an interview.

Dr. Bubu Banini

Liver cancer prevalence has risen over the past few decades in the United States and worldwide, and the 5-year overall survival rate of liver cancer is less than 20%, Banini told GI & Hepatology News.

Clinicians often withhold statins out of fear of liver injury in persons with chronic liver disease; however, a takeaway from this study is that for persons with chronic liver disease who have indications for statin use, the medication should not be withheld, she said.

Of course, prospective studies are needed to replicate the results, Banini added.

The study findings were limited by several factors, including the inability to adjust for all potential confounding variables, lack of data on post-index treatments, and the use of wide, cumulative, defined daily dose categories to ensure statistical power, the researchers noted.

“Moving forward, randomized controlled trials are essential to establish a causal relationship and clarify the molecular and clinical pathways through which statins exert hepatoprotective effects,” Chung added.

Randomized controlled trials are also needed to determine whether statins can actually reduce the risk for hepatocellular carcinoma and hepatic decompensation in patients with chronic liver disease, and cost-effectiveness analyses may be essential for translating this evidence into clinical guidelines, he added.

 

Statins and HCC Risk in the General Population

large cohort study, published in JAMA Network Open by Mara Sophie Vell, PhD, and colleagues, showed an association between reduced risk for hepatocellular carcinoma and statin use in the general population and in those at increased risk for liver disease.

The study, which included data for individuals aged 37-73 years from the UK Biobank, found a 15% reduced risk for new-onset liver disease and a 28% reduced risk for liver-related death among regular statin users than among nonusers (HR, 0.85 and 0.72, respectively). 

In addition, regular statin users showed a 74% reduced risk (P = .003) of developing hepatocellular carcinoma compared with those not using statins. The researchers identified a particular impact on liver disease risk reduction among men, individuals with diabetes, and patients with high levels of liver scarring at baseline based on the FIB-4 index.

meta-analysis of 24 studies, previously published in the journal Cancers, showed a significant reduction of 46% in hepatocellular carcinoma risk among statins users compared with nonusers.

The researchers found this risk reduction was significant in subgroups of patients with diabetes, liver cirrhosis, and those on antiviral therapy, and they suggested that the antiangiogenic, immunomodulatory, antiproliferative, and antifibrotic properties of statins may contribute to their potential to reduce tumor growth or hepatocellular carcinoma development.

The meta-analysis authors noted that although most studies have reported a low risk for statin-induced hepatotoxicity, clinicians should proceed with caution in some patients with existing cirrhosis.

“If the patients are diagnosed with decompensated cirrhosis, then statins should be prescribed with caution at low doses,” they wrote.

Advocating statin use solely for chemoprevention may be premature based on observational data, Chung told GI & Hepatology News.

“However, in patients with [chronic liver disease] who already meet indications for statin therapy, the potential added benefit of reducing liver-related complications strengthens the rationale for their use,” he said. Future randomized clinical trials will be key to defining the risk-benefit profile in this context.

The study by Choi and colleagues was supported by the National Institutes of Health.

The study by Sharma and colleagues was supported by the Karolinska Institutet, Stockholm, Sweden, and the Columbia University Irving Medical Center, New York City; researchers were supported by grants from the Swedish Research Council, Center for Innovative Medicine, the Swedish Cancer Society, and the National Institutes of Health.

The study by Vell and colleagues had no outside funding.

The study by Mohaimenul Islam and colleagues was supported by the Ministry of Education and Ministry of Science and Technology, Taiwan.

Chung and Banini had no financial conflicts to disclose.

A version of this article appeared on Medscape.com.

Long-term use of statins may delay or deflect the development of hepatocellular carcinoma in adults with chronic liver disease, as well as in the general population, emerging research, including several large cohort studies, suggested.

The most recent study, published in JAMA Internal Medicine, showed a lower incidence of hepatic decompensation among statin users in a registry for adults aged 40 years or older with baseline chronic liver disease.

“Our findings support the idea that statins may offer benefits beyond lipid-lowering in patients with [chronic liver disease], and clinicians may be more confident in prescribing statins when indicated,” even in these patients, said corresponding Co-author Raymond T. Chung, MD, gastroenterology investigator at Mass General Research Institute, Boston, in an interview. 

Dr. Raymond T. Chung



“While prior studies have suggested an association between statin use and reduced hepatocellular carcinoma risk, our study aimed to build on that evidence by using a large, real-world, hospital-based cohort inclusive of all etiologies of chronic liver disease,” Chung told GI & Hepatology News.

Chung, along with Jonggi Choi, MD, of the University of Ulsan College of Medicine, Seoul, South Korea, and colleagues, reviewed data from the Research Patient Data Registry from 2000 to 2023 for 16,501 participantsaged 40 years or older with baseline chronic liver disease and baseline Fibrosis-4 (FIB-4) scores ≥ 1.3.

The study population had a mean age of 59.7 years, and 40.9% were women. The researchers divided the population into statin users (n = 3610) and nonusers (n = 12,891). Statin use was defined as a cumulative defined daily dose ≥ 30 mg.

The primary outcome was the cumulative incidence of hepatocellular carcinoma and hepatic decompensation.

At 10 years follow-up, statin users showed a significantly reduced incidence of hepatocellular carcinoma vs nonusers (3.8% vs 8.0%; P < .001) as well as a significantly reduced incidence of hepatic decompensation (10.6% vs 19.5%; P < .001).

Incorporating FIB-4 scores, a surrogate marker for liver fibrosis, also showed that statin users were less likely to experience fibrosis progression, offering a potential mechanism of action for the observed reduction in adverse liver outcomes, Chung told GI & Hepatology News.

“Similar trends have been observed in prior observational studies, but our findings now support a real effect of statin use on fibrosis progression,” he said. “However, what strengthened our study was that the association remained consistent across multiple subgroups and sensitivity analyses.”

Another study published in Clinical Gastroenterology and Hepatology showed a reduced risk of developing severe liver disease in a Swedish cohort of noncirrhotic adults with chronic liver disease who used statins (n = 3862) compared with control patients with chronic liver disease (matched 1:1) and who did not use statins (hazard ratio [HR], 0.60). 

In that study, Rajani Sharma, MD, and colleagues found a protective association in both prefibrosis and fibrosis stages at diagnosis, and statin use was associated with reduced rates of progression to both cirrhosis and hepatocellular carcinoma (HR, 0.62 and 0.44, respectively).

 

Exciting and Necessary Research

The research by Choi and colleagues is “exciting,” said Bubu Banini, MD, PhD, an assistant professor in digestive diseases at Yale School of Medicine, New Haven, Connecticut, in an interview.

Dr. Bubu Banini

Liver cancer prevalence has risen over the past few decades in the United States and worldwide, and the 5-year overall survival rate of liver cancer is less than 20%, Banini told GI & Hepatology News.

Clinicians often withhold statins out of fear of liver injury in persons with chronic liver disease; however, a takeaway from this study is that for persons with chronic liver disease who have indications for statin use, the medication should not be withheld, she said.

Of course, prospective studies are needed to replicate the results, Banini added.

The study findings were limited by several factors, including the inability to adjust for all potential confounding variables, lack of data on post-index treatments, and the use of wide, cumulative, defined daily dose categories to ensure statistical power, the researchers noted.

“Moving forward, randomized controlled trials are essential to establish a causal relationship and clarify the molecular and clinical pathways through which statins exert hepatoprotective effects,” Chung added.

Randomized controlled trials are also needed to determine whether statins can actually reduce the risk for hepatocellular carcinoma and hepatic decompensation in patients with chronic liver disease, and cost-effectiveness analyses may be essential for translating this evidence into clinical guidelines, he added.

 

Statins and HCC Risk in the General Population

large cohort study, published in JAMA Network Open by Mara Sophie Vell, PhD, and colleagues, showed an association between reduced risk for hepatocellular carcinoma and statin use in the general population and in those at increased risk for liver disease.

The study, which included data for individuals aged 37-73 years from the UK Biobank, found a 15% reduced risk for new-onset liver disease and a 28% reduced risk for liver-related death among regular statin users than among nonusers (HR, 0.85 and 0.72, respectively). 

In addition, regular statin users showed a 74% reduced risk (P = .003) of developing hepatocellular carcinoma compared with those not using statins. The researchers identified a particular impact on liver disease risk reduction among men, individuals with diabetes, and patients with high levels of liver scarring at baseline based on the FIB-4 index.

meta-analysis of 24 studies, previously published in the journal Cancers, showed a significant reduction of 46% in hepatocellular carcinoma risk among statins users compared with nonusers.

The researchers found this risk reduction was significant in subgroups of patients with diabetes, liver cirrhosis, and those on antiviral therapy, and they suggested that the antiangiogenic, immunomodulatory, antiproliferative, and antifibrotic properties of statins may contribute to their potential to reduce tumor growth or hepatocellular carcinoma development.

The meta-analysis authors noted that although most studies have reported a low risk for statin-induced hepatotoxicity, clinicians should proceed with caution in some patients with existing cirrhosis.

“If the patients are diagnosed with decompensated cirrhosis, then statins should be prescribed with caution at low doses,” they wrote.

Advocating statin use solely for chemoprevention may be premature based on observational data, Chung told GI & Hepatology News.

“However, in patients with [chronic liver disease] who already meet indications for statin therapy, the potential added benefit of reducing liver-related complications strengthens the rationale for their use,” he said. Future randomized clinical trials will be key to defining the risk-benefit profile in this context.

The study by Choi and colleagues was supported by the National Institutes of Health.

The study by Sharma and colleagues was supported by the Karolinska Institutet, Stockholm, Sweden, and the Columbia University Irving Medical Center, New York City; researchers were supported by grants from the Swedish Research Council, Center for Innovative Medicine, the Swedish Cancer Society, and the National Institutes of Health.

The study by Vell and colleagues had no outside funding.

The study by Mohaimenul Islam and colleagues was supported by the Ministry of Education and Ministry of Science and Technology, Taiwan.

Chung and Banini had no financial conflicts to disclose.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Fri, 04/04/2025 - 16:13
Un-Gate On Date
Fri, 04/04/2025 - 16:13
Use ProPublica
CFC Schedule Remove Status
Fri, 04/04/2025 - 16:13
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Fri, 04/04/2025 - 16:13

Simple Score Predicts Advanced Colorectal Neoplasia in Young Adults

Article Type
Changed
Wed, 04/02/2025 - 12:13

Researchers have developed and internally validated a simple score using clinical factors that can help estimate the likelihood of advanced colorectal neoplasia in adults younger than age 45 years.

While colorectal cancer (CRC) incidence has declined overall due to screening, early-onset CRC is on the rise, particularly in individuals younger than 45 years — an age group not currently recommended for CRC screening.

Studies have shown that the risk for early-onset advanced neoplasia varies based on several factors, including sex, race, family history of CRC, smoking, alcohol consumption, diabetes, hyperlipidemia, obesity, and diet.

A score that incorporates some of these factors to identify which younger adults are at higher risk for advanced neoplasia, a precursor to CRC, could support earlier, more targeted screening interventions.

The simple clinical score can be easily calculated by primary care providers in the office, Carole Macaron, MD, lead author of the study and a gastroenterologist at Cleveland Clinic, told GI & Hepatology News. “Patients with a high-risk score would be referred for colorectal cancer screening.”

The study was published in Digestive Diseases and Sciences.

To develop and validate their risk score, Macaron and colleagues did a retrospective cross-sectional analysis of 9446 individuals aged 18-44 years (mean age, 36.8 years; 61% women) who underwent colonoscopy at their center.

Advanced neoplasia was defined as a tubular adenoma ≥ 10 mm or any adenoma with villous features or high-grade dysplasia, sessile serrated polyp ≥ 10 mm, sessile serrated polyp with dysplasia, traditional serrated adenoma, or invasive adenocarcinoma.

The 346 (3.7%) individuals found to have advanced neoplasia served as the case group, and the remainder with normal colonoscopy or non-advanced neoplasia served as controls.

A multivariate logistic regression model identified three independent risk factors significantly associated with advanced neoplasia: Higher body mass index (BMI; P = .0157), former and current tobacco use (P = .0009 and P = .0015, respectively), and having a first-degree relative with CRC < 60 years (P < .0001) or other family history of CRC (P = .0117).

The researchers used these risk factors to develop a risk prediction score to estimate the likelihood of detecting advanced neoplasia, which ranged from a risk of 1.8% for patients with a score of 1 to 22.2% for those with a score of 12. Individuals with a score of ≥ 9 had a 14% or higher risk for advanced neoplasia.

Based on the risk model, the likelihood of detecting advanced neoplasia in an asymptomatic 32-year-old overweight individual, with a history of previous tobacco use and a first-degree relative younger than age 60 with CRC would be 20.3%, Macaron and colleagues noted.

The model demonstrated “moderate” discriminatory power in the validation set (C-statistic: 0.645), indicating that it can effectively differentiate between individuals at a higher and lower risk for advanced neoplasia.

Additionally, the authors are exploring ways to improve the discriminatory power of the score, possibly by including additional risk factors.

Given the score is calculated using easily obtainable risk factors for individuals younger than 45 who are at risk for early-onset colorectal neoplasia, it could help guide individualized screening decisions for those in whom screening is not currently offered, Macaron said. It could also serve as a tool for risk communication and shared decision-making.

Integration into electronic health records or online calculators may enhance its accessibility and clinical utility.

The authors noted that this retrospective study was conducted at a single center caring mainly for White non-Hispanic adults, limiting generalizability to the general population and to other races and ethnicities.

 

Validation in Real-World Setting Needed

Dr. Steven H. Itzkowitz

“There are no currently accepted advanced colorectal neoplasia risk scores that are used in general practice,” said Steven H. Itzkowitz, MD, AGAF, professor of medicine, oncological sciences, and medical education, Icahn School of Medicine at Mount Sinai in New York City. “If these lesions can be predicted, it would enable these young individuals to undergo screening colonoscopy, which could detect and remove these lesions, thereby preventing CRC.”

Many of the known risk factors (family history, high BMI, or smoking) for CRC development at any age are incorporated within this tool, so it should be feasible to collect these data,” said Itzkowitz, who was not involved with the study.

But he cautioned that accurate and adequate family histories are not always performed. Clinicians also may not have considered combining these factors into an actionable risk score.

“If this score can be externally validated in a real-world setting, it could be a useful addition in our efforts to lower CRC rates among young individuals,” Itzkowitz said.

The study did not receive any funding. Macaron and Itzkowitz reported no competing interests.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Researchers have developed and internally validated a simple score using clinical factors that can help estimate the likelihood of advanced colorectal neoplasia in adults younger than age 45 years.

While colorectal cancer (CRC) incidence has declined overall due to screening, early-onset CRC is on the rise, particularly in individuals younger than 45 years — an age group not currently recommended for CRC screening.

Studies have shown that the risk for early-onset advanced neoplasia varies based on several factors, including sex, race, family history of CRC, smoking, alcohol consumption, diabetes, hyperlipidemia, obesity, and diet.

A score that incorporates some of these factors to identify which younger adults are at higher risk for advanced neoplasia, a precursor to CRC, could support earlier, more targeted screening interventions.

The simple clinical score can be easily calculated by primary care providers in the office, Carole Macaron, MD, lead author of the study and a gastroenterologist at Cleveland Clinic, told GI & Hepatology News. “Patients with a high-risk score would be referred for colorectal cancer screening.”

The study was published in Digestive Diseases and Sciences.

To develop and validate their risk score, Macaron and colleagues did a retrospective cross-sectional analysis of 9446 individuals aged 18-44 years (mean age, 36.8 years; 61% women) who underwent colonoscopy at their center.

Advanced neoplasia was defined as a tubular adenoma ≥ 10 mm or any adenoma with villous features or high-grade dysplasia, sessile serrated polyp ≥ 10 mm, sessile serrated polyp with dysplasia, traditional serrated adenoma, or invasive adenocarcinoma.

The 346 (3.7%) individuals found to have advanced neoplasia served as the case group, and the remainder with normal colonoscopy or non-advanced neoplasia served as controls.

A multivariate logistic regression model identified three independent risk factors significantly associated with advanced neoplasia: Higher body mass index (BMI; P = .0157), former and current tobacco use (P = .0009 and P = .0015, respectively), and having a first-degree relative with CRC < 60 years (P < .0001) or other family history of CRC (P = .0117).

The researchers used these risk factors to develop a risk prediction score to estimate the likelihood of detecting advanced neoplasia, which ranged from a risk of 1.8% for patients with a score of 1 to 22.2% for those with a score of 12. Individuals with a score of ≥ 9 had a 14% or higher risk for advanced neoplasia.

Based on the risk model, the likelihood of detecting advanced neoplasia in an asymptomatic 32-year-old overweight individual, with a history of previous tobacco use and a first-degree relative younger than age 60 with CRC would be 20.3%, Macaron and colleagues noted.

The model demonstrated “moderate” discriminatory power in the validation set (C-statistic: 0.645), indicating that it can effectively differentiate between individuals at a higher and lower risk for advanced neoplasia.

Additionally, the authors are exploring ways to improve the discriminatory power of the score, possibly by including additional risk factors.

Given the score is calculated using easily obtainable risk factors for individuals younger than 45 who are at risk for early-onset colorectal neoplasia, it could help guide individualized screening decisions for those in whom screening is not currently offered, Macaron said. It could also serve as a tool for risk communication and shared decision-making.

Integration into electronic health records or online calculators may enhance its accessibility and clinical utility.

The authors noted that this retrospective study was conducted at a single center caring mainly for White non-Hispanic adults, limiting generalizability to the general population and to other races and ethnicities.

 

Validation in Real-World Setting Needed

Dr. Steven H. Itzkowitz

“There are no currently accepted advanced colorectal neoplasia risk scores that are used in general practice,” said Steven H. Itzkowitz, MD, AGAF, professor of medicine, oncological sciences, and medical education, Icahn School of Medicine at Mount Sinai in New York City. “If these lesions can be predicted, it would enable these young individuals to undergo screening colonoscopy, which could detect and remove these lesions, thereby preventing CRC.”

Many of the known risk factors (family history, high BMI, or smoking) for CRC development at any age are incorporated within this tool, so it should be feasible to collect these data,” said Itzkowitz, who was not involved with the study.

But he cautioned that accurate and adequate family histories are not always performed. Clinicians also may not have considered combining these factors into an actionable risk score.

“If this score can be externally validated in a real-world setting, it could be a useful addition in our efforts to lower CRC rates among young individuals,” Itzkowitz said.

The study did not receive any funding. Macaron and Itzkowitz reported no competing interests.

A version of this article first appeared on Medscape.com.

Researchers have developed and internally validated a simple score using clinical factors that can help estimate the likelihood of advanced colorectal neoplasia in adults younger than age 45 years.

While colorectal cancer (CRC) incidence has declined overall due to screening, early-onset CRC is on the rise, particularly in individuals younger than 45 years — an age group not currently recommended for CRC screening.

Studies have shown that the risk for early-onset advanced neoplasia varies based on several factors, including sex, race, family history of CRC, smoking, alcohol consumption, diabetes, hyperlipidemia, obesity, and diet.

A score that incorporates some of these factors to identify which younger adults are at higher risk for advanced neoplasia, a precursor to CRC, could support earlier, more targeted screening interventions.

The simple clinical score can be easily calculated by primary care providers in the office, Carole Macaron, MD, lead author of the study and a gastroenterologist at Cleveland Clinic, told GI & Hepatology News. “Patients with a high-risk score would be referred for colorectal cancer screening.”

The study was published in Digestive Diseases and Sciences.

To develop and validate their risk score, Macaron and colleagues did a retrospective cross-sectional analysis of 9446 individuals aged 18-44 years (mean age, 36.8 years; 61% women) who underwent colonoscopy at their center.

Advanced neoplasia was defined as a tubular adenoma ≥ 10 mm or any adenoma with villous features or high-grade dysplasia, sessile serrated polyp ≥ 10 mm, sessile serrated polyp with dysplasia, traditional serrated adenoma, or invasive adenocarcinoma.

The 346 (3.7%) individuals found to have advanced neoplasia served as the case group, and the remainder with normal colonoscopy or non-advanced neoplasia served as controls.

A multivariate logistic regression model identified three independent risk factors significantly associated with advanced neoplasia: Higher body mass index (BMI; P = .0157), former and current tobacco use (P = .0009 and P = .0015, respectively), and having a first-degree relative with CRC < 60 years (P < .0001) or other family history of CRC (P = .0117).

The researchers used these risk factors to develop a risk prediction score to estimate the likelihood of detecting advanced neoplasia, which ranged from a risk of 1.8% for patients with a score of 1 to 22.2% for those with a score of 12. Individuals with a score of ≥ 9 had a 14% or higher risk for advanced neoplasia.

Based on the risk model, the likelihood of detecting advanced neoplasia in an asymptomatic 32-year-old overweight individual, with a history of previous tobacco use and a first-degree relative younger than age 60 with CRC would be 20.3%, Macaron and colleagues noted.

The model demonstrated “moderate” discriminatory power in the validation set (C-statistic: 0.645), indicating that it can effectively differentiate between individuals at a higher and lower risk for advanced neoplasia.

Additionally, the authors are exploring ways to improve the discriminatory power of the score, possibly by including additional risk factors.

Given the score is calculated using easily obtainable risk factors for individuals younger than 45 who are at risk for early-onset colorectal neoplasia, it could help guide individualized screening decisions for those in whom screening is not currently offered, Macaron said. It could also serve as a tool for risk communication and shared decision-making.

Integration into electronic health records or online calculators may enhance its accessibility and clinical utility.

The authors noted that this retrospective study was conducted at a single center caring mainly for White non-Hispanic adults, limiting generalizability to the general population and to other races and ethnicities.

 

Validation in Real-World Setting Needed

Dr. Steven H. Itzkowitz

“There are no currently accepted advanced colorectal neoplasia risk scores that are used in general practice,” said Steven H. Itzkowitz, MD, AGAF, professor of medicine, oncological sciences, and medical education, Icahn School of Medicine at Mount Sinai in New York City. “If these lesions can be predicted, it would enable these young individuals to undergo screening colonoscopy, which could detect and remove these lesions, thereby preventing CRC.”

Many of the known risk factors (family history, high BMI, or smoking) for CRC development at any age are incorporated within this tool, so it should be feasible to collect these data,” said Itzkowitz, who was not involved with the study.

But he cautioned that accurate and adequate family histories are not always performed. Clinicians also may not have considered combining these factors into an actionable risk score.

“If this score can be externally validated in a real-world setting, it could be a useful addition in our efforts to lower CRC rates among young individuals,” Itzkowitz said.

The study did not receive any funding. Macaron and Itzkowitz reported no competing interests.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Wed, 04/02/2025 - 12:03
Un-Gate On Date
Wed, 04/02/2025 - 12:03
Use ProPublica
CFC Schedule Remove Status
Wed, 04/02/2025 - 12:03
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Wed, 04/02/2025 - 12:03

WATS-3D Biopsy Increases Detection of Barrett’s Esophagus in GERD

Article Type
Changed
Wed, 03/26/2025 - 16:50

In patients with gastroesophageal reflux (GERD) symptoms undergoing screening upper endoscopy, adjunctive use of wide-area transepithelial sampling with 3D computer-assisted analysis (WATS-3D) increases detection of Barrett’s esophagus (BE) and dysplasia, new research showed. 

Shaheen_Nicholas_J_web-ETOC
Dr Nicholas Shaheen

Compared with forceps biopsies (FB) alone, the addition of WATS-3D led to confirmation of BE in an additional one fifth of patients, roughly doubled dysplasia diagnoses, and influenced clinical management in the majority of patients. 

“The big take-home point here is that the use of WATS-3D brushing along with conventional biopsies increases the likelihood that intestinal metaplasia will be identified,” first author Nicholas Shaheen, MD, MPH, AGAF, with the Center for Esophageal Diseases and Swallowing, University of North Carolina School of Medicine at Chapel Hill, North Carolina, told GI & Hepatology News

“Almost 20% of patients who harbor BE were only identified by WATS-3D and might have otherwise gone undiagnosed had only forceps biopsies been performed,” Shaheen said. 

The study was published in The American Journal of Gastroenterology.

 

Beyond Traditional Biopsies

BE develops as a complication of chronic GERD and is the chief precursor to esophageal adenocarcinoma. Early detection of BE and dysplasia is crucial to enable timely intervention. 

The current gold standard for BE screening involves upper endoscopy with FB following the Seattle protocol, which consists of four-quadrant biopsies from every 1-2 cm of areas of columnar-lined epithelium (CLE) to confirm the presence of intestinal metaplasia. However, this protocol is prone to sampling errors and high false-negative rates, leading to repeat endoscopy, the study team pointed out. 

WATS-3D (CDx Diagnostics) is a complementary technique designed to improve diagnostic yield by using brush biopsy to sample more tissue than routine biopsies.

WATS-3D has been shown to increase detection of dysplasia in patients with BE undergoing surveillance for BE, but less is known about the value of WATS-3D for BE screening in a community-based cohort of patients with GERD. 

To investigate, Shaheen and colleagues studied 23,933 consecutive patients enrolled in a prospective observational registry assessing the utility of WATS-3D in the screening of symptomatic GERD patients for BE. 

Patients had both WATS-3D and FB in the same endoscopic session. No patient had a history of BE, intestinal metaplasia or dysplasia in esophageal mucosa, or esophageal surgery, endoscopic ablation or endoscopic mucosal resection prior to enrollment. 

Overall, 6829 patients (29%) met endoscopic criteria for BE (≥ 1 cm esophageal CLE with accompanying biopsies showing intestinal metaplasia). 

Of these, 2878 (42%) had intestinal metaplasia identified by either FB or WATS-3D, but 19.3% had their BE diagnosis confirmed solely on the basis of WATS-3D findings. 

Among patients who fulfilled the endoscopic criteria for BE, the adjunctive yield of WATS-3D was 76.5% and the absolute yield was 18.1%.

Of the 240 (1.0%) patients with dysplasia, 107 (45%) were found solely by WATS-3D.

 

‘Clinically Valuable Adjunct’

Among patients with positive WATS-3D but negative FB results, clinical management changed in 90.7% of cases, mostly involving initiation or modification of surveillance and proton pump inhibitor therapy. 

These results suggest that WATS-3D is a “clinically valuable adjunct” to FB for the diagnosis of BE when used as a screening tool in symptomatic GERD patients and particularly in patients with endoscopic evidence of > 1 cm esophageal columnar-lined epithelium, the study team wrote. 

Adjunctive use of WATS-3D when BE is suspected “may save endoscopies and lead to quicker, more accurate diagnoses,” they added. 

The investigators said a limitation of the study is the lack of central pathology review, potentially leading to diagnostic variability. They also noted that over half of the detected dysplasia cases were crypt dysplasia or indefinite for dysplasia, raising concerns about clinical significance. 

Dr. Philip O. Katz



Reached for comment, Philip O. Katz, MD, AGAF, professor of medicine and director of the GI Function Laboratories, Weill Cornell Medicine in New York, said he’s been using WATS for more than a decade as an adjunct to standard biopsy in patients undergoing screening and surveillance for BE and finds it clinically helpful in managing his patients.

This new study provides “further information that WATS added to biopsy that has been traditionally done with the Seattle protocol increases the yield of intestinal metaplasia and likely dysplasia in patients being screened for Barrett’s,” Katz, who wasn’t involved in the study, told GI & Hepatology News.

Funding for the study was provided by CDx Diagnostics. Shaheen and several coauthors disclosed relationships with the company. Katz disclosed relationships (consultant/advisor) for Phathom Pharmaceuticals and Sebella.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

In patients with gastroesophageal reflux (GERD) symptoms undergoing screening upper endoscopy, adjunctive use of wide-area transepithelial sampling with 3D computer-assisted analysis (WATS-3D) increases detection of Barrett’s esophagus (BE) and dysplasia, new research showed. 

Shaheen_Nicholas_J_web-ETOC
Dr Nicholas Shaheen

Compared with forceps biopsies (FB) alone, the addition of WATS-3D led to confirmation of BE in an additional one fifth of patients, roughly doubled dysplasia diagnoses, and influenced clinical management in the majority of patients. 

“The big take-home point here is that the use of WATS-3D brushing along with conventional biopsies increases the likelihood that intestinal metaplasia will be identified,” first author Nicholas Shaheen, MD, MPH, AGAF, with the Center for Esophageal Diseases and Swallowing, University of North Carolina School of Medicine at Chapel Hill, North Carolina, told GI & Hepatology News

“Almost 20% of patients who harbor BE were only identified by WATS-3D and might have otherwise gone undiagnosed had only forceps biopsies been performed,” Shaheen said. 

The study was published in The American Journal of Gastroenterology.

 

Beyond Traditional Biopsies

BE develops as a complication of chronic GERD and is the chief precursor to esophageal adenocarcinoma. Early detection of BE and dysplasia is crucial to enable timely intervention. 

The current gold standard for BE screening involves upper endoscopy with FB following the Seattle protocol, which consists of four-quadrant biopsies from every 1-2 cm of areas of columnar-lined epithelium (CLE) to confirm the presence of intestinal metaplasia. However, this protocol is prone to sampling errors and high false-negative rates, leading to repeat endoscopy, the study team pointed out. 

WATS-3D (CDx Diagnostics) is a complementary technique designed to improve diagnostic yield by using brush biopsy to sample more tissue than routine biopsies.

WATS-3D has been shown to increase detection of dysplasia in patients with BE undergoing surveillance for BE, but less is known about the value of WATS-3D for BE screening in a community-based cohort of patients with GERD. 

To investigate, Shaheen and colleagues studied 23,933 consecutive patients enrolled in a prospective observational registry assessing the utility of WATS-3D in the screening of symptomatic GERD patients for BE. 

Patients had both WATS-3D and FB in the same endoscopic session. No patient had a history of BE, intestinal metaplasia or dysplasia in esophageal mucosa, or esophageal surgery, endoscopic ablation or endoscopic mucosal resection prior to enrollment. 

Overall, 6829 patients (29%) met endoscopic criteria for BE (≥ 1 cm esophageal CLE with accompanying biopsies showing intestinal metaplasia). 

Of these, 2878 (42%) had intestinal metaplasia identified by either FB or WATS-3D, but 19.3% had their BE diagnosis confirmed solely on the basis of WATS-3D findings. 

Among patients who fulfilled the endoscopic criteria for BE, the adjunctive yield of WATS-3D was 76.5% and the absolute yield was 18.1%.

Of the 240 (1.0%) patients with dysplasia, 107 (45%) were found solely by WATS-3D.

 

‘Clinically Valuable Adjunct’

Among patients with positive WATS-3D but negative FB results, clinical management changed in 90.7% of cases, mostly involving initiation or modification of surveillance and proton pump inhibitor therapy. 

These results suggest that WATS-3D is a “clinically valuable adjunct” to FB for the diagnosis of BE when used as a screening tool in symptomatic GERD patients and particularly in patients with endoscopic evidence of > 1 cm esophageal columnar-lined epithelium, the study team wrote. 

Adjunctive use of WATS-3D when BE is suspected “may save endoscopies and lead to quicker, more accurate diagnoses,” they added. 

The investigators said a limitation of the study is the lack of central pathology review, potentially leading to diagnostic variability. They also noted that over half of the detected dysplasia cases were crypt dysplasia or indefinite for dysplasia, raising concerns about clinical significance. 

Dr. Philip O. Katz



Reached for comment, Philip O. Katz, MD, AGAF, professor of medicine and director of the GI Function Laboratories, Weill Cornell Medicine in New York, said he’s been using WATS for more than a decade as an adjunct to standard biopsy in patients undergoing screening and surveillance for BE and finds it clinically helpful in managing his patients.

This new study provides “further information that WATS added to biopsy that has been traditionally done with the Seattle protocol increases the yield of intestinal metaplasia and likely dysplasia in patients being screened for Barrett’s,” Katz, who wasn’t involved in the study, told GI & Hepatology News.

Funding for the study was provided by CDx Diagnostics. Shaheen and several coauthors disclosed relationships with the company. Katz disclosed relationships (consultant/advisor) for Phathom Pharmaceuticals and Sebella.

A version of this article appeared on Medscape.com.

In patients with gastroesophageal reflux (GERD) symptoms undergoing screening upper endoscopy, adjunctive use of wide-area transepithelial sampling with 3D computer-assisted analysis (WATS-3D) increases detection of Barrett’s esophagus (BE) and dysplasia, new research showed. 

Shaheen_Nicholas_J_web-ETOC
Dr Nicholas Shaheen

Compared with forceps biopsies (FB) alone, the addition of WATS-3D led to confirmation of BE in an additional one fifth of patients, roughly doubled dysplasia diagnoses, and influenced clinical management in the majority of patients. 

“The big take-home point here is that the use of WATS-3D brushing along with conventional biopsies increases the likelihood that intestinal metaplasia will be identified,” first author Nicholas Shaheen, MD, MPH, AGAF, with the Center for Esophageal Diseases and Swallowing, University of North Carolina School of Medicine at Chapel Hill, North Carolina, told GI & Hepatology News

“Almost 20% of patients who harbor BE were only identified by WATS-3D and might have otherwise gone undiagnosed had only forceps biopsies been performed,” Shaheen said. 

The study was published in The American Journal of Gastroenterology.

 

Beyond Traditional Biopsies

BE develops as a complication of chronic GERD and is the chief precursor to esophageal adenocarcinoma. Early detection of BE and dysplasia is crucial to enable timely intervention. 

The current gold standard for BE screening involves upper endoscopy with FB following the Seattle protocol, which consists of four-quadrant biopsies from every 1-2 cm of areas of columnar-lined epithelium (CLE) to confirm the presence of intestinal metaplasia. However, this protocol is prone to sampling errors and high false-negative rates, leading to repeat endoscopy, the study team pointed out. 

WATS-3D (CDx Diagnostics) is a complementary technique designed to improve diagnostic yield by using brush biopsy to sample more tissue than routine biopsies.

WATS-3D has been shown to increase detection of dysplasia in patients with BE undergoing surveillance for BE, but less is known about the value of WATS-3D for BE screening in a community-based cohort of patients with GERD. 

To investigate, Shaheen and colleagues studied 23,933 consecutive patients enrolled in a prospective observational registry assessing the utility of WATS-3D in the screening of symptomatic GERD patients for BE. 

Patients had both WATS-3D and FB in the same endoscopic session. No patient had a history of BE, intestinal metaplasia or dysplasia in esophageal mucosa, or esophageal surgery, endoscopic ablation or endoscopic mucosal resection prior to enrollment. 

Overall, 6829 patients (29%) met endoscopic criteria for BE (≥ 1 cm esophageal CLE with accompanying biopsies showing intestinal metaplasia). 

Of these, 2878 (42%) had intestinal metaplasia identified by either FB or WATS-3D, but 19.3% had their BE diagnosis confirmed solely on the basis of WATS-3D findings. 

Among patients who fulfilled the endoscopic criteria for BE, the adjunctive yield of WATS-3D was 76.5% and the absolute yield was 18.1%.

Of the 240 (1.0%) patients with dysplasia, 107 (45%) were found solely by WATS-3D.

 

‘Clinically Valuable Adjunct’

Among patients with positive WATS-3D but negative FB results, clinical management changed in 90.7% of cases, mostly involving initiation or modification of surveillance and proton pump inhibitor therapy. 

These results suggest that WATS-3D is a “clinically valuable adjunct” to FB for the diagnosis of BE when used as a screening tool in symptomatic GERD patients and particularly in patients with endoscopic evidence of > 1 cm esophageal columnar-lined epithelium, the study team wrote. 

Adjunctive use of WATS-3D when BE is suspected “may save endoscopies and lead to quicker, more accurate diagnoses,” they added. 

The investigators said a limitation of the study is the lack of central pathology review, potentially leading to diagnostic variability. They also noted that over half of the detected dysplasia cases were crypt dysplasia or indefinite for dysplasia, raising concerns about clinical significance. 

Dr. Philip O. Katz



Reached for comment, Philip O. Katz, MD, AGAF, professor of medicine and director of the GI Function Laboratories, Weill Cornell Medicine in New York, said he’s been using WATS for more than a decade as an adjunct to standard biopsy in patients undergoing screening and surveillance for BE and finds it clinically helpful in managing his patients.

This new study provides “further information that WATS added to biopsy that has been traditionally done with the Seattle protocol increases the yield of intestinal metaplasia and likely dysplasia in patients being screened for Barrett’s,” Katz, who wasn’t involved in the study, told GI & Hepatology News.

Funding for the study was provided by CDx Diagnostics. Shaheen and several coauthors disclosed relationships with the company. Katz disclosed relationships (consultant/advisor) for Phathom Pharmaceuticals and Sebella.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Wed, 03/26/2025 - 16:47
Un-Gate On Date
Wed, 03/26/2025 - 16:47
Use ProPublica
CFC Schedule Remove Status
Wed, 03/26/2025 - 16:47
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Wed, 03/26/2025 - 16:47

Intensive Nutrition Therapy Improves Outcomes in Alcohol-Related ACLF

Article Type
Changed
Wed, 03/26/2025 - 10:08

A recent study supports the importance of intensive nutrition therapy in managing patients with alcohol-related acute-on-chronic liver failure (ACLF).

In a randomized controlled trial, compared with standard care, dietitian-supported, intensive nutritional therapy improved survival, reduced frailty, and lowered hospitalization rates in men with alcohol-related ACLF.

The study, performed by a team from the Postgraduate Institute of Medical Education and Research, Chandigarh, India, was published in Clinical Gastroenterology and Hepatology.

ACLF related to alcohol use is associated with poor outcomes due to poor nutritional intake and frailty. Frail patients with ACLF face higher morbidity, mortality, and hospitalization rates than their nonfrail counterparts. However, research on the role of structured nutritional interventions in improving these outcomes is limited.

Patal Giri, MBBS, MD, and colleagues enrolled 70 men with alcohol-related ACLF and frailty (liver frailty index [LFI] > 4.5) in a single-center, open-label study. Half were randomly allocated to an intervention group receiving outpatient intensive nutrition therapy (OINT) plus standard medical treatment (SMT) and half to a control group receiving SMT alone for 3 months.

The intervention group received a monitored high-calorie, high-protein, and salt-restricted diet as prescribed by a dedicated senior liver dietitian. The control group received regular nutritional recommendations and were managed for the ACLF-associated complications, without intervention or guidance by the study team.

After 3 months follow-up, overall survival (the primary outcome) was significantly improved in the OINT group compared with the control group (91.4% vs 57.1%), “suggesting that the improvement in nutrition status is associated with better survival,” the study team noted. Three patients died in the OINT group vs 15 in the SMT group.

OINT also led to a significant improvement in frailty, with LFI scores decreasing by an average of 0.93 in the intervention group vs 0.33 in the control group; 97% of patients improved from frail to prefrail status in the OINT group, whereas only 20% of patients improved in the SMT group.

The mean change in LFI of 0.93 with OINT is “well above the substantially clinically important difference” (change of 0.8) established in a previous study, the authors noted.

Significant improvements in weight and body mass index were also observed in the OINT group relative to the control group.

Liver disease severity, including model for end-stage liver disease (MELD) scores, showed greater improvement in the OINT group than in the control group (−8.7 vs −6.3 points from baseline to 3 months).

During the follow-up period, fewer patients in the intervention group than in the control group required a hospital stay (17% vs 45.7%).

Limitations of the study include the single-center design and the short follow-up period of 3 months, which limits long-term outcome assessment. Further, the study only included patients meeting Asia Pacific Association for Study of Liver criteria for ACLF, which does not include the patients with organ failure as defined by European Association for the Study of the Liver-Chronic Liver Failure Consortium criteria. Patients with ACLF who had more severe disease (MELD score > 30 or AARC > 10) were also not included.

Despite these limitations, the authors said their study showed that “dietician-monitored goal-directed nutrition therapy is very important in the management of patients with alcohol-related ACLF along with SMT.”

 

Confirmatory Data 

Reached for comment, Katherine Patton, MEd, RD, a registered dietitian with the Center for Human Nutrition at Cleveland Clinic, Cleveland, Ohio, said it’s well known that the ACLF patient population has a “very high rate of morbidity and mortality and their quality of life tends to be poor due to their frailty. It is also fairly well-known that proper nutrition therapy can improve outcomes, however barriers to adequate nutrition include decreased appetite, nausea, pain, altered taste, and early satiety from ascites.”

“Hepatologists are likely stressing the importance of adequate protein energy intake and doctors may refer patients to an outpatient dietitian, but it is up to the patient to make that appointment and act on the recommendations,” Patton told GI & Hepatology News.

“If a dietitian works in the same clinic as the hepatologist and patients can be referred and seen the same day, this is ideal. During a hospital admission, protein/calorie intake can be more closely monitored and encouraged by a multi-disciplinary team,” Patton explained.

She cautioned that “the average patient is not familiar with how to apply general calorie and protein goals to their everyday eating habits. This study amplifies the role of a dietitian and what consistent education and resources can do to improve a patient’s quality of life and survival.”

This study had no specific funding. The authors have declared no relevant conflicts of interest. Patton had no relevant disclosures.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

A recent study supports the importance of intensive nutrition therapy in managing patients with alcohol-related acute-on-chronic liver failure (ACLF).

In a randomized controlled trial, compared with standard care, dietitian-supported, intensive nutritional therapy improved survival, reduced frailty, and lowered hospitalization rates in men with alcohol-related ACLF.

The study, performed by a team from the Postgraduate Institute of Medical Education and Research, Chandigarh, India, was published in Clinical Gastroenterology and Hepatology.

ACLF related to alcohol use is associated with poor outcomes due to poor nutritional intake and frailty. Frail patients with ACLF face higher morbidity, mortality, and hospitalization rates than their nonfrail counterparts. However, research on the role of structured nutritional interventions in improving these outcomes is limited.

Patal Giri, MBBS, MD, and colleagues enrolled 70 men with alcohol-related ACLF and frailty (liver frailty index [LFI] > 4.5) in a single-center, open-label study. Half were randomly allocated to an intervention group receiving outpatient intensive nutrition therapy (OINT) plus standard medical treatment (SMT) and half to a control group receiving SMT alone for 3 months.

The intervention group received a monitored high-calorie, high-protein, and salt-restricted diet as prescribed by a dedicated senior liver dietitian. The control group received regular nutritional recommendations and were managed for the ACLF-associated complications, without intervention or guidance by the study team.

After 3 months follow-up, overall survival (the primary outcome) was significantly improved in the OINT group compared with the control group (91.4% vs 57.1%), “suggesting that the improvement in nutrition status is associated with better survival,” the study team noted. Three patients died in the OINT group vs 15 in the SMT group.

OINT also led to a significant improvement in frailty, with LFI scores decreasing by an average of 0.93 in the intervention group vs 0.33 in the control group; 97% of patients improved from frail to prefrail status in the OINT group, whereas only 20% of patients improved in the SMT group.

The mean change in LFI of 0.93 with OINT is “well above the substantially clinically important difference” (change of 0.8) established in a previous study, the authors noted.

Significant improvements in weight and body mass index were also observed in the OINT group relative to the control group.

Liver disease severity, including model for end-stage liver disease (MELD) scores, showed greater improvement in the OINT group than in the control group (−8.7 vs −6.3 points from baseline to 3 months).

During the follow-up period, fewer patients in the intervention group than in the control group required a hospital stay (17% vs 45.7%).

Limitations of the study include the single-center design and the short follow-up period of 3 months, which limits long-term outcome assessment. Further, the study only included patients meeting Asia Pacific Association for Study of Liver criteria for ACLF, which does not include the patients with organ failure as defined by European Association for the Study of the Liver-Chronic Liver Failure Consortium criteria. Patients with ACLF who had more severe disease (MELD score > 30 or AARC > 10) were also not included.

Despite these limitations, the authors said their study showed that “dietician-monitored goal-directed nutrition therapy is very important in the management of patients with alcohol-related ACLF along with SMT.”

 

Confirmatory Data 

Reached for comment, Katherine Patton, MEd, RD, a registered dietitian with the Center for Human Nutrition at Cleveland Clinic, Cleveland, Ohio, said it’s well known that the ACLF patient population has a “very high rate of morbidity and mortality and their quality of life tends to be poor due to their frailty. It is also fairly well-known that proper nutrition therapy can improve outcomes, however barriers to adequate nutrition include decreased appetite, nausea, pain, altered taste, and early satiety from ascites.”

“Hepatologists are likely stressing the importance of adequate protein energy intake and doctors may refer patients to an outpatient dietitian, but it is up to the patient to make that appointment and act on the recommendations,” Patton told GI & Hepatology News.

“If a dietitian works in the same clinic as the hepatologist and patients can be referred and seen the same day, this is ideal. During a hospital admission, protein/calorie intake can be more closely monitored and encouraged by a multi-disciplinary team,” Patton explained.

She cautioned that “the average patient is not familiar with how to apply general calorie and protein goals to their everyday eating habits. This study amplifies the role of a dietitian and what consistent education and resources can do to improve a patient’s quality of life and survival.”

This study had no specific funding. The authors have declared no relevant conflicts of interest. Patton had no relevant disclosures.

A version of this article appeared on Medscape.com.

A recent study supports the importance of intensive nutrition therapy in managing patients with alcohol-related acute-on-chronic liver failure (ACLF).

In a randomized controlled trial, compared with standard care, dietitian-supported, intensive nutritional therapy improved survival, reduced frailty, and lowered hospitalization rates in men with alcohol-related ACLF.

The study, performed by a team from the Postgraduate Institute of Medical Education and Research, Chandigarh, India, was published in Clinical Gastroenterology and Hepatology.

ACLF related to alcohol use is associated with poor outcomes due to poor nutritional intake and frailty. Frail patients with ACLF face higher morbidity, mortality, and hospitalization rates than their nonfrail counterparts. However, research on the role of structured nutritional interventions in improving these outcomes is limited.

Patal Giri, MBBS, MD, and colleagues enrolled 70 men with alcohol-related ACLF and frailty (liver frailty index [LFI] > 4.5) in a single-center, open-label study. Half were randomly allocated to an intervention group receiving outpatient intensive nutrition therapy (OINT) plus standard medical treatment (SMT) and half to a control group receiving SMT alone for 3 months.

The intervention group received a monitored high-calorie, high-protein, and salt-restricted diet as prescribed by a dedicated senior liver dietitian. The control group received regular nutritional recommendations and were managed for the ACLF-associated complications, without intervention or guidance by the study team.

After 3 months follow-up, overall survival (the primary outcome) was significantly improved in the OINT group compared with the control group (91.4% vs 57.1%), “suggesting that the improvement in nutrition status is associated with better survival,” the study team noted. Three patients died in the OINT group vs 15 in the SMT group.

OINT also led to a significant improvement in frailty, with LFI scores decreasing by an average of 0.93 in the intervention group vs 0.33 in the control group; 97% of patients improved from frail to prefrail status in the OINT group, whereas only 20% of patients improved in the SMT group.

The mean change in LFI of 0.93 with OINT is “well above the substantially clinically important difference” (change of 0.8) established in a previous study, the authors noted.

Significant improvements in weight and body mass index were also observed in the OINT group relative to the control group.

Liver disease severity, including model for end-stage liver disease (MELD) scores, showed greater improvement in the OINT group than in the control group (−8.7 vs −6.3 points from baseline to 3 months).

During the follow-up period, fewer patients in the intervention group than in the control group required a hospital stay (17% vs 45.7%).

Limitations of the study include the single-center design and the short follow-up period of 3 months, which limits long-term outcome assessment. Further, the study only included patients meeting Asia Pacific Association for Study of Liver criteria for ACLF, which does not include the patients with organ failure as defined by European Association for the Study of the Liver-Chronic Liver Failure Consortium criteria. Patients with ACLF who had more severe disease (MELD score > 30 or AARC > 10) were also not included.

Despite these limitations, the authors said their study showed that “dietician-monitored goal-directed nutrition therapy is very important in the management of patients with alcohol-related ACLF along with SMT.”

 

Confirmatory Data 

Reached for comment, Katherine Patton, MEd, RD, a registered dietitian with the Center for Human Nutrition at Cleveland Clinic, Cleveland, Ohio, said it’s well known that the ACLF patient population has a “very high rate of morbidity and mortality and their quality of life tends to be poor due to their frailty. It is also fairly well-known that proper nutrition therapy can improve outcomes, however barriers to adequate nutrition include decreased appetite, nausea, pain, altered taste, and early satiety from ascites.”

“Hepatologists are likely stressing the importance of adequate protein energy intake and doctors may refer patients to an outpatient dietitian, but it is up to the patient to make that appointment and act on the recommendations,” Patton told GI & Hepatology News.

“If a dietitian works in the same clinic as the hepatologist and patients can be referred and seen the same day, this is ideal. During a hospital admission, protein/calorie intake can be more closely monitored and encouraged by a multi-disciplinary team,” Patton explained.

She cautioned that “the average patient is not familiar with how to apply general calorie and protein goals to their everyday eating habits. This study amplifies the role of a dietitian and what consistent education and resources can do to improve a patient’s quality of life and survival.”

This study had no specific funding. The authors have declared no relevant conflicts of interest. Patton had no relevant disclosures.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Wed, 03/26/2025 - 10:07
Un-Gate On Date
Wed, 03/26/2025 - 10:07
Use ProPublica
CFC Schedule Remove Status
Wed, 03/26/2025 - 10:07
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Wed, 03/26/2025 - 10:07

ACG Issues First Guidance on Diagnosis and Management of Gastric Premalignant Conditions

Article Type
Changed
Mon, 03/24/2025 - 11:21

The American College of Gastroenterology (ACG) has issued its first clinical practice guideline on the diagnosis and management of gastric premalignant conditions (GPMCs) including atrophic gastritis, gastric intestinal metaplasia, dysplasia, and certain gastric epithelial polyps, all of which have an increased risk of progressing to gastric cancer.

The guideline was published online in The American Journal of Gastroenterology.

GPMCs are “common in gastroenterology practices, but in the US, at least, we’ve not had concrete guidance,” first author Douglas Morgan, MD, MPH, AGAF, Division of Gastroenterology, The University of Alabama at Birmingham, noted in an interview.

With these guidelines, we hope there “will be a paradigm shift to finally establish surveillance in the stomach, much like we’ve been doing for decades in the colon and the esophagus,” Morgan said.

Gastric cancer is a common cancer in the United States with disproportionately higher incidence rates in immigrants from countries with a high incidence of gastric cancer and certain non-White populations.

In addition, the 5-year survival rate in the United States for gastric cancer is 36%, which falls short of global standards and is driven by the fact that only a small percentage of these cancers are diagnosed in the early, curable stage.

These guidelines will help address this marked disparity and the burden on minority and marginalized populations, the guideline authors wrote. “The overarching goals are to reduce [gastric cancer] incidence in the United States, increase the detection of early-stage disease (early gastric cancer), and to significantly increase the 5-year survival rates in the near term.”

 

Key Recommendations

The guideline includes recommendations on endoscopic surveillance for high-risk patients, the performance of high-quality endoscopy and image-enhanced endoscopy (IEE) for diagnosis and surveillance, GPMC histology criteria and reporting, endoscopic treatment of dysplasia, the role of Helicobacter pylori eradication, general risk reduction measures, and the management of autoimmune gastritis and gastric epithelial polyps.

In terms of screening, the guidelines recommend against routine upper endoscopy screening for gastric cancer and GPMC in the general US population (low quality of evidence; conditional recommendation).

They also noted that there is “insufficient” direct US evidence to make a recommendation on opportunistic endoscopy screening for gastric cancer and GPMC in patients deemed at high risk based on race/ethnicity and family history. In addition, they recommend against the use of noninvasive biomarkers for screening or surveillance of GPMC or gastric cancer.

In terms of endoscopic and histologic diagnosis of GPMC, high-quality endoscopic evaluation is crucial to detect premalignant conditions or gastric cancer, the authors said. This includes adequate mucosal cleansing and insufflation, and photodocumentation of anatomic landmarks, as well as use of high-definition white light endoscopy (HDWLE) and IEE.

Systematic gastric biopsies should follow the updated Sydney protocol, with at least two separate containers for antrum/incisura and corpus biopsies. Histology should document the subtype of gastric intestinal metaplasia — incomplete, complete, or mixed — and severity and extent of atrophic gastritis and metaplasia.

Morgan emphasized the importance of coordination between the gastroenterologist and pathologist. “Several of these measures are not routinely reported, so we need to be in conversations with our collaborating pathologists,” he told this news organization.

In terms of GPMC surveillance, the authors suggest surveillance endoscopy every 3 years in high-risk patients with gastric intestinal metaplasia who meet one of the following criteria: High-risk histology (incomplete vs complete subtype, extending into the corpus); family history of gastric cancer in a first-degree relative; foreign-born individuals from high-gastric cancer incidence nations; or high-risk race/ethnicity (East Asian, Latino/a, Black, American Indian, or Alaska Native).

Individuals with multiple risk factors for gastric cancer may be considered for shorter than 3-year intervals.

Low-risk gastric intestinal metaplasia (limited to antrum, mild atrophy, and complete gastric intestinal metaplasia subtype) does not require routine endoscopic surveillance.

In terms of endoscopic management of dysplastic GPMC, endoscopic resection is suggested for dysplasia with visible margins. If dysplasia is not visible, repeat endoscopy with HDWLE and IEE by an experienced endoscopist is advised.

In patients appropriate for endoscopic resection of dysplasia, particularly endoscopic submucosal dissection, referral to a high-volume center with appropriate expertise in the diagnosis and therapeutic resection of gastric neoplasia is strongly recommended.

In patients with confirmed complete resection of dysplasia, endoscopic surveillance is also strongly recommended. Surveillance examinations should be performed by an experienced endoscopist using HDWLE and IEE, with biopsies according to the systematic biopsy protocol in addition to targeted biopsies.

In terms of nonendoscopic GPMC management, testing for H pylori (and eradication treatment if possible) is strongly recommended for patients with GPMC and those with a history of resected early gastric cancer.

Aspirin, nonsteroidal anti-inflammatory drugs, cyclooxygenase 2 inhibitors, or antioxidants are not recommended for patients with GPMC for the purpose of preventing gastric cancer.

In patients with autoimmune gastritis, testing for H pylori with a nonserological test, eradication treatment if positive, and posttreatment testing to confirm eradication is advised.

There is not enough evidence to make a formal recommendation on endoscopic surveillance in those with autoimmune gastritis; surveillance should be individualized, considering the risk for neuroendocrine tumors and possibly gastric cancer.

In terms of gastric epithelial polyps, endoscopic resection of all gastric adenomas is recommended, regardless of size, to exclude or prevent dysplasia and early gastric cancer. Adenomas that are not amenable to endoscopic resection should be referred for surgical resection, if clinically appropriate.

Morgan noted that the ACG GPMC guideline aligns with the updated ACG/American Society for Gastrointestinal Endoscopy upper endoscopy quality indicators released earlier this year.

Implementation of the ACG GPMC guideline and “change in clinical practice will require concrete targets and include training and quality initiatives,” the authors said.

This research received no commercial support. Morgan disclosed research support from Panbela Therapeutics, Thorne, and American Molecular Laboratories.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

The American College of Gastroenterology (ACG) has issued its first clinical practice guideline on the diagnosis and management of gastric premalignant conditions (GPMCs) including atrophic gastritis, gastric intestinal metaplasia, dysplasia, and certain gastric epithelial polyps, all of which have an increased risk of progressing to gastric cancer.

The guideline was published online in The American Journal of Gastroenterology.

GPMCs are “common in gastroenterology practices, but in the US, at least, we’ve not had concrete guidance,” first author Douglas Morgan, MD, MPH, AGAF, Division of Gastroenterology, The University of Alabama at Birmingham, noted in an interview.

With these guidelines, we hope there “will be a paradigm shift to finally establish surveillance in the stomach, much like we’ve been doing for decades in the colon and the esophagus,” Morgan said.

Gastric cancer is a common cancer in the United States with disproportionately higher incidence rates in immigrants from countries with a high incidence of gastric cancer and certain non-White populations.

In addition, the 5-year survival rate in the United States for gastric cancer is 36%, which falls short of global standards and is driven by the fact that only a small percentage of these cancers are diagnosed in the early, curable stage.

These guidelines will help address this marked disparity and the burden on minority and marginalized populations, the guideline authors wrote. “The overarching goals are to reduce [gastric cancer] incidence in the United States, increase the detection of early-stage disease (early gastric cancer), and to significantly increase the 5-year survival rates in the near term.”

 

Key Recommendations

The guideline includes recommendations on endoscopic surveillance for high-risk patients, the performance of high-quality endoscopy and image-enhanced endoscopy (IEE) for diagnosis and surveillance, GPMC histology criteria and reporting, endoscopic treatment of dysplasia, the role of Helicobacter pylori eradication, general risk reduction measures, and the management of autoimmune gastritis and gastric epithelial polyps.

In terms of screening, the guidelines recommend against routine upper endoscopy screening for gastric cancer and GPMC in the general US population (low quality of evidence; conditional recommendation).

They also noted that there is “insufficient” direct US evidence to make a recommendation on opportunistic endoscopy screening for gastric cancer and GPMC in patients deemed at high risk based on race/ethnicity and family history. In addition, they recommend against the use of noninvasive biomarkers for screening or surveillance of GPMC or gastric cancer.

In terms of endoscopic and histologic diagnosis of GPMC, high-quality endoscopic evaluation is crucial to detect premalignant conditions or gastric cancer, the authors said. This includes adequate mucosal cleansing and insufflation, and photodocumentation of anatomic landmarks, as well as use of high-definition white light endoscopy (HDWLE) and IEE.

Systematic gastric biopsies should follow the updated Sydney protocol, with at least two separate containers for antrum/incisura and corpus biopsies. Histology should document the subtype of gastric intestinal metaplasia — incomplete, complete, or mixed — and severity and extent of atrophic gastritis and metaplasia.

Morgan emphasized the importance of coordination between the gastroenterologist and pathologist. “Several of these measures are not routinely reported, so we need to be in conversations with our collaborating pathologists,” he told this news organization.

In terms of GPMC surveillance, the authors suggest surveillance endoscopy every 3 years in high-risk patients with gastric intestinal metaplasia who meet one of the following criteria: High-risk histology (incomplete vs complete subtype, extending into the corpus); family history of gastric cancer in a first-degree relative; foreign-born individuals from high-gastric cancer incidence nations; or high-risk race/ethnicity (East Asian, Latino/a, Black, American Indian, or Alaska Native).

Individuals with multiple risk factors for gastric cancer may be considered for shorter than 3-year intervals.

Low-risk gastric intestinal metaplasia (limited to antrum, mild atrophy, and complete gastric intestinal metaplasia subtype) does not require routine endoscopic surveillance.

In terms of endoscopic management of dysplastic GPMC, endoscopic resection is suggested for dysplasia with visible margins. If dysplasia is not visible, repeat endoscopy with HDWLE and IEE by an experienced endoscopist is advised.

In patients appropriate for endoscopic resection of dysplasia, particularly endoscopic submucosal dissection, referral to a high-volume center with appropriate expertise in the diagnosis and therapeutic resection of gastric neoplasia is strongly recommended.

In patients with confirmed complete resection of dysplasia, endoscopic surveillance is also strongly recommended. Surveillance examinations should be performed by an experienced endoscopist using HDWLE and IEE, with biopsies according to the systematic biopsy protocol in addition to targeted biopsies.

In terms of nonendoscopic GPMC management, testing for H pylori (and eradication treatment if possible) is strongly recommended for patients with GPMC and those with a history of resected early gastric cancer.

Aspirin, nonsteroidal anti-inflammatory drugs, cyclooxygenase 2 inhibitors, or antioxidants are not recommended for patients with GPMC for the purpose of preventing gastric cancer.

In patients with autoimmune gastritis, testing for H pylori with a nonserological test, eradication treatment if positive, and posttreatment testing to confirm eradication is advised.

There is not enough evidence to make a formal recommendation on endoscopic surveillance in those with autoimmune gastritis; surveillance should be individualized, considering the risk for neuroendocrine tumors and possibly gastric cancer.

In terms of gastric epithelial polyps, endoscopic resection of all gastric adenomas is recommended, regardless of size, to exclude or prevent dysplasia and early gastric cancer. Adenomas that are not amenable to endoscopic resection should be referred for surgical resection, if clinically appropriate.

Morgan noted that the ACG GPMC guideline aligns with the updated ACG/American Society for Gastrointestinal Endoscopy upper endoscopy quality indicators released earlier this year.

Implementation of the ACG GPMC guideline and “change in clinical practice will require concrete targets and include training and quality initiatives,” the authors said.

This research received no commercial support. Morgan disclosed research support from Panbela Therapeutics, Thorne, and American Molecular Laboratories.

A version of this article first appeared on Medscape.com.

The American College of Gastroenterology (ACG) has issued its first clinical practice guideline on the diagnosis and management of gastric premalignant conditions (GPMCs) including atrophic gastritis, gastric intestinal metaplasia, dysplasia, and certain gastric epithelial polyps, all of which have an increased risk of progressing to gastric cancer.

The guideline was published online in The American Journal of Gastroenterology.

GPMCs are “common in gastroenterology practices, but in the US, at least, we’ve not had concrete guidance,” first author Douglas Morgan, MD, MPH, AGAF, Division of Gastroenterology, The University of Alabama at Birmingham, noted in an interview.

With these guidelines, we hope there “will be a paradigm shift to finally establish surveillance in the stomach, much like we’ve been doing for decades in the colon and the esophagus,” Morgan said.

Gastric cancer is a common cancer in the United States with disproportionately higher incidence rates in immigrants from countries with a high incidence of gastric cancer and certain non-White populations.

In addition, the 5-year survival rate in the United States for gastric cancer is 36%, which falls short of global standards and is driven by the fact that only a small percentage of these cancers are diagnosed in the early, curable stage.

These guidelines will help address this marked disparity and the burden on minority and marginalized populations, the guideline authors wrote. “The overarching goals are to reduce [gastric cancer] incidence in the United States, increase the detection of early-stage disease (early gastric cancer), and to significantly increase the 5-year survival rates in the near term.”

 

Key Recommendations

The guideline includes recommendations on endoscopic surveillance for high-risk patients, the performance of high-quality endoscopy and image-enhanced endoscopy (IEE) for diagnosis and surveillance, GPMC histology criteria and reporting, endoscopic treatment of dysplasia, the role of Helicobacter pylori eradication, general risk reduction measures, and the management of autoimmune gastritis and gastric epithelial polyps.

In terms of screening, the guidelines recommend against routine upper endoscopy screening for gastric cancer and GPMC in the general US population (low quality of evidence; conditional recommendation).

They also noted that there is “insufficient” direct US evidence to make a recommendation on opportunistic endoscopy screening for gastric cancer and GPMC in patients deemed at high risk based on race/ethnicity and family history. In addition, they recommend against the use of noninvasive biomarkers for screening or surveillance of GPMC or gastric cancer.

In terms of endoscopic and histologic diagnosis of GPMC, high-quality endoscopic evaluation is crucial to detect premalignant conditions or gastric cancer, the authors said. This includes adequate mucosal cleansing and insufflation, and photodocumentation of anatomic landmarks, as well as use of high-definition white light endoscopy (HDWLE) and IEE.

Systematic gastric biopsies should follow the updated Sydney protocol, with at least two separate containers for antrum/incisura and corpus biopsies. Histology should document the subtype of gastric intestinal metaplasia — incomplete, complete, or mixed — and severity and extent of atrophic gastritis and metaplasia.

Morgan emphasized the importance of coordination between the gastroenterologist and pathologist. “Several of these measures are not routinely reported, so we need to be in conversations with our collaborating pathologists,” he told this news organization.

In terms of GPMC surveillance, the authors suggest surveillance endoscopy every 3 years in high-risk patients with gastric intestinal metaplasia who meet one of the following criteria: High-risk histology (incomplete vs complete subtype, extending into the corpus); family history of gastric cancer in a first-degree relative; foreign-born individuals from high-gastric cancer incidence nations; or high-risk race/ethnicity (East Asian, Latino/a, Black, American Indian, or Alaska Native).

Individuals with multiple risk factors for gastric cancer may be considered for shorter than 3-year intervals.

Low-risk gastric intestinal metaplasia (limited to antrum, mild atrophy, and complete gastric intestinal metaplasia subtype) does not require routine endoscopic surveillance.

In terms of endoscopic management of dysplastic GPMC, endoscopic resection is suggested for dysplasia with visible margins. If dysplasia is not visible, repeat endoscopy with HDWLE and IEE by an experienced endoscopist is advised.

In patients appropriate for endoscopic resection of dysplasia, particularly endoscopic submucosal dissection, referral to a high-volume center with appropriate expertise in the diagnosis and therapeutic resection of gastric neoplasia is strongly recommended.

In patients with confirmed complete resection of dysplasia, endoscopic surveillance is also strongly recommended. Surveillance examinations should be performed by an experienced endoscopist using HDWLE and IEE, with biopsies according to the systematic biopsy protocol in addition to targeted biopsies.

In terms of nonendoscopic GPMC management, testing for H pylori (and eradication treatment if possible) is strongly recommended for patients with GPMC and those with a history of resected early gastric cancer.

Aspirin, nonsteroidal anti-inflammatory drugs, cyclooxygenase 2 inhibitors, or antioxidants are not recommended for patients with GPMC for the purpose of preventing gastric cancer.

In patients with autoimmune gastritis, testing for H pylori with a nonserological test, eradication treatment if positive, and posttreatment testing to confirm eradication is advised.

There is not enough evidence to make a formal recommendation on endoscopic surveillance in those with autoimmune gastritis; surveillance should be individualized, considering the risk for neuroendocrine tumors and possibly gastric cancer.

In terms of gastric epithelial polyps, endoscopic resection of all gastric adenomas is recommended, regardless of size, to exclude or prevent dysplasia and early gastric cancer. Adenomas that are not amenable to endoscopic resection should be referred for surgical resection, if clinically appropriate.

Morgan noted that the ACG GPMC guideline aligns with the updated ACG/American Society for Gastrointestinal Endoscopy upper endoscopy quality indicators released earlier this year.

Implementation of the ACG GPMC guideline and “change in clinical practice will require concrete targets and include training and quality initiatives,” the authors said.

This research received no commercial support. Morgan disclosed research support from Panbela Therapeutics, Thorne, and American Molecular Laboratories.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Mon, 03/24/2025 - 11:20
Un-Gate On Date
Mon, 03/24/2025 - 11:20
Use ProPublica
CFC Schedule Remove Status
Mon, 03/24/2025 - 11:20
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Mon, 03/24/2025 - 11:20

AI-Enhanced Echocardiography: A Game-Changer for Opportunistic Liver Disease Detection?

Article Type
Changed
Mon, 03/24/2025 - 11:17

New research highlights the promise of artificial intelligence (AI) for opportunistic screening of chronic liver disease (CLD) through routine echocardiography. 

An AI algorithm called EchoNet-Liver demonstrated strong performance for detecting cirrhosis and steatotic liver disease (SLD) from routinely acquired transthoracic echocardiography studies containing subcostal images of the liver, the developers reported in NEJM AI.

“We hope that this algorithm enables physicians to opportunistically screen for chronic liver disease to identify asymptomatic and undiagnosed patients, thus enabling us to treat comorbidities relevant to the patient’s cardiovascular and noncardiovascular health,” Alan C. Kwan, MD, assistant professor, Department of Cardiology, Smidt Heart Institute at Cedars-Sinai, Los Angeles, California, told this news organization.

 

Harnessing Echo to Reveal Liver Trouble 

CLD affects over 1.5 billion people globally, with many cases remaining undiagnosed due to the asymptomatic nature of early disease and a lack of routine screening. Traditional diagnostic methods such as liver function tests, ultrasonography, and MRI are often limited by cost, availability, and patient access.

Echocardiography is a commonly performed imaging study that incidentally captures images of the liver but is not utilized for liver disease assessment.

EchoNet-Liver is an AI algorithm that can identify high-quality subcostal images from full echocardiography studies and detect the presence of cirrhosis and SLD. 

Kwan and colleagues trained it using nearly 1.6 million echocardiogram videos from 66,922 studies and 24,276 adult patients at Cedars-Sinai Medical Center (CSMC). The model predictions were compared with diagnoses from clinical evaluations of paired abdominal ultrasound or MRI studies. External validation studies were conducted using similar data from Stanford Health Care. 

In the “held-out” CSMC ultrasound dataset, EchoNet-Liver detected cirrhosis with an area under the receiver operating characteristic curve (AUROC) of 0.837 (95% CI, 0.828-0.848) and SLD with an AUROC of 0.799 (95% CI, 0.788-0.811).

The algorithm showed a sensitivity of 69.6% and a specificity of 84.6% for detecting cirrhosis, and a sensitivity of 74.1% and a specificity of 72.0% for detecting SLD. 

In the Stanford Health Care external-validation test ultrasound cohort, the model detected cirrhosis with an AUROC of 0.830 (95% CI, 0.799-0.859) and SLD with an AUROC of 0.769 (95% CI, 0.733-0.813), with sensitivity and specificity of 80.0% and 70.9%, respectively, for cirrhosis and 66.7% and 78.0%, respectively, for SLD. 

In the CSMC MRI-paired cohort, EchoNet-Liver detected cirrhosis with an AUROC of 0.704 (95% CI, 0.699-0.708) and SLD with an AUROC of 0.725 (95% CI, 0.707-0.762).

 

Identifying Subclinical Liver Disease to Improve Outcomes

“Across diverse populations and disease definitions, deep-learning-enhanced echocardiography enabled high-throughput, automated detection of CLD, which could enable opportunistic screening for asymptomatic liver disease,” the authors wrote. 

“By improving the diagnosis of subclinical CLD, we may be able to limit or reverse disease progression and improve care by triaging patients toward appropriate clinical and diagnostic management,” they said. 

By way of limitations, the researchers noted that the tool was developed using a cohort of patients who had both abdominal ultrasound and echocardiography within 30 days, and thus probably had a higher prevalence of liver disease compared with the general population receiving echocardiography. The true clinical utility of EchoNet-Liver will depend on whether its application to a general echocardiography population can efficiently detect undiagnosed CLD, they cautioned. 

“While we developed this algorithm based on clinical data, the application within the clinic would typically require FDA approval, which we have not yet applied for,” Kwan told this news organization.

“We plan to prospectively validate this algorithm at multiple sites to ensure that application of this algorithm improves patient care without causing excess diagnostic testing, thus providing value to patients and the healthcare system as a whole,” Kwan said.

Funding was provided in part by KAKENHI (Japan Society for the Promotion of Science). Kwan reported receiving consulting fees from InVision.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

New research highlights the promise of artificial intelligence (AI) for opportunistic screening of chronic liver disease (CLD) through routine echocardiography. 

An AI algorithm called EchoNet-Liver demonstrated strong performance for detecting cirrhosis and steatotic liver disease (SLD) from routinely acquired transthoracic echocardiography studies containing subcostal images of the liver, the developers reported in NEJM AI.

“We hope that this algorithm enables physicians to opportunistically screen for chronic liver disease to identify asymptomatic and undiagnosed patients, thus enabling us to treat comorbidities relevant to the patient’s cardiovascular and noncardiovascular health,” Alan C. Kwan, MD, assistant professor, Department of Cardiology, Smidt Heart Institute at Cedars-Sinai, Los Angeles, California, told this news organization.

 

Harnessing Echo to Reveal Liver Trouble 

CLD affects over 1.5 billion people globally, with many cases remaining undiagnosed due to the asymptomatic nature of early disease and a lack of routine screening. Traditional diagnostic methods such as liver function tests, ultrasonography, and MRI are often limited by cost, availability, and patient access.

Echocardiography is a commonly performed imaging study that incidentally captures images of the liver but is not utilized for liver disease assessment.

EchoNet-Liver is an AI algorithm that can identify high-quality subcostal images from full echocardiography studies and detect the presence of cirrhosis and SLD. 

Kwan and colleagues trained it using nearly 1.6 million echocardiogram videos from 66,922 studies and 24,276 adult patients at Cedars-Sinai Medical Center (CSMC). The model predictions were compared with diagnoses from clinical evaluations of paired abdominal ultrasound or MRI studies. External validation studies were conducted using similar data from Stanford Health Care. 

In the “held-out” CSMC ultrasound dataset, EchoNet-Liver detected cirrhosis with an area under the receiver operating characteristic curve (AUROC) of 0.837 (95% CI, 0.828-0.848) and SLD with an AUROC of 0.799 (95% CI, 0.788-0.811).

The algorithm showed a sensitivity of 69.6% and a specificity of 84.6% for detecting cirrhosis, and a sensitivity of 74.1% and a specificity of 72.0% for detecting SLD. 

In the Stanford Health Care external-validation test ultrasound cohort, the model detected cirrhosis with an AUROC of 0.830 (95% CI, 0.799-0.859) and SLD with an AUROC of 0.769 (95% CI, 0.733-0.813), with sensitivity and specificity of 80.0% and 70.9%, respectively, for cirrhosis and 66.7% and 78.0%, respectively, for SLD. 

In the CSMC MRI-paired cohort, EchoNet-Liver detected cirrhosis with an AUROC of 0.704 (95% CI, 0.699-0.708) and SLD with an AUROC of 0.725 (95% CI, 0.707-0.762).

 

Identifying Subclinical Liver Disease to Improve Outcomes

“Across diverse populations and disease definitions, deep-learning-enhanced echocardiography enabled high-throughput, automated detection of CLD, which could enable opportunistic screening for asymptomatic liver disease,” the authors wrote. 

“By improving the diagnosis of subclinical CLD, we may be able to limit or reverse disease progression and improve care by triaging patients toward appropriate clinical and diagnostic management,” they said. 

By way of limitations, the researchers noted that the tool was developed using a cohort of patients who had both abdominal ultrasound and echocardiography within 30 days, and thus probably had a higher prevalence of liver disease compared with the general population receiving echocardiography. The true clinical utility of EchoNet-Liver will depend on whether its application to a general echocardiography population can efficiently detect undiagnosed CLD, they cautioned. 

“While we developed this algorithm based on clinical data, the application within the clinic would typically require FDA approval, which we have not yet applied for,” Kwan told this news organization.

“We plan to prospectively validate this algorithm at multiple sites to ensure that application of this algorithm improves patient care without causing excess diagnostic testing, thus providing value to patients and the healthcare system as a whole,” Kwan said.

Funding was provided in part by KAKENHI (Japan Society for the Promotion of Science). Kwan reported receiving consulting fees from InVision.

A version of this article first appeared on Medscape.com.

New research highlights the promise of artificial intelligence (AI) for opportunistic screening of chronic liver disease (CLD) through routine echocardiography. 

An AI algorithm called EchoNet-Liver demonstrated strong performance for detecting cirrhosis and steatotic liver disease (SLD) from routinely acquired transthoracic echocardiography studies containing subcostal images of the liver, the developers reported in NEJM AI.

“We hope that this algorithm enables physicians to opportunistically screen for chronic liver disease to identify asymptomatic and undiagnosed patients, thus enabling us to treat comorbidities relevant to the patient’s cardiovascular and noncardiovascular health,” Alan C. Kwan, MD, assistant professor, Department of Cardiology, Smidt Heart Institute at Cedars-Sinai, Los Angeles, California, told this news organization.

 

Harnessing Echo to Reveal Liver Trouble 

CLD affects over 1.5 billion people globally, with many cases remaining undiagnosed due to the asymptomatic nature of early disease and a lack of routine screening. Traditional diagnostic methods such as liver function tests, ultrasonography, and MRI are often limited by cost, availability, and patient access.

Echocardiography is a commonly performed imaging study that incidentally captures images of the liver but is not utilized for liver disease assessment.

EchoNet-Liver is an AI algorithm that can identify high-quality subcostal images from full echocardiography studies and detect the presence of cirrhosis and SLD. 

Kwan and colleagues trained it using nearly 1.6 million echocardiogram videos from 66,922 studies and 24,276 adult patients at Cedars-Sinai Medical Center (CSMC). The model predictions were compared with diagnoses from clinical evaluations of paired abdominal ultrasound or MRI studies. External validation studies were conducted using similar data from Stanford Health Care. 

In the “held-out” CSMC ultrasound dataset, EchoNet-Liver detected cirrhosis with an area under the receiver operating characteristic curve (AUROC) of 0.837 (95% CI, 0.828-0.848) and SLD with an AUROC of 0.799 (95% CI, 0.788-0.811).

The algorithm showed a sensitivity of 69.6% and a specificity of 84.6% for detecting cirrhosis, and a sensitivity of 74.1% and a specificity of 72.0% for detecting SLD. 

In the Stanford Health Care external-validation test ultrasound cohort, the model detected cirrhosis with an AUROC of 0.830 (95% CI, 0.799-0.859) and SLD with an AUROC of 0.769 (95% CI, 0.733-0.813), with sensitivity and specificity of 80.0% and 70.9%, respectively, for cirrhosis and 66.7% and 78.0%, respectively, for SLD. 

In the CSMC MRI-paired cohort, EchoNet-Liver detected cirrhosis with an AUROC of 0.704 (95% CI, 0.699-0.708) and SLD with an AUROC of 0.725 (95% CI, 0.707-0.762).

 

Identifying Subclinical Liver Disease to Improve Outcomes

“Across diverse populations and disease definitions, deep-learning-enhanced echocardiography enabled high-throughput, automated detection of CLD, which could enable opportunistic screening for asymptomatic liver disease,” the authors wrote. 

“By improving the diagnosis of subclinical CLD, we may be able to limit or reverse disease progression and improve care by triaging patients toward appropriate clinical and diagnostic management,” they said. 

By way of limitations, the researchers noted that the tool was developed using a cohort of patients who had both abdominal ultrasound and echocardiography within 30 days, and thus probably had a higher prevalence of liver disease compared with the general population receiving echocardiography. The true clinical utility of EchoNet-Liver will depend on whether its application to a general echocardiography population can efficiently detect undiagnosed CLD, they cautioned. 

“While we developed this algorithm based on clinical data, the application within the clinic would typically require FDA approval, which we have not yet applied for,” Kwan told this news organization.

“We plan to prospectively validate this algorithm at multiple sites to ensure that application of this algorithm improves patient care without causing excess diagnostic testing, thus providing value to patients and the healthcare system as a whole,” Kwan said.

Funding was provided in part by KAKENHI (Japan Society for the Promotion of Science). Kwan reported receiving consulting fees from InVision.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Mon, 03/24/2025 - 11:15
Un-Gate On Date
Mon, 03/24/2025 - 11:15
Use ProPublica
CFC Schedule Remove Status
Mon, 03/24/2025 - 11:15
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Mon, 03/24/2025 - 11:15

Stool Test Detects Sensitivity to Food Additives

Article Type
Changed
Mon, 03/24/2025 - 11:13

Diets in wealthier countries often include processed foods that contain additives, particularly emulsifiers. These additives are increasingly associated with the development of various diseases, including inflammatory bowel disease (IBD).

A research team led by Benoit Chassaing, PhD, research director at the French National Institute of Health and Medical Research (Inserm), focused on one such emulsifier — carboxymethylcellulose (CMC) — which is commonly found in processed baked goods, such as brioche and sandwich bread, and ice cream.

The study, published in the journal Gut, describes how the team developed a new method that uses a simple stool sample to predict an individual’s sensitivity to CMC.

 

Sensitivity Detection

In a previous clinical trial conducted on healthy volunteers, Chassaing and colleagues found that CMC consumption altered the gut microbiota and fecal metabolome in some healthy individuals. In mice, transplanting fecal microbiota from CMC-sensitive animals made other animals susceptible. This has led researchers to investigate the characteristics of sensitive microbiota.

To explore this, the researchers developed an in vitro microbiota model capable of replicating multiple healthy human microbiota. CMC sensitivity was tested using this model, and the findings were validated in vivo by transplanting microbiota classified as sensitive or resistant to mice. Only mice that received microbiota predicted to be CMC-sensitive developed severe colitis after consuming CMC.

 

Predictive Signature

Next, the team analyzed the stool metagenomes of individuals with microbiotas classified as sensitive or resistant to CMC. They identified a specific microbial signature that could predict whether a given microbiota would react negatively to emulsifiers. Using molecular analyses, this signature allows researchers to predict whether an individual’s microbiota is susceptible or resistant to CMC exposure.

For the research team, these findings open the possibility of determining whether an individual is sensitive to a particular emulsifier, allowing for personalized dietary recommendations. This is particularly relevant for patients with chronic IBD and may also help prevent these conditions in those not previously affected.

These findings could pave the way for personalized dietary recommendations, particularly for patients with chronic IBD. By identifying individuals sensitive to specific emulsifiers, clinicians can tailor diets to reduce inflammation and potentially prevent disease onset in those at risk.

To further validate these insights, the team is launching a cohort study in patients with Crohn’s to explore why some individuals are more susceptible to food additives than others.

This story was translated from Univadis France using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Diets in wealthier countries often include processed foods that contain additives, particularly emulsifiers. These additives are increasingly associated with the development of various diseases, including inflammatory bowel disease (IBD).

A research team led by Benoit Chassaing, PhD, research director at the French National Institute of Health and Medical Research (Inserm), focused on one such emulsifier — carboxymethylcellulose (CMC) — which is commonly found in processed baked goods, such as brioche and sandwich bread, and ice cream.

The study, published in the journal Gut, describes how the team developed a new method that uses a simple stool sample to predict an individual’s sensitivity to CMC.

 

Sensitivity Detection

In a previous clinical trial conducted on healthy volunteers, Chassaing and colleagues found that CMC consumption altered the gut microbiota and fecal metabolome in some healthy individuals. In mice, transplanting fecal microbiota from CMC-sensitive animals made other animals susceptible. This has led researchers to investigate the characteristics of sensitive microbiota.

To explore this, the researchers developed an in vitro microbiota model capable of replicating multiple healthy human microbiota. CMC sensitivity was tested using this model, and the findings were validated in vivo by transplanting microbiota classified as sensitive or resistant to mice. Only mice that received microbiota predicted to be CMC-sensitive developed severe colitis after consuming CMC.

 

Predictive Signature

Next, the team analyzed the stool metagenomes of individuals with microbiotas classified as sensitive or resistant to CMC. They identified a specific microbial signature that could predict whether a given microbiota would react negatively to emulsifiers. Using molecular analyses, this signature allows researchers to predict whether an individual’s microbiota is susceptible or resistant to CMC exposure.

For the research team, these findings open the possibility of determining whether an individual is sensitive to a particular emulsifier, allowing for personalized dietary recommendations. This is particularly relevant for patients with chronic IBD and may also help prevent these conditions in those not previously affected.

These findings could pave the way for personalized dietary recommendations, particularly for patients with chronic IBD. By identifying individuals sensitive to specific emulsifiers, clinicians can tailor diets to reduce inflammation and potentially prevent disease onset in those at risk.

To further validate these insights, the team is launching a cohort study in patients with Crohn’s to explore why some individuals are more susceptible to food additives than others.

This story was translated from Univadis France using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.

A version of this article first appeared on Medscape.com.

Diets in wealthier countries often include processed foods that contain additives, particularly emulsifiers. These additives are increasingly associated with the development of various diseases, including inflammatory bowel disease (IBD).

A research team led by Benoit Chassaing, PhD, research director at the French National Institute of Health and Medical Research (Inserm), focused on one such emulsifier — carboxymethylcellulose (CMC) — which is commonly found in processed baked goods, such as brioche and sandwich bread, and ice cream.

The study, published in the journal Gut, describes how the team developed a new method that uses a simple stool sample to predict an individual’s sensitivity to CMC.

 

Sensitivity Detection

In a previous clinical trial conducted on healthy volunteers, Chassaing and colleagues found that CMC consumption altered the gut microbiota and fecal metabolome in some healthy individuals. In mice, transplanting fecal microbiota from CMC-sensitive animals made other animals susceptible. This has led researchers to investigate the characteristics of sensitive microbiota.

To explore this, the researchers developed an in vitro microbiota model capable of replicating multiple healthy human microbiota. CMC sensitivity was tested using this model, and the findings were validated in vivo by transplanting microbiota classified as sensitive or resistant to mice. Only mice that received microbiota predicted to be CMC-sensitive developed severe colitis after consuming CMC.

 

Predictive Signature

Next, the team analyzed the stool metagenomes of individuals with microbiotas classified as sensitive or resistant to CMC. They identified a specific microbial signature that could predict whether a given microbiota would react negatively to emulsifiers. Using molecular analyses, this signature allows researchers to predict whether an individual’s microbiota is susceptible or resistant to CMC exposure.

For the research team, these findings open the possibility of determining whether an individual is sensitive to a particular emulsifier, allowing for personalized dietary recommendations. This is particularly relevant for patients with chronic IBD and may also help prevent these conditions in those not previously affected.

These findings could pave the way for personalized dietary recommendations, particularly for patients with chronic IBD. By identifying individuals sensitive to specific emulsifiers, clinicians can tailor diets to reduce inflammation and potentially prevent disease onset in those at risk.

To further validate these insights, the team is launching a cohort study in patients with Crohn’s to explore why some individuals are more susceptible to food additives than others.

This story was translated from Univadis France using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GUT

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Mon, 03/24/2025 - 11:10
Un-Gate On Date
Mon, 03/24/2025 - 11:10
Use ProPublica
CFC Schedule Remove Status
Mon, 03/24/2025 - 11:10
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Mon, 03/24/2025 - 11:10

Better Prep, Better Scope: Task Force Updates Colonoscopy Bowel Prep Advice

Article Type
Changed
Fri, 03/21/2025 - 15:07

The United States multi-society task force on colorectal cancer (CRC) has updated its 2014 guidance for optimizing the adequacy of bowel preparation for colonoscopy.

The latest consensus recommendations emphasize the importance of verbal and written patient education, refine diet restrictions, update optimal purgative regimens, and advise tracking bowel prep adequacy rates at both the individual endoscopist and unit levels.

“Colorectal cancer remains the second most common cause of cancer death in the United States, and colonoscopy is considered the gold standard for evaluating the colon, including assessing causes of colon-related signs or symptoms and the detection of precancerous lesions. It is well recognized that the adequacy of bowel preparation is essential for optimal colonoscopy performance,” the task force wrote.

 

Choice of Prep, Dosing and Timing, and Dietary Restrictions 

When choosing bowel preparation regimens, the task force recommends considering the individual’s medical history, medications, and, when available, the adequacy of bowel preparation reported from prior colonoscopies. Other considerations include patient preference, associated additional costs to the patient, and ease in obtaining and consuming any purgatives or adjuncts.

Dr. Brian Jacobson

In terms of timing and dose, the task force now “suggests that lower-volume bowel preparation regimens, such as those that rely on only 2 liters of fluid compared to the traditional 4L, are acceptable options for individuals considered unlikely to have an inadequate bowel preparation. This assumes that the purgative is taken in a split-dose fashion (half the evening prior to colonoscopy and half the morning of the colonoscopy),” co–lead author Brian C. Jacobson, MD, MPH, AGAF, with Massachusetts General Hospital and Harvard Medical School, both in Boston, said in an interview.

The task force also states that a same-day bowel preparation regimen for afternoon, but not morning, colonoscopy is a “reasonable alternative to the now-common split-dose regimen,” Jacobson said.

The group did not find one bowel preparation purgative to be better than others, although table 7 in the document details characteristics of commonly used prep regimens including their side effects and contraindications.

Recommendations regarding dietary modifications depend upon the patient’s risk for inadequate bowel prep. For patients at low risk for inadequate bowel prep, the task force recommends limiting dietary restrictions to the day before a colonoscopy, relying on either clear liquids or low-fiber/low-residue diets for the early and midday meals. Table 5 in the document provides a list of low-residue foods and sample meals.

The task force also suggests the adjunctive use of oral simethicone (≥ 320 mg) to bowel prep as a way to potentially improve visualization, although they acknowledge that further research is needed.

How might these updated consensus recommendations change current clinical practice? 

Jacobson said: “Some physicians may try to identify individuals who will do just as well with a more patient-friendly, easily tolerated bowel preparation regimen, including less stringent dietary restrictions leading up to colonoscopy.” 

He noted that the task force prefers the term “guidance” to “guidelines.”

 

New Quality Benchmark 

The task force recommends documenting bowel prep quality in the endoscopy report after all washing and suctioning have been completed using reliably understood descriptors that communicate the adequacy of the preparation.

They recommend the term “adequate bowel preparation” be used to indicate that standard screening or surveillance intervals can be assigned based on the findings of the colonoscopy.

Additionally, the task force recommends that endoscopy units and individual endoscopists track and aim for ≥ 90% adequacy rates in bowel preparation — up from the 85% benchmark contained in the prior recommendations.

Jacobson told this news organization it’s “currently unknown” how many individual endoscopists and endoscopy units track and meet the 90% benchmark at present.

David Johnson, MD, professor of medicine and chief of gastroenterology at Eastern Virginia Medical School, Norfolk, who wasn’t on the task force, said endoscopy units and providers “need to be accountable and should be tracking this quality metric.”

Johnson noted that bowel prep inadequacy has “intrinsic costs,” impacting lesion detection, CRC incidence, and patient outcomes. Inadequate prep leads to “increased risk for morbidity, mortality, longer appointment and wait times for rescheduling, and negative connotations that may deter patients from returning.”

 

Dr. Brian Sullivan

Brian Sullivan, MD, MHS, assistant professor of medicine, division of gastroenterology, Duke University School of Medicine, Durham, North Carolina, who wasn’t on the task force, said the recommendation to target a 90% or higher bowel preparation adequacy rate is “appreciated.”

“This benchmark encourages practices to standardize measurement, tracking, and reporting of preparation quality at both the individual and unit levels. Specifically, it should motivate providers to critically evaluate their interpretation of preparation quality and ensure adequate cleansing before making determinations,” Sullivan said in an interview.

“At the unit level, this metric can identify whether there are opportunities for quality improvement, such as by implementing evidence-based initiatives (provided in the guidance) to enhance outpatient preparation processes,” Sullivan noted.

The task force emphasized that the majority of consensus recommendations focus on individuals at average risk for inadequate bowel prep. Patients at high risk for inadequate bowel prep (eg, diabetes, constipation, opioid use) should receive tailored instructions, including a more extended dietary prep and high-volume purgatives.

 

‘Timely and Important’ Updates

Sullivan said the updated consensus recommendations on optimizing bowel preparation quality for colonoscopy are both “timely and important.” 

“Clear guidance facilitates dissemination and adoption, promoting flexible yet evidence-based approaches that enhance patient and provider satisfaction while potentially improving CRC prevention outcomes. For instance, surveys reveal that some practices still do not utilize split-dose bowel preparation, which is proven to improve preparation quality, particularly for the right-side of the colon. This gap underscores the need for standardized guidance to ensure high-quality colonoscopy and effective CRC screening,” Sullivan said.

He also noted that the inclusion of lower-volume bowel prep regimens and less intensive dietary modifications for selected patients is a “welcome update.”

“These options can improve patient adherence and satisfaction, which are critical not only for the quality of the index exam but also for ensuring patients return for future screenings, thereby supporting long-term CRC prevention efforts,” Sullivan said.

The task force includes representatives from the American Gastroenterological Association, the American College of Gastroenterology, and the American Society for Gastrointestinal Endoscopy.

The consensus document was published online in the three societies’ respective scientific journals — Gastroenterology, the American Journal of Gastroenterology, and Gastrointestinal Endsocopy.

This research had no financial support. Jacobson is a consultant for Curis and Guardant Health. Sullivan had no disclosures. Johnson is an adviser to ISOThrive and a past president of the American College of Gastroenterology.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

The United States multi-society task force on colorectal cancer (CRC) has updated its 2014 guidance for optimizing the adequacy of bowel preparation for colonoscopy.

The latest consensus recommendations emphasize the importance of verbal and written patient education, refine diet restrictions, update optimal purgative regimens, and advise tracking bowel prep adequacy rates at both the individual endoscopist and unit levels.

“Colorectal cancer remains the second most common cause of cancer death in the United States, and colonoscopy is considered the gold standard for evaluating the colon, including assessing causes of colon-related signs or symptoms and the detection of precancerous lesions. It is well recognized that the adequacy of bowel preparation is essential for optimal colonoscopy performance,” the task force wrote.

 

Choice of Prep, Dosing and Timing, and Dietary Restrictions 

When choosing bowel preparation regimens, the task force recommends considering the individual’s medical history, medications, and, when available, the adequacy of bowel preparation reported from prior colonoscopies. Other considerations include patient preference, associated additional costs to the patient, and ease in obtaining and consuming any purgatives or adjuncts.

Dr. Brian Jacobson

In terms of timing and dose, the task force now “suggests that lower-volume bowel preparation regimens, such as those that rely on only 2 liters of fluid compared to the traditional 4L, are acceptable options for individuals considered unlikely to have an inadequate bowel preparation. This assumes that the purgative is taken in a split-dose fashion (half the evening prior to colonoscopy and half the morning of the colonoscopy),” co–lead author Brian C. Jacobson, MD, MPH, AGAF, with Massachusetts General Hospital and Harvard Medical School, both in Boston, said in an interview.

The task force also states that a same-day bowel preparation regimen for afternoon, but not morning, colonoscopy is a “reasonable alternative to the now-common split-dose regimen,” Jacobson said.

The group did not find one bowel preparation purgative to be better than others, although table 7 in the document details characteristics of commonly used prep regimens including their side effects and contraindications.

Recommendations regarding dietary modifications depend upon the patient’s risk for inadequate bowel prep. For patients at low risk for inadequate bowel prep, the task force recommends limiting dietary restrictions to the day before a colonoscopy, relying on either clear liquids or low-fiber/low-residue diets for the early and midday meals. Table 5 in the document provides a list of low-residue foods and sample meals.

The task force also suggests the adjunctive use of oral simethicone (≥ 320 mg) to bowel prep as a way to potentially improve visualization, although they acknowledge that further research is needed.

How might these updated consensus recommendations change current clinical practice? 

Jacobson said: “Some physicians may try to identify individuals who will do just as well with a more patient-friendly, easily tolerated bowel preparation regimen, including less stringent dietary restrictions leading up to colonoscopy.” 

He noted that the task force prefers the term “guidance” to “guidelines.”

 

New Quality Benchmark 

The task force recommends documenting bowel prep quality in the endoscopy report after all washing and suctioning have been completed using reliably understood descriptors that communicate the adequacy of the preparation.

They recommend the term “adequate bowel preparation” be used to indicate that standard screening or surveillance intervals can be assigned based on the findings of the colonoscopy.

Additionally, the task force recommends that endoscopy units and individual endoscopists track and aim for ≥ 90% adequacy rates in bowel preparation — up from the 85% benchmark contained in the prior recommendations.

Jacobson told this news organization it’s “currently unknown” how many individual endoscopists and endoscopy units track and meet the 90% benchmark at present.

David Johnson, MD, professor of medicine and chief of gastroenterology at Eastern Virginia Medical School, Norfolk, who wasn’t on the task force, said endoscopy units and providers “need to be accountable and should be tracking this quality metric.”

Johnson noted that bowel prep inadequacy has “intrinsic costs,” impacting lesion detection, CRC incidence, and patient outcomes. Inadequate prep leads to “increased risk for morbidity, mortality, longer appointment and wait times for rescheduling, and negative connotations that may deter patients from returning.”

 

Dr. Brian Sullivan

Brian Sullivan, MD, MHS, assistant professor of medicine, division of gastroenterology, Duke University School of Medicine, Durham, North Carolina, who wasn’t on the task force, said the recommendation to target a 90% or higher bowel preparation adequacy rate is “appreciated.”

“This benchmark encourages practices to standardize measurement, tracking, and reporting of preparation quality at both the individual and unit levels. Specifically, it should motivate providers to critically evaluate their interpretation of preparation quality and ensure adequate cleansing before making determinations,” Sullivan said in an interview.

“At the unit level, this metric can identify whether there are opportunities for quality improvement, such as by implementing evidence-based initiatives (provided in the guidance) to enhance outpatient preparation processes,” Sullivan noted.

The task force emphasized that the majority of consensus recommendations focus on individuals at average risk for inadequate bowel prep. Patients at high risk for inadequate bowel prep (eg, diabetes, constipation, opioid use) should receive tailored instructions, including a more extended dietary prep and high-volume purgatives.

 

‘Timely and Important’ Updates

Sullivan said the updated consensus recommendations on optimizing bowel preparation quality for colonoscopy are both “timely and important.” 

“Clear guidance facilitates dissemination and adoption, promoting flexible yet evidence-based approaches that enhance patient and provider satisfaction while potentially improving CRC prevention outcomes. For instance, surveys reveal that some practices still do not utilize split-dose bowel preparation, which is proven to improve preparation quality, particularly for the right-side of the colon. This gap underscores the need for standardized guidance to ensure high-quality colonoscopy and effective CRC screening,” Sullivan said.

He also noted that the inclusion of lower-volume bowel prep regimens and less intensive dietary modifications for selected patients is a “welcome update.”

“These options can improve patient adherence and satisfaction, which are critical not only for the quality of the index exam but also for ensuring patients return for future screenings, thereby supporting long-term CRC prevention efforts,” Sullivan said.

The task force includes representatives from the American Gastroenterological Association, the American College of Gastroenterology, and the American Society for Gastrointestinal Endoscopy.

The consensus document was published online in the three societies’ respective scientific journals — Gastroenterology, the American Journal of Gastroenterology, and Gastrointestinal Endsocopy.

This research had no financial support. Jacobson is a consultant for Curis and Guardant Health. Sullivan had no disclosures. Johnson is an adviser to ISOThrive and a past president of the American College of Gastroenterology.

A version of this article first appeared on Medscape.com.

The United States multi-society task force on colorectal cancer (CRC) has updated its 2014 guidance for optimizing the adequacy of bowel preparation for colonoscopy.

The latest consensus recommendations emphasize the importance of verbal and written patient education, refine diet restrictions, update optimal purgative regimens, and advise tracking bowel prep adequacy rates at both the individual endoscopist and unit levels.

“Colorectal cancer remains the second most common cause of cancer death in the United States, and colonoscopy is considered the gold standard for evaluating the colon, including assessing causes of colon-related signs or symptoms and the detection of precancerous lesions. It is well recognized that the adequacy of bowel preparation is essential for optimal colonoscopy performance,” the task force wrote.

 

Choice of Prep, Dosing and Timing, and Dietary Restrictions 

When choosing bowel preparation regimens, the task force recommends considering the individual’s medical history, medications, and, when available, the adequacy of bowel preparation reported from prior colonoscopies. Other considerations include patient preference, associated additional costs to the patient, and ease in obtaining and consuming any purgatives or adjuncts.

Dr. Brian Jacobson

In terms of timing and dose, the task force now “suggests that lower-volume bowel preparation regimens, such as those that rely on only 2 liters of fluid compared to the traditional 4L, are acceptable options for individuals considered unlikely to have an inadequate bowel preparation. This assumes that the purgative is taken in a split-dose fashion (half the evening prior to colonoscopy and half the morning of the colonoscopy),” co–lead author Brian C. Jacobson, MD, MPH, AGAF, with Massachusetts General Hospital and Harvard Medical School, both in Boston, said in an interview.

The task force also states that a same-day bowel preparation regimen for afternoon, but not morning, colonoscopy is a “reasonable alternative to the now-common split-dose regimen,” Jacobson said.

The group did not find one bowel preparation purgative to be better than others, although table 7 in the document details characteristics of commonly used prep regimens including their side effects and contraindications.

Recommendations regarding dietary modifications depend upon the patient’s risk for inadequate bowel prep. For patients at low risk for inadequate bowel prep, the task force recommends limiting dietary restrictions to the day before a colonoscopy, relying on either clear liquids or low-fiber/low-residue diets for the early and midday meals. Table 5 in the document provides a list of low-residue foods and sample meals.

The task force also suggests the adjunctive use of oral simethicone (≥ 320 mg) to bowel prep as a way to potentially improve visualization, although they acknowledge that further research is needed.

How might these updated consensus recommendations change current clinical practice? 

Jacobson said: “Some physicians may try to identify individuals who will do just as well with a more patient-friendly, easily tolerated bowel preparation regimen, including less stringent dietary restrictions leading up to colonoscopy.” 

He noted that the task force prefers the term “guidance” to “guidelines.”

 

New Quality Benchmark 

The task force recommends documenting bowel prep quality in the endoscopy report after all washing and suctioning have been completed using reliably understood descriptors that communicate the adequacy of the preparation.

They recommend the term “adequate bowel preparation” be used to indicate that standard screening or surveillance intervals can be assigned based on the findings of the colonoscopy.

Additionally, the task force recommends that endoscopy units and individual endoscopists track and aim for ≥ 90% adequacy rates in bowel preparation — up from the 85% benchmark contained in the prior recommendations.

Jacobson told this news organization it’s “currently unknown” how many individual endoscopists and endoscopy units track and meet the 90% benchmark at present.

David Johnson, MD, professor of medicine and chief of gastroenterology at Eastern Virginia Medical School, Norfolk, who wasn’t on the task force, said endoscopy units and providers “need to be accountable and should be tracking this quality metric.”

Johnson noted that bowel prep inadequacy has “intrinsic costs,” impacting lesion detection, CRC incidence, and patient outcomes. Inadequate prep leads to “increased risk for morbidity, mortality, longer appointment and wait times for rescheduling, and negative connotations that may deter patients from returning.”

 

Dr. Brian Sullivan

Brian Sullivan, MD, MHS, assistant professor of medicine, division of gastroenterology, Duke University School of Medicine, Durham, North Carolina, who wasn’t on the task force, said the recommendation to target a 90% or higher bowel preparation adequacy rate is “appreciated.”

“This benchmark encourages practices to standardize measurement, tracking, and reporting of preparation quality at both the individual and unit levels. Specifically, it should motivate providers to critically evaluate their interpretation of preparation quality and ensure adequate cleansing before making determinations,” Sullivan said in an interview.

“At the unit level, this metric can identify whether there are opportunities for quality improvement, such as by implementing evidence-based initiatives (provided in the guidance) to enhance outpatient preparation processes,” Sullivan noted.

The task force emphasized that the majority of consensus recommendations focus on individuals at average risk for inadequate bowel prep. Patients at high risk for inadequate bowel prep (eg, diabetes, constipation, opioid use) should receive tailored instructions, including a more extended dietary prep and high-volume purgatives.

 

‘Timely and Important’ Updates

Sullivan said the updated consensus recommendations on optimizing bowel preparation quality for colonoscopy are both “timely and important.” 

“Clear guidance facilitates dissemination and adoption, promoting flexible yet evidence-based approaches that enhance patient and provider satisfaction while potentially improving CRC prevention outcomes. For instance, surveys reveal that some practices still do not utilize split-dose bowel preparation, which is proven to improve preparation quality, particularly for the right-side of the colon. This gap underscores the need for standardized guidance to ensure high-quality colonoscopy and effective CRC screening,” Sullivan said.

He also noted that the inclusion of lower-volume bowel prep regimens and less intensive dietary modifications for selected patients is a “welcome update.”

“These options can improve patient adherence and satisfaction, which are critical not only for the quality of the index exam but also for ensuring patients return for future screenings, thereby supporting long-term CRC prevention efforts,” Sullivan said.

The task force includes representatives from the American Gastroenterological Association, the American College of Gastroenterology, and the American Society for Gastrointestinal Endoscopy.

The consensus document was published online in the three societies’ respective scientific journals — Gastroenterology, the American Journal of Gastroenterology, and Gastrointestinal Endsocopy.

This research had no financial support. Jacobson is a consultant for Curis and Guardant Health. Sullivan had no disclosures. Johnson is an adviser to ISOThrive and a past president of the American College of Gastroenterology.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Fri, 03/21/2025 - 09:22
Un-Gate On Date
Fri, 03/21/2025 - 09:22
Use ProPublica
CFC Schedule Remove Status
Fri, 03/21/2025 - 09:22
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Fri, 03/21/2025 - 09:22