Monday, December 12, 2011

Half of patients relapsed after infliximab cessation

December 8, 2011

BY DENISE NAPOLI Roughly half of Crohn’s disease patients who are in remission while being treated with infliximab will relapse after discontinuing this drug, reported Dr. Edouard Louis and colleagues reported online in the journal Gastroenterology.
However, following retreatment with infliximab, “almost all” patients were once again in remission 1 month later, “and none experienced a significant acute or delayed infusion reaction, despite a drug holiday longer than 6 months for half of them.”
Dr. Louis of the Centre Hospitalier Universitaire de Liège, Belgium, and his associates studied 115 adult patients with active luminal Crohn’s disease (CD) who had received at least 1 year of therapy with infliximab and an antimetabolite (azathioprine, 6-mercaptopurine, or methotrexate).
All patients had received at least two infliximab infusions during the 6 months prior to study inclusion, and patients’ concurrent antimetabolite doses had been stable for at least 3 months, with corticosteroid-free remission for the last 6 months before inclusion.
“Clinical response was defined by a decrease in the Crohns Disease Activity Index (CDAI) of at least 70 points, and 25% from CDAI at relapse,” whereas remission was defined as a CDAI below 150.
Overall, there were 62 reported relapses at the study centers. Five of these relapses were invalidated by the authors, and another five were not retreated with infliximab because of patient or physician decision, leaving 52 relapses which were retreated per study protocol (Gastroenterology 2011 [doi:10.1053/j.gastro.2011.09.034]).
Factors significantly associated with relapse included male gender, with a hazard ratio of 3.7 (95% confidence interval, 1.9-7.4; P less than .001) and baseline hemoglobin levels less than 145 g/L at study inclusion (such as infliximab cessation), with a hazard ratio of 6.0 (95% CI, 2.2-16.5; P less than .001).
A leukocyte count greater than 6 x 109/L (hazard ratio, 2.4, 95% CI 1.2-4.7; P = .01) and a high sensitivity C-reactive protein level greater than or equal to 5 mg/L (HR, 3.2, 95% CI 1.6-6.4; P less than .001) were also associated with relapse, as were a lack of previous surgical resection and a fecal calprotectin level greater than or equal to 300 mcg/g.
By 30 days after restarting infliximab, 37 of the 40 patients with complete data available (93%) were in remission once again, and 39 of 40 (98%) had clinical response.
Of the remaining 12 patients, there was one consent withdrawal and 11 patients were assessed at alternate time points, with 9 found to be in remission and with clinical response, 1 with response only, and 1 patient not in remission or showing clinical response.
“No infusion reaction or significant delayed reaction was reported in the retreated patients up to third retreatment, despite a median drug holiday of 6.6 months,” wrote Dr. Louis, adding that no other serious adverse event was reported during the study.
“In clinical practice, stopping infliximab may still be considered for various reasons including cost, fear of long-term side effects, and concerns about pregnancy,” concluded the authors.
“However, simple parameters may be used to identify a subgroup of patients with a low risk of relapse and in whom infliximab withdrawal may be considered,” they wrote, pointing out that “when patients relapsed, retreatment with infliximab was effective and well tolerated in the vast majority.”
Dr. Louis reported receiving consultancy fees as well as research and educational grants from several pharmaceutical companies, as did several other authors. The research was sponsored by the Association François Aupetit and the Société Nationale Française de Gastroentérologie.

HCV infection may predict coronary artery disease

December 9, 2011

BY HEIDI SPLETE NATIONAL HARBOR, MD. (EGMN) – Coronary artery disease was significantly more prevalent in patients with hepatitis C virus infection, compared with control subjects, based on a retrospective review. The findings were presented at the annual meeting of the American College of Gastroenterology.
“An association of coronary artery disease [CAD] with hepatitis C has been suggested, but definitive data are still lacking,” said Dr. Sanjaya Satapathy, who conducted the study while at Long Island Jewish Medical Center in New Hyde Park, N.Y.
To estimate the prevalence of CAD in hepatitis C patients, Dr. Satapathy and his colleagues reviewed data from 934 individuals with hepatitis C infection who were seen at a single center between May 2002 and December 2008. Of these patients, 63 had undergone coronary angiography. The investigators compared their data with data from 63 matched controls without hepatitis C.
Overall severity of CAD according to the combined Reardon severity score was significantly greater in the hepatitis C virus (HCV) group than in the controls (6.3 vs. 2.6, respectively), suggesting that being HCV-positive increases the severity of, or risk for, CAD, Dr. Satapathy said.
The researchers defined CAD in two different ways for their analysis. CAD defined as stenosis greater than 50% was found in 44 of the HCV cases (70%) compared with 30 controls (48%). CAD defined as stenosis greater than 75% was found in 42 patients with hepatitis C (67%) compared with 29 controls (46%).
In addition, the prevalence of multivessel coronary artery disease was significantly higher in the HCV patients compared with the controls (57% vs. 16%, respectively). The prevalence of single-vessel involvement was greater in the control group.
“HCV seropositive status is a strong predictor for CAD,” Dr. Satapathy said. However, “HCV patients are more likely to remain undertreated with antiplatelet and lipid-lowering agents,” he noted.
The study was limited by the retrospective design and small sample size, said Dr. Satapathy. However, the findings suggest that CAD is significantly more common and severe in HCV-positive patients, and this should be considered by clinicians treating these patients, he said.
Dr. Satapathy said he had no financial conflicts to disclose.

Outcomes of transplant for acute liver failure improve steadily

December 12, 2011

BY SUSAN LONDON SAN FRANCISCO (EGMN) – Outcomes after liver transplantation for acute liver failure steadily improved during a recent period spanning about 2 decades, researchers reported at the annual meeting of the American Association for the Study of Liver Diseases.
An analysis of data from nearly 5,000 European patients found that rates of both graft survival and patient survival increased over time, even as the mean age of donors and recipients was rising.
However, rates of graft loss and death remained high during the first year after surgery, driven mainly by infection, rejection, and primary delayed function or nonfunction of the graft.
“A progressive and constant improvement in the survival rate after liver transplantation for acute liver failure has been achieved in the last 20 years despite an increase in donor and recipient age,” commented lead investigator Dr. Giacomo Germani. “High mortality and graft loss still persist, especially within the first year posttransplant.”
Importantly, deaths and graft losses due to social causes – mainly nonadherence to therapy and suicide – were much more common among patients who underwent transplantation because of paracetamol (known as acetaminophen in the United States) toxicity than among other patients. And more than half of these events occurred in the first year, suggesting that early psychiatric and related intervention for this group of recipients could be beneficial, according to Dr. Germani.
For the study, the investigators analyzed data from the European Liver Transplant Registry, which captures data on liver transplants done in 23 countries. The authors identified 4,903 patients older than 16 years who underwent isolated liver transplantation for fulminant or subfulminant acute liver failure between 1988 and 2009.
Temporal trends showed that the annual number of transplantations for acute liver failure increased until approximately 1994 and then remained essentially stable thereafter, reported Dr. Germani, who is a research fellow with the Royal Free Hospital and University College London, and a physician with the Padova (Italy) University Hospital.
On average, recipients were 39 years old and donors were 41 years old, but the age of both groups increased during the study period.
“The most important change [over time] was the increase in the donor age,” Dr. Germani maintained. For example, donors older than 60 years made up merely 2% of all donors in 1988-1993 but 21% in 2004-2009.
There was a shift in the etiology of the acute liver failure leading to transplantation during the study period that most likely reflected more effective diagnosis, he said. The proportion of cases recorded as having an unknown etiology fell from 60% in 1988-1993 to just 33% in 2004-2009. Meanwhile, there were increases in the proportions attributed to paracetamol toxicity, other drug toxicity, and other known causes (for example, traumatic or operative liver injury).
For the entire study period, the cumulative 10-year rates of graft and patient survival were 50% and 63%, respectively, “almost comparable with [those for] transplantation for other etiology,” Dr. Germani observed.
Both outcomes improved significantly during the study period. For example, the cumulative 5-year rate of graft survival improved steadily from about 50% in 1988-1993 to about 70% in 2004-2009 (P less than .001).
Social causes accounted for just 1% of all deaths and graft losses for the entire cohort. But they accounted for nearly 8% of those among patients who underwent transplantation because of paracetamol toxicity.
Temporal trends in the causes of death or graft loss showed increases in the proportions that were due to cardiovascular causes (likely related to increasing recipient age) and to primary delayed function or nonfunction of the graft (likely related to increasing use of livers from donors having less favorable characteristics), according to Dr. Germani.
After determining that mortality and graft loss were most common in the first year after transplantation and particularly in the first 3 months, the investigators identified independent risk factors for these outcomes and incorporated them into predictive models.
There was generally good correspondence between observed and model-predicted mortality and graft loss, according to Dr. Germani.
When applied to older adults, the models identified those at especially high risk for poor outcome. For example, among the cohort of patients older than 50 years of age, the risk of graft loss or death in the first year after transplantation was 44% to 61% if the patient was older than 60 years, was male, or had an incompatible ABO match with the donor, depending on the factor. With all three factors combined, the risk jumped to 80%.
“It is therefore important to reevaluate the selection of older patients with acute liver failure as potential recipients for liver transplantation in order to avoid futile transplants,” Dr. Germani recommended.

Thursday, December 01, 2011

More Nonalcoholic Steatohepatitis Requiring Transplant

Neil Canavan
November 28, 2011 (San Francisco, California) — Nonalcoholic steatohepatitis (NASH) as an indication for liver transplantation rose 5-fold from 2002 to 2009. Although metabolic changes related to NASH risk have increased in the general population as a whole, the criteria for establishing risk for NASH-related liver failure remain unclear, according to data presented here at The Liver Meeting 2011: American Association for the Study of Liver Diseases 62nd Annual Meeting.
"NASH is increasingly an indication for liver transplant," said Danielle Brandman, MD, from the University of California at San Francisco. "Factors for this include the addition of NASH as a diagnosis in the UNOS [United Network for Organ Sharing] database, and increased awareness of NASH as a cause of end-stage liver disease." Up to half of all cases of cryptogenic cirrhosis are likely a result of unrecognized NASH, although Dr. Brandman noted that there are no uniform diagnostic criteria to define cryptogenic cirrhosis caused by NASH.
To identify the NASH-related risk factors driving this increase, the researchers conducted a comparison of pre- and post-MELD score measures.
The findings suggest that steep increases in the incidence of obesity and insulin resistance are the culprits, as opposed to the recorded rates of hypertension and dyslipidemia, which have remained essentially stable since 2002.
In addition to these changes occurring over time in the general population, "we must think about how patients with NASH undergoing liver transplant may be changing over time," said Dr. Brandman. This study is an investigation of changes in the characteristics of liver transplant recipients secondary to NASH over time, as well as patient survival after transplantation for NASH.
The data for this retrospective investigation were drawn from the UNOS database. The inclusion criteria included being 18 years or older and undergoing liver transplantation from 2002 to 2009. Exclusion criteria included retransplantation, HIV positivity, fulminant hepatic failure, and rare liver diseases.
Cases of NASH and "probably NASH" were combined for the analysis. NASH was determined using primary diagnostic code at liver transplantation, and probably NASH was defined as preliver transplant diabetes mellitus, preliver transplant hypertension, and/or a body mass index (BMI) of 40 kg/m² or higher.
After reviewing 30,182 charts, Dr. Brandman's team identified 1355 cases of NASH and 1537 cases of probably NASH. In the probably NASH group, 70% had diabetes, 32% were hypertensive, and 9% had a BMI of 40 kg/m² or higher. Many patients had more than 1 condition, and half of the remaining liver transplant recipients were positive for hepatitis C virus infection.
There were more females in the NASH/probably NASH group than in the no NASH group (43% vs 29%), more patients with a BMI of 40 kg/m² or higher (31.7% vs 27.5%), more white patients (31.7 vs 27.5), more preliver transplant diabetes (67% vs 19%), and more hypertension (43% vs 16%). Patients in the NASH/probably NASH group had a low prevalence of hepatocellular carcinoma but a high requirement for renal replacement therapy just before transplantation.
Five-year survival rates after liver transplantation in the 2 groups were the same (81.1%).
Matching temporal trends of these measures to risk and outcome has been problematic. "Since 2002, NASH is an increasing indication for liver transplant; it was responsible for just over 4% of transplants in 2002 and more than 12% in 2009," said Dr. Brandman. "At the same time, those identified as having NASH/probably NASH exhibited less preliver transplant diabetes and pretransplant hypertension over time, despite increases in these conditions in the general population."
Dr. Brandman surmises that the selection criteria for liver transplantation are likely being applied. "Additional studies are needed to determine what these criteria are, and which are the strongest predictors of outcome."
There's Something Happening Here
"NASH can definitely kill an individual," said Arun Sanyal, MD, chair of gastroenterology, hepatology, and nutrition at Virginia Commonwealth University in Richmond. Patients with NASH have a 15% to 20% risk of progressing to cirrhosis and end-stage liver disease, and there is increasing evidence that NASH may be connected to the development of hepatocellular carcinoma, even in the absence of cirrhosis. "That has huge public health implications because this cancer has one of the fastest rising incidences in the country."
Dr. Sanyal concurs with Dr. Brandman that the factors driving the increase in NASH are not clear.
"The increasing incidence of obesity and insulin resistance are 2 factors certainly." Other suggested contributors are the consumption of high-fructose corn syrup and environmental exposure to pollution. "There are studies that have linked exposure to various hydrocarbons to the development of fat in the liver — one of the defining characteristics of NASH."
Genetics also play a role. "We know that African Americans have a high incidence of hypertension and diabetes, but seem to be protected from fatty liver disease. In contrast, Hispanics have a high rate of metabolic syndrome and fatty liver disease," Dr. Sanyal said.
What is the clinician to do for the obese or hypertensive patient regarding NASH? "This is an emerging trend, so we're not quite there yet with a general clinical recommendation." There is no set diagnostic criteria for the disease, and other than lifestyle interventions, there is no approved treatment, although vitamin supplements can help. "We published a study last year showing that vitamin E at 800 units/day reverses NASH in roughly 40% of patients [N Engl J Med. 2010;362:1675-1685]," Dr. Sanyal noted.
Dr. Brandman and Dr. Sanyal have disclosed no relevant financial relationships.
The Liver Meeting 2011: American Association for the Study of Liver Diseases (AASLD) 62nd Annual Meeting. Abstract 12. Presented November 8, 2011.

Hepatitis screening offered with routine colonoscopy accepted by 75%

November 23, 2011

BY HEIDI SPLETE NATIONAL HARBOR, MD. (EGMN) – A screening colonoscopy can provide a convenient opportunity to simultaneously test older adults for hepatitis, based on a study of 500 patients, 75% of whom agreed to blood tests for hepatitis A, B, and C.
Adults aged 50-65 years (the “baby boomers”) represent a high-risk population for hepatitis, and hepatitis C in particular, because of possible exposure to high-risk activities in their teens and twenties, said Dr. Dawn Sears of Scott & White Hospital in Temple, Tex. The findings were presented at the annual meeting of the American College of Gastroenterology.
Men make up 70% of chronic hepatitis cases, and they are less likely to see a doctor regularly than women, she noted. “Colorectal cancer screenings are often the only physician encounter for men aged 50 to 60 years,” she said.
To increase hepatitis screening in older adults, Dr. Sears and her colleagues tested whether combining hepatitis testing with routine colonoscopy appointments would be effective.
Patients were mailed information about hepatitis along with their instructions for colonoscopy preparation. On the day of their colonoscopies, patients met with a research nurse, signed a consent form, and completed a patient risk form. Blood was drawn for hepatitis screening when the IV was placed prior to the colonoscopy.
A total of 376 of 500 patients (75%) undergoing colonoscopies agreed to hepatitis testing. The study population was 42% male and 58% female. Risk factors in the patients’ histories included high-risk sexual activity, getting a tattoo prior to the year 2000, injecting or snorting drugs, having a blood transfusion before 1992, having a sexual partner with known hepatitis, being a health care worker who had been stuck with a needle, and spending at least 2 days in jail.
None of the patients had hepatitis B surface antigens, and 77% did not have antibodies against hepatitis A and B. Four patients had results suggesting previously undiagnosed hepatitis C, and all four complied with the recommended follow-up polymerase chain reaction (PCR) testing. One patient had a positive PCR follow-up, and that patient is beginning triple therapy, Dr. Sears said. All patients who were found to have hepatitis C antibodies had risk factors for hepatitis C infection, she noted.
“We should ask about risk factors and consider screening for hepatitis B and C," Dr. Sears said. “Gastroenterologists see most baby boomers at least once. We understand the [test] results, and this provides the highest quality, most efficient health care for our patients.”

Meta-analysis supports lower colorectal cancer risk with high fiber intake

November 26, 2011

ST LOUIS (MD Consult) - A high intake of dietary fiber—especially cereals and whole grains—is associated with a lower risk of colorectal cancer, according to a study published in the November 26, 2011, issue of the British Medical Journal.

A systematic review was performed to identify cohort and case-control studies of the relationship between fiber and whole-grain intake and the incidence of colorectal cancer. Data from 25 prospective studies were pooled for meta-analysis. The lead author was Dagfinn Aune of Imperial College London.

Based on data from 16 studies, the summed relative risk of colorectal cancer at a total daily fiber intake of 10 g was 0.90. There was no significant reduction in colorectal cancer risk at the same daily intake of fruit or vegetable fiber, based on 9 studies each; or legume fiber, based on 4 studies.

However, higher intake of cereal fiber was associated with a significant reduction in colorectal cancer: summary relative risk 0.90, based on 8 studies. For an increase of 3 servings per day of whole grain, the summary relative risk of colorectal cancer was 0.83, based on 6 studies.

It has been suggested that increased intake of dietary fiber may reduce the risk of colorectal cancer, although studies of this issue have reached conflicting results. With the addition of recent studies, the available evidence base is large enough to clarify this association, including the dose-response relationship.

The results show a significant reduction in colorectal cancer incidence for individuals with a high intake of dietary fiber. The protective effect appears particularly strong for intake of cereal fiber and whole grains. The investigators conclude, "Our results indicate a 10% reduction in risk of colorectal cancer for each 10 g/day intake of total dietary fibre and cereal fibre and about a 20% reduction for each three servings (90 g/day) of whole grain daily, and further reductions with higher intake."

Endoscopist training program boosts polyp detection rate

November 29, 2011

BY HEIDI SPLETE NATIONAL HARBOR, MD. (EGMN) – Polyp detection rates were significantly higher among endoscopists who completed a quality-improvement training program, compared with rates of those who did not in a randomized, controlled trial of 15 endoscopists and 2,400 procedures. The findings were presented at the annual meeting of the American College of Gastroenterology.
Adenoma detection rate is a key quality indicator for colonoscopy, and previous studies have shown associations between physicians’ behavior (such as looking behind folds, and the time spent inspecting the colon) and rates of adenoma detection, said Dr. Susan Coe of the Mayo Clinic in Jacksonville, Fla.
However, attempts at improving polyp detection rates, including discussions with low-performing physicians, required withdrawal times, and financial penalties, have proven unsuccessful, she said.
Dr. Coe and her colleagues, including senior investigator Dr. Michael B. Wallace, designed a prospective, randomized educational intervention to determine whether targeted endoscopist training would increase polyp detection rates.
“This is the first study to our knowledge to prospectively show that adenoma detection rate can be significantly improved through an intensive, structured endoscopist training program,” Dr. Coe said.
In the first phase of the study, the endoscopists performed 1,200 colonoscopies to determine their baseline detection rates. The average baseline rate was 36% among endoscopists randomized to both the training and non-training groups.
In the second phase of the study, the endoscopists performed another 1,200 colonoscopies after half of them had completed the training program. Among endoscopists in the training group, the average adenoma detection rate increased significantly to 47%, compared to 35% in the non-training group.
The Endoscopic Quality Improvement Program consisted of two 1-hour small group sessions. The first session included literature, photo, and video examples of polyps, explanations of techniques from high-detecting endoscopists, and information about subtle lesions such as flat and serrated polyps.
The second session consisted of a validated surface pattern recognition exercise. The participants in the training program received monthly feedback on their adenoma detection rates, withdrawal times, and group averages after completing the program.
The baseline characteristics of the endoscopists who underwent training and those who did not were similar overall. Median age in the trained and untrained groups was 45 years and 50 years, respectively.
Dr. Coe noted that the findings were limited by the small number of endoscopists and the single setting, but the study is ongoing to see whether the improvements associated with training persist. Larger studies are also planned, she said.

Excessive vitamin D intake may elevate AF risk

November 30, 2011

BY MITCHEL L. ZOLER ORLANDO (EGMN) – People with an excessive blood level of vitamin D from overdosing with supplements had a 2.5-fold increased incidence of atrial fibrillation, based on a study of 132,000 residents of Utah and southeastern Idaho.
The finding “suggests the need for caution with vitamin D supplementation and the need for careful assessment of serum levels if high doses [of vitamin D] are used,” Megan B. Smith said at the annual scientific sessions of the American Heart Association.
The finding also suggests that patients identified with new-onset atrial fibrillation should be evaluated for a possible extremely high vitamin D level, said Ms. Smith, although in the results she reported, the high blood level of vitamin D linked with a significantly elevated incidence of atrial fibrillation, greater than 100 ng/dL, was extremely unusual, occurring in just 291 of the 132,000 people (0.2%) included in the study.
Although the mechanism linking such an extremely elevated blood level of vitamin D to a markedly increased rate of new-onset atrial fibrillation remains unclear, a likely explanation is the hypercalcemia that vitamin D toxicity can cause. Hypercalcemia can, in turn, reduce cardiac conduction velocity and shorten cardiac refractory time, said Ms. Smith, a dietician at Utah State University in Logan.
“Utah [residents have] tremendous use of supplements. From what we’ve seen in the charts we have, excessive use of vitamin D supplements is the primary driver” of the high levels seen, said Dr. T. Jared Bunch, director of electrophysiology research at the Intermountain Medical Group in Murray, Utah, and lead investigator for the study. “The few patients [with very high vitamin D levels] who I have seen got vitamin D in their milk, from a multivitamin, and from vitamin D pills. They get it from multiple sources,” but added that the low prevalence of levels above 100 ng/dL also showed that it is a difficult level for a person to reach.
“Utah has an enormous problem with vitamin D deficiency, so we had this large group of people” who were members of Intermountain Healthcare, and had their vitamin D level measured once as part of their routine care. A survey by Dr. Bunch and his associates showed that unless asked, people don’t usually tell their physician that they take a vitamin D supplement, and that physicians at Intermountain Health do not usually ask patients about their vitamin D intake.
The measurement numbers documented the extent of the vitamin D deficiency problem, with 38,000 of the 132,000 people measured (29%) having a blood level below 20 ng/dL. This group with vitamin D deficiency showed significantly elevated prevalence rates of diabetes, hypertension, coronary artery disease, heart failure, and depression, compared with people in the designated “normal” vitamin D range of 41-60 ng/dL. But notably the incidence of atrial fibrillation in the deficiency group was not significantly different than the rate in the reference group with a normal vitamin D level at baseline.
“There is something unique” about the excess, toxic level, for atrial fibrillation incidence, Dr. Bunch said in an interview.
To better examine the potential role of vitamin D in elevating atrial fibrillation risk, Dr. Bunch and his associates are now regularly measuring blood vitamin D levels in Intermountain Healthcare members and prospectively tracking their atrial fibrillation incidence.
The results reported by Ms. Smith came from a retrospective analysis of the one-time vitamin D measurement by an immunoassay, and atrial fibrillation incidence tallied over an average 584 days of follow-up based on ECG testing and ICD-9 codes in each person’s medical record. The most common vitamin D level measured was 21-40 ng/dL, in 73,547 people (56%). Another 17,234 people (13%) had a level of 41-60 ng/dL, which the researchers considered normal and which they used as the reference group.
During follow-up, the incidence of new-onset atrial fibrillation was about 1.5% in all subgroups based on their baseline vitamin D level, except for those with a level above 100 ng/dL, who had an incidence of about 4%. A multivariate analysis that controlled for baseline differences in demographics identified a significantly elevated atrial fibrillation rate only in people with a baseline vitamin D level greater than 100 ng/dL.
Ms. Smith and Dr. Bunch said that they had no disclosures.