April 24, 2011 | Comments Off on 2011 National Kidney Foundation Spring Clinical Meetings
Blog from the 2011 National Kidney Foundation Spring Clinical Meetings
Table of Contents (TOC)
- Hyponatremia: Etiology and Management in the High-risk Patient
- Tunneled Hemodialysis Catheters: What do Nephrologists Need to Know?
- Why not Peritoneal Dialysis?
- Integrating the treatment of secondary hyperparathyroidism in ESRD
- Renal Palliative Care
- Practical Issues in Home Hemodialysis
- Targeting Hemoglobin A1C in ESRD patients
- Low Sodium Acute Kidney Injury
Presenters: Robert Schrier, MD (University of Colorado) and Mitchell Rosner, MD (University of Virginia)
Blogger: Tejas Desai, MD (East Carolina University)
Sponsor: Otsuka Pharmaceuticals
Schrier starts off the session and will discuss the <font color="purple">etiology/pathophysiology of hyponatremia in high-risk patients. He defines high-risk patients (in this presentation) as those with heart failure and advanced liver disease. He starts off with a basic diagnostic algorithm for hyponatremia, focusing on the the changes in total body sodium and total body water to come up with the following classifications: hypovolemic, euvolemic, and hypervolemic hyponatremia. (For a detailed look at hyponatremia and how to interpret this diagnostic algorithm, click here).
Schrier mentions a quote from E.H. Starling, in which Starling states that the kidney is intelligently retaining sodium and water in heart failure patients (The Fluids of the Body, The Harter Lectures, 1909). Why would patients with low cardiac output (heart failure) retain sodium and water when patients with high-output cardiac failure (thyrotoxicosis, beriberi) also retain sodium and water? The common notion that a low cardiac output drives salt and water retention is an easy but not universal explanation for the development of hyponatremia.
He shows a slide of the distribution of the body fluid in various compartments. The key point is that 85% of all the body water is on the venous side of the circulation; 15% is on the arterial side. Schrier proposes that the kidney is responding to the fluid balance in the arterial circulation, and not the total body fluid. Arterial underfilling is dependent on cardiac output and systemic vascular tone more so that total body fluid. In thyrotoxicosis (and beriberi), there is a decrease in systemic vascular tone. Sodium and water retention, thus, is a compensatory mechanism to offset the deleterious effects of low cardiac output or decreased vascular tone (or both). This compensatory salt and water retention is mediated by the renin-angiotensin-aldosterone axis, sympathetic nervous system activity, and the non-osmotic release of ADH. Known as hemodynamic congestion, these compensatory mechanisms attempt to normalize cardiac output or vascular tone, but at the expense of developing pulmonary congestion, cerebral edema, and permanent cardiac remodelling (NEJM 1999, p. 577). Hemodynamic congestion is a bad sign for life, and is an explanation for why 50% of hypervolemic hyponatremic patients can present with a normal ejection fraction.
Hyponatremia is common in the heart failure population: about 50% of heart failure patients have serum sodium < 135 meq/L (JASN 2005, p. 16). There is a strong positive correlation (r 0.7) between activity of the renin-angiotensin-aldosterone axis and severity of hyponatremia (Circulation 1986, p. 257). Survival in heart failure is directly related to degree of hyponatremia: 20% survival at 20 months in the presence of low sodium (Circulation 1986, p. 257).
Of course, ADH is elevated in patients with heart failure (NEJM 1981, p. 263). Schrier indicates that measures to improve arterial filling will mitigate salt and water retention. This is the pathophysiologic explanation for why ACE-inhibitors can improve cardiac index (KI 1986, p. 1188). Inhibition of ADH activity can also improve arterial filling (KI 1986, p. 1188 and KI 1990, p. 818 and Am J Physiol Heart Circl Physiol 1994, p. H1713).
ADH has 2 functions: 1) it increases synthesis of and 2) traffics AQP-2 channels to the apical membrane of the principal cells in the collecting ducts. Inhibition of ADH decreases both synthesis and transport of the channel (JASN 1990, p. 2165).
Schrier turns his attention to hyponatremia in cirrhosis. Understanding hyponatremia and hemodynamic congestion in heart failure patients is crucial to understanding hyponatremia in cirrhotics (the mechanisms are very similar). In cirrhotics, there are 2 main hypotheses: 1) underfilling hypothesis: portal hypertension leads to ascites, which causes a decrease in plasma volume and secondary renal sodium and water retention, and 2) overflow hypothesis: a primary renal sodium and water retentive state leads to increased plasma volume, which combined with portal hypertension results in ascites. Schrier believes in the underfilling hypothesis (Annals of Internal Medicine 1990, p. 155).
SALT-1 and SALT-2: looked at heart failure, SIADH, and cirrhosis (all three cause hyponatremia) — after 30 days of tolvaptan use, hyponatremia was corrected, but when discontinued, serum sodium levels dropped to pre-treatment levels (NEJM 2006, p. 2099). ADH-receptor inhibitors work equally well in all three conditions.
Schrier reminds us that the main concern around hyponatremia is cerebral edema. The skull allows for only 8% of edema before herniation occurs. Chronic hyponatremia is often thought of as a condition with minimal (or no) symptoms. Schrier disagrees with this notion. Despite the adaptation by the brain to chronic hyponatremia, careful examination would reveal significant neurologic deficits. There may be no such thing as asymptomatic hyponatremia. He shows a nice graph that illustrates the improvement of gait stability as serum sodium rises in patients labeled as chronic, asymptomatic hyponatremia (Am J Med 2006, Vol 119, p. e1-71). Chronic hyponatremic patients have a 67-fold greater odds of falling than normonatremic controls.
The talk now turns to Rosner who will discuss therapy of hyponatremia in cirrhotics and heart failure patients. Hyponatremia is a strong predictor of mortality in cirrhotic patients (it is included in both the MELD and modified-MELD scores). Pre-transplant patients have a 4-6% likelihood of developing ODS in the post-transplant setting, which is very concerning given the scarcity of allografts. Rosner will discuss each of the following treatments: normal saline, hypertonic saline, fluid restriction, demeclocycline, lasix + NaCl, CRRT, and vasopressin receptor antagonism.
Normal saline is ineffective if volume depletion is not present or if there is any increase in ADH. Hypertonic saline is effective for patients who are acutely symptomatic, but runs the risk of ODS (for which cirrhotic patients are at increased risk already). Fluid restriction is cheap, but very slow to work and hard to comply with. Demeclocycline isn’t approved for hyponatremia treatment and has a degree of nephrotoxicity. Lasix + NaCl is difficult to titrate. CRRT works, and is used in some centers just prior to a liver transplantation, but is expensive and not widely available.
Rosner shows a study that compares fluid restriction (< 1 L/day) versus lixivaptan therapy (Gastroenterology 2003, p. 933). The study showed no change in serum sodium with fluid restriction but increasing sodium concentrations with increasing doses of the vaptan.
Loop diuretics + normal saline produces, in effect, a net water loss of 50% the urine volume. However, it is labor intensive and, as he points out, not always an ideal therapy. Rather, Rosner believes in the primacy of ADH's role in water retention and should be a target of therapy.
A study of satavaptan showed a rise in serum sodium concentration (by about 5 meq/L) within 24-48 hours, and maintains stability as long as the drug is continued (Hepatology 2008, p. 204).
Vaptan therapy, in conjunction with spironolactone (100 mg), can decrease the number of large volume paracenteses (LVP) needed, regardless of the degree of hyponatremia (J Hepatology 2010, p. 283). 151 patients were randomized to spironolactone versus spironolactone + vaptan therapy, there was a significant decrease in number of LVP’s needed to manage recurrent ascites.
Rosner turns his attention to treatment of hyponatremia in heart failure. He emphasizes the neurohormonal abnormalities in heart failure patients and that therapy must manage all the neurohormonal imbalances. He lists the 4 common treatments: fluid restriction, diuretic therapy, ultrafiltration, and ADH-receptor antagonism.
Fluid restriction hasn’t been studied very well in heart failure patients with hyponatremia. Even with significant restriction (< 1 L per day), there is only 20% patient compliance and no difference in number of diuretics or symptom improvement (J Card Fail 2007, p. 128).
Diuretic therapy can ease congestive symptoms and can improve arterial underfilling if combined with afterload reduction. However, diuretic use is associated with increased mortality and worsens the neurohormonal imbalance (Eur Hear J 2006, p. 1431). Ultrafiltration is safe and effective, but it is expensive and not routinely available (Curr Opin Crit Care 2008, p. 524).
As a result, Rosner believes a strong focus on vaptan therapy is necessary for treatment of hyponatremia in heart failure. Vaptans will increase serum sodium in heart failure patients (Am J Nephrol 2007, p. 447). He shows additional slides from other studies showing improvements in serum sodium with vaptan therapy (Am J Physiol Heart Circ Physiol 1998, p. H176; Am J Physiol Renal Physiol 2006, p. F273; Cleve Clin J Med 2006, p. S24). Rosner admits that mortality in heart failure patients using vaptan therapy does not decrease at 10 months. There is also no difference in heart failure hospitalizations (JAMA 2007, p. 1319). The BALANCE study will look at vaptan effects on all-cause mortality and hospitalization rates (study has concluded, results not released; JACC 2008, p. 1540).
Tunneled Hemodialysis Catheters: What do Nephrologists Need to Know?
Presenters: Tushar Vaccharajani, MD (Wake Forest University); Micah Chan (University of Wisconsin-Madison); Arif Asif, MD (University of Miami); Loay Salman, MD (University of Miami)
Blogger: Tejas Desai, MD (East Carolina University)
Salman starts the session with preventive strategies to prevent catheter-related blood stream infections (CRBSI). CRBSI occurs in 35% of patients at 3-months; 50% at 6-months. Overall mortality at 12-weeks is 34% in catheterized patients. Staph. aureus contributes to the greatest mortality compared to all other isolates. Keep in mind that non-tunneled catheters have about a 2x greater likelihood of infection compared to tunneled catheters (about 29% versus 12% in one study). However, in that same study, the likelihood of bacteremia from an AVF was about 0.2%. Staph. aureus is a terribly virulent isolate that has features which allow for adherence on many different surfaces.
The primary risk factor for CRBSI is the duration of catheter placement. Other risk factors include location (neck better than groin), side (right sided catheters less likely to be infected than left), and albumin level (< 3.5 g/dl confers greater infectious risk). One of the key protective factors is skin sterilization. Chlorhexadine confers less risk of bacteremia than iodophors (e.g., povidone-iodine). Alcohol can also be used but is not common in the US because of it’s flammable property and ability to erode certain catheter materials.
Salman then discusses Dr. Gerard Bethard’s outpatient dialysis checklist which has significantly decreased the rate of CRBSI (from 6.97 infections/1000 catheter-days to 1.28 infections/1000 catheter-days; and has been maintained at this level). The main focus of the checklist is to use sterile procedures when handling the hubs of the catheter.
Salman turns his attention to antibiotic locks, impregnated catheters, and topical antibiotic use. First, coated catheters using heparin. There was no difference in the primary endpoint, which was to improve patency, but a small subgroup analysis showed that heparin-coated catheters decreased CRBSI. Subsequently, another study was performed with CRBSI as the primary outcome: heparin conferred no benefit. Silver impregnated catheters show no statistical difference. Salman continues with other impregnated catheters and the results are the same: neither bismuth-coated, chlorhexadine-coated, sulfadiazene-coated, nor minocycline-rifampin-coated catheters confer benefit against CRBSI.
Topical antibiotics (either applied at the exit-site or intraluminally administered), can confer benefits. Salman shows a meta-analysis that indicates that all of the combinations of antibiotics were favorable. However, the development of resistant strains of organisms is the #1 reason why such a strategy is not common practice. Routine use of topical antibiotics will increase resistant strains. Even elimination of nasal carriage of Staph. aureus has been shown to breed resistant organisms. Antimicrobial lock solutions confer benefit, but are limited by systemic complications. For example, citrate and/or gentamicin locks have shown systemic complications (symptomatic hypocalcemia, ototoxicity).
Vaccharajani will talk about the impact of CRBSI, clinical scenarios associated with CRBSI, and treatment options. He starts with an exit-site infection, which is becoming increasingly uncommon given the use of sterile techniques (see Bethard’s checklist above). Exit-sites should be inspect at each dialysis session, chlorhexidine should be applied as well as simple non-occlussive dressings (rather than a transparent polyurethane dressing, which is more costly). Empiric antibiotic use that covers Staph. aureus should be administered at the slightest sign of infection. Ultimately, the catheter must be removed to prevent a blood stream infection, with a total of 2 weeks of antibiotic therapy.
CRBSI is the principal reason for catheter removal. Infections related to catheters cost anywhere betwee $1000-3000 per patient per episode (compare that to AVF infections, which lead to a cost of $500-800 per patient per episode). Once a CRBSI occurs, the ultimate goal is to treat the infection and salvage the catheter. Catheters can be salvaged with systemic antibiotics, antibiotic locks, and changes over a guidewire. Systemic antibiotic therapy alone leads to a recurrence rate of 75% and thus is not an ideal strategy to salvage catheters. In a prospective trial of 103 patients with 200 CRBSI episodes, the investigators used vancomycin and meropenem for 6 weeks (w/o antibiotic locks) and salvaged 2/3 of catheters. Recurrence occurred in 1/3 of patients, requiring a complete removal of the catheter, as far as 6 months post-treatment. Antibiotic lock solutions have shown promise in in-vitro studies. Theoretically, locks should help salvage catheters, but studies have shown a wide range of success rates. For example, a study using vancomycin + ceftazidine locks, in conjunction with systemic antibiotics, showed an overall salvage rate of 70%, with more catheters salvaged in gram negative infections than gram positive ones. At Vaccharajani’s center, high-risk patients (diabetics, last catheter, immunocompromised) who were given antibiotic locks had a drop from 9 to 1 episode per 1000-catheter-days. Vaccharajani believes that lock solutions have a role in high-risk patients, but one must keep in mind the systemic toxicity and antibiotic resistance that will develop. Overall, salvage rate with lock solutions is only 2/3 and mostly helpful with gram-negative infections (not for the more common gram positive infections).
Catheter exchanges that occurred with 48-72 hours versus patients who had a catheter-free trial (and placed 3-10 days later) showed no difference in infection recurrence rate. The advantage of early catheter exchanges is that the access site is preserved and the number of missed dialysis sessions is less (including hospitalized days). Should a catheter be placed only after negative cultures? K/DOQI guidelines do not recommend waiting for negative blood cultures. Catheters can be placed once the patient has clinically improved, regardless of the repeat culture results.
Chan now focuses on non-infectious complications of catheters. He spends a considerable time repeating the studies that Salman and Vaccharajani discussed. The only new information he presents is the development of clots and fibrin sheaths. Early catheter dysfunction (within 24-48 hours after insertion) is generally due to mechanical issues, such as kinking or clot formation. Late catheter dysfunction is generally due to large clots or fibrin sheaths. Fibrin sheaths must be removed and the catheter must be replaced; tPA will not properly disrupt the sheath to allow for adequate catheter function. Sheaths form because of Virchow’s triad of clot formation. There is no good data about oral anticoagulation, heparin or citrate catheter locks decreasing the risk of clot formation. Thrombolytic-locks have shown benefit, and a recent randomized trial confirmed the benefit of thrombolytic-locks to decrease clot formation.
Why not Peritoneal Dialysis?
Presenters: Ramesh Khanna, MD (University of Missouri – Columbia); Beth Piraino, MD (University of Pittsburgh); Thomas Golper, MD (Vanderbilt University)
Blogger: Tejas Desai, MD (East Carolina University)
Khanna starts the session with reasons for initiating peritoneal dialysis (PD) first. He starts with a survey published in 2008 in NDT that asked healthcare providers (85-90% were physicians or nurses) the best initial dialysis modality for a patient with a planned dialysis start. 6595 respondents (with 11% from the US who were mostly physicians) chose peritoneal dialysis 49% of the time. Europeans responded by 60%; US responders 44%. PD was the choice of the majority.
Khanna then focuses on healthcare provider perceptions of PD. Most providers believe PD have a longer preservation of residual renal function (RRF). This is confirmed by USRDS data; survival on PD is better than hemodialysis (HD) for the first 2-3 years on treatment, which he attributes to the preservation of RRF. Providers also believe that post-transplant patients survive longer if they were on PD than HD prior to transplantation, though this is based on small investigations.
Incident dialysis counts are about 100K per year, with PD contributing only 6.3%. Prevalent HD patients are about 350K, with PD about 25K only. In the mid-1990’s, there was a trend away from PD by dialysis chains, which has altered incident and prevalent PD patients. Khanna does not explain why this trend occurred or the exact influence of dialysis chains on initiating PD. He then goes on to explain the lower number of hospitalizations and other complications in PD patients, especially in the first 2-3 years of dialysis.
Five-year survival (between 1994-8) between PD and HD patients is similar. PD patients show an early survival advantage. Despite the convincing early survival advantage, PD is infrequently chosen as initial therapy. Khanna offers explanations for this fact. He touches on reimbursement practices that negatively affect starting patients on PD, ignorance among the general nephrology and internal medicine community about the effectiveness and feasibility of initiating PD (despite the initial survey data he mentioned at the start of the session), and patients lack of understanding/exposure/familiarity with PD. He does not provide much data to support any one of these factors as legitimate or validated reasons that negatively impact PD initiation.
Khanna summarizes his recent CJASN review advocating PD as initial therapy. The first argument in favor of PD is early survival. 240 deaths/100,000 patient-years on HD in the first year; 120-130 deaths/100,000 patient-years on PD in the first year. From 1993-2007, incidence of non-dialysis-associated infections (e.g., pneumonia) has been greater in HD than PD patients. Dialysis-associated infections (CRBSI versus peritonitis) is less in the PD than HD population (though both are decreasing). Another argument in favor of PD is the number of hours that PD patients can spend at home. Khanna touches on the preservation of RRF and its favorable cardiovascular and mineral metabolism effects and atherogenic profiles as another appealing feature of PD. Finally, he speaks about dialysis costs. $73K versus $153K for PD versus HD, respectively (cost per patient per year).
Piraino will discuss infectious complications in PD, mainly peritonits. She begins with an explanation of the mathematical formula the ISPD uses to calculate (and standardize) rates of peritonitis. It should be calculated as episodes/years-at-risk. Say you have 33 PD patients, and 15 new patients are trained, while 22 patients leave PD. In that year, the total patients on PD is 48 (33 + 15). Divide the number of peritonitis episodes with the number of PD patients in that one year to obtain episodes/years-at-risk. She also mentions that this calculation should be performed on an isolate-by-isolate basis.
The leading cause of peritonitis is contamination. Exit-site or tunnel infections are usually #2. Bacteremia from any cause or urinary/gynecologic sources are infrequent causes of peritonitis. Contamination can be prevented by adequately training the PD nurse (who then trains the patient). Piraino emphasizes that nurses who know how to perform PD aren’t necessarily equipped to teach proper PD technique. She mentions that the best patient education model comes from the principles of adult learning.
A strong independent predictor of peritonitis risk is late arrival to PD training. Baseline albumin is also a predictor of subsequent peritonitis.
Piraino turns her attention to retraining patients. A study in Italy showed a lower rate of peritionits with repeated PD training via home visits. She suggests that the frequency of retraining is debatable, and does not offer a time period for retraining.
GI sources of peritonitis occur, mainly, through transmural migration (which is not clearly understood). A study from Hong Kong showed that hypokalemia increases peritonitis rates, perhaps by exacerbating intestinal stasis and transmural migration. Another study from Hong Kong showed higher rates of peritonitis in patients who have diverticulosis.
Exit-site and tunnel infections are next on her agenda. She believes that the double-cuffed catheter prevents tunnel infections, though this is not supported/recommended by the ISPD currently. Mupirocin and/or gentamicin ointments applied to the exit-site can prevent peritionitis. If peritonitis sets in, management must be quick as cultures do not grow quickly (unless the organism is a fungus). Cover both gram positive and negative organisms, even if the gram stain is negative. Catheter removal should be considered if there is a recurrent episode of peritonitis. Mycobacterial infections may or may not be treated with catheter removal. Catheters should be removed for all fungal peritonitis episodes.
Golper finalizes this session by indicating that his presentation will be terminated early because of a lack of time. He will focus on non-infectious, mechanical complications of PD. Intra-abdominal pressure is dependent on patient age, dwell volume, and position of the patient (these are the 3 primary factors). Less influential factors include body mass index and degree of constipation.
Abdominal wall leaks generally present with localized, asymmetric swelling (and weight gain) that is not associated with lower extremity edema. Effluent return, though lower in such leaks, would be a late finding and should not be relied upon. Evaluation can be performed with intraperitoneally-injected contrast on a cat scan (about 100 ml per 1 liter of PD fluid). Alternatively, a non-contrast MRI can be performed. Treatment is first directed at lowering intra-abdominal pressure while continuing PD: use lower volumes with most time in the supine or standing position, convert CAPD to APD or CCPD and rely on RRF to make up the difference in kT/V. One in 5 patients will require surgical intervention, but most spontaneously repair.
Hernia development is closely associated with abdominal wall leaks. The key point about hernia development is that they are most common at the catheter site and prior surgical incisions, and that smaller hernias have a higher risk of incarceration/strangulation than larger hernias. Most large hernias can be treated with trusses rather than surgical repair.
Integrating the treatment of secondary hyperparathyroidism in ESRD
Presenters: Hatmut Malluche, MD (University of Kentucky); Csaba Kovesdy, MD (Salem VA Medical Center); Darryl Quarles, MD (University of Tennessee)
Blogger: Azzour Hazzan, MD (North Shore – LIJ/Hofstra Medical School)
Here are 5 different PTH assays Different PTH assays
Have different specificity and sensitivity but the trend is telling and should be trusted.
Further using ratio of cap and cip may help (cap activating protein within the PTH peptide)
Using PTH u can only screen for high turn over and not neccesarily diagnose
Using scantibody assay if PTH 420 or above u r 80 per diagnosis ratio of 1 support that.
In Blacks differed cutoff for PTH and ratios exist.
Bone volume is also important.
Dexa is unreliable in vertiberal gives falsely high reading. hip reading is good
Higher bone volume correlates with less calcification in the coronary arteries and visa versa. And that is true with any age.
More low turn over with whites and low volume, and more osteoporosis with blacks and high turn over
Strong corolation with low turn over and calcification and also high turn over. Elevated slightly better than low turn over.
Renal Palliative Care
Presenters: Sara Davison (University of Alberta); Alvin Moss MD (West Virginia University School of Medicine)
Blogger: Tejas Desai, MD (East Carolina University – Brody School of Medicine)
Moss starts the session with the his objective: to discuss the guidelines for withholding and withdrawing renal replacement therapy. He mentions the 2nd edition of the Clinical Practice Guidelines on Shared-decision making in the appropriate initiation of and withdrawal from dialysis offered by the Renal Physicians Association (that he and Dr. Davison were formative in crafting) for $35. (You can obtain these guidelines for free by clicking here). These guidelines have 10 recommendations for the care of adult patients, along with explanations and strategies on how to implement the recommendations in various clinical scenarios. The guidelines identify 3 types of treatment goals: 1) rehabilitative renal replacement therapy: patients that want renal replacement therapy along with other invasive procedures, 2) patients with a poor prognosis who choose dialysis but w/limitations of other treatments, like mechanical ventilation or CPR, and 3) active medical management: patients who decline dialysis and other therapies and wish to be comfortable, but want their medical conditions managed optimally through medications.
Recommendation 9 of the guidelines states that all patients with AKI, CKD, or on ESRD who are suffering from the burden of other diseases should be offered palliative care. Moss, Germaine, and others have developed an integrative prognostic model to calculate the 6-month and 12-month survival estimate. This prognostic tool can assist providers in determining which patients should be offered dialysis and which should be referred to palliative care.
The guideline booklet has a number of recommendations on how nephrologists can broach these subjects. The majority of the first half of Moss’ presentation focused on the benefits of using the new set of guidelines.
Davison turns her attention to pain management at the end-of-life in advanced CKD/ESRD. The common notion that renal disease is a painless way to die is absolutely false. Osler is credited with this notion, but recent data disproves his idea. In the last month of life, 84% had pruritis, 73% had substantial pain. More importantly, these patients are living with these symptoms as they approach death, which is why nephrology providers must take an active role in pain management early on, and not at the last days/moments of life. Within any 24-hour period, over 80% will suffer at least one episode of moderate-to-severe pain. The cause of the pain, such as ischemic or neuropathic pain, is not related to the severity of the pain. The impact of pain increases the risk of depression and insomnia by about 2.5 fold. Mitigating pain can improve quality of life by 35-50%, which, in terms of magnitude, is similar to the change in quality of life after renal transplantation.
The most common form of narcotic abuse in the care of the chronically dying is the under-treatment of pain. Screening is the first step at identifying patients who will require aggressive pain management. Any screening tool can be used, and in North America, most nephrology providers can adopt screening tools first created for cancer patients (modified for renal patients). After a positive screening test, Davison advocates the use of the WHO Analgesic Ladder. Although this algorithm has been validated in cancer patients, she feels it can be extrapolated to renal patients.
The WHO Analgesic Ladder has 3 stages. Stage 1 recommends a non-opioid analgesic (tylenol, NSAID for only a short duration to avoid catastrophic GI toxicity) + adjuvant therapy (gabapentin or pregabalin for neuropathic pain, tricyclic antidepressants as second line agents). Stage 2 recommends a weak-opioid analgesic (oxycodone, tramadol) + adjuvant. Stage 3 recommends a strong-opioid (fentanyl, hydromorphone, methadone) + adjuvant. Hydromorphone is an excellent opioid for renal patients who are on dialysis. Non-dialysis patients will have a build-up of a toxic metabolite (thus, for CKD patients, careful monitoring will be required). Fentanyl should be used as a second agent after hydromorphone has failed. Methadone would require frequent ECG monitoring.
The following medications are recommended only in the acute setting (and not for chronic pain): NSAID’s, codeine, morphine, meperidine, propoxyphene. In the last 4, toxic metabolites can exert devastating results. Codeine, in particular, has a narrow therapeutic window that can fluctuate in a manner that cannot be predicted by the dose or duration of use. NSAID’s are efficacious but can lead to major GI toxicities.
Practical Issues in Home Hemodialysis
Presenters: Robert Lockridge, MD (Lynchburg Nephrology Physicians, PLLC); Christopher Chan, MD (The Toronto General Hospital, University Health Systems); Joel Glickman, MD (University of Pennsylvania)
Blogger: Tejas Desai, MD (East Carolina University – Brody School of Medicine)
Glickman starts the session with basic definitions: Short daily hemodialysis (SDHD): 5-6 sessions/week for about 2-3 hours/session; Nocturnal hemodialysis (NHD): 5 sessions/week for 6-8 hours per/session.
Lockridge begins his portion of the session by discussing who should and can do home hemodialysis. In the US, as of 12/31/10, 350K people are performing in-center HD, while only 5700 are performing home hemodialysis (28K performing PD). The default modality has become in-center HD. Any patient can qualify to perform home hemodialysis: age, education level, and type of vascular access do not exclude one from home hemodialysis. Literacy is a necessity for successful home hemodialysis, however. Patients do not need partners to assist in the dialysis, but NxStage requires that patients have someone living in the dwelling in which dialysis is performed.
Home modifications are not necessary with the NxStage home hemo machine. Both municipal and spring/well water supplies can be used, but municipal water supplies can have varying levels of chloramines. Carbon filters must be checked frequently if the water supply is from a municipality.
Finally, Lockridge intimates that survival in home hemodialysis patients (short daily) is similar to that of cadaveric transplant patients at 4 years (NDT 2009 p. 2915). He states that studies such as these have been criticized for its flaws, but he poorly articulates what those criticisms or flaws are.
Chan’s portion of the session focuses on the optimal vascular access for home hemodialysis. The common theme in his portion of the session is that there is very limited data on the optimal access. Most patients prefer tunneled catheters because they find self-cannulation as a significant fear and barrier. However, Chan believes that the optimal access will be shown to be AV accesses, as seen in conventional HD patients. Catheters have a longer life span/patency rates in home hemodialysis because of the lower blood flow needed (about 200 ml/min).
Chan turns his attention to the cannulation of the AVF — the buttonhole technique. The intent of this technique was to create a track so that cannulation could be performed easily with limited pain, high yield in cannulation success, and less aneurysmal dilatations (cosmetic benefit). However, the largest downfall with this technique is infection. Patients may not maintain aseptic technique or may cannulate at a different angle; both increasing risk of infection (and metastatic infection). In Japan, home hemodialysis patients no longer routinely perform the buttonhole technique — in some centers, such a technique has been banned.
Glickman begins his presentation about the intricacies of writing a home hemodialysis prescription using NxStage equipment. He shows the relationship between dialysate flow rate (Qd) and urea clearance. There is a linear relationship between the 2 until Qd reaches about 600-800 ml/min, when the urea clearance becomes flat. Focusing on the initial part of the curve, urea clearance is directly proportional to Qd. This range of Qd provides efficient dialysis. Flow fraction is the ratio between Qd and blood flow rate (Qb): FF = Qd/Qb. At lower FF’s, the Qd is low, which puts us on the most efficient part of the Qd vs. urea clearance curve. A FF of 30-35% equates to a 92% saturation of dialysate.
Calculating kT/V in home hemodialysis patients is similar to that of PD patients. D/Purea in the kT/V formula for PD patients is the percent saturation in home hemodialysis. That is, D/Purea is the FF in the home hemodialysis prescription. kT/V(urea, per treatment) = FF * drain volume/V(d urea), where drain volume = total dialysate used (which is Qd*time) + UF.
Aim for a FF of 30-35%, to achieve a 92% saturation. Why is FF used in home hemodialysis and not in in-center HD? The reason is that home hemodialysis patients must use dialysate efficiently. Space and costs are the largest reasons why home hemodialysis patients cannot use an “infinite” amount of dialysate. A typical home hemo patient uses approximately 20 L dialysate/session, whereas an in-center hemo patient uses approx. 120 L dialysate/session. Secondly, we want to limit the amount of time per session. To achieve this, FF must be kept low.
For in-center HD, the physician stipulates the time. Time is independent because the amount of dialysate used is not considered. In home hemo patients, time is a dependent variable: depends on Qd and total drain volume. FF is programmed and fixed, unless the physician specifies a change. Because FF is fixed by NxStage, and the physicians sets the Qb (usually 200 ml/min), Qd will be set for us. In addition, the total amount of dialysate to be used per session is also fixed by the physician. As a result, the more dialysate volume used, of the greater the UF per session, the more time the patient will be on dialysis per session.
Fluid options in home hemodialysis are of 2 forms: 1) 5 liter bags, or 2) a sack that holds anywhere between 30-60 liters at one time. [K+] is fixed at 1 meq/l. [Mg2+] and [Ca2+] are fixed as is lactate. Note: the reason patients don’t become hypokalemic with a [K+] of 1 meq/l is because the total dialysate volume used (20L) is low. Keep in mind that the [K+] in peritoneal dialysate is 0 meq/l.
There are no outcome studies in home hemo patients regarding optimal kT/V. Opinion based recommendations are to achieve a weekly spkt/V of 2, or about 0.42/session.
Why can’t we use the concept of FF in in-center HD patients? Let’s do a simple set of mathematical calculations to understand why this would be unfeasible in the in-center environment.
First, let’s work with the notion that time is a dependent variable, just like it is in home hemo patients (recall that for traditional in-center patients, time is an independent variable, specified by the physician).
We want to achieve a kT/V of 1.4 (based on the HEMO trial results). Assume a 70kg man (thus, Vd is 0.6*70, or 42). Let’s set the FF at 30%. Based on:
kT/V = FF(drain volume)/Vd
we can solve for drain volume. In this scenario, it is 196 liters.
The standard Qb of an in-center HD patient is about 400 ml/min. Let’s use that blood flow rate.
Qd, therefore, will be determined by the following:
FF = Qd/Qb
Qd is about 120 ml/min. To use 196 L of dialysate volume (in order to achieve a kT/V of 1.4, by keeping the FF at 30% — in other words, to use dialysate most efficiently), the per session time is = 196(1000)/120 = 1633 min = 27.2 hours!
Not feasible. As a result, in in-center HD patients, the physician specifies the time at the expense of the amount of dialysate volume used. If time is decreased from 27.2 hours to 4 hours, then a lot more dialysate will be required to achieve the kT/V of 1.4.
Targeting Hemoglobin A1C in ESRD patients
Presenters: Marcello Tonelli, MD (Alberta Kidney Disease Network); Kamdar Kalantar-Zadeh, MD PhD (UCLA – David Geffen School of Medicine); Mark Williams, MD (Joslin Diabetes Center, Beth Israel Deaconess)
Blogger: Tejas Desai, MD (East Carolina University)
Tonelli begins the session by discussing where we stand with hemoglobin A1C targets in ESRD patients. Diabetes is a worldwide problem that is only worsening. In the US, incident rates for ESRD patients with diabetes has leveled off. This trend is not seen in young African-Americans (age 30-39), though, where the rate of growth of incident diabetic ESRD patients is linearly rising. Diabetes is no longer a disease of affluence. Projected trends from 2000-2030 show the greatest increase in diabetes in India (180%), Australia (130%).
The 1993 DCCT study convincingly showed that tighter control of glucose in type 1 diabetics decreased microvascular and macrovascular complications. The UKPDS 1998 study showed the same results in type 2 diabetics. These studies set the stage for the lower is better notion of hemoglobin A1C targets. Current K/DOQI guidelines suggest a target A1C 7%. 20% of in-center patients have A1C > 8%. Improvements in A1C in the general diabetic population has been much greater than in the diabetic CKD/ESRD population. Even in clinical trials, such as the DCCT, improvement in A1C levels has been modest (A1C 9% down to 6% after 6.5 years in DCCT, only to rise back to 8% at 4 years post-study completion). The same trend was seen in the UKPDS data as well.
Instituting therapies to decrease A1C can lead to symptomatic hypoglycemia. Diabetic patients are at risk for developing hypoglycemia after treatment, but in conjunction with CKD, the rate of hypoglycemia is twice as great than in diabetics w/o CKD. CKD itself increases the risk of hypoglycemia. Theoretically reasons for this trend include: decreased liver efficiency of insulin metabolism in ESRD, altered diet leading to decreased glucose stores, and increased half-life of insulin in CKD/ESRD patients.
Tonelli also mentions the controversy surrounding the use of hemoglobin A1C as an accurate measure of diabetic control in ESRD patients. Factors such as altered erythrocyte life span and use of ESA’s, as well as uremia and lower hemoglobin levels at baseline, have all been cited as reasons the hemoglobin A1C is an inaccurate marker of diabetic control in the dialysis population. Hemodialysis itself can change the ambient glucose concentration (glucoses are lower on dialysis days than non-dialysis days).
There are no RCT’s to answer whether lower A1C’s improve outcomes in CKD/ESRD patients. Thus, only observational studies have been used to help answer this question. In Tonelli’s unpublished data of 25,000 CKD-4 patients, there is a statistically significant increase in CVA, CHF, MI, and progression to dialysis with higher A1C’s. There is a J-shaped curve whose inflection point is around 7%, but this finding is from observational studies as well. CKD-3 patients have a higher risk of mortality and other cardiovascular complications with higher A1C’s than in CKD-4 patients. Tonelli does not have an explanation for this observation.
Kalantar-Zadeh begins his portion of the presentation, in which he will discuss why an A1C target of < 7% is appropriate in CKD/ESRD patients. A number of societies have recommended A1C target < 6.5-7% in all diabetics (not necessarily CKD/ESRD diabetics), including the ADA, AHA, and ACC. Based on such recommendations, in 2007 the K/DOQI guidelines recommended a target A1C < 7%, regardless of CKD stage or ESRD presence/absence. Kalantar shows 4 different observational studies from Japan, each showing that lower A1C levels resulted in better outcomes. As a result, prospective trials from the Fresenius Medical Center patients (25K), showed no relationship between A1C and outcomes (KI, 2006, p. 1903). Kalantar then showed data from his group at Davita with 26K HD patients that were followed over 3 years. Although crude data showed no relation between A1C and clinical outcomes, analysis that accounts for poor nutrition and inflammation showed a positive correlation with a J-shaped curve (inflection point at approx. 7%). Moreover, a study in CJASN 2011 (pending publication) showed a similar J-shaped curve and relationship to outcomes in 3000 PD patients (i.e, 8% had higher mortality rates).
Williams finalizes this session with a presentation on why A1C should be individually targeted rather than be uniformly applied to all diabetic ESRD patients. In the general diabetes population, Williams states that uniform targets are no longer the current recommendations. Moreover, the hemoglobin A1C is a flawed metric for diabetes management.
2011 ADA recommendations suggest that patients with limited life expectancy, extensive comorbid conditions, extensive vascular complications, should have less stringent A1C targets. The ACP and ACC agree with individualized targets, especially for patients vulnerable to hypoglycemic episodes. A JAMA 2009 paper attempts to describe that younger patients, who can sense hypoglycemia, and have had a short duration of diabetes should be targeted for A1C of < 7%. Williams reviews a number of papers, including some that he has authored, to show that not all CKD/ESRD diabetics will have steady improvement with simple lowering of hemoglobin A1C. At best, he states there is a weak association between A1C levels and clinical outcomes, certainly not as strongly associated as Kalantar-Zadeh mentions in his portion of the session.
Low Sodium Acute Kidney Injury
Presenters: Thomas Gonwa, MD (Mayo Clinic, Jacksonville, FL); Lakhmir Chawla, MD (George Washington University); Luis Juncos, MD (University of Mississippi)
Blogger: Tejas Desai, MD (East Carolina University)
Gonwa starts the session with a focus on AKI in cirrhotics. AKI occurs in 20-35% of cirrhotic patients. Gonwa also believes in the arterial under-filling theory supported by Schrier earlier in the NKF session (see Blog #1). AKI occurs on a spectrum which begins with pre-renal azotemia and culminates in the hepato-renal syndrome (renal vasoconstriction is the hallmark of the disease). Ultimately, renal cortical ischemia develops from prolonged renal vasoconstriction (or renal hypoperfusion) — this will impact the likelihood of renal recovery in patients who receive liver transplant.
Renal blood flow (RBF) is diminished in cirrhotic patients (who do not have defined hepato-renal syndrome). RBF decreases early on in cirrhosis, which only worsens as cirrhosis worsens. In hospitalized cirrhotic patients, 65% of patients developing AKI do so from ischemic ATN or pre-renal azotemia. The diagnosis of hepato-renal syndrome is a diagnosis of exclusion. In a study looking at 463 patients, survival depended on the cause of AKI — hypovolemic AKI had a 3-month survival of 46%, true hepato-renal syndrome had a 3-month survival of only 15%, renal parenchymal disease had a 3-month survival of 76%. Thus, it is critical to determine if there is parenchymal disease in cirrhotic patients causing AKI.
The definition of hepato-renal syndrome is Cr > 1.5 mg/dl, no improvement of serum Cr with plasma volume expansion using albumin (no longer crystalloids), no nephrotoxic drugs or vasodilators, absence of septic shock, and the absence of renal parenchymal disease on urine microscopy or ultrasound. Creatinine clearance and urinary sodium or fractional excretion of Na are no longer diagnostic criteria. Type I hepato-renal syndrome is acute and causes rapidly deteriorating function — is more common in acute hepatic failure episodes. Type II occurs slower over weeks-months.
Medical management of hepato-renal syndrome includes midodrine + octreotide. This drug combination increased GFR, increases UOP and improves electrolytes. Gonwa states that such a treatment strategy, especially in type II hepato-renal syndrome, works about 30-40% of the time. Norepinephrine is used primarily in type I hepato-renal syndrome — it is thought to raise the peripheral blood pressure. A newer drug, terlipressin, can be used (but not approved in the US). Terlipressin is an ADH analogue and used as a peripheral vasoconstrictor. A trial looking at terlipressin showed some results in type I hepato-renal syndrome, but it recurs once the drug is discontinued. Terlipressin is the standard of care in Europe. The study also showed no improvement in kidney function if the entry creatinine was 5.6 mg/dl or greater. In a study comparing terlipressin versus norepinephrine, there was no significant difference in rate at which creatinine declines. Gonwa believes that this might be a main reason why terlipressin is not yet approved in the US.
TIPS: transhepatic porto-systemic shunt. Many side effects with this procedure, including volume overload (especially with poor cardiac function) and metabolic encephalopathy. Improvement in kidney function is slow. 30-day survival is 71% (the usual 30-day survival is 15-30% in type I hepato-renal syndrome patients).
5-year survival post-liver transplant if the creatinine is < 1 mg/dl is 80%; this drops to 60% if the patient required dialysis prior to the transplant. Thus, where your kidney function is around the time of transplantation will affect your survival.
Chawla turns his attention to the abdominal compartment syndrome. It is an increasingly common etiology for AKI that is often missed. Historically, abdominal compartment syndrome (ACS) was generally in the purview of trauma patients/docs. The ACS develops when intra-abdominal pressure increases. Strictly speaking, ACS occurs with IAP > 25 mm Hg + organ failure. Any intra-abdominal process can lead to tissue edema, which is worsened by iatrogenic fluid administration, and this will lead to abdominal ischemia. This ischemia leads to inflammation and worsens the intra-abdominal pressure. Chawla indicates that the problem in diagnosis is that physicians interpret the patient as volume overloaded. In fact, these patients require additional fluids instead of diuresis.
There is no one absolute IAP that defines ACS. Generally, IAP’s of 20 or greater increase predisposition to ACS, but equally important is the speed at which the elevated IAP develops. As IAP increases, CVP rises and IVC pressure rises. The elevated CVP fools physicians into thinking the patient requires diuresis, when they need additional volume because all the fluid is trapped beneath the IVC. Therapy is volume, that must be administered through catheters that enter the SVC, which is untouched by the intra-abdominal process.
Renal dysfunction occurs through direct renal vein compression and direct cortical compression — both of which are fully reversible if the pressure is relieved promptly. Sequelae develop when elevated IAP persists and causes renal ischemia and infarction.
Overall prevalence of ACS in the SICU is similar to that of the MICU. Thus, one must keep ACS in the differential for AKI. One must not be fooled that the abdomen is soft — a tense distended abdomen is not necessarily a pathognomonic finding.
The gold standard for making the diagnosis is through a transduction of the IAP through a foley catheter. Oliguria is more likely to be evident than rises in creatinine (this is in part because the creatinine will be diluted as the patients are hypervolemic). Management of ACS is to, first, check serial pressure, urine output, and ventilator peak airway pressure measurements (about q6-12 hours). Despite appearing hypervolemic, these patients require additional volume, which should be titrated to the cardiac output. Volume should be administered through catheters that are supra-diaphragmatic. Paracentesis can be significantly helpful — small amounts of volume removal can lead to significant increases in urine output. The only downside is that such a measure is not sustainable and requires more definitive therapy — the decompressive laparotomy. The decompressive laparotomy can be performed by a surgeon at the bedside.