The success of medical therapies depends on drug dosage. The dose of medical devices, however, is complex because their contribution must be seen under a systems approach concept. Hemodialysis represents an ideal model for the dosage of medical systems, because affected uremic patients are repeatedly exposed to this treatment for longer periods of time. Thus results of treatment parameters can be attributed to system parameters.
Medicinal drugs are administered according to medical needs. A practising physician defines the dose of a pain killer, an ACE-inhibitor or an antibiotic to be compliant with a therapeutic outcome. The condition for approval of medicinal drugs also implies clinical investigations in which for instance Phase II trials serve to find the optimal drug dosage and its efficacy. To date, ‘individualised therapies’ are in the focus of drug research. They are based on gene analyses of individual patients and allow for a highly specific dosage and application. In contrast to the pharma approach, medical devices are typically developed for multiple cohorts of patients, such that we literally define this approach as ‘personalised medicine’.
Medical devices are very heterogeneous and cannot be categorised easily. When looking from the medicinal drug perspective at medical devices, several questions arise:
In this article, I´d like to discuss the specific situation of a therapy that continuously uses medical devices for the treatment of patients with kidney failure. Hemodialysis with artificial kidney represents a success story in the realm of biomaterials and medical technology, because currently worldwide, more than two million uremic patients owe their lives to this treatment. Patients suffering from kidney failure are exposed to this therapy three times a week, mostly by applying single-use devices, such as syringes, tubing and filters. Recent clinical investigations have shown that patients survive better once a specific therapy dose is applied (see below). Therefore, a short description of the treatment might be helpful to better understand the discussion circulating around the optimal dialysis dose.
In hemodialysis, blood from the forearm vein of a patient is guided through a set of tubing and the artificial kidney (dialyser) under the control of a dialysis machine. Such a dialyser contains more than 10,000 tiny capillaries with an internal diameter of about 200µm and a membrane wall of ca. 40µm. When blood passes by these capillary membranes, uremic retention solutes may leave the blood stream across the membrane wall and blood purification occurs. Under standard conditions, this treatment lasts for about four hours. The patient is then disconnected from the dialysis system and will return for this treatment two days later.
The dose concept for dialysis is borrowed from drug treatment, where physicians manipulate the levels of a single compound: increase the dose and the levels in the body will rise. If the treatment removes a substance, like in hemodialysis, the direction must be reversed: increase the dose and levels in the body should fall. This notion has led to many controversies during the last decades. The search for representative molecules to define an optimal surrogate marker for the efficiency of the therapy, combined with investigations on device and treatment efficiency is still ongoing.
In the early 1970s, Babb and colleagues found out that the presence of so-called middle molecules play an important role in uremic toxicity and could be attributed to uremic neuropathy. Prevention of neuropathy was dependent upon an adequate hour of dialysis per week rather than on maintaining a certain pre-dialysis level of serum urea and creatinine. The removal rate of these small molecules falls towards the end of dialysis because the concentration gradient responsible for the diffusive removal declines. Thus, prolonging dialysis time does not contribute to their increased removal. However, the removal of those molecules that diffuse slowly would benefit from extending dialysis time (taken as a dose). The consequence of these observations was the introduction of a dialysis dose, known as the ‘square meter-hour hypothesis’. Dialysis efficiency was defined to be dependent on the surface area of the dialysis filter and the hours of application. Despite being convincing, patient-specific properties were not included in this hypothesis.
This disadvantage was overcome about a decade later, when Gotch and Sargent introduced the concept of ‘Kt / Vurea’ as a dialysis dose. These authors based their analyses on the findings of a nationwide study on dialysis in the USA in the late 1970s. Here, the dialysis dose was defined by the clearance of urea multi¬plied by dialysis treatment time and normalised for distribution volume ‘V’ of urea in the patient. ‘V’ can be calculated in a first approximation as the amount of water present in the patient´s body i.e. about 60 per cent of his / her body weight. This approach was supported by observational studies that demonstrated an association between Kt / Vurea and patient survival. It showed for the first time that dialysis dose should be assessed as a systems approach i.e. by the combined use of those factors that control the therapy namely the patient (Vurea in a single pool), the engineer who had designed the device (k) and the physician responsible for the treatment time (t).
A value for Kt / V between 1.2 and 1.4 was considered to be the target for an optimal therapy. It is still the basis for the reimbursement of dialysis therapy in many countries to date. However, this rather simple equation has led to the assumption, that treatment time ‘t’ can be shortened to 2.5 to 3.5 hours, once ‘K’ is adequately increased. In contrast, reports on superior patient outcomes have been attributed to longer treatment times, e.g., in Tassin / France (24 h / week,) or subsequently with long nocturnal hemodialysis in the home in Toronto / Canada, and finally also to more frequent daily dialysis in Minneapolis / USA. The question arises whether ‘treatment time’ as a parameter for the dose of dialysis has become the most prominent parameter? A physiologist would welcome such an approach and explain it by the kinetics of diffusion-driven processes that represent a slow mobilisation of uremic retention solutes from body compartments other than blood, such as tissue.
Recently, further high tech research added to the complexity of uremic retention solutes and their efficient removal. Proteomic analyses of the ultrafiltrate in high-flux dialysis have shown that more than 1,400 different polypeptides with increased molecular weight can be found here. Without knowing both, the precise toxic potential of these polypeptides and their possible synergistic effects in causing uremia-related pathology, it will become difficult to define an appropriate parameter for the dose of dialysis. Moreover, one has to assume that it is not sufficient to remove only a single retention solute during dialysis, such as urea or phosphate. Focus should be on the removal of groups or families of molecules with common features such as a comparable conformation, a similar charge or an identical molecular weight, to name only a few. As a consequence, convective rather than diffusive transport mechanisms look more promising for the removal of such groups and thus should be applied. Why is that so?
It has long been known that convective transport of solutes across membranes decreases less rapidly as the solute size increases than does diffusive transport. Sieving coefficients of membranes are less dependent on the molecular size than are diffusion coefficients. A large ultrafiltration volume, defined as a dose, allows for the convectively driven removal of large molecular weight solutes. A large ultrafiltration volume can be provided by an ideal technique called hemodiafiltration (HDF). The EUDIAL group defines hemodiafiltration as follows: “HDF is a blood purification therapy combining diffusive and convective solute transport using a high-flux membrane characterised by an ultrafiltration coefficient greater than 20 mL / h / mm
Hg / m2 and a sieving coefficient (S) for β2-microglobulin of greater than 0.6. Convective transport is achieved by an effective convection volume of at least 20 per cent of the total blood volume processed. Appropriate fluid balance is maintained by external infusion of a sterile, non-pyrogenic solution into the patient’s blood”. A series of clinical trials have been performed in recent years that tried to confirm the advantage of HDF mostly without showing a clear-cut advantage, or advantages only after subgroup analyses. Obviously, and in light of a recent publication in Cataluña / Spain published in 2013, the convective dose in terms of ultrafiltration was simply too small.
This study showed that survival of dialysis patients significantly improved when HDF and an effective convection volume exceeding 20L was performed with HDF and a substitution volume provided online from the dialysate circuit. Data of this study represent an impressive background. More than 900 patients were followed for three years in a two arm scheme. The trial compared high-flux dialysis (n=450) with high efficiency hemodiafiltration (n=456). All-cause mortality was reduced by 30 per cent, if high efficiency HDF with an ultrafiltration volume of more than 20L was administered. The authors concluded that the estimated number needed to treat suggested that switching eight patients from high-flux hemodialysis (low volume HDF) to high efficient HDF (high volume HDF) may prevent one annual death.
To conclude, the definition of dialysis dose for an optimal therapy of uremic patients has changed. Parameters, such as duration or frequency of dialysis therapy as well as the physical performance of the filtering device (clearance) have come out second and have been replaced by high ultrafiltration volumes. At first sight, this looks surprisingly simple. However, the concept of high exchange volumes in hemodiafiltration implies two issues: efficient transport capacity for molecules with increased molecular weight and a deeper insight that a single culprit molecule has not yet been identified, that can be made responsible for the pathology of kidney failure. As a consequence, an unspecific blood purification therapy has come out to be most successful.
The search for similar parameters in other applications of medical device technology will certainly go on.
References are available at www.asianhhm.com
Joerg Vienken is a Chemical Engineer with a doctoral degree in Biophysics & Engineering. He currently works as Vice President ‘BioSciences’ at Fresenius Medical Care in Bad Homburg, Germany with special focus on biomaterials, medical device technology, and artificial organs.