Overview of Therapeutic Drug Monitoring
TDM is the practice of measuring the concentration of a drug or its metabolite in order to optimize the dosing of that drug to an individual patient and/or to assess patient compliance with a dosing schedule. The goal of TDM is to improve drug efficacy—the likelihood of a therapeutic effect while avoiding or minimizing adverse effects. Table 6–1 lists some commonly monitored drugs. Patients do not require monitoring for most drugs. However, for a limited group of agents or for patients with certain conditions (for instance, limited renal function, pregnancy, newborn or geriatric age groups), TDM plays an essential role in establishing the appropriate therapeutic dosing regimen.
The goal of therapeutic drug monitoring is to increase the likelihood of a therapeutic effect and avoid or minimize adverse effects. Patients do not require monitoring for most drugs.
Table 6–1Commonly Monitored Drugs
Prior to the 1960s, drug dosing was entirely empirical. For certain agents, this trial-and-error approach gave wide variations in patient response and a significant incidence of toxicity. Since then, physicians have learned to optimize drug dosages and delivery while avoiding many of the drug's adverse effects. This has been achieved through the development of sensitive and rapid laboratory assays and the establishment of therapeutic ranges for common medications.
Indications for Therapeutic Drug Monitoring
TDM is performed to optimize the dose of a drug to an individual patient. Drugs with a narrow therapeutic index or margin of safety (the difference between the effective dose and the toxic dose) are potential candidates for therapeutic monitoring (Table 6–2). TDM is useful for drugs that display significant pharmacokinetic variability that may be caused by drug interactions, genetic variation in drug metabolism, nonlinear kinetics, physiologic conditions such as pregnancy and aging, and underlying diseases that alter the effective amount of drug delivered to or metabolized by the body. When patient compliance is in question, drug monitoring may be used to demonstrate the presence or absence of the prescribed agent. TDM requires a suitable laboratory assay and the establishment of a therapeutic reference range that correlates with efficacy and/or toxicity.
Table 6–2Indications for Therapeutic Drug Monitoring ||Download (.pdf) Table 6–2 Indications for Therapeutic Drug Monitoring
|The prescribed drug has a low margin of safety; that is, toxic blood drug concentrations or dosages are only slightly greater than therapeutic ones (a narrow therapeutic index) |
|Patient compliance with their prescribed drug regimen is uncertain |
|The drug does not act via irreversible inhibition (“hit and run” effect) |
|Symptoms of underlying disease are difficult to distinguish from drug toxicity |
|The treatment goal is not an objectively measured end point (such as blood pressure) |
|The prescribed drug has significant pharmacokinetic variability as a result of: |
Interindividual metabolic capacity
Nonlinear (zero-order) drug kinetics
Frequent drug–drug interactions
Physiologic conditions (eg, aging, pregnancy)
Underlying disease state (eg, liver or renal impairment)
TDM is performed by measuring the concentration of a drug and metabolite(s). Blood or serum/plasma is the usual sample for TDM, but in some cases, urine or oral fluid samples are used to evaluate patient compliance. The most common examples of urine and oral fluid sampling are monitoring of buprenorphine, methadone, and oxycodone for compliance. By using blood levels to guide drug therapy, a proportional relationship is assumed between the plasma/serum concentration, the concentration of drug at the organ cellular level, and pharmacologic effect. For practical reasons, only blood levels of the drug are measured, because tissue concentrations cannot be easily sampled or analyzed. This pharmacokinetic principle of homogeneity defines the timing of sampling for TDM, since the concentrations of drug in blood at the moment of sample collection must reflect a proportional and constant (steady-state) concentration at the end organ and be reflective of drug effects at the cellular level. Most TDM samples are collected as trough concentrations, the lowest level just prior to the next dose, or as peak concentrations, 30 to 60 minutes after the dose, when blood levels are most reflective of the tissue concentration and drug efficacy or toxicity.
Pharmacokinetics is the study of drug interaction (absorption, metabolism, and clearance) within the body. Drug behavior in the body can be described by the LADME mnemonic. The “L” stands for liberation or release of the drug from its dosage form. The “A” is absorption that describes the movement of drug from the administration site into circulation. Distribution is the “D” and describes the reversible movement of drug through the circulatory system and body tissues. Metabolism or “M” is the chemical conversion of drug to active and inactive compounds. Finally, the “E” indicates how the body eliminates the drug.
Drugs behave in the body based on their chemical characteristics at the molecular level. Drugs can be acidic, basic, neutral, or polar (Table 6–3). The charge and dissociation constant (pK) of the drug influences its absorption, distribution, and elimination characteristics. The dissociation constant of the drug also affects how the drug can be extracted from patient samples and analyzed in the laboratory.
Liberation and Routes of Drug Administration
Drugs can be delivered to the body in a variety of ways. Patients may take a drug orally (PO) by pill (eg, aspirin) or dissolve a powder in a liquid drink (eg, laxatives). Drugs can be delivered intravascularly (IV) through a needle directly into circulation (antibiotics, like vancomycin). Some medications are delivered under the tongue, sublingual (SL), like nitroglycerin for cardiac pain or angina. Others may be injected under the skin, subcutaneous (SC), like compounds in a tuberculosis test, or intramuscularly (IM), such as vaccinations. Rectal (suppositories) and transdermal (eg, fentanyl pain patches) are other methods of delivering a drug. The route of administration will affect absorption and bioavailability of the drug to the body.
The route of drug administration and the formulation of the drug affect the rate and extent of drug absorption. For example, oral drug absorption is affected by many factors including drug solubility in enteral fluid, the acid–base characteristics of the drug, the lipid solubility of the drug, interferences with absorption by food, destruction of the drug by gastrointestinal flora, coadministration of other drugs—especially antacids, cholestyramine, and other resin-binding agents—blood flow to the gastrointestinal tract, and gastrointestinal transit time. Some orally delivered drugs are also subject to a significant “first-pass effect,” whereby they are largely metabolized by the liver to inactive compounds before reaching the systemic circulation. IV administration delivers drug directly into circulation bypassing “first-pass metabolism,” and the amount of drug delivered IV is often compared with other delivery options for determining the extent or amount of drug that is absorbed from a specific formulation. Significant variability in drug absorption is thus a common indication for TDM.
The chemical characteristics of a drug also affect the rate and extent of drug absorption. Acidic drugs carry a carboxyl group, R−COOH (Table 6–3). This acidic group is unionized or uncharged at pH levels below the drug's dissociation constant and is ionized or charged (R−COO−) at pH levels above the drug's dissociation constant. Drugs that act as strong acids have dissociation constants with a pK <5, such as salicylate, penicillin, and analgesics. Strongly acidic drugs are unionized and do not carry a charge at the acidic pH of the stomach while they carry a charge at the more basic pH of the intestines. So, drugs like salicylate are passively absorbed in the stomach, but require active transport for absorption across the intestines. Weak acids (barbiturates, sulfonamides, and thiazide diuretics) have dissociation constants in the range 5 to 11, and are preferentially absorbed in the intestines compared with the stomach. Strongly acidic drugs tend to be fully dissociated or charged at blood pH of 7.4 while weak acids may have significant amounts of the unionized form present in the blood. Due to the acidic pH of urine, weak acids that are ionized at blood pH may become unionized in urine and prone to greater reabsorption. Basic drugs contain an amine group (R−NH3). Basic groups are ionized (R−NH2+) below and unionized (uncharged) above their dissociation constant. Basic drugs can act as weak bases (eg, anesthetics, opiates, and antidepressants) with pK <10 or strong bases (eg, amphetamines and bronchodilators). Basic drugs tend to be significantly ionized (charged) at blood pH. Drugs can also be neutral and carry no charge across the range of physiologic pH. Neutral drugs can be lipophilic and act like fats (eg, corticosteroids) or polar and hydrophilic and attract water molecules (eg, digoxin).
After absorption, drugs distribute throughout the body through the circulatory system, lymphatic system, and tissue fluids. The amount of free drug available to act at organ receptors is affected by both protein and tissue binding. Protein binding is another consideration in TDM. Binding to plasma proteins occurs to some extent for most drugs, with bound and free (unbound) drugs existing in equilibrium. Although it is only the free drug fraction that is biologically active, most laboratory assays measure the total drug concentration, that is, the sum of the bound and the unbound drugs. Several factors can cause changes in plasma proteins and, consequently, affect free drug levels. Acid/neutral drugs tend to bind albumin while basic/neutral drugs bind α1-acid glycoprotein. Some drugs have specific binding proteins such as cortisol and corticosteroid-binding globulin, also known as transcortin. These proteins serve as transport proteins for drugs from the site of absorption to the tissue where drug can act, and as delivery mechanisms to the liver for metabolism or the kidney for elimination. Disease alterations in protein concentration can affect the concentration of free drug. For example, hypoalbuminemia, which occurs in the elderly and in patients with cirrhosis, may cause an increased free drug fraction in the setting of normal total drug levels. α1-Acid glycoprotein is an acute-phase reactant and levels of this protein increase in acute and chronic diseases. Increases in α1-acid glycoprotein create more binding sites for drug, so less free drug will be available in light of the same total drug concentration in a sample. The presence of uremia in disease results in compounds binding to albumin, displacing drug from the protein and elevating the free drug fraction. TDM can determine the proportion of free and total drugs in disease and individualize the dosage to the patient's condition.
Drug metabolism typically renders nonpolar, lipophilic drugs into more polar, water-soluble compounds for elimination. The liver is the primary site for drug metabolism. Genetic variants, age, cirrhosis, and other hepatic conditions may adversely affect drug metabolism, and thus predispose a patient to toxicity. Many drugs are hepatic enzyme inducers or inhibitors and thus can influence the rate of their own metabolism, as well as the metabolism of many other drugs. Pharmacogenetics is study of drug action and metabolism based on genetics. Different genes can lead to changes in drug metabolism and produce an individualized response to a drug. Fast metabolizers will change parent drug to a metabolite at a greater rate than slow metabolizers, depending on the genes expressing metabolizing enzymes in the patient. For example, patients with the slow metabolizer gene will acetylate procainamide (a cardiac drug) to N-acetylprocainamide metabolite at a slower rate than fast metabolizers, and may be more prone to toxicity while on the same dose of drug.
Pharmacokinetics is the study of drug interaction (absorption, metabolism, and clearance) within the body. Pharmacogenetics is the study of drug metabolism based on genetics.
Elimination is the removal of drug and metabolites from the body. Drug can be eliminated through the kidneys, the liver, the lungs, the skin, the feces, and by other means. Elimination of many polar, nonlipophilic drugs is achieved primarily through renal excretion, which is dependent on adequate kidney function and renal blood flow. Other parameters relevant to elimination through the kidneys include urine pH and the properties of the drug itself, such as the dissociation constant, pK, and molecular size. Drug clearance is the theoretical volume of serum/plasma that is completely cleared of a drug per unit time. Importantly, clearance is the sum of all elimination mechanisms—hepatic, renal, lung, and any other—for a particular drug. Patients with impaired drug clearance may need more frequent monitoring.
In TDM, drug levels are most often determined only after steady state has been achieved. Steady state is the condition that occurs when the amount of drug entering the system equals the amount being eliminated. Steady-state concentration is compared in relation to a target range to determine changes in dosing. The target range is established from experimental dosing studies to determine the optimum drug concentration where a drug is most effective while causing the least undesirable side effects and toxicity. The target range is a generalized range that fits most patients, but that range may need to be adjusted or altered in certain disease states and physiologic conditions. TDM allows physicians to optimize drug dosage to a patient's individual situation.
Most but not all drugs are eliminated by first-order (or linear) kinetics. This means that a constant fraction of total drug is eliminated per unit time. All drugs have a biological half-life. For drugs that follow first-order elimination kinetics, changes in dose will generally cause predictable changes in blood levels. Increases in drug concentration lead to increases in the rate of drug elimination. Some drugs, however, are eliminated by zero-order (or nonlinear) kinetics, such that a constant amount of drug is eliminated per unit time. Typically, metabolism by zero-order kinetics occurs when elimination pathways for that drug have been saturated. Under these circumstances, the biological half-life is not constant but depends on drug concentration. As a result, small increments in dose may cause disproportionately large increments in blood levels. Due to their lack of a predictable dose–response relationship, drugs that follow zero-order kinetics often require monitoring.
Most but not all drugs are eliminated by first-order (or linear) kinetics. This means that a constant fraction of drug is eliminated per unit time. Other drugs are eliminated by zero-order (or nonlinear) kinetics, such that a constant amount of drug is eliminated per unit time.
Assuming first-order kinetics, 5 half-lives are required after initiation of drug therapy to reach nearly complete (97%) steady state (5 half-life rule). Five half-lives are also required for nearly complete clearance of a drug after the termination of therapy, and for attaining a new steady state whenever a dosing regimen has been changed.
Drug Interactions and Dose Adjustments
Many patients may take more than 1 medication, and those drugs can interact in the patient's body. Drug interactions may cause displacement of bound drug from proteins. The clinical significance of the interaction is likely to be increased when both drugs are highly protein bound (80% or more), when 1 of the drugs has a higher binding affinity, or when 1 of the drugs is present in higher concentration than the other. Dosing adjustments may be required in these instances. Displacement of bound drug does not inevitably lead to an increased free drug level, because free drug is subject to increased metabolism and elimination. Increases in plasma proteins and drug binding may also occur as an acute-phase response or during pregnancy, and, consequently, higher dosing may be necessary. Caution must be used when interpreting total drug levels in patients with possible protein disturbances or drug interactions, and free drug levels may be more useful in these situations.
Currently, most clinical laboratories utilize immunoassays for the rapid and quantitative measurement of therapeutic drugs. In an immunoassay, drug in the patient's sample competes with a drug conjugate (drug attached to an enzyme or fluorescein molecule) for the binding of specific antibodies. Antibody binding results in blocking enzyme activity or in enhancing fluorescence polarization. By measuring enzyme activity or fluorescence polarization, the amount of drug in the patient sample is quantitated. Chemiluminescent immunoassays offering superior sensitivity are also available for drug analysis. Other immunoassay methods such as ELISA and radioimmunoassay are less commonly used. More complex laboratory techniques, such as chromatography with ultraviolet or mass spectral detection, are also commonly utilized for drug measurements. (See Chapter 2.) Immunoassays offer advantages over chromatographic methods, because immunoassays can be automated and analyze a greater number of samples more rapidly with less labor and cost. Only the total drug concentrations are routinely measured. Free drug levels require a more time-consuming and expensive ultracentrifugation or dialysis equilibrium steps to separate the protein-bound drug from the free drug. Free drug concentrations are typically lower than total drug concentrations by a factor of 2- to 20-fold, so more sensitive assays are required.
The appropriate specimen for therapeutic drug measurements is usually serum or plasma. Most laboratories do not accept gel separator tubes as the gel can bind drugs and interfere with drug recovery. Immunosuppressant levels are measured using whole blood due to the distribution and concentration of drug in RBCs, which are removed in the preparation of serum/plasma. EDTA-anticoagulated whole blood is the appropriate sample for these immunosuppressant drug measurements. Urine samples are frequently used to evaluate patient compliance in cases of therapeutic administration of buprenorphine, methadone, and several opiates (including oxycodone). Saliva or oral fluid may be appropriate for monitoring some medications, such as theophylline, in pediatric patients or in those for whom phlebotomy is difficult. Oral fluid is also not subject to adulteration or substitution, which can be an issue with monitoring for pain management compliance in patients prone to abuse. In general, trough levels are drawn just prior to the next dose and are used to evaluate the likelihood of a therapeutic effect. Peak levels are drawn at varying times, depending on the particular drug, and are used typically to assess toxicity risk.
In general, trough levels are drawn just prior to the next dose and are used to evaluate the likelihood of a therapeutic effect. Peak levels are drawn at varying times, depending on the particular drug, and are used typically to assess toxicity risk.
Selected Commonly Monitored Drugs
Selected individual drugs and considerations for TDM are presented in Table 6–4. The required specimen volume and preservative will vary by analytical methodology, so the described collection instructions are only a guide. The reader should refer to specific instructions from the laboratory. The general monitoring recommendations will depend on the motivation for monitoring the drug, possible drug interactions, and whether the patient is stable or showing signs of toxicity. Therapeutic ranges are only suggestions and will vary by patient, condition, and the presence of other medications.
Table 6–4Therapeutic Drug Monitoring for Commonly Monitored Drugs ||Download (.pdf) Table 6–4 Therapeutic Drug Monitoring for Commonly Monitored Drugs
|Drug ||Monitoring Recommendations ||Specimen Collection Tube and Instructions ||Suggested Therapeutic Range ||Special Considerations |
|Methotrexate ||24, 48, 72 h after bolus; then daily until below cytotoxic levels ||5 mL red top; wrap in foil to protect from light; indicate time past bolus || |
<10 μmol/L at 24 h
<1 μmol/L at 48 h
<0.4 μmol/L at 72 h
|Monitoring guidelines are for high-dose therapy (>20 mg/kg) only |
|Tacrolimus (FK-506) ||Trough levels, 12 h post dose ||3 mL purple top ||5–20 ng/mL ||Cross-reactivity with its metabolites in immunoassays |
|Cyclosporine ||Trough levels, 12 or 24 h post dose ||3 mL purple top; avoid drawing from line of administration ||Transplant of: |
(1) Liver: 400-800 ng/mL
(2) Heart: 150-300 ng/mL
|Ranges depend on organ transplanted and time since transplant |
|Aminoglycosides ||Peak: |
(1) IV: 30-60 min post dose
(2) IM: 60-90 min post dose
Trough: 30 min prior to next dose
|5 mL red top || |
Gentamicin—peak: 5-10 μg/mL,
trough: <2.0 μg/mL
Tobramycin—peak: 4-8 μg/mL,
trough: <2.0 μg/mL
|Guidelines for conventional dosing only (not low-dose therapy or pulse therapy) |
|Vancomycin ||Either peak or trough, once per day ||5 mL red top ||Peak: 30-40 μg/mL, trough: 5-10 μg/mL ||Frequency of monitoring dependent on clinical situation |
|Phenytoin ||Peak for toxicity is 4-5 h after dose; trough for monitoring ||5 mL red top ||10-20 μg/mL ||Pertains to assay that measures total drug (free + bound) |
|Phenobarbital ||Trough ||5 mL red top ||15-50 μg/mL ||Steady state attained in 2-3 weeks |
|Carbamazepine ||Peak level for toxicity is 2-4 h after dose; trough for monitoring ||5 mL red top ||4-12 μg/mL ||Not helpful for idiosyncratic toxicities |
|Clonazepam ||Peak for toxicity is 4 h after dose; trough for monitoring ||1 mL red or green top ||20-60 μg/mL || |
|Lamotrigine ||Peak for toxicity is 2-4 h after dose; trough for monitoring ||1 mL red or green top ||3-14 μg/mL || |
|Levetiracetam ||Peak for toxicity is 1 h after dose; trough for monitoring ||1 mL red or green top ||5-30 μg/mL || |
|Oxcarbazepine ||Peak MHD for toxicity is 4-6 h after dose; trough for monitoring ||1 mL red or green top ||15-35 μg/mL MHD || |
|Valproic acid ||Trough is not well defined ||5 mL red top ||50-100 μg/mL ||Upper limit of therapeutic range |
|Tricyclic antidepressants ||Steady state occurs in about 5 days; 10-14 h after once per day dosing; 4-6 h after twice per day dosing ||5 mL red top || ||Measure sum of parent and active metabolite for drugs noted with “a” in box at left |
|Lithium ||10-14 h after dose; then biweekly or weekly until steady state; then every 1-3 months ||5 mL red top ||0.5-1.5 mmol/L (avoid green-top tubes) ||Toxicity may occur at <1.5 mmol/L, especially in patients who show chronic toxicity |
|Digoxin ||8 h after PO dose; 12 h after IV dose; and at steady state (1 week after initiation) ||5 mL red top ||0.9-2.0 ng/mL ||Specimen collection time is crucial to avoid falsely high levels; STAT levels occasionally necessary |
Methotrexate is a folate antagonist used in the treatment of a wide variety of neoplasms. Dose-related toxicity is common with high-dose methotrexate therapy (defined as >1 g/m2 or 20 mg/kg). Adverse effects include immunosuppression, and diverse organ damage including renal failure, myelosuppression, hepatic toxicity, neurotoxicity, gastrointestinal toxicity, and death. Toxicity correlates with serum methotrexate concentration and duration of exposure. Patients with poor hydration, renal insufficiency, pleural effusion, ascites, or gastrointestinal obstruction are at increased risk for toxicity. Adverse effects of methotrexate are ameliorated by administration of leucovorin, a reduced folate. Serial methotrexate levels are used to guide the appropriate dosing and duration of leucovorin rescue following high-dose methotrexate administration.
The immunosuppressant drugs, tacrolimus (FK-506), cyclosporin, and sirolimus (rapamycin), are drugs used to prevent rejection in organ transplantation. Cyclosporin is also utilized to treat psoriasis, chronic autoimmune urticaria, and rheumatoid arthritis. These drugs were originally discovered in bacteria (tacrolimus and sirolimus) and fungus (cyclosporin) from soil samples. Monitoring is indicated because these drugs have a narrow therapeutic index and highly variable pharmacokinetics. Adverse effects include nephrotoxicity, hepatotoxicity, pulmonary toxicity, neurotoxicity (light sensitivity, tingling in the palms, and tinnitus), tremor, and hypertension.
Whole blood is the preferred specimen for TDM, as the immunosuppressant drugs concentrate into erythrocytes more than the plasma/serum portion of blood. Low trough concentrations may indicate subtherapeutic immunosuppression and can be associated with increased risk of rejection. High trough concentrations cause increased toxicity including nephrotoxicity that can be particularly challenging to diagnose in renal transplant patients. Drug levels must be interpreted in conjunction with other laboratory test results and clinical findings to discriminate between toxicity and rejection. For renal transplant patients on cyclosporin therapy, the only definitive method for differentiating graft rejection from drug-induced nephrotoxicity is renal biopsy. These drugs are sometimes used in combination, and with mycophenolic acid, to enhance the immunosuppressant effects and decrease the dose and side effects.
The immunosuppressants are extensively metabolized by the liver to a number of metabolites, some of which have immunosuppressant activity. Some metabolites can cross-react in laboratory immunoassays, thus overestimating parent drug concentrations in situations where elimination is impaired and when metabolites accumulate, as in cholestasis. Patients who have received mouse monoclonal antibody therapies may also have inaccurate immunoassay results. HPLC with tandem mass spectrometry is increasingly being used for laboratory analysis to circumvent cross-reactivity with the immunoassays.
Gentamicin, tobramycin, and amikacin are aminoglycoside antibiotics. Ototoxicity and nephrotoxicity from aminoglycosides are related to dose and duration of exposure. Numerous factors, such as renal and cardiac function, age, liver disease, and obesity, affect the pharmacokinetic properties of aminoglycosides. Because of the many patient factors, as well as the low margin of safety and high incidence of dose-related toxicity, aminoglycoside levels are usually indicated in conjunction with renal function monitoring to minimize toxicity. In patients with normal renal function and without underlying disease, the indication for drug monitoring is less well defined.
Vancomycin is a tricyclic glycopeptide antibiotic with significant dose-related nephrotoxicity and ototoxicity. The practice of measuring vancomycin levels emerged from the guidelines for aminoglycoside monitoring. However, the necessity for vancomycin monitoring is controversial, because a good correlation between serum vancomycin levels and efficacy or toxicity has yet to be definitively demonstrated. Adult patients with normal renal function may not require routine monitoring. Indications for monitoring include impaired or changing renal function, concomitant use of nephrotoxic drugs, altered volume of distribution (as in a burn injury victim), prolonged vancomycin use, higher than usual doses, and use in neonates, children, pregnant women, and patients with malignancy.
Antiepileptics are frequently monitored to establish the dose necessary to reduce the frequency and magnitude of seizures. Trough levels are used to establish minimum effective dose. When toxicity is suspected, peak or random levels may be obtained. Too low a level will lead to breakthrough seizures, while too high a dose can induce seizures. A therapeutic level maintains seizure control and avoids side effects. The concentration of drug in the blood also may be used to evaluate patient compliance, and explain seizures that are refractory to drug treatment.
Antiepileptics are frequently monitored to establish the dose necessary to reduce the frequency and magnitude of seizures.
Phenytoin (or its prodrug phosphenytoin) is a widely used anticonvulsant with nonlinear kinetics and wide interindividual variability in dose requirement. Phenytoin toxicity includes ataxia, tremor, lethargy, seizure exacerbation, and neuropsychiatric changes. Phenytoin use in certain populations requires special consideration. Neonates and the elderly have decreased clearance. On the other hand, children metabolize phenytoin more rapidly than adults, and, therefore, dose adjustment is necessary at various ages. Careful monitoring in pregnancy is required due to metabolic and volume changes that occur during pregnancy. Phenytoin is highly protein bound, and conditions such as renal failure, liver disease, burn injury, and age will affect the amount of free drug by altering the amount of plasma protein.
Extensive protein binding also predisposes phenytoin to significant interactions with other protein-bound drugs, such as valproic acid. Coadministration of valproic acid and phenytoin is common and may cause a decrease in total phenytoin. Valproic acid displaces phenytoin from albumin, which causes a transient increase in free phenytoin, but this free phenytoin is readily metabolized and cleared. The overall effect is usually a decrease in total phenytoin with an unchanged level of free phenytoin. Monitoring of total phenytoin levels is sufficient for patient management, and free phenytoin levels are not usually necessary except in renal or hepatic disease, conditions that would affect total protein or body clearance.
Phenobarbital and primidone are used to treat all types of seizures except absence (petit mal) seizures. The major active metabolite of primidone is phenobarbital. Clearance of both primidone and phenobarbital is prolonged in neonates, the elderly, and patients with hepatic and renal dysfunction. Phenobarbital is a potent hepatic enzyme inducer, and may affect the metabolism and levels of many other drugs metabolized by the same enzymes. Concurrent valproic acid use significantly decreases phenobarbital clearance.
The anticonvulsant carbamazepine is used not only for seizures but also for treatment of trigeminal neuralgia and bipolar disorder. Monitoring of carbamazepine levels is useful due to its slow and unpredictable absorption. Age and hepatic function affect drug clearance. Dose-related toxic effects include blurred vision, paresthesias, ataxia, nystagmus, and drowsiness. Carbamazepine is metabolized to the active metabolite, carbamazepine 10,11-epoxide. Children are known to accumulate the epoxide metabolite and, as a result, may present with toxicity in the setting of nonelevated carbamazepine levels. With chronic therapy, carbamazepine induces its own metabolism, and dosing adjustment becomes necessary.
Valproic Acid (Depakane®, Depakote®)
Valproic acid is used to treat all types of seizures. It is also used in the treatment of migraines and bipolar disorder. Valproic acid has a narrow therapeutic index. Dose-related adverse effects involve primarily central nervous system (CNS) depression. The average half-life of valproic acid is about 12 to 16 hours, but there is significant interindividual variability, and use of a sustained-release formulation is popular. The half-life of valproic acid is prolonged in neonates and in patients with liver dysfunction. Extensive protein binding accounts for the increased valproic acid toxicity observed in patients with uremia and cirrhosis.
The second-generation antiepileptics encompass a range of drugs with different chemical structures and pharmacokinetics. Some are protein bound (lamotrigine is 55% bound to albumin) while others are not (levetiracetam is <10% protein bound). Common adverse effects include dizziness, ataxia, nausea, and vomiting. Decreased hematocrit and neutropenia can also be seen with lamotrigine. In general, the second-generation antiepileptics have a wider therapeutic index and fewer side effects than the first-generation drugs. HPLC and immunoassays are available. However, therapeutic and toxic ranges have not been established for all of these drugs. So, monitoring is generally conducted to define the individual level at which a patient is achieving therapeutic action with fewest side effects for future reference, for compliance, and for documentation of the level at which side effects are evident for that patient.
Tricyclic antidepressants are monitored for multiple reasons. There is significant interindividual variation in metabolism and elimination, such that standard dosing results in therapeutic levels in less than half of patients. Genetic variation accounts for some of this variability. The fraction of “poor metabolizers” is about 17% of Caucasians and 5% of other ethnic groups. Other indications for monitoring include a narrow therapeutic index, multiple drug interactions, and patient compliance.
Tricyclic antidepressants are monitored for multiple reasons. There is significant interindividual variation in metabolism and elimination, such that standard dosing results in therapeutic levels in less than half of patients. Genetic variation accounts for some of this variability.
Tricyclic antidepressants have a low margin of safety and cause anticholinergic toxicity, seizures, and arrhythmias in overdose. Although the correlation between toxicity and blood level is poor, there are general guidelines. Levels in excess of 500 μg/L may be associated with anticholinergic toxicity (flushing, tachycardia, fever, dilated pupils, dry mucous membranes, urinary retention, and absent bowel sounds). Cardiotoxicity is more likely to occur at levels greater than 1000 μg/L in acute overdose.
Lithium is a univalent cation most commonly used to treat bipolar disorder. Lithium monitoring is useful due to its narrow therapeutic index and the wide interindividual variation in dose requirement.
Excretion of lithium is primarily renal. Children have increased clearance, while the elderly have decreased clearance. Lithium excretion parallels sodium excretion. Therefore, patients on stable doses of lithium may become toxic in states of sodium conservation such as fever, excessive sweating, lack of fluid intake, and diarrhea.
Toxicity is usually associated with levels in excess of 1.5 mmol/L. However, toxicity may occur at lower levels, especially in cases of chronic toxicity. Lithium overdose is characterized by lethargy, weakness, slurred speech, ataxia, tremor, and myoclonic jerks. Severe toxicity may result in seizure, hyperthermia, and coma. Management of patients who have ingested sustained-release lithium preparations is difficult, and serum measurements play a crucial role in the decision to instigate hemodialysis or whole bowel irrigation. Analytical methods involve the use of ion-specific electrodes, and spectrophotometry or colorimetric tests.
Fluoxetine was the first selective serotonin-reuptake inhibitor used to treat depression. Fluoxetine monitoring is useful when patient compliance is in question. Further monitoring is not likely to be beneficial since fluoxetine has a wide therapeutic index, and there is a poor correlation between blood levels and clinical response. Fluoxetine is metabolized by the liver to the active metabolite norfluoxetine.
Other serotonin-reuptake inhibitors/later-generation antidepressants—such as sertraline (Zoloft®), paroxetine (Paxil®), fluvoxamine (Luvox®), citalopram (Celexa®), quetiapine (Seroquel®), trazodone (Deseryl®), and venlafaxine (Effexor®)—do not require routine monitoring due to their wide therapeutic indices.
Digoxin is a commonly used drug in the treatment of heart failure and arrhythmias, and it has a low therapeutic index. There is significant interindividual variation in digoxin absorption and distribution along with prolonged clearance in patients with impaired renal function. Digoxin overdose is characterized by gastrointestinal distress, confusion, visual changes, hyperkalemia, and life-threatening cardiac toxicity. Overdoses may be treated with an antidigoxin antibody antidote. Such treatment typically renders subsequent blood digoxin concentrations unreliable. Blood digoxin immunoassays are generally less reliable than immunoassays for other therapeutic agents. Interferences with digoxin immunoassays are frequently reported. These interferences are referred to as digoxin-like immunoreactive substances (“DLIS”).
Acetaminophen is a therapeutic drug used as an analgesic and an antipyretic. When it is used in the recommended doses, it is not necessary to measure acetaminophen levels. However, excess intake of acetaminophen can be associated with severe liver injury. Thus, acetaminophen is a representative of many compounds with a wide therapeutic window that does not require therapeutic monitoring when used in recommended doses. However, because toxicity can occur if the upper limit of the window is exceeded, monitoring acetaminophen levels in patients with excess intake is critical, particularly since an antidote to the major toxic effect can be administered. Table 6–5 presents an overview of the laboratory evaluation for acetaminophen toxicity. Immunoassays are available for the rapid determination of serum/plasma levels.
When it is used in the recommended doses, it is not necessary to measure acetaminophen levels. However, excess intake of acetaminophen can be associated with severe liver injury.
Table 6–5Laboratory Evaluation for Acetaminophen Toxicity ||Download (.pdf) Table 6–5 Laboratory Evaluation for Acetaminophen Toxicity
|Laboratory Tests ||Results/Comments |
|Laboratory monitoring of acetaminophen concentration ||The importance of laboratory monitoring is related to the use of N-acetylcysteine as a treatment for the acetaminophen overdose; it is important that the neutralizing effect of N-acetylcysteine be provided before acetaminophen metabolites produce liver injury. To determine whether the acetaminophen ingestion is likely to cause liver toxicity, a 4-h postingestion serum concentration should be obtained; the serum concentration of the drug will be used to determine if the patient is likely to experience liver injury and, if so, treated with N-acetylcysteine. If the first acetaminophen level is obtained more than 4 h post ingestion, a nomogram can be used (available in many textbooks) to determine if the acetaminophen level at that time post ingestion is likely or not likely to be associated with liver injury |
|Liver function tests ||Hepatic necrosis becomes evident 24-48 h after the ingestion of the excess amount of acetaminophen if the patient is not treated; at that time, standard liver function tests such as AST (SGOT), ALT (SGPT), bilirubin, as well as the prothrombin time, can be used to assess the extent of liver injury |
Acetaminophen is rapidly absorbed from the gastrointestinal tract. The plasma concentration reaches its highest level 30 to 60 minutes after a dose. One of the compounds resulting from acetaminophen metabolism is an oxidation product that is hepatotoxic. Normally this metabolite is detoxified by binding to glutathione in the liver. With excess intake of acetaminophen, the production of the toxic metabolite exceeds the amount of hepatic glutathione, and this permits the toxic metabolite to produce liver injury. Renal damage also may occur as a result of injury by the same compound.
The recommended daily dose of acetaminophen is no more than 4 g per day. A single dose of 10 to 15 g may produce liver injury. Fatal disease is usually associated with ingestion of ≥25 g of acetaminophen.
The recommended daily dose of acetaminophen is no more than 4 g per day. A single dose of 10 to 15 g may produce liver injury. Fatal disease is usually associated with ingestion of ≥25 g of acetaminophen. Acetaminophen at slightly more than the recommended 4 g per day can produce hepatotoxicity when the patient has also ingested ethanol, and this response can be exacerbated if the patient had been fasting prior to ingestion of acetaminophen and ethanol, or takes another enzyme-inducing drug such as phenytoin. The ingestion of acetaminophen at greater than recommended doses produces corresponding elevations of acetaminophen in the blood, and the level of the drug in the blood correlates with the severity of hepatic injury.
Acute manifestation of excess acetaminophen intake typically occurs 2 to 3 hours after ingestion. Most often this includes nausea, vomiting, and abdominal pain. Cyanosis of the skin and fingernails may be observed as a result of methemoglobin generation from the overdose. The full extent of liver damage usually becomes apparent 2 to 4 days after drug ingestion. At that time, liver function test results, including the prothrombin time, become abnormal. A variety of associated abnormalities, including electrolyte disturbances, can occur if there is significant liver damage. Acute renal failure also may occur, even if liver failure is not observed.
Aspirin (acetylsalicylic acid) is a therapeutic drug in use for more than a century as an analgesic, antipyretic, anti-inflammatory, and antithrombotic agent. It is readily absorbed and rapidly metabolized by hydrolysis to salicylic acid. Peak concentrations occur within 1 to 2 hours with a therapeutic dose. Between 50% and 90% is bound to albumin in a dose-dependent manner. Further metabolism produces salicyluric and gentisic acids and glucuronide conjugates that are renally excreted. Aspirin is contained in many preparations, including those with other analgesics. When used in therapeutic doses, it is not necessary to measure levels. However, chronic salicylate poisoning (salicylism) carries a high morbidity (30%) and mortality (25%), and is difficult to diagnose without monitoring levels since the patient may be too confused to give a reliable history. Table 6–6 presents an overview of the laboratory evaluation for salicylate toxicity. Immunoassays are available for the rapid determination of serum/plasma levels. About 500 mg/kg as an acute dose is potentially lethal in comparison to a normal dose of 15 mg/kg. When taken in therapeutic doses, the half-life is 2 to 5 hours, but metabolism becomes saturated once the dose exceeds about 30 mg/kg, causing a delay in drug elimination. An early feature of toxicity is respiratory alkalosis through direct stimulation of the respiratory drive center, followed by vomiting. The latter mechanism of toxicity results from uncoupling of oxidative phosphorylation, leading to ketosis, metabolic acidosis, and pyrexia, with further dehydration and electrolyte imbalance. Hematologic consequences arise that manifest as an increased prothrombin time, GI bleeding, and occasionally DIC.
Table 6–6Laboratory Evaluation for Aspirin Toxicity ||Download (.pdf) Table 6–6 Laboratory Evaluation for Aspirin Toxicity
|Laboratory Tests ||Results/Comments |
|Detection of aspirin metabolites in urine by color test (Trinder reagent); monitoring of serum salicylate concentration by enzymatic assay or immunoassay || |
The importance of these tests is to establish the diagnosis of poisoning. Since a number of preparations are available containing sustained-release aspirin, it is recommended that serial blood samples be drawn at 3-h intervals to determine whether the drug concentration is still rising
The Done nomogram interprets the serum salicylate concentrations taken at 6 h after acute ingestion as follows:
The use of the Done nomogram is unreliable when:
There has been a previous ingestion within 24 h
Poisoning is chronic (concentrations >30 mg/dL indicate serious toxicity)
Enteric-coated or sustained-release preparations have been ingested
Renal failure is present
Treatment for aspirin overdose is symptomatic and supportive—administration of repeat doses of oral activated charcoal may be given in an attempt to prevent further absorption and increase fecal elimination. Bicarbonate is used to counteract the metabolic acidosis, and calcium and electrolytes are administered to prevent seizures and cardiac failure. Hemodialysis may be indicated at concentrations above 100 mg/dL (>40 mg/dL in chronic salicylism), and to support renal function and electrolyte balance
Regular monitoring of renal function, blood gas and lactate, and coagulation assessment are important for patient care
Other Pain Management Drugs
Buprenorphine and methadone are analgesics that are commonly utilized for opiate withdrawal, but have found recent medical application in the management of chronic pain. Fentanyl and oxycodone are other drugs utilized in pain management that may be monitored. Serum/plasma levels of these drugs correlate poorly with clinical effect because of tolerance. The safety of these drugs in doses utilized for pain management does not typically require monitoring and dosing can be adjusted based on pain relief. However, these drugs have high abuse potential, so urine tests are sometimes used to monitor for compliance and ensure that the patient is not diverting the drug for sale or other purposes. Immunoassays are available for analysis of these drugs in urine samples.