Article Text
Abstract
Objective A deep understanding of the relationship between a scarce drug's dose and clinical response is necessary to appropriately distribute a supply-constrained drug along these lines.
Summary of key data The vast majority of drug development and repurposing during the COVID-19 pandemic – an event that has made clear the ever-present scarcity in healthcare systems –has been ignorant of scarcity and dose optimisation's ability to help address it.
Conclusions Future pandemic clinical trials systems should obtain dose optimisation data, as these appear necessary to enable appropriate scarce resource allocation according to societal values.
- CLINICAL PHARMACOLOGY
- Clinical trials
- MEDICAL ETHICS
- COVID-19
- Public health
- Rationing
This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.
Statistics from Altmetric.com
Key messages
Resource scarcity reflects gross mismatch between supply and demand; societies strive to allocate scarce resources allocated in a manner that reflects their values.
With regard to drugs and vaccines, a deep understanding of the relationship between a scarce drug’s dose and clinical response is necessary to appropriately distribute a supply-constrained drug.
Future pandemic clinical trials should obtain dose optimisation data in order to enable appropriate distribution.
Background
Absolute scarcity—when demand for a resource vastly exceeds a limited supply—has been a recurring theme of the COVID-19 pandemic.1 Under absolute scarcity, resource allocation is inevitably zero sum. Increasing supply of an absolutely scarce resource through increased production is one approach, but for certain resources, this may not be feasible (scientifically or politically) over the timescale required to nimbly and effectively respond to scarcity.2 Societies must then confront the tragic dilemmas scarcity imposes by allocating absolutely scarce resources in a manner that reflects their values, whatever those values may be.2 3 In a strained health environment, many countries will seek to allocate scarce medical interventions to balance benefit maximisation with inequity minimisation. Doing so requires first knowing how much of the absolutely scarce resource is available, and second, knowing how much benefit recipients of the resource can expect to gain from each additional unit they receive. This second, key piece of information requires an understanding of the relationship between the scarce resource’s dose and clinical efficacy.4
A resource’s ability to benefit individuals and populations depends on its abundance or scarcity. For resources that are not abundant, tradeoffs between the relative welfares of individuals and populations become inevitable.2 These tradeoffs become especially acute when considering the absolute scarcity of medical resources during a pandemic.
The conventional drug development paradigm is oriented toward the individual. It assumes that the drug is abundant with respect to the potential recipient, with neither limits on access nor tradeoffs between different recipients. Critically, for the individual who is eligible to receive a given candidate therapy in the context of a clinical trial, that therapy is itself abundant because the dosing regimen does not impact other potential recipients’ access. Drug development thus occurs under a critical assumption: maximising the benefit that each individual recipient gleans from a drug will necessarily maximise total population benefits. Dose-finding and efficacy studies therefore strive to maximise the net benefit for each individual recipient of the candidate therapy. In the context of an early phase clinical trial of an abundant drug, administering more than the minimum amount needed to achieve maximum benefit presents no issue, so long as the additional amount does not increase harms.
The abundance assumption implied by conventional drug development falters under scarcity. Drug resources, including those that appear abundant before a disaster strikes,5 6 can quickly become absolutely scarce. When absolute scarcity and its attendant tradeoffs emerge, administration of excessive amounts of drug to some individuals, by excluding other potential recipients from any benefit, reduces the total population benefits that could have been gleaned. Further, this form of overdosing exacerbates inequities between those with access and those without.7 In truth, an infinite number of dose levels are theoretically possible for a given drug. For physicians and policymakers navigating scarcity dosing of an absolutely scarce resource is an optimisable choice—but it requires reapproaching dosing with consideration of both scarcity and public health.
Main text
Selected dose levels in COVID-19
On the basis of demonstrated clinical benefit, 13 drugs (including vaccines) have been approved or authorised by high-income governments for the treatment or prevention of COVID-19 as of 1 October 2021, with many more in development or subsequently approved.8 Dosing uncertainty persists for each of these drugs (table 1), in turn reducing health system capacity, efficiency of supply usage, and population health. To better understand the specific dose level that we ought to target in pandemic drug development, however, it is useful to review the dose levels that have been chosen thus far.
Dose optimisation of repurposed and new molecular entities for SARS-CoV-2 and COVID-19 as of summer 2021
Labelled dose for prior indication
The dosages of drugs repurposed for treatment of COVID-19 have almost uniformly duplicated those in the drug’s original, pre-COVID-19 indication (table 1). Prior clinical information provides investigators with the pharmacokinetic and safety information needed to rapidly incorporate a candidate drug into a clinical trial evaluating efficacy, bypassing traditional early-phase dose-finding research.9 If the dose used for the original indication exceeds the minimum dose needed to demonstrate efficacy in COVID-19, then investigators are able to determine whether the drug is in fact an efficacious therapy for COVID-19. The speed conferred by bypassing front-end dose-finding benefits patients through earlier identification of efficacious drugs. But this strategy, by not attempting to identify an optimal dose for the drug specific for COVID-19 introduces two major risks: first, because of relative underdosing, a clinical trial may reach a false negative conclusion about the drug’s efficacy and second, that the appropriate dose for the original indication is in truth excessive for COVID-19, exacerbating scarcity and potentially resulting in excess harms (neither of which are knowable at trial initiation). While there are clear efficiencies to be gained by building on prior knowledge of the drug’s performance for its original indication, the absence of both individual-level and population-level dose optimisation may exhaust drug supplies or harm individual recipients at rates far higher than necessary.
Model-informed drug repurposing
Model-informed drug repurposing (MIDR) uses in vitro estimates of a repurposed drug’s activity against a novel pathogen, such as COVID-19, to guide initial anti-infective dosing. As a pathogen becomes better characterised during the course of a pandemic, investigators can connect pharmacokinetics to epidemiological modelling, enabling clinical trials to administer the (presumed) right dose of therapy at the right time, enhancing the odds of demonstrating efficacy if it truly exists.10 Importantly, however, the small molecule antiviral drugs for which MIDR has been employed have not been shown to reduce COVID-19-related mortality.11 Instead, in the COVID-19 pandemic, the pathogen-directed approaches demonstrated to reduce mortality have uniformly been new chemical entities.12 13 Coupled with MIDR’s requirement for an in vitro assay to be developed and validated to guide dosing decisions, the benefits of host-targeted therapies,14 and the speeds with which novel antibody-based antiviral and mRNA vaccine development can occur, MIDR may be ill-suited to both rapid and accurate action in future viral pandemics.
Dosing informed by randomised, dose-ranging trials
Randomised, placebo-controlled dose-ranging studies directly compare multiple dose levels (including placebo) to one another to identify signs of clinical activity and the extent to which a dose–response relationship exists. These trials are often conducted in phase 2, before confirmatory efficacy trials that use gold-standard clinical endpoints. Timing is critical: randomised, placebo-controlled dose-ranging studies evaluating key surrogate clinical endpoints can enable dosing that responds to emergency conditions. For example, the BLAZE-1 trial evaluated whether any of three different doses (700 mg, 2800 mg, 7000 mg) of bamlanivimab reduced SARS-CoV-2 viral load.15 Despite only the 2800 mg arm reducing SARS-CoV-2 viral load, the 700 mg dose was granted Emergency Use Authorization (EUA) by the US Food and Drug Administration due to its comparable reduction in emergency care usage and hospitalisation—instantaneously providing a many-fold increase in the number of patients who could benefit from what would be a very limited initial supply of drug. BLAZE-1 was not powered to make a determination of small differences in these important outcomes between different dose levels. Despite this limitation, though, a decision that prioritised population health over individual health was made.
More disappointing for global health advocates, however, has been an absence of urgency in pursuing SARS-CoV-2 vaccine dose optimisation. For example, half-dose (50 µg) mRNA-1273 generates SARS-CoV-2-neutralising antibodies and seroconversion at rates nearly equivalent to full dose, with all half-dose recipients in the trial seroconverting by 6 weeks after the first dose.16 Similar findings have been suggested for the BNT162b2 mRNA vaccine.17 Similarly, quarter-dose (25 µg) mRNA-1273 is sufficient to generate durable cellular immunity against SARS-CoV-2.18 As it relates to boosters, dose-finding studies might even allow for a massive reduction in vaccine supply usage while minimising risk of adverse events.19 Yet this concept has not been pursued in follow-up trials, despite the potential to instantaneously increase access at least twofold to fourfold while simultaneously reducing the likelihood of adverse events, especially in the adolescent and young adult subpopulations at risk for vaccine-related myocarditis.
Administering drug at a lower dose, known as fractional dosing, rations divisible scarce resources to increase the number of recipients who have the potential to benefit from a relatively fixed supply of scarce resource. The approach acknowledges that individuals who receive a lower dose could be ‘worse off’ than if they had access to and received the full dose. However, the social value gained by multiplying the number of recipients by twofold (or more) more than compensates for the potentially reduced efficacy each recipient experiences. In the COVID-19 pandemic, for example, meaningful reductions in both infections and deaths would likely have occurred had a half-dose mRNA vaccine achieved an efficacy 70% that of the full-dose mRNA vaccine.20 In light of the evidence cited above showing the near-equivalence of low-dose mRNA vaccines16–18 as well as the possibility that the emergence of SARS-CoV-2 variants would have been reduced by using a fractional dosing strategy focused on vaccinating as many individuals as possible, as quickly as possible,21 it is plausible that the failure to optimise doses and enact fractional dosing led not only to unnecessary morbidity and mortality but also a lengthening of the COVID-19 pandemic.
Potential dose levels for future pandemic drug development
The extent to which a pandemic drug’s dose is ideal depends on the lens through which one views the allocation problem. Optimising dose to population health instead of individual health often leads to very different dosing decisions. Two approaches to dosing are described below.
Minimum dose with satisfactory efficacy
The minimum dose with satisfactory efficacy (MDSE) approach predefines a satisfactory level of efficacy and then identifies the minimum amount of drug needed to achieve that level, based on the dose–response information provided by dose-ranging studies.22 The MDSE approach is best suited for a clinical context in which a drug is known to have efficacy when administered at a given dose, frequency, route of administration and duration. Ideally, the MDSE would be identified in clinical trials; the presence of at least one cohort of patients that receives a dose level(s) that is less than the MDSE provides a measure of confidence that MDSE is, in fact, MDSE. Downsides to the MDSE-focused approach include the time needed to optimise the drug’s dose after confirming its efficacy at the higher, non-MDSE dose and the suboptimal efficacy that would be experienced by any study arms that receive a dose lower than MDSE in a clinical trial. Low-dose tocilizumab23 and preclinical attempts toward mRNA vaccine dose minimisation are examples of attempted MDSE identification during the COVID-19 pandemic.16 18 MDSE-based drug development operates from a socially minded yet individual-dominant perspective, wherein the arrived on MDSE depends on a society’s definition of ‘satisfactory’.
Socially optimal dose
The socially optimal dose (SOD) is still more socially minded. SOD is the theoretical dose at which individual efficacy-per-unit is maximised (figure 1). Usage of the SOD sacrifices maximal efficacy for each recipient in order to increase population-level benefit by increasing the number of recipients.4 The SOD, derived from estimates of the population-level dose–response relationship, attempts to maximise the quantity of benefit the drug produces without regard to distribution, and may facilitate allocation strategies that focus solely on maximising population-level efficacy. One real-world example of (more) socially optimal dosing—outside the bounds of a clinical trial—was the use of extended-interval dosing of mRNA vaccines, in which the potential risk of worsened individual-level outcomes due to the extended dose interval was accepted in exchange for the potential benefit of protecting more people, acknowledging recipients may receive suboptimal protection by straying from the evidence base.20 24 In retrospect, although this was a successful calculated risk, testing the strategy prospectively through clinical trials would have been preferred.25–27
Distinctions between individually and socially optimal dosing approaches for a hypothetical vaccine. (A) A randomised dose-finding study reveals the dose–response curve shown, where a vaccine is found to have maximal efficacy at a 100 µg dose and approximately 75% relative efficacy at quarter-dose. (B) By evaluating the drug’s efficacy relative to the amount of drug administered, we derive the socially optimal dose, maximising the efficacy gained per microgram administered. MDSE, minimum dose with satisfactory efficacy; RCT, randomised controlled trial.
Mathematical models’ guidance
Epidemiological models of viral pandemics, in combination with economic models describing implementation of mitigation and vaccination, enable side-by-side comparison of resource allocation strategies against a range of counterfactuals and under a range of assumptions.20 21 28 29 Conditioned on a given drug or vaccine having been shown to be efficacious, the most relevant question for public health becomes how to, in an efficient and accurate manner, maximise the population benefits that can be derived from a scarce supply of that resource. Determining what ‘socially optimal’ may be in the rapidly evolving evidence space of a global pandemic is inherently challenging and depends on multiple, imperfectly known factors, including: vaccine effectiveness, relative effectiveness of lower doses, the maximum rate with which vaccination efforts proceed and characteristics of the pandemic itself (eg, incubation period, rates of spread and replication, and mortality). Prior outbreaks, including influenza,30 cholera31 and yellow fever, demonstrate the value of taking a rational approach to scarce vaccine allocation guided by mathematical models. Such efforts came about, however, once scarcity had arrived, rather than pre-emptively, during drug and vaccine development.
Strategic pandemic clinical trial systems for the future
Under the relative abundance present outside a pandemic, once an efficacious therapy is identified, clinical trialists most commonly turn their attention to the next candidate drug—for example, by asking ‘Can addition of a new therapy improve a given individual’s outcome?’. However, from a population health perspective, the key question often is ‘How do we, as a health system, maximize the benefits generated by this therapy’s finite supply?’
During a pandemic, conscientious policymakers will often adopt strategies that allocate scarce resources according to societal values, including toward maximisation of benefit and reduction of inequities.3 Simulations of vaccine allocation in which lives saved is the primary outcome,17 as well as real-world evidence,25 26 demonstrate the necessity of knowing in real-time the dose–efficacy relationship and an approach that aligns with allocation of scarce resource based on SOD. Either MDSE-based or SOD-based allocation schemas would be well-suited to navigating pandemic-fueled drug scarcity in order to improve population health, but their derivation may require dose-ranging clinical trials. How, then, to incorporate dose-optimisation studies and enable welfare maximisation more efficiently in the future?
Incorporation of dose-optimisation studies into platform trials is the next step toward maximising population health, and the answer likely resides in blended trials that combine efficacy assessment with dose-optimisation. Employed at scale, platform trials have allowed for rapid identification of efficacious repurposed therapies through simultaneous evaluation of multiple candidate drugs.9 Therefore, a two-step approach in which efficacy is first determined in a platform trial, followed immediately by randomised, dose-ranging studies aimed at dose optimisation (perhaps guided by surrogate endpoints derived from the earlier, larger definitive randomised controlled trials (RCTs), such as correlates of protection32 33) can provide physicians and policymakers with the information needed to best allocate scarce resources toward population health aims. Seamless clinical trial designs proceeding from dose-finding to efficacy assessment have previously been conducted.34 35 Given the need to rapidly begin generating benefits, a future pandemic research paradigm should begin first with an assessment of a drug’s potential to generate benefits for an individual patient, followed by a thorough exploration of the dose–efficacy relationship to inform optimal allocation.
Optimisable components of dose
Once a given drug’s efficacy has been established, optimisation research is needed to understand the patient-related and drug-related factors most responsible for the drug’s success. A drug’s efficacy depends on the extent to which its target is exposed to the drug (ie, the exposure). Exposure itself is a function of dose, which can be framed as the discrete quantum of drug administered (whether it be a flat dose for all subjects or a personalised, weight-based dose), the frequency with which that quantum is administered (eg, once vs recurring), the time duration over which a patient is exposed to the drug and the route of administration. Lowering the quantum administered, reducing the frequency and duration of administration, and altering the route of administration can all help reduce the total amount of drug used while, simultaneously, achieving sufficient exposure.
As discussed in preceding sections, this question is orthogonal to, but no less important than, the original efficacy question more commonly examined in conventional studies. Clinical trial methods that efficiently accomplish the task of dose optimisation while (1) adhering to core ethical constraints (namely the absence of a placebo arm once a given drug’s efficacy has been established) and (2) recruiting efficiently are at this time being elucidated. Dose optimisation after confirmation of a drug’s efficacy is, essentially, a one-way sensitivity analyses: that is, they would evaluate the impact of lower quanta, less frequent administration, shorter duration and/or different routes on efficacy. These designs may come to resemble DURATIONS designs previously developed to efficiently examine antibiotic regimen durations.36 37 Clever use of Bayesian prospective clinical trials in sequence may unlock new efficiencies.
Considerations for trialists, health systems and policymakers
Incorporation of dose optimisation of scarce medical resources is not without its challenges or criticisms. First, clinicians, researchers and participants must be convinced that the risk to an individual involved in a dose-optimisation trial is reasonable, when viewed in relation to the social value that the trial provides. Individuals enrolled in dose-optimisation studies ultimately bear the risks of these trials. Chief among these is the risk that the lower dose has lower efficacy than the previously tested dose, a risk that can be mitigated by allowing for crossover. Counterintuitive benefits may emerge: lower doses may have improved safety profiles and individuals enrolling in dose-optimisation trials may receive access to therapies earlier than those who do not. Individuals enrolled in dose-optimisation trials may therefore, on balance, benefit from the lower dose—while also contributing to potentially substantial population benefit. While the populations to which MDSE and SOD will apply are more heterogeneous than the population in which a dose-ranging clinical trial is conducted, a dose-ranging study would be a potentially high-reward incremental step.
Second, critics may worry that dose optimisation will lengthen development, repurposing, and authorisation or approval timelines, or unnecessarily increase the rate at which stockpiles of absolutely scarce drugs are used. Dose optimisation indeed requires time, but simulation and real-world studies suggest it can enable implementation strategies that save more lives.4 17 25–27 Two-staged result reporting—efficacy trial result first, followed by dose optimisation—may allow patients to benefit from a therapy with demonstrated efficacy while dose optimisation efforts are ongoing. Unpredictable supply chains and increased demand leave clinicians and patients to grapple with a fundamental question ‘Is it worth the risk to this patient, who has access to the scarce medicine, to potentially provision a lower dose?’. Within the individual doctor–patient relationship, public health arguments lack an advocate, resulting in an answer of ‘No’. Health authorities and regulators must make efforts to ensure adequate drug supply for dose optimisation clinical trials, and hospital authorities should cordon off supplies for dose optimisation trials that may serve the greater good.
Third, the successes of corticosteroids for hospitalised patients with COVID-19 who require supplemental oxygen might suggest avoiding the need for dose optimisation by limiting our efforts to abundant therapies. This line of thinking represents supply-consciousness but has strategic flaws: it limits the universe of potential therapies that can be tested, no matter how well-reasoned mechanistically; fails to anticipate that a therapy being studied could become scarce in the future; and neglects geographic heterogeneity in abundance/scarcity. Extending the supplies of relatively abundant drugs in the event of higher than anticipated demand is consistent with a benefit-maximising, inequity-minimising strategy. Moreover, failing to reconsider the dose of abundant drugs risks under-dosing and failure to capture population benefits that otherwise could have been achieved.
Some may contend that lower doses should not be used until clinical efficacy is established in a confirmatory clinical trial comparing low-dose to standard-dose, implying that development timelines will be lengthened. While of course a reasonable concern, risk calculus is contextual. Whether to adopt a lower dose of an absolutely scarce therapy based on suboptimal evidence must be decided in awareness of the dangers resulting from ongoing absolute scarcity. Expanding population access to a therapy by lowering the dose—even if doing so sacrifices some degree of individual-level efficacy—may be socially optimal, ethical, and in line with the goals of a given society.4 38 Generation of this information alone does not compel policymakers to action.
Finally, some may worry that dosing strategies that aim to promote population health treat individual recipients inequitably, by providing a dose that is less beneficial than the dose that would be provided under abundance. But, under scarcity, maintaining the dosing strategies used under abundance may exclude many who can benefit. Maximising benefits for a few recipients while leaving many unprotected for want of access is likely to neither maximise population welfare nor serve a society’s equity goals. Indeed, a dose-optimised approach to scarce medicines may facilitate fulfilment of rational pharmacotherapy’s goal of ensuring therapeutically sound and cost-effective use of medicines in a post-COVID-19 world.
Applying scarcity-oriented development to the next pandemic
Many of these same themes and debates have re-emerged in the past 6 months. Since this manuscript was initially submitted, monkeypox has been declared a public health emergency of international concern.39 Authorities in the USA appear keen to avoid the accessibility issues that plagued the COVID-19 vaccine rollout. In addition to adopting a ‘first doses first’ strategy in some localities,40 federal-level policymakers will allow federal drug regulators to provide EUA for fractional dosing of monkeypox vaccine (modified vaccinia Ankara (MVA)),41 drawing on a previously conducted RCT comparing full-dose subcutaneous injection of two different forms of MVA with one-fifth dose intradermally.42 Notably in the trial, recipients received two doses of MVA.42 Alongside this extrapolation and concern that intradermal injection will be suboptimal when implemented in the real world, adoption of fractional dosing has been criticised.43
Recognising the evidentiary ambiguity, clinical trialists are starting to ask the right, public health-oriented questions. The US-based National Institute of Allergy and Infectious Diseases is sponsoring a prospective, randomised, controlled phase 2 trial evaluating two doses of one-fifth dose MVA (intradermal), two doses of one-tenth dose MVA (intradermal) against two doses of a full-dose standard of care (delivered subcutaneously).44 45 There is, however, cause for great concern in the execution of this research vision. The proposed trial,45 which is not yet recruiting, employs a non-inferiority hypothesis structure in evaluating peak antibody in patients 18–50 years of age. Of note, the initial trial plan does not include a clinical efficacy endpoint. Together, these factors raise the worrying possibility that this much-needed dose optimisation trial could, despite not having defined the minimal antibody response needed to confer protection, reject the fractional dosing strategy on the basis of inferior peak antibody responses alone and, simultaneously, fail to identify clinical near-equivalence between the three tested dose levels if it does exist. The BLAZE-1 trial may be informative: Even underpowered clinical outcomes can influence decision-making. Moreover, the COVID-19 experience suggests that even a positive trial that demonstrates the non-inferiority of fractional dosing’s antibody response would fail to convince sceptical public health authorities or regulators, due to the failures to provide either a clinical endpoint or meaningful data for patients over 50 years of age. The extent to which the science of fractional dosing will be furthered is limited by the trial’s inability to independently correlate threshold antibody titres with clinical protection as well as its likely inability to identify both the MDSE and SOD. Finally, by virtue of the study’s three-arm design, the potential vaccine supply expansion is inherently capped at a 10-fold increase, which may not be sufficient to protect all individuals in some locales.
Conclusions
Clinical care provided under conditions of scarcity and clinical care provided under conditions of abundance are profoundly different. Society’s ability to maximise the benefits that could be generated from its scarce drug supplies was hampered by a lack of information and an inability or unwillingness to ask key questions. Research questions and the clinical trials used to answer them must acknowledge the stark differences between care under scarcity and care under abundance in order to serve population health goals. Policymakers must make strategic decisions to navigate scarcity, and medicine’s role is to provide policymakers with information that guides decision-making. Dosing remains a major inefficiency to be improved on in pandemic preparedness. Indeed, in a post-COVID-19 future, these issues are likely to persist for high-cost cancer and rheumatology drugs in low-income and middle-income countries, where the drugs may be available but the means to acquire them are scarce.38 46 Clinical trial systems, supported by policy-makers, must acknowledge scarcity and take appropriate steps to optimise dosing. Though methodological questions remain, a two-step model of innovation that recognises and then balances the inherent tension between population health and individual health under scarcity is one potential approach moving forward.
Ethics approval
Not applicable.
References
Footnotes
Twitter @srinmurthy99
Contributors GS designed and conceptualised the work, interpreted findings, and drafted and revised the manuscript for intellectual content. GP, WFP and SM interpreted findings, revised the manuscript and provided intellectual content. The corresponding author attests that all listed authors meet authorship criteria and that no others meeting the criteria have been omitted. GS is the guarantor.
Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.
Competing interests GS is an employee of the US Government; the views expressed are his own and do not necessarily represent those of the Department of Veterans Affairs or US Government. GS is a listed coinventor of a filed patent held by the University of Chicago covering the use of low-dose tocilizumab in viral infections. GS and SM have served in WHO Clinical Therapeutics Working Groups; the views expressed are theirs and do not reflect the WHO or other members of Working Groups.
Provenance and peer review Not commissioned; externally peer reviewed.