ELECTRONICALLY REPRINTED FROM FEBRUARY 2015 sas-Farmer, senior director of bioanalytical labs with Frontage Laboratories. Furthermore, analytical method development for APIs tends to be more straightforward because reference materials with known concentrations are available and can be used to spike into the matrix. With biomarker assays, internal standards are not often available, and, generally, an endogenous level of the biomarker is already in the matrix, which makes quantitation a challenge, according to April Brys, director of biomarker services with Battelle. Regulatory requirements for analytical methods also differ for APIs and biomarkers. API analytical method development and validation must meet US Cynthia A. Challener, PhD Pharmacopeial Convention requirements or guidelines set by the International Conference on Harmonization (ICH). Biomarker assay development and validation, meanwhile, depends on the assay’s intended use. If it is being used as a diagnostic tool, it must meet requirements of the Clinical Laboratory Improvement Amendments (CLIA). If it is being used to establish a pharmacokinetic/pharmacodynamics (PK/PD) endpoint, it must meet good laboratory practice (GLP) standards. Many of the same analytical techniques, however, are used for API methods and biomarker assays. “Smallmolecule biomarker assays are typically developed using a liquid chromatography/mass spectrometry (LC/MS) platform, which provides a great balance of sensitivity, selectivity, robustness, precipproaches to drug development are challenging for biomarkers than it is for sion, accuracy, and ease of operation,” changing, as pharmaceutical com- APIs. This article will examine some of states Patrick Bennett, executive director panies use advanced technologies these issues. of biomarker operations at PPD. The LC/ to achieve targeted goals. Today, ratioMany of the challenges of working MS platform is also beneficial for bional drug development strategies based with biomarkers stem from the fact that, marker analysis because it can be used on modeling, simulation, and biomark- unlike APIs in neat solutions, they are en- to perform multi-component analyses ers complement such traditional drug dogenous compounds in biological fluids. simply, including metabolomics-based discovery methods as structure-activity For example, isobaric analytes such as analyses of hundreds of molecules per relationship (SAR) studies. Biomarkers bile acids in matrices can lead to higher sample. are also being used to evaluate drug background noise and interference in performance during preclinical testing biomarker assays. In addition, biomark- Dealing with matrix issues and clinical trials. Assay development ers are typically present in trace amounts, For biomarker assays, regulatory agenand validation, however, can be more unlike APIs, according to Stephanie Pa- cies prefer that calibration standards and Cover Story: Biomarker Assays Developing and Validating Assays for Small-Molecule Biomarkers A SCIENCE PHOTO LIBRARY - TEK IMAGE/GETTY IMAGES Working with biological matrices and understanding the intended use are crucial. Cover Story: Biomarker Assays quality control samples be prepared using the same biological matrix (whether blood, urine, plasma, or other biological fluid) found in the samples to be analyzed. However, they should be free of biomarker, which can influence the accuracy of readings. Unfortunately, it is difficult to obtain analyte-free matrices. One option, according to PasasFarmer, is to use a surrogate matrix that is biomarker free. A second option is to remove the endogenous analyte/biomarker from the matrix through chemi- cal entity is in the early stages of development. Bennett notes, for example, that, even if no compliance is deemed necessary, extensive method performance qualification may still be warranted to ensure that the biomarker assay can provide adequate confidence in the data required. Validation of an API analytical method is typically focused on the validation of Intended use and validation specific assay performance parameters Before method validation, it is crucial to such as the precision, robustness, linidentify the intended use of the biomarker earity, and lower limit of quantification (LLOQ)/lower limit of detection (LLOD). In contrast, validation of biomarker assays involves determination of “fitness for purpose” and must demonstrate that the assay meets the predetermined purpose, according to Brys, who defines the stages of biomarker validation as follows: • Defining the purpose and selection of the assay cal treatment, charcoal stripping, selec- assay. “The investigator must understand • Identifying the critical reagents and tive extraction, or some other method. the ultimate use of the data generated,” standards In both of these cases, however, the says Pasas-Farmer. If biomarker data are • Writing the procedure and validachosen matrix will be different from the to be used for early screening and evalution plan study sample matrix. “With surrogate, ation of the mechanism of action (MOA) • Performing the actual study validastripped, and depleted matrices, there of a drug candidate, then a qualified assay tion, which typically includes asmay be issues with solubility, stabil- that is semi-quantitative or relatively sessment of fitness for purpose, robustness of the assay in the clinical ity, non-specific binding, or ionization quantitative can be used, she says, but if setting, standardization of patient matrix effects due to the differences in the biomarker data are used to support the modified matrix and the authentic matrix,” says Bennett. Consequently, it will be necessary to design validation experiments to confirm that the ability to quantify analytes in a modified matrix is similar to that for analytes in the authentic matrix. If an authentic matrix is combined with standard addition, Bennett adds, experiments and processes will be necessary to determine the baseline and sampling, and the collection, storage, final concentrations and generate bridg- claims for the safety/efficacy of dosing in a pivotal preclinical or clinical study, and stability of the samples. ing data from newer lots. • Evaluating the assay during routine These issues may be avoided by using and if they will be used in a regulatory use a standard addition approach with the filing in support of a new drug applica• Quality control monitoring true biological matrix once the basal tion (NDA), then a fully validated assay • Proficiency testing levels of the endogenous biomarker have should be implemented. • Identification of batch-to-batch isFinally, if the assay will be used as a been calculated for the specific lot of sues. control matrix. Stable, labeled reference diagnostic tool to determine the disease If stable labeled material is used as a material can be used as a surrogate stan- state of an individual, then CLIA validadard, and labeled internal standards can tion requirements must be met, she ex- surrogate reference standard, Bennett says, additional experiments must be perbe effective. These approaches, however, plains. Discussions are still ongoing, however, formed immediately before each analytican also be challenging because, often, reference standards are not available, or about whether biomarker assays should cal run, to demonstrate similar responses are difficult to synthesize, for biomarker be fully validated, or if qualification of an for equimolar amounts of labeled and unassays. Even if they are available, whether assay can still be used if the new chemi- labeled analyte. Because biomarkers are they are labeled or unlabeled, they can be expensive. In some cases, depending on the detection method and the type of matrix, samples may have to be purified before a biomarker assay can be completed. This, in turn, can create various problems, including recovery and matrix interference issues, notes Brys. Many of the challenges of working with biomarkers stem from the fact that, unlike APIs in neat solutions, they are endogenous compounds in biological fluids. FDA’s recent guidance suggests that the agency will take a more rigorous and less semi-quantitative approach to overseeing assay qualification and validation. Cover Story: Biomarker Assays endogenous and the goal is to measure the change in the biomarker concentration in response to some other change (e.g., use in diseased vs. healthy patients or in comparing different therapies), he says, experiments must also be run to determine the ability to detect a statistically meaningful change. must be in place for initial discovery into preclinical, followed by preclinical into clinical, as each stage will have its own level of rigor, and follow the fit-for-purpose assay qualification and validation steps,” Eash notes. Given the low levels of biomarkers present in most samples, Brys adds, it is important to identify an analytical Successful strategies method with sufficient sensitivity. Access While it might seem obvious, Bennett to actual study or patient samples can be stresses that, even before assay develop- important during the pre-qualification/ ment starts, it is crucial to ensure that the validation stage to ensure that the right Regulatory expectations for biomarker methods are changing, and are expected to continue to evolve over the next three to five years. -Patrick Bennett, PPD correct biomarker(s) is being evaluated for the right endpoint(s). Brys adds that selected biomarkers should have a strong biological rationale and correlation with efficacy in preclinical studies and, thus, focusing on a few biomarkers is more effective. “A basic starting point for the development of any biomarker assay is an understanding of relevant precedents for the assay,” agrees Donna Eash, director of client services with Frontage. As important is ensuring that the extent of qualification or validation is aligned with the objective of the biomarker data that is obtained, according to Bennett. “If the data are used for dose selection, formulation comparisons, or research, then compliance may not be as important as ensuring that the method meets an appropriate acceptance criteria. However, if the data are to be used for patient enrollment, safety, or efficacy, meeting required regulatory compliance standards also should be included in the strategy,” he observes. Developers must also outline an effective development plan, considering that endogenous biomarkers are often identified in discovery stages. “A good strategy concentration range(s) are being evaluated and to determine differences in the concentrations of the biomarker(s) in treated and untreated samples, according to Bennett. He also reiterates that validation experiments should be performed to quantify any matrix effects and non-specific binding that may occur when using modified or surrogate matrices. The following are also important to the successful development and validation of biomarker assays, Brys says: • Evaluating biomarker stability, particularly during storage • Proper determination and evaluation of sample collection and storage methods • For biomarker assays that are intended for use in diseased patients, the validation process must also include screening patients to evaluate how the biomarker is affected by the disease state and how the disease state will affect the analysis, according to Pasas-Farmer. Multiplexing and other advances New advances in analytical technology, such as multiplexing systems that allow for 10–100 biomarkers to be analyzed in a single sample, offer a number of benefits, including lower costs and fewer invasive procedures, says Pasas-Farmer. High resolution/accurate mass (HR/ AM) instruments are now commercially available with triple quadrupole-like sensitivity and robustness. Bennett says that they offer significantly higher resolving power with a much greater signal-tonoise ratio, even when using single-ion monitoring (SIM). SIM is advantageous because it does not require the fragmentation and subsequent loss of signal that occurs in LC-MS/MS experiments. This means that, for many analytes, particularly those with poor or excessive fragmentation or high background noise levels, even when MS/MS is used, it allows for greater selectivity and sensitivity, he says. Examples would include bile, which has poor fragmentation, and steroids, with excessive fragmentation. Easier and more robust nanospray technologies have also allowed researchers to achieve significantly better LLOQs than was possible with high-flow liquid chromatography and traditional electrospray ionization techniques, Bennett says. Pairing this technology with either HR/AM or triple quadrupole instruments can provide sensitivity benefits that are further magnified when multidimensional chromatography and an immunoaffinity cleanup step are used, he adds. Evolving regulatory environment Regulatory expectations for biomarker methods are changing, and Bennett expects them to continue to evolve over the next three to five years. FDA’s recent guidance suggests that the agency will take a more rigorous and less semiquantitative approach to overseeing assay qualification and validation. In addition, Incurred Sample Reanalysis, which is used to measure the quality of data and robustness of an assay and has been required for small- and largemolecule drug substances, is now being requested for biomarker development, Pasas Farmer points out. PT Posted with permission from the February 2015 issue of Pharmaceutical Technology ® www.pharmtech.com. Copyright 2015, Advanstar Communications, Inc. All rights reserved. For more information on the use of this content, contact Wright’s Media at 877-652-5295. 115729
© Copyright 2026 Paperzz