"Advances in precision medicine, outcomes management and genetic classification require high quality biospecimens, yet standard procedures for collecting, storing and handling such samples are lacking," says Katheryn Shea, vice president of Bioservices at Precision Bioservices in Frederick, MD, and immediate past president of the International Society for Biological and Environmental Repositories (ISBER), an SLAS strategic alliance partner. "Inconsistency in these processes can lead to inconsistent, irreproducible results," she warns.
When working with biological samples—cells, tissue, plasma, serum—"the bottom line is that as soon as you begin to collect a sample, you're changing the profile of that sample," says Shea, who co-presented the SLAS2014 Sample Management Short Course. "Even something as simple as putting a tourniquet on a person's arm elicits a response from the body that will affect the sample from that person. In addition, the sample is removed from a warm 98.7oF and exposed to the ambient environment, which can stress the biological pathways. And of course, the way we handle and process a sample also can impact those pathways."
To minimize the impact, "we need to do these tasks in a standardized way, so we know that donor A's sample, donor B's sample and donor C's sample all were handled and processed the same way," Shea says. "Then when we analyze those samples to determine what is actually different from one donor to the next, we know we're measuring the variability among the donors and not the variability that we introduce during handling."
The U.S. Food and Drug Administration requires researchers to go through a series of tests on new biological devices to determine the limit of detection, linearity, reproducibility, interfering substances and other parameters. "You should go through the same kind of testing using the pre-analytical conditions planned for the biological samples themselves," Shea asserts.
Time and temperature are key variables when it comes to handling biospecimens, Shea says. Depending on what parameters will be measured, some serum samples must be processed within 30 minutes of draw, for example, whereas others may be processed up to five days after draw. What's being measured determines how much processing delay can occur before the sample starts to degrade.
"People need to provide evidence-based protocols and test those protocols in advance to ensure they are suitable for their measurements," Shea urges. "They also should look at the impact of deviations. If, for example, the standard protocol for a serum sample is to separate it within one hour of draw, what would happen if it were processed faster, at 30 minutes? Or later than an hour? How would those time deviations affect what you're trying to measure?"
ISBER's Biospecimen Science Working Group recently published a report, Identification of Evidence-Based Biospecimen Quality-Control Tools, which identifies the effect of variations in time and other variables in sample handling on analytes used as diagnostic QC tools. Delays in pre- and post-centrifugation affect, among others, concentrations of transferrin receptor; ascorbic acid; potassium; and the inflammatory cytokines GM-CSF, IL-1alpha, G-CSF and ACTH. More specifically, taking potassium as an example, one study referenced in the report shows that a blood pre-centrifugation delay at 4oC induces a "dramatic increase" in potassium concentrations; concentrations increase about 200% after one day and up to 500% after seven days of processing delay.
The report also identifies a number of analytes that are affected by temperature, storage conditions and freeze/thaw. For example, suboptimal storage conditions and freeze/thaw together affect the stability of the signal protein VEGF in serum; storage conditions alone can lead to degradation of Vitamin E in EDTA plasma and the enzyme MMP-9 in citrated plasma; improper freeze/thaw alone can lead to stability issues in cytokines MMP-7, IL-15, IL-17, and INF-gamma.
"Some biological specimens can't go through any kind of freeze/thaw," Shea notes. "If you put cryopreserved cells through an unplanned freeze/thaw cycle, it will simply kill the cells."
The impact of freeze/thaw on various types of specimens convinced Derek Donohue to establish Box Scientific, based in Newton, NC, to help reduce the degradation and variability that can be introduced when samples are "haphazardly transitioned" from deep freeze to working temperatures, he says. "Thawing has been overlooked as a source of processing errors for a long time. When working with samples in microplates or small tubes, for example, the thaw time can be anywhere from half an hour to two and a half hours, depending on the sample size. If you leave the samples on a bench, they may look solid for all intents and purposes, but what do you really have?"
Leaving samples on a bench can cause "edge effects" in microplates, because the temperature in the wells on the edge of the plates may be slightly different from that of the wells toward the center, Donohue explains. This can lead to statistically different results from the samples being measured; values obtained from the edge wells could be higher or lower than those toward the center. "By looking at the wells, you can confirm there is no ice, but it's not standard process to take a thermometer and check that every sample is at the exact same temperature," he says. "Yet so many processes such as PCR have to be done at a precise temperature; others involve cycling samples through different temperature ranges. It's imperative to know what you're starting with."
The same is true for samples used on a larger scale, for diagnostic purposes, according to SLAS member Charles D. Hawker, Ph.D., M.B.A., F.A.C.B., scientific director, Automation and Special Projects at ARUP Laboratories, a U.S. national reference laboratory in Salt Lake City, UT. ARUP developed an automated system for rapidly thawing large volumes of frozen samples—mainly blood and urine used for diagnostic purposes—while helping to ensure stability and uniformity. The automated system essentially circulates ambient air past the samples, maintaining a steady temperature gradient. "This certainly beats the old way we used to thaw samples by setting a group of specimens out on a laboratory bench and getting an electric fan to blow on them," Hawker quips.
The ARUP system also mixes the samples when thawing is completed, Hawker notes. "Specimens have to be well mixed before you can do whatever the next step is. In our case, it's performing a diagnostic test; but even for bench scientists, if they were to take an aliquot out of a tube, for example, it would have to be well mixed or it wouldn't be a correct specimen."
Hawker recalls instances in which a hospital lab would ask ARUP to repeat a test because the results weren't what they were expecting. "We would repeat the test, and we'd get the results they were looking for," he says. "Sure, we could say it was laboratory error, but the likely explanation was that the specimen may not have been completely thawed at the time of the test, and therefore it wasn't well mixed. The incorrect result was obtained because the proteins and other substances we were trying to measure were much more concentrated than if the ice had completely melted. Uniform thawing and mixing of specimens is critically important, regardless of what the application is, whether it's clinical diagnostics or bench research. When manual procedures are used, you risk getting inadequate thawing and inaccurate results."
Time, temperature and storage also are considerations when working with small molecules, according to Sue Holland-Crimmin, Ph.D., global head of Sample Management Technologies at GlaxoSmithKline (GSK) in Philadelphia, PA, and co-presenter with Shea of the SLAS Sample Management Short Course. Pharmaceutical companies, biotechs and other laboratories typically hold their small molecule collections in two formats: powder, which has to be dissolved at some point; or in a solvent such as DMSO. "Proper storage is key to maintaining the quality and stability of the solution collections," Crimmin says. "Studies have shown it's important to keep those collections as cold as possible in a dry atmosphere to avoid moisture uptake into the DMSO. Typically, samples are stored at -20oC in sealed tubes or plates."
Not paying attention to appropriate storage of solutions of small molecules has consequences, Crimmin cautions." Poor sample quality can cause many problems in biological assays. Impurities and decomposition products may result in false positives. Furthermore, a lower than expected concentration of target compounds could cause difficulty with accurate IC50 and Kd determinations So avoiding water uptake and temperature variations is critical."
Donohue agrees. "Most samples have some aqueous component, even if they're not stored in solution. If you don't maintain a steady heat gradient when you phase-transition them, you can hit a point where the sample goes back and forth between ice and water. This can cause crystals to form and wreak havoc on the samples. Cells are most susceptible, but even DNA and RNA can be fragmented by that process."
The desire to thaw substances more quickly can drive researchers to try ad hoc strategies. "When I worked with nuclear pharmaceuticals, I would see people take heavy water and other substances with radioactive signatures and try to heat them up fast by putting the samples under their armpits," Donohue recalls. "I said, 'you're putting radioactive water right on your thyroid gland! Stop the madness!'" Several companies offer freeze/thaw systems using various technologies, from circulating ambient air, to circulating fluid, to polymers that keep samples at specific temperatures during use.
"They all come back to the same core goal: consistency and reproducibility," Donohue says. "We're no longer doing simple blood work. We're digging into DNA, into individual genes, into specific proteins within DNA fragments. The industry is at a point where the need for depth and precision is so high that we really need to raise the bar on how we treat our samples and maintain their integrity as we investigate deeper and deeper."
Although the need for appropriate sample management might seem self-evident, awareness of that need is, in fact, quite low, according to Shea. "When researchers are designing a new protocol, they're very focused on what they're going to measure. They focus on the assays and on the indicators that could best answer whatever question they're investigating. The biospecimens they're using often are an afterthought, even though they're essential to experiments. So we need to move that awareness to the forefront. ISBER is focused on this initiative and the SLAS Sample Management Special Interest Group could be a forum for discussion of these challenges."
The need for evidence-based protocols is also well accepted; however, these are not widely available. "It would be ideal if people tested their protocols, looked at the impact of deviations such as time or processing delays and published their results," Shea says. "But the reality is that mainstream scientific publications are not going to put together an article on the best way to process your serum sample. They'll publish papers on proteomics, which are impacted by time variables, but they simply say 'samples were processed within these parameters.' There's limited to no data on what happens if there's a delay or deviation." One move in the right direction is the guidelines from the U.S. National Cancer Institute's Biospecimen Reporting for Improved Study Quality (BRISQ). A recent paper in Nature includes a checklist derived from BRISQ for the use of omics-based predictors in clinical trials, including specimen and assay issues, among others.
Another overlooked strategy for safeguarding specimens is to read the package inserts for the assays, Shea advises. "The inserts will give you information on some of the important preanalytic conditions, like whether the fact that a sample is hemolyzed or lipemic will interfere with the assay. It will also tell you about any freeze/thaw impact."
GSK's Crimmin points to a different type of challenge. "Those of us working in large pharmaceutical companies are dealing more and more with partnerships," she says. "That's great because bringing together teams from academia, government, nonprofits and other parts of the industry is a really powerful way of trying to address difficult problems such as neglected and rare diseases. But as we share our compounds with our partners, we also need to share how to handle them, so that the collections we pull together specifically for tuberculosis research, for example, maintain their integrity. Only then can we be sure that the results we're seeing are related to potential efficacy, not to sample quality control issues."
March 31, 2014