Format

Send to

Choose Destination
J Exp Second Sci. 2014 Apr;3(2). pii: http://jes2s.com/may2014/pdfs/hts.pdf.

Minimizing Systematic Errors in Quantitative High Throughput Screening Data Using Standardization, Background Subtraction, and Non-Parametric Regression.

Author information

1
William G. Enloe High School, 128 Clarendon Crescent, Raleigh, NC 27610.
2
National Institute of Environmental Health Sciences (NIEHS/NIH), 111 TW Alexander Dr., Research Triangle Park, NC 27709.

Abstract

Quantitative high throughput screening (qHTS) has the potential to transform traditional toxicological testing by greatly increasing throughput and lowering costs on a per chemical basis. However, before qHTS data can be utilized for toxicity assessment, systematic errors such as row, column, cluster, and edge effects in raw data readouts need to be removed. Normalization seeks to minimize effects of systematic errors. Linear (LN) normalization, such as standardization and background removal, minimizes row and column effects. Alternatively, local weighted scatterplot smoothing (LOESS or LO) minimizes cluster effects. Both approaches have been used to normalize large scale data sets in other contexts. A new method is proposed in this paper to combine these two approaches (LNLO) to account for systematic errors within and between experiments. Heat maps illustrate that the LNLO method is more effective in removing systematic error than either the LN or the LO approach alone. All analyses were performed on an estrogen receptor agonist assay data set generated as part of the Tox21 collaboration.

PMID:
27840777
PMCID:
PMC5102623

Supplemental Content

Full text links

Icon for PubMed Central
Loading ...
Support Center