Table of Contents Table of Contents
Previous Page  165 / 258 Next Page
Information
Show Menu
Previous Page 165 / 258 Next Page
Page Background

1710

Pacquette & Thompson:

J

ournal of

AOAC I

nternational

V

ol.

98, N

o.

6, 2015

Collaborative Study Results and Discussion

The key precision performance metrics of the multilaboratory

study per sample matrix are summarized in Table 6. The RSD

r

derived from analysis of the blinded duplicates was roughly the

same as for the known duplicates (raw data not shown from the

participating laboratories, but the RSD

r

obtained is consistent

with the intermediate precision data from the SLV shown in

Table 3). For instance, none of the seven matrixes produced an

RSD

r

higher than the method’s 10% RSD duplicate criterion for

Cr, 7% for Se, or 5% for Mo. Note, however, that repeatability

of Se for three of the seven matrixes was between 5 and 7%,

further justification for the change of this QC criterion to 7%.

In terms of the repeatability SMPR, two matrixes had RSD

r

of >5% for Cr, as did three matrixes for Se. The highest RSD

r

observed for the blinded duplicates was 7.0%, and in this case,

as well as the other four cases, the corresponding reproducibility

was only slightly higher.

The RSD

R

of method

2011.19

for each matrix was, on the

average, about half of the SMPR of 15%. HorRat were similarly

low, averaging 0.46 for Cr, 0.27 for Mo, and 0.32 for Se. The

authors’ opinion is that the RSD

R

expected from this study is

a function of how far above the instrument quantification limit

we are at the determination stage, not of the absolute level of

the analyte. Methods with good sensitivity, good linearity over

the calibration range, and adequate required system suitability

should be able to produce comparable reproducibility at the low

ppb level, and this appears to be supported by other SPIFANMLT

studies (publications in progress).

The individual sample results submitted by each laboratory

are given in Tables 7–9. Each value given is the mean of known

duplicates, prepared per the method. Then, the blinded duplicate

results are shown for each participating laboratory, for each

matrix. The footnotes indicate which samples were rejected,

either by the method’s QC criteria, or by the AOAC-supplied

statistical package (5). The laboratories could have analyzed new

samples and obtained data to replace the rejected results, but there

was not enough time to do so, or perhaps they did not realize

this was an option. Although there were five cases in which both

blind duplicate samples were rejected (thus Table 6 records the

number of laboratories as seven for that matrix), the footnotes

indicate that retaining the data would keep the RSD

R

under 15%

in all but one case. The data in Tables 7–9 also indicate why

Laboratory 1 data were totally excluded from the study; except

for a few Mo results, its data were significantly lower than

that of any other laboratory across the board. Also, Laboratory

1 stood out as having the most problems with sensitivity and

linearity (Table 1), and perhaps contamination was an issue due

to the number of (known) duplicate failures for Cr (Table 2). It

may be a coincidence, but that laboratory was using the oldest

ICP/MS instrument, a PerkinElmer ELAN DRC-e, which may

not have had the capability to do the required collisional/reaction

chemistry to eliminate low-mass interferences.

Comments about the performance of the method were

requested. One laboratory pointed out that the 10% powder

reconstitution in the method was different than the 11.1%

reconstitution recommended by SPIFAN and proceeded to use

the latter (it made no discernible difference). Another comment

was that the ICP/MS instrument model DRC-e could not use

ammonia gas for Se determination, which may be the reason for

Laboratory 1’s exclusion from this study.

Conclusions

AOAC Method

2011.19

was successfully studied in

collaboration by eight laboratories using multiple ICP/MS

instrument models and testing a variety of infant, pediatric, and

adult nutritional matrixes. The method demonstrated acceptable

repeatability and reproducibility and met the SPIFAN SMPRs for

reproducibility for all seven matrixes analyzed.

Recommendation

The multilaboratory collaborative study data were

summarized and presented to the AOAC ERP in

September 2014. After reviewing the data, the AOAC ERP voted

to move AOAC

2011.19

to Final Action status, and the method

was approved by the AOAC Official Methods Board as a Final

Action method (6).

Acknowledgments

The authors would like to thank the following collaborators

and their associates:

Yue Fenpeng, Chinese Academy of Inspection and Quarantine

(CAIQ), Beijing, China

Fan Xiang, Entry-Exit Inspection and Quarantine, Shanghai,

China

Yue Zhang and Shuqi Zhang, Zhejiang Test Academy,

Hangzhou City, China

Sudhakar Yadlapalli, First Source Laboratory Solutions,

Hyderabad, India

Isabelle Malaviole, Laboratory Aquanal, Pessac, France

Ashutosh Mittal, Syngene International Ltd, Bangalore, India

Michael Gray, Mead Johnson, Evansville, IN

Marissa Feller, Covance Laboratories, Madison, WI

Diana Mould and Michael Farrow, U.S. Food and Drug

Administration, Atlanta, GA.

References

(1) Cubadda, F., Raggi, A., Testoni, A., & Zanasi, F.

(2002) J. AOAC Int. 85 , 113–121

(2) Sharpless, K.E., Thomas, J.B., Christopher, S.J., Greenberg, R.R.,

Sander, L.C., Shantz, M.M., Welch, M.J., &

Wise, S.A.

(2007) Anal. Bioanal. Chem. 389 , 171–178. http://dx.doi.org/10.1007/s00216-007-1315-y

(3) Pacquette, L., Szabo, A., & Thompson, J

. (2011) J. AOAC Int . 94 , 1240–1252

(4) AOAC SMPR 2011.009 (2012)

J. AOAC Int

.

95

, 297. http://

dx.doi.org10.5740/jaoac.int.11-0441

(5) AOAC Interlaboratory Study Workbook for Blind (Unpaired)

Replicates (2013) Version 2.1, AOAC INTERNATIONAL,

Rockville, MD

(6)

Official Methods of Analysis

(2012) 19th Ed., AOAC

INTERNATIONAL, Rockville, MD.

www.eoma.aoac.org,

Method

2011.19

165