Publication
How reliable is Ki-67 immunohistochemistry in grade 2 breast carcinomas? A QA study of the Swiss Working Group of Breast- and Gynecopathologists
Journal Paper/Review - May 25, 2012
Varga Zsuzsanna, Viale Giuseppe, Mastropasqua Mauro G, Wagner Urs, Tapia Coya, Singer Gad, Schreiber-Facklam Heide, Schobinger-Clement Sylviane, Sancho Oliver Sara, Rakozy Christiane, Padberg Barbara, Öhlschlegel Christian, Obermann Ellen, Noske Aurelia, Kaup Daniela, Frick Harald, Dommann-Scherrer Corina, Diebold Joachim, Lehr Hans-Anton
Units
PubMed
Doi
Citation
Type
Journal
Publication Date
Issn Electronic
Pages
Brief description/objective
UNLABELLED
Adjuvant chemotherapy decisions in breast cancer are increasingly based on the pathologist's assessment of tumor proliferation. The Swiss Working Group of Gyneco- and Breast Pathologists has surveyed inter- and intraobserver consistency of Ki-67-based proliferative fraction in breast carcinomas.
METHODS
Five pathologists evaluated MIB-1-labeling index (LI) in ten breast carcinomas (G1, G2, G3) by counting and eyeballing. In the same way, 15 pathologists all over Switzerland then assessed MIB-1-LI on three G2 carcinomas, in self-selected or pre-defined areas of the tumors, comparing centrally immunostained slides with slides immunostained in the different laboratoires. To study intra-observer variability, the same tumors were re-examined 4 months later.
RESULTS
The Kappa values for the first series of ten carcinomas of various degrees of differentiation showed good to very good agreement for MIB-1-LI (Kappa 0.56-0.72). However, we found very high inter-observer variabilities (Kappa 0.04-0.14) in the read-outs of the G2 carcinomas. It was not possible to explain the inconsistencies exclusively by any of the following factors: (i) pathologists' divergent definitions of what counts as a positive nucleus (ii) the mode of assessment (counting vs. eyeballing), (iii) immunostaining technique, and (iv) the selection of the tumor area in which to count. Despite intensive confrontation of all participating pathologists with the problem, inter-observer agreement did not improve when the same slides were re-examined 4 months later (Kappa 0.01-0.04) and intra-observer agreement was likewise poor (Kappa 0.00-0.35).
CONCLUSION
Assessment of mid-range Ki-67-LI suffers from high inter- and intra-observer variability. Oncologists should be aware of this caveat when using Ki-67-LI as a basis for treatment decisions in moderately differentiated breast carcinomas.