final nrl report - health.gov.aufile/wnrl-2… · web view:2008. nrl’s independence was another...
Post on 21-Sep-2019
1 Views
Preview:
TRANSCRIPT
FINAL REPORT: Investigation of the Performance of Assays for Lyme Disease in Australia
NRL is : Certified by BSI for Quality Management AS/NZS ISO 9001: 2008 World Health Organization (WHO) Collaborating Centre for Diagnostics and
Laboratory Support for HIV and AIDS and Other Blood-borne Infections
Report prepared by Susan Best24 May 2017
Page 2 of 45
1 EXECUTIVE SUMMARYIn June 2015, the Australian Government Department of Health commissioned the National
Serology Reference Laboratory Australia (NRL) to undertake a comparison of in vitro diagnostic
devices (IVD), in other words “tests” used for testing individuals for Lyme Disease (the Project).
NRL was selected for this activity because of its international reputation and accreditations, its
expertise in serology (including evaluating serological IVDs for regulatory purposes) and its
independence. The objectives of the Project were a) to evaluate the IVDs used to test Australian
individuals for Lyme disease both in Australian and overseas laboratories to the extent possible
within the resources available and b) to show whether Lyme disease testing performed by
Australian laboratories was of high quality. This project was designed to determine the ability of
IVDs to detect Borrelia burgdorferi sensu lato and not other Borrelia species.
Methods
Eight institutions were able to provide serum specimens of sufficient volume to the Project, four in
Australia and four overseas In Australia, participating laboratories were Sullivan and Nicolaides
Pathology (SNP), Pacific Laboratory Medicine Services at Royal North Shore Hospital (PaLMS),
Australian Biologics and the Australian Red Cross Blood Service (ARCBS). From overseas, the
laboratories that participated were the Rare and Imported Pathogen Laboratory (RIPL), Public
Health England (PHE), InfectoLab, Germany, Armin Labs, Germany and IGeneX Inc. USA.
In total 958 specimens were provided; 11 were of insufficient volume leaving 947 for testing. Each
laboratory director apart from ARCBS provided a range of specimens. Along with the specimens
the collaborator included the results of testing in their laboratory. According to the result assigned
by the collaborators we received 249 positive, 308 clinical negative, 308 blood donor negative and
82 equivocal specimens. Clinical presentation and country of origin were not provided consistently.
All the specimens from ARCBS were considered negative as they were collected from individuals
without symptoms who had never travelled outside Australia.
All the specimens other than those from ARCBS were tested in all 10 IVDs. Of the 308 ARCBS
specimens, all were tested in the five immunoassays and 132 were also tested in the five
immunoblots.
Page 3 of 45
The IVDs included in the Project were all commercial IVDs used by any collaborator to test
Australian specimens in the past several years: five immunoassays (usually used for first line
testing) and five immunoblots (classically used for confirmatory testing). Of the five immunoassays
and five immunoblots, three and one respectively were included on the Australian Register of
Therapeutic Goods (ARTG). InfectoLab, Armin Labs and Australian Biologics used only
immunoblots for serology testing. IGeneX INC used in house IVDs; the Project specimens were not
tested in these IVDs, except for the specimens provided by IGeneX itself. All the laboratories used
the IVDs according to the manufacturers’ instructions except for Australian Biologics, which
modified the procedure for the IVD it uses. A selection of the Project specimens was tested using
the modified procedure.
Multiple approaches were used to analyse the data.
Sensitivity
The 100 specimens provided by PHE were considered the known positive specimen panel in the
Project. The results at PHE had been generated from a two-tier algorithm and clinical history was
provided that was consistent with Lyme disease.
In addition a group of 95 specimens was selected as presumed antibody status (PAS) positive
based on the specimens being positive in seven of the 10 IVDs in the Project
Specificity
The 308 specimens provided by Australian blood donors formed the known negative specimen
panel in the Project. In addition a group of 405 specimens was selected as PAS negative based on
the specimens being negative in seven of the 10 IVDs in the Project.
ResultsSensitivity
The sensitivities of the immunoassays ranged from 77% [95% confidence interval (CI) 68–85] to
95% (95%CI 88–98) in the known positive specimen panel and from 73% to 100% in the PAS
positive specimen panel. There was no statistically significant difference between the sensitivities
in the immunoassays. Nevertheless the positive (delta) δ statistic calculated for these assays does
demonstrate that some of the immunoassays are more likely to give false negative results than
others. (The δ statistic is one that shows how far the results are from the threshold above which a
result is considered positive in an immunoassay. It can be calculated for both positive and negative
specimen populations. A “good” δ statistic is greater than 3).
The sensitivities of the immunoblots ranged from 33% to 99% in the known positive specimen
panel and from 66% to 100% in the PAS positive specimen panel. Statistically significant poorer
sensitivity was shown by the Trinity Biotech immunoblot in the specimens used in the Project.
Removing the Trinity Biotech immunoblot, the next lowest sensitivities were 77% and 87% in the
known positive and the PAS positive panels respectively. The sensitivities of the other
immunoblots were not statistically different from each other.
Page 4 of 45
Specificity
The specificities of all the IVDs were generally better than the sensitivities. In the immunoassays,
specificities in the known negative and PAS negative specimen panels ranged from 87.7% (95% CI
83–91) to 99.7% (95% CI 98–100) and from 79.5 % (95% CI 75–83) to 98.0% (95% CI 96–99)
respectively. There were no statistically significant differences between the specificity estimates.
However, similar to the results from the positive specimen panels, the negative δ statistic showed
that two of the immunoassays were more likely to give false positive results, the most prone to
false positivity being the Immunetics C6 ELISA.
In the immunoblots, the specificities were all greater than 90%.
DiscussionAll but two of the evaluated IVDs used predominantly recombinant or peptide proteins for binding
of the antibodies contained in the specimens. Using recombinant and peptide proteins generally
leads to improvement in sensitivity and specificity of IVDs. Two IVDs that used native proteins
showed the lowest sensitivity in the Project specimens.
The immunoassays examined in the Project showed various propensities for false positive and
false negative results. Compared with immunoassays used for screening of organ, tissue and
blood, with which NRL has extensive experience, those for Lyme disease appear under-developed.
This is borne out by their generally low δ statistics which predict the immunoassays’ propensity for
false results.
Immunoblots separate bacterial or viral proteins, which allows reactivity in a specimen to each
protein to be visualised . The Borrelia immunoblots required assessment of reactivity to up to 16
proteins for each specimen. Of the five immunoblots, two were read by scanning instruments and
three were read by eye. Reading by eye is subjective, so in accredited laboratories, the reading is
undertaken by two individuals independently, following which their scores are collated. When
scores differ, the readers consult each other and agree on the scores; sometimes a third reader is
consulted when agreement cannot be reached. In the Project, of 2313 immunoblots that were read
by eye, approximately 10% required consultation of a third reader. Such checking is not
uncommon when the result is not clear.
Manufacturers of immunoblots specify the criteria that laboratories must use to interpret reactivity.
With the Borrelia immunoblots, negative, equivocal/borderline and positive interpretations differ
only by the presence of one band. For example, reactivity to no or one protein is considered
negative, reactivity to two proteins, equivocal/borderline and reactivity to three or more proteins
positive. This, coupled with the subjectivity of the reading could easily lead to specimens being
misinterpreted, e.g., a negative result being reported equivocal or an equivocal result being
reported positive or the reverse, i.e., positive results being reported as equivocal and equivocal
being reported negative.
Page 5 of 45
Reading and interpretation of immunoblots requires significant expertise and experience. This is
the reason that in classical serology, immunoblots are usually used only in specimens that have
shown reactivity in an immunoassay.
Of the five immunoassays used in the Project, three were available through distributors in Australia
who included them on the Australian Register of Therapeutic Goods (ARTG). Of the five
immunoblots only two were available through distributors; two of the other immunoblots are used
by Australian laboratories. In these cases two laboratories have become the Sponsor of an
immunoblot and included them on the ARTG. This incurs an initial application fee and an annual
fee to maintain the IVD on the ARTG. From July 1 2017, should a laboratory modify a
manufacturer’s instructions for use (IFU) for an IVD, the laboratory is required to include the
modification to the IFU on the ARTG as an in-house test and must be NATA accredited.
One laboratory in Australia modifies the manufacturer’s IFU for the IVD used. In the Project we
tested 165 specimens using the modified protocol used by this particular laboratory (87 Australian
blood donor specimens, 54 collaborator negative specimens and 24 collaborator positive
specimens that had given negative or borderline results using the instructions contained in the
manufacturer’s IFU). Using the modified protocol we found that 21 (15%) of the 141 negative
specimens became either borderline (20) or positive (1). Similarly, as might be expected, 15 of the
24 collaborator positive specimens became positive using the modified protocol.
Recommendations IVDs using native proteins should be avoided, or used with recognition that their sensitivity may
not be optimal.
To allow access to a wider range of immunoblots in Australia, without requiring individual
laboratories to be the Sponsors, the establishment of a national reference laboratory could be
considered. This laboratory could be responsible for evaluating Borrelia IVDs, whether or not
they are included on the ARTG. The laboratory could also be responsible nationally for
confirmatory testing. Such a laboratory could be an established medical testing laboratory with
experience in Lyme disease testing or a laboratory with experience in providing a quality
assurance program service.
Confirmatory immunoblots should be read using scanning software to limit the inconsistency
caused by subjective reading by eye.
Conclusions Proven positive samples (from classical Lyme disease patients in endemic countries) tested
positive on the screening immunoassay more than 78% of the time and proven negative control
samples (from healthy non exposed blood donors) tested negative over 88% of the time.
There was also reasonable test to test correlation (a true positive on one test was generally
positive on another test).
Whilst the tests are relatively under-developed, results reported by NATA accredited
laboratories in Australia were consistent with those of other laboratories and tests
Page 6 of 45
internationally and there is confidence that active infections with Borrelia burgdorferi are
appropriately detected or, alternatively, excluded using these tests in Australia more than 80%
of the time.
Greater diagnostic accuracy can be achieved by the addition of immunoblot assays but there is
significant subjectivity in their interpretation. This could be ameliorated by implementation of
national requirements that immunoblots only be used as the second test in a two-tier algorithm.
Including the use of blot reading equipment would also reduce variability.
From the Project’s results, there is nothing to suggest that testing performed by NATA/RCPA
accredited medical testing laboratories in Australia is not of good quality
AcknowledgementsThe NRL would like to thank all the collaborators for their willingness to provide specimens and
their open collaboration.
This project was funded by the Australian Government Department of Health as an outcome of the
Chief Medical Officer’s Clinical Advisory Committee.
Page 7 of 45
1 INTRODUCTION
In June 2015, the National Serology Reference Laboratory, Australia (NRL) was awarded a
contract by the Australian Government Department of Health for the project entitled “Investigation
of the Performance of Assays for Lyme Disease in Australia” (Project). The project was planned to
involve collaboration by multiple laboratories both in Australia and internationally, and for up to 700
specimens to be collected, and tested in each of the different serology IVDs used by the
collaborators. This document is the final report of the Project.
NRL’s credentials relevant to the Project are that it is a globally recognised expert in the pre-
market evaluation and quality assurance of serology and molecular tests and testing for infectious
diseases. During the period in Australia when HIV and hepatitis C tests underwent laboratory
performance evaluation before the test could be sold in Australia, NRL was the laboratory that
conducted those performance evaluations on behalf of the Australian Government. Today, NRL
conducts similar performance evaluations on behalf of the World Health Organization, of which it is
a Collaborating Centre. NRL’s main experience with serology and molecular testing is with those
viral infectious diseases that are screened for in organ, tissue and blood donors. The tests for
these viruses are highly developed and the instructions for their use must be adhered to exactly.
We brought the same level of stringency to the testing for this project as we observe in our routine
testing of specimens from tissue and blood donors, which are conducted using the same type of
technology as the tests used for the Project
NRL also provides proficiency testing and quality control programmes to laboratories globally that
test for infectious diseases. These programmes involve providing the participating laboratories with
standardised specimens that are designed to assist the laboratories in assuring that the test results
they report are accurate and reproducible.
NRL is:
Accredited by NATA as Medical Testing Laboratory compliant with ISO 15189:2012 and as a
Proficiency Testing Scheme Provider compliant with ISO 17043:2010. NRL is
Certified by TGA as compliant with Australian Code of Good Manufacturing Practice for human
blood and blood components, human tissues and human cellular therapy products: 2013 and
Certified by BSI as compliant with ISO 9001:2008.
NRL’s independence was another key consideration in its selection as the laboratory to undertake
this project. It is recognised as an independent laboratory in Australia. Further NRL does not
perform testing for Lyme disease routinely and therefore is not conflicted in the conduct or the
outcome of the Project
Page 8 of 45
2 COLLABORATORS
Ten laboratories agreed to collaborate in the Project, six in Australia, one in the UK, two in
Germany and one in the United States. The laboratories are shown in Table 1.
Table 1: Laboratories collaborating in the Project
Australia UK Germany US
1 Sullivan Nicolaides Pathology (SNP)
2Pacific Laboratory Medicine Services,
Royal North Shore Hospital (PaLMS)
3 Australian Biologics
4Institute of Clinical Pathology and Medical Research
(ICPMR)
5 Australian Rickettsial Reference Laboratory (ARRL)
6 Australian Red Cross Blood Service (ARCBS)
7Rare and Imported Pathogens Laboratory
Public Health England (PHE)
8 Infectolab
9 ArminLabs
1
0
IGeneX
3 Specimens
All the laboratories shown in Table 1, except for ICPMR and ARRL contributed specimens to the
Project. The Project had strict specifications for the specimens to be included in the Project and
neither ICPMR nor ARRL could provide specimens with the serum volume required. Reference
laboratories usually receive minimal volume for testing. It is usual practice in Australia when
collecting blood from individuals that the volume collected is not seen as excessive. If a specimen
then has to be referred for reference testing, the volume remaining is often minimal; particularly
following testing on modern instruments which require significant volume. ICPMR and ARRL
provided comment and advice as necessary, especially on the protocol as it was being developed.
The number and type of specimens provided by each collaborator are shown in Table 2. Nine
hundred and fifty-eight specimens were received.
Page 9 of 45
Table 2: Number and type of specimens provided by collaborators
POS CLINICAL NEG
DONOR NEG
EQU/IND
Sullivan Nicolaides Pathology 4 53 0 0
Pacific Laboratory Medicine Services,
Royal North Shore Hospital2 7 0 9
Australian Biologics 5 58 0 14
Institute of Clinical Pathology and Medical
Research (ICPMR)0 0 0 0
Australian Rickettsial Reference Laboratory
(ARRL)0 0 0 0
Rare and Imported Pathogens Laboratory
Public Health England100 100 0 50
Infectolab 56 33 0 0
ArminLabs 57 31 0 0
IGeneX 25 26 0 9
Australian Red Cross Blood Service (ARCBS) 0 0 308 0
Apart from the ARCBS specimens, all the specimens from participant laboratories provided for the
Project were retrieved from the collaborators’ banks of archived specimens. All specimens had
been stored at –20 °C or lower. Collaborators were asked to provide a minimum of 800 microlitres
of serum so that a complete range of results could be obtained for all specimens. This requirement
was met for all but a very small number of specimens, which were removed from the specimen
panel.
Ethics approval was obtained for the prospective collection of specimens from the Australian blood
donors. The specimens were collected from donors in Tasmania who had never travelled outside
Australia.
Page 10 of 45
2.1 Assigning overall result status to specimens
A range of results were received on the specimens from different collaborators. Two provided
results on a single IgG test only, one sometimes provided two (immunoassay and immunoblot),
sometimes one (immunoblot) result and the remainder provided results in two or more tests. For
the purposes of Table 2, “POS” was assigned to collaborators’ specimens when their single test or
all their IgG tests were positive; “NEG” when their single test or results interpreted using the 2-tier
algorithm were negative; and “EQU/IND” when this was the status assigned by the collaborator or
when this was the interpreted result of a single immunoassay or 0 immunoblot.
The purpose of assigning the overall result status to specimens was to:
Infer the result a referring clinician would have received. This was a valid exercise because in
three of the four laboratories using two or more tests they had provided their algorithms for
reporting, and the fourth laboratory provided their overall interpretation. In laboratories that only
used one test, the result of that test was assigned as the status.
Compare the reactivity in all the IVDs against the collaborator’s interpretation
One collaborator used in-house tests.
The country of origin of the collaborator specimens was provided in only approximately 25% of
cases.
The specimens collected from donors at the ARCBS had not been tested for Lyme disease
previously but, for the purposes of assigning a presumed result status they were classified as
“NEG”.
4 IVDs included in the Project
One objective of the Project was to test all the specimens from all the collaborators (except
ARCBS) on all of the IVDs that may have been used to test Australian specimens. In other words,
we know that some specimens from Australian individuals are sent to laboratories overseas for
Lyme disease testing. Therefore the Project included IVDs used by the overseas collaborators.
Further, the Project sought to determine the sensitivity and specificity of the IVDs used to test
Australian specimens. This was examined by including a significant number of specimens from
Lyme endemic areas that had been reported positive, that had consistent medical history in many
cases and that had given positive results on all the serological tests used in the collaborating
laboratory. The specificity was examined using 308 specimens collected from Tasmanian blood
donors who had never travelled outside Australia.
The IVDs used in the Project are shown in Table 3
Page 11 of 45
Table 3: IVDs used in the Project and their usage by collaborator
IVD NAME CURRENT USAGE BY COLLABORATORS
PREVIOUS USAGE BY COLLABORATORS
NovaTec NovaLisa Borrelia burgdorferi IgG-ELISA (Novatec Novalisa)
ICPMR Westmead; Australian Rickettsial Reference Laboratory (ARRL),
Previously used by PaLMS
DiaSorin LIAISON Borrelia IgG Chemiluminescent immunoassay (DiaSorin Liaison CLIA)
SNP; PaLMS
Trinity Biotech B. burgdorferi ELISA (IgG) Test System (Trinity Biotech ELISA)
IGeneX Inc.(a).; Previously used by SNP; Previously used by ICPMR Westmead
EUROIMMUN Anti-Borrelia Select ELISA (IgG)(Quantitative) (Euroimmun ELISA)
Infectolab
Immunetics C6 Lyme ELISA (Immunetics C6 ELISA)
Public Health England (PHE); IGeneX Inc.(a)
Viramed Borrelia ViraStripe IgG Immunoblot (Viramed ViraStripe IB)
PHE; ARRL
EUROIMMUN Anti-Borrelia EUROLINE-RN-AT (IgG) Immunoblot (Euroimmun Euroline IB)
SNP; PaLMS; Infectolab; Previously used by Armin Labs
Trinity Biotech EU Lyme + VIsE IgG Western Blot (Trinity Biotech IB)
Previously used by PaLMS
Mikrogen recomLine Borrelia IgG Immunoblot (Mikrogen recomLine IB)
Australian Biologics Testing Services
Seramun SeraSpot Anti-Borrelia-10 IgG(b) (Seramun SeraSpot)
Armin Labs
IGeneX IFA (in house) IGeneX Inc.
IGeneX IgG Western blot (in house)
IGeneX Inc.
B. burgdorferi IgG in house Western blot
B. afzelli IgG in house Western blot
Previously used by ICPMR
Previously used by ICPMR
Page 12 of 45
(a)Although IGeneX indicated by questionnaire that they used the Trinity Biotech ELISA and the Immunetics C6 Lyme ELISA, results from these tests were not provided with the specimens provided for the Project(b)Even though this IVD is presented like an EIA in a 96-well plate, it gives similar information to an immunoblot, each well containing 10 Borrelia antigen dots plus control dots. For this Project this IVD is considered an immunoblot.
The information in Table 3 was obtained at the beginning of the Project and, for those laboratories
that did not contribute specimens, the IVDs may have changed in the intervening time. The
information provided for laboratories that provided specimens is current at the time of writing.
Some IVDs that have since ceased to be used in Australia were included in the Project. As the
Project seeks to show whether Australian testing for Lyme disease is of good quality, it was
considered important that results for tests used in the past several years were able to be
evaluated.
5 Testing
All testing was performed according to the test kit manufacturers’ instructions for use (IFU).
Specimens that gave grey zone/equivocal/borderline results in either the DiaSorin Liaison CLIA or
the Immunetics C6 Lyme ELISA were retested once as instructed by the IFU. The second result
was the one recorded. The other immunoassays did not recommend retesting on the same
specimen.
Of the five immunoblots, two used a dedicated scanner and software for reading and interpreting
the results. The dedicated scanner for the Seramun SeraSpot Anti-Borrelia-10 IgG was rented for
the Project period. The remaining three immunoblots were read by eye. Consistent with NRL’s
routine quality management procedures, these immunoblots were read independently by two
individuals; any differences were examined and a decision reached, consulting a third reader on
some occasions.
5.1 Result interpretation
The Trinity Biotech IB gave two options for result interpretation: “Interpretive Criteria for Europe
excluding FDR Germany” and “Interpretive Criteria for FDR Germany”. The interpretive criteria for
Germany were less stringent.
5.2 Testing of non-ARCBS specimens
Page 13 of 45
Of the three Australian laboratories that provided clinical archived specimens for the Project, two
test and report results for Lyme disease using a two-tier algorithm. This consists of a screening
immunoassay followed by an immunoblot in specimens that are reactive. However, this algorithm
was not followed for the Project testing; rather, all the non-ARCBS specimens were tested on all
the IVDs. The reason for this was two-fold. First, at least one Australian laboratory and some
collaborator laboratories use an immunoblot as the first, and in some cases the only test. Second,
the IFUs for the immunoblots do not all preclude using the test as the first test. Therefore, it was
considered beneficial to discover how the immunoblots would perform if they were used as a first
and/or only test.
Australian Biologics uses only the Mikrogen recomLine Borrelia IgG immunoblot for serology
testing. Australian Biologics modifies the manufacturer’s IFU by extending the incubation period of
immunoblot strips and patient specimen from one hour to overnight. Australian Biologics indicated
that they had validated the extended incubation time. For the Project, the manufacturer’s IFU were
used to perform testing with the Mikrogen recomLine immunoblot. In addition, 165 specimens were
selected for testing using the overnight incubation protocol of the Mikrogen recomLine, which was
provided by Australian Biologics. The specimens selected included:
87 collected from Australian blood donors (low risk negative population) that were reactive at
least once on one or more IVDs;
24 from the known Positive population that were found negative or “borderline” on the Mikrogen
recomLine using the manufacturer’s IFU; and
54 from the presumed antibody status negative population (“high risk” negative population:
European individuals who have been tested for Lyme disease)
5.3 Testing of ARCBS specimens
All 308 ARCBS specimens were tested on all the EIAs/immunoassays. In addition, 132 were also
tested in all the immunoblots. The 132 consisted of:
Eighty-two specimens that had given reactivity at least once in any of the EIAs/immunoassays;
and
An additional 50 specimens randomly selected from the remainder
6 DATA ANALYSIS
Page 14 of 45
The results were separated into two working datasets, one that included the collaborators’
specimens other than those from ARCBS (non-ARCBS) and another that included the ARCBS
specimens (ARCBS). When calculating sensitivity or specificity, equivocal results were considered
negative when estimating sensitivity and positive when estimating specificity. Using this method,
conservative estimates of sensitivity and specificity were calculated. The data were analysed using
several different approaches. For the purpose of analysis the Interpretive Criteria for Europe
excluding FDR Germany were used for the Trinity Biotech IB.
6.1 Sensitivity in known positive specimens
The specimens’ results used for this analysis were the 100 specimens with Positive status
provided by Public Health England. These were chosen because:
The specimens were collected from individuals with symptoms from Lyme prevalent areas;
Credible clinical indication of Lyme was provided for each. Many included tick bite and/or
erythema migrans;
The Positive status was the result of a 2-tier algorithm
PHE also provided IgM data on these specimens which, if positive, would most likely indicate that
the infection was early. As a separate analysis, the Project data were grouped according to
whether the IgM result was positive or negative to determine whether the sensitivity in the Project
IVDs was any different in the IgM positive specimens.
6.2 Sensitivity in presumed antibody status positive specimens
The results from non-ARCBS specimens were used for this analysis. The specimens from PHE
were removed, given that they formed the basis of the analysis of known positive specimens.
Within this panel, a specimen was allocated presumed antibody status (PAS) positive if seven of
ten IVDs in the Project were positive. Equivocal results were considered negative for allocation of
PAS positive status.
6.3 Specificity in known negative specimens
The ARCBS specimens’ results were used for this analysis. All specimens were considered
negative given the history of no travel outside Australia and the presumption that the donors were
symptom free given that they were donating blood.
6.4 Specificity in presumed antibody status negative specimens
The intention of this analysis was to estimate the specificity of the Project IVDs in specimens from
individuals who had been tested for Lyme disease but who were found to be negative. This
specimen population differs from the ARCBS specimens because the individuals had symptoms. It
could be considered that the ARCBS specimens were a low risk negative population while the non-
ARCBS specimens could be considered high risk negative.
Page 15 of 45
The results from non-ARCBS specimens were used for this analysis. Within this panel, a specimen
was allocated PAS negative if seven of the 10 IVDs in the Project were negative; equivocal results
were considered positive for allocation of PAS negative status.
6.5 Delta values
Immunoassays are based on a series of steps, the last of which is a colour reaction. The intensity
of the colour is read by a machine; specimens that are reactive give a coloured result and
specimens that are negative are colourless. A number of controls are provided by the test
manufacturer, which must be included every time the IVD is used. At the end of assay the colour
for each specimen is compared with the colour in the cut-off control. Each specimen’s result is
expressed as a sample to cut-off ratio (S/CO). The delta (δ) of an immunoassay is a measure of
the distance of the mean S/CO ratio of a positive or negative specimen population from the cut-off
of the assay, measured in standard deviations. In a normally distributed population 67% of the
results are expected to fall within one SD of the mean, 95% within 2 SD and 99% within 3SD.
Therefore, ideally, an assay’s δ value will be ≥3 to ensure that all the results in a population are
sufficiently distant from the cut-off of the assay. In other words, a δ value <3 can show the
propensity of an IVD to give false results.
6.5 IVDs’ results compared with status assigned by collaborator
All the non-ARCBS specimens were accompanied by an interpretation of the results from the
relevant collaborator. For each collaborator, specimens with results interpreted by them as positive
or negative were grouped with the results on the other IVDs.
7 RESULTS
There was sufficient volume of 947 specimens for testing in this Project. Three hundred and eight
were contributed by Tasmanian blood donors, 152 in total from Australian clinical laboratories and
487 in total from the four laboratories outside Australia.
7.1 IVD instructions for use
Of the five immunoassays included in the Project:
all had criteria for grey zone/equivocal/borderline result interpretation.
two suggested repeat testing on the same specimen, the remainder suggested repeat testing
in a number of weeks
only one recommended that an equivocal or reactive result is confirmed by immunoblot
Of the five immunoblots included in the Project:
Only three stated that the IVD should be used only on specimens with reactive immunoassay
results
Page 16 of 45
Four of five had criteria for indeterminate/equivocal/borderline result interpretation
one had different interpretation criteria for “Germany”, and “Europe excluding Germany”. Those
for Germany were less stringent.
Page 17 of 45
7.2 Sensitivity in positive specimens
7.2.1 Known positive specimen panel
There were 100 specimens in the known positive specimen panel. Of these 56 were reported by
PHE as IgM immunoblot negative and 44 IgM immunoblot positive.
3 4 5 6 7 8 9 1002468
1012141618
Reactivity in known positive spe-cimens compared by IgM status
IgM +IgM -
Number of IgG assays in which specimens were positiveNum
ber o
f skn
own
positi
ve a
mpl
es
Figure 1: The number of Project assays in which IgM positive and negative specimens from the known positive panel were positive
Figure 1 shows the numbers of IVDs that were reactive in specimens that were IgM negative and
positive. Five of the IgM positive specimens were only positive in 3 or 4 IVDs. Otherwise, there was
no difference in the number of IVDs positive, irrespective of IgM status.
Also examined were the S/CO results in the Immunetics C6 ELISA in IgM positive and negative
specimens and with history of tick bite. Scatter diagrams of the results are shown in Figures 2 and
3. These diagrams suggest that neither IgM status nor history of tick bite correlate with the strength
of reactivity in the immunoassay.
Page 18 of 45
0
2
4
6
8
10
12
IgM POSIgM NEGS/
CO
Specimen ID
Figure 2: S/CO ratios in the Immunetics C6 ELISA in 44 IgM positive and 56 IgM negative specimens from the known positive specimen panel
0
2
4
6
8
10
12
C6 ELISA results by history of tick bite
C6
S/CO
Specimen ID
Figure 3: S/CO results in the Immunetics C6 ELISA in specimens from the known positive specimen panel with history of tick bite
Table 4 shows the sensitivity in both known positive and PAS positive specimens. The selection of
the specimens that were analysed for each of these subsets is described in Methods. Also
included in Table 4 are the δ values for those IVDs for which it can be calculated, the 95%
confidence intervals around the sensitivity estimates and the percentage equivocal results obtained
using each IVD.
Page 19 of 45
The estimate of sensitivity in the immunoassays ranged from 78% to 95% (Table 4). (This excludes
the Immunetics C6 ELISA which was used to select the specimens). The δ values were 0.97
(Euroimmun ELISA), 1.2 (Trinity Biotech ELISA), 1.9 (Novatec Novalisa) and 4.03 (Immunetics C6
ELISA). It was not possible to calculate the δ value for the DiaSorin Liaison CLIA because of the
way the results are reported from the instrument. The Trinity Biotech ELISA and the Euroimmun
ELISA gave seven and eight equivocal results respectively.
The estimates of sensitivity in the immunoblots ranged from 33% to 99%. The Viramed ViraStripe
immunoblot, which PHE had used in selecting the specimens, was only 89% sensitive in our
hands, when the equivocal results were considered negative. However, if the equivocal results
were considered positive, the sensitivity increased to 98%. The PHE and NRL results in the 9
specimens that gave equivocal results were compared and in every case, PHE had considered
one additional band as having reactivity equal to or greater than the cut-off control band and NRL
had considered the reactivity less than the cut-off control band.
The Trinity Biotech IB showed very poor sensitivity of 33% when equivocal results were considered
negative and 39% when considered positive. The results were interpreted using the criteria for
Europe excluding FDR Germany. If the results were interpreted using the criteria for FDR
Germany, the sensitivity increased to 40% interpreting equivocal results as negative and to 72% if
equivocal results were considered positive. Irrespective of the interpretation criteria, the sensitivity
of the assay remained poor and the number of equivocal results excessive.
7.2.2 PAS positive specimen panel
Of the starting dataset of 539 non-ARCBS specimens, following removal of the PHE known
positive specimens, there were 95 specimens in which seven of the 10 Project IVDs were positive
These formed the PAS positive specimen panel. The specimens were contributed by Armin Labs
(43), Australian Biologics (1), Infectolab (34), IGeneX (14), PaLMS (2) and SNP (1).
In this panel the estimates of sensitivity in most of the Project IVDs improved over that seen with
the known positive specimen panel. In the immunoassays the sensitivities ranged from 73% to
100%. Two of the five immunoassays gave sensitivities of 100% and four of the five gave
sensitivities ≥ 94% (Table 4).
Similarly, better performance in immunoblots was implied in the PAS positive specimen panel for
three of the five IVDs. In the Trinity Biotech IB the sensitivity in the PAS positive specimen panel
was 66% compared with 33% in the known positive panel; in the Mikrogen recomLine 92%
compared with 76% and in the Seramun SeraSpot, 100% compared with 87%. Sensitivities in the
other two immunoblots were comparable irrespective of specimen panel.
Significant numbers of equivocal/borderline results were recorded for all the immunoblots that were
read by eye. If these results were considered positive for analysis, sensitivity estimates in these
IVDs improved accordingly.
Page 20 of 45
Table 4: The sensitivity, 95% confidence intervals (CI) and delta values of the Project IVDs in a known positive and a presumed antibody status positive specimen panel.
No. of Specimens Known positive panel (Equivocal as negative)
PAS positive panel (Equivocal as negative)
Equivocal results (%)
Known Positive
PAS Positive
Sensitivity % 95% CI δ value Sensitivity % 95% CI Known Positive
PAS Positive
Novatec Novalisa
100 95
94 87–98 1.91 100 95–100 1 0
DiaSorin Liaison CLIA 95 88–98 N/A 99 93–100 2 1
Trinity Biotech ELISA 80 71–87 1.2 73 62–81 7 4
Euroimmun ELISA 78 68–85 0.97 94 86–97 8 2
Immunetics C6 IgG ELISA 100* 95–100 4.03 100 95–100 0 0
Viramed IgG IB 89* 81–94 N/A 87 79–93 9 5
Euroimmun Euroline IB 99 94–100 N/A 100 95–100 N/A N/A
Trinity Biotech IB # 33 24–43 N/A 66 56–75 6 15
Mikrogen IgG IB 77 67–85 N/A 93 85–97 14 4
Seramun SeraSpot 87 78–93 N/A 100 95–100 5 0
The specimens were selected by PHE based on screening reactivity in this IVD followed by a positive Viramed IgG immunoblot#Interpretive criteria Europe excluding German FDR
Page 21 of 45
7.3 Specificity in known negative specimens
7.3.1 Known negative specimen panel
Of the 308 Australian blood donor specimens, 87 showed initial reactivity in one or more IVDs
The IFU for the DiaSorin Liaison CLIA and Immunetics C6 IVDs recommended that specimens
with equivocal results were retested on the same specimen.
Of 14 specimens originally equivocal on the Immunetics C6 ELISA, 11 were negative on
retesting
Of four specimens originally equivocal on the DiaSorin Liaison CLIA two remained equivocal
and two became positive
After these adjustments, 78 specimens showed reactivity in one or more IVDs (Table 5):
63/78 were reactive or equivocal in only one of 10 IVDs (18 equivocal; 45 reactive)
Of these 60 were reactive in either the Immunetics C6 ELISA (38) or the Trinity Biotech
ELISA (22);
14/78 were reactive or equivocal in two of 10 IVDs
Of these two were reactive in each of the Immunetics C6 ELISA and the Trinity Biotech
ELISA
1/78 was reactive in one IVD and equivocal in two more of the 10 IVDs. Of the 78, the rate of
reactivity was highest in the Immunetics C6 ELISA with 38 of the 78 being reactive. The Trinity
Biotech ELISA showed reactivity in 26 of the 78 specimens.
The estimates of specificity in the immunoassays in the known negative population ranged from
87.7%–99.7%. The δ values of the Novatec Novalisa and the Euroimmun ELISA were both greater
than 2.9 while the δ values for the remaining two assays for which the statistic could be calculated
were ≤1.3. (Table 5)
Page 22 of 45
Table 5: The specificity, 95% confidence intervals (CI) and delta values of the Project IVDs in a known negative and a presumed antibody status negative specimen panel.
No. of Specimens Known negative (Equivocal as positive)
PAS negative (Equivocal as positive)
Equivocal results (%)
Known Negative
PAS Negative
Specificity (%) 95% CI δ value Specificity (%) 95% CI Known Negative
PAS Negative
Novatec Novalisa
308
403
99.7 98–100 –2.91 95.1 92–97 0 1.5
DiaSorin Liaison CLIA 96.4 94–98 N/A 91.9 89–94 0.6 2.5
Trinity Biotech ELISA 91.6 88–94 –1.31 89.1 86–92 4.5 5.2
Euroimmun ELISA 99.7 98–100 –2.99 98.0 96–99 0.3 1.0
Immunetics C6 IgG ELISA 87.7 83–91 –1.06 79.5 75–83 1.6 1.2
Viramed IgG IB
132
99.2 95–100 N/A 95.6 93–97 0.8 3.0
Euroimmun Euroline IB *
94.8 89–98 N/A 91.3 88–94 N/A N/A
Trinity Biotech IB 100 96–100 N/A 100 99–100 0 2.5
Mikrogen IgG IB 98.5 94–100 N/A 98.0 96–99 1.5 1.9
Seramun SeraSpot 94.7 89–98 N/A 96.3 94–98 0.8 0.7
* The Euromimmune IB does not have the option of a borderline/equivocal result
Page 23 of 45
Despite immunoblots not typically being used for testing negative specimens, the specificity
estimates for the immunoblots in the Project in the known negative population were all greater than
94.5%
7.3.2 PAS negative specimen panel
Of the starting dataset of 539 non-ARCBS, non-PHE positive specimens, there were 405
specimens in which seven of the 10 Project IVDs were negative These formed the PAS negative
specimen panel.
The specificities of all the IVDs except the Immunetics C6 ELISA were> 89% in the PAS negative
specimen panel. The C6 ELISA gave a specificity of 87.7 and 79.5 in the known negative and PAS
negative specimen panels respectively. Corresponding δ values were –1.06 and –0.74 indicating
that a substantial number of false positive results can be expected with this assay.
7.4 Comparison of overnight vs one hour specimen incubation with Mikrogen immunoblot
Table 6 shows the results of testing 165 specimens using a one hour specimen incubation as
instructed by the manufacturer’s IFU and an overnight specimen incubation as is performed
routinely by Australian Biologics
Table 6: Results in known negative, known positive and PAS negative specimens using one hour and overnight specimen incubations in the Mikrogen immunoblot
Panel # tested
Results using 1 hour specimen incubation
Results using overnight specimen incubation
Negative Borderline
Positive Negative Borderline
Positive
Known Negative
87 85 2 0 74 12 1
Known Positive
24 5 19 0 0 9 15
PAS Negative
54 54 0 0 46 8 0
Page 24 of 45
Of the 87 specimens from the known negative panel that were reactive at least once on one or
more of the 10 Project IVDs 85 were negative and two borderline using the manufacturer’s
instructions for the Mikrogen immunoblot, while 12 were borderline and one became positive using
the overnight incubation protocol.
Similarly in the PAS negative population, of 54 that were negative on the Mikrogen immunoblot
using manufacturer’s instructions, eight became borderline and 46 remained negative using the
overnight incubation protocol.
As might be expected, the loss of specificity using the overnight incubation protocol was offset by
increased sensitivity in known positive specimens. Twenty-four known positive specimens gave
five negative and 19 borderline results using the manufacturer’s instructions for the Mikrogen
immunoblot. However, using the overnight incubation protocol, no specimens were negative, nine
were borderline and 15 were positive.
7.5 IVDs’ results compared with status assigned by collaborator
On the following pages, the results for specimens that the collaborators have assigned as positive
(Figures 4-10) and those assigned as negative (Figures 11-17) are shown as screen shots from an
excel spreadsheet. Positive, negative and borderline results from the Project IVDs have been
coloured red, green and yellow respectively. Presenting the information in this way is intended to
give a picture of the degree of concordance between the IVDs and the status assigned by the
collaborators.
Page 25 of 45
Page 26 of 45
Page 27 of 45
Page 28 of 45
Page 29 of 45
Page 30 of 45
Page 31 of 45
Page 32 of 45
Page 33 of 45
Page 34 of 45
Page 35 of 45
Page 36 of 45
8 DISCUSSION
There were a large number of equivocal/borderline results in the Project, predominantly in PAS
positive specimens. These were seen both in immunoassays and in immunoblots. To take a
conservative view, we have considered the equivocal results to be negative in PAS positive
specimens and positive in PAS negative specimens when discussing sensitivity estimates.
Immunoassays
All the immunoassays except the Trinity Biotech ELISA used predominantly recombinant or
peptide antigens on the solid phase. The Trinity immunoassay used native antigens. Properly-
constructed and targeted recombinant and peptide antigens generally lead to IVDs that are more
sensitive and specific because the antigens can be well purified or synthetically manufactured,
allowing increased input of other reagents without increased nonspecific reactions.
In the known positive population, the Novatec Novalisa, DiaSorin Liaison CLIA and Immunetics C6
ELISA gave sensitivities of 94%, 95% and 100% respectively. (It must be noted that the
Immunetics C6 ELISA was the first line IVD used to select these specimens so it was expected to
be 100% sensitive). Further, these IVDs showed few equivocal results in these specimens. On the
other hand the Trinity Biotech ELISA and the Euroimmun ELISA gave poorer sensitivities of 80%
and 78% respectively and gave more equivocal results. These differences in performance are
borne out by the δ values in these last two immunoassays which were 1.2 and 0.97 respectively.
This means that the mean results in this positive population in these assays are only one standard
deviation from the cut-off of the assay and therefore the higher proportion of negative and
equivocal results is not surprising.
In the PAS positive population, sensitivities were better for all IVDs except the Trinity Biotech
ELISA. The improvement was not significant except in the case of the Euroimmun ELISA.
Overall, the immunoassay sensitivities were not statistically different from each other (p>0.05).
The fact that the sensitivities appear to have improved may be explained by the way the PAS
positive status has been allocated. Specimens with up to three IVDs negative have not been
included in this panel, which in turn shifts this specimen panel towards a greater likelihood of true
positive status. All of the immunoassays indicated that they should be used to test people with
symptoms of Lyme disease. Therefore, the predictive value of any positive results in the known
negative (blood donor) population used in this Project would be very low. Nevertheless, using
specimens collected from individuals who had not travelled outside Australia was considered a
valuable group to include.
Page 37 of 45
Specificities of the IVDs were generally > 90% and similar irrespective of negative panel. The
exception was the Immunetics C6 ELISA, giving specificities of 87.7% and 79.9% in the known
negative and PAS negative panels respectively. Similar to the positive δ values for the Trinity
Biotech ELISA and the Euroimmun ELISA, the negative δ values for the C6 ELISA in known
negative and PAS negative specimens were –1.06 and –0.74 respectively. Once again this low δ
value predicts the high false positive rate seen with the Immunetics C6 ELISA.
ARCBS specimens
Blood Service specimens that were reactive in one IVD only were deemed negative, especially as
sixty of 63 of these were in the Immunetics C6 ELISA or the Trinity Biotech ELISA
It may be anticipated that the S/CO results for these reactive blood donor specimens would be
closer to the assay cut off than S/CO results of positive specimens. Figure 11 shows the S/CO
results of the Immunetics C6 ELISA in reactive blood donor specimens along with the S/CO results
in the same IVD in true positive specimens. Twenty-seven of the blood donor specimens gave
S/CO results of ≤ 2 while only two of 100 of the known positive specimens gave S/CO results of
<2.
0
2
4
6
8
10
12
Immunetics C6 EIA Reactive Results*
ARCBS reactiveARCBS EquivocalKnown Positive Cohort
* Immunetics C6ELISA reactive and equivocal ARCBS results and C6 ELISA results from known posit -ive population
S/CO
Figure 11: S/CO results for the known positive specimen panel and blood donors reactive in the Immunetics C6 ELISA
Page 38 of 45
Specimens from four blood donors gave results that fulfil the criteria for a positive result in a two-
tier algorithm, even though those were the only two IVDs in which the specimens were positive.
(Criteria for positive in a two tier algorithm means reactive in an immunoassay and positive in an
immunoblot). The positive result in all four immunoblots was conferred by reactivity to two or at
most three antigens. Because these donors had no symptoms of Lyme disease, and the positive
predictive value of the assays in well individuals is low, these results are most likely to be falsely
positive.
Immunoblots
There were five immunoblots included in the Project. Although presented in a microwell format, the
Seramun SeraSpot was categorised as an immunoblot because the Borrelia proteins were
individually presented as spots and read, consistent with an immunoblot format. Three of the
immunoblots were read by eye (Mikrogen recomLine, Trinity Biotech IB and Viramed ViraStripe);
the remaining two (Seramun SeraSpot; Euroimmun Euroline IB) used dedicated scanners with
algorithms programmed into their software to interpret the results.
Of the five immunoblots, all except the Euroimmun Euroline IB gave the option of an
indeterminate/borderline/equivocal result.
Whether read by eye or digitally, in a microwell or on a strip, all the immunoblots used the principle
of comparing the intensity of a band (spot) with that of a cut-off control band (spot) to assign
reactivity i.e. if the intensity of the test band (spot) was equal to or greater than the cut-off control
band (spot), the test band (spot) was considered reactive. In four of the five immunoblots, the cut-
off control band (spot) was incorporated into each strip or microwell. On the Trinity Biotech IB, the
cut-off control band was on a separate strip and reactivity of all the bands on all the strips in a run
were compared back to the band on the cut-off control strip.
When reporting the overall interpretation of immunoblot results, three of the five assays reported
VlsE reactivity alone as either indeterminate (borderline) or positive. Further, reactivity greater than
the cut-off control on one additional band changed the overall interpretation. Table 7 shows the
number of bands (spots) required to be reactive to assign indeterminate or positive overall
interpretation. The information in the table is meant to provide an overview of the different
immunoblots’ interpretation criteria rather than the specific details of each. The details are difficult
to summarise because the different immunoblots use different scoring systems (one assigns a
point value to each of the bands) and may assign increased significance to the presence of some
bands (e.g. VlsE). The intention of the table is to show that a small change in reactivity, the
assessment of which is subjective in itself, can significantly change the interpretation delivered to
the clinician.
Page 39 of 45
Table 7: Overview of interpretation criteria for each of the immunoblots included in the Project
Number of bands (spots) required to show reactivity ≥ cut-off
control band (spot)
Negative Borderline (Indeterminate) Positive
Trinity Biotech IB (Excl
Germany FDR)<2 2 ≥3
Trinity Biotech IB (Germany
FDR)0 1 ≥2
Viramed ViraStripe ≤1 1 (VlsE) ≥2
Mikrogen recomLine ≤1 2 (incl p41) ≥2
Euroimmun Euroline IB ≤1 N/A 1 (VlsE) or ≥2
Seramun SeraSpot ≤1 (except VlsE) 1 (VlsE) ≥2
Page 40 of 45
NRL’s experience in reading immunoblots is extensive. Its methods are well documented and
include steps to maximise consistency in reading including having two scientists read each
immunoblot, independently of each other. NRL testing scientists are all trained in the same way to
read immunoblots. By this we seek to reduce variation in an inherently subjective method. When
IVD manufacturers’ instructions for use say to compare the intensity of a band (spot) with a
control’s intensity, NRL testing scientists scrutinise these intensities closely, especially when the
intensities are similar, to determine how they compare. Occasionally the two readers cannot agree
and in these cases a third reader is consulted. During the Project, of the 2313 immunoblots that
were read by eye, approximately 10% required consultation of a third reader. This degree of
scrutiny may not happen in other laboratories, although it is believed to be usual practice in NATA
accredited laboratories in Australia that at least two independent readings of immunoblots are
undertaken and reconciled. This is not raised to suggest that one method is right and the other
wrong, but rather to point out that differences in interpretation are easily introduced. In the case of
the Lyme disease immunoblots, the impact of the subjective nature of the tests is exacerbated by
the significant change in interpretation brought about by the presence of a single additional band
(spot). Reactivity to two proteins on most of the immunoblots is sufficient to confer a positive result,
according the manufacturers’ IFUs. The Centers for Disease Control and Prevention (CDC) in the
US has issued case definitions for Lyme disease for many years, the most recent one being 2017.
In this case definition an IgG immunoblot is not considered definitively positive for surveillance and
diagnosis unless reactivity is observed to five B. burgdorferi proteins. If CDC criteria had been
used in this evaluation the result profile would demonstrate fewer positive results and would have
reduced the number of blood donor specimens that were positive on immunoblot.
The subjective nature of immunoblots is a significant reason why they are typically not used for first
line testing.
Combining the known positive and the PAS positive populations, the sensitivity of the Trinity
Biotech IB gave statistically poorer sensitivity than the other immunoblots in the Project.
Comparison of overnight and one hour specimen incubation in the Mikrogen immunoblot
Australian Biologics increases the specimen incubation time when using the Mikrogen immunoblot
because they believe the sensitivity of the test using the manufacturer’s instructions is not
adequate. In our hands we found that the sensitivity of the IVD did improve with the overnight
incubation: Of 24 known positive specimens that gave five negative and 19 borderline results using
the one hour incubation, 15 were positive, nine were borderline and none were negative with
overnight specimen incubation. However, of 141 negative specimens, 20 (14%) became borderline
and one became positive with overnight specimen incubation.
Page 41 of 45
The immunoblot results in the Project suggest that NRL testing scientists may be conservative in
their reading of immunoblots. This conservative approach likely results from the stringent
requirements for reporting a positive result for a blood borne virus (HIV, HCV etc.); requiring NRL
scientists to closely scrutinise band intensities. Under these circumstances it is possible, that
Australian Biologics’ scientists may read immunoblots less conservatively than NRL and therefore
deliver a higher rate of false reactivity through the overnight specimen incubation than reported
here.
Even though improvement in true positive specimens was seen with the overnight incubation,
because of the increase in indeterminate results from negative specimens, on balance we consider
that following the manufacturer’s instructions is more likely to give a higher proportion of correct
results.
Availability of IVDs in Australia
Of the five immunoassays included in this Project, three are included on the Australian Register of
Therapeutic Goods (ARTG) and have distributors in Australia. We know of at least one other
immunoassay available in Australia that was not included in the Project.
Only two of the immunoblots used in the Project are available in Australia through distributors:
Trinity Biotech IB and Euroimmun Euroline IB. Despite its excellent specificity, the former showed
very poor sensitivity in our hands. On the other hand the specificity of the Euroimmun Euroline IB is
questionable, especially when confirming reactivity from the DiaSorin Liaison CLIA. In the PAS
negative specimen panel (n=405), 21 specimens were falsely positive on the DiaSorin Liaison
CLIA. Of these five were also positive on the Euroimmun Euroline IB. Hence there is the possibility
of reporting false positive results. Of the immunoblots, only the Euroimmun Euroline IB is
registered on the ARTG.
In the absence of a distributor, if a laboratory wanted access to another IVD, it has to be named
the Sponsor of the IVD on the ARTG, and pay the application fee and the fee to maintain the IVD
on the Register. Alternatively it can gain access to the IVD through the TGA’s Authorised
Prescriber Scheme. Neither of these mechanisms is ideal. Understandably, laboratories choose
from what is available on the ARTG. Testing quality may be more reliable if a single reference
laboratory in Australia undertook all the B. burgdorferi reference testing using a validated
immunoblot with a dedicated scanning system that may not be available on the ARTG.
9 RECOMMENDATIONS IVDs using native proteins should be avoided, or used with recognition that their sensitivity may
not be optimal.
Page 42 of 45
To allow access to a wider range of immunoblots in Australia, without requiring individual
laboratories to be the Sponsors, the establishment of a national reference laboratory could be
considered. This laboratory could be responsible for evaluating Borrelia IVDs, whether or not
they are included on the ARTG. The laboratory could also be responsible nationally for
confirmatory testing. Such a laboratory could be an established medical testing laboratory with
experience in Lyme disease testing or a laboratory with experience in providing a quality
assurance program service.
Immunoblots should be read using scanning software to limit the inconsistency caused by
subjective reading by eye.
10 CONCLUSION
Approximately 950 specimens have been tested in 10 IVDs that detect IgG antibodies against B.
burgdorferi. Positive and negative specimens were contributed by collaborating laboratories in both
endemic and non-endemic areas. The main objectives of the testing were to examine the
performance of the 10 IVDs and to determine whether Australian laboratories’ results were of good
quality.
Generally the sensitivities of the IVDs were variable between 73% and 100% in known positive and
PAS positive specimen panels respectively. However, the Trinity Biotech IB showed very poor
sensitivity in our hands. The distances between the upper and lower confidence intervals were
acceptable. Apart from the Immunetics C6 ELISA, none of the remaining three immunoassays for
which the statistic could be calculated demonstrated a positive δ value >3, meaning that they
would all give a proportion of false negative results. The specificities of the IVDs were less variable
with most being >90% in the known negative and PAS negative specimen panels. The negative δ
value of the Trinity Biotech EIA was only -1.2 confirming the higher proportion than the other IVDs
of false positive results. However, the specificity and δ value of the Immunetics C6 ELISA were
79.9% and –0.74 respectively in the PAS negative panel showing that it will have a high proportion
of false positive results. Given its high sensitivity it could be used providing that reactive specimens
were tested further in a high specificity immunoblot.
The immunoblots, which require a visual interpretation are subjective and the interpretation criteria
for positive are lacking in stringency. It is recommended that an immunoblot is used with scanning
software to minimise the subjectivity.
Page 43 of 45
Improvements in ELISA technology are measured in “generations”. Four of the five Project
immunoassays were second generation microplate based ELISA assays using traditional colour
detection systems. These second generation assays do not demonstrate the increased sensitivity
and specificity offered by third and fourth generation immunoassays, and those with the more
sensitive chemiluminescent detection systems. (The fifth Project immunoassay was an instrument
based chemiluminescent assay). Hence, in our opinion the majority of the immunoassays based on
traditional ELISA technology, especially those using native proteins, lacked the refinement seen in
third and fourth generation immunoassays,
The Project collected specimens, IVDs used and results reported from Australian and collaborator
laboratories. The Project then tested the specimens on all the IVDs included. The objectives of this
testing were to examine:
Did the Project obtain the same result as the collaborator laboratories in the IVD(s) the
collaborator used?
The outcome of this has shown that discrepancies between Project and collaborator results
by IVD occurred less than 1% of the time. In other words, results obtained by Australian
accredited medical testing laboratories in the IVDs they are using are as expected from those
IVDs
Did the IVDs in the Project obtain the same results as each other?
The outcome of this has shown that the IVDs agree on positive results approximately 80% of
the time and on negative results approximately 90% of the time.
These outcomes suggest that variability in results reported is more likely due to IVD variation than
poor laboratory performance.
The modification to the manufacturers IFU for the Mikrogen recomLine IB undertaken by Australian
Biologics did marginally increase sensitivity as claimed. However, it also reduced specificity by
15% and this must be considered before borderline or positive results based on the modified
protocol are issued.
Page 44 of 45
11 AcknowledgementsThe NRL would like to thank all the collaborators for their willingness to provide specimens and
their open collaboration.
This project was funded by the Australian Government Department of Health
End of report
24 May 2017
Page 45 of 45
top related