[Advances in Immunology] Advances in Immunology Volume 48 Volume 48 || Immune Privilege and Immune Regulation in the Eye

Download [Advances in Immunology] Advances in Immunology Volume 48 Volume 48 || Immune Privilege and Immune Regulation in the Eye

Post on 13-Apr-2017




0 download

Embed Size (px)



    Immune Privilege and Immune Regulation in the Eye


    Department of Ophthalmology, University of Texas Southwestern Medical Center,

    Dallas, Texas 75235

    1. Introduction

    The eye has been likened to an imunological microcosm in which virtually all forms of immunological events can take place and, in doing so, often produce unique results (1). Indeed, when the topic of immuno- logical privilege is raised, the anterior chamber of the eye (Fig. 1) is offered as the classic example of a privileged site in which histoincompa- tible allografts escape immunological recognition and enjoy prolonged, and sometimes permanent, residence (2). Likewise, discussions of other immunological phenomena, such as tolerance, often turn to the eye for examples to illustrate basic immunological principles. For example, one pathway for maintaining immunological tolerance of host tissue epitopes is the sequestration of self-antigens behind anatomical barriers. In this regard, the crystallin antigens of the lens are offered as examples of tissue-specific immunogens that are sequestered early in ontogeny but, although potentially capable of arousing an autoimmune response, do not dq so since they are incarcerated within an impervious capsule. Thus, tolerance of host antigens (e.g., lens crystallins) can be maintained, at least in part, by anatomical sequestration.

    The eye possesses other unique features that warrant the immunolo- gists attention. Perhaps the oldest and most successful form of organ transplantation takes place in the eye. Corneal transplantation has been performed on animal subjects for over 150 years and on human patients for over 80 years (3). In the United States alone, 30,000 corneal trans- plants are performed each year, with a success rate well over 90%. The extraordinary success of corneal transplantation is often attributed to the mysterious and undefined immunological privilege of the cornea and the eye. What is the basis for this immunological privilege, and is it restricted to the eye? In this review I examine those features of the cornea and the eye which conspire to promote successful corneal transplan- tation.

    The putative immunological privilege of the cornea, corneal graft bed, and anterior chamber would seemingly create an environment free of

    191 Copyright 8 1990 by Academic Press, Inc.

    All rights of reproduction in any form reserved.


    Area Of I Choroid ., , activitv Retina L


    Anterior chamber

    Pupil Iris

    Area of immunologic


    FIG. 1. Schematic representation of ocular anatomy demonstrating regions of immunological privilege and areas especially vulnerable to autoimmune activity.

    immune-mediated diseases. However, this is not the case, as the eye is vulnerable to an interesting array of autoimmune diseases uniquely suited to precise immunological analysis.

    Thus, the eye offers interesting opportunities for analyzing a panorama of immunological phenomena, including organ transplan- tation, immunological privilege, tolerance, and autoimmunity. More- over, each of these immunological processes can have a profound bear- ing on the normal functioning of one of the most precious of the five senses: our vision.

    II. Corneal Allografts: lmmunogenically Privileged Grafts on Immunologically Privileged Graft Beds

    The prospect of replacing an opaque diseased cornea with a healthy one was suggested as early as 1796 by Erasmus Darwin, who suggested that I. . . a slight and not painful operation might be facilitated by cutting the cornea with a kind of trephine, about the size of a thick bristle or a small crow-quill, an experiment I wish strongly to recommend to some ingenious surgeon or oculist (4). In 1837 an Irish surgeon, Samuel Bigger, described how he had successfully placed a corneal allograft into a pet gazelles eye while he was a prisoner of the Egyptians (5). The first recorded attempt at therapeutic corneal transplantation on a human


    subject was performed by Richard Kissam, who sutured the cornea of a 6-month-old pig onto a blind Irishman (6). This xenograft ultimately failed, as did subsequent attempts at corneal grafting. However, suc- ceeding decades brought the introduction of general anesthesia, antisep- sis in surgery, improvements in ophthalmic surgical instruments, and, at the turn of this century, the first successful human corneal transplant (7). In the 75 years following the first successful human corneal allograft, literally thousands of corneal transplants have been performed each year to correct blndness produced by corneal edema, trauma, inflammation, or congenital abnormalities.

    The success of corneal transplantation is unrivaled by all other forms of organ transplantation. Corneas transplanted onto diseased, but other- wise avascular, graft beds (e.g., keratoconus), remain clear and healthy in greater than 90% of the recipients (8,9). This extraordinary success rate is even more impressive when one considers the conditions surrounding corneal transplantation. Human leukocyte antigen matching of donor and recipient is normally not performed, except in high-risk patients. Immunosuppressive drugs are restricted to topical corticosteroid eye- drops, which are gradually tapered to maintenance levels following su- ture removal. Such conditions would certainly lead to graft failure with any other form of organ transplantation, yet corneal allografts thrive in spite of such immunological handicaps.

    A. ESCAPING IMMUNOLOGICAL RECOGNITION The apparent ease with which corneal grafts avoid immunological

    recognition suggests that the cornea is endowed with unique immunolog- ical characteristics and thus possesses an immunological privilege not shared by other organ grafts (10). Historically, three basic hypotheses have been offered to account for the privileged existence of corneal allografts (10). The simplest explanation suggests that the cornea, like certain neuronal tissues, is devoid of conventional major histocompatibil- ity complex (MHC) antigens. Accordingly, potentially alloreactive T cells would be blind to alien corneal cells. The putative absence of MHC antigens would not only prevent the arousal of an alloimmune response, but even if a response were initiated, the grafts would be immunologically invisible. This hypothesis, although appealing in its simplicity, has been unequivocally disproved. Several findings indicate not only that the cor- nea is vulnerable to immunological attack, but that it is also capable of eliciting an alloimmune response that results in graft rejection. Studies in rats, rabbits, and mice indicate that heterotopic transplantation of cor- neas to subdermal graft beds leads to rapid sensitization and swift rejec- tion of the allografts (1 1-14).


    Moreover, MHC antigens have been detected on all three cell layers of the cornea (15-18) (Fig. 2). In fact, the grafting procedure itself has been shown to elicit increased expression of MHC class I1 antigens on the corneal epithelium (19, 20). Although MHC antigens can be found on cells of the corneal epithelium, stroma, and endothelium, the density of antigen expression differs markedly among these three cell layers. In the rat, corneal endothelial cells express meager detectable amounts of MHC class I antigens and no detectable MHC class I1 antigens (2 1).

    Despite the paucity of MHC antigens, corneal endothelial grafts stimu- late robust cytotoxic T lymphocyte (CTL) responses following hetero- topic grafting in the rat (22). Although MHC class I antigen expression is greater in the epithelium than in the endothelium, heterotopic trans- plantation of isolated allogeneic corneal epithelium fails to induce detect- able anti-MHC class I CTL responses in the rat (22). Thus, there are significant antigenic and immunogenic gradients within the corneal allo- graft. The mere expression of histocompatibility antigens on corneal cells does not ensure the induction of an alloimmune response. Nonetheless, the corneal allograft has the full potential to be both immunogenic and antigenic.

    Basement membrane

    Bowmans membrane

    Descemets membrane


    FIG. 2. The cornea is composed of three distinct layers: epithelium, stroma, and endothelium. MHC class I antigens are expressed on cells of each layer. [Reprinted from Niederkorn and Peeler (153) by permission of S. Karger Pub- lishing, Inc.]


    The second hypothesis offered to explain the survival of corneal allo- grafts suggests that the donor cells were rapidly replaced by host cellular components in the graft bed. According to this hypothesis, the cellular elements of the graft were replaced before the host's immune machinery could be aroused. Like the previous hypothesis, this explanation has been refuted. Animal studies using sex chromatin markers to distinguish do- nor cells from recipient cells have demonstrated the long-term survival of donor cells in corneal grafts (23,24). Other investigators came to similar conclusions by radiolabeling donor corneas with [3H] thymidine (25,26). Clinical findings also support this conclusion, since immunological rejec- tion can occur over a decade after corneal transplantation (10).

    The third and most widely accepted hypothesis to account for the high acceptance of corneal allografts relates to the nature of the avascular corneal graft bed. It is a well-recognized clinical observation that vascu- larization of the corneal graft bed is a harbinger of graft failure (27,28). The absence of blood and lymph vessels at the interface of the graft and the graft bed is thought to prevent the escape of alloantigens to the regional lymphoid tissues, thereby resulting in an afferent blockade of the immunological reflex arc.

    Although the corneal allograft is potentially immunogenic and anti- genic, the anatomical sequestration promotes graft survival due to the privileged location of the graft. Thus, the avascular graft and the graft bed conspire to produce a state of immunological ignorance that permits allograft survival. Heterotopic and orthotopic corneal allografts in rats and mice have been used to examine this hypothesis. Corneal allografts can be transplanted heterotopically (i.e., to an abnormal anatomical site) onto subdermal graft beds richly endowed with blood vessels and lym- phatics, which would favor the induction and execution of alloimmunity.

    Such grafts can be compared to similar grafts placed orthotopically (i.e., to the normal anatomical site) onto avascular graft beds in the eye. If the avascular graft bed contributes to the survival of the allograft, one would predict that heterotopic corneal grafts would suffer significantly higher rejection rates than their orthotopic counterparts. This is indeed the case, as 100% of the fully allogeneic heterotopic corneal allografts are rejected in a mouse model of corneal transplantation (13, 14, 29), while only 55-57% of fully allogeneic grafts fail when grafted orthotopically (30, 31).

    The pioneering studies of Maumenee (32) provided strong support for the afferent blockade theory. In these studies rabbits bearing long-term orthotopic corneal allografts rejected skin grafts from the same donors which provided their corneal grafts. Skin graft rejection occurred at a tempo indicative of a first-set rejection, thereby supporting the notion


    that the initial corneal allograft failed to stimulate alloimmunity and that the graft bed produced an afferent blockade of the immune response. However, skin graft rejection led to the rejection of 90% of the previously clear corneal grafts. Thus, the corneal grafts initially displayed immuno- genic privilege, but were antigenically vulnerable to an ongoing systemic immune response: Afferent blockade was present, but efferent blockade was not. Callanan and co-workers (30) came to similar conclusions, using a rat orthotopic corneal allograft model.

    In these studies the appearance of antigen-specific CTL activity coin- cided with graft rejection, while the absence of CTL responses was a consistent feature of hosts bearing long-term corneal allografts. It is interesting that in both of these studies a small but significant number of corneal grafts were initially clear and avascular, yet subsequently under- went immunological rejection. Thus, the presence of an avascular graft bed does not necessarily ensure permanent graft survival or the mainte- nance of an afferent blockade.

    Recent studies from our laboratory lend further support to the affer- ent blockade theory of corneal graft survival. Ross et al. (33) have shown that orthotopic corneal grafts differing from their hosts only at MHC class I1 loci do not undergo rejection unless the host is systemically immunized with skin grafts from the same donor strain.

    The importance of an efferent blockade in promoting corneal graft survival is not clear. The previously mentioned studies by Maumenee (32) and by Ross et al. (33) argue against an effective efferent blockade. However, Khodadoust and Silverstein (34) confirmed earlier findings by Billingham and Boswell(1 l), which indicated that the corneal graft bed served as an effective barrier that shielded the graft from sensitized effector elements of the host. Results from the Khodadoust and Silver- stein study indicated that 95% of the lamellar corneal grafts and 75% of the penetrating grafts remained healthy, even though the hosts had rejected large skin grafts from the same donors of the corneal grafts.


    Numerous studies have demonstrated the presence of MHC class I antigens on cells in all three layers of the cornea (15-18); however, MHC class I1 antigens are not normally expressed in detectable amounts on corneal cells. MHC Class I1 antigen-bearing Langerhans cells are abun- dant in the peripheral outermost regions of the cornea that interface with the conjunctiva (i.e., limbus), but are conspicuously absent from the central corneal epithelium in adult humans, mice, rabbits, and guinea pigs (35-38).


    The absence of Ia Langerhans cells in the central cornea is of more than casual interest, since these cells represent an important immuno- genic component of an allograft. Indeed, it has been suggested that Ia+ passenger cells are the major barrier to successful organ transplan- tation (39). Over three decades ago Snell (40) suggested that donor leukocytes present in transplanted tissues were a major source of tissue immunogenicity. Interest in passenger cells, however, remained dor- mant until Lafferty et al. (39) reconsidered the role of Ia cells in thyroid allografts and pancreatic islet grafts. Subsequently, numerous studies have confirmed that Ia passenger cells are indeed major obstacles to successful organ transplantation and tha...