With improvements in technology we can expect the number of corneal grafts to decrease. Modern lens implant design, viscoelastic materials, and improved antimicrobial agents have all made an impact in this respect. But the need for keratoplasty remains and with it the vexing problem of how to deal with the “high risk” cornea.
The cornea has long been recognised as having “immunological privilege” but in vascularised corneas and possibly corneas that have previously rejected a graft, this privilege breaks down and the cornea becomes as susceptible as any other vascularised tissue in the body to rejection. These corneas have been referred to as “high risk” and special precautions need to be taken to prevent or minimise rejection. When attempting to establish the optimal treatment for these corneas, the first hurdle remains the definition of a “high risk” cornea. This question has been explored before and is touched upon in a paper by Rumelt and colleagues published in this issue of the BJO (p 000). In the Collaborative Corneal Transplantation Study1 (CCTS) “high risk” was defined as a cornea with two or more quadrants of vascularisation, or one in which a graft had previously been rejected. In the first part of the definition a cornea with two blood vessels at least 6 clock hours apart would qualify as high risk but is intuitively less at risk than one in which all four quadrants are heavily vascularised. It has been shown that the incidence of rejection increases with both the number of quadrants vascularised and with the total number of vessels crossing the proposed graft/donor junction.2 Likewise it has been shown that graft rejection in a previously grafted eye relates more to the number of blood vessels in the cornea than to the number of previous grafts.3 Studies using univariate and multivariate survival analysis suggest that recipient corneas can be divided into low, medium, and high risk depending on the number of quadrants of vascularisation (avascular, 1–2 quadrants, and 3+ quadrants respectively).4
Considering the high failure rate from rejection in these high risk corneas, it is essential that we institute measures that help to improve graft survival, especially in those patients who are bilaterally blind. In other solid organ transplantations systemic immunosuppression, using agents such as cyclosporin A (CSA), is routine with often more than one agent being used. Our studies with systemic CSA in high risk keratoplasty reported that both short and long term survival improved. There was also evidence that some degree of immunological privilege was re-established, indicating that CSA can eventually be safely withdrawn.5 Other authors have found no statistical benefit from the use of systemic CSA, possibly because their criteria for high risk differed and included other factors such as chronically inflamed eyes.6 These authors correctly emphasised the need for intensive topical corticosteroid therapy to decrease inflammation, thus lowering the local population of immunologically active cells and reducing the expression of class 2 antigens in the recipient cornea. In the article by Rumelt and colleagues, the definition of high risk was vascularisation in four quadrants and a previously failed graft. Six (21.4%) of 28 grafts on CSA rejected while on treatment and another three (10.7%) failed after treatment was stopped: this compared with five (42%) of 12 control grafts that did not receive CSA. This would indicate some benefit from CSA therapy but analysis is difficult because of the high graft failure rate from other causes.
Corneal grafting in high risk corneas remains a challenge. Clearly we need to identify those factors that increase risk and devise appropriate treatment regimens to counteract them. Intensive topical corticosteroid therapy is beneficial and usually poses low risk; the addition of systemic corticosteroid has not been shown to add any benefit. Certain risk factors may warrant the addition of systemic CSA, and with less toxic alternatives becoming available, systemic immunosuppression may become more widely accepted. With longer storage of donor corneas the ability to plan surgery is now feasible and the introduction of treatment regimens a few days before surgery should be considered to optimise the local corneal environment before grafting. Although controversial, the use of tissue typing should still be considered in high risk vascularised corneas. No one single therapy is necessarily appropriate for all these grafts and we need to devise therapeutic regimens, sometimes with multiple medications, to improve graft survival especially in those unfortunate patients who are bilaterally blind.
REFERENCES
- 1.The Collaborative Corneal Transplantation Studies Research Group. The Collaborative Corneal Transplantation Studies (CCTS). Effectiveness of histocompatibility matching in high-risk corneal transplantation. Arch Ophthalmol 1992;110:1392–403. [PubMed] [Google Scholar]
- 2.Khodadoust AA. The allograft rejection reaction: the leading cause of late graft failure of clinical corneal grafts. In: Porter R, Knight J, eds. Corneal graft failure. Ciba Foundation Symposium 15. Amsterdam: Elsevier, 1973:151–64.
- 3.D'Amaro J, Völker-Dieben HJ, Kruit PJ, et al. Influence of pretransplant sensitization on the survival of corneal allografts. Transplant Proc 1991;23:368–72. [PubMed] [Google Scholar]
- 4.Hill JC. Immunosuppression in corneal transplantation. Eye 1995;9:247–53. [DOI] [PubMed] [Google Scholar]
- 5.Hill JC. Systemic cyclosporine in high-risk keratoplasty: short versus long term therapy. Ophthalmology 1994;101:128–33. [DOI] [PubMed] [Google Scholar]
- 6.Poon AC, Forbes JE, Dart JKG, et al. Systemic cyclosporine A in high risk penetrating keratoplasties: a case-control study. Br J Ophthalmol 2001;85:1464–9. [DOI] [PMC free article] [PubMed] [Google Scholar]