Skip to main content
Proceedings of the National Academy of Sciences of the United States of America logoLink to Proceedings of the National Academy of Sciences of the United States of America
. 2024 Feb 20;121(9):e2321318121. doi: 10.1073/pnas.2321318121

C.R. Rao: Paramount statistical scientist (1920 to 2023)

Anirban DasGupta a,1
PMCID: PMC10907269  PMID: 38377193

Calyampudi Radhakrishna Rao, affectionately known to the world of statistics as “Dr. Rao,” or simply “CR,” passed away on August 22, 2023, in Buffalo, New York, at the age of 102. His passing ended a notable voyage into statistics, mathematics, and numerous applied sciences as a steward of the visionary ideal of marrying theory, models, data, and methods. Rao was a prominent explorer over a very wide range of statistics.

Among his greatest theoretical discoveries and inventions of methodologies and applications to raw data are the Cramér– Rao inequality, the Rao–Blackwell theorem, the score test, orthogonal arrays, growth curve analyses, the ecological metric, higher-order efficiency, applying generalized inverses in singular linear models, characterization problems, the Rao–Zyskind model of consumer demand, indices of genetic variation and diversity, and analysis of the Bengal anthropometric survey data. Of note, 75 years after their publication, the Cramér–Rao inequality, the Rao–Blackwell theorem, and Rao’s score test are still taught to every first-year graduate student in statistics.

Folklore is that when he was lecturing to a class at the Indian Statistical Institute (ISI) in Calcutta, V.M. Dandekar, a student in the class, asked how low the variance of an unbiased estimate can be for a given sample size. Rao went home and solved that problem overnight. Harald Cramér was working independently on the same problem, and the result became known as the Cramér–Rao inequality. This inequality can be interpreted as a gold standard for the estimation of a parameter for any given sample size and for any value of the parameter, if suitable conditions hold. Interestingly, for the special case of location parameter problems, the conclusion reached by the Cramér–Rao inequality coincides with the conclusion reached by the Heisenberg–Weyl uncertainty principle of physics.

The Cramér–Rao inequality has been used in countless other problems, specifically the work on decision theory by Joseph Lodges and Erich Lehmann, and in some sense, the inequality led to the development of the area of thresholding and superefficient estimators. The Cramér–Rao inequality has had its impact on nonparametric and semiparametric statistics, too, via the derivation of analogous information bounds, as in Ritov and Bickel (1), Groeneboom and Wellner (2), and Bickel (3), to cite a few.

The Rao–Blackwell theorem gives an explicit construction of an estimator depending only on a sufficient statistic better than a given estimator. It gives an intuitive, concrete, and practical use of the notion of sufficiency. This was another instance of two scientists working on the same problem independently at the same time—in this case, Rao and David Blackwell. The result generalizes to risk functions more general than mean squared error.

The score test is an ingenuous alternative to the Wald and the likelihood ratio test. It is now well known that CIs derived from the score test can beat Wald and likelihood ratio CIs in moderate sample sizes.

Of Rao’s other impactful work in diverse areas of general science, a mention must be made of the 1946 introduction of orthogonal arrays (4); the 1958 article on comparison of growth curves (5), a pioneering publication; and the geodesic metric based on Fisher information, initiated in Rao (6). This last metric has been applied in image processing and computer vision, quantum mechanics, and, later, in ecology. Orthogonal arrays have unified orthogonal Latin squares and Latin hypercube sampling and have found multifarious applications in cryptology, software reliability, construction of fractional factorial sampling designs and, notably, the Taguchi methods for process optimization.

Rao was born on September 10, 1920, to a Telugu family of six boys and four girls in the Madras Presidency of British-ruled India. C.D. Naidu, his father, was an inspector of police in British India; his mother, the late A. Laxmikantamma, was about 20 when she married Rao’s father. Rao’s interest in mathematics developed at an early age, with a lot of support from his parents. They used to regularly supply him with mathematical puzzles, and he spent hours solving them. Rao was educated in the current Indian state of Andhra Pradesh, after which he obtained an M.Sc. in mathematics with the rank of first class from Andhra University.

At age 22, Rao made a 500-mile train trip from Vizag, Andhra Pradesh, to Calcutta, with the intention of joining the military. He was denied a military job, ultimately to the benefit of the fields of statistics and general science. Out of desperation, he sent a letter to P.C. Mahalanobis, the founder of the ISI, asking for admission to a statistics training program at the ISI. He was granted this request but found himself disappointed by the quality of teaching at the ISI at that time.

The University of Calcutta had just started a master’s degree in statistics, and, at the advice of Mahalanobis, Rao joined the program. He was already writing joint papers with K.R. Nair and others within a few months of getting introduced to statistics and reading research at the international level, in addition to his master’s thesis. Rao had R.C. Bose as his teacher at the University of Calcutta, and this was the driving force behind Rao’s lasting interest in the design of experiments. Rao’s initial research was on practical statistics, and four years after getting into statistics, he obtained two of the greatest theoretical results in statistics history.

Mahalanobis had a University of Cambridge background, and he sent Rao to Cambridge to analyze some anthropological data available there. While at Cambridge, Rao requested R.A. Fisher to be his Ph.D. thesis advisor. As a precondition, Fisher had Rao spend substantial time in his genetics laboratory at basically a daily level. Only in the evening hours, he spent some “theoretical time” at King’s College, Cambridge. Not too long after getting his Ph.D., he returned to the ISI and was made a full professor at age 29. He was also appointed as the head of the proverbial RTS (Research and Training School) of the ISI. Later, he was selected as Director of the UNESCO-approved International Statistical Education Center (ISEC) for training government officials in Africa and Southeast Asia. It was Rao who introduced the B.Stat and M.Stat programs at the ISI. After Mahalanobis’ death in 1972, Rao was appointed as the Director and Secretary of the ISI. He held these positions until his move to the United States in 1980 and mentored S.R.S. Varadhan, V.S. Varadarajan, D. Basu, T. Parthasarathy, Ranajit Chakrabort, T. Krishnan, and numerous others who were attracted to ISI after India gained independence.

Rao had very much wanted the ISI to be a world leader in work on computer-intensive methods in statistics. He took the time to learn programming, but he could not introduce the use of computers there because political opposition to computers at ISI was too steep of a barrier at that time. Rao deeply regretted this missed opportunity.

He permanently moved to the United States around 1980. He first came to the University of Pittsburgh, as multivariate analysis was a lifelong love. He then moved to the Pennsylvania State University (Penn State) and opened a Center for Multivariate Analysis there. Rao held the Eberly Family Professorship at Penn State until his retirement in 2001. He had a joint appointment at the University at Buffalo following his retirement from Penn State. Penn State has awarded the prestigious C.R. and Bhargavi Rao Prize in Statistics since 2003 for influential contributions and innovations in statistics. The first five winners, in alphabetical order, were Jim Berger, Peter Bickel, Larry Brown, Brad Efron, and Jayaram Sethuraman.

Rao leaves a significant legacy for the present and coming generations. If he had done nothing but write the 1973 Wiley text Linear Statistical Inference and Its Applications (7), that, by itself, would be a memorable legacy. From that one book, still, half a century after it was written, a graduate student can get a self-contained, rigorous, and virtually error-free education in measure theory, matrix and abstract linear algebra, distribution theory, fixed sample statistical inference, linear models, ANOVA and MANOVA, multivariate analysis, and sequential and nonparametric statistics. It has already received more than 19,000 citations. His earlier book, Advanced Statistical Methods in Biometric Research (8), is also very well-known and received more than 6,800 citations. Rao wrote a total of 15 books and edited countless special topics volumes in probability and statistics. His latest such publication was the book Statistical Learning Using Neural Networks (9), a testimony to his lifelong motto of staying current. He has approximately 500 publications, many of which were groundbreaking. The Institute of Mathematical Statistics commemorated his 100th birthday in 2020, and, at that time, numerous leaders in statistics made short statements about Rao and his legacy: “Dr. Rao, your ideas have helped shape the entire field of statistics” (Larry Wasserman). “We are all collecting fruits of your insight from our statistical garden, a harvest due to your planting the precious seeds” (Sara van de Geer) (10).

To name a few of his awards and recognitions, Rao was given the second-highest civilian honor in India, the Padma Vibhusan, in 2001; the International Prize in Statistics in 2023; the US President’s National Medal of Science in 2002 (Fig. 1); the Guy Medal in Gold and Silver in 1965 and 2011, respectively, from the Royal Statistical Society; the Ramanujan Medal in 2003; and the Wilks Award of the American Statistical Association in 1989. He was the Wald Lecturer of the Institute of Mathematical Statistics in 1975. He was elected a member of the National Academy of Sciences, a Fellow of the Royal Society, and given honorary doctorate degrees by 38 universities in 19 different countries.

Fig. 1.

Fig. 1.

C.R. Rao receiving the National Medal of Science from President Bush, 2002. Image credit: Wikimedia Commons/NSF.

However, Rao was an unassuming man, most comfortable in an untucked short-sleeved shirt. He was a master of impromptu subtle and short humor. Rao abhorred people giving him respect for his power when he was Director of the ISI. He learned to cook simple homestyle meals at the dorm in King’s College, and cooking remained a passion. He was skilled at photography and gardening and an ardent admirer of Indian classical dance. His daughter Tejaswini is a renowned classical dancer. His son Veerendra is an engineer, plus a skilled cook like his father.

Philosophically, Rao was never an ideologue. He was one of the most open-minded about anything that works or looks sensible. Among his best expositions on statistics and its role and responsibilities is his book Statistics and Truth, published in 1997 (11). From those of us who knew him personally, and the many others familiar with his work, we celebrate his scholarly as well as institutional contributions. With heavy hearts, we say “Goodbye, Dr. Rao. Thank you for your inspiration and guidance. We will remember you.”

Acknowledgments

Author contributions

A.D. wrote the paper.

Competing interests

The author declares no competing interest.

References

  • 1.Ritov Y., Bickel P. J., Achieving information bounds in non and semiparametric models. Ann. Statist. 18, 925–938 (1990). [Google Scholar]
  • 2.Groeneboom P., Wellner J. A., Information Bounds and Nonparametric Maximum Likelihood Estimation (Springer Science & Business Media, 1992), vol. 19. [Google Scholar]
  • 3.Bickel P. J., Discussion of “Information Bounds and Nonparametric Maximum Likelihood Estimation” (Piet Groeneboom and Jon A. Wellner). SIAM Rev. 36, 503–504 (1994). [Google Scholar]
  • 4.Rao C. R., Factorial experiments derivable from combinatorial arrangements of arrays. J. R. Statist. Soc. 9, 128–139 (1947). [Google Scholar]
  • 5.Rao C. R., Some statistical methods for comparison of growth curves. Biometrics 14, 1–17 (1958). [Google Scholar]
  • 6.Rao C. R., Information and the accuracy attainable in the estimation of statistical parameters. Bull. Calcutta Math. Soc. 37, 81–91 (1945). [Google Scholar]
  • 7.Rao C. R., Linear Statistical Inference and Its Applications (Wiley, 1973). [Google Scholar]
  • 8.Rao C. R., Advanced Statistical Methods in Biometric Research (Wiley, 1952). [Google Scholar]
  • 9.de Braganca Pereira B., Rao C. R., de Oliveira F. B., Statistical Learning Using Neural Networks: A Guide for Statisticians and Data Scientists with Python (CRC Press, 2020). [Google Scholar]
  • 10.IMS Bulletin, 49, 4–6 (2020). [Google Scholar]
  • 11.Rao C. R., Statistics and Truth: Putting Chance to Work (World Scientific Press, 1997). [Google Scholar]

Articles from Proceedings of the National Academy of Sciences of the United States of America are provided here courtesy of National Academy of Sciences

RESOURCES