Linear Statistical Inference and its Applications
WILEY SERIES IN PROBABILITY AND STATISTICSEstablished by WALTER A. SHEWHART and SAMUEL S. WILKSEditors: Peter Bloomfield, Noel A . C. Cressie, Nicholas I. Fisher,Iain M. Johnstone, J. B. Kadane, Louise M. Ryan, David W. Scott,Bernard W. Silverman, Adrian F. M. Smith, Jozef L. Teugels;Editors Emeriti: Vic Barnett, Ralph A. Bradley, J. Stuart Hunter,David G. KendallA complete list of the titles in this series appears at the end of this volume.
Linear Statistical Inferenceand its ApplicationsSecond EditonC. RADHAKFUSHNA RAOPennsylvania State UniversityA Wiley-Interscience PublicationJOHN WILEY & SONS, INC.
This text is printed on acid-free paper. @Copyright 0 1965, 1973 by John Wiley 81. Sons. Inc. All rights reserved.Paperback edition published 2002.Published simultaneously in CanadaNo part of this publication may be reproduced, stored in a retrieval system or transmitted in anyform or by any means. electronic, mechanical, photocopying, recording. scanning or otherwise,except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, withouteither the prior written permission of the Publisher, or authorization through payment o f theappropriate per-copy fee to the Copyright Clearance Center, 222 Rosewood Drive, Danvers, MA01923. (978) 750-8400, fax (978) 750-4744. Requests to the Publisher for permission should beaddressed to the Permissions Department, John Wiley & Sons, Inc., 605 Third Avenue, New York,NY 10158-0012, (212) 850-601 I . fax (212) 850-6008, E-Mail: PERMREQ (a! WILEY.COM.For ordering and customer service, call 1-800-CALL-WILEY,Library of Congress Cataloging in Publication Data is available.ISBN 0-471-21875-8Printed in the United States of AmericaI09 8 7 6 5 4 3
PrefaceThe purpose of this book is to present up-to-date theory and techniquesof statistical inference in a logically integrated and practical form. Essentially,it incorporates the important developments in the subject that have takenplace in the last three decades. It is written for readers with a backgroundknowledge of mathematics and statistics at the undergraduate level.Quantitative inference, if it were to retain its scientific character, couldnot be divested of its logical, mathematical, and probabilistic aspects. Themain approach to statistical inference is inductive reasoning, by which wearrive at “ statements of uncertainty.” The rigorous expression that degreesof uncertainty require are furnished by the mathematical methods and probability concepts which form the foundations of modern statistical theory. Itwas my awareness that advanced mathematical methods and probabilitytheory are indispensable accompaniments in a self-contained treatment ofstatistical inference that prompted me to devote the first chapter of this bookto a detailed discussion of vector spaces and matrix methods and the secondchapter to a measure-theoretic exposition of probability and development ofprobability tools and techniques.Statistical inference techniques, if not applied to the real world, will losetheir import and appear to be deductive exercises. Furthermore, it is my beliefthat in a statistical course emphasis should be given to both mathematicaltheory of statistics and to the application of the theory to practical problems.A detailed discussion on the application of a statistical technique facilitatesbetter understanding of the theory behind the technique. To this end, in thisbook, live examples have been interwoven with mathematical results. Inaddition, a large number of problems are given at the end of each chapter.Some are intended to complement main results derived in the body of thechapter, whereas others are meant to serve as exercises for the reader to testhis understanding of theory.The selection and presentation of material to cover the wide field ofvii
viiiPREFACEstatistical inference have not been easy. I have been guided by my ownexperience in teaching undergraduate and graduate students, and in conducting and guiding research in statistics during the last twenty years. I haveselected and presented the essential tools of statistics and discussed in detailtheir theoretical bases to enable the readers to equip themselves for consultation work or for pursuing specialized studies and research in statistics.Why Chapter 1 provides a rather lengthy treatment of the algebra ofvectors and matrices needs some explanation. First, the mathematical treatment of statistical techniques in subsequent chapters depends heavily on vectorspaces and matrix methods; and second, vector and matrix algebra constitutea branch of mathematics widely used in modern treatises on natural, biological,and social sciences. The subject matter of the chapter is given a logical andrigorous treatment and is developed gradually to an advanced level. All theimportant theorems and derived results are presented in a form readily adaptable for use by research workers in different branches of science.Chapter 2 contains a systematic development of the probability toolsand techniques needed for dealing with statistical inference. Starting with theaxioms of probability, the chapter proceeds to formulate the concepts of arandom variable, distribution function, and conditional expectation anddistributions. These are followed by a study of characteristic functi0ns;probability distributions in infinite dimensional product spaces, and all the importantlimit theorems. Chapter 2 also provides numerous propositions, which findfrequent use in some of the other chapters and also serve as good equipmentfor those who want to specialize in advanced probability theory.Chapter 3 deals with continuous probability models and the samplingdistributions needed for statistical inference. Some of the important distributions frequently used in practice, such as the normal, Gamma, Cauchy, andother distributions, are introduced through appropriate probability modelson physical mechanisms generating the observations. A special feature of thischapter is a discussion of problems in statistical mechanics relating to theequilibrium distribution of particles.Chapter 4 is devoted to inference through the technique of analysis ofvariance. The Gauss-Markoff linear model and the associated problems ofestimation and testing are treated in their wide generality. The problem ofvariance-components is considered as a special case of the more generalproblem of estimating intraclass correl?tion coefficients. A unified treatmentis provided of multiclassified data under different sampling schemes for classeswithin categories.The different theories and methods of estimation form the subjectmatter of Chapter 5. Some of the controversies on the topic of estimation areexamined ; and to remove some of the existing inconsist,encies, certain modifications are introduced in the criteria of estimation\ in large samples.Problems of specification, and associated tests of homogeneity of parallel
PREFACEixsamples and estimates, are dealt with in Chapter 6. The choice of a mathematical model from which the observations could be deemed to have arisenis of fundamental importance because subsequent statistical computationswill be made on the framework of the chosen model. Appropriate tests havebeen developed to check the adequacy of proposed models on the basis ofavailable facts.Chapter 7 provides the theoretical background for the different aspectsof statistical inference, such as testing of hypotheses, interval estimation,experimentation, the problem of identification, nonparametric inference, andso on.Chapter 8, the last chapter, is concerned with inference from multivariate data. A special feature of this chapter is a study of the multivariatenormal distribution through a simple characterization, instead of throughthe density function. The characterization simplifies the multivariate theoryand enables suitable generalizations to be made from the univariate theorywithout further analysis. It also provides the necessary background for studying multivariate normal distributions in more general situations, such asdistributions on Hilbert space.Certain notations have been used throughout the book to indicatesections and other references. The following examples will help in their interpretation. A subsection such as 4f.3 means subsection 3 in section f of Chapter4. Equation (4f.3.6) is the equation numbered 6 in subsection 4f.3 and Table4f.3p is the table numbered second in subsection 4f-3. The main propositions(or theorems) in each subsection are numbered: (i), (ii), etc. A back referencesuch as [(iii). 5d.21 indicates proposition (iii) in subsection 5d.2.A substantial part of the book was written while I was a visiting professorat the Johns Hopkins University, Baltimore, in 1963-1964, under a SeniorScientist Fellowship scheme of the National Science Foundation, U.S.A. Atthe Johns Hopkins University, 1 had the constant advice of G . S. Watson,Professor of Statistics, who read the manuscript at the various stages of itspreparation. Comments by Herman Chernoff on Chapters 7 and 8, by RupertMiller and S. W. Dharmadhikari on Chapter 2, and by Ralph Bradley onChapters 1 and 3, have been extremely helpful in the preparation of the finalmanuscript. I wish to express my thanks to all of them. The preparation andrevision of the manuscript would not have been an easy task without the helpof G . M. Das, who undertook the heavy burden of typing and organizing themanuscript for the press with great care and diligence.Finally, I wish to express my gratitude to the late Sir Ronald A. Fisherand to Professor P. C. Mahalanobis under whose influence 1 have come toappreciate statistics as the new technology of the present century.Calcutta, IndiaJune, 1965C. R. Rao
Preface to theSecond EditionAs in the first edition, the aim has been to provide in a single volume afull discussion of the wide range of statistical methods useful for consultingstatisticians and, at same time, to present in a rigorous manner the mathematical and logical tools employed in deriving statistical procedures, with which aresearch worker should be familiar.A good deal of new material is added, and the book is brought up to datein several respects, both in theory and applications.Some of the important additions are different types of generalizedinverses, concepts of statistics and subfields, MINQUE theory of variancecomponents, the law of iterated logarithms and sequential tests with powerone, analysis of dispersion with structural parameters, discrimination betweencomposite hypotheses, growth models, theorems on characteristic functions,etc.Special mention may be made of the new material on estimation ofparameters in a linear model when the observations have a possibly singularcooariance matrix. The existing theories and methods due to Gauss (1809)and Aitken (1935) are applicable only when the covariance matrix is knownto be nonsingular. The new unijied approaches discussed in the book(Section 4i) are valid for all situations whether the covariance matrix issingular or not.A large number of new exercises and complements have been added.1 wish to thank Dr. M. S. Avadhani, Dr. J. K. Ghosh, Dr. A. Maitra,Dr. P. E. Niiesch, Dr. Y. R. Sarma, Dr. H. Toutenberg, and Dr. E. J. Williams for their suggestions while preparing the second edition.New DelhiC . R. RaoXI
ContentsCHAPTER1Algebra of Vectors and Matrices11 a. Vector Spaces2la.] Definition of Vector Spaces and Subspaces,la.2 Basis of aVector Space,la.3 Linear Equations,la.4 Vector Spaces withan Inner ProductComplements and ProblemsIb. Theory of Matrices and Determinants1114Ib.1 Matrix Operations,Ib.2 Elementary Matrices and Diagonal16.3 Determinants,16.4 TransformationsReduction of a Matrix,lb.5 Generalized Inverse of a Matrix,lb.6 Matrix Representation,of Vector Spaces, Bases, etc.,lb.7 Idempotent Matrices,Ib.8 Special Products of Matrices1 c.Complements and Problems30Eigenvalues and Reduction of Matrices341c.l Classificationand Transformation of Quadratic Forms,Ic.2 Roots112.3 Canonical Reduction of Matrices,Ic.4 Projection Operator,lc.5 Further Results on g-Inverse,Ic.6Restricted Eigenvalue Problemof Determinantal Eqmtions,Id. Convex Sets in Vector SpacesId.] Definitions,51Id.2 Separation Theoremsfor Convex Setsxiii
xivCONTENTS531 e. Inequalitiesle.1 Cauchy-Schwarz (C-S) Inequality,le.2 Holder’s Inequality,le.3 Hadamard’s Inequality,le.4 Inequalities Involving Moments,1e.S Convex Functions and Jensen’s Inequality,le.6 Inequalities inl e .7 Stirling’s ApproximationInformation Theory,1f.Extrema of Quadratic Forms60l f . 2 Results Involving Eigenvalues and Vectors151 General Results,l f . 3 Minimum Trace ProblemsComplements and Problems672Probability Theory, Tools and Techniques792a. Calculus of Probability80CHAPTER2a.l The Space of Elementary Events,2a.2 The Class of Subsets(Events), 2a.3 Probability as a Set Function,2a.4 Bore1 Field2a.S Notion of a(a-field) and Extension of Probability Measure,Random Variable and Distribution Function,2a.6 MultidimensionalRandom Variable,2a. 7 Conditional Probability and Statistical Inde2a.8 Conditional Distribution of a Random Variablependence,2b. Mathematical Expectation and Moments of Random Variables2h.l Properties of Mathematical Expectation,2b.2 Moments,Conditional Expectation,26.4 Characteristic Function ( C J ) ,Inversion Theorems,26.6 Multivariate Moments9226.326.52c. Limit Theorems1082c.l Kolmogorov Consistency Theorem,2c.2 Convergence of aSequence of Random Variables,2c.3 Law of Large Numbers,2c.S2c.4 Convergence of a Sequence of Distribution Functions,Central Limit Theorems,2c.6 Sums of Independent Random Variables2d. Family of Probability Measures and Problems o f Statistics1302d. I Family ofProbability Measures,2d.2 The Concept of a SufficientStatistic.2d.3 Characterization of SufficiencyAppendix 2AStieltjes and Lebesgue Integrals132Appendih LB. Some Important Theorems in Measure Theorya