A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://link.springer.com/doi/10.1007/978-1-4612-1694-0_15 below:

Information Theory and an Extension of the Maximum Likelihood Principle

Abstract

In this paper it is shown that the classical maximum likelihood principle can be considered to be a method of asymptotic realization of an optimum estimate with respect to a very general information theoretic criterion. This observation shows an extension of the principle to provide answers to many practical problems of statistical model fitting.

This is a preview of subscription content, log in via an institution to check access.

Preview

Unable to display preview. Download preview PDF.

Similar content being viewed by others References
  1. Akaike, H., Fitting autoregressive models for prediction. Ann. Inst. Statist. Math. 21 (1969) 243–217.

    Article  MathSciNet  MATH  Google Scholar 

  2. Akaike., H., Statistical predictor identification. Ann. Inst. Statist. Math. 22 (1970) 203–217.

    Article  MathSciNet  Google Scholar 

  3. Akaike, H., On a semi-automatic power spectrum estimation procedure. Proc. 3rd Hawaii International Conference on System Sciences, 1970, 974–977.

    Google Scholar 

  4. Akaike, H., On a decision procedure for system identification, Preprints, IFAC Kyoto Symposium on System Engineering Approach to Computer Control. 1970, 486–490.

    Google Scholar 

  5. Akaike, H., Autoregressive model fitting for control. Ann. Inst. Statist. Math. 23 (1971) 163–180.

    Article  MathSciNet  MATH  Google Scholar 

  6. Akaike, H., Determination of the number of factors by an extended maximum likelihood principle. Research Memo. 44, Inst. Statist. Math. March, 1971.

    Google Scholar 

  7. Bartlett, M. S., The statistical approach to the analysis of time-series. Symposium on Information Theory (mimeographed Proceedings), Ministry of Supply, London, 1950, 81–101.

    Google Scholar 

  8. Billingsley, P., Statistical Inference for Markov Processes. Univ. Chicago Press, Chicago 1961.

    MATH  Google Scholar 

  9. Blackwell, D., Equivalent comparisons of experiments. Ann. Math. Statist. 24 (1953) 265–272.

    Article  MathSciNet  MATH  Google Scholar 

  10. Campbell, L.L., Equivalence of Gauss’s principle and minimum discrimination information estimation of probabilities. Ann. Math. Statist. 41 (1970) 10111015.

    Google Scholar 

  11. Fisher, R.A., Theory of statistical estimation. Proc. Camb. Phil. Soc. 22 (1925) 700–725, Contributions to Mathematical Statistics John Wiley & Sons, New York, 1950, paper 11.

    Google Scholar 

  12. Good, I.J. Maximum entropy for hypothesis formulation, especially for multidimensional contingency tables. Ann. Math. Statist. 34 (1963) 911–934.

    Article  MathSciNet  MATH  Google Scholar 

  13. Gorman, J.W. and Toman, R.J., Selection of variables for fitting equations to data. Technometrics 8 (1966) 27–51.

    Article  Google Scholar 

  14. Jenkins, G.M. and Watts, D.G., Spectral Analysis and Its Applications. Holden Day, San Francisco, 1968.

    MATH  Google Scholar 

  15. Kullback, S. and Leibler, R.A., On information and sufficiency. Ann. Math Statist. 22 (1951) 79–86.

    Article  MathSciNet  MATH  Google Scholar 

  16. Kullback, S., Information Theory and Statistics. John Wiley & Sons, New York 1959.

    MATH  Google Scholar 

  17. Le Cam, L., On some asymptotic properties of maximum likelihood estimates and related Bayes estimates. Univ. Calif. Publ. in Stat. 1 (1953) 277–330.

    Google Scholar 

  18. Lehmann, E.L., Testing Statistical Hypotheses. John Wiley & Sons, New York 1969.

    Google Scholar 

  19. Otomo, T., Nakagawa, T. and Akaike, H. Statistical approach to computer control of cement rotary kilns. 1971. Automatica 8 (1972) 35–48.

    Article  Google Scholar 

  20. Rényi, A., Statistics and information theory. Studia Sci. Math. Hung. 2 (1967) 249–256.

    MATH  Google Scholar 

  21. Savage, L.J., The Foundations of Statistics. John Wiley & Sons, New York 1954.

    MATH  Google Scholar 

  22. Shannon, C.E. and Weaver, W., The Mathematical Theory of Communication. Univ. of Illinois Press, Urbana 1949.

    MATH  Google Scholar 

  23. Wald, A., Tests of statistical hypotheses concerning several parameters when the number of observations is large. Trans. Am. Math. Soc. 54 (1943) 426–482.

    Article  MathSciNet  MATH  Google Scholar 

  24. Wald, A., Note on the consistency of the maximum likelihood estimate. Ann Math. Statist. 20 (1949) 595–601.

    Article  MathSciNet  MATH  Google Scholar 

  25. Wald, A., Statistical Decision Functions. John Wiley & Sons, New York 1950.

    MATH  Google Scholar 

  26. Whittle, P., The statistical analysis of seiche record. J. Marine Res. 13 (1954) 76–100.

    MathSciNet  Google Scholar 

  27. Whittle, P., Prediction and Regulation. English Univ. Press, London 1963.

    Google Scholar 

  28. Wiener, N., Cybernetics. John Wiley & Sons, New York, 1948.

    Google Scholar 

Download references

Author information Authors and Affiliations
  1. Institute of Statistical Mathematics, Japan

    Hirotogu Akaike

Editor information Editors and Affiliations
  1. Department of Statistics, Texas A&M University, 77843, College Station, TX, USA

    Emanuel Parzen

  2. The Institute of Statistical Mathematics, 4-6-7 Minami-Azabu, 106, Minato-ku Tokyo, Japan

    Kunio Tanabe  & Genshiro Kitagawa  & 

Copyright information

© 1998 Springer Science+Business Media New York

About this chapter Cite this chapter

Akaike, H. (1998). Information Theory and an Extension of the Maximum Likelihood Principle. In: Parzen, E., Tanabe, K., Kitagawa, G. (eds) Selected Papers of Hirotugu Akaike. Springer Series in Statistics. Springer, New York, NY. https://doi.org/10.1007/978-1-4612-1694-0_15

Download citation

RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4