Image from Coce

Probability and statistics for economists / Bruce E. Hansen.

By: Material type: TextTextPublisher: Princeton : Princeton University Press, [2022]Copyright date: 2022Description: 1 volume : illustrations ; 26 cmContent type:
  • text
Media type:
  • unmediated
Carrier type:
  • volume
ISBN:
  • 0691235945
  • 9780691235943
Subject(s): Additional physical formats: Online version:: Probability and statistics for economists; No titleDDC classification:
  • 330.015195 23
LOC classification:
  • HB139 .H3638 2022
Contents:
1. Basic Probability Theory -- 1.1. Introduction -- 1.2. Outcomes and Events -- 1.3. Probability Function -- 1.4. Properties of the Probability Function -- 1.5. Equally Likely Outcomes -- 1.6. Joint Events -- 1.7. Conditional Probability -- 1.8. Independence -- 1.9. Law of Total Probability -- 1.10. Bayes Rule -- 1.11. Permutations and Combinations -- 1.12. Sampling with and without Replacement -- 1.13. Poker Hands -- 1.14. Sigma Fields* -- 1.15. Technical Proofs* -- 1.16. Exercises -- -- 2. Random Variables -- 2.1. Introduction -- 2.2. Random Variables -- 2.3. Discrete Random Variables -- 2.4. Transformations -- 2.5. Expectation -- 2.6. Finiteness of Expectations -- 2.7. Distribution Function -- 2.8. Continuous Random Variables -- 2.9. Quantiles -- 2.10. Density Functions -- 2.11. Transformations of Continuous Random Variables -- 2.12. Non-Monotonic Transformations -- 2.13. Expectation of Continuous Random Variables -- 2.14. Finiteness of Expectations -- 2.15. Unifying Notation -- 2.16. Mean and Variance -- 2.17. Moments -- 2.18. Jensen's Inequality -- 2.19. Applications of Jensen's Inequality* -- 2.20. Symmetric Distributions -- 2.21. Truncated Distributions -- 2.22. Censored Distributions -- 2.23. Moment Generating Function -- 2.24. Cumulants -- 2.25. Characteristic Function -- 2.26. Expectation: Mathematical Details* -- 2.27. Exercises -- -- 3. Parametric Distributions -- 3.1. Introduction -- 3.2. Bernoulli Distribution -- 3.3. Rademacher Distribution -- 3.4. Binomial Distribution -- 3.5. Multinomial Distribution -- 3.6. Poisson Distribution -- 3.7. Negative Binomial Distribution -- 3.8. Uniform Distribution -- 3.9. Exponential Distribution -- 3.10. Double Exponential Distribution -- 3.11. Generalized Exponential Distribution -- 3.12. Normal Distribution -- 3.13. Cauchy Distribution -- 3.14. Student t Distribution -- 3.15. Logistic Distribution -- 3.16. Chi-Square Distribution -- 3.17. Gamma Distribution -- 3.18. F Distribution -- 3.19. Non-Central Chi-Square -- 3.20. Beta Distribution -- 3.21. Pareto Distribution -- 3.22. Lognormal Distribution -- 3.23. Weibull Distribution -- 3.24. Extreme Value Distribution -- 3.25. Mixtures of Normals -- 3.26. Technical Proofs* -- 3.27. Exercises -- -- 4. Multivariate Distributions -- 4.1. Introduction -- 4.2. Bivariate Random Variables -- 4.3. Bivariate Distribution Functions -- 4.4. Probability Mass Function -- 4.5. Probability Density Function -- 4.6. Marginal Distribution -- 4.7. Bivariate Expectation -- 4.8. Conditional Distribution for Discrete X -- 4.9. Conditional Distribution for Continuous X -- 4.10. Visualizing Conditional Densities -- 4.11. Independence -- 4.12. Covariance and Correlation -- 4.13. Cauchy-Schwarz Inequality -- 4.14. Conditional Expectation -- 4.15. Law of Iterated Expectations -- 4.16. Conditional Variance -- 4.17. H ölder's and Minkowski's Inequalities* -- 4.18. Vector Notation -- 4.19. Triangle Inequalities* -- 4.20. Multivariate Random Vectors -- 4.21. Pairs of Multivariate Vectors -- 4.22. Multivariate Transformations -- 4.23. Convolutions -- 4.24. Hierarchical Distributions -- 4.25. Existence and Uniqueness of the Conditional Expectation* -- 4.26. Identification -- 4.27. Exercises -- -- 5. Normal and Related Distributions -- 5.1. Introduction -- 5.2. Univariate Normal -- 5.3. Moments of the Normal Distribution -- 5.4. Normal Cumulants -- 5.5. Normal Quantiles -- 5.6. Truncated and Censored Normal Distributions -- 5.7. Multivariate Normal -- 5.8. Properties of the Multivariate Normal -- 5.9. Chi-Square, t,F , and Cauchy Distributions -- 5.10. Hermite Polynomials* -- 5.11. Technical Proofs* -- 5.12. Exercises -- -- 6. Sampling -- 6.1. Introduction -- 6.2. Samples -- 6.3. Empirical Illustration -- 6.4. Statistics, Parameters, and Estimators -- 6.5. Sample Mean -- 6.6. Expected Value of Transformations -- 6.7. Functions of Parameters -- 6.8. Sampling Distribution -- 6.9. Estimation Bias -- 6.10. Estimation Variance -- 6.11. Mean Squared Error -- 6.12. Best Unbiased Estimator -- 6.13. Estimation of Variance -- 6.14. Standard Error -- 6.15. Multivariate Means -- 6.16. Order Statistics∗ -- 6.17. Higher Moments of Sample Mean* -- 6.18. Normal Sampling Model -- 6.19. Normal Residuals -- 6.20. Normal Variance Estimation -- 6.21. Studentized Ratio -- 6.22. Multivariate Normal Sampling -- 6.23. Exercises -- -- 7. Law of Large Numbers -- 7.1. Introduction -- 7.2. Asymptotic Limits -- 7.3. Convergence in Probability -- 7.4. Chebyshev's Inequality -- 7.5. Weak Law of Large Numbers -- 7.6. Counterexamples -- 7.7. Examples -- 7.8. Illustrating Chebyshev's Inequality -- 7.9. Vector-Valued Moments -- 7.10. Continuous Mapping Theorem -- 7.11. Examples -- 7.12. Uniformity Over Distributions* -- 7.13. Almost Sure Convergence and the Strong Law* -- 7.14. Technical Proofs* -- 7.15. Exercises -- -- 8. Central Limit Theory -- 8.1. Introduction -- 8.2. Convergence in Distribution -- 8.3. Sample Mean -- 8.4. A Moment Investigation -- 8.5. Convergence of the Moment Generating Function -- 8.6. Central Limit Theorem -- 8.7. Applying the Central Limit Theorem -- 8.8. Multivariate Central Limit Theorem -- 8.9. Delta Method -- 8.10. Examples -- 8.11. Asymptotic Distribution for Plug-In Estimator -- 8.12. Covariance Matrix Estimation -- 8.13. t -Ratios -- 8.14. Stochastic Order Symbols -- 8.15. Technical Proofs* -- 8.16. Exercises -- --
9. Advanced Asymptotic Theory* -- 9.1. Introduction -- 9.2. Heterogeneous Central Limit Theory -- 9.3. Multivariate Heterogeneous Central Limit Theory -- 9.4. Uniform Central Limit Theory -- 9.5. Uniform Integrability -- 9.6. Uniform Stochastic Bounds -- 9.7. Convergence of Moments -- 9.8. Edgeworth Expansion for the Sample Mean -- 9.9. Edgeworth Expansion for Smooth Function Model -- 9.10. Cornish-Fisher Expansions -- 9.11. Technical Proofs* -- -- 10. Maximum Likelihood Estimation -- 10.1. Introduction -- 10.2. Parametric Model -- 10.3. Likelihood -- 10.4. Likelihood Analog Principle -- 10.5. Invariance Property -- 10.6. Examples -- 10.7. Score, Hessian, and Information -- 10.8. Examples -- 10.9. Cram ér-Rao Lower Bound -- 10.10. Examples -- 10.11. Cram ér-Rao Bound for Functions of Parameters -- 10.12. Consistent Estimation -- 10.13. Asymptotic Normality -- 10.14. Asymptotic Cram ér-Rao Efficiency -- 10.15. Variance Estimation -- 10.16. Kullback-Leibler Divergence -- 10.17. Approximating Models -- 10.18. Distribution of the MLE under Misspecification -- 10.19. Variance Estimation under Misspecification -- 10.20. Technical Proofs* -- 10.21. Exercises -- -- 11. Method of Moments -- 11.1. Introduction -- 11.2. Multivariate Means -- 11.3. Moments -- 11.4. Smooth Functions -- 11.5. Central Moments -- 11.6. Best Unbiased Estimation -- 11.7. Parametric Models -- 11.8. Examples of Parametric Models -- 11.9. Moment Equations -- 11.10. Asymptotic Distribution for Moment Equations -- 11.11. Example: Euler Equation -- 11.12. Empirical Distribution Function -- 11.13. Sample Quantiles -- 11.14. Robust Variance Estimation -- 11.15. Technical Proofs* -- 11.16. Exercises -- -- 12. Numerical Optimization -- 12.1. Introduction -- 12.2. Numerical Function Evaluation and Differentiation -- 12.3. Root Finding -- 12.4. Minimization in One Dimension -- 12.5. Failures of Minimization -- 12.6. Minimization in Multiple Dimensions -- 12.7. Constrained Optimization -- 12.8. Nested Minimization -- 12.9. Tips and Tricks -- 12.10. Exercises -- -- 13. Hypothesis Testing -- 13.1. Introduction -- 13.2. Hypotheses -- 13.3. Acceptance and Rejection -- 13.4. Type I and Type II Errors -- 13.5. One-Sided Tests -- 13.6. Two-Sided Tests -- 13.7. What Does "Accept ℍ0" Mean about ℍ0? -- 13.8. t Test with Normal Sampling -- 13.9. Asymptotic t Test -- 13.10. Likelihood Ratio Test for Simple Hypotheses -- 13.11. Neyman-Pearson Lemma -- 13.12. Likelihood Ratio Test against Composite Alternatives -- 13.13. Likelihood Ratio and t Tests -- 13.14. Statistical Significance -- 13.15. p-Value -- 13.16. Composite Null Hypothesis -- 13.17. Asymptotic Uniformity -- 13.18. Summary -- 13.19. Exercises -- -- 14. Confidence Intervals -- 14.1. Introduction -- 14.2. Definitions -- 14.3. Simple Confidence Intervals -- 14.4. Confidence Intervals for the Sample Mean under Normal Sampling -- 14.5. Confidence Intervals for the Sample Mean under Non-Normal Sampling -- 14.6. Confidence Intervals for Estimated Parameters -- 14.7. Confidence Interval for the Variance -- 14.8. Confidence Intervals by Test Inversion -- 14.9. Use of Confidence Intervals -- 14.10. Uniform Confidence Intervals -- 14.11. Exercises -- -- 15. Shrinkage Estimation -- 15.1. Introduction -- 15.2. Mean Squared Error -- 15.3. Shrinkage -- 15.4. James-Stein Shrinkage Estimator -- 15.5. Numerical Calculation -- 15.6. Interpretation of the Stein Effect -- 15.7. Positive-Part Estimator -- 15.8. Summary -- 15.9. Technical Proofs* -- 15.10. Exercises -- -- 16. Bayesian Methods -- 16.1. Introduction -- 16.2. Bayesian Probability Model -- 16.3. Posterior Density -- 16.4. Bayesian Estimation -- 16.5. Parametric Priors -- 16.6. Normal-Gamma Distribution -- 16.7. Conjugate Prior -- 16.8. Bernoulli Sampling -- 16.9. Normal Sampling -- 16.10. Credible Sets -- 16.11. Bayesian Hypothesis Testing -- 16.12. Sampling Properties in the Normal Model -- 16.13. Asymptotic Distribution -- 16.14. Technical Proofs* -- 16.15. Exercises -- -- 17. Nonparametric Density Estimation -- 17.1. Introduction -- 17.2. Histogram Density Estimation -- 17.3. Kernel Density Estimator -- 17.4. Bias of Density Estimator -- 17.5. Variance of Density Estimator -- 17.6. Variance Estimation and Standard Errors -- 17.7. Integrated Mean Squared Error of Density Estimator -- 17.8. Optimal Kernel -- 17.9. Reference Bandwidth -- 17.10. Sheather-Jones Bandwidth* -- 17.11. Recommendations for Bandwidth Selection -- 17.12. Practical Issues in Density Estimation -- 17.13. Computation -- 17.14. Asymptotic Distribution -- 17.15. Undersmoothing -- 17.16. Technical Proofs* -- 17.17. Exercises -- -- 18. Empirical Process Theory -- 18.1. Introduction -- 18.2. Framework -- 18.3. Glivenko-Cantelli Theorem -- 18.4. Packing, Covering, and Bracketing Numbers -- 18.5. Uniform Law of Large Numbers -- 18.6. Functional Central Limit Theory -- 18.7. Conditions for Asymptotic Equicontinuity -- 18.8. Donsker's Theorem -- 18.9. Technical Proofs* -- 18.10. Exercises -- -- Appendix 1. Limits -- Appendix 2. Series -- Appendix 3. Factorials -- Appendix 4. Exponentials -- Appendix 5. Logarithms -- Appendix 6. Differentiation -- Appendix 7. Mean Value Theorem -- Appendix 8. Integration -- Appendix 9. Gaussian Integral -- Appendix 10. Gamma Function -- Appendix 11. Matrix Algebra.
Summary: "A comprehensive introduction to probability and statistics through the perspective of applying it to economic analysis"-- Provided by publisher.
Tags from this library: No tags from this library for this title. Log in to add tags.
Holdings
Item type Current library Call number Status Date due Barcode
Book City Campus City Campus Main Collection 330.015195 HAN (Browse shelf(Opens below)) Issued 29/09/2024 A537433B
Book City Campus City Campus Main Collection 330.015195 HAN (Browse shelf(Opens below)) Available A537429B

Includes bibliographical references and index.

1. Basic Probability Theory -- 1.1. Introduction -- 1.2. Outcomes and Events -- 1.3. Probability Function -- 1.4. Properties of the Probability Function -- 1.5. Equally Likely Outcomes -- 1.6. Joint Events -- 1.7. Conditional Probability -- 1.8. Independence -- 1.9. Law of Total Probability -- 1.10. Bayes Rule -- 1.11. Permutations and Combinations -- 1.12. Sampling with and without Replacement -- 1.13. Poker Hands -- 1.14. Sigma Fields* -- 1.15. Technical Proofs* -- 1.16. Exercises -- -- 2. Random Variables -- 2.1. Introduction -- 2.2. Random Variables -- 2.3. Discrete Random Variables -- 2.4. Transformations -- 2.5. Expectation -- 2.6. Finiteness of Expectations -- 2.7. Distribution Function -- 2.8. Continuous Random Variables -- 2.9. Quantiles -- 2.10. Density Functions -- 2.11. Transformations of Continuous Random Variables -- 2.12. Non-Monotonic Transformations -- 2.13. Expectation of Continuous Random Variables -- 2.14. Finiteness of Expectations -- 2.15. Unifying Notation -- 2.16. Mean and Variance -- 2.17. Moments -- 2.18. Jensen's Inequality -- 2.19. Applications of Jensen's Inequality* -- 2.20. Symmetric Distributions -- 2.21. Truncated Distributions -- 2.22. Censored Distributions -- 2.23. Moment Generating Function -- 2.24. Cumulants -- 2.25. Characteristic Function -- 2.26. Expectation: Mathematical Details* -- 2.27. Exercises -- -- 3. Parametric Distributions -- 3.1. Introduction -- 3.2. Bernoulli Distribution -- 3.3. Rademacher Distribution -- 3.4. Binomial Distribution -- 3.5. Multinomial Distribution -- 3.6. Poisson Distribution -- 3.7. Negative Binomial Distribution -- 3.8. Uniform Distribution -- 3.9. Exponential Distribution -- 3.10. Double Exponential Distribution -- 3.11. Generalized Exponential Distribution -- 3.12. Normal Distribution -- 3.13. Cauchy Distribution -- 3.14. Student t Distribution -- 3.15. Logistic Distribution -- 3.16. Chi-Square Distribution -- 3.17. Gamma Distribution -- 3.18. F Distribution -- 3.19. Non-Central Chi-Square -- 3.20. Beta Distribution -- 3.21. Pareto Distribution -- 3.22. Lognormal Distribution -- 3.23. Weibull Distribution -- 3.24. Extreme Value Distribution -- 3.25. Mixtures of Normals -- 3.26. Technical Proofs* -- 3.27. Exercises -- -- 4. Multivariate Distributions -- 4.1. Introduction -- 4.2. Bivariate Random Variables -- 4.3. Bivariate Distribution Functions -- 4.4. Probability Mass Function -- 4.5. Probability Density Function -- 4.6. Marginal Distribution -- 4.7. Bivariate Expectation -- 4.8. Conditional Distribution for Discrete X -- 4.9. Conditional Distribution for Continuous X -- 4.10. Visualizing Conditional Densities -- 4.11. Independence -- 4.12. Covariance and Correlation -- 4.13. Cauchy-Schwarz Inequality -- 4.14. Conditional Expectation -- 4.15. Law of Iterated Expectations -- 4.16. Conditional Variance -- 4.17. H ölder's and Minkowski's Inequalities* -- 4.18. Vector Notation -- 4.19. Triangle Inequalities* -- 4.20. Multivariate Random Vectors -- 4.21. Pairs of Multivariate Vectors -- 4.22. Multivariate Transformations -- 4.23. Convolutions -- 4.24. Hierarchical Distributions -- 4.25. Existence and Uniqueness of the Conditional Expectation* -- 4.26. Identification -- 4.27. Exercises -- -- 5. Normal and Related Distributions -- 5.1. Introduction -- 5.2. Univariate Normal -- 5.3. Moments of the Normal Distribution -- 5.4. Normal Cumulants -- 5.5. Normal Quantiles -- 5.6. Truncated and Censored Normal Distributions -- 5.7. Multivariate Normal -- 5.8. Properties of the Multivariate Normal -- 5.9. Chi-Square, t,F , and Cauchy Distributions -- 5.10. Hermite Polynomials* -- 5.11. Technical Proofs* -- 5.12. Exercises -- -- 6. Sampling -- 6.1. Introduction -- 6.2. Samples -- 6.3. Empirical Illustration -- 6.4. Statistics, Parameters, and Estimators -- 6.5. Sample Mean -- 6.6. Expected Value of Transformations -- 6.7. Functions of Parameters -- 6.8. Sampling Distribution -- 6.9. Estimation Bias -- 6.10. Estimation Variance -- 6.11. Mean Squared Error -- 6.12. Best Unbiased Estimator -- 6.13. Estimation of Variance -- 6.14. Standard Error -- 6.15. Multivariate Means -- 6.16. Order Statistics∗ -- 6.17. Higher Moments of Sample Mean* -- 6.18. Normal Sampling Model -- 6.19. Normal Residuals -- 6.20. Normal Variance Estimation -- 6.21. Studentized Ratio -- 6.22. Multivariate Normal Sampling -- 6.23. Exercises -- -- 7. Law of Large Numbers -- 7.1. Introduction -- 7.2. Asymptotic Limits -- 7.3. Convergence in Probability -- 7.4. Chebyshev's Inequality -- 7.5. Weak Law of Large Numbers -- 7.6. Counterexamples -- 7.7. Examples -- 7.8. Illustrating Chebyshev's Inequality -- 7.9. Vector-Valued Moments -- 7.10. Continuous Mapping Theorem -- 7.11. Examples -- 7.12. Uniformity Over Distributions* -- 7.13. Almost Sure Convergence and the Strong Law* -- 7.14. Technical Proofs* -- 7.15. Exercises -- -- 8. Central Limit Theory -- 8.1. Introduction -- 8.2. Convergence in Distribution -- 8.3. Sample Mean -- 8.4. A Moment Investigation -- 8.5. Convergence of the Moment Generating Function -- 8.6. Central Limit Theorem -- 8.7. Applying the Central Limit Theorem -- 8.8. Multivariate Central Limit Theorem -- 8.9. Delta Method -- 8.10. Examples -- 8.11. Asymptotic Distribution for Plug-In Estimator -- 8.12. Covariance Matrix Estimation -- 8.13. t -Ratios -- 8.14. Stochastic Order Symbols -- 8.15. Technical Proofs* -- 8.16. Exercises -- --

9. Advanced Asymptotic Theory* -- 9.1. Introduction -- 9.2. Heterogeneous Central Limit Theory -- 9.3. Multivariate Heterogeneous Central Limit Theory -- 9.4. Uniform Central Limit Theory -- 9.5. Uniform Integrability -- 9.6. Uniform Stochastic Bounds -- 9.7. Convergence of Moments -- 9.8. Edgeworth Expansion for the Sample Mean -- 9.9. Edgeworth Expansion for Smooth Function Model -- 9.10. Cornish-Fisher Expansions -- 9.11. Technical Proofs* -- -- 10. Maximum Likelihood Estimation -- 10.1. Introduction -- 10.2. Parametric Model -- 10.3. Likelihood -- 10.4. Likelihood Analog Principle -- 10.5. Invariance Property -- 10.6. Examples -- 10.7. Score, Hessian, and Information -- 10.8. Examples -- 10.9. Cram ér-Rao Lower Bound -- 10.10. Examples -- 10.11. Cram ér-Rao Bound for Functions of Parameters -- 10.12. Consistent Estimation -- 10.13. Asymptotic Normality -- 10.14. Asymptotic Cram ér-Rao Efficiency -- 10.15. Variance Estimation -- 10.16. Kullback-Leibler Divergence -- 10.17. Approximating Models -- 10.18. Distribution of the MLE under Misspecification -- 10.19. Variance Estimation under Misspecification -- 10.20. Technical Proofs* -- 10.21. Exercises -- -- 11. Method of Moments -- 11.1. Introduction -- 11.2. Multivariate Means -- 11.3. Moments -- 11.4. Smooth Functions -- 11.5. Central Moments -- 11.6. Best Unbiased Estimation -- 11.7. Parametric Models -- 11.8. Examples of Parametric Models -- 11.9. Moment Equations -- 11.10. Asymptotic Distribution for Moment Equations -- 11.11. Example: Euler Equation -- 11.12. Empirical Distribution Function -- 11.13. Sample Quantiles -- 11.14. Robust Variance Estimation -- 11.15. Technical Proofs* -- 11.16. Exercises -- -- 12. Numerical Optimization -- 12.1. Introduction -- 12.2. Numerical Function Evaluation and Differentiation -- 12.3. Root Finding -- 12.4. Minimization in One Dimension -- 12.5. Failures of Minimization -- 12.6. Minimization in Multiple Dimensions -- 12.7. Constrained Optimization -- 12.8. Nested Minimization -- 12.9. Tips and Tricks -- 12.10. Exercises -- -- 13. Hypothesis Testing -- 13.1. Introduction -- 13.2. Hypotheses -- 13.3. Acceptance and Rejection -- 13.4. Type I and Type II Errors -- 13.5. One-Sided Tests -- 13.6. Two-Sided Tests -- 13.7. What Does "Accept ℍ0" Mean about ℍ0? -- 13.8. t Test with Normal Sampling -- 13.9. Asymptotic t Test -- 13.10. Likelihood Ratio Test for Simple Hypotheses -- 13.11. Neyman-Pearson Lemma -- 13.12. Likelihood Ratio Test against Composite Alternatives -- 13.13. Likelihood Ratio and t Tests -- 13.14. Statistical Significance -- 13.15. p-Value -- 13.16. Composite Null Hypothesis -- 13.17. Asymptotic Uniformity -- 13.18. Summary -- 13.19. Exercises -- -- 14. Confidence Intervals -- 14.1. Introduction -- 14.2. Definitions -- 14.3. Simple Confidence Intervals -- 14.4. Confidence Intervals for the Sample Mean under Normal Sampling -- 14.5. Confidence Intervals for the Sample Mean under Non-Normal Sampling -- 14.6. Confidence Intervals for Estimated Parameters -- 14.7. Confidence Interval for the Variance -- 14.8. Confidence Intervals by Test Inversion -- 14.9. Use of Confidence Intervals -- 14.10. Uniform Confidence Intervals -- 14.11. Exercises -- -- 15. Shrinkage Estimation -- 15.1. Introduction -- 15.2. Mean Squared Error -- 15.3. Shrinkage -- 15.4. James-Stein Shrinkage Estimator -- 15.5. Numerical Calculation -- 15.6. Interpretation of the Stein Effect -- 15.7. Positive-Part Estimator -- 15.8. Summary -- 15.9. Technical Proofs* -- 15.10. Exercises -- -- 16. Bayesian Methods -- 16.1. Introduction -- 16.2. Bayesian Probability Model -- 16.3. Posterior Density -- 16.4. Bayesian Estimation -- 16.5. Parametric Priors -- 16.6. Normal-Gamma Distribution -- 16.7. Conjugate Prior -- 16.8. Bernoulli Sampling -- 16.9. Normal Sampling -- 16.10. Credible Sets -- 16.11. Bayesian Hypothesis Testing -- 16.12. Sampling Properties in the Normal Model -- 16.13. Asymptotic Distribution -- 16.14. Technical Proofs* -- 16.15. Exercises -- -- 17. Nonparametric Density Estimation -- 17.1. Introduction -- 17.2. Histogram Density Estimation -- 17.3. Kernel Density Estimator -- 17.4. Bias of Density Estimator -- 17.5. Variance of Density Estimator -- 17.6. Variance Estimation and Standard Errors -- 17.7. Integrated Mean Squared Error of Density Estimator -- 17.8. Optimal Kernel -- 17.9. Reference Bandwidth -- 17.10. Sheather-Jones Bandwidth* -- 17.11. Recommendations for Bandwidth Selection -- 17.12. Practical Issues in Density Estimation -- 17.13. Computation -- 17.14. Asymptotic Distribution -- 17.15. Undersmoothing -- 17.16. Technical Proofs* -- 17.17. Exercises -- -- 18. Empirical Process Theory -- 18.1. Introduction -- 18.2. Framework -- 18.3. Glivenko-Cantelli Theorem -- 18.4. Packing, Covering, and Bracketing Numbers -- 18.5. Uniform Law of Large Numbers -- 18.6. Functional Central Limit Theory -- 18.7. Conditions for Asymptotic Equicontinuity -- 18.8. Donsker's Theorem -- 18.9. Technical Proofs* -- 18.10. Exercises -- -- Appendix 1. Limits -- Appendix 2. Series -- Appendix 3. Factorials -- Appendix 4. Exponentials -- Appendix 5. Logarithms -- Appendix 6. Differentiation -- Appendix 7. Mean Value Theorem -- Appendix 8. Integration -- Appendix 9. Gaussian Integral -- Appendix 10. Gamma Function -- Appendix 11. Matrix Algebra.

"A comprehensive introduction to probability and statistics through the perspective of applying it to economic analysis"-- Provided by publisher.

There are no comments on this title.

to post a comment.

Powered by Koha