機率、統計與隨機過程(第四版)(英文版)

機率、統計與隨機過程(第四版)(英文版)

《機率、統計與隨機過程(第四版)(英文版)》是2012年電子工業出版社出版的圖書,作者是亨利 斯塔克、約翰 伍茲。

基本介紹

  • 書名:機率、統計與隨機過程(第四版)(英文版)
  • 作者:亨利 斯塔克、約翰 伍茲 
  • 出版社:電子工業出版社
  • 出版時間:2012年08月01日
  • ISBN:9787121176685
圖書內容,目錄,

圖書內容

本書從工程套用的角度,全面闡述機率、統計與隨機過程的基本理論及其套用。
全書共11章,首先簡單贈探葛介紹機率論歡尋充,然後各章分別討論隨機變數、隨機變數的函式、均值與矩、轎踏隨鍵霉地機矢量、統計(包括參數估計和假設檢驗)、隨機序列、隨機過程基礎知識和深入探討,最後討論了統計信號處理中的相關套用。書中只項槓探給出了大量電子和信棕墓探息系統相關實例,每章給出了勸挨坑照豐富的習題。

目錄

Preface 11
Chapter1 Introduction to Probability 13
1.1 Introduction: Why Study Probability? 13
1.2 The Different Kinds of Probability 14
Probability as Intuition 14
Probability as the Ratio of Favorable to Total Outcomes (Classical Theory) 15
Probability as a Measure of Frequency of Occurrence 16
Probability Based on an Axiomatic Theory 17
1.3 Misuses, Miscalculations, and Paradoxes in Probability 19
1.4 Sets, Fields, and Events 20
Examples of Sample Spaces 20
1.5 Axiomatic Definition of Probability 27
1.6 Joint, Conditional, and Total Probabilities; Independence 32
Compound Experiments 35
1.7 Bayes’ Theorem and Applications 47
1.8 Combinatorics 50
Occupancy Problems 54
Extensions and Applications 58
1.9 Bernoulli Trials—Binomial and Multinomial Probability Laws 60
Multinomial Probability Law 66
1.10 Asymptotic Behavior of the Binomial Law: The Poisson Law 69
1.11 Normal Approximation to the Binomial Law 75
Summary 77
Problems 78
References 89
Chapter2 Random Variables 91
2.1 Introduction 91
2.2 Definition of a Random Variable 92
2.3 Cumulative Distribution Function 95
Properties of FX(x) 96
Computation of FX(x) 97
2.4 Probability Density Function (pdf) 100
Four Other Common Density Functions 107
More Advanced Density Functions 109
2.5 Continuous, Discrete, and Mixed Random Variables 112
Some Common Discrete Random Variables 114
2.6 Conditional and Joint Distributions and Densities 119
Properties of Joint CDF FXY (x, y) 130
2.7 Failure Rates 149
Summary 153
Problems 153
References 161
Additional Reading 161
Chapter3 Functions of Random Variables 163
3.1 Introduction 163
Functions of a Random Variable (FRV): Several Views 166
3.2 Solving Problems of the Type Y = g(X) 167
General Formula of Determining the pdf of Y = g(X) 178
3.3 Solving Problems of the Type Z = g(X, Y ) 183
3.4 Solving Problems of the Type V = g(X, Y ), W = h(X, Y ) 205
Fundamental Problem 205
Obtaining fVW Directly from fXY 208
3.5 Additional Examples 212
Summary 217
Problems 218
References 226
Additional Reading 226
Chapter4 Expectation and Moments 227
4.1 Expected Value of a Random Variable 227
On the Validity of Equation 4.1-8 230
4.2 Conditional Expectations 244
Conditional Expectation as a Random Variable 251
4.3 Moments of Random Variables 254
Joint Moments 258
Properties of Uncorrelated Random Variables 260
Jointly Gaussian Random Variables 263
4.4 Chebyshev and Schwarz Inequalities 267
Markov Inequality 269
The Schwarz Inequality 270
4.5 Moment-Generating Functions 273
4.6 Chernoff Bound 276
4.7 Characteristic Functions 278
Joint Characteristic Functions 285
The Central Limit Theorem 288
4.8 Additional Examples 293
Summary 295
Problems 296
References 305
Additional Reading 306
Chapter5 Random Vectors 307
5.1 Joint Distribution and Densities 307
5.2 Multiple Transformation of Random Variables 311
5.3 Ordered Random Variables 314
5.4 Expectation Vectors and Covariance Matrices 323
5.5 Properties of Covariance Matrices 326
Whitening Transformation 330
5.6 The Multidimensional Gaussian (Normal) Law 331
5.7 Characteristic Functions of Random Vectors 340
Properties of CF of Random Vectors 342
The Characteristic Function of the Gaussian (Normal) Law 343
Summary 344
Problems 345
References 351
Additional Reading 351
Chapter6 Statistics: Part 1 Parameter Estimation 352
6.1 Introduction 352
Independent, Identically, Observations 353
Estimation of Probabilities 355
6.2 Estimators 358
6.3 Estimation of the Mean 360
Properties of the Mean-Estimator Function (MEF) 361
Procedure for Getting a δ-confidence Interval on the Mean of a Normal
Random Variable When σX Is Known 364
Confidence Interval for the Mean of a Normal Distribution When σX Is Not
Known 364
Procedure for Getting a δ-Confidence Interval Based on n Observations on
the Mean of a Normal Random Variable when σX Is Not Known 367
Interpretation of the Confidence Interval 367
6.4 Estimation of the Variance and Covariance 367
Confidence Interval for the Variance of a Normal Random
variable 369
Estimating the Standard Deviation Directly 371
Estimating the covariance 372
6.5 Simultaneous Estimation of Mean and Variance 373
6.6 Estimation of Non-Gaussian Parameters from Large Samples 375
6.7 Maximum Likelihood Estimators 377
6.8 Ordering, more on Percentiles, Parametric Versus Nonparametric Statistics 381
The Median of a Population Versus Its Mean 383
Parametric versus Nonparametric Statistics 384
Confidence Interval on the Percentile 385
Confidence Interval for the Median When n Is Large 387
6.9 Estimation of Vector Means and Covariance Matrices 388
Estimation of μ 389
Estimation of the covariance K 390
6.10 Linear Estimation of Vector Parameters 392
Summary 396
Problems 396
References 400
Additional Reading 401
Chapter7 Statistics: Part 2 Hypothesis Testing 402
7.1 Bayesian Decision Theory 403
7.2 Likelihood Ratio Test 408
7.3 Composite Hypotheses 414
Generalized Likelihood Ratio Test (GLRT) 415
How Do We Test for the Equality of Means of Two Populations? 420
Testing for the Equality of Variances for Normal Populations:
The F-test 424
TestingWhether the Variance of a Normal Population Has a Predetermined
Value: 428
7.4 Goodness of Fit 429
7.5 Ordering, Percentiles, and Rank 435
How Ordering is Useful in Estimating Percentiles and the Median 437
Confidence Interval for the Median When n Is Large 440
Distribution-free Hypothesis Testing: Testing If Two Population are the
Same Using Runs 441
Ranking Test for Sameness of Two Populations 444
Summary 445
Problems 445
References 451
Chapter8 Random Sequences 453
8.1 Basic Concepts 454
Infinite-length Bernoulli Trials 459
Continuity of Probability Measure 464
Statistical Specification of a Random Sequence 466
8.2 Basic Principles of Discrete-Time Linear Systems 483
8.3 Random Sequences and Linear Systems 489
8.4 WSS Random Sequences 498
Power Spectral Density 501
Interpretation of the psd 502
Synthesis of Random Sequences and Discrete-Time Simulation 505
Decimation 508
Interpolation 509
8.5 Markov Random Sequences 512
ARMA Models 515
Markov Chains 516
8.6 Vector Random Sequences and State Equations 523
8.7 Convergence of Random Sequences 525
8.8 Laws of Large Numbers 533
Summary 538
Problems 538
References 553
Chapter9 Random Processes 555
9.1 Basic Definitions 556
9.2 Some Important Random Processes 560
Asynchronous Binary Signaling 560
Poisson Counting Process 562
Alternative Derivation of Poisson Process 567
Random Telegraph Signal 569
Digital Modulation Using Phase-Shift Keying 570
Wiener Process or Brownian Motion 572
Markov Random Processes 575
Birth–Death Markov Chains 579
Chapman–Kolmogorov Equations 583
Random Process Generated from Random Sequences 584
9.3 Continuous-Time Linear Systems with Random Inputs 584
White Noise 589
9.4 Some Useful Classifications of Random Processes 590
Stationarity 591
9.5 Wide-Sense Stationary Processes and LSI Systems 593
Wide-Sense Stationary Case 594
Power Spectral Density 596
An Interpretation of the psd 598
More on White Noise 602
Stationary Processes and Differential Equations 608
9.6 Periodic and Cyclostationary Processes 612
9.7 Vector Processes and State Equations 618
State Equations 620
Summary 623
Problems 623
References 645
Chapters 10 and 11 are available as Web chapters on the companion
Web site at www.pearsoninternationaleditions.com/stark.
Chapter10 Advanced Topics in Random Processes 647
10.1 Mean-Square (m.s.) Calculus 647
Stochastic Continuity and Derivatives [10-1] 647
Further Results on m.s. Convergence [10-1] 657
10.2 Mean-Square Stochastic Integrals 662
10.3 Mean-Square Stochastic Differential Equations 665
10.4 Ergodicity [10-3] 670
10.5 Karhunen–Lo`eve Expansion [10-5] 677
10.6 Representation of Bandlimited and Periodic Processes 683
Bandlimited Processes 683
Bandpass Random Processes 686
WSS Periodic Processes 689
Fourier Series for WSS Processes 692
Summary 694
Appendix: Integral Equations 694
Existence Theorem 695
Problems 698
References 711
Chapter11 Applications to Statistical Signal Processing 712
11.1 Estimation of Random Variables and Vectors 712
More on the Conditional Mean 718
Orthogonality and Linear Estimation 720
Some Properties of the Operator ˆE 728
11.2 Innovation Sequences and Kalman Filtering 730
Predicting Gaussian Random Sequences 734
Kalman Predictor and Filter 736
Error-Covariance Equations 741
11.3 Wiener Filters for Random Sequences 745
Unrealizable Case (Smoothing) 746
Causal Wiener Filter 748
“A01_STAR2288_04_PIE_FM” — 2011/9/16 — 17:15 — page 9
Contents 9
11.4 Expectation-Maximization Algorithm 750
Log-likelihood for the Linear Transformation 752
Summary of the E-M algorithm 754
E-M Algorithm for Exponential Probability
Functions 755
Application to Emission Tomography 756
Log-likelihood Function of Complete Data 758
E-step 759
M-step 760
11.5 Hidden Markov Models (HMM) 761
Specification of an HMM 763
Application to Speech Processing 765
Efficient Computation of P[E|M] with a Recursive
Algorithm 766
Viterbi Algorithm and the Most Likely State Sequence
for the Observations 768
11.6 Spectral Estimation 771
The Periodogram 772
Bartlett’s Procedure---Averaging Periodograms 774
Parametric Spectral Estimate 779
Maximum Entropy Spectral Density 781
11.7 Simulated Annealing 784
Gibbs Sampler 785
Noncausal Gauss–Markov Models 786
Compound Markov Models 790
Gibbs Line Sequence 791
Summary 795
Problems 795
References 800
Appendix A Review of Relevant Mathematics A-1
A.1 Basic Mathematics A-1
Sequences A-1
Convergence A-2
Summations A-3
Z-Transform A-3
A.2 Continuous Mathematics A-4
Definite and Indefinite Integrals A-5
Differentiation of Integrals A-6
Integration by Parts A-7
Completing the Square A-7
Double Integration A-8
Functions A-8
A.3 Residue Method for Inverse Fourier Transformation A-10
Fact A-11
Inverse Fourier Transform for psd of Random Sequence A-13
A.4 Mathematical Induction A-17
References A-17
Appendix B Gamma and Delta Functions B-1
B.1 Gamma Function B-1
B.2 Incomplete Gamma Function B-2
B.3 Dirac Delta Function B-2
References B-5
Appendix C Functional Transformations and Jacobians C-1
C.1 Introduction C-1
C.2 Jacobians for n = 2 C-2
C.3 Jacobian for General n C-4
Appendix D Measure and Probability D-1
D.1 Introduction and Basic Ideas D-1
Measurable Mappings and Functions D-3
D.2 Application of Measure Theory to Probability D-3
Distribution Measure D-4
Appendix E Sampled Analog Waveforms and Discrete-time Signals E-1
Appendix F Independence of Sample Mean and Variance for Normal Random Variables F-1
Appendix G Tables of Cumulative Distribution Functions: the Normal, Student t, Chi-square, and F G-1
Index I-1
1.11 Normal Approximation to the Binomial Law 75
Summary 77
Problems 78
References 89
Chapter2 Random Variables 91
2.1 Introduction 91
2.2 Definition of a Random Variable 92
2.3 Cumulative Distribution Function 95
Properties of FX(x) 96
Computation of FX(x) 97
2.4 Probability Density Function (pdf) 100
Four Other Common Density Functions 107
More Advanced Density Functions 109
2.5 Continuous, Discrete, and Mixed Random Variables 112
Some Common Discrete Random Variables 114
2.6 Conditional and Joint Distributions and Densities 119
Properties of Joint CDF FXY (x, y) 130
2.7 Failure Rates 149
Summary 153
Problems 153
References 161
Additional Reading 161
Chapter3 Functions of Random Variables 163
3.1 Introduction 163
Functions of a Random Variable (FRV): Several Views 166
3.2 Solving Problems of the Type Y = g(X) 167
General Formula of Determining the pdf of Y = g(X) 178
3.3 Solving Problems of the Type Z = g(X, Y ) 183
3.4 Solving Problems of the Type V = g(X, Y ), W = h(X, Y ) 205
Fundamental Problem 205
Obtaining fVW Directly from fXY 208
3.5 Additional Examples 212
Summary 217
Problems 218
References 226
Additional Reading 226
Chapter4 Expectation and Moments 227
4.1 Expected Value of a Random Variable 227
On the Validity of Equation 4.1-8 230
4.2 Conditional Expectations 244
Conditional Expectation as a Random Variable 251
4.3 Moments of Random Variables 254
Joint Moments 258
Properties of Uncorrelated Random Variables 260
Jointly Gaussian Random Variables 263
4.4 Chebyshev and Schwarz Inequalities 267
Markov Inequality 269
The Schwarz Inequality 270
4.5 Moment-Generating Functions 273
4.6 Chernoff Bound 276
4.7 Characteristic Functions 278
Joint Characteristic Functions 285
The Central Limit Theorem 288
4.8 Additional Examples 293
Summary 295
Problems 296
References 305
Additional Reading 306
Chapter5 Random Vectors 307
5.1 Joint Distribution and Densities 307
5.2 Multiple Transformation of Random Variables 311
5.3 Ordered Random Variables 314
5.4 Expectation Vectors and Covariance Matrices 323
5.5 Properties of Covariance Matrices 326
Whitening Transformation 330
5.6 The Multidimensional Gaussian (Normal) Law 331
5.7 Characteristic Functions of Random Vectors 340
Properties of CF of Random Vectors 342
The Characteristic Function of the Gaussian (Normal) Law 343
Summary 344
Problems 345
References 351
Additional Reading 351
Chapter6 Statistics: Part 1 Parameter Estimation 352
6.1 Introduction 352
Independent, Identically, Observations 353
Estimation of Probabilities 355
6.2 Estimators 358
6.3 Estimation of the Mean 360
Properties of the Mean-Estimator Function (MEF) 361
Procedure for Getting a δ-confidence Interval on the Mean of a Normal
Random Variable When σX Is Known 364
Confidence Interval for the Mean of a Normal Distribution When σX Is Not
Known 364
Procedure for Getting a δ-Confidence Interval Based on n Observations on
the Mean of a Normal Random Variable when σX Is Not Known 367
Interpretation of the Confidence Interval 367
6.4 Estimation of the Variance and Covariance 367
Confidence Interval for the Variance of a Normal Random
variable 369
Estimating the Standard Deviation Directly 371
Estimating the covariance 372
6.5 Simultaneous Estimation of Mean and Variance 373
6.6 Estimation of Non-Gaussian Parameters from Large Samples 375
6.7 Maximum Likelihood Estimators 377
6.8 Ordering, more on Percentiles, Parametric Versus Nonparametric Statistics 381
The Median of a Population Versus Its Mean 383
Parametric versus Nonparametric Statistics 384
Confidence Interval on the Percentile 385
Confidence Interval for the Median When n Is Large 387
6.9 Estimation of Vector Means and Covariance Matrices 388
Estimation of μ 389
Estimation of the covariance K 390
6.10 Linear Estimation of Vector Parameters 392
Summary 396
Problems 396
References 400
Additional Reading 401
Chapter7 Statistics: Part 2 Hypothesis Testing 402
7.1 Bayesian Decision Theory 403
7.2 Likelihood Ratio Test 408
7.3 Composite Hypotheses 414
Generalized Likelihood Ratio Test (GLRT) 415
How Do We Test for the Equality of Means of Two Populations? 420
Testing for the Equality of Variances for Normal Populations:
The F-test 424
TestingWhether the Variance of a Normal Population Has a Predetermined
Value: 428
7.4 Goodness of Fit 429
7.5 Ordering, Percentiles, and Rank 435
How Ordering is Useful in Estimating Percentiles and the Median 437
Confidence Interval for the Median When n Is Large 440
Distribution-free Hypothesis Testing: Testing If Two Population are the
Same Using Runs 441
Ranking Test for Sameness of Two Populations 444
Summary 445
Problems 445
References 451
Chapter8 Random Sequences 453
8.1 Basic Concepts 454
Infinite-length Bernoulli Trials 459
Continuity of Probability Measure 464
Statistical Specification of a Random Sequence 466
8.2 Basic Principles of Discrete-Time Linear Systems 483
8.3 Random Sequences and Linear Systems 489
8.4 WSS Random Sequences 498
Power Spectral Density 501
Interpretation of the psd 502
Synthesis of Random Sequences and Discrete-Time Simulation 505
Decimation 508
Interpolation 509
8.5 Markov Random Sequences 512
ARMA Models 515
Markov Chains 516
8.6 Vector Random Sequences and State Equations 523
8.7 Convergence of Random Sequences 525
8.8 Laws of Large Numbers 533
Summary 538
Problems 538
References 553
Chapter9 Random Processes 555
9.1 Basic Definitions 556
9.2 Some Important Random Processes 560
Asynchronous Binary Signaling 560
Poisson Counting Process 562
Alternative Derivation of Poisson Process 567
Random Telegraph Signal 569
Digital Modulation Using Phase-Shift Keying 570
Wiener Process or Brownian Motion 572
Markov Random Processes 575
Birth–Death Markov Chains 579
Chapman–Kolmogorov Equations 583
Random Process Generated from Random Sequences 584
9.3 Continuous-Time Linear Systems with Random Inputs 584
White Noise 589
9.4 Some Useful Classifications of Random Processes 590
Stationarity 591
9.5 Wide-Sense Stationary Processes and LSI Systems 593
Wide-Sense Stationary Case 594
Power Spectral Density 596
An Interpretation of the psd 598
More on White Noise 602
Stationary Processes and Differential Equations 608
9.6 Periodic and Cyclostationary Processes 612
9.7 Vector Processes and State Equations 618
State Equations 620
Summary 623
Problems 623
References 645
Chapters 10 and 11 are available as Web chapters on the companion
Web site at www.pearsoninternationaleditions.com/stark.
Chapter10 Advanced Topics in Random Processes 647
10.1 Mean-Square (m.s.) Calculus 647
Stochastic Continuity and Derivatives [10-1] 647
Further Results on m.s. Convergence [10-1] 657
10.2 Mean-Square Stochastic Integrals 662
10.3 Mean-Square Stochastic Differential Equations 665
10.4 Ergodicity [10-3] 670
10.5 Karhunen–Lo`eve Expansion [10-5] 677
10.6 Representation of Bandlimited and Periodic Processes 683
Bandlimited Processes 683
Bandpass Random Processes 686
WSS Periodic Processes 689
Fourier Series for WSS Processes 692
Summary 694
Appendix: Integral Equations 694
Existence Theorem 695
Problems 698
References 711
Chapter11 Applications to Statistical Signal Processing 712
11.1 Estimation of Random Variables and Vectors 712
More on the Conditional Mean 718
Orthogonality and Linear Estimation 720
Some Properties of the Operator ˆE 728
11.2 Innovation Sequences and Kalman Filtering 730
Predicting Gaussian Random Sequences 734
Kalman Predictor and Filter 736
Error-Covariance Equations 741
11.3 Wiener Filters for Random Sequences 745
Unrealizable Case (Smoothing) 746
Causal Wiener Filter 748
“A01_STAR2288_04_PIE_FM” — 2011/9/16 — 17:15 — page 9
Contents 9
11.4 Expectation-Maximization Algorithm 750
Log-likelihood for the Linear Transformation 752
Summary of the E-M algorithm 754
E-M Algorithm for Exponential Probability
Functions 755
Application to Emission Tomography 756
Log-likelihood Function of Complete Data 758
E-step 759
M-step 760
11.5 Hidden Markov Models (HMM) 761
Specification of an HMM 763
Application to Speech Processing 765
Efficient Computation of P[E|M] with a Recursive
Algorithm 766
Viterbi Algorithm and the Most Likely State Sequence
for the Observations 768
11.6 Spectral Estimation 771
The Periodogram 772
Bartlett’s Procedure---Averaging Periodograms 774
Parametric Spectral Estimate 779
Maximum Entropy Spectral Density 781
11.7 Simulated Annealing 784
Gibbs Sampler 785
Noncausal Gauss–Markov Models 786
Compound Markov Models 790
Gibbs Line Sequence 791
Summary 795
Problems 795
References 800
Appendix A Review of Relevant Mathematics A-1
A.1 Basic Mathematics A-1
Sequences A-1
Convergence A-2
Summations A-3
Z-Transform A-3
A.2 Continuous Mathematics A-4
Definite and Indefinite Integrals A-5
Differentiation of Integrals A-6
Integration by Parts A-7
Completing the Square A-7
Double Integration A-8
Functions A-8
A.3 Residue Method for Inverse Fourier Transformation A-10
Fact A-11
Inverse Fourier Transform for psd of Random Sequence A-13
A.4 Mathematical Induction A-17
References A-17
Appendix B Gamma and Delta Functions B-1
B.1 Gamma Function B-1
B.2 Incomplete Gamma Function B-2
B.3 Dirac Delta Function B-2
References B-5
Appendix C Functional Transformations and Jacobians C-1
C.1 Introduction C-1
C.2 Jacobians for n = 2 C-2
C.3 Jacobian for General n C-4
Appendix D Measure and Probability D-1
D.1 Introduction and Basic Ideas D-1
Measurable Mappings and Functions D-3
D.2 Application of Measure Theory to Probability D-3
Distribution Measure D-4
Appendix E Sampled Analog Waveforms and Discrete-time Signals E-1
Appendix F Independence of Sample Mean and Variance for Normal Random Variables F-1
Appendix G Tables of Cumulative Distribution Functions: the Normal, Student t, Chi-square, and F G-1
Index I-1

相關詞條

熱門詞條

聯絡我們