Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

probility density funct, Thesis of Engineering

Probability Probability and Stochastic Processes 3rd Edition ISBN: 9789354243455 Alternate ISBNs David Goodman, Roy D. Yates Textbook solutions Verified Chapter 1: Experiments, Models, and Probabilities Page 7: Quiz Page 29: Problems 1-2 Page 29: Problems 1-1 Page 30: Problems 1-3 Page 31: Problems 1-4 Page 33: Problems 1-5 Page 33: Problems 1-6

Typology: Thesis

2021/2022

Uploaded on 04/12/2023

unknown user
unknown user 🇺🇸

2 documents

1 / 65

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
PROBABILITY AND
STOCHASTIC PROCESSES
A Friendly Introduction
for Electrical and Computer Engineers
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff
pf12
pf13
pf14
pf15
pf16
pf17
pf18
pf19
pf1a
pf1b
pf1c
pf1d
pf1e
pf1f
pf20
pf21
pf22
pf23
pf24
pf25
pf26
pf27
pf28
pf29
pf2a
pf2b
pf2c
pf2d
pf2e
pf2f
pf30
pf31
pf32
pf33
pf34
pf35
pf36
pf37
pf38
pf39
pf3a
pf3b
pf3c
pf3d
pf3e
pf3f
pf40
pf41

Partial preview of the text

Download probility density funct and more Thesis Engineering in PDF only on Docsity!

PROBABILITY AND

STOCHASTIC PROCESSES

A Friendly Introduction

for Electrical and Computer Engineers

ACQUISITIONS EDITOR Bill Zobrist MARKETING MANAGER Katherine Hepburn PRODUCTION EDITOR Ken Santor DESIGNER Laura Boucher

This book was set in Times Roman by the authors and printed and bound by Quebecor - Fairfield, Inc. The cover was printed by Phoenix Color Corporation.

About the cover: The cover art was developed by Ken Harris, a student in the ECE Department at Rutgers University, and shows a bivariate Gaussian probability density function.

This book is printed on acid-free paper. ∞

The paper in this book was manufactured by a mill whose forest management programs include sustained yield harvesting of its timberlands. Sustained yield harvesting principles ensure that the numbers of trees cut each year does not exceed the amount of new growth.

Copyright c1999 John Wiley & Sons, Inc. All rights reserved.

No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning or otherwise, except as permitted under Sections 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, 222 Rosewood Drive, Danvers, MA 01923, (508) 750-8400, fax (508) 750-4470. Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 605 Third Avenue, New York, NY 10158-0012, (212) 850-6011, fax (212) 850-6008, E-Mail: PERMREQ@WILEY.COM.

Library of Congress Cataloging - in - Publications Data

Yates Roy D Probability and stochastic processes  a friendly introduction for electrical  computer engineers  Roy D Yates David J Goodman p cm Includes index ISBN     cloth  alk paper  Probabilities  Stochastic processes I Goodman David J    II Title QA Y    dc    CIP

Printed in the United States of America

10 9 8 7 6 5 4 3 2 1

To Theresa and Liz

PREFACE

When we started teaching the course Probability and Stochastic Processes to Rutgers undergraduates in 1991, we never dreamed we would write a textbook on the subject. Our bookshelves contain more than a dozen probability texts, many of them directed at electrical engineering students. We respect most of them. However, we have yet to find one that works well for Rutgers students. We discovered to our surprise that the majority of our students have a hard time learning the subject. Beyond meeting degree requirements, the main motivation of most of our students is to learn how to solve practical problems. For the majority, the mathematical logic of probability theory is, in itself, of minor interest. What the students want most is an intuitive grasp of the basic concepts and lots of practice working on applications. The students told us that the textbooks we assigned, for all their mathematical elegance, didn’t meet their needs. To help them, we distributed copies of our lecture notes, which gradually grew into this book. We also responded to student feedback by administering a half-hour quiz every week. A quiz contains ten questions designed to test a student’s grasp of the concepts presented that week. The quizzes provide rapid feedback to students on whether they are catching on to the new material. This is es- pecially important in probability theory because much of the math appears deceptively simple to engineering students. Reading a text and attending lectures, they feel they understand everything presented to them. However, when confronted with problems to solve, they discover that it takes a lot of careful thought and practice to use the math- ematics correctly. Although the equations and formulas are simple, knowing which one to use is difficult. This is a reversal from some mathematics courses, where the equations are given and the solutions are hard to obtain. To meet the needs of our students, this book has several distinctive characteristics:

The entire text adheres to a single model that begins with an experiment con- sisting of a procedure and observations. The mathematical logic is apparent to readers. Every fact is identified clearly as a definition, an axiom, or a theorem. There is an explanation, in simple English, of the intuition behind every concept when it first appears in the text. The mathematics of discrete random variables are introduced separately from the mathematics of continuous random variables. Stochastic processes fit comfortably within the unifying model of the text. They are introduced in Chapter 6, immediately after the presentations of discrete and continuous random variables. Subsequent material, including central limit the- orem approximations, laws of large numbers, and statistical inference, then use examples that reinforce stochastic process concepts. The text concludes with introductions to random signal processing and Markov chains.

vii

viii

There is an abundance of exercises that put the theory to use. Many worked out example problems are embedded in the text. Each section concludes with a simple quiz to help students gauge their grasp of that section. An appendix includes a complete solution for each quiz. At the end of each chapter, there are problems that span a range of difficulty. We estimate that the material in this book represents about 125% of a one semester undergraduate course. We suppose that every introduction to probability to theory will spend about two thirds of a semester covering the material in the first five chapters. The remainder of a course will be devoted to about half of the material in the final six chapters, with the selection depending on the preferences of the instructor and the needs of the students. Rutgers electrical and computer engineering students take this course in the first semester of junior year. The following semester they use much of the material in Principles of Communications. We have also used this book in an entry-level graduate course. That course covers the entire book in one semester, placing more emphasis on mathematical derivations and proofs than the undergraduate course. Although most of the early material in the book is familiar in advance to many graduate students, the course as a whole brings our diverse graduate student population up to a shared level of competence.

ORGANIZATION OF THE BOOK

The first five chapters carry the core material that is common to practically all in- troductory engineering courses in probability theory. Chapter 1 examines probability models defined on abstract sets. It introduces the set theory notation used throughout the book and states the three axioms of probability and several theorems that follow di- rectly from the axioms. It defines conditional probability, the Law of Total Probability, Bayes’ theorem, and independence. The chapter concludes by presenting combinato- rial principles and formulas that are used later in the book. The second and third chapters apply this material to models of discrete random variables, introducing expected values, functions of random variables, variance, co- variance, and conditional probability mass functions. Chapter 2 examines the prop- erties of a single discrete random variable and Chapter 3 covers multiple random variables with the emphasis on pairs of discrete random variables. Chapters 4 and 5 present the same material for continuous random variables and mixed random vari- ables. In studying Chapters 1–5, students encounter many of the same ideas three times in the contexts of abstract events, discrete random variables, and continuous random variables. We have found this repetition to be very helpful pedagogically. The road map for the text indicates that there are three sets of subjects that follow from the core material in the first five chapters. Chapter 6 introduces the basic principles of stochastic processes. Chapters 10 and 11 build on this introduction to cover random signal processing and Markov chains, respectively. Chapters 7 and 8 cover sums of random variables, moment generating functions, the Central Limit Theorem, and laws of large numbers. There is a dotted line connecting Chapters 6 and 7 because some of the terminology introduced in Chapter 6 appears in Chapters 7 and 8. However, it is also possible to skip Chapter 6 and go directly from Chapter 5 to Chapter 7.

x

we have borrowed the following marks:

Easier. More Difficult.  Most Difficult.  Experts Only. We have tried to assign difficulty marks based on the perception of a typical under- graduate engineering student. We expect that a course instructor is not likely to find a  problem a great challenge. Every ski area emphasizes that these designations are relative to the trails at that area. Similarly, the difficulty of our problems is relative to the other problems in this text.

HINTS ON STUDYING PROBABILITY

A lot of students find it hard to do well in this course. We think there are a few reasons for this difficulty. One reason is that some people find the concepts hard to use and understand. Many of them are successful students in other courses for whom the ideas of probability are so “weird” (or different from others) that it is very hard for them to catch on. Usually these students recognize that learning probability theory is a struggle, and most of them work hard enough to do well. However, they find themselves putting in more effort than in other courses to achieve similar results. Other people have the opposite problem. The work looks easy to them, and they understand everything they hear in class and read in the book. There are good reasons for assuming this is easy material. There are very few basic concepts to absorb. The terminology (like the word probability ), in most cases, contains familiar words. With a few exceptions, the mathematical manipulations are not complex. You can go a long way solving problems with a four-function calculator. For many people, this apparent simplicity is dangerously misleading. The problem is that it is very tricky to apply the math to specific problems. A few of you will see things clearly enough to do everything right the first time. However, most people who do well in probability need to practice with a lot of examples to get comfortable with the work and to really understand what the subject is about. Students in this course end up like elementary school children who do well with multiplication tables and long division but bomb out on “word problems.” The hard part is figuring out what to do with the numbers, not actually doing it. Most of the work in this course is that way, and the only way to do well is to practice a lot. We have the short quizzes to show you how well you are doing. Taking the midterm and final are similar to running in a five-mile race. Most people can do it in a respectable time, provided they train for it. Some people look at the runners who do it and say, “I’m as strong as they are. I’ll just go out there and join in.” Without the training, most of them are exhausted and walking after a mile or two. So, our advice to students is if this looks really weird to you, keep working at it. You will probably catch on. If it looks really simple, don’t get too complacent. It may be harder than you think. Get into the habit of doing the homework, and if you don’t answer all the quiz questions correctly, go over them until you understand each one.

xi

I (DJG) will add one more personal remark. For many years, I have been paid to solve probability problems. This has been a major part of my research at Bell Labs and at Rutgers. Applying probability to engineering questions has been extremely helpful to me in my career, and it has led me to a few practical inventions. I hope you will find the material intrinsically interesting. I hope you will learn to work with it. I think you will have many occasions to use it in future courses and throughout your career. We have worked hard to produce a text that will be useful to a large population of students and instructors. We welcome comments, criticism, and suggestions. Feel free to send us email at ryates@winlab.rutgers.edu or dgoodman@winlab.rutgers.edu. In addition, a companion website, http://www.winlab.rutgers.edu/probability , provides a variety of supplemental materials, including the MATLAB code used to produce many of the text examples.

FURTHER READING

University libraries have hundreds of books on probability. Of course, each book is written for a particular audience and has its own emphasis on certain topics. We encourage you to spend an afternoon in the library examining a wide selection. For students using this text, a short reference list on page 408 gives a sampling of the many books that may be helpful. Texts on a similar mathematical level to this text include [LG92, Dra67, Ros96]. For an emphasis on data analysis and statistics, [MR94] is very readable. For those wishing to follow up on the random signal processing material introduced Chapter 10, we can recommend [Pap91, SW94, Vin98]. The material in Chapter 11 can be found in greatly expanded form in [Gal96, Ros96] and in a very accessible introduction in [Dra67]. The two volumes by Feller, [Fel68] and [Fel66] are classic texts in probability theory. For research engineers, [Pap91] is a valuable reference for stochastic processes.

ACKNOWLEDGMENTS

In contrast to the goddess Athena, who emerged fully grown from her father’s head, this book has had a long gestation and infancy in which it was nurtured by our students and teaching assistants. Over the three years in which we have used drafts of the book as the principal text for our courses, we have changed the book considerably in response to student comments (and complaints). We thank the young men and women who took the courses ”Probability and Stochastic Processes” and ”Stochastic Signals and Systems” at Rutgers in the past three years and provided us with criticism and suggestions on the successive drafts of the text. Two undergraduates merit special thanks. Ken Harris created MATLAB demon- strations of many of the concepts presented in the text. (They are available at the companion website http://www.winlab.rutgers.edu/probability .) He also produced the diagram on the cover. Nisha Batra devised solutions for many of the homework prob- lems in early chapters.

CONTENTS

  • CHAPTER 1 EXPERIMENTS, MODELS, AND PROBABILITIES - Getting Started with Probability
    • 1.1 Set Theory
    • 1.2 Applying Set Theory to Probability
    • 1.3 Probability Axioms
    • 1.4 Some Consequences of the Axioms
    • 1.5 Conditional Probability
    • 1.6 Independence
    • 1.7 Sequential Experiments and Tree Diagrams
    • 1.8 Counting Methods
    • 1.9 Independent Trials
      • Chapter Summary
      • Problems
  • CHAPTER 2 DISCRETE RANDOM VARIABLES
    • 2.1 Definitions
    • 2.2 Probability Mass Function
    • 2.3 Some Useful Discrete Random Variables
    • 2.4 Cumulative Distribution Function (CDF)
    • 2.5 Averages
    • 2.6 Functions of a Random Variable
    • 2.7 Expected Value of a Derived Random Variable
    • 2.8 Variance and Standard Deviation
    • 2.9 Conditional Probability Mass Function
      • Chapter Summary
      • Problems
  • CHAPTER 3 MULTIPLE DISCRETE RANDOM VARIABLES
    • 3.1 Joint Probability Mass Function
    • 3.2 Marginal PMF
    • 3.3 Functions of Two Random Variables
    • 3.4 Expectations xiv CONTENTS
    • 3.5 Conditioning a Joint PMF by an Event
    • 3.6 Conditional PMF
    • 3.7 Independent Random Variables
    • 3.8 More Than Two Discrete Random Variables
      • Chapter Summary
      • Problems
  • CHAPTER 4 CONTINUOUS RANDOM VARIABLES - Continuous Sample Space
    • 4.1 The Cumulative Distribution Function
    • 4.2 Probability Density Function
    • 4.3 Expected Values
    • 4.4 Some Useful Continuous Random Variables
    • 4.5 Gaussian Random Variables
    • 4.6 Delta Functions, Mixed Random Variables
    • 4.7 Probability Models of Derived Random Variables
    • 4.8 Conditioning a Continuous Random Variable
      • Chapter Summary
      • Problems
  • CHAPTER 5 MULTIPLE CONTINUOUS RANDOM VARIABLES
    • 5.1 Joint Cumulative Distribution Function
    • 5.2 Joint Probability Density Function
    • 5.3 Marginal PDF
    • 5.4 Functions of Two Random Variables
    • 5.5 Expected Values
    • 5.6 Conditioning a Joint PDF by an Event
    • 5.7 Conditional PDF
    • 5.8 Independent Random Variables
    • 5.9 Jointly Gaussian Random Variables
    • 5.10 More Than Two Continuous Random Variables
      • Chapter Summary
      • Problems
  • CHAPTER 6 STOCHASTIC PROCESSES - Definitions
    • 6.1 Stochastic Process Examples CONTENTS xv
    • 6.2 Types of Stochastic Processes
    • 6.3 Random Variables from Random Processes
    • 6.4 Independent, Identically Distributed Random Sequences
    • 6.5 The Poisson Process
    • 6.6 The Brownian Motion Process
    • 6.7 Expected Value and Correlation
    • 6.8 Stationary Processes
    • 6.9 Wide Sense Stationary Random Processes
      • Chapter Summary
      • Problems
  • CHAPTER 7 SUMS OF RANDOM VARIABLES
    • 7.1 Expectations of Sums
    • 7.2 PDF of the Sum of Two Random Variables
    • 7.3 Moment Generating Function
    • 7.4 MGF of the Sum of Independent Random Variables
    • 7.5 Sums of Independent Gaussian Random Variables
    • 7.6 Random Sums of Independent Random Variables
    • 7.7 Central Limit Theorem
    • 7.8 Applications of the Central Limit Theorem
      • Chapter Summary
      • Problems
  • CHAPTER 8 THE SAMPLE MEAN
    • 8.1 Expected Value and Variance
    • 8.2 Useful Inequalities
    • 8.3 Sample Mean of Large Numbers
    • 8.4 Laws of Large Numbers
      • Chapter Summary
      • Problems
  • CHAPTER 9 STATISTICAL INFERENCE
    • 9.1 Significance Testing
    • 9.2 Binary Hypothesis Testing
    • 9.3 Multiple Hypothesis Test
    • 9.4 Estimation of a Random Variable
    • 9.5 Linear Estimation of X given Y xvi CONTENTS
    • 9.6 MAP and ML Estimation
    • 9.7 Estimation of Model Parameters
      • Chapter Summary
      • Problems
  • CHAPTER 10 RANDOM SIGNAL PROCESSING
    • 10.1 Linear Filtering of a Random Process
    • 10.2 Power Spectral Density
    • 10.3 Cross Correlations
    • 10.4 Gaussian Processes
    • 10.5 White Gaussian Noise Processes
    • 10.6 Digital Signal Processing
      • Chapter Summary
      • Problems
  • CHAPTER 11 RENEWAL PROCESSES AND MARKOV CHAINS
    • 11.1 Renewal Processes
    • 11.2 Poisson Process
    • 11.3 Renewal-Reward Processes
    • 11.4 Discrete Time Markov Chains
    • 11.5 Discrete Time Markov Chain Dynamics
    • 11.6 Limiting State Probabilities
    • 11.7 State Classification
    • 11.8 Limit Theorems For Discrete Time Markov Chains
    • 11.9 Periodic States and Multiple Communicating Classes
    • 11.10 Continuous Time Markov Chains
    • 11.11 Birth-Death Processes and Queueing Systems
      • Chapter Summary
      • Problems
  • APPENDIX A COMMON RANDOM VARIABLES
    • A.1 Discrete Random Variables
    • A.2 Continuous Random Variables
  • APPENDIX B QUIZ SOLUTIONS

CONTENTS xvii

Quiz Solutions – Chapter 1 403

REFERENCES 408

CHAPTER 1

EXPERIMENTS, MODELS,

AND PROBABILITIES

GETTING STARTED WITH PROBABILITY

You have read the “Hints on Studying Probability” in the Preface. Now you can be- gin. The title of this book is Probability and Stochastic Processes. We say and hear and read the word probability and its relatives ( possible, probable, probably ) in many contexts. Within the realm of applied mathematics, the meaning of probability is a question that has occupied mathematicians, philosophers, scientists, and social scien- tists for hundreds of years. Everyone accepts that the probability of an event is a number between 0 and 1. Some people interpret probability as a physical property (like mass or volume or tem- perature) that can be measured. This is tempting when we talk about the probability that a coin flip will come up heads. This probability is closely related to the nature of the coin. Fiddling around with the coin can alter the probability of heads. Another interpretation of probability relates to the knowledge that we have about something. We might assign a low probability to the truth of the statement It is raining now in Phoenix, Arizona , because of our knowledge that Phoenix is in the desert. How- ever, our knowledge changes if we learn that it was raining an hour ago in Phoenix. This knowledge would cause us to assign a higher probability to the truth of the state- ment It is raining now in Phoenix. Both views are useful when we apply probability theory to practical problems. Whichever view we take, we will rely on the abstract mathematics of probability, which consists of definitions, axioms, and inferences (theorems) that follow from the axioms. While the structure of the subject conforms to principles of pure logic, the terminology is not entirely abstract. Instead, it reflects the practical origins of prob- ability theory, which was developed to describe phenomena that cannot be predicted with certainty. The point of view is different from the one we took when we started studying physics. There we said that if you do the same thing in the same way over and over again – send a space shuttle into orbit, for example – the result will always be the same. To predict the result, you have to take account of all relevant facts. The mathematics of probability begins when the situation is so complex that we just can’t replicate everything important exactly – like when we fabricate and test

1

2 CHAPTER 1 EXPERIMENTS, MODELS, AND PROBABILITIES

an integrated circuit. In this case, repetitions of the same procedure yield different results. The situation is not totally chaotic, however. While each outcome may be unpredictable, there are consistent patterns to be observed when you repeat the proce- dure a large number of times. Understanding these patterns helps engineers establish test procedures to ensure that a factory meets quality objectives. In this repeatable procedure (making and testing a chip) with unpredictable outcomes (the quality of in- dividual chips), the probability is a number between 0 and 1 that states the proportion of times we expect a certain thing to happen, such as the proportion of chips that pass a test. As an introduction to probability and stochastic processes, this book serves three purposes:

It introduces students to the logic of probability theory. It helps students develop intuition into how the theory applies to practical situ- ations. It teaches students how to apply probability theory to solving engineering prob- lems.

To exhibit the logic of the subject, we show clearly in the text three categories of theoretical material: definitions, axioms, and theorems. Definitions establish the logic of probability theory, while axioms are facts that we have to accept without proof. Theorems are consequences that follow logically from definitions and axioms. Each theorem has a proof that refers to definitions, axioms, and other theorems. Although there are dozens of definitions and theorems, there are only three axioms of probability theory. These three axioms are the foundation on which the entire subject rests. To meet our goal of presenting the logic of the subject, we could set out the material as dozens of definitions followed by three axioms followed by dozens of theorems. Each theorem would be accompanied by a complete proof. While rigorous, this approach would completely fail to meet our second aim of conveying the intuition necessary to work on practical problems. To address this goal, we augment the purely mathematical material with a large number of examples of practical phenomena that can be analyzed by means of probability theory. We also interleave definitions and theorems, presenting some theorems with complete proofs, others with partial proofs, and omitting some proofs altogether. We find that most engineering students study probability with the aim of using it to solve practical prob- lems, and we cater mostly to this goal. We also encourage students to take an interest in the logic of the subject – it is very elegant – and we feel that the material presented will be sufficient to enable these students to fill in the gaps we have left in the proofs. Therefore, as you read this book you will find a progression of definitions, axioms, theorems, more definitions, and more theorems, all interleaved with examples and comments designed to contribute to your understanding of the theory. We also include brief quizzes that you should try to solve as you read the book. Each one will help you decide whether you have grasped the material presented just before the quiz. The problems at the end of each chapter give you more practice applying the material introduced in the chapter. They vary considerably in their level of difficulty. Some of them take you more deeply into the subject than the examples and quizzes do.