Chapter 1 Mathematical Methods 1.1 Time average vs. ensemble average time average: mean: mean-square: variance: auto-correlation:
ensemble average: mean: mean-square:
variance:
1
covariance :
・・・first-order probability density function ・・・second-order probability density function If time average = ensemble average “ergodic ensemble”
1.2 Stationary vs. non-stationary processes • If k-th order probability density function is invariant with respect to the shift of time origin,
stationary of order k • If a stochastic process is stationary of any order k = 1, 2, ・・・・・, strictly stationary
2
• If and if
and
are independent of is dependent only of
, i.e. constants, ,
wide-sense stationary (weakly stationary) • If
ergodic in the mean
• If
ergodic in the autocorrelation
example 1:
a basket full of batteries stationary but not ergodic
example 2: (uniform dist.) ergodic in both mean and autocorrelation
1.3 Basic stochastic processes 1.3.1 Probability distribution functions and characteristic functions probability mass function (PMF): P(x) discrete random variable z-transform:
3
probability density function (PDF): f(x) continuous random variable s-transform:
variance: third order cumulant: fourth order cumulant: 4
1.3.2 The Bernoulli process A. Bernouilli trial PMF:
z-transform:
x = p, s x2 = p(1 - p )
B. Binomial distribution A series of independent Bernouilli trials with the same probability of success produces k0 successes. z-transform:
no success
PMF:
one two success success
definition of z-transform
2 k
k = np, s = np(1 - p )
5
C. Geometric distribution Number of Bernoulli trials after any one success and before the first next success, including this events. PMF:
successive failure
first success
z-transform:
l1 =
1 1- p , s l21 = 2 p p
1.3.3 The Poisson process A. Poisson distribution A series of identical and independent Bernoulli trials, one every , with a probability of success p = lDt: k k
for a sufficiently small
.
6
Number of successes over a time interval [0, t] ?
mutually exclusive histories
(continuous limit)
iterative solution for k = 0, 1, 2….. with an initial condition
: average number of successful events : PMF z-transform:
k = m , s k2 = m
7
B. Erlang distribution Time interval between any one success and the r-th success after that.
(r – 1) successful events in
one successful event in
PDF:
(exponential distribution) s-transform:
is the sum of independent random variables
l1 =
1 1 , s l21 = 2 l l
lr =
r r , s l2r = 2 l l
8
C. Addition and random deletion of Poisson process i) w = x + y two independent Poisson random variables
e
x
(z -1)
e
y
(z -1)
compound PMF (due to independence of x and y)
e
(x w
+ y
w0
e
)(z -1)
- w
w0 !
(w
= x + y
)
: Poisson distribution cf. The weak law of large numbers
9
Mn = y
ii)
: initial Poisson distribution random deletion
binomial distribution
final Poisson distribution
D. Binomical to Poisson distribution
(very small probability of success)
10
Poisson distribution A sequence of single Bernoulli trials with a constant and small probability of success produces a Poisson distribution. independent Bernoulli trials with the probability of success ( : constant, ) = definition of Poisson process Physically, it corresponds to a memoryless system with a very fast internal relaxation.
1.3.4 The Gaussian process binomial distribution: n: very large p, 1 – p: not very close to zero A pronounced peak at
can be considered as a function of a continuous variable .
11
Taylor series expansion of
about k0 :
small deviation
binomial PMF:
pn
Truncate the Taylor expansion 12
Gaussian distribution cf. The central limit theorem Regardless of the individual random variable PDF, the sum of n independent identically distributed random variables converges to the Gaussian PDF as . s-transform:
1 4 0
0
0.5
1
p
1.4 Burgess variance theorem
n-constant
(1-p) : random deletion 13
binomial distribution If the number of incident particles fluctuates, the final particle number does not obey a simple binomial distribution.
initial distribution
binomial distribution
14
: Burgess variance theorem attenuation initial factor variance
partition noise
1.5 Fourier transform (analysis) If x(t) is absolutely integrable, , the Fourier transform of x(t) exists and is defined by
The inverse relation is
A statistically stationary process is not absolutely integrable, so strictly speaking, its Fourier transform does not exist.
15
gated function:
absolutely integrable
1.5.1 Parseval theorem
: Parseval theorem For example,
16
: energy theorem
: complex amplitude of : energy density of
component of at
1.5.2 Power spectral density average power of
ensemble average
ensemble averaged power of
power spectral density
: statistically stationary process
: statistically non-stationary process
17
1.5.3 Wiener-Khintchine theorem Parseval theorem
ensemble average
en. av. auto-correlation (covariance)
power spectral density
: non-stationary process
: stationary process inverse relation:
18
power spectral density
ensemble averaged auto-correlation
Fourier transform pair Example 1 White noise
: mean square
correlation time (memory time)
: Lorentzian If (infinitesimally short memory time), the power spectrum becomes white. Example 2 Wiener-Levy process
19
x(t) t=0
statistically-stationary noisy waveform t
t=0
statistically-nonstationary noisy waveform t
Stationary vs. nonstationary noisy waveforms
20
covariance
If x(t) is ergodic in the correlation,
(Wiener-Khintchine theorem)
If
,
(white noise)
21
diffusion constant : cumulative process
: no correlation
physical systems
x(t)
laser
frequency w(t)
phase f(t)
Brownian particle
velocity v(t)
position x(t)
current carrying resistor
current i(t)
charge q(t)
22
Autocorrelation Function
1
1 0.5
-1
Unilateral Power Spectrum
0
10-1 10-2 10-2
1
10-1
1
10
The autocorrelation function and unilateral power spectrum of a stationary noisy waveform.
Autocorrelation Function
Unilateral Power Spectrum 48 D yT 2 1
1
10-1
-1
0
1
10-2 10-2
10-1
1
10
The autocorrelation function and unilateral power spectrum of a nonstationary noisy waveform y(t). 23
1.5.4 Cross-correlation x(t), y(t): statistically stationary process cross-correlation function
cross-spectral density
: c-number (carry the amplitude and phase) Parseval theorem
: generalized Wiener-Khintchine theorem coherence function 24
1 -1 0
: complete positive correlation : complete negative correlation : no correlation
1.6 Random pulse train 1.6.1 Carson theorem a2 a1
a3 t1
t2
ak
t3 …………………… tk
t
random pulse train random variables
Fourier transform power spectral density
i) k = m 25
: average rate of pulse emission : mean-square of the pulse amplitude
ii)
If a pulse emission time is a Poisson-point-process and a pulse amplitude is completely independent,
: mean of the pulse amplitude
tk is uniformly distributed in [0, T]
e i wt m
26
Carson theorem
1.6.2 Campbell’s theorem : Wiener-Khintchine theorem
Parseval theorem
1/2
: mean-square
27
Campbell’s theorem of mean-square
Campbell’s theorem of mean 1.6.3 Shot noise in a vacuum diode cathode
surface charge QA = CV anode
QC = -CV - + -q - + electron -
+ + + + + + + + + +
vacuum diode i(t)
V When an electron is thermionically emitted, this event creates an additional surface charge of +q on the cathode. This surface charge shields the electric field created by the electron and realizes charge neutrality inside the cathode conductor.
As the electron travels from the cathode to the anode, the surface charge on the cathode decreases and the surface charge on the anode increases. This change in the surface charge is achieved by an external circuit current.
28
Ramo Theorem If an external circuit has a negligible resistance, the voltage between the two electrodes is kept constant. constant voltage operation energy gain by an electron circuit relaxation current
electron velocity
energy supply by a current transit time
If each electron emission is independent, such a memoryless system obeys Carson’s theorem. If the electron transit time is much shorter than any relevant time constants, we can assume the relaxation current pulse is an impulse with a constant area q. infinite noise power Carson’s theorem
Si (w ) = 2v a 2
a2 = q2
white
: Shottky formula of shot noise
29
If the electron transit time is not negligible, the Fourier transform of i(t) provides the information about the cut-off of shot noise component. finite noise power If an external circuit has a finite resistance and thus a finite circuit relaxation time , the voltage between the two electrodes is no more constant. However, if the average inter-emission time of electrons is much longer than the circuit relaxation time , each electron emission process is still considered as an independent process. constant voltage = memory-less
i(t) t V(t)
CRS
t If the average inter-emission time of electrons becomes shorter then the circuit relaxation time, electron emission process becomes self-regulated sub-Poissonian process. constant current operation 30
ensemble averaged autocorrelation 2
i (t ) 2
i (t )
: Campbell’s theorem Wiener-Khintchine theorem 2qi (t )
2
i (t ) d (w )
full shot noise
0 cut-off frequency
31