Lecture 1
BASICS OF COMMUNICATION SYSTEMS
Introduction • Electronic Communication – The transmission, reception, and processing of information with the use of electronic circuits
• Information – Knowledge or intelligence that is communicated (i.e., transmitted or received) between two or more points
Introduction • Digital Modulation – The transmittal of digitally modulated analog signals (carriers) between two or more points in a communications systems – Sometimes referred to as digital radio because digitally modulated signals can be propagated through Earth’s atmosphere and used in wireless communications systems
Introduction • Digital Communications – Include systems where relatively high-frequency analog carriers are modulated by relatively lowfrequency digital signals (digital radio) and systems involving the transmission of digital pulses (digital transmission)
Introduction
ASK
FSK
QAM
PSK
Applications 1
• Relatively low-speed voice-band data communications modems such as those found in most personal computers
2
• High-speed data transmission systems, such as broadband digital subscriber lines (DSL)
3
4
• Digital microwave and satellite communications systems
• Cellular telephone Personal Communications Systems (PCS)
Basic Telecommunication System
Attenuation
Source
Transducer
Transducer
Sink
Transmission Medium
In an electrical communication system, at the transmitting side, a transducer converts the real –life information into an electrical signal. At the receiving side, a transducer converts the electrical signal back into real-life information
Basic Telecommunication System
NOISE!!!
Source
Transducer
Transducer
Sink
Transmission Medium
Note: As the electrical signal passes through the transmission medium, the signal gets attenuated. In addition, the transmission medium introduces noise and, as a result, the signal gets distorted.
Basic Telecommunication System
The objective of designing a communication system is to reproduce the electrical signal at the receiving end with minimal distortion.
Basic Telecommunication System Channel
RS 232 Port
RS 232 Port
Note: The serial ports of two computers can be connected directly using a copper cable. However, due to the signal attenuation, the distance cannot be more than 100 meters.
Basic Telecommunication System
Two computers can communicate with each other through the telephone network, using a modem at each end. The modem converts the digital signals generated by the computer into analog form for transmission over the medium at the transmitting end and the reverse at the receiving end.
Basic Telecommunication System Source
Baseband Signal Processing
Medium Access Processing
Transmitter Medium
(a) Transmitting Side
Sink
Baseband Signal Processing
Decoding of Data
(a) Receiving Side
Receiver
Basic Telecommunication System In the case of a radio communication system for broadcasting audio programs, the electrical signal is transformed into a highfrequency signal and sent through the air (free space). A radio transmitter is used to do this. A reverse of this transformation – converting the high-frequency signal into an audio signal – is performed at the receiving station. Since it is a broadcasting system, many receivers receive the information.
Basic Telecommunication System • In a mobile communication system, a radio channel has to be shared by a number of users. Each user has to use the radio channel for a short time during which he has to transmit his data and then wait for his next turn. This mechanism of sharing the channel is known as multiple access.
Basic Telecommunication System Depending on the type of communication, the distance to be covered, etc., a communication system will consist of a number of elements, each element carrying out a specific function. Some important elements are: 1
• Multiplexer
2
• Multiple access
3
• Error detection and correction
4
• Source coding
5
• Signaling
Basic Telecommunication System Note: Two voice signals cannot be mixed directly because it will not be possible to separate them at the receiving end. The two voice signals can be transformed into different frequencies to combine them and send over the medium.
Types of Communication 1
• Point-to-point communication
2
• Point-to-multipoint communication
3
• Broadcasting
4
• Simplex communication
5
• Half-duplex communication
6
• Full-duplex communication
Transmission Impairments 1
2
3
• Attenuation • The amplitude of the signal wave decreases as the signal travels through the medium.
• Delay distortion • Occurs as a result of different frequency components arriving at different times in the guided media such as copper wire or coaxial cable
• Noise • Thermal noise, intermodulation, crosstalk, impulse noise
Transmission Impairments
• Thermal Noise – occurs due to the thermal agitation of electrons in a conductor. (white noise), N = kTB • Intermodulation Noise – When two signals of different frequencies are sent through the medium, due to nonlinearity of the transmitters, frequency components such as f1 + f2 and f1 – f2 are produced, which are unwanted components and need to be filtered out.
Transmission Impairments • Crosstalk – Unwanted coupling between signal paths • Impulse Noise – occurs due to external electromagnetic disturbances such as lightning. This also causes burst of errors.
Analog Versus Digital Transmission Analog Communication
Digital Communication
The signal, whose amplitude varies continuously, is transmitted over the medium.
1s and 0s are transmitted as voltage pulses. So, even if the pulse s distorted due to noise, it is not very difficult to detect the pulses at the receiving end.
Reproducing the analog signal at the receiving end is very difficult due to transmission impairments
Much more immune to noise
Advantages of Digital Transmission More reliable transmission • Because only discrimination between ones and zeros is required
Less costly implementation • Because of the advances in digital logic chips
Ease of combining various types of signals (voice, video, etc.,)
Ease of developing secure communication systems
Questions: 1. What are the advantages of digital communication over analog communication? 2. Explain the different types of communication systems. 3. What are the different types of transmission impairments? 4. What is multiplexing? 5. What is signaling?
Lecture 2
INFORMATION THEORY
Claude Shannon -Laid the foundation of information theory in 1948. His paper “A Mathematical Theory of Communication” published in Bell System Technical Journal is the basis for the entire telecommunications developments that have taken place during the last five decades. A good understanding of the concepts proposed by Shannon is a must for every budding telecommunication professional.
Requirements of a Communication System The requirement of a communication system is to transmit the information from the source to the sink without errors, in spite of the fact that noise is always introduced in the communication medium.
The Communication System Channel Information Source
Transmitter
Receiver
Noise Source
Generic Communication System
Information Sink
Symbols Produced
A
B
B
A
A
A
B
A
B
A
Bit stream produced
1
0
0
1
1
1
0
1
0
1
Bit stream received
1
0
0
1
1
1
1
1
0
1
In a digital communication system, due to the effect of noise, errors are introduced. As a result, 1 may become a 0 and 0 may become a 1.
Information Source
Source Encoder
Channel Encoder
Modulator
Modulating Signal
Modulated Signal
Demodulating Signal
Information Sink
Source Decoder
Channel Decoder
Demodulator
Generic Communication System as proposed by Shannon
Explanation of Each Block Information Source: Produces the symbols
Source encoder: converts the signal produced by the information source into a data stream Channel Encoder: add bits in the sourceencoded data Modulation: process of transforming the signal Demodulator: performs the inverse operation of the modulator
Explanation of Each Block Channel Decoder: analyzes the received bit stream and detects and corrects the error
Source Decoder: converts the bit stream into the actual information Information Sink: absorbs the information
Types of Source Encoding • Source encoding is done to reduce the redundancy in the signal. 1. Lossless coding 2. Lossy coding The compression utilities we use to compress data files use lossless encoding techniques. JPEG image compression is a lossy technique because some information is lost.
Channel Encoding • Redundancy is introduced so that at the receiving end, the redundant bits can be used for error detection or error correction
Entropy of an Information Source What is information?
???
How do we measure information
???
Entropy of an Information Source H = log2 N bits/symbol Where: N = symbols
Entropy of an Information Source Example: Assume that a source produces the English letters (from A to Z, including space), and all these symbols will be produced with equal probability. Determine the entropy. Ans. H = 4.75 bits/symbol
Entropy of an Information Source
If a source produces (i)th symbol with a probability of P(i)
Entropy of an Information Source • Example: Consider a source that produces four symbols with probabilities of ½, ¼, 1/8, and 1/8, and all symbols are independent of each other. Determine the entropy. Ans. 7/4 bits/symbol
Channel Capacity • The limit at which data can be transmitted through a medium
Where:
C = channel capacity (bps) W = bandwidth of the channel (Hz) S/N = signal-to-noise ratio (SNR) (dB)
SNR
Channel Capacity • Example: Consider a voice-grade line for which W = 3100 Hz, SNR = 30 dB. Determine the channel capacity.
Ans: 30.898 kbps
Question 1. To increase C, can we increase W? 2. To increase C, can we increase SNR?
Ans. 1. No, because increasing W increases noise as well, and SNR will be reduced. 2. No, that results in more noise, called intermodulation noise
Shannon’s Theorems • In digital communication system, the aim of the designer is to convert any information into a digital signal, pass it through the transmission medium and, at the receiving end, reproduce the digital signal exactly.
Shannon’s Theorems • Requirements:
To code any type of information into digital format To ensure that the data sent over the channel is not corrupted.
Source Coding Theorem • States that “the number of bits required to uniquely describe an information source can be approximated to the information content as closely as desired.”
Example: Consider a source that produces two symbols A and B with equal probability. Symbol
Probability
Code Word
A
0.5
1
B
0.5
0
Now, consider a source that produces these same two symbols. But instead of coding A and B directly, we can code AA, AB, BA, BB. Symbol
Probability
Code Word
AA
0.45
0
AB
0.45
10
BA
0.05
110
BB
0.05
111
• NOTE: Assigning short code words to highprobability symbols and long code words to low-probability symbols results in efficient coding.
AABABAABBB
0 110 110 10 111
Channel Coding Theorem • States that “the error rate of data transmitted over a bandwidth limited noisy channel can be reduced to an arbitrary small amount if the information rate is lower than the channel capacity.”
Example: Consider the example of a source producing the symbols A and B. A is coded as 1 and B as 0.
Symbols Produced
A
B
B
A
B
Bit Stream
1
0
0
1
0
Transmitting……………111000000111000 101000010111000 …………Received
NOTE • Source coding is used mainly to reduce the redundancy in the signal, whereas channel coding is used to introduce redundancy to overcome the effect of noise.
Questions: 1. Draw the block diagram of a communication system and explain the function of each block. 2. What is entropy of an information source? Illustrate with examples. 3. What is source coding? What is the difference between lossless coding and lossy coding? 4. Explain the concept of channel capacity with an example. 5. What is channel coding? Explain the concept of error correcting codes.
Exercises 1. A source produces 42 symbols with equal probability. Calculate the entropy of the source. 2. A source produces two symbols A and B with probabilities of 0.6 and 0.4, respectively. Calculate the entropy of the source. 3. The ASCII code is used to represent characters in the computer. Is it an efficient coding techniques from Shannon’s point of view? If not, why?
Answers 1. 2. 3.
5.39 bits/symbol 0.970 bits/symbol In ASCII, each character is represented by seven bits. The frequency of occurrence of the English letters is not taken into consideration at all. If the frequency of occurrence is taken into consideration, then the most frequently occurring letters have to be represented by small code words (such as 2 bits) and less frequently occurring letters have to be represented by long code words. According to Shannon’s theory, ASCII is not an efficient coding technique. However, note that if an efficient coding technique is followed, then a lot of additional processing is involved, which causes delay in decoding the text.