Question? Leave a message!




Coding Schemes, Adaptive Modulation/Coding, Hybrid ARQ

Point-to-Point Wireless Communication (Coding Schemes, Adaptive Modulation/Coding, Hybrid ARQ 44
Dr.ShivJindal Profile Pic
Dr.ShivJindal,India,Teacher
Published Date:19-07-2017
Website URL
Comment
Point-to-Point Wireless Communication (III): Coding Schemes, Adaptive Modulation/Coding, Hybrid ARQ/FEC Shivkumar Kalyanaraman Rensselaer Polytechnic Institute : “shiv rpi” 1References  David MacKay, Information Theory, Inference & Learning Algorithms. PDF available online:  http://www.inference.phy.cam.ac.uk/mackay/itprnn/book.html  Chapter 1: Introduction to Information Theory  Skim Chapter 9: Communication over a noisy channel  Section 11.4: Capabilities of Practical Error Correcting Codes  Skim Chap 13: Binary codes (ideas of “distance”, “perfect”/MDS codes, concatenation)  Skim Chapter 47: LDPC codes  Chapter 48: Convolutional & Turbo Codes  Optional browsing: Chap 49: Digital Fountain (erasure) codes  Article by Berlekamp: “Application of Error Control Coding to Communications” (especially the discussions on RS coding, concatenated codes, hybrid ARQ/FEC strategies) Shivkumar Kalyanaraman Rensselaer Polytechnic Institute : “shiv rpi” 2Context: Time Diversity  Time diversity can be obtained by interleaving and coding over symbols across different coherent time periods. Channel: time diversity/selectivity, but correlated across successive symbols (Repetition) Coding… w/o interleaving: a full codeword lost during fade Interleaving: of sufficient depth: ( coherence time) At most 1 symbol of codeword lost Coding alone is not sufficient Shivkumar Kalyanaraman Rensselaer Polytechnic Institute : “shiv rpi” 3What is channel coding?  Transforming signals to improve communications performance by increasing the robustness against channel impairments (noise, interference, fading, ..)  It is a time-diversity technique, but can be broadly thought of as techniques to make better use of the degrees-of-freedom in channels (eg: space-time codes)  Waveform coding: Transforming waveforms to better waveforms  Structured sequences: Transforming data sequences into better sequences, having structured redundancy.  “Better” in the sense of making the decision process less subject to errors.  Introduce constraints on transmitted codewords to have greater “distance” between them  Note: Channel coding was developed in the context of AWGN channels & we shall study them in the same context Shivkumar Kalyanaraman Rensselaer Polytechnic Institute : “shiv rpi” 4Channel (Modified) Block Diagram Source Channel Pulse Bandpass Format encode encode modulate modulate Digital modulation Digital demodulation Source Demod. Channel Format Detect decode Sample decode Shivkumar Kalyanaraman Rensselaer Polytechnic Institute : “shiv rpi” 5Channel Coding Schemes: Block, Convolutional, Turbo Shivkumar Kalyanaraman Rensselaer Polytechnic Institute : “shiv rpi” 6Coding Gain: The Value of Coding…  Error performance vs. bandwidth  Power vs. bandwidth P B  Data rate vs. bandwidth Coded  Capacity vs. bandwidth A F Coding gain: For a given bit-error probability, C B the reduction in the Eb/N0 that can be realized through the use of code: D E Uncoded  E E b b  G dB dB dB  N N  0 0 u c E / N (dB) b 0 Shivkumar Kalyanaraman Rensselaer Polytechnic Institute : “shiv rpi” 7Coding Gain Potential Gap-from-Shannon-limit: -5 BER=10 9.6 + 1.59 = 11.2 dB (about 7.8 dB if you maintain spectral efficiency) Shivkumar Kalyanaraman Rensselaer Polytechnic Institute : “shiv rpi” 8The Ultimate Shannon Limit  Goal: what is min Eb/No for any spectral efficiency (ρ→0)?  Spectral efficiency ρ = B/W = log (1 + SNR) 2  where SNR = E /N where E =energy per symbol s o s ρ  Or SNR = (2 - 1)  Eb/No = Es/No (W/B) = SNR/ρ ρ Lets try to appreciate what Shannon‟s bound means Eb/No = (2 - 1)/ρ ln 2 = -1.59dB by designing some simple codes and comparing it to the Shannon bound  Fix ρ = 2 bits/Hz = (2ρ - 1)/ρ = 3/2 = 1.76dB -5  Gap-to-capacity BER =10 : 9.6dB + 1.59 = 11.2 dB (without regard for spectral eff.) or 9.6 – 1.76 = 7.84 dB (keeping spectral eff. constant) Shivkumar Kalyanaraman Rensselaer Polytechnic Institute : “shiv rpi” 9Binary Symmetric Channel (BSC)  Given a BER (f), we can construct a BSC with this BER… Shivkumar Kalyanaraman Rensselaer Polytechnic Institute : “shiv rpi” 10Reliable Disk Drive Application  We want to build a disk drive and write a GB/day for 10 years. -15  = desired BER: 10  Physical solution: use more reliable components, reduce noise  System solution: accept noisy channel, detect/correct errors (engineer reliability over unreliable channels) Shivkumar Kalyanaraman Rensselaer Polytechnic Institute : “shiv rpi” 11Repetition Code (R3) & Majority Vote Decoding AWGN: Shivkumar Kalyanaraman Rensselaer Polytechnic Institute : “shiv rpi” 12Performance of R3 The error probability is dominated by the probability that two bits in 2 a block of three are flipped, which scales as f . For BSC with f = 0.1, the R3 code has a probability of error, after decoding, of p = 0.03 per bit or 3%. b Rate penalty: need 3 noisy disks to get the loss prob down to 3%. To -15 get to BER: 10 , we need 61 disks Shivkumar Kalyanaraman Rensselaer Polytechnic Institute : “shiv rpi” 13Coding: Rate-BER Tradeoff? Repetition code R3: Lets try to design a “better” code: Hamming Code  Shannon: The perception that there is a necessary tradeoff between Rate and BER is illusory It is not true upto a critical rate, the channel capacity  You only need to design better codes to give you the coding gain… Shivkumar Kalyanaraman Rensselaer Polytechnic Institute : “shiv rpi” 14Hamming Code: Linear Block Code  A block code is a rule for converting a sequence of source bits s, of length K, say, into a transmitted sequence t of length N bits.  In a linear block code, the extra N-K bits are linear functions of the original K bits; these extra bits are called parity-check bits.  (7, 4) Hamming code: transmits N = 7 bits for every K = 4 source bits.  The first four transmitted bits, t t t t , are set equal to the four source 1 2 3 4 bits, s s s s . 1 2 3 4  The parity-check bits t t t are set so that the parity within each circle 5 6 7 (see below) is even Shivkumar Kalyanaraman Rensselaer Polytechnic Institute : “shiv rpi” 15Hamming Code: (Contd) Shivkumar Kalyanaraman Rensselaer Polytechnic Institute : “shiv rpi” 16Hamming Code: Syndrome Decoding  If channel is BSC and all source vectors are equiprobable, then…  … the optimal decoder identifies the source vector s whose encoding t(s) differs from the received vector r in the fewest bits.  Similar to “closest-distance” decision rule seen in demodulation  Can we do it more efficiently? Yes: Syndrome decoding Tx The decoding task is to find the smallest set of flipped bits that can account for these violations of the parity rules. The pattern of violations of the parity checks is called the syndrome: the syndrome above is z = (1, 1, 0), because the first two circles are `unhappy' (parity 1) and the third circle is `happy„ (parity 0). Shivkumar Kalyanaraman Rensselaer Polytechnic Institute : “shiv rpi” 17Syndrome Decoding (Contd)  Can we find a unique bit that lies inside all the `unhappy' circles and outside all the `happy' circles?  If so, the flipping of that bit would account for the observed syndrome. Shivkumar Kalyanaraman Rensselaer Polytechnic Institute : “shiv rpi” 18Hamming Code: Performance  A decoding error will occur whenever the noise has flipped more than one bit in a block of seven. 2  The probability scales as O(f ), as did the probability of error for the repetition code R3; but Hamming code has a greater rate, R = 4/7.  Dilbert Test: About 7% of the decoded bits are in error. The residual errors are correlated: often two or three successive decoded bits are flipped…  Generalizations of Hamming codes: called BCH codes Shivkumar Kalyanaraman Rensselaer Polytechnic Institute : “shiv rpi” 19Shannon’s Legacy: Rate-Reliability of Codes  Noisy-channel coding theorem: defines achievable rate/reliability regions  Note: you can get BER as low as desired by designing an appropriate code within the capacity region Shivkumar Kalyanaraman Rensselaer Polytechnic Institute : “shiv rpi” 20