The mathematics behind wireless communication
|
|
- Adelia Holmes
- 7 years ago
- Views:
Transcription
1 June 2008
2 Questions and setting In wireless communication, information is sent through what is called a channel. The channel is subject to noise, so that there will be some loss of information. How should we send information so that there is as little information loss as possible? How should we dene the capacity of a channel? Can we nd an expression for the capacity from the characteristics of the channel?
3 What is information? Assume that the random variable X takes values in the alphabet X = {α 1, α 2,...}. Set p i = Pr(X = α i ). How can we dene a measure H for how much choice/uncertainty/information is associated with each outcome? Shannon [1] proposed the following requirements for H: 1 H should be continous in the p i. 2 If all the p i are equal (p i = 1 n ), then H should be an increasing function of n (with equally likely events there is more uncertainty when there are more possible events). 3 If a choice can be broken down into successive choices, the original H should be the weighted sum of the individual values of H: A choice between {α 1, α 2, α 3 } can rst be split into a choice between {α 1, {α 2, α 3 }}, followed by an alternative choice between {α 2, α 3 }.
4 Entropy Denition The entropy of X is dened by H(X ) = H(p 1, p 2,...) = i p i log 2 p i The entropy is measured in bits. Shannon showed that an information measure which satises the requirements of the previous foil, necessarily has this form! If p 1 = 1, p 2 = 1, p 3 = 1, the weighting described on the previous foil can be veried as ( 1 H 2, 1 3, 1 ) ( 1 = H 6 2, 1 ) + 1 ( H 3, 1 ), 3 where the weight 1 appearing on the right side is computed as 2 p 2 + p 3 = 1. 2
5 Shannon's source coding theorem We would like to represent data generated by the random variable X in a shorter way (i.e. compress). Shannon's source coding theorem addresses the limits of such compression: Theorem Assume that we have independent outcomes of the random variable X (= x 1 x 2 x 3 ) The average number of bits per symbol for any lossless compression strategy is always greater than or equal to the entropy H(X ). The entropy H is therefore a lower limit for achievable compression. The theoretical limit given by the entropy is also achievable. In a previous talk, I focused on methods for achieving the limit given by the entropy (Human coding, arithmetic coding).
6 Sketch of Shannon's proof There exists a subset A (n) ɛ of all length-n sequences (x 1, x 2,..., x n ) such that The size of A (n) ɛ is 2 nh(x ) (which can be small when compared to the number of all sequences). Pr(A (n) ɛ ) > 1 ɛ. A (n) ɛ is called the typical set, and consists of all (x 1, x 2,..., x n ) with 1 n log 2(p(x 1, x 2,..., x n ) (=empirical entropy) close enough to the actual entropy H(X ). Shannon proved the source coding theorem by 1 assigning codes with a (smaller) xed length to ALL elements in the typical set, 2 assigning codes with another (longer) xed length to ALL elements outside the typical set, 3 letting n, and ɛ 0.
7 What is a communication channel? That A communicates with B means that the physical acts of A induce a desired physical state in B. This transfer of information is subject to noise and the imperfections of the physical signaling process itself. The communication is succesful if the receiver B and the transmitter A agree on what was sent. Denition A discrete channel, denoted by (X, p(y x), Y), consists of two nite sets X (the input alphabet) and Y (the output alphabet), and a probability transition matrix p(y x) that expresses the probability of observing the output symbol y given that we send the symbol x. The channel is said to be memoryless if the probability distribution of the output depends only on the input at that time, and is conditionally independent of previous channel inputs and outputs.
8 A general scheme for communication W Encoder X n Channel p(y x) Y n Decoder Ŵ W {1, 2,..., M} is the message we seek to transfer via the channel The encoder is a map X n : {1, 2,..., M} X n, taking values in a codebook from X n of size M (X n (1), X n (2),..., X n (M)). The decoder is a map Y n : Y n {1, 2,..., M}. This is a deterministic rule that assigns a guess to each possible received vector. Ŵ {1, 2,..., M} is the message retrieved by the decoder. n is the block length. It says how many times the channel is used for each transmission. M is the number of possible messages. A message can thus be represented with log 2 (M) bits.
9 The encoder/decoder pair is called a (M, n)-code (i.e. codes where there are M possible messages, n uses of the channel per transmission). When the encoder maps the input to codewords in the data transmission process, it actually adds redundancy in a controlled fashion to combat errors in the channel. This is in contrast to data compression, where one goes the opposite way, i.e. removing redundancy in the data to form the most compressed form possible. The basic question is how one can construct an encoder/decoder pair, such that there is a high probability that the received message Ŵ equals the transmitted message W?
10 Denition Let λ W be the probability that the received message Ŵ is dierent from the sent message W. This is called the conditional probability of error given that W was sent. We also dene the maximal probability of error as λ (n) = max W {1,2,...,M} λ W. Denition The rate of an (M.n)-code is dened as R = log 2 (M) n, measured in bits per transmission. Denition A rate R is said to be achievable if there for each n exists a ( 2 nr, n)-code, such that lim n λ (n) = 0 (i.e. the maximal probability of error goes to 0). Denition The (operational) capacity of a channel is the supremum of all achievable rates.
11 Shannon's channel coding theorem Expresses the capacity in terms of the probability distribution of the channel, irrespective of the use of encoders/decoders. Theorem The capacity of a discrete memoryless channel is given by C = max I (X ; Y ), q(x) where X /Y is the random input/output to the channel, with X having distribution q(x) on X. Here I (X ; Y ) is the mutual information between the random variables X and Y, dened by I (X ; Y ) = x,y p(x, y) log 2 ( p(x, y) p(x)p(y) where p(x, y) is the joint p.d.f. of X and Y. ), (1)
12 Sketch of proof I We generalize the denition of the typical set (from the proof of the source coding theorem) to the following: The jointly typical set consists of all jointly typical sequences ((x n ), (y n )) = ((x 1, x 2,..., x n ), (y 1, y 2,..., y n )), dened as those sequences where 1 the empirical entropy of (x 1, x 2,..., x n ) is close enough to the actual entropy H(X ), 2 the empirical entropy of (y 1, y 2,..., y n ) is close enough to the actual entropy H(Y ), 3 the joint empirical entropy ( 1 n log ( n p(x 2 i=1 i, y i ))) of ((x 1, x 2,..., x n ), (y 1, y 2,..., y n )) is close enough to the actual joint entropy H(X, Y ) dened by H(X, Y ) = p(x, y) log 2 p(x, y), x X y Y where p(x, y) is the joint distribution of X and Y.
13 Sketch of proof II The jointly typical set is, just as the typical set, denoted A (n) ɛ. It has the following property similar to the corresponding properties for the typical set: 1 The size of A (n) ɛ is approximately 2 nh(x,y ) (which is small when compared to the number of all sequences). 2 Pr(A (n) ɛ ) 1 as n.
14 Sketch of proof III The channel coding theorem can be proved in the following way for a given rate R < C : 1 Construct a randomly (dictated by some xed distribution of the input) generated codebook of length 2 nr from X n. Dene the encoder as any mapping from {1,..., 2 nr } into this set. 2 Dene the decoder in the following way if the output (y1, y2,..., yn) of the channel is jointly typical with a unique (x 1,...x n), dene (x 1,...x n) as the output of the decoder Otherwise, the output of the decoder should be some dummy index, declaring an error. 3 One can show that, with high probability (going to 1 as n ), the input to the channel (x 1, x 2,..., x n ) is jointly typical with the output (y 1, y 2,..., y n ). The expression for the mutual information enters the picture when computing the probability that the output is jointly typical with another sequence, which is 2 ni (X ;Y ).
15 More general channels I In general, channels do not use nite alphabet inputs/outputs. The most important continous alphabet channel is the Gaussian channel. This is a time-discrete channel with output Y i at time i given by Y i = X i + Z i. X i is input, Z i N (0, N) noise (Gaussian, variance N). Capacity can be dened in a similar fashion for such channels The capacity can be innite, unless we restrict the input. The most common such restriction is a limitation on its variance. Assume that the variance of the input is less than P. One can then show that the capacity of the Gaussian channel is ( 1 2 log 1 + P ), 2 N and that the capacity is achieved when X N (0, P).
16 More general channels II In general, communication systems consist of multiple transmitters and receivers, talking and interfering with each other. Such communication systems are described by a channel matrix, whose dimensions match the number of transmitters and receivers. Its entries is a function of the geometry of the transmitting and receiving antennas. Capacity can be described in a meaningful way for such systems also. It turns out that, for a wide class of channels, the capacity is given by C = 1 ( n log det I 2 n + ρ 1 ) m HHH where H is the n m channel matrix, n,m is the number of receiving/transmitting antennas, ρ is signal to noise ratio (like for the Gaussian channel). P N
17 Active areas of research and open problems How do we construct codebooks which help us achieve rates close to the capacity? In other words, how can we nd the input distribution p(x) which maximizes I (X ; Y ) (the mutual information between the input and the output)? Such codes should also be implementable. Much progress made in recent years. Convolutional codes, Turbo codes, LDPC (Low-Density Parity Check) codes. Error correcting codes: These codes are able to detect where bit errors have occured in the received data. Hamming codes. What is the capacity in more general systems? One has to account for any number of receivers/transmitters, any type of interference, cooperation and feedback between the sending and receiving antennas. General case far from being solved.
18 Good sources on information theory are the books [2] (which most of these foils are based on), and [3]. Related courses at UNIK: UNIK4190, UNIK4220, UNIK4230. Related courses at NTNU: TTT4125, TTT4110. This talk is available at oyvindry/talks.shtml. My publications are listed at oyvindry/publications.shtml
19 C. E. Shannon, A mathematical theory of communication, The Bell System Technical Journal, vol. 27, pp ,623656, October T. M. Cover and J. A. Thomas, Elements of Information Theory, second edition. Wiley, D. J. MacKay, Information Theory, Inference, and Learning Algorithms. Cambridge University Press, 2003.
Capacity Limits of MIMO Channels
Tutorial and 4G Systems Capacity Limits of MIMO Channels Markku Juntti Contents 1. Introduction. Review of information theory 3. Fixed MIMO channels 4. Fading MIMO channels 5. Summary and Conclusions References
More informationChapter 1 Introduction
Chapter 1 Introduction 1. Shannon s Information Theory 2. Source Coding theorem 3. Channel Coding Theory 4. Information Capacity Theorem 5. Introduction to Error Control Coding Appendix A : Historical
More informationPolarization codes and the rate of polarization
Polarization codes and the rate of polarization Erdal Arıkan, Emre Telatar Bilkent U., EPFL Sept 10, 2008 Channel Polarization Given a binary input DMC W, i.i.d. uniformly distributed inputs (X 1,...,
More informationCoding and decoding with convolutional codes. The Viterbi Algor
Coding and decoding with convolutional codes. The Viterbi Algorithm. 8 Block codes: main ideas Principles st point of view: infinite length block code nd point of view: convolutions Some examples Repetition
More informationGambling with Information Theory
Gambling with Information Theory Govert Verkes University of Amsterdam January 27, 2016 1 / 22 How do you bet? Private noisy channel transmitting results while you can still bet, correct transmission(p)
More informationMIMO CHANNEL CAPACITY
MIMO CHANNEL CAPACITY Ochi Laboratory Nguyen Dang Khoa (D1) 1 Contents Introduction Review of information theory Fixed MIMO channel Fading MIMO channel Summary and Conclusions 2 1. Introduction The use
More informationLinear Codes. Chapter 3. 3.1 Basics
Chapter 3 Linear Codes In order to define codes that we can encode and decode efficiently, we add more structure to the codespace. We shall be mainly interested in linear codes. A linear code of length
More informationGambling and Data Compression
Gambling and Data Compression Gambling. Horse Race Definition The wealth relative S(X) = b(x)o(x) is the factor by which the gambler s wealth grows if horse X wins the race, where b(x) is the fraction
More informationPrivacy and Security in the Internet of Things: Theory and Practice. Bob Baxley; bob@bastille.io HitB; 28 May 2015
Privacy and Security in the Internet of Things: Theory and Practice Bob Baxley; bob@bastille.io HitB; 28 May 2015 Internet of Things (IoT) THE PROBLEM By 2020 50 BILLION DEVICES NO SECURITY! OSI Stack
More informationFUNDAMENTALS of INFORMATION THEORY and CODING DESIGN
DISCRETE "ICS AND ITS APPLICATIONS Series Editor KENNETH H. ROSEN FUNDAMENTALS of INFORMATION THEORY and CODING DESIGN Roberto Togneri Christopher J.S. desilva CHAPMAN & HALL/CRC A CRC Press Company Boca
More informationEntropy and Mutual Information
ENCYCLOPEDIA OF COGNITIVE SCIENCE 2000 Macmillan Reference Ltd Information Theory information, entropy, communication, coding, bit, learning Ghahramani, Zoubin Zoubin Ghahramani University College London
More informationNational Sun Yat-Sen University CSE Course: Information Theory. Gambling And Entropy
Gambling And Entropy 1 Outline There is a strong relationship between the growth rate of investment in a horse race and the entropy of the horse race. The value of side information is related to the mutual
More informationReview Horse Race Gambling and Side Information Dependent horse races and the entropy rate. Gambling. Besma Smida. ES250: Lecture 9.
Gambling Besma Smida ES250: Lecture 9 Fall 2008-09 B. Smida (ES250) Gambling Fall 2008-09 1 / 23 Today s outline Review of Huffman Code and Arithmetic Coding Horse Race Gambling and Side Information Dependent
More informationTeaching Convolutional Coding using MATLAB in Communication Systems Course. Abstract
Section T3C2 Teaching Convolutional Coding using MATLAB in Communication Systems Course Davoud Arasteh Department of Electronic Engineering Technology, LA 70813, USA Abstract Convolutional codes are channel
More informationKhalid Sayood and Martin C. Rost Department of Electrical Engineering University of Nebraska
PROBLEM STATEMENT A ROBUST COMPRESSION SYSTEM FOR LOW BIT RATE TELEMETRY - TEST RESULTS WITH LUNAR DATA Khalid Sayood and Martin C. Rost Department of Electrical Engineering University of Nebraska The
More informationELEC3028 Digital Transmission Overview & Information Theory. Example 1
Example. A source emits symbols i, i 6, in the BCD format with probabilities P( i ) as given in Table, at a rate R s = 9.6 kbaud (baud=symbol/second). State (i) the information rate and (ii) the data rate
More informationAn Introduction to Information Theory
An Introduction to Information Theory Carlton Downey November 12, 2013 INTRODUCTION Today s recitation will be an introduction to Information Theory Information theory studies the quantification of Information
More informationThe Degrees of Freedom of Compute-and-Forward
The Degrees of Freedom of Compute-and-Forward Urs Niesen Jointly with Phil Whiting Bell Labs, Alcatel-Lucent Problem Setting m 1 Encoder m 2 Encoder K transmitters, messages m 1,...,m K, power constraint
More informationOn Directed Information and Gambling
On Directed Information and Gambling Haim H. Permuter Stanford University Stanford, CA, USA haim@stanford.edu Young-Han Kim University of California, San Diego La Jolla, CA, USA yhk@ucsd.edu Tsachy Weissman
More informationDiffusion and Data compression for data security. A.J. Han Vinck University of Duisburg/Essen April 2013 Vinck@iem.uni-due.de
Diffusion and Data compression for data security A.J. Han Vinck University of Duisburg/Essen April 203 Vinck@iem.uni-due.de content Why diffusion is important? Why data compression is important? Unicity
More informationCoding Theorems for Turbo-Like Codes Abstract. 1. Introduction.
Coding Theorems for Turbo-Like Codes Dariush Divsalar, Hui Jin, and Robert J. McEliece Jet Propulsion Laboratory and California Institute of Technology Pasadena, California USA E-mail: dariush@shannon.jpl.nasa.gov,
More informationImage Compression through DCT and Huffman Coding Technique
International Journal of Current Engineering and Technology E-ISSN 2277 4106, P-ISSN 2347 5161 2015 INPRESSCO, All Rights Reserved Available at http://inpressco.com/category/ijcet Research Article Rahul
More informationLECTURE 4. Last time: Lecture outline
LECTURE 4 Last time: Types of convergence Weak Law of Large Numbers Strong Law of Large Numbers Asymptotic Equipartition Property Lecture outline Stochastic processes Markov chains Entropy rate Random
More informationInformation, Entropy, and Coding
Chapter 8 Information, Entropy, and Coding 8. The Need for Data Compression To motivate the material in this chapter, we first consider various data sources and some estimates for the amount of data associated
More informationTowards a Tight Finite Key Analysis for BB84
The Uncertainty Relation for Smooth Entropies joint work with Charles Ci Wen Lim, Nicolas Gisin and Renato Renner Institute for Theoretical Physics, ETH Zurich Group of Applied Physics, University of Geneva
More informationSolutions to Exam in Speech Signal Processing EN2300
Solutions to Exam in Speech Signal Processing EN23 Date: Thursday, Dec 2, 8: 3: Place: Allowed: Grades: Language: Solutions: Q34, Q36 Beta Math Handbook (or corresponding), calculator with empty memory.
More informationCommunication on the Grassmann Manifold: A Geometric Approach to the Noncoherent Multiple-Antenna Channel
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 2, FEBRUARY 2002 359 Communication on the Grassmann Manifold: A Geometric Approach to the Noncoherent Multiple-Antenna Channel Lizhong Zheng, Student
More informationBasics of information theory and information complexity
Basics of information theory and information complexity a tutorial Mark Braverman Princeton University June 1, 2013 1 Part I: Information theory Information theory, in its modern format was introduced
More informationEx. 2.1 (Davide Basilio Bartolini)
ECE 54: Elements of Information Theory, Fall 00 Homework Solutions Ex.. (Davide Basilio Bartolini) Text Coin Flips. A fair coin is flipped until the first head occurs. Let X denote the number of flips
More informationSheet 7 (Chapter 10)
King Saud University College of Computer and Information Sciences Department of Information Technology CAP240 First semester 1430/1431 Multiple-choice Questions Sheet 7 (Chapter 10) 1. Which error detection
More informationLog-Likelihood Ratio-based Relay Selection Algorithm in Wireless Network
Recent Advances in Electrical Engineering and Electronic Devices Log-Likelihood Ratio-based Relay Selection Algorithm in Wireless Network Ahmed El-Mahdy and Ahmed Walid Faculty of Information Engineering
More informationAchievable Strategies for General Secure Network Coding
Achievable Strategies for General Secure Network Coding Tao Cui and Tracey Ho Department of Electrical Engineering California Institute of Technology Pasadena, CA 91125, USA Email: {taocui, tho}@caltech.edu
More informationCapacity of the Multiple Access Channel in Energy Harvesting Wireless Networks
Capacity of the Multiple Access Channel in Energy Harvesting Wireless Networks R.A. Raghuvir, Dinesh Rajan and M.D. Srinath Department of Electrical Engineering Southern Methodist University Dallas, TX
More informationA Practical Scheme for Wireless Network Operation
A Practical Scheme for Wireless Network Operation Radhika Gowaikar, Amir F. Dana, Babak Hassibi, Michelle Effros June 21, 2004 Abstract In many problems in wireline networks, it is known that achieving
More information6.02 Fall 2012 Lecture #5
6.2 Fall 22 Lecture #5 Error correction for linear block codes - Syndrome decoding Burst errors and interleaving 6.2 Fall 22 Lecture 5, Slide # Matrix Notation for Linear Block Codes Task: given k-bit
More informationIntroduction to Learning & Decision Trees
Artificial Intelligence: Representation and Problem Solving 5-38 April 0, 2007 Introduction to Learning & Decision Trees Learning and Decision Trees to learning What is learning? - more than just memorizing
More informationDesign of LDPC codes
Design of LDPC codes Codes from finite geometries Random codes: Determine the connections of the bipartite Tanner graph by using a (pseudo)random algorithm observing the degree distribution of the code
More informationA New Interpretation of Information Rate
A New Interpretation of Information Rate reproduced with permission of AT&T By J. L. Kelly, jr. (Manuscript received March 2, 956) If the input symbols to a communication channel represent the outcomes
More informationSecure Physical-layer Key Generation Protocol and Key Encoding in Wireless Communications
IEEE Globecom Workshop on Heterogeneous, Multi-hop Wireless and Mobile Networks Secure Physical-layer ey Generation Protocol and ey Encoding in Wireless Communications Apirath Limmanee and Werner Henkel
More informationSecure Network Coding on a Wiretap Network
IEEE TRANSACTIONS ON INFORMATION THEORY 1 Secure Network Coding on a Wiretap Network Ning Cai, Senior Member, IEEE, and Raymond W. Yeung, Fellow, IEEE Abstract In the paradigm of network coding, the nodes
More informationModified Golomb-Rice Codes for Lossless Compression of Medical Images
Modified Golomb-Rice Codes for Lossless Compression of Medical Images Roman Starosolski (1), Władysław Skarbek (2) (1) Silesian University of Technology (2) Warsaw University of Technology Abstract Lossless
More informationCapacity Limits of MIMO Systems
1 Capacity Limits of MIMO Systems Andrea Goldsmith, Syed Ali Jafar, Nihar Jindal, and Sriram Vishwanath 2 I. INTRODUCTION In this chapter we consider the Shannon capacity limits of single-user and multi-user
More informationReading.. IMAGE COMPRESSION- I IMAGE COMPRESSION. Image compression. Data Redundancy. Lossy vs Lossless Compression. Chapter 8.
Reading.. IMAGE COMPRESSION- I Week VIII Feb 25 Chapter 8 Sections 8.1, 8.2 8.3 (selected topics) 8.4 (Huffman, run-length, loss-less predictive) 8.5 (lossy predictive, transform coding basics) 8.6 Image
More informationInformation Theory and Coding SYLLABUS
SYLLABUS Subject Code : IA Marks : 25 No. of Lecture Hrs/Week : 04 Exam Hours : 03 Total no. of Lecture Hrs. : 52 Exam Marks : 00 PART - A Unit : Information Theory: Introduction, Measure of information,
More informationThis article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
IEEE/ACM TRANSACTIONS ON NETWORKING 1 A Greedy Link Scheduler for Wireless Networks With Gaussian Multiple-Access and Broadcast Channels Arun Sridharan, Student Member, IEEE, C Emre Koksal, Member, IEEE,
More informationJPEG compression of monochrome 2D-barcode images using DCT coefficient distributions
Edith Cowan University Research Online ECU Publications Pre. JPEG compression of monochrome D-barcode images using DCT coefficient distributions Keng Teong Tan Hong Kong Baptist University Douglas Chai
More informationCODING THEORY a first course. Henk C.A. van Tilborg
CODING THEORY a first course Henk C.A. van Tilborg Contents Contents Preface i iv 1 A communication system 1 1.1 Introduction 1 1.2 The channel 1 1.3 Shannon theory and codes 3 1.4 Problems 7 2 Linear
More informationPhysical Layer Security in Wireless Communications
Physical Layer Security in Wireless Communications Dr. Zheng Chang Department of Mathematical Information Technology zheng.chang@jyu.fi Outline Fundamentals of Physical Layer Security (PLS) Coding for
More informationPrinciple of Data Reduction
Chapter 6 Principle of Data Reduction 6.1 Introduction An experimenter uses the information in a sample X 1,..., X n to make inferences about an unknown parameter θ. If the sample size n is large, then
More informationData analysis in supersaturated designs
Statistics & Probability Letters 59 (2002) 35 44 Data analysis in supersaturated designs Runze Li a;b;, Dennis K.J. Lin a;b a Department of Statistics, The Pennsylvania State University, University Park,
More informationCommunication Theoretic Data Analytics
1 Communication Theoretic Data Analytics Kwang-Cheng Chen, Shao-Lun Huang, Lizhong Zheng, H. Vincent Poor arxiv:1501.05379v1 [cs.it] 22 Jan 2015 Abstract Widespread use of the Internet and social networks
More informationHyperspectral images retrieval with Support Vector Machines (SVM)
Hyperspectral images retrieval with Support Vector Machines (SVM) Miguel A. Veganzones Grupo Inteligencia Computacional Universidad del País Vasco (Grupo Inteligencia SVM-retrieval Computacional Universidad
More informationCoding and Cryptography
Coding and Cryptography Dr T.A. Fisher Michaelmas 2005 L A TEXed by Sebastian Pancratz ii These notes are based on a course of lectures given by Dr T.A. Fisher in Part II of the Mathematical Tripos at
More informationWeakly Secure Network Coding
Weakly Secure Network Coding Kapil Bhattad, Student Member, IEEE and Krishna R. Narayanan, Member, IEEE Department of Electrical Engineering, Texas A&M University, College Station, USA Abstract In this
More informationTHIS paper deals with a situation where a communication
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 44, NO. 3, MAY 1998 973 The Compound Channel Capacity of a Class of Finite-State Channels Amos Lapidoth, Member, IEEE, İ. Emre Telatar, Member, IEEE Abstract
More informationDATA VERIFICATION IN ETL PROCESSES
KNOWLEDGE ENGINEERING: PRINCIPLES AND TECHNIQUES Proceedings of the International Conference on Knowledge Engineering, Principles and Techniques, KEPT2007 Cluj-Napoca (Romania), June 6 8, 2007, pp. 282
More informationencoding compression encryption
encoding compression encryption ASCII utf-8 utf-16 zip mpeg jpeg AES RSA diffie-hellman Expressing characters... ASCII and Unicode, conventions of how characters are expressed in bits. ASCII (7 bits) -
More informationINTER CARRIER INTERFERENCE CANCELLATION IN HIGH SPEED OFDM SYSTEM Y. Naveena *1, K. Upendra Chowdary 2
ISSN 2277-2685 IJESR/June 2014/ Vol-4/Issue-6/333-337 Y. Naveena et al./ International Journal of Engineering & Science Research INTER CARRIER INTERFERENCE CANCELLATION IN HIGH SPEED OFDM SYSTEM Y. Naveena
More informationMULTIVARIATE PROBABILITY DISTRIBUTIONS
MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined
More informationFACULTY OF GRADUATE STUDIES. On The Performance of MSOVA for UMTS and cdma2000 Turbo Codes
FACULTY OF GRADUATE STUDIES On The Performance of MSOVA for UMTS and cdma2000 Turbo Codes By Hani Hashem Mis ef Supervisor Dr. Wasel Ghanem This Thesis was submitted in partial ful llment of the requirements
More informationHill s Cipher: Linear Algebra in Cryptography
Ryan Doyle Hill s Cipher: Linear Algebra in Cryptography Introduction: Since the beginning of written language, humans have wanted to share information secretly. The information could be orders from a
More informationLDPC Codes: An Introduction
LDPC Codes: An Introduction Amin Shokrollahi Digital Fountain, Inc. 39141 Civic Center Drive, Fremont, CA 94538 amin@digitalfountain.com April 2, 2003 Abstract LDPC codes are one of the hottest topics
More informationInformation Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay
Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay Lecture - 17 Shannon-Fano-Elias Coding and Introduction to Arithmetic Coding
More informationCoded Bidirectional Relaying in Wireless Networks
Coded Bidirectional Relaying in Wireless Networks Petar Popovski and Toshiaki Koike - Akino Abstract The communication strategies for coded bidirectional (two way) relaying emerge as a result of successful
More information(2) (3) (4) (5) 3 J. M. Whittaker, Interpolatory Function Theory, Cambridge Tracts
Communication in the Presence of Noise CLAUDE E. SHANNON, MEMBER, IRE Classic Paper A method is developed for representing any communication system geometrically. Messages and the corresponding signals
More information1.2 Solving a System of Linear Equations
1.. SOLVING A SYSTEM OF LINEAR EQUATIONS 1. Solving a System of Linear Equations 1..1 Simple Systems - Basic De nitions As noticed above, the general form of a linear system of m equations in n variables
More informationA Probabilistic Quantum Key Transfer Protocol
A Probabilistic Quantum Key Transfer Protocol Abhishek Parakh Nebraska University Center for Information Assurance University of Nebraska at Omaha Omaha, NE 6818 Email: aparakh@unomaha.edu August 9, 01
More information4932 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 11, NOVEMBER 2009
4932 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 55, NO 11, NOVEMBER 2009 The Degrees-of-Freedom of the K-User Gaussian Interference Channel Is Discontinuous at Rational Channel Coefficients Raúl H Etkin,
More informationDigital Modulation. David Tipper. Department of Information Science and Telecommunications University of Pittsburgh. Typical Communication System
Digital Modulation David Tipper Associate Professor Department of Information Science and Telecommunications University of Pittsburgh http://www.tele.pitt.edu/tipper.html Typical Communication System Source
More informationMathematical finance and linear programming (optimization)
Mathematical finance and linear programming (optimization) Geir Dahl September 15, 2009 1 Introduction The purpose of this short note is to explain how linear programming (LP) (=linear optimization) may
More informationElements of probability theory
2 Elements of probability theory Probability theory provides mathematical models for random phenomena, that is, phenomena which under repeated observations yield di erent outcomes that cannot be predicted
More informationA Survey of the Theory of Error-Correcting Codes
Posted by permission A Survey of the Theory of Error-Correcting Codes by Francis Yein Chei Fung The theory of error-correcting codes arises from the following problem: What is a good way to send a message
More informationIntelligent Agents. Based on An Introduction to MultiAgent Systems and slides by Michael Wooldridge
Intelligent Agents Based on An Introduction to MultiAgent Systems and slides by Michael Wooldridge Denition of an Agent An agent is a computer system capable of autonomous action in some environment, in
More informationComplexity-bounded Power Control in Video Transmission over a CDMA Wireless Network
Complexity-bounded Power Control in Video Transmission over a CDMA Wireless Network Xiaoan Lu, David Goodman, Yao Wang, and Elza Erkip Electrical and Computer Engineering, Polytechnic University, Brooklyn,
More informationCompression techniques
Compression techniques David Bařina February 22, 2013 David Bařina Compression techniques February 22, 2013 1 / 37 Contents 1 Terminology 2 Simple techniques 3 Entropy coding 4 Dictionary methods 5 Conclusion
More informationOn closed-form solutions of a resource allocation problem in parallel funding of R&D projects
Operations Research Letters 27 (2000) 229 234 www.elsevier.com/locate/dsw On closed-form solutions of a resource allocation problem in parallel funding of R&D proects Ulku Gurler, Mustafa. C. Pnar, Mohamed
More information1 Domain Extension for MACs
CS 127/CSCI E-127: Introduction to Cryptography Prof. Salil Vadhan Fall 2013 Reading. Lecture Notes 17: MAC Domain Extension & Digital Signatures Katz-Lindell Ÿ4.34.4 (2nd ed) and Ÿ12.0-12.3 (1st ed).
More informationHow To Find A Nonbinary Code Of A Binary Or Binary Code
Notes on Coding Theory J.I.Hall Department of Mathematics Michigan State University East Lansing, MI 48824 USA 9 September 2010 ii Copyright c 2001-2010 Jonathan I. Hall Preface These notes were written
More informationPrinciples of Digital Communication
Principles of Digital Communication Robert G. Gallager January 5, 2008 ii Preface: introduction and objectives The digital communication industry is an enormous and rapidly growing industry, roughly comparable
More informationTechnical Specifications for KD5HIO Software
Technical Specifications for KD5HIO Software Version 0.2 12/12/2000 by Glen Hansen, KD5HIO HamScope Forward Error Correction Algorithms HamScope is a terminal program designed to support multi-mode digital
More informationNotes 11: List Decoding Folded Reed-Solomon Codes
Introduction to Coding Theory CMU: Spring 2010 Notes 11: List Decoding Folded Reed-Solomon Codes April 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami At the end of the previous notes,
More informationPower Control is Not Required for Wireless Networks in the Linear Regime
Power Control is Not Required for Wireless Networks in the Linear Regime Božidar Radunović, Jean-Yves Le Boudec School of Computer and Communication Sciences EPFL, Lausanne CH-1015, Switzerland Email:
More informationCHAPTER 6. Shannon entropy
CHAPTER 6 Shannon entropy This chapter is a digression in information theory. This is a fascinating subject, which arose once the notion of information got precise and quantifyable. From a physical point
More informationSTUDY OF MUTUAL INFORMATION IN PERCEPTUAL CODING WITH APPLICATION FOR LOW BIT-RATE COMPRESSION
STUDY OF MUTUAL INFORMATION IN PERCEPTUAL CODING WITH APPLICATION FOR LOW BIT-RATE COMPRESSION Adiel Ben-Shalom, Michael Werman School of Computer Science Hebrew University Jerusalem, Israel. {chopin,werman}@cs.huji.ac.il
More informationOn the Use of Compression Algorithms for Network Traffic Classification
On the Use of for Network Traffic Classification Christian CALLEGARI Department of Information Ingeneering University of Pisa 23 September 2008 COST-TMA Meeting Samos, Greece Outline Outline 1 Introduction
More informationENERGY-EFFICIENT RESOURCE ALLOCATION IN MULTIUSER MIMO SYSTEMS: A GAME-THEORETIC FRAMEWORK
ENERGY-EFFICIENT RESOURCE ALLOCATION IN MULTIUSER MIMO SYSTEMS: A GAME-THEORETIC FRAMEWORK Stefano Buzzi, H. Vincent Poor 2, and Daniela Saturnino University of Cassino, DAEIMI 03043 Cassino (FR) - Italy;
More informationTransform-domain Wyner-Ziv Codec for Video
Transform-domain Wyner-Ziv Codec for Video Anne Aaron, Shantanu Rane, Eric Setton, and Bernd Girod Information Systems Laboratory, Department of Electrical Engineering Stanford University 350 Serra Mall,
More informationRegular Languages and Finite State Machines
Regular Languages and Finite State Machines Plan for the Day: Mathematical preliminaries - some review One application formal definition of finite automata Examples 1 Sets A set is an unordered collection
More informationFurther Analysis Of A Framework To Analyze Network Performance Based On Information Quality
Further Analysis Of A Framework To Analyze Network Performance Based On Information Quality A Kazmierczak Computer Information Systems Northwest Arkansas Community College One College Dr. Bentonville,
More informationEnhancing High-Speed Telecommunications Networks with FEC
White Paper Enhancing High-Speed Telecommunications Networks with FEC As the demand for high-bandwidth telecommunications channels increases, service providers and equipment manufacturers must deliver
More informationOptimal Design of Sequential Real-Time Communication Systems Aditya Mahajan, Member, IEEE, and Demosthenis Teneketzis, Fellow, IEEE
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 11, NOVEMBER 2009 5317 Optimal Design of Sequential Real-Time Communication Systems Aditya Mahajan, Member, IEEE, Demosthenis Teneketzis, Fellow, IEEE
More informationMathematical Modelling of Computer Networks: Part II. Module 1: Network Coding
Mathematical Modelling of Computer Networks: Part II Module 1: Network Coding Lecture 3: Network coding and TCP 12th November 2013 Laila Daniel and Krishnan Narayanan Dept. of Computer Science, University
More informationCoding Schemes for a Class of Receiver Message Side Information in AWGN Broadcast Channels
Coding Schemes for a Class of eceiver Message Side Information in AWG Broadcast Channels Behzad Asadi Lawrence Ong and Sarah J. Johnson School of Electrical Engineering and Computer Science The University
More information2695 P a g e. IV Semester M.Tech (DCN) SJCIT Chickballapur Karnataka India
Integrity Preservation and Privacy Protection for Digital Medical Images M.Krishna Rani Dr.S.Bhargavi IV Semester M.Tech (DCN) SJCIT Chickballapur Karnataka India Abstract- In medical treatments, the integrity
More informationCoding Theorems for Turbo Code Ensembles
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 6, JUNE 2002 1451 Coding Theorems for Turbo Code Ensembles Hui Jin and Robert J. McEliece, Fellow, IEEE Invited Paper Abstract This paper is devoted
More information0.1 Phase Estimation Technique
Phase Estimation In this lecture we will describe Kitaev s phase estimation algorithm, and use it to obtain an alternate derivation of a quantum factoring algorithm We will also use this technique to design
More informationPHASE ESTIMATION ALGORITHM FOR FREQUENCY HOPPED BINARY PSK AND DPSK WAVEFORMS WITH SMALL NUMBER OF REFERENCE SYMBOLS
PHASE ESTIMATION ALGORITHM FOR FREQUENCY HOPPED BINARY PSK AND DPSK WAVEFORMS WITH SMALL NUM OF REFERENCE SYMBOLS Benjamin R. Wiederholt The MITRE Corporation Bedford, MA and Mario A. Blanco The MITRE
More information. (3.3) n Note that supremum (3.2) must occur at one of the observed values x i or to the left of x i.
Chapter 3 Kolmogorov-Smirnov Tests There are many situations where experimenters need to know what is the distribution of the population of their interest. For example, if they want to use a parametric
More informationAn Adaptive Decoding Algorithm of LDPC Codes over the Binary Erasure Channel. Gou HOSOYA, Hideki YAGI, Toshiyasu MATSUSHIMA, and Shigeichi HIRASAWA
2007 Hawaii and SITA Joint Conference on Information Theory, HISC2007 Hawaii, USA, May 29 31, 2007 An Adaptive Decoding Algorithm of LDPC Codes over the Binary Erasure Channel Gou HOSOYA, Hideki YAGI,
More informationPhysical-Layer Security: Combining Error Control Coding and Cryptography
1 Physical-Layer Security: Combining Error Control Coding and Cryptography Willie K Harrison and Steven W McLaughlin arxiv:09010275v2 [csit] 16 Apr 2009 Abstract In this paper we consider tandem error
More information