# Chap 3 Huffman Coding

Save this PDF as:

Size: px
Start display at page:

## Transcription

1 Chap 3 Huffman Coding 3.1 Overview 3.2 The Huffman Coding Algorithm 3.4 Adaptive Huffman Coding 3.5 Golomb Codes 3.6 Rice Codes 3.7 Tunstall Codes 3.8 Applications of Huffman Coding 1

2 3.2 The Huffman Coding Algorithm The Huffman procedure is based on two observations regarding optimum prefix codes. 1. In an optimum code, symbols that occur more frequently will have shorter codewords than symbols that occur less frequently. 2. In an optimum code, the two symbols that occur least frequently will have the same length. The Huffman procedure is obtained by adding a simple requirement to these two observations. 2

3 The requirement is that the codewords corresponding to the two lowest probability symbols differ only in the last bit. That is, if γ and δ are the two least probable symbols, if the codeword for γ is m*0, then for δ is m*1. Here m is a string of 1s and 0s, and * denotes concatenation. 3

4 Example Design of a Huffman Code The entropy for the source of Table 3.1 is bits/symbol. 4

5 5

6 6

7 7

8 8

9 9

10 10

11 Length of Huffman Codes For a source S with alphabet A = {a 1, a 2,...,a k } and probability model {P(a 1 ), P(a 2 ),..., P(a k )}, the average number of bits (bit rate) to code a symbol is Where l i is the length of codeword a i 11

12 A measure of the efficiency of this code is its redundancy the difference between the entropy and the average length. Example 3.2.1, the redundancy is bits/symbol. The redundancy is zero when the probabilities are negative powers of two. The Huffman code for a source S has an average code length bounded below by the entropy and bounded above by the entropy plus 1 bit. 12

13 3.4 Adaptive Huffman Coding Huffman coding requires knowledge of the probabilities of the source sequence. If this knowledge is not available, Huffman coding becomes a two-pass procedure. the statistics are collected in the first pass the source is encoded in the second pass Theoretically, if we want to encode the (k+1)- th symbol using the statistics of the first k symbols, we could recompute the code using the Huffman coding procedure each time a symbol is transmitted. 13

14 However, this would not be a very practical approach due to the large amount of computation involved. Hence, the adaptive Huffman coding procedures are provided. In order to describe how the adaptive Huffman code works, we add two other parameters to the binary tree: the weight of each node, which is written as a number inside the node,and a node number. 14

15 The weight of each external node is simply the number of times the symbol corresponding to the leaf has been encountered. The weight of each internal node is the sum of the weights of its offspring. The node number y i is a unique number assigned to each internal and external node. 15

16 Sibling property If we have an alphabet of size n, then the 2n-1 internal and external nodes can be numbered as y 1,, y 2n-1 such that if x j is the weight of node y j, we have x 1 x 2... x 2n-l. The nodes y 2j-1 and y 2j are offspring of the same parent node, or siblings, for 1 j < n, and the node number for the parent node is greater than y 2j-1 and y 2j. The above two characteristics are called the sibling property, and any tree that possesses this property is a Huffman tree [23]. 16

17 In the adaptive Huffman coding procedure, neither transmitter nor receiver knows anything about the statistics of the source sequence at the start of transmission. The tree at both the transmitter and the receiver consists of a single node that corresponds to all symbols not yet transmitted (NYT) and has a weight of zero. As transmission progresses, nodes corresponding to symbols transmitted will be added to the tree, and the tree is reconfigured using an update procedure. 17

18 Before the beginning of transmission, a fixed code for each symbol is agreed upon between transmitter and receiver. A simple (short) code is as follows: Suppose the source has an alphabet (a 1, a 2,..., a m ) of size m, then pick e and r such that 18

19 For example, suppose m=26, then e=4, and r=10. The symbol a 1 is encoded as 00000, the symbol a 2 is encoded as 00001, and the symbol a 22 is encoded as In Class Exercise: List all the codewords from a 1 to a

20 When a symbol is encountered for the first time, the code for the NYT node is transmitted, followed by the fixed code for the symbol. A node for the symbol is then created, and the symbol is taken out of the NYT list. Both transmitter and receiver start with the same tree structure. The updating procedure used by both transmitter and receiver is identical. The encoding and decoding processes remain synchronized. 20

21 The update procedure requires that the nodes be in a fixed order. This ordering is preserved by numbering the nodes. The largest node number is given to the root of the tree. The smallest number is assigned to the NYT node. The numbers from the NYT node to the root of the tree are assigned in increasing order from left to right, and from lower level to upper level. The set of nodes with the same weight makes up a block. Figure 3.6 is a flowchart of the updating procedure. 21

22 22

23 The function of the update procedure is to preserve the sibling property. In order that the update procedures at the transmitter and receiver both operate with the same information, the tree at the transmitter is updated after each symbol is encoded, the tree at the receiver is updated after each symbol is decoded. 23

24 24

25 Example 3.4.1: Update procedure Assume we are encoding the message [a a r d v]. We begin with only the NYT node. The total number of nodes in this tree will be 2 x 26-1 =51, so we start numbering backwards from 51 with the number of the root node being 51. As a does not yet exist in the tree, we send a binary code for a and then add a to the tree. The NYT node gives birth to a new NYT node and a terminal node corresponding to a. The weight of the terminal node will be higher than the NYT node, so we assign the number 49 to the NYT node and 50 to the terminal node corresponding to the letter a. 25

26 The second letter to be transmitted is also a. This time the transmitted code is 1. The node corresponding to a has the highest number (if we do not consider its parent), so we do not need to swap nodes. The next letter to be transmitted is r. This letter does not have a corresponding node on the tree so we send the codeword for the NYT node, which is 0 followed by the index of r, which is The NYT node gives birth to a new NYT node and an external node corresponding to r. Again, no update is required. 26

27 The next letter to be transmitted is d, which is also being sent for the first time. We again send the code for the NYT node, which is now 00 followed by the index for d, which is The NYT node again gives birth to two new nodes. However, an update is still not required. The next letter v is transmitted, which is also being sent for the first time. We again send the code for the NYT node, which is now 000 followed by the index for v, which is

28 Nodes 43 and 44 are added to the tree, examine the grandparent node of v (node 47) to see if it has the largest number in its block. As it does not, swap it with node 48, which has the largest number in its block, increment node 48 and move to its parent, which is node 49. In the block containing node 49, the largest number belongs to node 50. swap nodes 49 and 50 and then increment node 50 move to the parent node of node 50, which is node 51. As this is the root node, all we do is increment node

29 29

30 30

31 3.5 Golomb Codes The Golomb-Rice codes belong to a family of codes designed to encode integers with the assumption that the larger an integer, the lower its probability of occurrence. The simplest code for this situation is the unary code. The unary code for a positive integer n is simply n 1s followed by a 0. Thus, the code for 4 is 11110, and the code for 7 is

32 In Golomb code with parameter m, we represent an integer n > 0 using two numbers q and r and Let n=qm+r. q is represented by the unary code of q. r is represented by the binary code. If m is a power of two, we use the log 2 m bit binary representation of r. If m is not a power of two, we use the log 2 m bits,where x is the smallest integer greater than or equal to x. We can reduce the number of bits required if we use the log 2 m bit binary representation of r for the first r<2 log 2 m -m values, and the log 2 m bit binary representation of r+2 log 2 m -m for the rest of the values. 32

33 Example 3.5.1: Golomb Codes 33

34 3.6 Rice Codes The Rice code was originally developed by Robert F. Rice (1979) and later extendedby Pen-Shu Yeh and Warner Miller. The Rice code can be viewed as an adaptive Golomb code. We will study the implementation of the Rice code in the recommendation for lossless compression from the Consultative Committee on Space Data Standards (CCSDS). 34

35 The algorithm consists of a preprocessor (the modeling step) and a binary coder (coding step) The preprocessor removes correlation from the input and generates a sequence of nonnegative integers. The sequence had the property that smaller values are more probable than large values. The binary coder generates a bit stream to represent the integer sequence. 35

36 3.7 Tunstall Codes In the Tunstall code, all codewords are of equal length. An example of a 2-bit Tunstall code is shown in Table The main advantage of a Tunstall code is that errors in codewords do not propagate. 36

37 The design of a code that has a fixed codeword length but a variable number of symbols per codeword should satisfy the following conditions: We should be able to parse a source output sequences of symbols that appear in the codebook. We should maximize the average number of source symbols represented by each codeword. 37

38 Designing a Tunstall Codes Suppose an n-bit Tunstall Code for a source that generates iid letters from an alphabet of size N. Remove the entry in the codebook that has highest probability and add the N strings obtained by concatenating this letter with every letter in the alphabet (including itself). Perform K times, where N+K(N-1) 2 n 38

39 Example Tunstall Codes Let us design a 3-bit Tunstall code for a memoryless source with the following alphabet : A = {A,B,C} P(A)=0.6, P(B)=0.3, P(C)=0.1 39

40 40

41 3.8 Applications of Huffman Coding 41

42 Lossless Image Compression 42

43 Lossless Image Compression 43

44 Text Compression 44

45 Text Compression 45

46 Audio Compression 46

### Huffman Coding. National Chiao Tung University Chun-Jen Tsai 10/2/2014

Huffman Coding National Chiao Tung University Chun-Jen Tsai 10/2/2014 Huffman Codes Optimum prefix code developed by D. Huffman in a class assignment Construction of Huffman codes is based on two ideas:

### Multimedia Communications. Huffman Coding

Multimedia Communications Huffman Coding Optimal codes Suppose that a i -> w i C + is an encoding scheme for a source alphabet A={a 1,, a N }. Suppose that the source letter a 1,, a N occur with relative

### Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay Lecture - 14 Non-Binary Huffman Code and Other Codes In the last class we

### Almost every lossy compression system contains a lossless compression system

Lossless compression in lossy compression systems Almost every lossy compression system contains a lossless compression system Lossy compression system Transform Quantizer Lossless Encoder Lossless Decoder

Adaptive Huffman coding If we want to code a sequence from an unknown source using Huffman coding, we need to know the probabilities of the different symbols. Most straightforward is to make two passes

### Name: Shu Xiong ID:

Homework #1 Report Multimedia Data Compression EE669 2013Spring Name: Shu Xiong ID: 3432757160 Email: shuxiong@usc.edu Content Problem1: Writing Questions... 2 Huffman Coding... 2 Lempel-Ziv Coding...

### and the bitplane representing the second most significant bits is and the bitplane representing the least significant bits is

1 7. BITPLANE GENERATION The integer wavelet transform (IWT) is transformed into bitplanes before coding. The sign bitplane is generated based on the polarity of elements of the transform. The absolute

### CSC 310: Information Theory

CSC 310: Information Theory University of Toronto, Fall 2011 Instructor: Radford M. Neal Week 2 What s Needed for a Theory of (Lossless) Data Compression? A context for the problem. What are we trying

### Information, Entropy, and Coding

Chapter 8 Information, Entropy, and Coding 8. The Need for Data Compression To motivate the material in this chapter, we first consider various data sources and some estimates for the amount of data associated

### 5.4. HUFFMAN CODES 71

5.4. HUFFMAN CODES 7 Corollary 28 Consider a coding from a length n vector of source symbols, x = (x... x n ), to a binary codeword of length l(x). Then the average codeword length per source symbol for

### Image Compression. Topics

Image Compression October 2010 Topics Redundancy Image information Fidelity Huffman coding Arithmetic coding Golomb code LZW coding Run Length Encoding Bit plane coding 1 Why do we need compression? Data

### Inverted Indexes Compressed Inverted Indexes. Indexing and Searching, Modern Information Retrieval, Addison Wesley, 2010 p. 40

Inverted Indexes Compressed Inverted Indexes Indexing and Searching, Modern Information Retrieval, Addison Wesley, 2010 p. 40 Compressed Inverted Indexes It is possible to combine index compression and

### Binary Codes for Nonuniform

Binary Codes for Nonuniform Sources (dcc-2005) Alistair Moffat and Vo Ngoc Anh Presented by:- Palak Mehta 11-20-2006 Basic Concept: In many applications of compression, decoding speed is at least as important

### Shannon and Huffman-Type Coders

Shannon and Huffman-Type Coders A useful class of coders that satisfy the Kraft's inequality in an efficient manner are called Huffman-type coders. To understand the philosophy of obtaining these codes,

### Signal Compression Survey of the lectures Hints for exam

Signal Compression Survey of the lectures Hints for exam Chapter 1 Use one statement to define the three basic signal compression problems. Answer: (1) designing a good code for an independent source;

### Compression for IR. Lecture 5. Lecture 5 Information Retrieval 1

Compression for IR Lecture 5 Lecture 5 Information Retrieval 1 IR System Layout Lexicon (w, *occ) Occurrences (d, f t,d ) Locations (d, *pos) Documents 2 Why Use Compression? More storage inverted file

### Compression techniques

Compression techniques David Bařina February 22, 2013 David Bařina Compression techniques February 22, 2013 1 / 37 Contents 1 Terminology 2 Simple techniques 3 Entropy coding 4 Dictionary methods 5 Conclusion

### Chapter 3: Digital Audio Processing and Data Compression

Chapter 3: Digital Audio Processing and Review of number system 2 s complement sign and magnitude binary The MSB of a data word is reserved as a sign bit, 0 is positive, 1 is negative. The rest of the

### Class Notes CS 3137. 1 Creating and Using a Huffman Code. Ref: Weiss, page 433

Class Notes CS 3137 1 Creating and Using a Huffman Code. Ref: Weiss, page 433 1. FIXED LENGTH CODES: Codes are used to transmit characters over data links. You are probably aware of the ASCII code, a fixed-length

### Lecture 17: Huffman Coding

Lecture 17: Huffman Coding CLRS- 16.3 Outline of this Lecture Codes and Compression. Huffman coding. Correctness of the Huffman coding algorithm. 1 Suppose that we have a 100, 000 character data file that

### Lec 03 Entropy and Coding II Hoffman and Golomb Coding

Outline CS/EE 559 / ENG 4 Special Topics (Class Ids: 784, 785, 783) Lecture ReCap Hoffman Coding Golomb Coding and JPEG Lossless Coding Lec 3 Entropy and Coding II Hoffman and Golomb Coding Zhu Li Z. Li

### Image compression. Stefano Ferrari. Università degli Studi di Milano Elaborazione delle immagini (Image processing I)

Image compression Stefano Ferrari Università degli Studi di Milano stefano.ferrari@unimi.it Elaborazione delle immagini (Image processing I) academic year 2011 2012 Data and information The representation

### Postings Lists - Reminder

Introduction to Search Engine Technology Index Compression Ronny Lempel Yahoo! Labs, Haifa (Some of the following slides are courtesy of Aya Soffer and David Carmel, IBM Haifa Research Lab) Postings Lists

### An FPGA Implementation of a Lossless Electrocardiogram Compressor based on Prediction and Golomb-Rice Coding

An FPGA Implementation of a Lossless Electrocardiogram Compressor based on Prediction and Golomb-Rice Coding Moab Mariz Meira 1, José Antônio Gomes de Lima 1, Leonardo Vidal Batista 1 1 Departamento de

### Foundation of Video Coding Part I: Overview and Binary Encoding

Foundation of Video Coding Part I: Overview and Binary Encoding Yao Wang Polytechnic University, Brooklyn, NY11201 http://eeweb.poly.edu Based on: Y. Wang, J. Ostermann, and Y.-Q. Zhang, Video Processing

### CS383, Algorithms Notes on Lossless Data Compression and Huffman Coding

Prof. Sergio A. Alvarez http://www.cs.bc.edu/ alvarez/ Fulton Hall 410 B alvarez@cs.bc.edu Computer Science Department voice: (617) 552-4333 Boston College fax: (617) 552-6790 Chestnut Hill, MA 02467 USA

### character E T A S R I O D frequency

Data Compression Data compression is any process by which a digital (e.g. electronic) file may be transformed to another ( compressed ) file, such that the original file may be fully recovered from the

### Image Compression through DCT and Huffman Coding Technique

International Journal of Current Engineering and Technology E-ISSN 2277 4106, P-ISSN 2347 5161 2015 INPRESSCO, All Rights Reserved Available at http://inpressco.com/category/ijcet Research Article Rahul

### Implementation and Analysis of Efficient Lossless Image Compression Algorithm

International Journal of Engineering and Technical Research (IJETR) ISSN: 2321-0869, Volume-2, Issue-6, June 2014 Implementation and Analysis of Efficient Lossless Image Compression Algorithm Prof.Megha

### Saranya G., M.E. Loganya R., M.E. Student ==================================================================

================================================================== ================================================================== Amended Golomb Coding (AGC) Implementation for Reconfigurable Devices

### Reading.. IMAGE COMPRESSION- I IMAGE COMPRESSION. Image compression. Data Redundancy. Lossy vs Lossless Compression. Chapter 8.

Reading.. IMAGE COMPRESSION- I Week VIII Feb 25 Chapter 8 Sections 8.1, 8.2 8.3 (selected topics) 8.4 (Huffman, run-length, loss-less predictive) 8.5 (lossy predictive, transform coding basics) 8.6 Image

### Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay Lecture - 17 Shannon-Fano-Elias Coding and Introduction to Arithmetic Coding

### Problem Set I Lossless Coding

Problem Set I Lossless Coding Christopher Tsai Problem #2 Markov- Source Model The two-dimensional entropy rate function, is, Problem 2[ii] - Entropy Rate Function of a Markov- Source.9 Entropy H(α, β).9.7.3..7.3

### Relative Data Redundancy

Image Compression Relative Data Redundancy Let b and b denote the number of bits in two representations of the same information, the relative data redundancy R is R = 1-1/C C is called the compression

### International Journal of Advanced Research in Computer Science and Software Engineering

Volume 3, Issue 7, July 23 ISSN: 2277 28X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Greedy Algorithm:

### Arithmetic Coding: Introduction

Data Compression Arithmetic coding Arithmetic Coding: Introduction Allows using fractional parts of bits!! Used in PPM, JPEG/MPEG (as option), Bzip More time costly than Huffman, but integer implementation

### Exercises with solutions (1)

Exercises with solutions (). Investigate the relationship between independence and correlation. (a) Two random variables X and Y are said to be correlated if and only if their covariance C XY is not equal

### Bits, Bytes, and Codes

Bits, Bytes, and Codes Telecommunications 1 Peter Mathys Black and White Image Suppose we want to draw a B&W image on a computer screen. We first subdivide the screen into small rectangles or squares called

### Full and Complete Binary Trees

Full and Complete Binary Trees Binary Tree Theorems 1 Here are two important types of binary trees. Note that the definitions, while similar, are logically independent. Definition: a binary tree T is full

### EP A1 (19) (11) EP A1. (12) EUROPEAN PATENT APPLICATION published in accordance with Art. 158(3) EPC

(19) (12) EUROPEAN PATENT APPLICATION published in accordance with Art. 8(3) EPC (11) EP 1 962 14 A1 (43) Date of publication: 27.08.08 Bulletin 08/3 (21) Application number: 06828244.1 (22) Date of filing:

### On the Determination of Optimal Parameterized Prefix Codes for Adaptive Entropy Coding

On the Determination of Optimal Parameterized Prefix Codes for Adaptive Entropy Coding Amir Said Media Technologies Laboratory HP Laboratories Palo Alto HPL-2006-74 April 28, 2006* entropy coding, Golomb-Rice

### Lab Session 4. Review. Outline. May 18, Image and video encoding: A big picture

Outline Lab Session 4 May 18, 2009 Review Manual Exercises Comparing coding performance of different codes: Shannon code, Shannon-Fano code, Huffman code (and Tunstall code *) MATLAB Exercises Working

### 2. Compressing data to reduce the amount of transmitted data (e.g., to save money).

Presentation Layer The presentation layer is concerned with preserving the meaning of information sent across a network. The presentation layer may represent (encode) the data in various ways (e.g., data

### Reduce the amount of data required to represent a given quantity of information Data vs information R = 1 1 C

Image Compression Background Reduce the amount of data to represent a digital image Storage and transmission Consider the live streaming of a movie at standard definition video A color frame is 720 480

### Error Detection and Correction: Parity Check Code; Bounds Based on Hamming Distance

Error Detection and Correction: Parity Check Code; Bounds Based on Hamming Distance Greg Plaxton Theory in Programming Practice, Fall 2005 Department of Computer Science University of Texas at Austin Error

### Principles of Image Compression

Principles of Image Compression Catania 03/04/2008 Arcangelo Bruna Overview Image Compression is the Image Data Elaboration branch dedicated to the image data representation It analyzes the techniques

### Gambling and Data Compression

Gambling and Data Compression Gambling. Horse Race Definition The wealth relative S(X) = b(x)o(x) is the factor by which the gambler s wealth grows if horse X wins the race, where b(x) is the fraction

### TOOLS FOR VISUALIZING TEXT COMPRESSION ALGORITHMS

TOOLS FOR VISUALIZING TEXT COMPRESSION ALGORITHMS Sami Khuri San José State University Dept. of Mathematics and Computer Science San José, CA 95192-0103, USA khuri@cs.sjsu.edu Hsiu-Chin Hsu San José State

### ENSC 861 Source Coding in Digital Communications Golomb-Rice Code

ENSC 861 Source Coding in Digital Communications Golomb-Rice Code Jie Liang Engineering Science Simon Fraser University JieL@sfu.ca J. Liang SFU ENSC861 1 Outline Unary Code Golomb Code Golomb-Rice Code

### Probability Interval Partitioning Entropy Codes

SUBMITTED TO IEEE TRANSACTIONS ON INFORMATION THEORY 1 Probability Interval Partitioning Entropy Codes Detlev Marpe, Senior Member, IEEE, Heiko Schwarz, and Thomas Wiegand, Senior Member, IEEE Abstract

### 6.1 General-Purpose Data Compression

6 Index Compression An inverted index for a given text collection can be quite large, especially if it contains full positional information about all term occurrences in the collection. In a typical collection

### Khalid Sayood and Martin C. Rost Department of Electrical Engineering University of Nebraska

PROBLEM STATEMENT A ROBUST COMPRESSION SYSTEM FOR LOW BIT RATE TELEMETRY - TEST RESULTS WITH LUNAR DATA Khalid Sayood and Martin C. Rost Department of Electrical Engineering University of Nebraska The

### ECEN 5682 Theory and Practice of Error Control Codes

ECEN 5682 Theory and Practice of Error Control Codes Convolutional Codes University of Colorado Spring 2007 Linear (n, k) block codes take k data symbols at a time and encode them into n code symbols.

### The mathematics behind wireless communication

June 2008 Questions and setting In wireless communication, information is sent through what is called a channel. The channel is subject to noise, so that there will be some loss of information. How should

### Sophomore College Mathematics of the Information Age Entropy

Sophomore College Mathematics of the Information Age Entropy Let s recall the fundamental notions of information and entropy. To repeat from the first class, Shannon s emphasis is on selecting a given

### Information Retrieval

Information Retrieval ETH Zürich, Fall 2012 Thomas Hofmann LECTURE 4 INDEX COMPRESSION 10.10.2012 Information Retrieval, ETHZ 2012 1 Overview 1. Dictionary Compression 2. Zipf s Law 3. Posting List Compression

### IMPROVED GOLOMB CODE FOR JOINT GEOMETRIC DISTRIBUTION DATA AND ITS APPLICATIONS IN IMAGE PROCESSING

IMPROVED GOLOMB CODE FOR JOINT GEOMETRIC DISTRIBUTION DATA AND ITS APPLICATIONS IN IMAGE PROCESSING Jian-Jiun Ding ( 丁建均 ), Soo-Chang Pei ( 貝蘇章 ), Wei-Yi Wei ( 魏維毅 ) Tzu-Heng Henry Lee ( 李自恆 ) Department

### Entropy and Mutual Information

ENCYCLOPEDIA OF COGNITIVE SCIENCE 2000 Macmillan Reference Ltd Information Theory information, entropy, communication, coding, bit, learning Ghahramani, Zoubin Zoubin Ghahramani University College London

### TCOM 370 NOTES 99-9 CYCLIC CODES, AND THE CRC (CYCLIC REDUNDANCY CHECK) CODE

TCOM 370 NOTES 99-9 CYCLIC CODES, AND THE CRC (CYCLIC REDUNDANCY CHECK) CODE 1. CYCLIC CODES Cyclic codes are a special type of linear block code that are popular because they are very effective for error

### 4.8 Huffman Codes. These lecture slides are supplied by Mathijs de Weerd

4.8 Huffman Codes These lecture slides are supplied by Mathijs de Weerd Data Compression Q. Given a text that uses 32 symbols (26 different letters, space, and some punctuation characters), how can we

### On the Use of Compression Algorithms for Network Traffic Classification

On the Use of for Network Traffic Classification Christian CALLEGARI Department of Information Ingeneering University of Pisa 23 September 2008 COST-TMA Meeting Samos, Greece Outline Outline 1 Introduction

### Backward Linear Prediction for Lossless Coding of Stereo Audio

Backward for Lossless Coding of Stereo Audio Jean-Luc Garcia, Philippe Gournay and Roch Lefebvre Université de Sherbrooke Département de génie électrique et génie informatique Sherbrooke (Québec) J1K 2R1

### INTRODUCTION TO CODING THEORY: BASIC CODES AND SHANNON S THEOREM

INTRODUCTION TO CODING THEORY: BASIC CODES AND SHANNON S THEOREM SIDDHARTHA BISWAS Abstract. Coding theory originated in the late 1940 s and took its roots in engineering. However, it has developed and

### Lecture 7: Image coding and compression

Lecture 7: Image coding and compression Robin Strand Centre for Image Analysis Swedish University of Agricultural Sciences Uppsala University Today Information and Data Redundancy Image Quality Coding

### Today. Data Representation. Break: Binary Numbers. Representing Text. Representing Images. 18:30(ish) 15 Minute Break

Introduction Today Data Representation Binary Numbers Representing Text Representing Images Break: 18:30(ish) 15 Minute Break Why Data Representation? GCSE syllabus has an entire Module on data representation.

LZ77 The original LZ77 algorithm works as follows: A phrase T j starting at a position i is encoded as a triple of the form distance, length, symbol. A triple d, l, s means that: T j = T [i...i + l] =

### ELEC3028 Digital Transmission Overview & Information Theory. Example 1

Example. A source emits symbols i, i 6, in the BCD format with probabilities P( i ) as given in Table, at a rate R s = 9.6 kbaud (baud=symbol/second). State (i) the information rate and (ii) the data rate

### The relationship of a trees to a graph is very important in solving many problems in Maths and Computer Science

Trees Mathematically speaking trees are a special class of a graph. The relationship of a trees to a graph is very important in solving many problems in Maths and Computer Science However, in computer

### Modified Golomb-Rice Codes for Lossless Compression of Medical Images

Modified Golomb-Rice Codes for Lossless Compression of Medical Images Roman Starosolski (1), Władysław Skarbek (2) (1) Silesian University of Technology (2) Warsaw University of Technology Abstract Lossless

### LZ77 Encoding Algorithm

Components of LZ77 General Strategy: See if the current character (or sequence of characters) in the input string has already occurred earlier in the input data. If it has, then output a pointer to the

### A Catalogue of the Steiner Triple Systems of Order 19

A Catalogue of the Steiner Triple Systems of Order 19 Petteri Kaski 1, Patric R. J. Östergård 2, Olli Pottonen 2, and Lasse Kiviluoto 3 1 Helsinki Institute for Information Technology HIIT University of

### Design of Efficient Algorithms for Image Compression with Application to Medical Images

Design of Efficient Algorithms for Image Compression with Application to Medical Images Ph.D. dissertation Alexandre Krivoulets IT University of Copenhagen February 18, 2004 Abstract This thesis covers

### Background. Linear Convolutional Encoders (2) Output as Function of Input (1) Linear Convolutional Encoders (1)

S-723410 Convolutional Codes (1) 1 S-723410 Convolutional Codes (1) 3 Background Convolutional codes do not segment the data stream, but convert the entire data stream into another stream (codeword) Some

### Module 6. Channel Coding. Version 2 ECE IIT, Kharagpur

Module 6 Channel Coding Lesson 35 Convolutional Codes After reading this lesson, you will learn about Basic concepts of Convolutional Codes; State Diagram Representation; Tree Diagram Representation; Trellis

### Inverted Files for Text Search Engines

Table of Contents Inverted Files for Text Search Engines...2 JUSTIN ZOBEL And ALISTAIR MOFFAT...2 by M. Levent KOC...2 Sample Collection for examples:...2 Collection with only case-folding...2 Collection

### Three Models for the Description of Language

1 Three Models for the Description of Language Alaa Kharbouch and Zahi Karam I. INTRODUCTION The grammar of a language is a device that describes the structure of that language. The grammar is comprised

### Data Compression for Information Interchange - Adaptive Coding with Embedded Dictionary - DCLZ Algorithm

Standard ECMA-151 June 1991 Published in electronic form in September 1999 Standardizing Information and Communication Systems Data Compression for Information Interchange - Adaptive Coding with Embedded

### International Journal of Advancements in Technology ISSN

Implementation of Data Compression Algorithms on FPGA using Soft-core Processor Vijay G. Savani *, Piyush M. Bhatasana, Akash I. Mecwan Assistant Professor, Institute of Technology, Nirma University, India.

### Source Coding: Part I of Fundamentals of Source and Video Coding

Foundations and Trends R in sample Vol. 1, No 1 (2011) 1 217 c 2011 Thomas Wiegand and Heiko Schwarz DOI: xxxxxx Source Coding: Part I of Fundamentals of Source and Video Coding Thomas Wiegand 1 and Heiko

### Introduction to Arithmetic Coding - Theory and Practice

Introduction to Arithmetic Coding - Theory and Practice Amir Said Imaging Systems Laboratory HP Laboratories Palo Alto HPL-2004-76 April 21, 2004* entropy coding, compression, complexity This introduction

### THE SECURITY AND PRIVACY ISSUES OF RFID SYSTEM

THE SECURITY AND PRIVACY ISSUES OF RFID SYSTEM Iuon Chang Lin Department of Management Information Systems, National Chung Hsing University, Taiwan, Department of Photonics and Communication Engineering,

### Digital Logic. The Binary System is a way of writing numbers using only the digits 0 and 1. This is the method used by the (digital) computer.

Digital Logic 1 Data Representations 1.1 The Binary System The Binary System is a way of writing numbers using only the digits 0 and 1. This is the method used by the (digital) computer. The system we

### CHAPTER 5 A MEMORY EFFICIENT SPEECH CODING SCHEME WITH FAST DECODING

86 CHAPTE A MEMOY EFFICIENT SPEECH CODING SCHEME WITH FAST DECODING. INTODUCTION Huffman coding proposed by Huffman (9) is the most widely used minimal prefix coding algorithm because of its robustness,

### Length Limited Variable to Variable Length Codes For High Performance Entropy Coding

UCRL-CONF-2991 Length Limited Variable to Variable Length Codes For High Performance Entropy Coding Joshua Senecal Mark Duchaineau Kenneth I. Joy This paper was submitted to the 24 Data Compression Conference,

### Discovery of Huffman codes

Discovery of Huffman codes Inna Pivkina 1 Introduction The story of the invention of Huffman codes is described in an article by Gary Stix in the September 1991 issue of Scientific American, pp. 54, 58

### Lecture 2: The Complexity of Some Problems

IAS/PCMI Summer Session 2000 Clay Mathematics Undergraduate Program Basic Course on Computational Complexity Lecture 2: The Complexity of Some Problems David Mix Barrington and Alexis Maciel July 18, 2000

### Simple Fast and Adaptive Lossless Image Compression Algorithm

Simple Fast and Adaptive Lossless Image Compression Algorithm Roman Starosolski December 20, 2006 This is a preprint of an article published in Software Practice and Experience, 2007, 37(1):65-91, DOI:

### CHAPTER 2 LITERATURE REVIEW

11 CHAPTER 2 LITERATURE REVIEW 2.1 INTRODUCTION Image compression is mainly used to reduce storage space, transmission time and bandwidth requirements. In the subsequent sections of this chapter, general

### H.264 / AVC Context Adaptive Variable Length Coding

White Paper: H.264 / AVC Context Adaptive Variable Length Coding Iain Richardson Vcodex 2002-2011 Variable-Length Coding 1 Introduction The H.264 / AVC standard specifies two types of entropy coding: Context-based

### BACK. Texas Instrument DSP Solution Challenge. Ecole Nationale Supérieure de Télécommunication de Bretagne. Project Team : William TAPSOBA

BACK Block Turbo-Codes Decoding using the Walsh-Hadamard Transform applied to Hartmann Nazarov algorithm. Real Time Implementation on the DSP Texas TMS320C6201 platform Texas Instrument DSP Solution Challenge

### Coding and decoding with convolutional codes. The Viterbi Algor

Coding and decoding with convolutional codes. The Viterbi Algorithm. 8 Block codes: main ideas Principles st point of view: infinite length block code nd point of view: convolutions Some examples Repetition

### 802.11A - OFDM PHY CODING AND INTERLEAVING. Fernando H. Gregorio. Helsinki University of Technology

802.11A - OFDM PHY CODING AND INTERLEAVING Fernando H. Gregorio Helsinki University of Technology Signal Processing Laboratory, POB 3000, FIN-02015 HUT, Finland E-mail:gregorio@wooster.hut.fi 1. INTRODUCTION

### The largest has a 0 in the sign position and 0's in all other positions:

10.2 Sign Magnitude Representation Sign Magnitude is straight-forward method for representing both positive and negative integers. It uses the most significant digit of the digit string to indicate the

### Information Theory. Entropy

Information Theory 19 Information Theory Information theory concerns the measure of information contained in data. The security of a cryptosystem is determined by the relative content of the key, plaintext,

### Diffusion and Data compression for data security. A.J. Han Vinck University of Duisburg/Essen April 2013 Vinck@iem.uni-due.de

Diffusion and Data compression for data security A.J. Han Vinck University of Duisburg/Essen April 203 Vinck@iem.uni-due.de content Why diffusion is important? Why data compression is important? Unicity

### Regular Expressions with Nested Levels of Back Referencing Form a Hierarchy

Regular Expressions with Nested Levels of Back Referencing Form a Hierarchy Kim S. Larsen Odense University Abstract For many years, regular expressions with back referencing have been used in a variety

### Section IV.1: Recursive Algorithms and Recursion Trees

Section IV.1: Recursive Algorithms and Recursion Trees Definition IV.1.1: A recursive algorithm is an algorithm that solves a problem by (1) reducing it to an instance of the same problem with smaller

### Analysis of Compression Algorithms for Program Data

Analysis of Compression Algorithms for Program Data Matthew Simpson, Clemson University with Dr. Rajeev Barua and Surupa Biswas, University of Maryland 12 August 3 Abstract Insufficient available memory