## Image Compression using Huffman Coding and Run Length Coding

₹3,000.00

**Huge Price Drop : 50% Discount**

**Source Code + Demo Video**

100 in stock

## Description

**ABSTRACT**

Data compression is the general term for the various algorithms and programs developed to address this problem. A compression program is used to convert data from an easy-to-use format to one optimized for compactness. Likewise, an decompression program returns the information to its original form. This research aims to appear the effect of a simple lossless compression method, RLE or Run Length Encoding, on another lossless compression algorithm which is the Huffman algorithm that generates an optimal prefix codes generated from a set of probabilities and DWT gives more significance of accuracy while encoding While RLE simply replaces repeated bytes with a short description of which byte to repeat it

**INTRODUCTION**

There are two dimensions along which each of the schemes discussed here may be measured, algorithm complexity and amount of compression. When data compression is used in a data transmission application, the goal is speed. Speed of transmission depends upon the number of bits sent, the time required for the encoder to generate the coded message, and the time required for the decoder to recover the original ensemble. In a data storage application, although the degree of compression is the primary concern, it is nonetheless necessary that the algorithm be efficient in order for the scheme to be practical. Several common measures of compression have been suggested, average message length ,and compression ratio . Related to each of these measures are assumptions about the characteristics of the source. It is generally assumed in information theory that all statistical parameters of a message source are known with perfect accuracy

**EXISTING SYSTEM**

Compression refers to reducing the quantity of data used to represent a file, image or video content without excessively reducing the quality of the original data. We use Discrete cosine transform and normal extraction by that It also reduces the number of bits required to store and/or transmit digital media. To compress something means that you have a piece of data and you decrease its size. There are different techniques who to do that and they all have their own advantages and disadvantages.

**PROPOSED SYSTEM**

1.Taking the input file, that is a random sequence of English alphabet symbols, computing the probability for each symbol.

2. Applying the Huffman coding of the sequence of probabilities, the result is a string of 0 and 1 bits with Discrete wavelet transform for segmentation process

3. Applying the RLE method on the string of 0 and 1 bits(the result of Huffman method). The RLE is applied after dividing the string of 0 and 1 into 8-block each and transmitting each into a byte. The RLE is applied on the bytes (0 or 255), which 0 came from a sequence of eight zeros and 255 came from a sequence of eight ones. Here the RLE is applied only the 0 and 255 bytes not on all the bytes in the string.

4.The final compressed file contains:- a- the number of symbols. b- the symbols. c- the code words of each symbol. d- the final strings resulted from applying the RLE method on the final code after substituting the codeword of each symbol in the input file.

**BLOCK DIAGRAM**

**ADVANTAGES**

- No Data loss while compression
- No data loss by edge detection
- No loss in accuracy

## **APPLICATIONS**

- Compressing file
- Run time process
- Networking and security analysis

## **CONCLUSION **

1.The proposed compression method is lossless here both of the Huffman and RLE methods are lossless which is useful in text compression since losing a single character can in the worst case make the text dangerously misleading. This means increasing compression ratio without losing information.

2. When the compressed file contains on a longer sequence of frequented symbols, it has a high effect on the compression ratio as the proposed compression method gives better results.

3. when the RLE method is applied with the Huffman algorithm, if it does not decreases the file size it will not increases it. Which indicates that the RLE method has an High effect on the Huffman method when applied together.

**REFERENCES**

1.A. W. Berger and others,” A Hybrid Coding Strategy for Optimized Test Data Compression”, University of Innsbruck, Austria, Proceedings IEEE International Test Conference, Charlotte, NC, USA, September 30 – October 2, 2003.

2. D. Lelewer and others,” Data Compression”, ACM Press Newyork,NY,USA,1987.

3. D. Salomon , “ Data Compression the Complete Reference“ spring verlag Newyork, USA, 1998.

4. G. Kempe, “Computer Science Honours Research Report Compression and Computational Gene Finding”, 1 November , 2002.

5.From Wikipedia, the free encyclopedia,” Huffman coding”, GNU Free Documentation license, January 19,1996.

6. J. M. Pullen ,” Data Compression, Security Principles Integrity, Appropriate Use “,2/3/03 © 2003. 7- R. He ,”Indexing Compressed Text”, A thesis, Waterloo, Ontario, Canada, 2003 . 8- R. Müller , “Image Compression” Part II: Image Processing Computer Graphics and Image Processing , Winter Semester 2003/04.

## DEMO VIDEO

## Reviews

There are no reviews yet.