The document discusses different techniques for compressing multimedia data such as text, images, audio and video. It describes how compression works by removing redundancy in digital data and exploiting properties of human perception. It then explains different compression methods including lossless compression, lossy compression, entropy encoding, and specific algorithms like Huffman encoding and arithmetic coding. The goal of compression is to reduce the size of files to reduce storage and bandwidth requirements for transmission.
7. Source Encoders and destination decoders Prior to transmitting the source information relating to a multimedia application , a compression algorithm is applied to it. This implies that in order for the destination to reproduce the original source information or in some instances, a nearly exact copy of it – a matching decompression algorithm must be applied to it. The application of the compression algorithm is the main function carried out by the source encoder and the decompression algorithm is carried out by the destination decoder .
8. Source Encoders and destination decoders In applications which involve two computers communicating with each other, the time required to perform the compression and decompression algorithm is not always critical. So both algorithms are normally implemented in software within the two computers. Source information Source encoder program Destination Decoder Program Copy of Source Information Network Source encoder / destination decoder : Software only Source Computer Destination Computer
9. Source Encoders and destination decoders In other applications, however the time required to perform the compression and decompression algorithms in software is not acceptable and instead the two algorithms must be performed by special processors in separate units . Source information Source encoder Processor Destination Decoder Processor Copy of Source Information Network Source Computer Destination Computer Source encoder / destination decoder : Special processors/hardware
10.
11.
12.
13.
14. Statistical Encoding In this technique, patterns of bits (word) or that are more frequent are recorded using shorter codes. It uses a set of variable length codewords with the shortest codewords used to represent the most frequently occurring symbols. For example : In a string of text , the character A may occurs more frequently than say the character P which occurs more frequently than the character Z , and so on… Statistical encoding exploits this property by using a set of variable length Codewords With the shortest codewords used to represent the most frequently occurring symbols.
15.
16. In practice , the use of variable-length codewords is not quite as straight forward as it first appears. Clearly as with run-length encoding , the destination must know the set of codewords being used by the source. With variable length codewords , however in order for the decoding operation to be carried out correctly , it is necessary to ensure that a shorter codeword in the set does not form the start of a longer codeword otherwise the decoder will interpret the string on the wrong codeword boundaries. A codeword set that avoids this happening is said to process the prefix and an Encoding scheme that generates codewords that have this property is the Huffman encoding algorithm.
22. Decoding Algorithm End Is codeword already stored ? Begin Set CODEWORD to empty Read next bit from BITSTREAM and append to existing bits in CODEWORD Load matching ASCII Character into Receive _buffer All bits in BITSTREAM Processed n n