# Data Compression MCQ Based Question With Answer Set-2

**Data Compression Multiple Choice Based Question With Answers** aktu. Regarding Any Errors related to the question or its solutions please comment in the given below, our team will check, and improve.

**Data Compression MCQ Practice Set-1: Click Here****Data Compression MCQ Practice Set-2: Click Here****Data Compression MCQ Practice Set-3: Click Here****Data Compression MCQ Practice Set-4: Click Here****Data Compression MCQ Practice Set-5: Click Here**

**Data Compression MCQ Online Test**

**Data Compression MCQ Quiz Set-1: Click Here****Data Compression MCQ Quiz Set-2: Click Here****Data Compression MCQ Quiz Set-3: Click Here****Data Compression MCQ Quiz Set-4: Click Here****Data Compression MCQ Quiz Set-5: Click Here**

**Data Compression MCQ Based Question**

**Check Also:** **Software Engineering Quiz Set-1**

**Software ****Engineering Quiz Set-**2

**Huffman codes are codes and are optimum for a given model (set of probabilities).**

- Parity
**Prefix**- Convolutional code
- Block code

**The Huffman procedure is based on observations regarding optimum prefix codes, which is/are**

- A. In an optimum code, symbols that occur more frequently (have a higher probability of occurrence) will have shorter codewords than symbols that occur less
- B. In an optimum code, the two symbols that occur least frequently will have the same length
**C. Both (A) and (B)**- D. None of these

Correct option is C

**The best algorithms for solving Huffman codes**

- Brute force algorithm
- Divide and conquer algorithm
**Greedy algorithm**- Exhaustive search

**How many printable characters does the ASCII character set consists of?**

- 128
**100**- 98
- 90

**The difference between the entropy and the average length of the Huffman code is called**

- Rate
**Redundancy**- Power
- None of these

**Unit of redundancy is**

- bits/second
- symbol/bits
**bits/symbol**- none of these

**The redundancy is zero when**

- The probabilities are positive powers of two
**The probabilities are negative powers of two**- Both
- None of the above

**Which bit is reserved as a parity bit in an ASCII set?**

- Sixth
- Seventh
**Eighth**- Ninth

**Bits are needed for standard encoding if the size of the character set is X**

- X+1
**log(X)**- X
^{2} - 2X

10**. In Huffman coding, data in a tree always occur in**

**Leaves**- Roots
- Left sub trees
- None of these

** 11. An optimal code will always be present in a full tree?**

**True**- False

**Running time of the Huffman encoding algorithm is**

- O(Nlog(C))
**O(Clog(C))**- O(C)
- O(log(C))

**Running time of the Huffman algorithm, if its implementation of the priority queue is done using linked lists**

- O(log(C))
- O(Clog(C))
**O(C**^{2})- O(C)

- The unary code for a positive integer n is simply n ______followed by a _____ .

- zero, ones
**ones, zero**- None of these

**The unary code for 4 is**.

- 11100
**11110**- 00001
- 00011

- In the Tunstall code, all codewords are of ______However, each codeword represents a _______number of letters.

- different, equal
**equal, different**- none of these

**Tunstall coding is a form of entropy coding used for**

**Lossless data compression**- Lossy data compression
- Both
- None of these

**The main advantage of a Tunstall code is that**

**Errors in codewords do not propagate**- Errors in codewords propagate
- The disparity between frequencies
- None of these

**Applications of Huffman Coding**

- Text compression
- Audio compression
- Lossless image compression
**All of the above**

**An alphabet consist of the letters A, B, C and D. The probability of occurrence is P(A) = 0.4, P(B)= 0.1, P(C) = 0.2 and P(D) = 0.3. The Huffman code is**

**A = 0 B = 111 C = 110 D = 10**- A = 0 B = 11 C = 10 D = 111
- A = 0 B = 111 C = 11 D = 101
- A = 01 B = 111 C = 110 D = 10

**The basic idea behind Huffman coding is to**- compress data by using fewer bits to encode fewer frequently occurring characters
**compress data by using fewer bits to encode more frequently occuring characters**- compress data by using more bits to encode more frequently occurring characters
- expand data by using fewer bits to encode more frequently occuring characters

**Huffman coding is an encoding algorithm used for****lossless data compression**- broadband systems
- files greater than 1 Mbit
- lossy data compression

**A Huffman encoder takes a set of characters with fixed length and produces a set of characters of**- random length
- fixed length
**variable length**- constant length

**A Huffman code: A = 1, B = 000, C = 001, D = 01 , P(A) = 0.4, P(B) = 0.1, P(C) = 0.2, P(D) = 0.3 The average number of bits per letter is**

- 0 bit
- 1 bit
**0 bit**- 9 bit

**Which of the following is not a part of the channel coding?**- rectangular code
- Checksum checking
- Hamming code
**Huffman code**

**Which of the following is the first phase of JPEG?****DCT Transformation**- Quantization
- Data Compression
- None of the above

**Which type of method is used is used to compress data made up of a**combination of symbols?**Run-length encoding**- Huffman encoding
- Lempel Ziv encoding
- JPEG encoding

**How many passes does lossy compression make frequently?**- One pass
**Two-pass**- Three pass
- Four pass

**Information is the**- data
**meaningful data**- raw data
- Both A and B