here is the simple rationale/ procedure :
[login to view URL]
As a simple example, if you had three symbols with probability 1/3rd each, your optimal Huffman encoding would use the three symbols 0, 10 and 11 with an average of 5/3rd bits.
There are 243 symbols created by concatenating 5 of the original symbols, each with probability 1/243. Which is much closer to 1/256. The optimal Huffman encoding will encode 13 of these groups in 7 bits and 230 groups in 8 bits, for an average of 7.9465 bits per group or 1.5893 bits per original symbol, down from 1.6667 bits for the original Huffman coding, with arithmetic coding taking 1.5850 bits.
So in theory you could just combine two symbols each into one larger symbol, or three symbols each into one larger symbol, and use Hufman coding for the combinations.
Fixed Price $150 , 1 week ( flexible )
Jan 7 '17 at 12:42