It sounds like you are writing the string "001111011001" to the file as a string.
A string is a sequence of characters. Depending on how you are encoding the characters, each character will take 8 or 16 bits in the file. If you convert the single character "2" into the sequence of 3 characters "111" you've just trippled the size of your file.
Instead of a sequence of characters, what you are supposed to be creating is a sequence of bits.
Not sure how you do it in Java, but in C you could do it like this:
char *encoded_string = "0011111000110000111100001110001100";
int len = strlen(encoded_string);
char byte_value = 0;
for (int ix = 0; ix < len; ix += 8) {
byte_value = byte_value << 1;
if (encoded_string[ix] == '1')
byte_value = byte_value |= 1;
if ((ix % 8) == 7) {
fprintf(outputfile, "%c", byte_value);
byte_value = 0;
}
}
int remainder = ix % 8;
if (remainder) {
byte_value = byte_value << remainder;
fprintf(outputfile, "%c", byte_value);
}
There may be a Java library function that does that conversion for you, but I gave it to you in this format so you could see what actually needs to happen and you could understand the difference.
This code example assumes you want to encode with the most significant bit first. It gets around byte ordering issues by outputing a byte at a time.
The code pads out the end of the file with zero bits -- that may be an issue for your huffman encoding scheme. It probably ought to append an end of file code before padding out with zero bits.