|
I'm sure you understand it (but you do not know that)!
Wortbreite
8 Bit
Bereiche mit Vorzeichen
-128 ~ 127
Bereich ohne Vorzeichen
0 ~ 255
2^8 = 256
and 1 to 256 you can define with 8 bit as shown above with 0 to 255.
Now you should know that you already knew that
|
|
|
|
|
I think , I'm very save with binary numbers. But again translated by google, I don't get the point on that:
Quote: Since it is well known that 2^n different bit patterns can be formed with n bits, it is immediately clear that a system with 2^n states can be completely described with an n-bit file, so that in this case H=n
How one can desribe 256 condistions in a file with one byte?
I'm pretty sure I have a problem understanding the article, but I'm also very happy if somebody can explain what I'm missinterpreting
|
|
|
|
|
A file with one byte can describe how many states a system has (state value 1 or 2 or ... 256).
If you want describe 256 states you need 256 * 8 bit.
|
|
|
|
|
my understanding is that a system at any particular moment is in a state . a single state . of course a system can not be in more than one state at any moment unless of course we are discussing Quantum Mechanics which of course we are not i presume . if it is known the system can be in any state of 256 possible states then at any moment only 8 bits are required to specify that state . QED
|
|
|
|
|
|
A visualization is literally 256 light switches side by side.
Each switch represents logical branching in the code. appStates is an array of bool... if (appStates[34] and appStates[42]) ....
|
|
|
|
|
0x01AA wrote: Nobody knows what entropy really is
As a Mechanical Engineer by education, I can only say that Second Law of Thermodynamics dictates that the entropy of the Universe is continuously increasing, dS > 0.
|
|
|
|
|
But there is a limit to this? Something like 2.58###
|
|
|
|
|
As a suggestion attempting to learn one part of a system of study well, probably is always going to require that one learns more about the system that contains it first.
So study information theory first.
|
|
|
|
|
Quote: So study information theory first.
I would say 'Shannon's' therorie about Entropy is exactly that very basic theory
|
|
|
|
|
There is a simple equation that defines entropy. Chemical Engineers make use of the term to describe the behavior of substances. We use it, for example, to evaluate the performance of a steam turbine. It has been misused by zealous promoters to obfuscate information.
|
|
|
|
|
And that simple equation is?
|
|
|
|
|
|
|
Been a while but I don't think that is the same thing as entropy in information theory.
But perhaps are related.
|
|
|
|
|
|
|
Entropy is an attempt to give a macroscopic number that describes a large number of microscopic states. Taking the "classic" deck of cards, there is only 1 way of arranging the cards in suit order (Clubs, Diamonds, Hearts, Spades) in increasing order in each suit (2, 3, ..., J, Q, K, A). We take the log of the number of states, and the entropy is 0.
If the suits can be in any order, we have 4! possibilities, with an entropy of log(4!).
If the cards are in the order red-black-red-black, we have 26! * 26! possibilities, with an entropy of 2*log(26!).
If the cards can be in any order, we have 52! possibilities, with an entropy of log(52!). This is also the maximum entropy for the card system.
Each of the different ways to arrange the cards in the above examples is called a "micro-state". In physics/chemistry we usually multiply the entropy calculated above by Bolzman's constant, to fit it in with other units such as temperature, energy, etc.
The connection to information theory comes from the fact that the lower the entropy of the system, the easier it is to predict the next card.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Thank you very much for your answer. I will go through the card deck example and hope I will understand it in detail.
Anyway statements like Quote: Entropy is an attempt to give a macroscopic number that describes a large number of macroscopic states. makes it hard to understand. What exactly 'macroscopic' and 'macroscopic' means....
|
|
|
|
|
The second 'macroscopic' should have been 'microscopic'. Corrected in the original.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Wait wait. Looks more I cited it wrong?
This chaos therory let me end in chaos
|
|
|
|
|
Hi All,
Just a sign off for the year from work to you all :santa: (well he doesn't have an icon? could well be on the naughty list?). Happy whatever, Happy New Year!!!
|
|
|
|
|
And Happy Christmas to you, enjoy the break, and maybe a or two.
|
|
|
|
|
🎅
What do you get when you cross a joke with a rhetorical question?
The metaphorical solid rear-end expulsions have impacted the metaphorical motorized bladed rotating air movement mechanism.
Do questions with multiple question marks annoy you???
|
|
|
|
|
glennPattonWork3 wrote: That point when 'do it next year' is reasonable
The 2nd of January?
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|