OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

 


 

   Re: [xml-dev] [Shannon: information ~ uncertainty] Ramifications toXML d

[ Lists Home | Date Index | Thread Index ]


I hate to show my stupidity, but if the message regarding the condition of 
the prisoner [a constant] is irrevant to the condition it is meant to 
express to the intended receiver,  then the message is just as 
uncertain to the receiver as is no information at all. The state of no 
information is the ultimate in uncertainity. 

Do you not mean that high uncertainity equates with low amounts of 
reliable information because high amounts of reliable information is 
postively related to greater certainity? 

Hence, the reliability of information is a factor of the conseqence of its 
use and uncertainty approaches certainty as predicable consequence of 
reliance becomes more certain. 

Furthermore, if I am trying to predict the weather, it does not matter 
which message or how many messages are sent, since the message is neither 
related to the condition of person whose condition is to be expressed or 
to the weather conditions.

I know nothing of Shannon, but information that has no foundation ( that 
is random information unrelated to the circumstances of its source or use) 
is not information, it is at its very best, data.  

Information is an object with many modifers and much utility!

My two bits. 


On Mon, 11 Oct 2004, Roger L. Costello wrote:

> Hi Folks,
> 
> I am trying to get an understanding of Claude Shannon's work on information
> theory. Below I describe one small part of Shannon's work.  I would like to
> hear your thoughts on its ramifications to information exchange using XML. 
> 
> INFORMATION
> 
> Shannon defines information as follows: 
> 
>     Information is proportional to uncertainty.  High uncertainty equates
>     to a high amount of information.  Low uncertainty equates to a low
>     amount of information.
> 
>     More specifically, Shannon talks about a set of possible data.
>     A set comprised of 10 possible choices of data has less information than
>     a set comprised of a hundred possible choices.
> 
> This may seem rather counterintuitive, but bear with me as I give an
> example. 
> 
> In a book I am reading[1] the author gives an example which provides a nice
> intuition of Shannon's statement that information is proportional to
> uncertainty.
> 
> EXAMPLE
> 
> Imagine that a man is in prison and wants to send a message to his wife.
> Suppose that the prison only allows one message to be sent, "I am fine".
> Even if the person is deathly ill all he can
> send is, "I am fine".  Clearly there is no information in this message.  
> 
> Here the set of possible messages is one.  There is no uncertainty and there
> is no information. 
> 
> Suppose that the prison allows one of two messages to be sent, "I am fine"
> or "I am ill".  If the prisoner sends one of these messages then some
> information will be passed to his wife.
> 
> Here the set of possible messages is two.  There is uncertainty (of which
> message will be sent).  When one of the two messages is selected by the
> prisoner and sent to his wife some information is
> passed.
> 
> Suppose that the prison allows one of four messages to be sent:
> 
> 1. I am healthy and happy
> 2. I am healthy but not happy
> 3. I am happy but not healthy
> 4. I am not happy and not healthy
> 
> If the person sends one of these messages then even more information will be
> passed.
> 
> Thus, the bigger the set of potential messages the more uncertainty. The
> more uncertainty there is the more information there is.
> 
> Interestingly, it doesn't matter what the messages are.  All that matters is
> the "number" of messages in the set.  Thus, there is the same amount of
> information in this set:
> 
>    {"I am fine", "I am ill"}
> 
> as there is in this set:
> 
>    {A, B}
> 
> SIDE NOTES
> 
> a. Part of Shannon's goal was to measure the "amount" of information.
>    In the example above where there are two possible messages the amount
>    of information is 1 bit.  In the example where there are four
>    possible messages the amount of information is 2 bits.
> 
> b. Shannon refers to uncertainty as "entropy".  Thus, the higher the
>    entropy (uncertainty) the higher the information.  The lower the
>    entropy the lower the information.
> 
> QUESTIONS
> 
> 1. How does this aspect (information ~ uncertainty) of Shannon's work relate
> to data exchange using XML?  (I realize that this is a very broad question.
> Its intent is to stimulate discussion on the application of Shannon's
> information/uncertainty ideas to XML data exchange)
> 
> 2. A schema is used to restrict the allowable forms that an instance
> document may take.  So doesn't a schema reduce information?  
> 
> /Roger  
>  
> [1] An Introduction to Cybernetics by Ross Ashby
> 
> 
> 
> -----------------------------------------------------------------
> The xml-dev list is sponsored by XML.org <http://www.xml.org>, an
> initiative of OASIS <http://www.oasis-open.org>
> 
> The list archives are at http://lists.xml.org/archives/xml-dev/
> 
> To subscribe or unsubscribe from this list use the subscription
> manager: <http://www.oasis-open.org/mlmanage/index.php>
> 





 

News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 2001 XML.org. This site is hosted by OASIS