Talk:Emerging Media

From SI410
Revision as of 18:27, 11 November 2011 by Youngjos (Talk | contribs) (Varying definitions of Entropy)

Jump to: navigation, search

Varying definitions of Entropy

Gleick discusses Claude Shannon and his contribution to information theory. Among his contributions was to compare the physical definition of entropy to information. Yes, this is in direct contrast to the term we learned about from Floridi, but Gleick devotes an entire chapter Entropy and Its Demons to this concept, quoting many information theorists and physicist who have all contributed greatly to the material we read about in 410. In fact, at least how I understand this definition of entropy, it may provide greater leeway when talking about information quality in terms of entropy. For example, Gleick would argue that all information deserves to be treated equally, "harmless," "harmful," or otherwise. The library should have a copy of this book; I highly recommend checking it out. It will blow your mind! - JYoung