Talk:Emerging Media

From SI410
Revision as of 22:03, 11 December 2012 by Alexvis (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

I want to edit this article, but am hesitant because I simply do not know the direction the original author wanted this article to take. It seems as if this page is almost unnecessary, as it appears to merely combine ideas and content from other pages (i.e. Floridi's view on entropy).

Varying definitions of Entropy

Gleick discusses Claude Shannon and his contribution to information theory. Among his contributions was to compare the physical definition of entropy to information. Yes, this is in direct contrast to the term we learned about from Floridi, but Gleick devotes an entire chapter Entropy and Its Demons to this concept, quoting many information theorists and physicist who have all contributed greatly to the material we read about in 410. In fact, at least how I understand this definition of entropy, it may provide greater leeway when talking about information quality in terms of entropy. For example, Gleick would argue that all information deserves to be treated equally, "harmless," "harmful," or otherwise. The library should have a copy of this book; I highly recommend checking it out. It will blow your mind! - JYoung