Relative Information: Theories and Applications

Sold by Ingram

This product may not be approved for your region.
Paperback
  • Free Shipping

    On orders of AED 100 or more. Standard delivery within 5-15 days.
  • Free Reserve & Collect

    Reserve & Collect from Magrudy's or partner stores accross the UAE.
  • Cash On Delivery

    Pay when your order arrives.
  • Free returns

    See more about our return policy.
For four decades, information theory has been viewed almost exclusively as a theory based upon the Shannon measure of uncertainty and information, usually referred to as Shannon entropy. Since the publication of Shannon's seminal paper in 1948, the theory has grown extremely rapidly and has been applied with varied success in almost all areas of human endeavor. At this time, the Shannon information theory is a well established and developed body of knowledge. Among its most significant recent contributions have been the use of the complementary principles of minimum and maximum entropy in dealing with a variety of fundamental systems problems such as predic- tive systems modelling, pattern recognition, image reconstruction, and the like. Since its inception in 1948, the Shannon theory has been viewed as a restricted information theory. It has often been argued that the theory is capable of dealing only with syntactic aspects of information, but not with its semantic and pragmatic aspects. This restriction was considered a v~rtue by some experts and a vice by others. More recently, however, various arguments have been made that the theory can be appropriately modified to account for semantic aspects of in- formation as well. Some of the most convincing arguments in this regard are in- cluded in Fred Dretske's Know/edge & Flow of Information (The M.LT. Press, Cambridge, Mass., 1981) and in this book by Guy lumarie.