Closed stephenfmann closed 1 year ago
Godfrey-Smith & Sterelny (2016) define ''Shannon's concept of information'' in terms of correlational relationships like mutual information. They say: "This sense of information is associated with Claude Shannon (1948) who showed how the concept of information could be used to quantify facts about contingency and correlation in a useful way, initially for use in communication technology."
Earlier they describe it as "the sense of information isolated by Claude Shannon and used in mathematical information theory".
Adriaans (2019): more interested in surprisal and entropy than mutual information.
Owren et al (2010:759) describe "Shannon and Weaver's (1949) theory of information" and say "the associated concept of Shannon information refers strictly and solely to observable correlations between events in the world".
Dennett (2017:sec. 6.1) says that "Shannon's theory is, at its most fundamental, about the statistical relationship between different states of affairs in the world: What can be gleaned (in principle) about state A from the observation of state B?", later explicitly distinguishing "Shannon information" from semantic information.
Shea (2018:12): "Shannon (1948) developed a formal treatment of correlational information—as a theory of communication, rather than meaning—which forms the foundation of (mathematical) information theory", later invoking "Shannon information" to describe a correlational measure which could be mutual information (Shea, 2018, p. 78, n. 5).
Check the texts and complete the paragraph