Article published In: Pragmatics & Cognition
Vol. 26:2/3 (2019) ► pp.296–320
How linguistic meaning harmonizes with information through meaning conservation
Published online: 12 February 2021
https://doi.org/10.1075/pc.18018.mon
https://doi.org/10.1075/pc.18018.mon
Abstract
This paper aims to characterize the relationship between information as defined in the information-theoretic approach and
linguistic meaning by way of formulation of computations over the lexicon of a natural language. Information in its information theoretic
sense is supposed not to be equivalent to linguistic meaning, whereas linguistic meaning has an intrinsic connection to information as far
as the form and structure of the lexicon of a language (in a non-lexicological sense) is concerned. We argue that these two apparently
conflicting aspects of the relationship between information and linguistic meaning can be unified by showing that information conserves
linguistic meaning, only insofar as computations of a certain kind are defined on the symbolic elements of a lexicon. This has consequences
not merely for the nature of lexical learning – natural or artificial or otherwise – but also for the conservation of information in
axiomatic systems.
Keywords: lexicon, meaning, information, axiomatic systems, syntax
Article outline
- 1.Introductory background
- 2.Information, linguistic meaning and computation
- 3.Why the lexicon?
- 4.The principle of linguistic meaning conservation
- 5.Implications
- 6.Conclusion
- Acknowledgements
- Notes
References
References (37)
Barwise, Jon & Jerry Seligman. 1997. Information flow: The logic of distributed systems. Cambridge: Cambridge University Press.
Benthem, Johan van & Maricarmen Martinez. 2008. The stories of logic and information. In Pieter Adriaans & Johan van Benthem (eds.), Philosophy of information, vol. 81, 217–280. Amsterdam: Elsevier.
Boland, Richard J. Jr. & Kalle Lyytinen. 2017. The limits to language in doing system design. European Journal of Information Systems 26(3). 248–259.
Bowerman, Melissa. 1988. The ‘no negative evidence’ problem: How do children avoid constructing an overly general grammar? In John Hawkins (ed.), Explaining language universals, 73–101. Oxford: Blackwell.
Carnap, Rudolf & Yehoshua Bar-Hillel. 1952. An outline of a theory of semantic information. Technical Report 247. Research Laboratory of Electronics, MIT.
Clark, Stephen. 2015. Vector space models of lexical meaning. In Shalom Lappin & Chris Fox (eds.), Handbook of contemporary semantic theory, 493–522. Oxford: Blackwell.
Dascal, Marcelo. 2003. Interpretation and understanding. Amsterdam: John Benjamins.
Erteschik-Shir, Nomi. 2007. Information structure: The syntax-discourse interface. New York: Oxford University Press.
Fillmore, Charles. 1976. Frame semantics and the nature of language. Annals of the New York Academy of Sciences: Conference on the Origin and Development of Language and Speech 280(1). 20–32.
Goldberg, Adele. 2006. Constructions at work: The nature of generalization in language. New York: Oxford University Press.
. 2019. Explain me this: Creativity, competition, and the partial productivity of constructions. Princeton: Princeton University Press.
Gregory, Howard. 2015. Language and logics: An introduction to the logical foundations of language. Edinburgh: Edinburgh University Press.
Hampton, James A. 2017. Compositionality and concepts. In James A. Hampton & Yoad Winter (eds.), Language, cognition, and mind: Vol. 3. Compositionality and concepts in linguistics and psychology, 95–122. New York: Springer Nature.
Harris, Zellig. 1991. A theory of language and information: A mathematical approach. Oxford: Clarendon Press.
Jackendoff, Ray. 2002. Foundations of language: Brain, meaning, grammar, evolution. New York: Oxford University Press.
Kamp, Hans & Martin Stokhof. 2008. Information in natural language. In Pieter Adriaans & Johan van Benthem (eds.), Philosophy of information, vol. 81, 49–111. Amsterdam: Elsevier.
Lambrecht, Knud. 1996. Information structure and sentence form: A theory of topic, focus, and the mental representations of discourse referents. Cambridge: Cambridge University Press.
Lyytinen, Kalle. 1985. Implications of theories of language for information systems. MIS Quarterly 9(1). 61–74.
Menzel, Christopher. 1999. The objective conception of contexts and its logic. Minds and Machines 9(1). 29–56.
Mondal, Prakash. 2012. Can internalism and externalism be reconciled in a biological epistemology of language? Biosemiotics 5(1). 61–82.
. 2018. Lexicon, meaning relations and semantic networks. The 2nd Workshop on Natural Language for Artificial Intelligence (NL4AI 2018) 21. 40–52.
Newport, Elissa L., Daphne Bavelier & Helen J. Neville. 2002. Critical thinking about critical periods: Perspectives on a critical period for language acquisition. In Emmanuel Dupoux (ed.), Language, brain and cognitive development, 481–502. Cambridge, Massachusetts: MIT Press.
Pelletier, Francis J. 2017. Compositionality and concepts – A perspective from formal semantics and philosophy of language. In James A. Hampton & Yoad Winter (eds.), Language, cognition, and mind, vol. 3: Compositionality and concepts in linguistics and psychology, 31–94. New York: Springer Nature.
Pollard, Carl & Sag, Ivan. 1994. Head-driven phrase structure grammar. Chicago: University of Chicago Press.
Pullum, Geoffrey K. 2013. The central question in comparative syntactic metatheory. Mind and Language 28(4). 492–521.
. 2012. Type theory and lexical decomposition. In James Pustejovsky, Pierrette Bouillon, Hitoshi Isahara, Kyoko Kanzaki & Chungmin Lee (eds.), Advances in generative lexicon theory, 9–38. Heidelberg: Springer.
Seuren, Pieter A. M. 2013. From Whorf to Montague: Explorations in the theory of language. New York: Oxford University Press.
Cited by (2)
Cited by two other publications
Reimer, Laura & Eva Smolka
This list is based on CrossRef data as of 29 november 2025. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers. Any errors therein should be reported to them.
