请输入您要查询的英文单词:

 

单词 information theory
释义

information theory


information theory

n. The theory of the probability of transmission of messages with specified accuracy when the bits of information constituting the messages are subject, with certain probabilities, to transmission failure, distortion, and accidental additions.

information theory

n (Mathematics) a collection of mathematical theories, based on statistics, concerned with methods of coding, transmitting, storing, retrieving, and decoding information

informa′tion the`ory


n. the mathematical theory concerned with the content, transmission, storage, and retrieval of information, usu. in the form of messages or data. [1945–50]
Thesaurus
Noun1.information theory - (computer science) a statistical theory dealing with the limits and efficiency of information processingscientific theory - a theory that explains scientific observations; "scientific theories must be falsifiable"statistics - a branch of applied mathematics concerned with the collection and interpretation of quantitative data and the use of probability theory to estimate population parameterscomputer science, computing - the branch of engineering science that studies (with the aid of computers) computable processes and structures
Translations
Informationstheorie

information theory


information theory

or

communication theory,

mathematical theory formulated principally by the American scientist Claude E. ShannonShannon, Claude Elwood,
1916–2001, American applied mathematician, b. Gaylord, Michigan. A student of Vannevar Bush at the Massachusetts Institute of Technology (MIT), he was the first to propose the application of symbolic logic to the design of relay circuitry with his
..... Click the link for more information.
 to explain aspects and problems of information and communication. While the theory is not specific in all respects, it proves the existence of optimum coding schemes without showing how to find them. For example, it succeeds remarkably in outlining the engineering requirements of communication systems and the limitations of such systems.

In information theory, the term information is used in a special sense; it is a measure of the freedom of choice with which a message is selected from the set of all possible messages. Information is thus distinct from meaning, since it is entirely possible for a string of nonsense words and a meaningful sentence to be equivalent with respect to information content.

Measurement of Information Content

Numerically, information is measured in bits (short for binary digit; see binary systembinary system,
numeration system based on powers of 2, in contrast to the familiar decimal system, which is based on powers of 10. In the binary system, only the digits 0 and 1 are used.
..... Click the link for more information.
). One bit is equivalent to the choice between two equally likely choices. For example, if we know that a coin is to be tossed but are unable to see it as it falls, a message telling whether the coin came up heads or tails gives us one bit of information. When there are several equally likely choices, the number of bits is equal to the logarithm of the number of choices taken to the base two. For example, if a message specifies one of sixteen equally likely choices, it is said to contain four bits of information. When the various choices are not equally probable, the situation is more complex.

Interestingly, the mathematical expression for information content closely resembles the expression for entropyentropy
, quantity specifying the amount of disorder or randomness in a system bearing energy or information. Originally defined in thermodynamics in terms of heat and temperature, entropy indicates the degree to which a given quantity of thermal energy is available for doing
..... Click the link for more information.
 in thermodynamics. The greater the information in a message, the lower its randomness, or "noisiness," and hence the smaller its entropy. Since the information content is, in general, associated with a source that generates messages, it is often called the entropy of the source. Often, because of constraints such as grammar, a source does not use its full range of choice. A source that uses just 70% of its freedom of choice would be said to have a relative entropy of 0.7. The redundancy of such a source is defined as 100% minus the relative entropy, or, in this case, 30%. The redundancy of English is estimated to be about 50%; i.e., about half of the elements used in writing or speaking are freely chosen, and the rest are required by the structure of the language.

Analysis of the Transfer of Messages through Channels

A message proceeds along a channel from the source to the receiver; information theory defines for any given channel a limiting capacity or rate at which it can carry information, expressed in bits per second. In general, it is necessary to process, or encode, information from a source before transmitting it through a given channel. For example, a human voice must be encoded before it can be transmitted by telephone. An important theorem of information theory states that if a source with a given entropy feeds information to a channel with a given capacity, and if the source entropy is less than the channel capacity, a code exists for which the frequency of errors may be reduced as low as desired. If the channel capacity is less than the source entropy, no such code exists.

The theory further shows that noisenoise,
any signal that does not convey useful information. Electrical noise consists of electrical currents or voltages that interfere with the operation of electronic systems.
..... Click the link for more information.
, or random disturbance of the channel, creates uncertainty as to the correspondence between the received signal and the transmitted signal. The average uncertainty in the message when the signal is known is called the equivocation. It is shown that the net effect of noise is to reduce the information capacity of the channel. However, redundancy in a message, as distinguished from redundancy in a source, makes it more likely that the message can be reconstructed at the receiver without error. For example, if something is already known as a certainty, then all messages about it give no information and are 100% redundant, and the information is thus immune to any disturbances of the channel. Using various mathematical means, Shannon was able to define channel capacity for continuous signals, such as music and speech.

Bibliography

See C. E. Shannon and W. Weaver, The Mathematical Theory of Communication (1949); M. Mansuripur, Introduction to Information Theory (1987); J. Gleick, The Information: A History, a Theory, a Flood (2011).

Information Theory

 

the mathematical discipline that studies the processes of storage, transformation, and transmission of information. Information theory is an essential part of cybernetics.

At the basis of information theory lies a definite method for measuring the quantity of information contained in given data (“messages”). Information theory proceeds from the idea that the messages designated for retention in a storage device or for transmission over a communication channel are not known in advance with complete certainty. Only the set from which these messages may be selected is known in advance and, at best, how frequently certain of these messages are selected (that is, the probability of the messages). In information theory it is shown that the “uncertainty” encountered in such circumstances admits of a quantitative expression and that precisely this expression (and not the specific nature of the messages themselves) determines the possibility of their storage and transmission.

As such a “measure of uncertainty” in information theory one uses the number of binary digits (bits) necessary to record an arbitrary message from a given source. More precisely, one looks at all possible methods for representing the messages by sequences of the symbols 0 and 1 (binary codes) that satisfy two conditions: (a) different sequences correspond to different messages and (b) upon the transcription of a certain sequence of messages into coded form this sequence must be unambiguously recoverable. Then as a measure of the uncertainty one takes the average length of the coded sequence that corresponds to the most economical method of encoding; one binary digit serves as the unit of measurement.

For example, let certain messages x1, x2, and x3 appear with probabilities of ½, ⅜, and ⅛, respectively. Any code that is too short, such as

x1 = 0, x2 = 1, x3 = 01

is unsuitable since it violates condition (b). Thus, the sequence 01 can denote x1,x2x3 The code

x1 = 0, x2 = 10, x3 = 11

satisfies conditions (a) and (b). To it corresponds an average length of a coded sequence equal to

It is not hard to see that no other code can give a smaller value, that is, the code indicated is the most economical. In accordance with our choice of a measure for uncertainty, the uncertainty of the given information source should be taken equal to 1.5 binary units.

Here it is appropriate to note that “message,” “communication channel,” and other terms are understood very broadly in information theory. Thus, from the viewpoint of information theory, an information source is described by enumerating the set x1, x2, … of possible messages (which can be the words of some language, results of measurements, or television pictures) and their respective probabilities p1, p2p,

There is no simple formula expressing the exact minimum H’ of the average number of bits necessary for encoding the messages x1, x2, …, xn through the probabilities p1, p2, … Pn of these messages. However, the specified minimum is not less than the value

(where log2a denotes the logarithm of the quantity a to base 2) and may not exceed it by more than one unit. The quantity H (the entropy of the set of messages) possesses simple formal properties, and for all conclusions of information theory that are of an asymptotic character, corresponding to the case H′→ ∞, the difference between H and H′ is absolutely immaterial. Accordingly, the entropy is taken as the measure of the uncertainty of the messages from a given source. In the example above, the entropy is equal to

From the viewpoint stated, the entropy of an infinite aggregate, as a rule, turns out to be infinite. Therefore, when applied to an infinite collection it is treated differently: a certain precision level is assigned, and the concept of £-entropy is introduced as the entropy of the information recorded with a precision of e, if the message is a continuous quantity or function (for example, of time).

Just as with the concept of entropy, the concept of the amount of information contained in a certain random object (random quantity, random vector, or random function) relative to another is introduced at first for objects with a finite number of possible values. Then the general case is studied with the help of a limiting process. In contrast to entropy, the amount of information, for example, in a certain continuously distributed random variable relative to another continuously distributed variable, very often turns out to be finite.

The concept of a communication channel is of an extremely general nature in information theory. In essence, a communication channel is given by specifying a set of “admissible messages” at the “channel input,” a set of “output messages,” and a collection of conditional probabilities for receiving one or another message at the output for a given input message. These conditional probabilities describe the effect of “noise” distorting the transmitted information. “Connecting” any information source to the channel, one may calculate the amount of information contained in the messages at the output relative to that at the input. The upper limit of these amounts of information, taken with all admissible sources, is termed the capacity of the channel. The capacity of a channel is its fundamental information characteristic. Regardless of the effect (possibly strong) of noise in the channel, at a definite ratio of the entropy of the incoming information to the channel capacity, almost error-free transmission is possible with the correct coding.

Information theory searches for methods for transmitting information that are optimal with respect to speed and reliability, having established theoretical limits to the quality attainable. Clearly, information theory is of an essentially statistical character; therefore, a significant portion of its mathematical methods is derived from probability theory.

The foundations of information theory were laid in 1948–49 by the American scientist C. Shannon. The contribution of the Soviet scientists A. N. Kolmogorov and A. Ia. Khinchin was introduced into its theoretical branches and that of V. A. Kotel’-nikov, A. A. Kharkevich, and others into the branches concerning applications.

REFERENCES

Iaglom, A. M., and I. M. Iaglom. Veroiatnost’ i informatsiia, 2nd ed. Moscow, 1960.
Shannon, C. “Statisticheskaia teoriia peredachi elektricheskikh signalov.” In Teoriia peredachi elektricheskikh signalov pri nalichii pomekh: Sb. perevodov. Moscow, 1953.
Goldman, S. Teoriia informatsii. Moscow, 1957. (Translated from English.)
Teoriia informatsii i ee prilozheniia: Sb. perevodov. Moscow, 1959.
Khinchin, A. Ia. “Poniatie entropii v teorii veroiatnostei.” Uspekhi matematicheskikh nauk, 1953, vol. 8, issue 3.
Kolmogorov, A. N. Teoriia peredachi informatsii. Moscow, 1956. (Academy of Sciences of the USSR. Session on the scientific problems of the automation of production. Plenary session.)
Peterson, W. W. Kody, ispravliaiushchie oshibki. Moscow, 1964.
(Translated from English.)

IU. V. PROKHOROV

information theory

[‚in·fər′mā·shən ‚thē·ə·rē] (communications) A branch of theory which is devoted to problems in communications, and which provides criteria for comparing different communications systems on the basis of signaling rate, using a numerical measure of the amount of information gained when the content of a message is learned. (mathematics) The branch of probability theory concerned with the likelihood of the transmission of messages, accurate to within specified limits, when the bits of information composing the message are subject to possible distortion.

Information theory

A branch of communication theory devoted to problems in coding. A unique feature of information theory is its use of a numerical measure of the amount of information gained when the contents of a message are learned. Information theory relies heavily on the mathematical science of probability. For this reason the term information theory is often applied loosely to other probabilistic studies in communication theory, such as signal detection, random noise, and prediction. See Electrical communications

In designing a one-way communication system from the standpoint of information theory, three parts are considered beyond the control of the system designer: (1) the source, which generates messages at the transmitting end of the system, (2) the destination, which ultimately receives the messages, and (3) the channel, consisting of a transmission medium or device for conveying signals from the source to the destination. The source does not usually produce messages in a form acceptable as input by the channel. The transmitting end of the system contains another device, called an encoder, which prepares the source's messages for input to the channel. Similarly the receiving end of the system will contain a decoder to convert the output of the channel into a form that is recognizable by the destination. The encoder and the decoder are the parts to be designed. In radio systems this design is essentially the choice of a modulator and a detector.

A source is called discrete if its messages are sequences of elements (letters) taken from an enumerable set of possibilities (alphabet). Thus sources producing integer data or written English are discrete. Sources which are not discrete are called continuous, for example, speech and music sources. The treatment of continuous cases is sometimes simplified by noting that signal of finite bandwidth can be encoded into a discrete sequence of numbers.

The output of a channel need not agree with its input. For example, a channel might, for secrecy purposes, contain a cryptographic device to scramble the message. Still, if the output of the channel can be computed knowing just the input message, then the channel is called noiseless. If, however, random agents make the output unpredictable even when the input is known, then the channel is called noisy. See Communications scrambling, Cryptography

Many encoders first break the message into a sequence of elementary blocks; next they substitute for each block a representative code, or signal, suitable for input to the channel. Such encoders are called block encoders. For example, telegraph and teletype systems both use block encoders in which the blocks are individual letters. Entire words form the blocks of some commercial cablegram systems. It is generally impossible for a decoder to reconstruct with certainty a message received via a noisy channel. Suitable encoding, however, may make the noise tolerable.

Even when the channel is noiseless, a variety of encoding schemes exists and there is a problem of picking a good one. Of all encodings of English letters into dots and dashes, the Continental Morse encoding is nearly the fastest possible one. It achieves its speed by associating short codes with the most common letters. A noiseless binary channel (capable of transmitting two kinds of pulse 0, 1, of the same duration) provides the following example. Suppose one had to encode English text for this channel. A simple encoding might just use 27 different five-digit codes to represent word space (denoted by #), A, B, . . . , Z; say # 00000, A 00001, B 00010, C 00011, . . . , Z 11011. The word #CAB would then be encoded into 00000000110000100010. A similar encoding is used in teletype transmission; however, it places a third kind of pulse at the beginning of each code to help the decoder stay in synchronism with the encoder.

information theory

a collection of mathematical theories, based on statistics, concerned with methods of coding, transmitting, storing, retrieving, and decoding information

information theory

The study of encoding and transmitting information. From Claude Shannon's 1948 paper, "A Mathematical Theory of Communication," which proposed the use of binary digits for coding information. Shannon said that all information has a "source rate" that can be measured in bits per second and requires a transmission channel with a capacity equal to or greater than the source rate.

See information theory

information theory


theory

 [the´ah-re, thēr´e] 1. the doctrine or the principles underlying an art as distinguished from the practice of that particular art.2. a formulated hypothesis or, loosely speaking, any hypothesis or opinion not based upon actual knowledge.3. a provisional statement or set of explanatory propositions that purports to account for or characterize some phenomenon. The concepts and provisions set forth in a theory are more specific and concrete than those of a conceptual model. Hence a theory is derived from a conceptual model to fully describe, explain, and predict phenomena within the domain of the model.attribution theory a theory developed in an attempt to understand why an event occurred so that later events can be predicted and controlled.care-based theory a type of ethical theory of health care based on the two central constructive ideas of mutual interdependence and emotional response. The ethics of care is a rejection of impartial, principle-driven, dispassionate reasoning and judgment that has often dominated the models and paradigms of bioethics. Its origins are developmental psychology, moral theory, and feminist writings. Its moral concern is with needs and corresponding responsibility as they arise within a relationship. Moral response is individualized and is guided by the private norms of friendship, love, and care rather than by abstract rights and principles.cell theory all organic matter consists of cells, and cell activity is the essential process of life.clonal-selection theory of immunity immunologic specificity is preformed during embryonic life and mediated through cell clones.Cohnheim's theory tumors develop from embryonic rests that do not participate in the formation of normal surrounding tissue.community-based theory any ethical theory of health care according to which everything fundamental in ethics derives from communal values, the common good, social goals, traditional practices, and cooperative virtues. Commitment is to the general welfare, to common purposes, and to education of community members. Beliefs and principles, shared goals, and obligations are seen as products of the communal life. Conventions, traditions, and social solidarity play a prominent role in this type of theory. Called also communitarianism.consequence-based theory teleological theory.continuity theory a theory of motor development that postulates that motor changes occur in a linear fashion during an individual's life and that each change is dependent on the development of the prior period.deontological theory a type of ethical theory that maintains that some features of actions other than or in addition to consequences make the actions right or wrong. A major postulate is that we may not use or mistreat other people as a means to our own happiness or to that of others. Deontological theories guide action with a set of moral principles or moral rules, but it is the actions themselves and their moral properties that are fundamental. This theory is sometimes called the Kantian theory because the work of Immanuel Kant (1724–1804) has a deep effect on its formulations.discontinuity theory each stage of motor development has a new and unique feature that is added to distinguish it from the previous stage.family systems theory a view of the family as a dynamic, interactive unit that undergoes continual evolvement in structure and function. There are subsystems that are discrete units (such as mother-father, sister-brother, and mother-child) and there is a suprasystem (the community). The main functions of the family are considered to be support, regulation, nurturance, and socialization; specific aspects of the functions change as the subsystems interact with the suprasystem.feminist theory a type of ethical theory whose core assumptions are that women's experiences have not been taken as seriously as men's experiences and that there is subordination of women, which must end. A central theme is that women's reality is a social construction and not a biological determination. See also praxis" >feminist praxis.gate theory (gate-control theory) neural impulses generated by noxious painful stimuli and transmitted to the spinal cord by small-diameter C-fibers and A-delta fibers are blocked at their synapses in the dorsal horn by the simultaneous stimulation of large-diameter myelinated A-fibers, thus inhibiting pain by preventing pain impulses from reaching higher levels of the central nervous system.The gate-control theory of pain. From Linton et al., 2000.general systems theory a theory of organization proposed by Ludwig von Bertalanffy in the 1950s as a means by which various disciplines could communicate with one another and duplication of efforts among scientists could be avoided. The theory sought universally applicable principles and laws that would hold true regardless of the kind of system under study, the nature of its components, or the interrelationships among its components. Since the introduction of the general systems theory, theoretical models, principles, and laws have been developed that are of great value to scientists in all fields, including those of medicine, nursing, and other health-related professions.germ theory 1. all organisms are developed from a cell.2. infectious diseases are of microbial origin.theory of human becoming a theory of nursing formulated by Rosemarie Rizzo parse. Principles of Martha Rogers' science of unitary human beings are synthesized with major tenets and concepts from existential phenomenological thought to create a conceptual system and theory. Major areas of focus, rooted in the human sciences, describe the unitary human being interrelating with the universe in co-creating health. Essential concepts include the human-universe-health interrelationship, the co-creating of health, and the freely choosing of meaning in becoming. Humans are unitary beings mutually co-creating rhythmical patterns of relating in open interchange with the universe. The human being is a unity of the subject-world relationship, participating with the world in co-creation of self.

Health, in this theory, is a continuously changing process that humans participate in co-creating. Health is human becoming. It is not the opposite of disease, nor is it a state that exists. Disease is viewed as a pattern of the human being's interrelationship with the world.
Nursing is both science and art. The science is nursing's abstract body of knowledge lived through the art in service to people. Three principles of this theory comprise the abstract knowledge base used to guide nursing research and practice. The principles of structuring meaning multidimensionally, co-creating rhythmical patterns of relating, and co-transcending with the possibles provide the underpinnings for practice and research.
There is a particular nursing practice methodology, the only one that evolves directly from a nursing theory. Parse's practice methodology specifies that the nurse be truly present with the person and family illuminating meaning, synchronizing rhythms, and mobilizing transcendence. Persons choose their own patterns of health, reflective of their values. The nurse is there with the person and family as they uncover meanings and make decisions about their life situations. True presence is an unconditional love grounded in the belief that individuals know the way.
Parse has also constructed a research methodology congruent with her theory and unique to nursing. Her research methodology offers the researcher the opportunity to study universal lived experiences from the perspective of the people living the experiences. The purpose of her basic research method is to uncover the meaning of lived experiences to enhance the knowledge base of nursing. Parse has contributed to nursing science a theory with congruent practice and research methodologies.
theory of human caring a nursing theory formulated by Jean watson, derived from the values and assumptions of metaphysical, phenomenological-existential, and spiritual conceptual orientations. The primary concepts of the theory, transpersonal human caring and caring transactions, are multidimensional giving and receiving responses between a nurse and another person. Transpersonal human caring implies a special kind of relationship where both the nurse and the other have a high regard for the whole person in a process of being and becoming. Caring transactions provide a coming together in a lived moment, an actual caring occasion that involves choice and action by both the nurse and another.

Person (other) is defined as an experiencing and perceiving “being in the world,” possessing three spheres; mind, body, and soul. Person is also defined as a living growing gestalt with a unique phenomenal field of subjective reality.
The environment includes an objective physical or material world and a spiritual world. Watson defines the world as including all forces in the universe as well as a person's immediate environment. Critical to this definition is the concept of transcendence of the physical world that is bound in time and space, making contact with the emotional and spiritual world by the mind and soul.
Health is more than the absence of disease. Health is unity and harmony within the mind, body, and soul and is related to the congruence between the self as perceived and the self as experienced.
Nursing is defined as a human science and an activity of art, centered on persons and human health-illness experiences. The goal of nursing is to help persons gain a higher level of harmony within the mind, body and soul. Nursing practice is founded on the human-to-human caring process and a commitment to caring as a moral ideal. The activities of nursing are guided by Watson's ten carative factors, which offer a descriptive topology of interventions. The nursing process is incorporated in these carative factors as “creative problem-solving caring process,” a broad approach to nursing that seeks connections and relations rather than separations.
information theory a mathematical theory dealing with messages or signals, the distortion produced by statistical noise, and methods of coding that reduce distortion to the irreducible minimum.information processing theory a theory of learning that focuses on internal, cognitive processes in which the learner is viewed as a seeker and processor of information.Kantian theory deontological theory.Lamarck's theory the theory that acquired characteristics may be inherited.Metchnikoff theory the theory that harmful elements in the body are attacked by phagocytes, causing inflammation; see also metchnikoff theory" >metchnikoff theory.middle range theory a testable theory that contains a limited number of variables, and is limited in scope as well, yet is of sufficient generality to be useful with a variety of clinical research questions.nursing theory 1. a framework designed to organize knowledge and explain phenomena in nursing, at a more concrete and specific level than a conceptual model or a metaparadigm.2. The study and development of theoretical frameworks in nursing.obligation-based theory deontological theory.quantum theory radiation and absorption of energy occur in quantities (quanta) that vary in size with the frequency of the radiation.recapitulation theory ontogeny recapitulates phylogeny; see also recapitulation theory.rights-based theory a type of ethical theory under which the language of rights provides the basic terminology for ethical and political theory; it maintains that a democratic society must protect individuals and allow all to pursue personal goals. The idea of primacy of rights has been strongly disputed by, for example, utilitarians and Marxists. Individual interests often conflict with communal or institutional interests, as has been seen in efforts to reform the health care system. A prominent rights-based theory is what is known as liberal individualism.teleological theory a type of ethical theory that takes judgments of the value of the consequences of action as basic. Utilitarianism is the most prominent consequence-based theory; it accepts one and only one basic principle of ethics, the principle of utility, which asserts that we ought always to produce the maximal balance of positive value over negative consequences (or the least possible negative consequence, if only undesirable results can be achieved).Young-Helmholtz theory the theory that color vision depends on three sets of retinal receptors, corresponding to the colors of red, green, and violet.

in·for·ma·tion the·o·ry

in the behavioral sciences, a system for studying the communication process through the detailed analysis, often mathematic, of all aspects of the process including the encoding, transmission, and decoding of signals; not concerned in any direct sense with the meaning of a message.

in·for·ma·tion th·e·o·ry

(in'fŏr-mā'shŭn thē'ŏr-ē) behavioral sciences A system for studying the communication process through the detailed analysis, often mathematical, of all aspects of the process including the encoding, transmission, and decoding of signals; not concerned in any direct sense with the meaning of a message.

information theory

the study of the measurement and properties of codes and messages.

Information theory


Information theory

A mathematical theory first studied by Claude E. Shannon that presents a framework that measures among many things the amount of theoretical coding necessary to communicate information effectively.
FinancialSeeis/it

information theory


Related to information theory: Introduction to Information theory
  • noun

Words related to information theory

noun (computer science) a statistical theory dealing with the limits and efficiency of information processing

Related Words

  • scientific theory
  • statistics
  • computer science
  • computing
随便看

 

英语词典包含2567994条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。

 

Copyright © 2004-2022 Newdu.com All Rights Reserved
更新时间:2024/9/22 11:24:28