Difference Between Information Theory And Information Science
Information Theory, a branch of the statistical theory of communications sciences. Originated in 1948 by Claude Shannon of the Bell Laboratories, information theory has introduced a new quantitative way of measuring the information content of messages and devising the most efficient codes for transmitting them. Although a part of applied communications sciences, it has acquired the unique distinction of opening a new path of research in pure mathematics.
Information theory has a wide range of applications, from pure and applied mathematics to communication theory, cybernetics, computer science, translating machines, genetics, psychology, and even medical diagnosis. In psychology, for example, many studies have been carried out concerning the maximum rate of information that a human being can absorb. The main applications have been in the communications sciences—in particular, in the intelligent design of communications systems, including the choice of such efficient codes as will give nearly errorless transmission of signals at a rate approaching the channel capacity. This work is vast in magnitude and importance and will continue over many decades.
Information science is the study of the ways in which organisms process information. The dominant emphasis of information science today, however, is the last of these. Information science integrates parts of other disciplines, such as biology, computer science, physics, librarianship, sociology and psychology..
Uses of Information Science
In the theoretical sense, information science tries to increase understanding of the ways in which information is generated, stored, made available, and used. In the practical sense it undertakes specific actions to try to improve these same functions of information science. The information scientist may compare alternative means of making information available, as by indexing (see index), or devise tools and methods for improving the transfer of information