Approaches MODULE 1

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

The Information Processing Model

The Information Processing Model is a framework used by cognitive psychologists to explain


and describe mental processes. The model likens the thinking process to how a computer works.

Just like a computer, the human mind takes in information, organizes and stores it to be retrieved
at a later time. Just as the computer has an input device, a processing unit, a storage unit, and an
output device, so does the human mind have equivalent structures.

In a computer, information is entered by means of input devices like a keyboard or scanner. In


the human mind, the input device is called the Sensory Register, composed of sensory organs
like the eyes and the ears through which we receive information about our surroundings.

Shannon and Weaver’s Communication Model

Most closely associated with the work of the American electrical engineer Claude Shannon in the
mid-20th century, information theory is chiefly of interest to communication engineers, though
some of the concepts have been adopted and used in such fields as psychology and linguistics.

The first component of the model, the message source, is simply the entity that originally creates the
message. Often the message source is a human, but in Shannon’s model it could also be an animal, a
computer, or some other inanimate object. The encoder is the object that connects the message to the
actual physical signals that are being sent. For example, there are several ways to apply this model to two
people having a telephone conversation.

On one level, the actual speech produced by one person can be considered the message, and the telephone
mouthpiece and its associated electronics can be considered the encoder, which converts the speech into
electrical signals that travel along the telephone network. Alternatively, one can consider the speaker’s
mind as the message source and the combination of the speaker’s brain, vocal system, and telephone
mouthpiece as the encoder.

The channel is the medium that carries the message. The channel might be wires, the air or space in the
case of radio and television transmissions, or fibre-optic cable. In the case of a signal produced simply by
banging on the plumbing, the channel might be the pipe that receives the blow.
Noise is anything that interferes with the transmission of a signal. In telephone conversations interference
might be caused by static in the line, cross talk from another line, or background sounds. Signals
transmitted optically through the air might suffer interference from clouds or excessive humidity. Clearly,
sources of noise depend upon the particular communication system. A single system may have several
sources of noise, but, if all of these separate sources are understood, it will sometimes be possible to treat
them as a single source.

The decoder is the object that converts the signal, as received, into a form that the message receiver can
comprehend. In the case of the telephone, the decoder could be the earpiece and its electronic circuits.
Depending upon perspective, the decoder could also include the listener’s entire hearing system.

The message receiver is the object that gets the message. It could be a person, an animal, or a computer
or some other inanimate object.

Shannon’s theory deals primarily with the encoder, channel, noise source, and decoder. As noted above,
the focus of the theory is on signals and how they can be transmitted accurately and efficiently; questions
of meaning are avoided as much as possible. Because the 7th aspect of feedback is missing which is later
added by Warren Weaver in the later version of the theory.

As long as the 7th concept of feedback was missing, the model was linear in nature. With the inclusion of
Feedback, the model became cyclical in nature.
Shannon-Weaver model of communication

Shannon an American mathematician, Electronic engineer and Weaver an American scientist, joined
together to write an article in “Bell System Technical Journal” called “A Mathematical Theory of
Communication” also known as the “Shannon-Weaver model of communication”. The basic concepts are
the same as studied before.

A Mathematical theory of Communication

In 1948, Shannon published “A Mathematical theory of Communication” in the Bell System Technical
Journal. It established the basic results of information theory in such a complete form that his framework
and terminology are still used. (The paper appears to contain the first published use of the term bit to
designate a single binary digit.)

An important step taken by Shannon was to separate the technical problem of delivering a message from
the problem of understanding what a message means.

A key step in Shannon’s work was his realization that, in order to have a theory, communication signals
must be treated in isolation from the meaning of the messages that they transmit. This view is in sharp
contrast with the common conception of information, in which meaning has an essential role.

Shannon also realized that the amount of knowledge conveyed by a signal is not directly related to the
size of the message.

Please read this example to understand this concept better:

A famous illustration of this distinction is the correspondence between French novelist Victor Hugo and
his publisher following the publication of Les Misérables in 1862. Hugo sent his publisher a card with
just the symbol “?”. In return he received a card with just the symbol “!”. Within the context of Hugo’s
relations with his publisher and the public, these short messages were loaded with meaning; lacking such
a context, these messages are meaningless. Similarly, a long, complete message in perfect French would
convey little useful knowledge to someone who could understand only English.

This step permitted engineers to focus on the message delivery system. Shannon concentrated on two key
questions in his 1948 paper:

1. Determining the most efficient encoding of a message using a given alphabet in a noiseless
environment, and understanding what additional steps need to be taken in the presence of noise.
2. He realized that he had to concentrate on the problems associated with the sending and receiving
of messages and would have to leave out the semantic problems (intrinsic meaning of the
message) for later investigators. So you can see that the technical problem needed to be addressed
before the semantic issue, because if the message is not transmitted correctly then the semantic
problem was not likely ever to be solved satisfactorily. Solving the technical problem was
therefore the first step in developing a reliable communication system.

Communication Signals and Formal Signal Processing

Communication and signal processing are the science behind our connected digital lives. A
signal is an electrical or electromagnetic current that is used for carrying data from one device or
network to another. It is the key component behind virtually all:

- Communication

- Computing

- Networking

- Electronic devices

A signal can be either analog or digital. An analog signal transmits data in the form of wave
whereas, a digital signal transmits the data in the binary form i.e. in the form of bits. The best
example of an analog signal is a human voice, and the best example of a digital signal is the
transmission of data in a computer. Each signal carries data in some form. The data is fed into
the signal using analog or digital modulation techniques, depending upon the source and
destination device and/or medium.

Semiotics and the Semiotic Approach

Semiotics is the study of signs, sign systems, and sign processes. Sign processes can be non-
communicative and communicative. It is the study of how human beings and other organisms
derive meaning from the world around them.

There are 3 types of signs:

I. ICON: Directly represents the object. For e.g. a painting of a pipe is an icon representing a
pipe.

II. INDEX: Implied association with the object. The sign and the object are connected in a
logical way. For e.g. black sunglasses and a walking stick can indicate blindness.

III. SYMBOL: A symbol is not inherently connected with the object, instead it is a matter of a
particular society and must be explicitly taught. For eg. The Star Of David in a synagogue is a
symbol of Judaism.

What is semiotic approach to communication?

In the semiotic approach the message is a construction of signs which, by interacting with the
receptor, generates meaning. The focus is not so much on communication as process, but rather
on communication as the generator of meaning. The sender (message transmitter) loses his
importance. The focus is directed towards to the "text" and the way it is "read". "The reading" is
the process of discovering the meaning that emerges when the "reader" interacts or negotiates
with the "text". The negotiation takes place when the "reader" filters the message through the
strainer of the cultural pattern, in terms of signs and codes that make up the message.

Meaning generation is the essence of the semiotic approach and is of paramount importance for
communication as opposed to signal processing approach which was concerned primarily with
the efficient transmission of messages and not so much with its meaning. The semiotic Approach
is an active process and semioticians use verbs such as to create, to generate or to negotiate.
Following the "negotiation" process, Meaning emerges, that is the meaning of the message – in
fact, it is the message itself that emerges, as there is no message without meaning (the same way
that a sign without meaning does not function as a sign). To explain with an example, the Public
Relations profession, the engineering of effective and efficient communication is mandatory in
the research phase, in order to get to know the cultural background of the target-audience.

Information, Communication and significance

Information theory studies the transmission, processing, extraction, and utilization of


information. Information is the source of a communication system, whether it is analog or
digital. Information theory is a mathematical approach to the study of coding of information
along with the quantification, storage, and communication of information. Shannon’s
information theory was tremendously important as an intellectual endeavor. Shannon explained
the features that all communications systems have in common. He invented the bit, allowing us
to quantify the amount of information in a message. Shannon discovered that all messages—no
matter the sender, the recipient, the length, the meaning—were all essentially reducible to the
same thing: bits. He showed engineers how they could compress and encode those bits to
transmit information with flawless accuracy.

You might also like