Professional Documents
Culture Documents
Assignment#1 Manahan (BSIS-1B)
Assignment#1 Manahan (BSIS-1B)
BSIS - 1B
An Introduction to IT
In a modern context, the term ‘IT’ is commonly used to describe computers and
networks within a business environment. It refers to their applications in: generating,
manipulating, storing, regaining, transmitting, handling, exchanging, studying and
securing all data or information in an electronic format. IT is also used as an umbrella
term to cover: television, telecommunication equipment, software, e-commerce and the
internet.
When thinking about IT you need to consider IT support within both your personal and
private life. Especially when it comes to the increasingly sophisticated level of cyber
crime we see every day. This is so that when you are surfing the web on your computer
or receiving an email, your personal and business data is kept safe. IT support also
covers technical problems you may come across, ensuing you are using the most up to
date software and finding the best tools possible to effectively complete tasks.
Humanity has been manipulating, storing, and communicating information since the
early Sumerians pioneered the written word in ancient Mesopotamia, circa 3000 BC.
The term IT did not appear until the mid-20th century however when an influx of early
office technology appeared. The term was first published in the 1958 Harvard Business
Review when authors Harold J. Leavitt and Thomas C. Whisler said “the new
technology does not yet have a single established name. We shall call it Information
Technology.”
The first mechanical computer device was conceptualized and invented by English
mechanical engineer and polymath Charles Babbage in the early 19th century. Called
the ‘Difference Engine,’ it was originally created to aid in navigation calculations. Often
referred to as the ‘Father of the Computer’, Babbage came up with the more general
‘Analytical Engine’ in 1833 which could be used in fields other than navigation. Funding
constraints meant that Babbage died without seeing his machine completed, however
his son Henry completed a much simpler version of the machine in 1888, which was
successfully demonstrated to the public in 1906.
Early computers were not developed until the mid 1900s, when a more compact
analogue electromechanical computer, that used trigonometry, was installed on a
submarine to solve a problem with firing torpedoes at moving targets.
The Z2, the first electromechanical digital computer, invented by Engineer Konrad Zuse
in 1939, used electric switches to drive, and relays to perform calculations. Devices like
the Z2 had very low operating speeds and were eventually succeeded by faster all
electric machines, such as the first fully automatic 1941 Z3, also created by Zuse.
Colossus, a set of computers created between 1943 – 1945, are widely recognised as
the world’s first programmable electronic digital computers. Popularized by its use
during World War II, Colossus was used in intercepting and deciphering encrypted
German communications from the Enigma machine. English computer scientist,
mathematician, and theoretical biologist Alan Turing conceptualized modern computers
in his 1936 seminal paper ‘On Computable Numbers’, whereby programmable
instructions are stored in the memory of a machine.
Another early programmable computer was the Manchester Mark 1 developed by the
Victoria University of Manchester. Frederic C. Williams, Tom Kilburn, and Geoff Tootill
began working on the machine in August of 1948, but the first operational version of the
computer was not available for use until 1949. The Manchester Mark 1 caused
controversy when British media outlets referred to it as an electronic brain, which
provoked a long-running debate with the department of Neurosurgery at Manchester
University. They asked whether an electronic computer could ever be truly creative.
It was not until 1951 when electrical engineering company Ferranti International plc
created the Ferranti Mark 1; that the world’s first general-purpose computer was
commercially available. Also called the Manchester Electronic Computer, the Ferranti
Mark 1 was first used by the Victoria University of Manchester.
The first computer used in processing commercial business applications was developed
by the Lyons Tea Corporation to increase business output in 1951 – Leo I.
A brief timeline of some other important events is listed below:
● 1835 – Morse Code invented by Samuel Morse
● 1951 – MIT’s Whirlwind becomes the first computer in the world to allow users to
input commands with a keyboard
● 1958 – Silicon Chip: the first integrated circuit is produced by Jack Kilby and
Robert Noyce
● 1959 – The first photocopier, the Xerox Machine enters the consumer market
● 1967 – Hypertext software invented by Andries Van Dam and Ted Nelson
● 1972 – The first video game console designed for use on TV’s is invented – the
Magnavox Odyssey
● 1982 – WHOIS (pronounced who is) is released as one of the earliest domain
search engines
● 1989 – World Wide Web (the internet) invented by Sir Tim-Berners Lee
● 1993 – Benny Landau unveils the E-Print 1000 as the world’s first digital colour
printing press
● – Xerox 914 is released as the first successful commercial plain paper copier
● 1996 – The Nokia 9000 Communicator is released in Finland as the first internet
enabled mobile device
● – LinkedIn is established
● 2004 – Emergence of Web 2.0 – Humans move away from consumers of internet
material to active participation
● – Amazon releases the Kindle, marking a new era in reading and book
technology
The earliest age of technology has been dated back to the pre mechanical age
(between 3000 B.C. and 1450 A.D.). Human beings at that time primarily communicated
with each other using simple picture drawings called petroglyphs. They created these
drawings on rock. This form of language was used to tell a story, to keep record of how
many animals one owned and to mark their territory. This eventually led to the arrival of
the first writing system known as ¨cuneiform¨. Instead of using pictures to express
words, signs were composed to correspond with spoken sounds. Afterwards, the
Phoenician alphabet was created which consisted of a more simplified writing technique
using symbols to express single syllables and consonants. Later on, vowels were added
and names were given to the letters to create the alphabet that we use today.
As the alphabets and the writing systems became more popular and common, there
was more and more recorded information. This resulted in finding better ways to
communicate and keep a record of information. The first writing material was simply a
pen-like object to create markings in wet clay. This led to more useful forms of writing
materials from writing on bark, leaves, leather to writing on the papyrus plant to making
paper with rags to the making of modern-day paper we use today. However, as more
and more people used these new early technologies, they had to confront a new
problem. How could they safely store all of this information for a long period of time?
This resulted in different methods for record keeping such as clay tablets and scrolls
which led to books and libraries.
The numbering systems and the abacus, the first calculator, were also invented during
this period.
The slide rule (1600s)- an analog computer that allowed users to multiply and divide.
The Pascaline (around 1642) – a mechanical computer that allowed users to add,
subtract, multiply and divide two numbers. Leibniz's machine (1670s) – a machine that
was an improvement of the Pascaline that included additional components that made it
easier for users to multiply and divide. The difference engine (1820s) – a machine
creation that could calculate numbers and print the results. Even though these machine
inventions were not as effective as the latest technologies we use today, they play a big
role in the evolution process of information technology.
The evolution of information technology and the development of computers have been
grouped into five different stages or generations. They are:
During the first generation, computer systems used vacuum tubes. The machines were
huge and took up entire rooms. They consumed a large amount of electricity and
therefore, were expensive to operate. They also generated a lot of heat which resulted
in malfunctions.
Computers relied on machine language and could only solve one problem at a time. It
also took a long, complicated process to set up a new problem onto the machine.
Punched cards and paper tape were used for input and printouts were used to display
output. The ENIAC was an example of a first generation computer but it was
programmed by plugboard and switches.
By the time the third generation arrived, there was an increase in the speed and the
efficiency of computers due to the development of integrated circuits. Transistors were
made smaller and installed on silicon chips. Punched cards and printouts became
obsolete since operating systems, keyboards and monitors were created. This allowed
computers to operate many different applications at one time with one main program to
monitor and store information. Programming languages were developed that made
programming easier to do. BASIC was one of those languages. For the first time in
history, society as a whole had access to computers.
We are currently in the fifth generation. A lot of today´s research focuses on artificial
intelligence, the creation of intelligent machines that function and behave like humans.
Speech recognition, learning, planning and problem solving are some of the activities
that are being tested and performed. Even though artificial intelligence is still in
development, there are some applications that are being used such as voice
recognition. Overall, the main goal is to develop devices that respond to natural
language input and that are capable of learning.
In conclusion, information technology has been in existence for thousands of years and
has been evolving since the beginning of time – and it is still evolving. Human beings
have been collecting information in different ways and have discovered different forms
to communicate through the use of technologies. Information technology has been
essential to our lives and has made a huge impact throughout history. Without it, we
wouldn’t be here today with the latest, advanced technologies.