Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 9

Name:

Course:
HISTORY OF COMPUTER
1801 Joseph Marie Jacquard, a French merchant and inventor invents a loom
that uses punched wooden cards to automatically weave fabric designs.
Early computers would use similar punch cards.
1821 English mathematician Charles Babbage conceives of a steam-driven
calculating machine that would be able to compute tables of numbers.
Funded by the British government, the project, called the "Difference
Engine" fails due to the lack of technology at the time, according to the
University of Minnesota.
1848 Ada Lovelace, an English mathematician and poet Lord Byron's
daughter, creates the world's first computer program in 1848. Lovelace
produces the first program while translating a paper on Babbage's
Analytical Engine from French to English, according to Anna Siffert, a
professor of theoretical mathematics at the University of Münster in
Germany.
1853 Per Georg Scheutz and his son Edvard, a Swedish inventor, create the
world's first printing calculator in 1853. According to Uta C. Merzbach's
book, "Georg Scheutz and the First Printing Calculator," the machine is
important for being the first to "calculate tabular differences and print the
findings" (Smithsonian Institution Press, 1977).
1890 Herman Hollerith designs a punch-card system to help calculate the 1890
U.S. Census. The machine, saves the government several years of
calculations, and the U.S. taxpayer approximately $5 million, according to
Columbia University Hollerith later establishes a company that will
eventually become International Business Machines Corporation (IBM).
1931 At the Massachusetts Institute of Technology (MIT), Vannevar Bush
invents and builds the Differential Analyzer, the first large-scale automatic
general-purpose mechanical analog computer, according to Stanford
University.
1936 Alan Turing, a British scientist and mathematician, presents the principle
of a universal machine, later called the Turing machine, in a paper called
"On Computable Numbers…" according to Chris Bernhardt's book
"Turing's Vision" (The MIT Press, 2017).
1937 John Vincent Atanasoff, a professor of physics and mathematics at Iowa
State University, submits a grant proposal to build the first electric-only
computer, without using gears, cams, belts or shafts.
1939 David Packard and Bill Hewlett found the Hewlett Packard Company in
Palo Alto, California. The pair decide the name of their new company by
the toss of a coin, and Hewlett-Packard's first headquarters are in
Packard's garage, according to MIT.
1941 German inventor and engineer Konrad Zuse completes his Z3 machine,
the world's earliest digital computer, according to Gerard O'Regan's book
"A Brief History of Computing" (Springer, 2021). The machine was
destroyed during a bombing raid on Berlin during World War II. Zuse fled
the German capital after the defeat of Nazi Germany and later released
the world's first commercial digital computer, the Z4, in 1950, according to
O'Regan.
1941 Atanasoff and his graduate student, Clifford Berry, design the first digital
electronic computer in the U.S., called the Atanasoff-Berry Computer
(ABC). This marks the first time a computer is able to store information on
its main memory, and is capable of performing one operation every 15
seconds, according to the book "Birthing the Computer" (Cambridge
Scholars Publishing, 2016)
1945 Two professors at the University of Pennsylvania, John Mauchly and J.
Presper Eckert, design and build the Electronic Numerical Integrator and
Calculator (ENIAC). The machine is the first "automatic, general-purpose,
electronic, decimal, digital computer," according to Edwin D. Reilly's book
"Milestones in Computer Science and Information Technology"
(Greenwood Press, 2003).
1946 Mauchly and Presper leave the University of Pennsylvania and receive
funding from the Census Bureau to build the UNIVAC, the first
commercial computer for business and government applications.
1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories
invent the transistor. They discover how to make an electric switch with
solid materials and without the need for a vacuum.
1949 A team at the University of Cambridge develops the Electronic Delay
Storage Automatic Calculator (EDSAC), "the first practical stored-
program computer," according to O'Regan. "EDSAC ran its first program
in May 1949 when it calculated a table of squares and a list of prime
numbers," O'Regan wrote.
1954 John Backus and his team of programmers at IBM publish a paper
describing their newly created FORTRAN programming language, an
acronym for FORmula TRANslation, according to MIT.
1958 Jack Kilby and Robert Noyce unveil the integrated circuit, known as the
computer chip. Kilby is later awarded the Nobel Prize in Physics for his
work.
1968 Douglas Engelbart reveals a prototype of the modern computer at the Fall
Joint Computer Conference, San Francisco. His presentation, called "A
Research Center for Augmenting Human Intellect" includes a live
demonstration of his computer, including a mouse and a graphical user
interface (GUI), according to the Doug Engelbart Institute.
1969 Ken Thompson, Dennis Ritchie and a group of other developers at Bell
Labs produce UNIX, an operating system that made "large-scale
networking of diverse computing systems — and the internet —
practical," according to Bell Labs.. The team behind UNIX continued to
develop the operating system using the C programming language.
1970 The newly formed Intel unveils the Intel 1103, the first Dynamic Access
Memory (DRAM) chip.
1971 A team of IBM engineers led by Alan Shugart invents the "floppy disk,"
enabling data to be shared among different computers.
1972 Ralph Baer, a German-American engineer, releases Magnavox Odyssey,
the world's first home game console, in September 1972 , according to
the Computer Museum of America. Months later, entrepreneur Nolan
Bushnell and engineer Al Alcorn with Atari release Pong, the world's first
commercially successful video game.
1973 Robert Metcalfe, a member of the research staff for Xerox, develops
Ethernet for connecting multiple computers and other hardware.
1977 The Commodore Personal Electronic Transactor (PET), is released onto
the home computer market, featuring an MOS Technology 8-bit 6502
microprocessor, which controls the screen, keyboard and cassette player.
The PET is especially successful in the education market, according to
O'Regan.
1975 The magazine cover of the January issue of "Popular Electronics"
highlights the Altair 8080 as the "world's first minicomputer kit to rival
commercial models." After seeing the magazine issue, two "computer
geeks," Paul Allen and Bill Gates, offer to write software for the Altair,
using the new BASIC language.
1976 Steve Jobs and Steve Wozniak co-found Apple Computer on April Fool's
Day. They unveil Apple I, the first computer with a single-circuit board
and ROM (Read Only Memory), according to MIT.
1977 The first West Coast Computer Faire is held in San Francisco. Jobs and
Wozniak present the Apple II computer at the Faire, which includes color
graphics and features an audio cassette drive for storage.
1978 VisiCalc, the first computerized spreadsheet program is introduced.
1979 MicroPro International, founded by software engineer Seymour
Rubenstein, releases WordStar, the world's first commercially successful
word processor. WordStar is programmed by Rob Barnaby, and includes
137,000 lines of code, according to Matthew G. Kirschenbaum's book
"Track Changes: A Literary History of Word Processing" (Harvard
University Press, 2016).
1981 "Acorn," IBM's first personal computer, is released onto the market at a
price point of $1,565, according to IBM. Acorn uses the MS-DOS
operating system from Windows. Optional features include a display,
printer, two diskette drives, extra memory, a game adapter and more.
1983 The Apple Lisa, standing for "Local Integrated Software Architecture" but
also the name of Steve Jobs' daughter, according to the National
Museum of American History (NMAH), is the first personal computer to
feature a GUI. The machine also includes a drop-down menu and icons.
1984 The Apple Macintosh is announced to the world during a Superbowl
advertisement. The Macintosh is launched with a retail price of $2,500,
according to the NMAH.
1985 As a response to the Apple Lisa's GUI, Microsoft releases Windows in
November 1985, the Guardian reported. Meanwhile, Commodore
announces the Amiga 1000.
1989 Tim Berners-Lee, a British researcher at the European Organization for
Nuclear Research (CERN), submits his proposal for what would become
the World Wide Web. His paper details his ideas for Hyper Text Markup
Language (HTML), the building blocks of the Web.
1993 The Pentium microprocessor advances the use of graphics and music on
PCs.
1996 Sergey Brin and Larry Page develop the Google search engine at
Stanford University.

1997 Microsoft invests $150 million in Apple, which at the time is struggling
financially. This investment ends an ongoing court case in which Apple
accused Microsoft of copying its operating system.
1999 Wi-Fi, the abbreviated term for "wireless fidelity" is developed, initially
covering a distance of up to 300 feet (91 meters) Wired reported.
2001 Mac OS X, later renamed OS X then simply macOS, is released by Apple
as the successor to its standard Mac Operating System. OS X goes
through 16 different versions, each with "10" as its title, and the first nine
iterations are nicknamed after big cats, with the first being codenamed
"Cheetah," TechRadar reported.
2003 AMD's Athlon 64, the first 64-bit processor for personal computers, is
released to customers.
2004 The Mozilla Corporation launches Mozilla Firefox 1.0. The Web browser
is one of the first major challenges to Internet Explorer, owned by
Microsoft. During its first five years, Firefox exceeded a billion downloads
by users, according to the Web Design Museum.
2005 Google buys Android, a Linux-based mobile phone operating system.
2006 The MacBook Pro from Apple hits the shelves. The Pro is the company's
first Intel-based, dual-core mobile computer.
2009 Microsoft launches Windows 7 on July 22. The new operating system
features the ability to pin applications to the taskbar, scatter windows
away by shaking another window, easy-to-access jumplists, easier
previews of tiles and more, TechRadar reported.
2010 The iPad, Apple's flagship handheld tablet, is unveiled.
2011 Google releases the Chromebook, which runs on Google Chrome OS.
2015 Apple releases the Apple Watch. Microsoft releases Windows 10.
2016 The first reprogrammable quantum computer was created. "Until now,
there hasn't been any quantum-computing platform that had the
capability to program new algorithms into their system. They're usually
each tailored to attack a particular algorithm," said study lead author
Shantanu Debnath, a quantum physicist and optical engineer.
2017 The Defense Advanced Research Projects Agency (DARPA) is
developing a new "Molecular Informatics" program that uses molecules
as computers. "Chemistry offers a rich set of properties that we may be
able to harness for rapid, scalable information storage and processing,"
Anne Fischer, program manager in DARPA's Defense Sciences Office,
said in a statement. "Millions of molecules exist, and each molecule has a
unique three-dimensional atomic structure as well as variables such as
shape, size, or even color.

Artificial Intelligence for Driverless/Autonomous Cars


Autonomous cars are some of the most talked-about and highly anticipated
technologies today. Inside every one of these vehicles is another technology that gets a
lot of publicity — AI. Without artificial intelligence, driverless cars wouldn’t be possible.
Many automated processes today use AI, from your YouTube suggestions to
your phone’s predictive text. Self-driving cars are no exception but rely even more on
this technology even more. In most places, AI is a convenience, but in driverless
vehicles, it’s a necessity.
Driverless vehicles have been a reality for a while now without needing or using
AI. There are 64 automated train lines in 42 cities globally, like Singapore’s MRT
system. Letting self-driving cars onto crowded streets is more complicated than running
a driverless railway, though.
Following a pre-determined path from point A to point B is straightforward
enough for a machine. When you add a road full of other drivers and pedestrians into
the equation, though, things get tricky. Driverless cars need to recognize and respond to
moving obstacles, which is where AI comes in.
Self-driving cars use a system of cameras and sensors that feed information to
AI software. This software then analyzes that data to understand the world around it,
like where other cars are going or how close they are. It can then make driving
decisions like braking and turning.
Robotics
The robots will command a greater presence in shaping the manufacturing industry as
these entities become cheaper, smarter, and more efficient in their roles on the factory
edge. With advances in robotics technology, these machines are allowed to take on
more complex traits, including heightened dexterity, machine learning, memory, and the
ability to collaborate more effectively. Hence, these robots will usher in a new set of
standards that every manufacturer will need to adapt to remain relevant. Robots have
been relied upon as an essential part of manufacturing. Robotic presence provides
incredible benefits, including enhanced accuracy, speed, and tireless labor. However,
they cannot do it all. As a result, these smaller and agiler implements on the
manufacturing edge are engineered to work collaboratively alongside their human
counterparts and are referred to as collaborative robotics.
Robotic Check-Ups

A pillar of health reform is improving access to the best health care for more
people. Technology is a cost-effective and increasingly potent means to connect clinics
in the vast and medically underserved rural regions of the United States with big city
medical centers and their specialists. Telemedicine is well established as a tool for
triage and assessment in emergencies, but new medical robots go one step further—
they can now patrol hospital hallways on more routine rounds, checking on patients in
different rooms and managing their individual charts and vital signs without direct
human intervention. The RP-VITA Remote Presence Robot produced jointly by iRobot
Corp. and InTouch Health is the first such autonomous navigation remote-presence
robot to receive FDA clearance for hospital use. The device is a mobile cart with a two-
way video screen and medical monitoring equipment, programmed to maneuver
through the busy halls of a hospital.
Apple iPhone
Though it wasn't the first smartphone, Apple really got the ball rolling with the
introduction of the iPhone in 2007. Social media, messaging and the mobile internet
wouldn't be nearly as powerful or universal if they hadn't been freed from the shackles
of the desktop computer and optimized for the iPhone and its dozens of competitors.

Armed with powerful features and able to run thousands of apps, they squeezed
more functionality into one device than we'd ever seen before. The mobile revolution
also brought the death of point-and-shoot cameras, dashboard GPS units, camcorders,
PDAs and MP3 players. Now we use smartphones to shop, as a flashlight and
sometimes even to call people. It's tech's version of the Swiss Army knife.

Now, 13 years after the iPhone's introduction, more than 3.5 billion people
around the world use a smartphone, nearly half the Earth's population. You may even
be using one to read this article.
Wi-Fi

The smartphone and the internet we use today wouldn't have been possible without
wireless communication technologies such as Wi-Fi. In 1995 if you wanted to "surf" the
internet at home, you had to chain yourself to a network cable like it was an extension
cord. In 1997, Wi-Fi was invented and released for consumer use. With a router and a
dongle for our laptop, we could unplug from the network cable and roam the house or
office and remain online.

Over the years, Wi-Fi's gotten progressively faster and found its way into computers,
mobile devices and even cars. Wi-Fi is so essential to our personal and professional
lives today that it's almost unheard of to be in a home or public place that doesn't have
it.

You might also like