Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 6

Computer software, or just software, is a collection of computer programs and related data that provides the instructions for

telling a computer what to do and how to do it. Software refers to one or more computer programs and data held in the storage of the computer. In other words, software is a set of programs, procedures, algorithms and its documentation concerned with the operation of a data processing system. Program software performs the function of the program it implements, either by directly providing instructions to the digital electronics or by serving as input to another piece of software. The term was coined to contrast to the old term hardware (meaning physical devices). In contrast to hardware, software "cannot be touched".[1] Software is also sometimes used in a more narrow sense, meaning application software only. Sometimes the term includes data that has not traditionally been associated with computers, such as film, tapes, and records.[2] Software is the language of a computer. And like human language, there are many different computer languages. Essentially, computer software can be divided into three main groups depending on their use and application. These are system software or operating system referred simply as the OS, application software and programming languages. Usually most of us interact with a computer using application software. 1. System Software: System software or operating system is the software used by the computer to translate inputs from various sources into a language which a machine can understand. Basically, the OS coordinates the different hardware components of a computer. There are many OS in the market. The most popular Os are from the stable of Microsoft. We have all heard, used and wondered at the Windows software, which is an OS. Starting with Windows, Microsoft has migrated to Vista, its latest offering in the market. It may come as a surprise to some that there are other operating systems used by others. Among these UNIX is used for large office setups with extensive networking. XENIX is software which has now become redundant. HP -UX and AIX are some operating systems used by HP computers. Apache OS is quite popular with web servers. IBM still uses proprietary operating systems for its main frames. Proprietary systems are generally built with the help of a variant of UNIX operating system. 2. Application software: A normal user rarely gets to see the operating system or to work with it. But all of us are familiar with application software which we must use to interact with a computer. Popular examples of application software are the Microsoft office suite which includes Word, Excel and PowerPoint. We have used these applications extensively. Internet explorer, Mozilla Firefox is two applications used to access the internet. E-mail software like Outlook express is used to manage Emails. It is obvious that all software utilized for working on a computer is classified as application software. In fact all user interfaces are an application. The anti-virus is an application and so is the Media player. 3. Programming languages: Now this is a kind of computer software which is used exclusively by computer programmers. Unless we are also programmers, we are unlikely to come across programming languages. A simple way to understand programming languages is to think of

them as bricks which can be used to create applications and operating system. C++, Java and Simlab are some popular programming languages. Generally Java is used for internet applications. C++ is a language of professional developers and used extensively in developing operating systems. PHP is another language used for internet applications. There is a new class of languages which are being utilized for the mobiles. These are light weight, modular languages which are used to design mobile applications. Computer software falls under three basic categories; System software or operating system, application software and programming languages. We usually use applications on a day to day basis. These applications are themselves created using programming languages.

History of Computers This is a searchable directory about the history of computers, computing and a timeline of the history of computers and early calculating machines has been included. Our timeline includes developments in the 1600's and their impact on computing. The development of the modern day computer was the result of advances in technologies and man's need to quantify. (The abacus was one of the first counting machines. Calculating machines were sold commercially before the advent of steel manufacturing technologies. Papyrus was something to write on, before we had paper. Writing was a way to record mathematical calculations.) This history of computers site includes the names of early pioneers of math and computing and links to related sites for further study. A new "Timeline of the History of Computers and Related Technologies" has been added. This site was designed to be used by students assigned topics about the history of computers and computing. Original articles are footnoted and related links are included. One important purpose of this Web page, is to debunk myths some people create, such as "we have computers because of the military" (Not true). We have computers because man wanted to quantify as early as the ancient Chinese Dynasties, when they created the abacus and used it for calculating, and adding and subtracting in particular... Babbage and Lovelace were "programming" machines as early as the 1800's before any military computer in this country. 1801 was the creation of the Jacquard loom which used "punch cards". Cathode Ray Tubes (CRTs) have been around since 1885 and the US gov't first used a computer in the 1950's. Great Britain's COLOSSUS was developed before the ENGIGMA, so people trying to perpetuate the importance of the US military in the development of computer technologies is doing a disservice to students. Electronics and related computer development, and the invention of the transistor were all independent of military intent. If anything, even the totalisator

machines were created for statistical purposes and have been used for horse racing, not rocket science. I love my Mac, and it has no military background that I am aware of. Military computers did not have integrated circuits like PC computer chips either... Stop saying computer development was military in origin... simply can't back it up with fact. Yes, the military also had old computers, just like my Commodore was old... but they weren't related... no tubes in my commodore, that was different technology altogether than a military monster computer with vacuum tubes... mechanical relays... ta dah... that was hot stuff in that age.
First Generation Computers (1940s 1950s) First electronic computers used vacuum tubes, and they were huge and complex. The first general purpose electronic computer was the ENIAC (Electronic Numerical Integrator And Computer). It was digital, although it didnt operate with binary code, and was reprogrammable to solve a complete range of computing problems. It was programmed using plugboards and switches, supporting input from an IBM card reader, and output to an IBM card punch. It took up 167 square meters, weighed 27 tons, and consuming 150 kilowatts of power. It used thousands of vacuum tubes, crystal diodes, relays, resistors, and capacitors. The first non-general purpose computer was ABC (AtanasoffBerry Computer), and other similar computers of this era included german Z3, ten British Colossus computers, LEO, Harvard Mark I, and UNIVAC. History of The Computer IBM 1401 Second Generation Computers (1955 1960) The second generation of computers came about thanks to the invention of the transistor, which then started replacing vacuum tubes in computer design. Transistor computers consumed far less power, produced far less heat, and were much smaller compared to the first generation, albeit still big by todays standards. The first transistor computer was created at the University of Manchester in 1953. The most popular of transistor computers was IBM 1401. IBM also created the first disk drive in 1956, the IBM 350 RAMAC. Third Generation Computers (1960s) History of The Computer IBM System/360 The invention of the integrated circuits (ICs), also known as microchips, paved the way for computers as we know them today. Making circuits out of single pieces of silicon, which is a semiconductor, allowed them to be much smaller and more practical to produce. This also started the ongoing process of integrating an ever larger number of transistors onto a single microchip. During the sixties microchips started making their way into computers, but the process was gradual, and second generation of computers still held on.

First appeared minicomputers, first of which were still based on non-microchip transistors, and later versions of which were hybrids, being based on both transistors and microchips, such as IBMs System/360. They were much smaller, and cheaper than first and second generation of computers, also known as mainframes. Minicomputers can be seen as a bridge between mainframes and microcomputers, which came later as the proliferation of microchips in computers grew. Fourth Generation Computers (1971 present) First microchips-based central processing units consisted of multiple microchips for different CPU components. The drive for ever greater integration and miniaturization led towards single-chip CPUs, where all of the necessary CPU components were put onto a single microchip, called a microprocessor. The first single-chip CPU, or a microprocessor, was Intel 4004. The advent of the microprocessor spawned the evolution of the microcomputers, the kind that would eventually become personal computers that we are familiar with today. First Generation of Microcomputers (1971 1976) History of The Computer Altair 8800 First microcomputers were a weird bunch. They often came in kits, and many were essentially just boxes with lights and switches, usable only to engineers and hobbyists whom could understand binary code. Some, however, did come with a keyboard and/or a monitor, bearing somewhat more resemblance to modern computers. It is arguable which of the early microcomputers could be called a first. CTC Datapoint 2200 is one candidate, although it actually didnt contain a microprocessor (being based on a multi-chip CPU design instead), and wasnt meant to be a standalone computer, but merely a terminal for the mainframes. The reason some might consider it a first microcomputer is because it could be used as a de-facto standalone computer, it was small enough, and its multi-chip CPU architecture actually became a basis for the x86 architecture later used in IBM PC and its descendants. Plus, it even came with a keyboard and a monitor, an exception in those days. However, if we are looking for the first microcomputer that came with a proper microprocessor, was meant to be a standalone computer, and didnt come as a kit then it would be Micral N, which used Intel 8008 microprocessor. Popular early microcomputers which did come in kits include MOS Technology KIM-1, Altair 8800, and Apple I. Altair 8800 in particular spawned a large following among the hobbyists, and is considered the spark that started the microcomputer revolution, as these hobbyists went on to found companies centered around personal computing, such as Microsoft, and Apple. Second Generation Microcomputers (1977 present) History of The Computer Commodore PET2001 (Image by Tomislav Medak licensed under CC-BY-SA).

As microcomputers continued to evolve they became easier to operate, making them accessible to a larger audience. They typically came with a keyboard and a monitor, or could be easily connected to a TV, and they supported visual representation of text and numbers on the screen. In other words, lights and switches were replaced by screens and keyboards, and the necessity to understand binary code was diminished as they increasingly came with programs that could be used by issuing more easily understandable commands. Famous early examples of such computers include Commodore PET, Apple II, and in the 80s the IBM PC. The nature of the underlying electronic components didnt change between these computers and modern computers we know of today, but what did change was the number of circuits that could be put onto a single microchip. Intels co-founder Gordon Moore predicted the doubling of the number of transistor on a single chip every two years, which became known as Moores Law, and this trend has roughly held for over 30 years thanks to advancing manufacturing processes and microprocessor designs. The consequence was a predictable exponential increase in processing power that could be put into a smaller package, which had a direct effect on the possible form factors as well as applications of modern computers, which is what most of the forthcoming paradigm shifting innovations in computing were about. Graphical User Interface (GUI) History of The Computer Macintosh 128k (Image by All About Apple museum licensed under CC-BY-SA-2.5-it) Possibly the most significant of those shifts was the invention of the graphical user interface, and the mouse as a way of controlling it. Doug Engelbart and his team at the Stanford Research Lab developed the first mouse, and a graphical user interface, demonstrated in 1968. They were just a few years short of the beginning of the personal computer revolution sparked by the Altair 8800 so their idea didnt take hold. Instead it was picked up and improved upon by researchers at the Xerox PARC research center, which in 1973 developed Xerox Alto, the first computer with a mouse-driven GUI. It never became a commercial product, however, as Xerox management wasnt ready to dive into the computer market and didnt see the potential of what they had early enough. It took Steve Jobs negotiating a stocks deal with Xerox in exchange for a tour of their research center to finally bring the user friendly graphical user interface, as well as the mouse, to the masses. Steve Jobs was shown what Xerox PARC team had developed, and directed Apple to improve upon it. In 1984 Apple introduced the Macintosh, the first mass-market computer with a graphical user interface and a mouse. Microsoft later caught on and produced Windows, and the historic competition between the two companies started, resulting in improvements to the graphical user interface to this day. Meanwhile IBM was dominating the PC market with their IBM PC, and Microsoft was riding on their coat tails by being the one to produce and sell the operating system for the IBM PC known as DOS or Disk Operating System. Macintosh, with its graphical user interface, was meant to dislodge IBMs

dominance, but Microsoft made this more difficult with their PC-compatible Windows operating system with its own GUI. Portable Computers History of The Computer Powerbook 150 (Image by Dana Sibera licensed under CC-BY-SA.) As it turned out the idea of a laptop-like portable computer existed even before it was possible to create one, and it was developed at Xerox PARC by Alan Kay whom called it the Dynabook and intended it for children. The first portable computer that was created was the Xerox Notetaker, but only 10 were produced. The first laptop that was commercialized was Osborne 1 in 1981, with a small 5 CRT monitor and a keyboard that sits inside of the lid when closed. It ran CP/M (the OS that Microsoft bought and based DOS on). Later portable computers included Bondwell 2 released in 1985, also running CP/M, which was among the first with a hinge-mounted LCD display. Compaq Portable was the first IBM PC compatible computer, and it ran MS-DOS, but was less portable than Bondwell 2. Other examples of early portable computers included Epson HX-20, GRiD compass, Dulmont Magnum, Kyotronic 85, Commodore SX-64, IBM PC Convertible, Toshiba T1100, T1000, and T1200 etc. The first portable computers which resemble modern laptops in features were Apples Powerbooks, which first introduced a built-in trackball, and later a trackpad and optional color LCD screens. IBMs ThinkPad was largely inspired by Powerbooks design, and the evolution of the two led to laptops and notebook computers as we know them. Powerbooks were eventually replaced by modern MacBook Pros. Of course, much of the evolution of portable computers was enabled by the evolution of microprocessors, LCD displays, battery technology and so on. This evolution ultimately allowed computers even smaller and more portable than laptops, such as PDAs, tablets, and smartphones.

You might also like