Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

Computer programming

Computer programming or coding is the composition of sequences of instructions, called


programs, that computers can follow to perform tasks.[1][2] It involves designing and
implementing algorithms, step-by-step specifications of procedures, by writing code in one or
more programming languages. Programmers typically use high-level programming languages
that are more easily intelligible to humans than machine code, which is directly executed by the
central processing unit. Proficient programming usually requires expertise in several different
subjects, including knowledge of the application domain, details of programming languages and
generic code libraries, specialized algorithms, and formal logic.

Auxiliary tasks accompanying and related to programming include analyzing requirements,


testing, debugging (investigating and fixing problems), implementation of build systems, and
management of derived artifacts, such as programs' machine code. While these are sometimes
considered programming, often the term software development is used for this larger overall
process – with the terms programming, implementation, and coding reserved for the writing and
editing of code per se. Sometimes software development is known as software engineering,
especially when it employs formal methods or follows an engineering design process.

History

Ada Lovelace, whose notes were


added to the end of Luigi Menabrea's
paper included the first algorithm
designed for processing by Charles
Babbage's Analytical Engine. She is
often recognized as history's first
computer programmer.
Programmable devices have existed for centuries. As early as the 9th century, a programmable
music sequencer was invented by the Persian Banu Musa brothers, who described an automated
mechanical flute player in the Book of Ingenious Devices.[3][4] In 1206, the Arab engineer Al-Jazari
invented a programmable drum machine where a musical mechanical automaton could be made
to play different rhythms and drum patterns, via pegs and cams.[5][6] In 1801, the Jacquard loom
could produce entirely different weaves by changing the "program" – a series of pasteboard
cards with holes punched in them.

Code-breaking algorithms have also existed for centuries. In the 9th century, the Arab
mathematician Al-Kindi described a cryptographic algorithm for deciphering encrypted code, in A
Manuscript on Deciphering Cryptographic Messages. He gave the first description of cryptanalysis
by frequency analysis, the earliest code-breaking algorithm.[7]

The first computer program is generally dated to 1843 when mathematician Ada Lovelace
published an algorithm to calculate a sequence of Bernoulli numbers, intended to be carried out
by Charles Babbage's Analytical Engine.[8]

Data and instructions were once


stored on external punched cards,
which were kept in order and arranged
in program decks.

In the 1880s, Herman Hollerith invented the concept of storing data in machine-readable form.[9]
Later a control panel (plug board) added to his 1906 Type I Tabulator allowed it to be
programmed for different jobs, and by the late 1940s, unit record equipment such as the IBM 602
and IBM 604, were programmed by control panels in a similar way, as were the first electronic
computers. However, with the concept of the stored-program computer introduced in 1949, both
programs and data were stored and manipulated in the same way in computer memory.[10]
Machine language

Machine code was the language of early programs, written in the instruction set of the particular
machine, often in binary notation. Assembly languages were soon developed that let the
programmer specify instructions in a text format (e.g., ADD X, TOTAL), with abbreviations for
each operation code and meaningful names for specifying addresses. However, because an
assembly language is little more than a different notation for a machine language, two machines
with different instruction sets also have different assembly languages.

Wired control panel for an IBM 402


Accounting Machine. Wires connect
pulse streams from the card reader to
counters and other internal logic and
ultimately to the printer.

Compiler languages

High-level languages made the process of developing a program simpler and more
understandable, and less bound to the underlying hardware. The first compiler related tool, the A-
0 System, was developed in 1952[11] by Grace Hopper, who also coined the term 'compiler'.[12][13]
FORTRAN, the first widely used high-level language to have a functional implementation, came
out in 1957,[14] and many other languages were soon developed—in particular, COBOL aimed at
commercial data processing, and Lisp for computer research.

These compiled languages allow the programmer to write programs in terms that are
syntactically richer, and more capable of abstracting the code, making it easy to target varying
machine instruction sets via compilation declarations and heuristics. Compilers harnessed the
power of computers to make programming easier[14] by allowing programmers to specify
calculations by entering a formula using infix notation.

Source code entry

Programs were mostly entered using punched cards or paper tape. By the late 1960s, data
storage devices and computer terminals became inexpensive enough that programs could be

You might also like