Group 1 - History of Computer.pptx

GROUP 1
MEMBERS: MENDOZA SARAH G
ALVAREZ JOHN FRITZ
DE SAGUN GIAN MARKO
WHAT IS COMPUTER?
• A computer is an electronic tool for handling data or information.
It is capable of storing, retrieving, and processing data. You may
already be aware of the fact that a computer may be used to type
documents, send emails, play games, and surf the Internet.
Additionally, you may use it to make or edit presentations,
spreadsheets, and even films.
• Computers have opened up a new era in manufacturing through
the techniques of automation, and they have enhanced modern
communication systems. They are essential tools in almost every
field of research and applied technology, from constructing
models of the universe to producing tomorrow's weather reports,
and their use has in itself opened up new areas of conjecture.
Database services and computer networks make available a great
variety of information sources. The same advanced techniques
also make possible invasions of privacy and of restricted
information sources, but computer crime has become one of the
many risks that society must face if it would enjoy the benefits of
modern technology
HISTORY OF COMPUTER
• When we study the many aspects of computing and
computers, it is important to know about the history of
computers. Charles Babbage designed an Analytical
Engine which was a general computer It helps us
understand the growth and progress of technology
through the times. It is also an important topic for
competitive and banking exams.
• The history of computers goes back over 200 years. At first
theorized by mathematicians and entrepreneurs, during the 19th
century mechanical calculating machines were designed and built
to solve the increasingly complex number-crunching challenges.
The advancement of technology enabled ever more-complex
computers by the early 20th century, and computers became
larger and more powerful.
• Today, computers are almost unrecognizable from designs of the
19th century, such as Charles Babbage's Analytical Engine — or
even from the huge computers of the 20th century that occupied
whole rooms, such as the Electronic Numerical Integrator and
Calculator
• Since the evolution of humans, devices have been used for calculations
for thousands of years. One of the earliest and most well-known
devices was an abacus. Then in 1822, the father of computers, Charles
Babbage began developing what would be the first mechanical
computer. And then in 1833 he actually designed an Analytical Engine
which was a general-purpose computer. It contained an ALU, some
basic flow chart principles and the concept of integrated memory.
• Then more than a century later in the history of computers, we got our
first electronic computer for general purpose. It was the ENIAC, which
stands for Electronic Numerical Integrator and Computer. The inventors
of this computer were John W. Mauchly and J.Presper Eckert.
•And with times the technology developed
and the computers got smaller and the
processing got faster. We got our first
laptop in 1981 and it was introduced by
Adam Osborne and EPSON
Two main types of computers are in use today, analog and digital,
although the term computer is often used to mean only the digital type.
Analog- The analog computer is an electronic or hydraulic device that
is designed to handle input in terms of, for example, voltage levels or
hydraulic pressures, rather than numerical data. The simplest analog
calculating device is the slide rule, which employs lengths of specially
calibrated scales to facilitate multiplication, division, and other functions.
In a typical electronic analog computer, the inputs are converted into
voltages that may be added or multiplied using specially designed
circuit elements. The answers are continuously generated for display or
for conversion to another desired form.
Digital Computers- Everything that a digital computer does is
based on one operation: the ability to determine if a switch, or
"gate," is open or closed. That is, the computer can recognize only
two states in any of its microscopic circuits: on or off, high voltage
or low voltage, or-in the case of numbers-0 or 1. The speed at
which the computer performs this simple act, however, is what
makes it a marvel of modern technology.
The first adding machine, a precursor of the digital computer, was
devised in 1642 by the French philosopher Blaise Pascal. This device
employed a series of ten-toothed wheels, each tooth representing a
digit from 0 to 9. The wheels were connected so that numbers
could be added to each other by advancing the wheels by a correct
number of teeth. In the 1670s the German philosopher and
mathematician Gottfried Wilhelm von Leibniz improved on this
machine by devising one that could also multiply. The French
inventor Joseph Marie Jacquard , in designing an automatic loom,
used thin, perforated wooden boards to control the weaving of
complicated designs
GENERATIONS OF COMPUTERS
In the history of computers, we often refer to
the advancements of modern computers as the
generation of computers. We are currently on
the fifth generation of computers. So let us
look at the important features of these five
generations of computers
1ST GENERATION
• This was from the period of 1940 to 1955. This was when machine
language was developed for the use of computers. They used
vacuum tubes for the circuitry. For the purpose of memory, they
used magnetic drums. These machines were complicated, large,
and expensive. They were mostly reliant on batch operating
systems and punch cards. As output and input devices, magnetic
tape and paper tape were implemented. For example, ENIAC,
UNIVAC-1, EDVAC, and so on.
2ND GENERATION
• The years 1957-1963 were referred to as the “second generation
of computers” at the time. In second-generation computers,
COBOL and FORTRAN are employed as assembly languages and
programming languages. Here they advanced from vacuum tubes
to transistors. This made the computers smaller, faster and more
energy-efficient. And they advanced from binary to assembly
languages. For instance, IBM 1620, IBM 7094, CDC 1604, CDC
3600, and so forth.
3RD GENERATION
• The hallmark of this period (1964-1971) was the development of
the integrated circuit. A single integrated circuit (IC) is made up
of many transistors, which increases the power of a computer
while simultaneously lowering its cost. These computers were
quicker, smaller, more reliable, and less expensive than their
predecessors. High-level programming languages such as
FORTRON-II to IV, COBOL, and PASCAL PL/1 were utilized. For
example, the IBM-360 series, the Honeywell-6000 series, and the
IBM-370/168.
4TH GENERATION
•The invention of the microprocessors brought
along the fourth generation of computers. The
years 1971-1980 were dominated by fourth
generation computers. C, C++ and Java were the
programming languages utilized in this generation
of computers. For instance, the STAR 1000, PDP 11,
CRAY-1, CRAY-X-MP, and Apple II. This was when
we started producing computers for home use.
5TH GENERATION
• These computers have been utilized since 1980 and continue to
be used now. This is the present and the future of the computer
world. The defining aspect of this generation is artificial
intelligence. The use of parallel processing and superconductors
are making this a reality and provide a lot of scope for the future.
Fifth-generation computers use ULSI (Ultra Large Scale
Integration) technology. These are the most recent and
sophisticated computers. C, C++, Java,.Net, and more
programming languages are used. For instance, IBM, Pentium,
Desktop, Laptop, Notebook, Ultrabook, and so on.
1801Joseph Marie Jacquard, a French merchant and inventor invents a loom that uses
punched wooden cards to automatically weave fabric designs. Early computers would
use similar punch cards.
1821 English mathematician Charles Babbage conceives of a steam-driven calculating
machine that would be able to compute tables of numbers. Funded by the British
government, the project, called the "Difference Engine" fails due to the lack of
technology at the time, according to the University of Minnesota.
1848 Ada Lovelace, an English mathematician and the daughter of poet Lord Byron,
writes the world's first computer program. According to Anna Siffert, a professor of
theoretical mathematics at the University of Münster in Germany, Lovelace writes the
first program while translating a paper on Babbage's Analytical Engine from French
into English
1931 At the Massachusetts Institute of Technology (MIT), Vannevar
Bush invents and builds the Differential Analyzer, the first large-
scale automatic general-purpose mechanical analog computer,
according to Stanford University.
1936 Alan Turing, a British scientist and mathematician, presents the
principle of a universal machine, later called the Turing machine, in
a paper called "On Computable Numbers.
1937John Vincent Atanasoff, a professor of physics and
mathematics at Iowa State University, submits a grant proposal to
build the first electric-only computer, without using gears, cams,
belts or shafts.
• solving “a vast class of numerical problems” by reprogramming, earning it the title of
“Grandfather of computers.”
• 1946 – The UNIVAC I (Universal Automatic Computer) was the first general-purpose
electronic digital computer designed in the United States for corporate applications.
• 1949 – The Electronic Delay Storage Automatic Calculator (EDSAC), developed by a team
at the University of Cambridge, is the “first practical stored-program computer.”
• 1950 – The Standards Eastern Automatic Computer (SEAC) was built in Washington, DC,
and it was the first stored-program computer completed in the United States.
1953 – Grace Hopper, a computer scientist, creates the first computer language, which becomes
known as COBOL, which stands for Common, Business-Oriented Language. It allowed a computer
user to offer the computer instructions in English-like words rather than numbers.
1954 – John Backus and a team of IBM programmers created the FORTRAN programming language,
an acronym for Formula Translation. In addition, IBM developed the 650.
1958 – The integrated circuit, sometimes known as the computer chip, was created by Jack Kirby
and Robert Noyce.
1962 – Atlas, the computer, makes its appearance. It was the fastest computer in the world at the
time, and it pioneered the concept of “virtual memory.”
1968 Douglas Engelbart reveals a prototype of the modern computer at the Fall Joint Computer
Conference, San Francisco. His presentation, called "A Research Center for Augmenting Human
Intellect" includes a live demonstration of his computer, including a mouse and a graphical user
interface (GUI), according to the Doug Engelbart Institute
2014 – The University of Michigan Micro Mote (M3), the
world’s smallest computer, was constructed.
2016 – The world’s first reprogrammable quantum
computer is built.
THANKYOU!
1 von 22

Más contenido relacionado

Similar a Group 1 - History of Computer.pptx

Chapter 1 1(1)Chapter 1 1(1)
Chapter 1 1(1)TejaswiB4
36 views61 Folien
Chapter 1 1(1)Chapter 1 1(1)
Chapter 1 1(1)TejaswiB4
80 views61 Folien
Lecture 1-2.pptxLecture 1-2.pptx
Lecture 1-2.pptxAthar Baig
1 view79 Folien
Ericho.pptxEricho.pptx
Ericho.pptxJerichoCamiling
2 views23 Folien

Similar a Group 1 - History of Computer.pptx(20)

Lecture  two_january_2012Lecture  two_january_2012
Lecture two_january_2012
Kopapcalvince445 views
Chapter 1 1(1)Chapter 1 1(1)
Chapter 1 1(1)
TejaswiB436 views
Chapter 1 1(1)Chapter 1 1(1)
Chapter 1 1(1)
TejaswiB480 views
Essay About The History Of ComputersEssay About The History Of Computers
Essay About The History Of Computers
Papers Writing Service National University of Health Sciences5 views
Lecture 1-2.pptxLecture 1-2.pptx
Lecture 1-2.pptx
Athar Baig1 view
Ericho.pptxEricho.pptx
Ericho.pptx
JerichoCamiling2 views
ICT across curriculumICT across curriculum
ICT across curriculum
Narendar Kandimalla53 views
Generation of comnputerGeneration of comnputer
Generation of comnputer
Govind Mishra13.8K views
History of computingHistory of computing
History of computing
Hossain Md Shakhawat3.2K views
Generations of ComputerGenerations of Computer
Generations of Computer
Yang Comia383 views
Chapter 1.pptxChapter 1.pptx
Chapter 1.pptx
samreen822 views
CAB.pptxCAB.pptx
CAB.pptx
Ankur Kukreti13 views
The History Of ComputersThe History Of Computers
The History Of Computers
Write My Paper Please New Mexico Institute of Mining and Technology4 views
UNIT 1 FOC.pptxUNIT 1 FOC.pptx
UNIT 1 FOC.pptx
AnchalLuthra315 views
HistoryofcomputersHistoryofcomputers
Historyofcomputers
raghunath santha155 views
The History Of Computer Development EssayThe History Of Computer Development Essay
The History Of Computer Development Essay
Custom Papers Online Clarkson College18 views
History of computer systemHistory of computer system
History of computer system
Seetal Daas209 views
the generation of computersthe generation of computers
the generation of computers
Bhavya Chawla5.1K views

Group 1 - History of Computer.pptx

  • 1. GROUP 1 MEMBERS: MENDOZA SARAH G ALVAREZ JOHN FRITZ DE SAGUN GIAN MARKO
  • 2. WHAT IS COMPUTER? • A computer is an electronic tool for handling data or information. It is capable of storing, retrieving, and processing data. You may already be aware of the fact that a computer may be used to type documents, send emails, play games, and surf the Internet. Additionally, you may use it to make or edit presentations, spreadsheets, and even films.
  • 3. • Computers have opened up a new era in manufacturing through the techniques of automation, and they have enhanced modern communication systems. They are essential tools in almost every field of research and applied technology, from constructing models of the universe to producing tomorrow's weather reports, and their use has in itself opened up new areas of conjecture. Database services and computer networks make available a great variety of information sources. The same advanced techniques also make possible invasions of privacy and of restricted information sources, but computer crime has become one of the many risks that society must face if it would enjoy the benefits of modern technology
  • 4. HISTORY OF COMPUTER • When we study the many aspects of computing and computers, it is important to know about the history of computers. Charles Babbage designed an Analytical Engine which was a general computer It helps us understand the growth and progress of technology through the times. It is also an important topic for competitive and banking exams.
  • 5. • The history of computers goes back over 200 years. At first theorized by mathematicians and entrepreneurs, during the 19th century mechanical calculating machines were designed and built to solve the increasingly complex number-crunching challenges. The advancement of technology enabled ever more-complex computers by the early 20th century, and computers became larger and more powerful. • Today, computers are almost unrecognizable from designs of the 19th century, such as Charles Babbage's Analytical Engine — or even from the huge computers of the 20th century that occupied whole rooms, such as the Electronic Numerical Integrator and Calculator
  • 6. • Since the evolution of humans, devices have been used for calculations for thousands of years. One of the earliest and most well-known devices was an abacus. Then in 1822, the father of computers, Charles Babbage began developing what would be the first mechanical computer. And then in 1833 he actually designed an Analytical Engine which was a general-purpose computer. It contained an ALU, some basic flow chart principles and the concept of integrated memory. • Then more than a century later in the history of computers, we got our first electronic computer for general purpose. It was the ENIAC, which stands for Electronic Numerical Integrator and Computer. The inventors of this computer were John W. Mauchly and J.Presper Eckert.
  • 7. •And with times the technology developed and the computers got smaller and the processing got faster. We got our first laptop in 1981 and it was introduced by Adam Osborne and EPSON
  • 8. Two main types of computers are in use today, analog and digital, although the term computer is often used to mean only the digital type. Analog- The analog computer is an electronic or hydraulic device that is designed to handle input in terms of, for example, voltage levels or hydraulic pressures, rather than numerical data. The simplest analog calculating device is the slide rule, which employs lengths of specially calibrated scales to facilitate multiplication, division, and other functions. In a typical electronic analog computer, the inputs are converted into voltages that may be added or multiplied using specially designed circuit elements. The answers are continuously generated for display or for conversion to another desired form.
  • 9. Digital Computers- Everything that a digital computer does is based on one operation: the ability to determine if a switch, or "gate," is open or closed. That is, the computer can recognize only two states in any of its microscopic circuits: on or off, high voltage or low voltage, or-in the case of numbers-0 or 1. The speed at which the computer performs this simple act, however, is what makes it a marvel of modern technology.
  • 10. The first adding machine, a precursor of the digital computer, was devised in 1642 by the French philosopher Blaise Pascal. This device employed a series of ten-toothed wheels, each tooth representing a digit from 0 to 9. The wheels were connected so that numbers could be added to each other by advancing the wheels by a correct number of teeth. In the 1670s the German philosopher and mathematician Gottfried Wilhelm von Leibniz improved on this machine by devising one that could also multiply. The French inventor Joseph Marie Jacquard , in designing an automatic loom, used thin, perforated wooden boards to control the weaving of complicated designs
  • 11. GENERATIONS OF COMPUTERS In the history of computers, we often refer to the advancements of modern computers as the generation of computers. We are currently on the fifth generation of computers. So let us look at the important features of these five generations of computers
  • 12. 1ST GENERATION • This was from the period of 1940 to 1955. This was when machine language was developed for the use of computers. They used vacuum tubes for the circuitry. For the purpose of memory, they used magnetic drums. These machines were complicated, large, and expensive. They were mostly reliant on batch operating systems and punch cards. As output and input devices, magnetic tape and paper tape were implemented. For example, ENIAC, UNIVAC-1, EDVAC, and so on.
  • 13. 2ND GENERATION • The years 1957-1963 were referred to as the “second generation of computers” at the time. In second-generation computers, COBOL and FORTRAN are employed as assembly languages and programming languages. Here they advanced from vacuum tubes to transistors. This made the computers smaller, faster and more energy-efficient. And they advanced from binary to assembly languages. For instance, IBM 1620, IBM 7094, CDC 1604, CDC 3600, and so forth.
  • 14. 3RD GENERATION • The hallmark of this period (1964-1971) was the development of the integrated circuit. A single integrated circuit (IC) is made up of many transistors, which increases the power of a computer while simultaneously lowering its cost. These computers were quicker, smaller, more reliable, and less expensive than their predecessors. High-level programming languages such as FORTRON-II to IV, COBOL, and PASCAL PL/1 were utilized. For example, the IBM-360 series, the Honeywell-6000 series, and the IBM-370/168.
  • 15. 4TH GENERATION •The invention of the microprocessors brought along the fourth generation of computers. The years 1971-1980 were dominated by fourth generation computers. C, C++ and Java were the programming languages utilized in this generation of computers. For instance, the STAR 1000, PDP 11, CRAY-1, CRAY-X-MP, and Apple II. This was when we started producing computers for home use.
  • 16. 5TH GENERATION • These computers have been utilized since 1980 and continue to be used now. This is the present and the future of the computer world. The defining aspect of this generation is artificial intelligence. The use of parallel processing and superconductors are making this a reality and provide a lot of scope for the future. Fifth-generation computers use ULSI (Ultra Large Scale Integration) technology. These are the most recent and sophisticated computers. C, C++, Java,.Net, and more programming languages are used. For instance, IBM, Pentium, Desktop, Laptop, Notebook, Ultrabook, and so on.
  • 17. 1801Joseph Marie Jacquard, a French merchant and inventor invents a loom that uses punched wooden cards to automatically weave fabric designs. Early computers would use similar punch cards. 1821 English mathematician Charles Babbage conceives of a steam-driven calculating machine that would be able to compute tables of numbers. Funded by the British government, the project, called the "Difference Engine" fails due to the lack of technology at the time, according to the University of Minnesota. 1848 Ada Lovelace, an English mathematician and the daughter of poet Lord Byron, writes the world's first computer program. According to Anna Siffert, a professor of theoretical mathematics at the University of Münster in Germany, Lovelace writes the first program while translating a paper on Babbage's Analytical Engine from French into English
  • 18. 1931 At the Massachusetts Institute of Technology (MIT), Vannevar Bush invents and builds the Differential Analyzer, the first large- scale automatic general-purpose mechanical analog computer, according to Stanford University. 1936 Alan Turing, a British scientist and mathematician, presents the principle of a universal machine, later called the Turing machine, in a paper called "On Computable Numbers. 1937John Vincent Atanasoff, a professor of physics and mathematics at Iowa State University, submits a grant proposal to build the first electric-only computer, without using gears, cams, belts or shafts.
  • 19. • solving “a vast class of numerical problems” by reprogramming, earning it the title of “Grandfather of computers.” • 1946 – The UNIVAC I (Universal Automatic Computer) was the first general-purpose electronic digital computer designed in the United States for corporate applications. • 1949 – The Electronic Delay Storage Automatic Calculator (EDSAC), developed by a team at the University of Cambridge, is the “first practical stored-program computer.” • 1950 – The Standards Eastern Automatic Computer (SEAC) was built in Washington, DC, and it was the first stored-program computer completed in the United States.
  • 20. 1953 – Grace Hopper, a computer scientist, creates the first computer language, which becomes known as COBOL, which stands for Common, Business-Oriented Language. It allowed a computer user to offer the computer instructions in English-like words rather than numbers. 1954 – John Backus and a team of IBM programmers created the FORTRAN programming language, an acronym for Formula Translation. In addition, IBM developed the 650. 1958 – The integrated circuit, sometimes known as the computer chip, was created by Jack Kirby and Robert Noyce. 1962 – Atlas, the computer, makes its appearance. It was the fastest computer in the world at the time, and it pioneered the concept of “virtual memory.” 1968 Douglas Engelbart reveals a prototype of the modern computer at the Fall Joint Computer Conference, San Francisco. His presentation, called "A Research Center for Augmenting Human Intellect" includes a live demonstration of his computer, including a mouse and a graphical user interface (GUI), according to the Doug Engelbart Institute
  • 21. 2014 – The University of Michigan Micro Mote (M3), the world’s smallest computer, was constructed. 2016 – The world’s first reprogrammable quantum computer is built.