Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Computer Science: Understanding Algorithms and Their Role, Study notes of Computer Science

An introduction to computer science, debunking common misconceptions and explaining the study of algorithms, their properties, and applications. It covers topics such as history, data representation, computer architecture, operating systems, networking, algorithms, theory, database systems, and programming. Central questions of computer science are also discussed, including problem-solving, algorithm discovery, representation and communication, analysis and comparison, information manipulation, intelligent behavior, and societal impact.

Typology: Study notes

2009/2010

Uploaded on 03/28/2010

koofers-user-t06
koofers-user-t06 🇺🇸

5

(1)

10 documents

1 / 15

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
8/25/2009
1
Introduction to Computer
Science
CS A101
What is Computer Science?
First, some misconceptions.
Misconception 1: I can put together my own
PC, am good with Windows, and can surf the
net with ease, so I know CS.
Misconception 2: Computer science is the
study of how to write computer programs.
Misconception 3: Computer science is the
study of the uses and applications of
computers and software.
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff

Partial preview of the text

Download Computer Science: Understanding Algorithms and Their Role and more Study notes Computer Science in PDF only on Docsity!

Introduction to Computer

Science

CS A

What is Computer Science?

  • First, some misconceptions.
  • Misconception 1 : I can put together my own PC, am good with Windows, and can surf the net with ease, so I know CS.
  • Misconception 2 : Computer science is the study of how to write computer programs.
  • Misconception 3 : Computer science is the study of the uses and applications of computers and software.

Computer Science

  • Computer science is the study of algorithms, including - Their formal and mathematical properties - Their hardware realizations - Their linguistic realizations - Their applications

What Will We Cover?

  • Broad survey of computer science topics, some depth in programming, more on breadth
  • Topics
    • History
    • Data representation
    • Computer architecture (software perspective)
    • Operating Systems
    • Networking
    • Algorithms
    • Theory
    • Database Systems
    • Programming (more depth than other topics)

Example: Euclid’s algorithm

Central Questions of Computer Science

  • Which problems can be solved by algorithmic processes?
  • How can algorithm discovery be made easier?
  • How can techniques of representing and communicating algorithms be improved?
  • How can characteristics of different algorithms be analyzed and compared?

Central Questions of Computer Science

(continued)

  • How can algorithms be used to manipulate information?
  • How can algorithms be applied to produce intelligent behavior?
  • How does the application of algorithms affect society?

The central role of algorithms in

computer science

Roots of Computing…

  • Herman Hollerith’s Tabulating Machine
    • Former MIT lecturer, developed a machine to read punch cards
    • Inspired by a train conductor to punch tickets
    • Used in the 1890 US Census
    • Company became IBM in 1924

Roots of Computing…

  • 1940, Conrad Zuse’s Z
    • First computing machine to use binary code, precursor to modern digital computers
  • 1944, Harvard Mark I, Howard Aiken
  • 1946, ENIAC, first all digital computer
    • Ushered in the “Mainframe” era of computing
    • “First Generation”
    • 18,000 vacuum tubes

Similar to a lightbulb but plate in middle controls flow of electrical current

  • On the ENIAC, all programming was done at the digital logic level.
  • Programming the computer involved moving plugs and wires.

1.7 The von Neumann Model

Roots of Computing…

  • 1945: John von Neumann defines his architecture for an “automatic computing system” - Basis for architecture of modern computing - Computer accepts input - Processes data using a CPU - Stores data in memory - Stored program technique, storing instructions with data in memory - Produces output
  • Led to the EDVAC and UNIVAC computers

The Third Generation:

Integrated Circuits (IC)

  • Multiple transistors on a single chip
  • IBM 360 - First mainframe to use IC
  • DEC PDP-11 - First minicomputer
  • End of mainframe era, on to the minicomputer era

Integrated Circuit

  • Invented at TI by Jack Kilby, Bob Noyce
  • " What we didn't realize then was that the integrated circuit would reduce the cost of electronic functions by a factor of a million to one, nothing had ever done that for anything before " - Jack Kilby

Minicomputer Era

  • Made possible by DEC and Data General Corporation, IBM
  • Medium-sized computer, e.g. DEC-PDP
  • Much less expensive than mainframes, computing more accessible to smaller organizations
  • Used transistors with integrated circuits

Personal Computer Era

  • First microprocessor, Intel 4004 in 1971
  • MITS Altair “kit” in 1975
  • Apple in 1976
  • IBM PC in 1981 using 8086
  • Macintosh in 1984, introduced the GUI (Graphical User Interface) we still use today - Some critics: Don Norman on complexity - Next interface delegation instead of direct manipulation?

CPU Clock Speeds

Moore’s Law

1965: Computing power doubles ~ every 18 months

Chip Production

  • Ingot of purified silicon – 1 meter long, sliced into thin wafers
  • Chips are etched – much like photography - UV light through multiple masks - Circuits laid down through mask
  • Process takes about 3 months

View of Cross-Section

Fabrication

Doping Annealing

SiO2 gate

“wires” – chemical or vapor deposition