+ All Categories
Home > Documents > CS101 Lecture 25: The Machinery of Computation: … complement Recall that binary subtraction is...

CS101 Lecture 25: The Machinery of Computation: … complement Recall that binary subtraction is...

Date post: 30-Mar-2018
Category:
Upload: hoangtuong
View: 220 times
Download: 3 times
Share this document with a friend
16
1 1 John Magee 29 July 2013 Some material copyright Jones and Bartlett CS101 Lecture 25: The Machinery of Computation: Computer Architecture 2 Overview/Questions – What did we do last time? – Can we relate this circuit stuff to something we know something about? – How can we combine these elements to do more complicated tasks?
Transcript

1

1

John Magee 29 July 2013

Some material copyright Jones and Bartlett

CS101 Lecture 25:

The Machinery of Computation:

Computer Architecture

2

Overview/Questions

– What did we do last time?

– Can we relate this circuit stuff to something we know something about?

– How can we combine these elements to do more complicated tasks?

2

3

What did we talk about last time?

– Circuits control the flow of electricity.

– Gates are simple logical systems.

– By combining several gates, we create logic-computing circuits.

– Logic-computing circuits can do binary number addition.

4

Integrated Circuits

Integrated circuit (also called a chip)

A piece of silicon on which multiple (many) gates have been embedded.

Silicon pieces are mounted on a plastic or ceramic package with pins along the edges that can be soldered onto circuit boards or inserted into appropriate sockets

3

5

Integrated Circuits

6

Central Processor Units

The most important integrated circuit in any computer is the Central Processing Unit, or CPU.

– The Intel Duo Core 2 ® processor has more than

1.9 billion (1.9 * 109) gate transistors on one chip.

The CPU combines many gates, to enable a small number of instructions. Examples:

– Add/subtract 2 binary inputs

– Load a value from memory

– Store a value into memory

4

7

Recall Binary Number Addition Adding two 1-bit numbers together produces

– A sum bit

– A carry bit

http://www.cs.bu.edu/courses/cs101/labs/ECS_2e/Applets/APPLETS/BINARYADD/applet_frame.htm

8

Recall: The Full Adder

The full adder takes 3 inputs:

– A, B, and a carry-in value

Figure 4.10 A full adder

5

9

The Full Adder

Here is the Full Adder, with its internal details hidden (an abstraction).

What matters now are: – inputs are A, B, and CI.

– outputs are S and CO

10

An 8-bit Adder

To add two 8-bit numbers together, we need an 8-bit adder: Notice how the carry out from one bit’s adder becomes the carry-in to the next adder.

6

11

An 8-bit Adder

We can abstract away the 1-bit adders, And summarize with this diagram: Notice the inputs and outputs.

12

Output from the Adder

The adder produces 2 outputs: – Sum (multi-bit), Carry Out (1-bit)

Where does the output go from here? Accumulator A circuit connected to an adder, which stores the adder’s result.

7

13

Putting it Together

The accumulator is a memory circuit, and is wired as both an output from the adder and an input back into the adder.

14

Accumulator Example

Suppose we want to add 3 numbers:

1) Clear the accumulator (set to all 0s)

2) Load the first input into the adder

3) Compute the sum of accumulator + input

4) Result flows back into accumulator

5) Go to step 2 with next input

8

15

Input to the Adder

The adder takes inputs – A, B are two binary numbers – (Carry-in should be 0)

How do we feed numbers into the adder? Random Access Memory A large memory unit which stores data before/after processing.

Random Access Memory

Memory cells are circuits which

each hold a 1 bit value, grouped

into 8-bit bytes.

Each byte of memory has a

unique address corresponding

to its location within the circuit,

so that it can be located.

9

17

Memory

Memory Address

A physical location in the computer’s memory.

Addressability

The number of bits stored in each addressable location in memory. (A byte in our example.)

Word Size

The number of bits used in each memory address.

– This dictates how much physical memory can be addressed.

– Example: A 32-bit machine has 232 = 4,294,967,296 possible memory addresses.

18

Putting it together

One issue we’ll need to deal with:

how to specify which data to fetch from RAM.

Input to the adder now comes from RAM.

10

19

Putting it together

Here’s a more complete version, which takes another input for a memory address:

20

What about Subtraction?

2s complement Recall that binary subtraction is accomplished by adding the 2s complement of a number. Inverter

A circuit built using NOT gates, which inverts all bits – turning 1s into 0s and 0s into 1s.

– The inverter creates a 1s complement of its input.

– Adding 1 to this gives a 2s complement number, suitable for doing subtraction.

– (How could we add 1 to the inverted number?)

11

21

Putting it together

Using an inverter, we can do addition and subtraction.

Now we need a way to control the inverter.

22

From Adding Machine… What we’ve got is an machine that can do

addition/subtraction in circuitry.

It can read data from memory, and write data back to memory.

We haven’t dealt with how to:

– Specify from which address memory to read.

– Specify which operation to perform (add/subtract).

– Specify to which address to write.

12

23

… to Automatic Computer

We need a way to tell the machine to:

– Load some data from memory (by address) into the adder.

– Perform Add or Subtract

– Store data from the accumulator into the memory (by address)

What we need is a a way to program it.

24

The earliest digital computers were programmed by wiring them up to perform some specific logic.

Pictured:

Harvard Mark I

Computer Programming, 0.X

13

25

Computer Programming, 0.X

Later, instructions were programmed by flipping switches.

Pictured:

Digital Equipment PDP-8

Demo:

http://www.youtube.com/watch?v=DPioENtAHuY

26

Stored Program Computer

John von Neumann (a mathematician who worked on the atomic bomb)

– Described a computer architecture in which instructions are read from the memory space (RAM), just like data.

– This design enables programmability, by making it relatively easy to provide new instructions to the computer’s hardware.

14

27

Stored Program Computer (Von Neumann Architecture)

Figure 5.1 The von Neumann architecture

28

Central Processing Unit

Central Processing Unit (CPU)

Refers to the combination of the Arithmetic/Logic Unit and the Control Unit.

The ALU performs basic arithmetic operations and logic operations. Examples:

– Arithmetic: addition, subtraction, multiplication

– Logical operations: AND, OR, NOT, XOR

15

29

Central Processing Unit

Central Processing Unit (CPU)

Refers to the combination of the Arithmetic/Logic Unit and the Control Unit.

The control unit coordinates the flow of operations and data in the computer.

– coordinates the ALU and memory

– Keeps track of which instruction to do in a Program Counter (PC)

30

Take-Away Points

– Adder

– Inverter

– Accumulator

– Instructions

– Von Neumann Architecture

– CPU: ALU/CU

16

31

Student To Dos

– Readings:

Reed ch 7, pp 121-134

– HW posted on course schedule.

Ask for help / visit TF hours.

– Wednesday Lab: Bring a headset or microphone if you have one. Most laptops have built-in microphones as well.

Warning: Homework assignments can be very time consuming!


Recommended