Today s agenda ECS15, Lecture 10 Mini-Review & Topic 3.2 Software Review the lectures. Sample midterm to be posted late today/tonight. Extra credit (1pt) turn in Monday 9:30am Finish up details on Topic 3: The computer (Topic 3.1 Hardware, 3.2 Software). Question: Do students want lecture notes 2-slides to a page? (Currently in 4-slides to a page format.) Goals of this course Understand how a computer works: 1) input (a string of letters or numbers, a picture, a waveform, etc.); 2) program (algorithm: transforms the data) 3) output (a human-useful format: string, picture, etc). Become computationally aware and literate in software, programming basics and internet workings. Mini-Review: Outline High level view of a Computer: Digital input Logical operations on digital data via hardware, software, memory storage, data network Digital output Growth of computers: Moore s law (exponential growth) Learn the process of writing a scientifically sound and rigorous paper. 1
Topic 0: Growth of computational power: Moore s Law: The number of transistors on a chip doubles approximately every two years. Note: X-axis is linear Y-axis is logarithmic Counting in base 10: log(10) = 1 log(100) = 2 log(1000) = 3 (What base do computers count in?) The growth of computation and information In your lifetime, the number of operations per second will exceed the capacity of all human brains combined! In your lifetime it is possible that the amount of data generated each day will exceed the amount of data ever written previously. (We will soon reach the point where the content on the WWW exceeds all the content ever written.) (http://en.wikipedia.org/wiki/accelerating_change) Topic 1: Digital data Lec 3: Decimal / binary / hexadecimal / ASCII -- Conversion between them (algorithms) -- Add two binary numbers -- Biggest number stored in a decimal/binary /hexadecimal number of length n. -- Definition of bits, bytes, kilobytes, megabytes, gigabytes, terrabytes Topic 2.1: Boolean logic Lec 5 Proposition (denoted P or Q): a declarative statement with a True/False value. Operators: NOT, AND, OR, NAND, NOR (Note NOT is an operator on a single, individual proposition, the rest are compound operators using two propositions) Truth tables Lec 4: Digital sound: -- Sampling (Nyquist Frequency) -- Discretization (how many distinct levels to record) E.g. how many bytes to store 10 mins of music? 2
Truth tables A B A OR B A AND B A NOR B A NAND B 0 0 0 0 1 1 0 1 1 0 0 1 1 0 1 0 0 1 1 1 1 1 0 0 Implication P -> Q (Lec 5) This means If P=T then Q=T (P is sufficient for Q; there could be many things that lead to Q=T, P is one of them, but not necessarily the only one.) It also means If Q=F then P=F or If not(q)=t then not(p)=t (Q is necessary for P; if Q does not happen (not(q)=t) then we know P did not happen so not(p)=t.) Remember a logically correct argument need not be reasonable. e.g. If 1+2=5 then everyday is Sunday. Here P = 1+2=5 = F. If P=F, than P -> Q is always a logically valid statement. (P->Q only invalid when P=T and Q=F.) Topic 2.2 Digital logic, Lec 5 (switches and transistors) The most basic switch is an inverter (i.e. not(p)) Combine switches to make NOT, AND, NAND, OR, NOR gates Parallel ( OR ) versus serial/sequential ( AND ) wiring of switches (see Lec 7, slide number 7). Topic 3, The computer. 3.1 Hardware, CPU, Memory, BIOS, Operating System 3.2 Software and Overarching topic Writing a research paper When do you need to cite sources What is plagiarism Purpose of an abstract Topics from Labs: Simple html tags Tables of content Miscellaneous topics: 3
Topic 3: Computer Layers Hardware (Motherboard/Keyboard/Monitor/Mouse/etc) BIOS Operating System The motherboard: backbone of the computer CPU, keyboard, mouse, monitor, hard drive, video/sound/internet cards, power supply) Slot for memory: RAM Slot for CPU Power supply connector Hard drive connectors Software Input/Output: Keyboard, Mouse, Extension cards: Video, sound, internet Remember the difference between RAM and ROM!! CPU The Central Process Unit (CPU) ALU Control Input BIOS refers to the firmware code usually stored on the PROM, EPROM or flash drive that is run by a computer when first powered on. BIOS performs three major tasks: BIOS: Basic Input/Output Layer BIOS (firmware code stores in PROM or EPROM) - POST (power on self tests) - Initiate the boot process from the Master Boot Record and load the operating system Memory Output - First, the Power On Self Tests (POST) are conducted. These tests verify that the hardware system is operating correctly. The CPU consists of three parts: - the Arithmetic Logic Unit (ALU) - The Control Unit - Memory The CPU performs Fetch/Execute cycles - Second, the BIOS initiates the different hardware component of the system, scanning their own ROM or PROM. - Third, the BIOS initiate the boot process. The BIOS looks for boot information that is contained in file called the master boot record (MBR) at the first sector on the disk. Once an acceptable boot record is found the operating system is loaded which takes over control of the computer. 4
The operating system - manages the sharing of resources and memory management - coordinates all input/output devices - provides a software platform on top of which application programs can run The operating system Operating systems can be classified as follows: - multi-user : Allows two or more users to run programs at the same time. -multiprocessing : Supports running a program on more than one CPU. -multitasking : Allows more than one program to run concurrently. - multithreading : Allows different parts of a single program to run concurrently. Memory management: The operating system Computer Layers Current computers organize memory resources hierarchically, from registers, CPU cache, RAM and disks. Hardware BIOS Operating System The virtual memory manager coordinates the use of these resources by tracking which one is available, which is to be allocated or deallocated and how to move data between them. If running processes require significantly more RAM than is available, the system may start thrashing. Software 5
Application Software Application software is a class of computer software that uses the capabilities of a computer to a task that the user wishes to perform. The term application refers to both the application software and its implementation. Typical examples of software applications are word processors, spreadsheets, and media players. Hardware BIOS Operating System Computer Layers Software A programming language is an artificial language that can be used to control the behavior of a machine, particularly a computer. are used to facilitate communication about the task of organizing and manipulating information, and to express algorithms precisely. (An algorithm is a list of well-defined instructions for completing a task; that is, given an initial state, it will proceed through a well-defined series of successive states, eventually terminating in an end-state. The transition from one state to the next is not necessarily deterministic; some algorithms, known as probabilistic algorithms, incorporate randomness) There are many, many programming languages, and new ones appear every year. (http://www.engin.umd.umich.edu/cis/course.des/cis400/) 6
Hardware Machine Languages Assembly Languages High level Languages Three main levels of programming languages: - Machine languages: refers to the "ones and zeroes" that processors use as instructions. Give it one pattern of bits (such as 11001001) and it will add two numbers, give it a different pattern (11001010) and it will instead subtract one from the other. - Assembly languages: is as close as you can come to writing in machine language, but has the advantage that it's also human -readable... using a small vocabulary of words with one syllable, such as: MOV A, B : Interpret or Compile? Regardless of what language you use, you eventually need to convert your program into machine language so that the computer can understand it. There are two ways to do this: - interpret the program through an interpreter (one line at a time) - compile the program through a compiler The main disadvantage of interpreters is that when a program is interpreted, it runs slower than if it had been compiled. -High level languages: A vocabulary and set of grammatical rules for instructing a computer to perform specific tasks. Each language has its own set of keywords and its own syntax. : Interpreters : Compilers An interpreter is a program that either: - executes the source code directly (type I) Source Code Source Code Source Code Source Code - translates source code into some efficient intermediate representation and immediately executes this (type II) Compile - is invoked to explicitly execute stored precompiled code made by a compiler which is part of the interpreter system (type III) Object File Object File Object File Object File Runtime Library Link Executable Program 7
: Compilers : Examples A compiler is a program that translates source codes into object codes. The compiler derives its name from the way it works, looking at the entire source code and collecting and reorganizing the instructions. Interpreted languages: - Perl, Python, Matlab (type II) - Java (type III) Thus, a compiler differs from an interpreter, which analyzes and executes each line of source code successively, without analyzing the entire program. Compiled languages: - Fortran - C, C++ - Pascal - Basic - Cobol - ADA Getting started with programming in Python Make sure you have python on your personal computer or you can use python on the machines in the computer lab. (Note difference between source code and binary distributions.) We will use an IDE (integrated Development Environment), namely IDLE Read the Python Primer, Introduction documentation. 8