CS 265. Computer Architecture. Wei Lu, Ph.D., P.Eng.

Similar documents
CHAPTER 6 Memory. CMPS375 Class Notes (Chap06) Page 1 / 20 Dr. Kuo-pao Yang

Chapter 6 Memory 11/3/2015. Chapter 6 Objectives. 6.2 Types of Memory. 6.1 Introduction

CHAPTER 6 Memory. CMPS375 Class Notes Page 1/ 16 by Kuo-pao Yang

Memory. Objectives. Introduction. 6.2 Types of Memory

Chapter 6 Objectives

Computer Systems and Networks. ECPE 170 Jeff Shafer University of the Pacific $$$ $$$ Cache Memory 2 $$$

Computer Systems and Networks. ECPE 170 Jeff Shafer University of the Pacific $$$ $$$ Cache Memory $$$

CS24: INTRODUCTION TO COMPUTING SYSTEMS. Spring 2014 Lecture 14

ECE7995 (6) Improving Cache Performance. [Adapted from Mary Jane Irwin s slides (PSU)]

ECE 2300 Digital Logic & Computer Organization. More Caches

Cache Policies. Philipp Koehn. 6 April 2018

CMSC 313 COMPUTER ORGANIZATION & ASSEMBLY LANGUAGE PROGRAMMING LECTURE 27, FALL 2012

CMSC 313 COMPUTER ORGANIZATION & ASSEMBLY LANGUAGE PROGRAMMING LECTURE 27, SPRING 2013

Homework 6. BTW, This is your last homework. Assigned today, Tuesday, April 10 Due time: 11:59PM on Monday, April 23. CSCI 402: Computer Architectures

Cache Memory COE 403. Computer Architecture Prof. Muhamed Mudawar. Computer Engineering Department King Fahd University of Petroleum and Minerals

Physical characteristics (such as packaging, volatility, and erasability Organization.

Page 1. Memory Hierarchies (Part 2)

ECE331: Hardware Organization and Design

CSF Improving Cache Performance. [Adapted from Computer Organization and Design, Patterson & Hennessy, 2005]

COSC3330 Computer Architecture Lecture 19. Cache

ECEC 355: Cache Design

CS 136: Advanced Architecture. Review of Caches

12 Cache-Organization 1

Fig 7.30 The Cache Mapping Function. Memory Fields and Address Translation

MEMORY HIERARCHY BASICS. B649 Parallel Architectures and Programming

LECTURE 12. Virtual Memory

Introduction to OpenMP. Lecture 10: Caches

Chapter 7: Large and Fast: Exploiting Memory Hierarchy

Q3: Block Replacement. Replacement Algorithms. ECE473 Computer Architecture and Organization. Memory Hierarchy: Set Associative Cache

ECE 411 Exam 1 Practice Problems

Cache Memory and Performance

Cache Memory and Performance

Announcement. Computer Architecture (CSC-3501) Lecture 20 (08 April 2008) Chapter 6 Objectives. 6.1 Introduction. 6.

CS 265. Computer Architecture. Wei Lu, Ph.D., P.Eng.

Improving Cache Performance

EE 4683/5683: COMPUTER ARCHITECTURE

Cray XE6 Performance Workshop

Chapter 6 Objectives

MEMORY. Objectives. L10 Memory

Advanced Computer Architecture

CS 265. Computer Architecture. Wei Lu, Ph.D., P.Eng.

Levels in memory hierarchy

Mismatch of CPU and MM Speeds

CS 3510 Comp&Net Arch

Second Midterm Exam March 21, 2017 CS162 Operating Systems

Caches. Hakim Weatherspoon CS 3410, Spring 2012 Computer Science Cornell University. See P&H 5.1, 5.2 (except writes)

Cache memory. Lecture 4. Principles, structure, mapping

The Memory Hierarchy. Cache, Main Memory, and Virtual Memory (Part 2)

and data combined) is equal to 7% of the number of instructions. Miss Rate with Second- Level Cache, Direct- Mapped Speed

Memory Management. Disclaimer: some slides are adopted from book authors slides with permission 1

Assignment 1 due Mon (Feb 4pm

CS 31: Intro to Systems Caching. Martin Gagne Swarthmore College March 23, 2017

Basic Memory Hierarchy Principles. Appendix C (Not all will be covered by the lecture; studying the textbook is recommended!)

CS152 Computer Architecture and Engineering

CS152 Computer Architecture and Engineering CS252 Graduate Computer Architecture Spring Caches and the Memory Hierarchy

Computer Organization (Autonomous)

CS24: INTRODUCTION TO COMPUTING SYSTEMS. Spring 2017 Lecture 15

EEC 483 Computer Organization. Chapter 5.3 Measuring and Improving Cache Performance. Chansu Yu

ECE331: Hardware Organization and Design

CS161 Design and Architecture of Computer Systems. Cache $$$$$

Memory Management! Goals of this Lecture!

CS162 Operating Systems and Systems Programming Lecture 14. Caching and Demand Paging

CS 265. Computer Architecture. Wei Lu, Ph.D., P.Eng.

EEC 170 Computer Architecture Fall Improving Cache Performance. Administrative. Review: The Memory Hierarchy. Review: Principle of Locality

CIS Operating Systems Memory Management Cache Replacement & Review. Professor Qiang Zeng Fall 2017

Chapter 8. Virtual Memory

A Framework for Memory Hierarchies

Memory Management. Goals of this Lecture. Motivation for Memory Hierarchy

Memory Management. Disclaimer: some slides are adopted from book authors slides with permission 1

CS 153 Design of Operating Systems Winter 2016

Last Class: Demand Paged Virtual Memory

COMP 3221: Microprocessors and Embedded Systems

MEMORY MANAGEMENT/1 CS 409, FALL 2013

CS 31: Intro to Systems Caching. Kevin Webb Swarthmore College March 24, 2015

Structure of Computer Systems

CS162 Operating Systems and Systems Programming Lecture 10 Caches and TLBs"

Eastern Mediterranean University School of Computing and Technology CACHE MEMORY. Computer memory is organized into a hierarchy.

Caches. See P&H 5.1, 5.2 (except writes) Hakim Weatherspoon CS 3410, Spring 2013 Computer Science Cornell University

Chapters 9 & 10: Memory Management and Virtual Memory

Advanced Memory Organizations

Memory Management! How the hardware and OS give application pgms:" The illusion of a large contiguous address space" Protection against each other"

CS3350B Computer Architecture

Cache memories are small, fast SRAM-based memories managed automatically in hardware. Hold frequently accessed blocks of main memory

Caches. See P&H 5.1, 5.2 (except writes) Hakim Weatherspoon CS 3410, Spring 2013 Computer Science Cornell University

Paging! 2/22! Anthony D. Joseph and Ion Stoica CS162 UCB Fall 2012! " (0xE0)" " " " (0x70)" " (0x50)"

ECE 2300 Digital Logic & Computer Organization. More Caches

Computer Systems and Networks. ECPE 170 Jeff Shafer University of the Pacific $$$ $$$ Cache Memory $$$

CMU Introduction to Computer Architecture, Spring 2014 HW 5: Virtual Memory, Caching and Main Memory

Locality and Data Accesses video is wrong one notes when video is correct

Cache Memories. From Bryant and O Hallaron, Computer Systems. A Programmer s Perspective. Chapter 6.

Administrivia. CMSC 411 Computer Systems Architecture Lecture 8 Basic Pipelining, cont., & Memory Hierarchy. SPEC92 benchmarks

The levels of a memory hierarchy. Main. Memory. 500 By 1MB 4GB 500GB 0.25 ns 1ns 20ns 5ms

CS 152 Computer Architecture and Engineering. Lecture 7 - Memory Hierarchy-II

CS 31: Intro to Systems Virtual Memory. Kevin Webb Swarthmore College November 15, 2018

Chapter 4 Memory Management

Cache Memory Mapping Techniques. Continue to read pp

Locality. CS429: Computer Organization and Architecture. Locality Example 2. Locality Example

Memory Hierarchy Design (Appendix B and Chapter 2)

Lecture 12. Memory Design & Caches, part 2. Christos Kozyrakis Stanford University

In multiprogramming systems, processes share a common store. Processes need space for:

Transcription:

CS 265 Computer Architecture Wei Lu, Ph.D., P.Eng.

Part 4: Memory Organization Our goal: understand the basic types of memory in computer understand memory hierarchy and the general process to access memory know how memory is organized to store data understand basics of cache memory including the cache memory address mapping algorithm from main memory Understand virtual memory

Part 4: Memory Organization Overview: Introduction to memory Memory hierarchy and how computer access memory How memory is organized to store data and program Two memory technologies - Cache memory - Virtual memory

Cache Memory Cache Mapping Schemes

Cache Mapping Scheme

Cache Mapping Scheme Block (K words) Direct mapping Fully associative mapping Set associative mapping

Why not direct mapped cache Suppose a program generates a series of memory references such as: 50, D0, 50, D0, 50, D0,... The cache will continually evict and replace blocks. The theoretical advantage offered by the cache is lost in this extreme case because the cache miss rate is 100% This is the main disadvantage of direct mapped cache. Other cache mapping schemes are designed to prevent this kind of thrashing, including fully associative cache and set associative cache

Fully associative mapped cache Instead of placing memory blocks in specific cache locations based on memory address, we could allow a block to go anywhere in cache. In this way, cache would have to fill up one by one before any blocks are evicted. This is how fully associative cache works. In fully associative cache, a memory address is partitioned into only two fields: the tag and the word.

Fully associative mapped cache Suppose, we have 8-bit memory addresses and a cache with 16 blocks, each block include 8 (i.e.2 3 ) bytes. The field format of a memory reference is: 5 bit = 8bit -3bit Tag 5 bit Word 3 bit When the cache is searched, all tags are searched in parallel to retrieve the data quickly. However, this requires very special, costly hardware to be implemented

Set associative mapped cache Set associative cache combines the ideas of direct mapped cache and fully associative cache. An N-way set associative cache mapping is: like direct mapped cache in which a memory reference maps to a particular location in cache. Unlike direct mapped cache, a memory reference maps to a set of several (N) cache blocks, similar to the way in which fully associative cache works. Instead of mapping anywhere in the entire cache, a memory reference can map only to the subset of cache blocks.

Set associative mapped cache The number of cache blocks per set in set associative cache varies according to overall system design. For example, a 2-way set associative cache can be conceptualized as shown in the schematic below. Each set contains two different memory blocks.

Set associative mapped cache In set associative cache mapping, a memory reference is divided into three fields: tag, set, and word, as shown below. The set field determines the set to which the memory block maps. Suppose we have a main memory of 2 8 bytes. This memory is mapped to a 2-way set associative cache having 16 blocks where each block contains 8 (i.e.2 3 ) bytes. Since this is a 2-way cache, each set consists of 2 blocks, and there are 8 (i.e.2 3 ) sets. Thus, we need 3 bits for the set, 3 bits for the word, giving 2 leftover bits for the tag: 2 bit = 8bit -3bit -3bit Tag 2 bit Set 3 bit Word 3 bit

Replacement policy in cache memory In direct mapped cache Each main memory block only maps to one specified cache block So we have no choice and have to replace that block In fully or set associative mapped cache Least Recently Used (LRU) algorithm keeps track of the last time that a block was assessed and evicts the block that has been unused for the longest period of time e.g. in 2-way set associative, which of the 2 block is LRU? First in first out (FIFO) algorithm replaces the block that has been in the cache the longest, regardless of when it was last used Random algorithm picks a block at random and replaces it with a new block.

Summary on Cache Memory Cache memory gives faster access to main memory Cache maps blocks of main memory to blocks of cache memory. There are three general types of cache: Direct mapped, fully associative and set associative. Most of today s small systems employ multilevel cache hierarchies. Level1 cache (8KB to 64KB) is situated on the processor itself. Access time is typically very fast about 4ns. Level 2 cache (64KB to 2MB) may be on the motherboard, or on an expansion card. Access time is usually around 15-20ns. Accordingly, the Level 3 cache (2MB to 256MB) refers to cache that is situated between the processor and main memory

Thank you for your attendance Any questions?