A Sophomoric Introduction to Shared-Memory Parallelism and Concurrency Lecture 4 Shared-Memory Concurrency & Mutual Exclusion

Similar documents
Programmazione di sistemi multicore

CSE 332: Data Structures & Parallelism Lecture 17: Shared-Memory Concurrency & Mutual Exclusion. Ruth Anderson Winter 2019

CSE332: Data Abstractions Lecture 22: Shared-Memory Concurrency and Mutual Exclusion. Tyler Robison Summer 2010

CSE332: Data Abstractions Lecture 19: Mutual Exclusion and Locking

Program Graph. Lecture 25: Parallelism & Concurrency. Performance. What does it mean?

Sec$on 6: Mutual Exclusion. Michelle Ku5el

Topics in Parallel and Distributed Computing: Introducing Concurrency in Undergraduate Courses 1,2

Sharing is the Key. Lecture 25: Parallelism. Canonical Example. Bad Interleavings. Common to have: CS 62 Fall 2016 Kim Bruce & Peter Mawhorter

CSE 332: Locks and Deadlocks. Richard Anderson, Steve Seitz Winter 2014

3/25/14. Lecture 25: Concurrency. + Today. n Reading. n P&C Section 6. n Objectives. n Concurrency

CSE332: Data Abstractions Lecture 25: Deadlocks and Additional Concurrency Issues. Tyler Robison Summer 2010

Programmazione di sistemi multicore

CS 162 Operating Systems and Systems Programming Professor: Anthony D. Joseph Spring 2002

CSE 374 Programming Concepts & Tools

CS 31: Introduction to Computer Systems : Threads & Synchronization April 16-18, 2019

CS 2112 Lecture 20 Synchronization 5 April 2012 Lecturer: Andrew Myers

Introducing Shared-Memory Concurrency

Thread-Local. Lecture 27: Concurrency 3. Dealing with the Rest. Immutable. Whenever possible, don t share resources

CSE332: Data Abstractions Lecture 23: Programming with Locks and Critical Sections. Tyler Robison Summer 2010

A Sophomoric Introduction to Shared-Memory Parallelism and Concurrency Lecture 5 Programming with Locks and Critical Sections

Synchronization I. Jo, Heeseung

+ Today. Lecture 26: Concurrency 3/31/14. n Reading. n Objectives. n Announcements. n P&C Section 7. n Race conditions.

CSE 332: Data Structures & Parallelism Lecture 18: Race Conditions & Deadlock. Ruth Anderson Autumn 2017

CSE 451: Operating Systems Winter Lecture 7 Synchronization. Steve Gribble. Synchronization. Threads cooperate in multithreaded programs

10/17/ Gribble, Lazowska, Levy, Zahorjan 2. 10/17/ Gribble, Lazowska, Levy, Zahorjan 4

Lecture 5: Synchronization w/locks

Operating Systems. Synchronization

CSE332: Data Abstractions Lecture 23: Data Races and Memory Reordering Deadlock Readers/Writer Locks Condition Variables. Dan Grossman Spring 2012

CSE 451: Operating Systems Winter Lecture 7 Synchronization. Hank Levy 412 Sieg Hall

CSE 153 Design of Operating Systems

The New Java Technology Memory Model

CMSC 132: Object-Oriented Programming II

Advanced Threads & Monitor-Style Programming 1/24

Dealing with Issues for Interprocess Communication

CS 153 Design of Operating Systems Winter 2016

Synchronization I. Jin-Soo Kim Computer Systems Laboratory Sungkyunkwan University

CPSC/ECE 3220 Fall 2017 Exam Give the definition (note: not the roles) for an operating system as stated in the textbook. (2 pts.

CS-537: Midterm Exam (Spring 2001)

Deadlock CS 241. March 19, University of Illinois

Operating Systems. Lecture 4 - Concurrency and Synchronization. Master of Computer Science PUF - Hồ Chí Minh 2016/2017

Models of concurrency & synchronization algorithms

CSL373: Lecture 5 Deadlocks (no process runnable) + Scheduling (> 1 process runnable)

CS 333 Introduction to Operating Systems. Class 3 Threads & Concurrency. Jonathan Walpole Computer Science Portland State University

G Programming Languages Spring 2010 Lecture 13. Robert Grimm, New York University

CS 162 Operating Systems and Systems Programming Professor: Anthony D. Joseph Spring Lecture 8: Semaphores, Monitors, & Condition Variables

Today: Synchronization. Recap: Synchronization

Parallel Code Smells: A Top 10 List

Week 7. Concurrent Programming: Thread Synchronization. CS 180 Sunil Prabhakar Department of Computer Science Purdue University

Datenstrukturen und Algorithmen

Thread Safety. Review. Today o Confinement o Threadsafe datatypes Required reading. Concurrency Wrapper Collections

Computation Abstractions. Processes vs. Threads. So, What Is a Thread? CMSC 433 Programming Language Technologies and Paradigms Spring 2007

CSE 332: Data Abstractions Lecture 22: Data Races and Memory Reordering Deadlock Readers/Writer Locks Condition Variables. Ruth Anderson Spring 2014

CS 333 Introduction to Operating Systems. Class 3 Threads & Concurrency. Jonathan Walpole Computer Science Portland State University

CSE 153 Design of Operating Systems Fall 2018

CSE 332 Winter 2018 Final Exam (closed book, closed notes, no calculators)

The University of Texas at Arlington

Deadlock and Monitors. CS439: Principles of Computer Systems September 24, 2018

Software Design and Analysis for Engineers

The University of Texas at Arlington

Synchronization. CS61, Lecture 18. Prof. Stephen Chong November 3, 2011

The Dining Philosophers Problem CMSC 330: Organization of Programming Languages

RACE CONDITIONS AND SYNCHRONIZATION

1. Introduction to Concurrent Programming

CMSC 330: Organization of Programming Languages. The Dining Philosophers Problem

CSE 373: Data Structures and Algorithms

Concurrent Processes Rab Nawaz Jadoon

The concept of concurrency is fundamental to all these areas.

Midterm on next week Tuesday May 4. CS 361 Concurrent programming Drexel University Fall 2004 Lecture 9

Interprocess Communication By: Kaushik Vaghani

Chapter 6: Process Synchronization. Module 6: Process Synchronization

Process Management And Synchronization

Introduction to OS Synchronization MOS 2.3

Suggested Solutions (Midterm Exam October 27, 2005)

CSCI-1200 Data Structures Fall 2009 Lecture 25 Concurrency & Asynchronous Computing

COMP 346 WINTER Tutorial 2 SHARED DATA MANIPULATION AND SYNCHRONIZATION

Atomic Transactions

Lab. Lecture 26: Concurrency & Responsiveness. Assignment. Maze Program

Mutating Object State and Implementing Equality

Introduction to Java Threads

Operating Systems. Sina Meraji U of T

CS 153 Design of Operating Systems Winter 2016

Background. The Critical-Section Problem Synchronisation Hardware Inefficient Spinning Semaphores Semaphore Examples Scheduling.

Module 6: Process Synchronization. Operating System Concepts with Java 8 th Edition

Concurrent Programming

Last Class: Synchronization

Lecture 19: Synchronization. CMU : Parallel Computer Architecture and Programming (Spring 2012)

Races. Example. A race condi-on occurs when the computa-on result depends on scheduling (how threads are interleaved)

Software Design and Analysis for Engineers

SYNCHRONIZATION M O D E R N O P E R A T I N G S Y S T E M S R E A D 2. 3 E X C E P T A N D S P R I N G 2018

Assertions, pre/postconditions

Deadlocks. Copyright : University of Illinois CS 241 Staff 1

Lecture. DM510 - Operating Systems, Weekly Notes, Week 11/12, 2018

Storing Data in Objects

Advanced Operating Systems (CS 202) Memory Consistency, Cache Coherence and Synchronization

UNIX Input/Output Buffering

Concurrency Control. Synchronization. Brief Preview of Scheduling. Motivating Example. Motivating Example (Cont d) Interleaved Schedules

Ch 9: Control flow. Sequencers. Jumps. Jumps

CS 111. Operating Systems Peter Reiher

Threads and Synchronization. Kevin Webb Swarthmore College February 15, 2018

So far, we know: Wednesday, October 4, Thread_Programming Page 1

Transcription:

A Sophomoric Introduction to Shared-Memory Parallelism and Concurrency Lecture 4 Shared-Memory Concurrency & Mutual Exclusion Dan Grossman Last Updated: August 2010 For more information, see http://www.cs.washington.edu/homes/djg/teachingmaterials/

Toward sharing resources (memory) Have been studying parallel algorithms using fork-join Reduce span via parallel tasks Algorithms all had a very simple structure to avoid race conditions Each thread had memory only it accessed Example: array sub-range On fork, loaned some of its memory to forkee and did not access that memory again until after join on the forkee Strategy won t work well when: Memory accessed by threads is overlapping or unpredictable Threads are doing independent tasks needing access to same resources (rather than implementing the same algorithm) 2

Concurrent Programming Concurrency: Allowing simultaneous or interleaved access to shared resources from multiple clients Requires coordination, particularly synchronization to avoid incorrect simultaneous access: make somebody block join is not what we want block until another thread is done using what we need not completely done executing Even correct concurrent applications are usually highly nondeterministic: how threads are scheduled affects what operations from other threads they see when non-repeatability complicates testing and debugging 3

Examples Multiple threads: 1. Processing different bank-account operations What if 2 threads change the same account at the same time? 2. Using a shared cache (e.g., hashtable) of recent files What if 2 threads insert the same file at the same time? 3. Creating a pipeline (think assembly line) with a queue for handing work to next thread in sequence? What if enqueuer and dequeuer adjust a circular array queue at the same time? 4

Why threads? Unlike with parallelism, not about implementing algorithms faster But threads still useful for: Code structure for responsiveness Example: Respond to GUI events in one thread while another thread is performing an expensive computation Processor utilization (mask I/O latency) If 1 thread goes to disk, have something else to do Failure isolation Convenient structure if want to interleave multiple tasks and don t want an exception in one to stop the other 5

Sharing, again It is common in concurrent programs that: Different threads might access the same resources in an unpredictable order or even at about the same time Program correctness requires that simultaneous access be prevented using synchronization Simultaneous access is rare Makes testing difficult Must be much more disciplined when designing / implementing a concurrent program Will discuss common idioms known to work 6

Canonical example Correct code in a single-threaded world class BankAccount { private int balance = 0; int getbalance() { return balance; void setbalance(int x) { balance = x; void withdraw(int amount) { int b = getbalance(); if(amount > b) throw new WithdrawTooLargeException(); setbalance(b amount); // other operations like deposit, etc. 7

Interleaving Suppose: Thread T1 calls x.withdraw(100) Thread T2 calls y.withdraw(100) If second call starts before first finishes, we say the calls interleave Could happen even with one processor since a thread can be pre-empted at any point for time-slicing If x and y refer to different accounts, no problem You cook in your kitchen while I cook in mine But if x and y alias, possible trouble 8

A bad interleaving Time Interleaved withdraw(100) calls on the same account Assume initial balance 150 Thread 1 Thread 2 int b = getbalance(); if(amount > b) throw new ; setbalance(b amount); int b = getbalance(); if(amount > b) throw new ; setbalance(b amount); Lost withdraw unhappy bank 9

Incorrect fix It is tempting and almost always wrong to fix a bad interleaving by rearranging or repeating operations, such as: void withdraw(int amount) { if(amount > getbalance()) throw new WithdrawTooLargeException(); // maybe balance changed setbalance(getbalance() amount); This fixes nothing! Narrows the problem by one statement (Not even that since the compiler could turn it back into the old version because you didn t indicate need to synchronize) And now a negative balance is possible why? 10

Mutual exclusion The sane fix: At most one thread withdraws from account A at a time Exclude other simultaneous operations on A too (e.g., deposit) Called mutual exclusion: One thread doing something with a resource (here: an account) means another thread must wait a.k.a. critical sections, which technically have other requirements Programmer must implement critical sections The compiler has no idea what interleavings should or shouldn t be allowed in your program Buy you need language primitives to do it! 11

Wrong! Why can t we implement our own mutual-exclusion protocol? It s technically possible under certain assumptions, but won t work in real languages anyway class BankAccount { private int balance = 0; private boolean busy = false; void withdraw(int amount) { while(busy) { /* spin-wait */ busy = true; int b = getbalance(); if(amount > b) throw new WithdrawTooLargeException(); setbalance(b amount); busy = false; // deposit would spin on same boolean 12

Still just moved the problem! Time Thread 1 Thread 2 while(busy) { while(busy) { busy = true; busy = true; int b = getbalance(); int b = getbalance(); if(amount > b) throw new ; setbalance(b amount); if(amount > b) throw new ; setbalance(b amount); Lost withdraw unhappy bank 13

What we need There are many ways out of this conundrum, but we need help from the language One basic solution: Locks Not Java yet, though Java s approach is similar and slightly more convenient An ADT with operations: new: make a new lock acquire: blocks if this lock is already currently held Once not held, makes lock held release: makes this lock not held if >= 1 threads are blocked on it, exactly 1 will acquire it 14

Why that works An ADT with operations new, acquire, release The lock implementation ensures that given simultaneous acquires and/or releases, a correct thing will happen Example: Two acquires: one will win and one will block How can this be implemented? Need to check and update all-at-once Uses special hardware and O/S support See a senior-level course in computer architecture or operating systems Here, we take this as a primitive and use it 15

Almost-correct pseudocode class BankAccount { private int balance = 0; private Lock lk = new Lock(); void withdraw(int amount) { lk.acquire(); /* may block */ int b = getbalance(); if(amount > b) throw new WithdrawTooLargeException(); setbalance(b amount); lk.release(); // deposit would also acquire/release lk 16

Some mistakes A lock is a very primitive mechanism Still up to you to use correctly to implement critical sections Incorrect: Use different locks for withdraw and deposit Mutual exclusion works only when using same lock Poor performance: Use same lock for every bank account No simultaneous withdrawals from different accounts Incorrect: Forget to release a lock (blocks other threads forever!) Previous slide is wrong because of the exception possibility! if(amount > b) { lk.release(); // hard to remember! throw new WithdrawTooLargeException(); 17

Other operations If withdraw and deposit use the same lock, then simultaneous calls to these methods are properly synchronized But what about getbalance and setbalance? Assume they re public, which may be reasonable If they don t acquire the same lock, then a race between setbalance and withdraw could produce a wrong result If they do acquire the same lock, then withdraw would block forever because it tries to acquire a lock it already has 18

Re-acquiring locks? int setbalance1(int x) { balance = x; int setbalance2(int x) { lk.acquire(); balance = x; lk.release(); void withdraw(int amount) { lk.acquire(); setbalancex(b amount); lk.release(); Can t let outside world call setbalance1 Can t have withdraw call setbalance2 Alternately, we can modify the meaning of the Lock ADT to support re-entrant locks Java does this Then just use setbalance2 19

Re-entrant lock A re-entrant lock (a.k.a. recursive lock) Remembers the thread (if any) that currently holds it a count When the lock goes from not-held to held, the count is 0 If (code running in) the current holder calls acquire: it does not block it increments the count On release: if the count is > 0, the count is decremented if the count is 0, the lock becomes not-held 20

Now some Java Java has built-in support for re-entrant locks Several differences from our pseudocode Focus on the synchronized statement synchronized (expression) { statements 1. Evaluates expression to an object Every object (but not primitive types) is a lock in Java 2. Acquires the lock, blocking if necessary If you get past the {, you have the lock 3. Releases the lock at the matching Even if control leaves due to throw, return, etc. So impossible to forget to release the lock 21

Java example (correct but non-idiomatic) class BankAccount { private int balance = 0; private Object lk = new Object(); int getbalance() { synchronized (lk) { return balance; void setbalance(int x) { synchronized (lk) { balance = x; void withdraw(int amount) { synchronized (lk) { int b = getbalance(); if(amount > b) throw setbalance(b amount); // deposit would also use synchronized(lk) 22

Improving the Java As written, the lock is private Might seem like a good idea But also prevents code in other classes from writing operations that synchronize with the account operations More idiomatic is to synchronize on this 23

Java version #2 class BankAccount { private int balance = 0; int getbalance() { synchronized (this){ return balance; void setbalance(int x) { synchronized (this){ balance = x; void withdraw(int amount) { synchronized (this) { int b = getbalance(); if(amount > b) throw setbalance(b amount); // deposit would also use synchronized(this) 24

Syntactic sugar Version #2 is slightly poor style because there is a shorter way to say the same thing: Putting synchronized before a method declaration means the entire method body is surrounded by synchronized(this){ Therefore, version #3 (next slide) means exactly the same thing as version #2 but is more concise 25

Java version #3 (final version) class BankAccount { private int balance = 0; synchronized int getbalance() { return balance; synchronized void setbalance(int x) { balance = x; synchronized void withdraw(int amount) { int b = getbalance(); if(amount > b) throw setbalance(b amount); // deposit would also use synchronized 26

More Java notes Class java.util.concurrent.reentrantlock works much more like our pseudocode Often use try { finally { to avoid forgetting to release the lock if there s an exception Also library and/or language support for readers/writer locks and condition variables (upcoming lectures) Java provides many other features and details. See, for example: Chapter 14 of CoreJava, Volume 1 [Horstmann/Cornell] Java Concurrency in Practice [Goetz et al] 27