CSE332: Data Abstractions Lecture 22: Shared-Memory Concurrency and Mutual Exclusion. Tyler Robison Summer 2010

Similar documents
A Sophomoric Introduction to Shared-Memory Parallelism and Concurrency Lecture 4 Shared-Memory Concurrency & Mutual Exclusion

CSE 332: Data Structures & Parallelism Lecture 17: Shared-Memory Concurrency & Mutual Exclusion. Ruth Anderson Winter 2019

Programmazione di sistemi multicore

CSE332: Data Abstractions Lecture 19: Mutual Exclusion and Locking

Program Graph. Lecture 25: Parallelism & Concurrency. Performance. What does it mean?

Topics in Parallel and Distributed Computing: Introducing Concurrency in Undergraduate Courses 1,2

Sec$on 6: Mutual Exclusion. Michelle Ku5el

CSE 332: Locks and Deadlocks. Richard Anderson, Steve Seitz Winter 2014

Sharing is the Key. Lecture 25: Parallelism. Canonical Example. Bad Interleavings. Common to have: CS 62 Fall 2016 Kim Bruce & Peter Mawhorter

3/25/14. Lecture 25: Concurrency. + Today. n Reading. n P&C Section 6. n Objectives. n Concurrency

CSE332: Data Abstractions Lecture 25: Deadlocks and Additional Concurrency Issues. Tyler Robison Summer 2010

CS 31: Introduction to Computer Systems : Threads & Synchronization April 16-18, 2019

Synchronization I. Jo, Heeseung

CS 162 Operating Systems and Systems Programming Professor: Anthony D. Joseph Spring 2002

CSE 374 Programming Concepts & Tools

CSE332: Data Abstractions Lecture 23: Programming with Locks and Critical Sections. Tyler Robison Summer 2010

CS 2112 Lecture 20 Synchronization 5 April 2012 Lecturer: Andrew Myers

10/17/ Gribble, Lazowska, Levy, Zahorjan 2. 10/17/ Gribble, Lazowska, Levy, Zahorjan 4

CSE 451: Operating Systems Winter Lecture 7 Synchronization. Steve Gribble. Synchronization. Threads cooperate in multithreaded programs

Operating Systems. Synchronization

CSE 451: Operating Systems Winter Lecture 7 Synchronization. Hank Levy 412 Sieg Hall

Lecture 5: Synchronization w/locks

Today: Synchronization. Recap: Synchronization

Computation Abstractions. Processes vs. Threads. So, What Is a Thread? CMSC 433 Programming Language Technologies and Paradigms Spring 2007

CMSC 330: Organization of Programming Languages. The Dining Philosophers Problem

CSE 153 Design of Operating Systems

CS 153 Design of Operating Systems Winter 2016

Models of concurrency & synchronization algorithms

Synchronization I. Jin-Soo Kim Computer Systems Laboratory Sungkyunkwan University

Synchronized Methods of Old Versions of Java

Dealing with Issues for Interprocess Communication

Operating Systems. Lecture 4 - Concurrency and Synchronization. Master of Computer Science PUF - Hồ Chí Minh 2016/2017

Thread-Local. Lecture 27: Concurrency 3. Dealing with the Rest. Immutable. Whenever possible, don t share resources

CMSC 330: Organization of Programming Languages

CMSC 330: Organization of Programming Languages. Threads Classic Concurrency Problems

CSE 373: Data Structures and Algorithms

Programmazione di sistemi multicore

The Dining Philosophers Problem CMSC 330: Organization of Programming Languages

Threads and Synchronization. Kevin Webb Swarthmore College February 15, 2018

RACE CONDITIONS AND SYNCHRONIZATION

CMSC 132: Object-Oriented Programming II

CSE 332 Winter 2018 Final Exam (closed book, closed notes, no calculators)

COMPSCI 230 Threading Week8. Figure 1 Thread status diagram [

Last Class: CPU Scheduling! Adjusting Priorities in MLFQ!

Synchronization. CS61, Lecture 18. Prof. Stephen Chong November 3, 2011

CSE 332: Data Structures & Parallelism Lecture 18: Race Conditions & Deadlock. Ruth Anderson Autumn 2017

Introduction to Java Threads

Advanced Threads & Monitor-Style Programming 1/24

Thread Safety. Review. Today o Confinement o Threadsafe datatypes Required reading. Concurrency Wrapper Collections

CS-537: Midterm Exam (Spring 2001)

Recap: Thread. What is it? What does it need (thread private)? What for? How to implement? Independent flow of control. Stack

Introducing Shared-Memory Concurrency

CS103 Handout 29 Winter 2018 February 9, 2018 Inductive Proofwriting Checklist

1. Introduction to Concurrent Programming

+ Today. Lecture 26: Concurrency 3/31/14. n Reading. n Objectives. n Announcements. n P&C Section 7. n Race conditions.

CSE 153 Design of Operating Systems Fall 2018

Week 7. Concurrent Programming: Thread Synchronization. CS 180 Sunil Prabhakar Department of Computer Science Purdue University

CSE332 Summer 2010: Final Exam

CSCI-1200 Data Structures Fall 2009 Lecture 25 Concurrency & Asynchronous Computing

CMSC 433 Programming Language Technologies and Paradigms. Spring 2013

CS 351 Design of Large Programs Threads and Concurrency

CPSC/ECE 3220 Fall 2017 Exam Give the definition (note: not the roles) for an operating system as stated in the textbook. (2 pts.

A Sophomoric Introduction to Shared-Memory Parallelism and Concurrency Lecture 5 Programming with Locks and Critical Sections

Threads and Locks, Part 2. CSCI 5828: Foundations of Software Engineering Lecture 08 09/18/2014

Processor speed. Concurrency Structure and Interpretation of Computer Programs. Multiple processors. Processor speed. Mike Phillips <mpp>

The concept of concurrency is fundamental to all these areas.

COMP31212: Concurrency A Review of Java Concurrency. Giles Reger

Concurrency & Synchronization. COMPSCI210 Recitation 25th Feb 2013 Vamsi Thummala Slides adapted from Landon Cox

CSE332: Data Abstractions Lecture 23: Data Races and Memory Reordering Deadlock Readers/Writer Locks Condition Variables. Dan Grossman Spring 2012

! Why is synchronization needed? ! Synchronization Language/Definitions: ! How are locks implemented? Maria Hybinette, UGA

Page 1. Challenges" Concurrency" CS162 Operating Systems and Systems Programming Lecture 4. Synchronization, Atomic operations, Locks"

Software Design and Analysis for Engineers

EECS 482 Introduction to Operating Systems

Introduction to OS Synchronization MOS 2.3

Synchronization in Java

CS 333 Introduction to Operating Systems. Class 3 Threads & Concurrency. Jonathan Walpole Computer Science Portland State University

CS 162 Operating Systems and Systems Programming Professor: Anthony D. Joseph Spring Lecture 8: Semaphores, Monitors, & Condition Variables

Software Design and Analysis for Engineers

The University of Texas at Arlington

Deadlock CS 241. March 19, University of Illinois

Process Management And Synchronization

Text Input and Conditionals

Parallel Code Smells: A Top 10 List

CS11 Java. Fall Lecture 7

CS 162 Operating Systems and Systems Programming Professor: Anthony D. Joseph Spring 2004

The University of Texas at Arlington

Chapter 6: Process [& Thread] Synchronization. CSCI [4 6] 730 Operating Systems. Why does cooperation require synchronization?

CmpSci 187: Programming with Data Structures Spring 2015

Background. The Critical-Section Problem Synchronisation Hardware Inefficient Spinning Semaphores Semaphore Examples Scheduling.

CS 333 Introduction to Operating Systems. Class 3 Threads & Concurrency. Jonathan Walpole Computer Science Portland State University

CMSC 132: Object-Oriented Programming II. Threads in Java

Last Class: Synchronization

Midterm on next week Tuesday May 4. CS 361 Concurrent programming Drexel University Fall 2004 Lecture 9

CSL373: Lecture 5 Deadlocks (no process runnable) + Scheduling (> 1 process runnable)

Synchronization

Lecture 19 CSE August You taught me Language, and my profit on t is I know how to curse. William Shakspere, The Tempest, I, ii.

Module 6: Process Synchronization. Operating System Concepts with Java 8 th Edition

Lecture 10 State dependency

CS 167 Final Exam Solutions

Adam Blank Lecture 1 Winter 2017 CSE 332. Data Abstractions

Transcription:

CSE332: Data Abstractions Lecture 22: Shared-Memory Concurrency and Mutual Exclusion Tyler Robison Summer 2010 1

Toward sharing resources (memory) So far we ve looked at parallel algorithms using fork-join ForkJoin algorithms all had a very simple structure to avoid race conditions Each thread had memory only it accessed Example: array sub-range Array variable itself was treated as read-only in parallel portion Result of forked process not accessed until after join() called So the structure (mostly) ensured that bad simultaneous access wouldn t occur Strategy won t work well when: Memory accessed by threads is overlapping or unpredictable Threads are doing independent tasks needing access to same resources (rather than implementing the same algorithm) 2 We ll need to coordinate resources for them to be of use

What could go wrong? Imagine 2 threads, running at the same time, both with access to a shared linkedlist based queue (initially empty) 3 enqueue(x) { if(back==null){ back=new Node(x); front=back; else{ back.next = new Node(x); back = back.next; Each own program counter (and heap, etc.) Queue is shared, so they both indirectly use the same front and back (which is the whole point of sharing the queue) We have no guarantee what happens first between different threads; can (and will) arbitrarily interrupt each other Many things can go wrong: say, one tries to enqueue a, the other b, and both verify that back is null before other sets back Result: One assignment of back will be forgotten In general, any interleaving of results is possible if enqueue were called at the same time for both

Concurrent Programming Concurrency: Allowing simultaneous or interleaved access to shared resources from multiple clients Requires coordination, particularly synchronization to avoid incorrect simultaneous access: make somebody block (wait) until resource is free join isn t going to work here We want to block until another thread is done using what we need not completely done executing Even correct concurrent applications are usually highly non-deterministic How threads are scheduled affects what operations from other threads they see when Non-repeatability complicates testing and debugging 4

Why threads? Use of threads not always to increase performance (though they can be) Also used for: Code structure for responsiveness Example: Respond to GUI events in one thread while another thread is performing an expensive computation Failure isolation Convenient structure if want to interleave multiple tasks and don t want an exception in one to stop the other 5

Canonical example Simple code for a bank account Correct in a single-threaded world class BankAccount { private int balance = 0; int getbalance() { return balance; void setbalance(int x) { balance = x; void withdraw(int amount) { int b = getbalance(); if(amount > b) throw new WithdrawTooLargeException(); setbalance(b amount); // other operations like deposit, etc. 6

Interleaving Suppose we have 2 threads, T1 & T2: Thread T1 calls x.withdraw(100) Thread T2 calls y.withdraw(100) If second call starts before first finishes, we say the calls interleave Could happen even with one processor since a thread can be pre-empted at any point for time-slicing T1 runs for 50 ms, pauses somewhere, T2 picks up for 50ms If x and y refer to different accounts, no problem You cook in your kitchen while I cook in mine But if x and y alias, weird things can occur 7

Time A bad interleaving Imagine two interleaved withdraw(100) calls on the same account Assume initial balance 150 From the code we saw before, this should cause a WithdrawTooLarge exception Thread 1 Thread 2 int b = getbalance(); if(amount > b) throw new ; setbalance(b amount); int b = getbalance(); if(amount > b) throw new ; setbalance(b amount); Instead of an exception, we have a Lost withdraw But if we had if(amount>getbalance()) instead, this wouldn t have happened right? 8

Incorrect fix It is tempting and almost always wrong to fix a bad interleaving by rearranging or repeating operations, such as: void withdraw(int amount) { if(amount > getbalance()) throw new WithdrawTooLargeException(); // maybe balance changed setbalance(getbalance() amount); This fixes nothing! Narrows the problem by one statement (Not even that since the compiler could turn it back into the old version because you didn t indicate need to synchronize) And now a negative balance is possible why? 9

Mutual exclusion The sane fix: At most one thread withdraws from account A at a time Exclude other simultaneous operations on A too (e.g., deposit) Other combinations of simultaneous operations on balance could break things One at a time is embodied in the idea of mutual exclusion Mutual exclusion: One thread doing something with a resource (here: an account) means another thread must wait Define critical sections ; areas of code that are mutually exclusive Programmer (that is, you) must implement critical sections The compiler has no idea what interleavings should or shouldn t be allowed in your program Buy you need language primitives to do it! Like with Thread start() & join(), you can t implement these yourself in Java 10

Wrong! Why can t we implement our own mutual-exclusion protocol? 11 Say we tried to coordinate it ourselves, using busy : class BankAccount { private int balance = 0; private boolean busy = false; void withdraw(int amount) { while(busy) { /* spin-wait */ busy = true; int b = getbalance(); if(amount > b) throw new WithdrawTooLargeException(); setbalance(b amount); busy = false; // deposit would spin on same boolean

Time Still just moved the problem! 12 Thread 1 Thread 2 while(busy) { while(busy) { busy = true; busy = true; int b = getbalance(); if(amount > b) throw new ; setbalance(b amount); int b = getbalance(); if(amount > b) throw new ; setbalance(b amount); Lost withdraw unhappy bank Time does elapse between checking busy and setting busy ; can be interrupted there

What we need To resolve this issue, we ll need help from the language One basic solution: Locks Still on a conceptual level at the moment, Lock is not a Java class* An ADT with operations: new: make a new lock acquire: If lock is not held, makes it held Blocks if this lock is already held Checking & setting happen together, and cannot be interrupted Fixes problem we saw before release: makes this lock not held If multiple threads are blocked on it, exactly 1 will acquire it 13

Why that works Lock: ADT with operations new, acquire, release The lock implementation ensures that given simultaneous acquires and/or releases, a correct thing will happen 14 Example: If we have two acquires : one will win and one will block How can this be implemented? Need to check and update all-at-once Uses special hardware and O/S support See CSE471 and CSE451 In CSE332, we take this as a primitive and use it

Almost-correct pseudocode Note: Lock is not an actual Java class 15 class BankAccount { private int balance = 0; private Lock lk = new Lock(); void withdraw(int amount) { lk.acquire(); /* may block */ int b = getbalance(); if(amount > b) throw new WithdrawTooLargeException(); setbalance(b amount); lk.release(); // deposit would also acquire/release lk One problem with this code

Some potential Lock mistakes 16 A lock is a very primitive mechanism Still up to you to use correctly to implement critical sections Lots of little things can go wrong, and completely break your program Incorrect: Forget to release a lock (blocks other threads forever!) Previous slide is wrong because of the exception possibility! Incorrect: Use different locks for withdraw and deposit Mutual exclusion works only when using same lock With one lock for each, we could have a simultaneous withdraw & deposit; could still break Poor performance: Use same lock for every bank account if(amount > b) { lk.release(); // hard to remember! throw new WithdrawTooLargeException(); No simultaneous withdrawals from different accounts

Other operations If withdraw and deposit use the same lock (and they use it correctly), then simultaneous calls to these methods are properly synchronized But what about getbalance and setbalance? Assume they re public, which may be reasonable If they don t acquire the same lock, then a race between setbalance and withdraw could produce a wrong result If they do acquire the same lock, then withdraw would block forever because it tries to acquire a lock it already has 17

One (not very good) possibility int setbalance1(int x) { balance = x; int setbalance2(int x) { lk.acquire(); balance = x; lk.release(); void withdraw(int amount) { lk.acquire(); setbalancex(b amount); lk.release(); Can t let outside world call setbalance1 Can t have withdraw call setbalance2 Could work (if adhered to), but not good style; also not very convenient Alternately, we can modify the meaning of the Lock ADT to support re-entrant locks Java does this 18

Re-entrant lock A re-entrant lock (a.k.a. recursive lock) The idea: Once acquired, the lock is held by the Thread, and subsequent calls to acquire in that Thread won t block Remembers the thread (if any) that currently holds it a count When the lock goes from not-held to held, the count is 0 If code in the holding Thread calls acquire: it does not block it increments the count On release: if the count is > 0, the count is decremented if the count is 0, the lock becomes not-held Result: Withdraw can acquire the lock, and then call setbalance, which can also acquire the lock Because they re in the same thread & it s a re-entrant lock, the inner acquire won t block 19

Java s Re-entrant Lock java.util.concurrent.reentrantlock Has methods lock() and unlock() As described above, it is conceptually owned by the Thread, and shared within that Important to guarantee that lock is always released; recommend something like this: lock.lock(); try { // method body finally { lock.unlock(); Despite what happens in try, the code in finally will execute afterwards 20

Synchronized: A Java convenience Java has built-in support for re-entrant locks You can use the synchronized statement as an alternative to declaring a ReentrantLock 21 synchronized (expression) { statements 1. Evaluates expression to an object, uses it as a lock Every object (but not primitive types) is a lock in Java 2. Acquires the lock, blocking if necessary If you get past the {, you have the lock 3. Releases the lock at the matching Even if control leaves due to throw, return, etc. So impossible to forget to release the lock

Example of Java s synchronized 22 class BankAccount { private int balance = 0; private Object lk = new Object(); int getbalance() { synchronized (lk) { return balance; void setbalance(int x) { synchronized (lk) { balance = x; void withdraw(int amount) { synchronized (lk) { int b = getbalance(); if(amount > b) throw setbalance(b amount); // deposit would also use synchronized(lk)

Improving the Java As written, the lock is private Might seem like a good idea But also prevents code in other classes from writing operations that synchronize with the account operations More common is to synchronize on this Also, it s convenient; don t need to declare an extra object 23

Java version #2 class BankAccount { private int balance = 0; int getbalance() { synchronized (this){ return balance; void setbalance(int x) { synchronized (this){ balance = x; void withdraw(int amount) { synchronized (this) { int b = getbalance(); if(amount > b) throw setbalance(b amount); // deposit would also use synchronized(this) 24

Syntactic sugar synchronized (this) is sufficiently common that there is an even simpler way to do it in Java: Putting synchronized before a method declaration means the entire method body is surrounded by synchronized(this){ Therefore, version #3 (next slide) means exactly the same thing as version #2 but is more concise 25

Java version #3 (final version) class BankAccount { private int balance = 0; synchronized int getbalance() { return balance; synchronized void setbalance(int x) { balance = x; synchronized void withdraw(int amount) { int b = getbalance(); if(amount > b) throw setbalance(b amount); // deposit would also use synchronized 26