Introducing Shared-Memory Concurrency

Similar documents
CSE 374 Programming Concepts & Tools

Programmazione di sistemi multicore

CSE 451: Operating Systems Winter Lecture 7 Synchronization. Steve Gribble. Synchronization. Threads cooperate in multithreaded programs

A Sophomoric Introduction to Shared-Memory Parallelism and Concurrency Lecture 4 Shared-Memory Concurrency & Mutual Exclusion

CS 153 Design of Operating Systems Winter 2016

CSE 451: Operating Systems Winter Lecture 7 Synchronization. Hank Levy 412 Sieg Hall

CS 162 Operating Systems and Systems Programming Professor: Anthony D. Joseph Spring 2002

CSE 332: Data Structures & Parallelism Lecture 17: Shared-Memory Concurrency & Mutual Exclusion. Ruth Anderson Winter 2019

CS 2112 Lecture 20 Synchronization 5 April 2012 Lecturer: Andrew Myers

Synchronization. CS61, Lecture 18. Prof. Stephen Chong November 3, 2011

MultiThreading. Object Orientated Programming in Java. Benjamin Kenwright

Operating Systems. Synchronization

CSE 153 Design of Operating Systems

10/17/ Gribble, Lazowska, Levy, Zahorjan 2. 10/17/ Gribble, Lazowska, Levy, Zahorjan 4

Synchronization I. Jo, Heeseung

Overview. CMSC 330: Organization of Programming Languages. Concurrency. Multiprocessors. Processes vs. Threads. Computation Abstractions

EECS 482 Introduction to Operating Systems

Dealing with Issues for Interprocess Communication

Deadlock and Monitors. CS439: Principles of Computer Systems September 24, 2018

Pre- and post- CS protocols. CS 361 Concurrent programming Drexel University Fall 2004 Lecture 7. Other requirements for a mutual exclusion algorithm

Lecture 5: Synchronization w/locks

CSE332: Data Abstractions Lecture 19: Mutual Exclusion and Locking

ECE 462 Object-Oriented Programming using C++ and Java. Scheduling and Critical Section

10/17/2011. Cooperating Processes. Synchronization 1. Example: Producer Consumer (3) Example

Operating Systems. Lecture 4 - Concurrency and Synchronization. Master of Computer Science PUF - Hồ Chí Minh 2016/2017

Midterm Exam Amy Murphy 19 March 2003

CS510 Advanced Topics in Concurrency. Jonathan Walpole

Operating Systems. Thread Synchronization Primitives. Thomas Ropars.

CS 31: Introduction to Computer Systems : Threads & Synchronization April 16-18, 2019

Concurrent Processes Rab Nawaz Jadoon

Learning from Bad Examples. CSCI 5828: Foundations of Software Engineering Lecture 25 11/18/2014

! Why is synchronization needed? ! Synchronization Language/Definitions: ! How are locks implemented? Maria Hybinette, UGA

INF 212 ANALYSIS OF PROG. LANGS CONCURRENCY. Instructors: Crista Lopes Copyright Instructors.

Processes Prof. James L. Frankel Harvard University. Version of 6:16 PM 10-Feb-2017 Copyright 2017, 2015 James L. Frankel. All rights reserved.

CSE332: Data Abstractions Lecture 22: Shared-Memory Concurrency and Mutual Exclusion. Tyler Robison Summer 2010

CPS 310 first midterm exam, 10/6/2014

Synchronising Threads

Synchronization I. Jin-Soo Kim Computer Systems Laboratory Sungkyunkwan University

Chapter 6: Process [& Thread] Synchronization. CSCI [4 6] 730 Operating Systems. Why does cooperation require synchronization?

Concurrency. Glossary

Sharing is the Key. Lecture 25: Parallelism. Canonical Example. Bad Interleavings. Common to have: CS 62 Fall 2016 Kim Bruce & Peter Mawhorter

Animation Part 2: MoveableShape interface & Multithreading

Operating Systems, Assignment 2 Threads and Synchronization

Concurrency in Java Prof. Stephen A. Edwards

CS 220: Introduction to Parallel Computing. Introduction to CUDA. Lecture 28

CPS 110 Midterm. Spring 2011

Deadlock and Monitors. CS439: Principles of Computer Systems February 7, 2018

Safety and liveness for critical sections

Thread Safety. Review. Today o Confinement o Threadsafe datatypes Required reading. Concurrency Wrapper Collections

Multiprocessor Systems. COMP s1

CSE 153 Design of Operating Systems Fall 2018

Before we can talk about concurrency, we need to first understand what happens under the hood when the interpreter executes a Python statement.

G52CON: Concepts of Concurrency

CS533 Concepts of Operating Systems. Jonathan Walpole

Symmetric Multiprocessors: Synchronization and Sequential Consistency

Lesson 6: Process Synchronization

Introduction to Java Threads

Thread Synchronization: Too Much Milk

Computation Abstractions. Processes vs. Threads. So, What Is a Thread? CMSC 433 Programming Language Technologies and Paradigms Spring 2007

Lecture #7: Implementing Mutual Exclusion

CMSC 330: Organization of Programming Languages

Threads Questions Important Questions

Advanced Threads & Monitor-Style Programming 1/24

Concurrency & Synchronization. COMPSCI210 Recitation 25th Feb 2013 Vamsi Thummala Slides adapted from Landon Cox

THREADS: (abstract CPUs)

CSE332: Data Abstractions Lecture 23: Programming with Locks and Critical Sections. Tyler Robison Summer 2010

Concurrent Programming in C++ Venkat

Multiprocessor System. Multiprocessor Systems. Bus Based UMA. Types of Multiprocessors (MPs) Cache Consistency. Bus Based UMA. Chapter 8, 8.

Review: Easy Piece 1

Concurrent & Distributed Systems Supervision Exercises

Concurrent Programming

6.824 Distributed System Engineering: Spring Quiz I Solutions

PROCESS SYNCHRONIZATION

Concurrency, Mutual Exclusion and Synchronization C H A P T E R 5

Concurrent Programming. Copyright 2017 by Robert M. Dondero, Ph.D. Princeton University

Locks. Dongkun Shin, SKKU

CMSC 132: Object-Oriented Programming II

User Space Multithreading. Computer Science, University of Warwick

Process & Thread Management II. Queues. Sleep() and Sleep Queues CIS 657

Process & Thread Management II CIS 657

PROVING THINGS ABOUT PROGRAMS

CSCI [4 6] 730 Operating Systems. Example Execution. Process [& Thread] Synchronization. Why does cooperation require synchronization?

1. Introduction to Concurrent Programming

27/04/2012. We re going to build Multithreading Application. Objectives. MultiThreading. Multithreading Applications. What are Threads?

Interprocess Communication By: Kaushik Vaghani

The concept of concurrency is fundamental to all these areas.

Week 7. Concurrent Programming: Thread Synchronization. CS 180 Sunil Prabhakar Department of Computer Science Purdue University

Synchronization Principles

Systèmes d Exploitation Avancés

C09: Process Synchronization

Concurrency and Parallel Computing. Ric Glassey

Chapter 5: Process Synchronization. Operating System Concepts 9 th Edition

Synchronization. CS 416: Operating Systems Design, Spring 2011 Department of Computer Science Rutgers University

Program Graph. Lecture 25: Parallelism & Concurrency. Performance. What does it mean?

The Kernel Abstraction

Interprocess Communication and Synchronization

Concurrency, Thread. Dongkun Shin, SKKU

Processor speed. Concurrency Structure and Interpretation of Computer Programs. Multiple processors. Processor speed. Mike Phillips <mpp>

Performance Throughput Utilization of system resources

Mid-term Roll no: Scheduler Activations: Effective Kernel Support for the User-Level Management of Parallelism

Transcription:

Race Conditions and Atomic Blocks November 19, 2007

Why use concurrency? Communicating between threads Concurrency in Java/C Concurrency Computation where multiple things happen at the same time is inherently more complicated than sequential computation. Entirely new kinds of bugs and obligations Two forms of concurrency: Time-slicing: only one computation at a time but pre-empt to provide responsiveness or mask I/O latency. True parallelism: more than one CPU (e.g., the lab machines have two, the attu machines have 4,...) Within a program, each computaton becomes a separate thread.

Why do this? Concurrency Why use concurrency? Communicating between threads Concurrency in Java/C Convenient structure of code Example: web browser. Each tab becomes a separate thread. Example: Fairness one slow computation only takes some of the CPU time without your own complicated timer code. Avoids starvation. Performance Run other threads while one is reading/writing to disk (or other slow thing that can happen in parallel) Use more than one CPU at the same time The way computers will get faster over the next 10 years So no parallelism means no faster.

Why use concurrency? Communicating between threads Concurrency in Java/C Working in Parallel Often you have a bunch of threads running at once and they might need the same mutable memory at the same time but probably not. Want to be correct without sacrificing parallelism. Example: A bunch of threads processing bank transactions: withdraw, deposit, transfer, currentbalance,... chance of two threads accessing the same account at the same time very low, but not zero. want mutual exclusion (a way to keep each other out of the way when there is contention)

Why use concurrency? Communicating between threads Concurrency in Java/C Basics C: The POSIX Threads (pthreads) library #include <pthread.h> pthread create takes a function pointer and an argument for it; runs it as a separate thread. Many types, functions, and macros for threads, locks, etc. Java: Built into the language Subclass java.lang.thread overriding run Create a Thread object and call its start method Any object can be synchronized on (later)

What are race conditions? Example of race conditions Common bug: race conditions There are several new types of bugs that occur in concurrent programs; race conditions are the most fundamental and the most common. A race condition is when the order of thread execution in a program affects the program s output. Difficult to identify and fix, because problematic thread interleavings may be unlikely. Data races (common type of race condition) - when multiple threads access the same location in memory simultaneously, with at least one access being a write

What are race conditions? Example of race conditions Example: TwoThreads.java What is the intended output of this program? What actual outputs are possible?

What could go wrong? Concurrency What are race conditions? Example of race conditions Simultaneous updates lead to race conditions. Suppose two threads both execute i++. In machine code, this single statement becomes several operations: Thread 1 Thread 2 r1 = i r2 = i r1 += 1 r2 += 1 i = r1 i = r2 If i starts at 0, what is the value of i after execution?

What could go wrong? Concurrency What are race conditions? Example of race conditions Simultaneous updates lead to race conditions. Final value of i is 2. Thread 1 Thread 2 r1 = i r1 += 1 i = r1 r2 = i r2 += 1 i = r2

What could go wrong? Concurrency What are race conditions? Example of race conditions Simultaneous updates lead to race conditions. Thread 1 Thread 2 r1 = i r1 += 1 r2 = i r2 += 1 i = r1 i = r2 Final value of i is 1. The first update to i is lost.

Detecting race conditions What are race conditions? Example of race conditions Without calls to Thread.sleep(), our code ran so fast that the race condition did not manifest. Forcing a thread to yield control is a good way to encourage interesting interleavings. BUT: Calling sleep doesn t guarantee that the race condition will affect the output. In general, programs are large and we don t know where to look for bugs or if bugs even exist.

Avoiding race conditions Using atomic blocks to avoid race conditions Example: BankAccount.java Example: ProducerConsumer.java We will try to restrict the number of possible thread interleavings. E.g., in TwoThreads, we got into trouble because the updates were interleaved. Simple limitation is to define atomic blocks in which a thread may assume that no other threads will execute. Lots of variations on terminology: critical sections, synchronization, etc.

Fixing TwoThreads.java Using atomic blocks to avoid race conditions Example: BankAccount.java Example: ProducerConsumer.java (Demo.)

Fixing TwoThreads.java Using atomic blocks to avoid race conditions Example: BankAccount.java Example: ProducerConsumer.java Now the following troublesome interleaving is illegal! Thread 1 Thread 2 r1 = i r1 += 1 r2 = i r2 += 1 i = r1 i = r2

Fixing TwoThreads.java Using atomic blocks to avoid race conditions Example: BankAccount.java Example: ProducerConsumer.java Instead, we get: Thread 1 Thread 2 atomic { r1 = i r1 += 1 i = r1 } atomic { r2 = i r2 += 1 i = r2 }

Concurrency Using atomic blocks to avoid race conditions Example: BankAccount.java Example: ProducerConsumer.java are a common way of fixing race conditions Allows us to think single-threaded even in a multi-threaded program How much code should be inside the block? that are too long could cause the program to slow down, as threads sleep waiting for other threads to finish executing atomically. that are too short might miss race conditions.

Familiar example: bank accounts Using atomic blocks to avoid race conditions Example: BankAccount.java Example: ProducerConsumer.java (Demo)

Using atomic blocks to avoid race conditions Example: BankAccount.java Example: ProducerConsumer.java Producer and consumer threads One (or more) thread(s) produces values and leaves them in a buffer to be processed One (or more) thread(s) takes values from the buffer and consumes them Common application in operating systems, etc. (Demo.)

Using atomic blocks to avoid race conditions Example: BankAccount.java Example: ProducerConsumer.java Limitations of atomic blocks In the producer-consumer example, we see: Busy wait: threads loop forever waiting for a state change. Scheduling fairness: no guarantee that the threads will get equal shares of processor time. There are ways to fix both of these problems with atomic blocks, but we don t have good implementations for Java or C. Instead, we ll look briefly at what mechanisms Java and pthreads do provide (more on Wednesday).

don t exist yet Locks You can t (yet) use atomic blocks in true Java code (although we faked it ). Active area of research maybe it will be integrated into Java at some point. Instead programmers use locks (mutexes) or other mechanisms, usually to get the behavior of atomic blocks. But misuse of locks will violate the all-at-once policy. Or lead to other bugs we haven t seen yet.

Lock basics Concurrency Locks A lock is acquired and released by a thread. At most one thread holds it at any moment. Acquiring it blocks until the holder releases it and the blocked thread acquires it. Many threads might be waiting; one will win. The lock-implementor avoids race conditions on lock-acquire. So create an atomic block by surrounding with lock acquire and release. Problems: Easy to mess up (e.g., use two different locks to protect the same location in memory). Deadlock: threads might get stuck forever waiting for locks that will never be released.

Summary Concurrency introduces bugs that simply don t exist in sequential programming. are a useful way of thinking about safety in concurrent programs. In real code, you ll use locks or other mechanisms, which eliminate race conditions if used properly but can lead to more bugs if misused.

Blatant plug(in) For the examples we used AtomEclipse, a plugin that I developed for Eclipse (an IDE for Java (and other languages)). AtomEclipse is designed for students like you to learn about atomic blocks more easily. Download and try out if you want to play around with the examples some more: http://wasp.cs.washington.edu/atomeclipse (linked from the 303 web page) Let me know what you think: effinger@cs or talk to me after class!