This session discusses a basic high-level introduction to concurrency programming with Java which include:
programming basics, OOP concepts, concurrency, concurrent programming, parallel computing, concurrent vs parallel, why concurrency, real world example, terms, Moore's Law, Amdahl's Law, types of parallel computation, MIMD Variants, shared memory model, distributed memory model, client server model, scoop mechanism, scoop preview - a sequential program, in a concurrent setting - using scoop, programming then & now, sequential programming, concurrent programming,
3. Learning Outcomes
• Demonstrate foundational computing knowledge of
concurrent systems, their performance opportunities and
how to implement them using:
• Java concurrent features and their semantics
• Java packages and APIs for concurrent programs
• Conventional synchronisation algorithms, data-
structures and APIs
• Wait-free and lock-free synchronisation controls.
4. Learning Outcomes Cont.d
• Apply knowledge of computing principles and technical
skills to parallelise tasks to improve their performance
and response characteristics by:
• Using abstraction and computational thinking
• Developing, implementing and testing the
effectiveness of alternate Java programs with
different levels of concurrency
• Critiquing the approach used to solve a problem by
evaluating its strengths and weaknesses
6. Programming
• the process of writing computer programs.
• example
• Output
class First {
public static void main(String[] arguments) {
System.out.println("Let's do something using Java technology.");
}
}
7. OOP
• Object
• An object is a software bundle of related state and behavior. Software objects are often
used to model the real-world objects that you find in everyday life. This lesson explains how
state and behavior are represented within an object, introduces the concept of data
encapsulation, and explains the benefits of designing your software in this manner.
• Class
• A class is a blueprint or prototype from which objects are created. This section defines a
class that models the state and behavior of a real-world object. It intentionally focuses on
the basics, showing how even a simple class can cleanly model state and behavior.
• Inheritance
• Inheritance provides a powerful and natural mechanism for organizing and structuring your
software. This section explains how classes inherit state and behavior from their
superclasses, and explains how to derive one class from another using the simple syntax
provided by the Java programming language.
8. OOP Cont.d
• Encapsulation
• If a class disallows calling code from accessing internal object data and forces
access through methods only, this is a strong form of abstraction or information
hiding known as encapsulation
• Interface
• An interface is a contract between a class and the outside world. When a class
implements an interface, it promises to provide the behavior published by that
interface. This section defines a simple interface and explains the necessary
changes for any class that implements it.
• Package
• A package is a namespace for organizing classes and interfaces in a logical manner.
Placing your code into packages makes large software projects easier to manage.
This section explains why this is useful, and introduces you to the Application
Programming Interface (API) provided by the Java platform.
9. Concurrency
• a property of systems in which several computations are
executing simultaneously, and potentially interacting with
each other.
• The computations may be executing on multiple cores in the
same chip, preemptively time-shared threads on the same
processor, or executed on physically separated processors.
• A number of mathematical models have been developed for
general concurrent computation including Petri nets, process
calculi, the Parallel Random Access Machine model, the
Actor model and the Reo Coordination Language.
10. Concurrent Programming
• Concurrent object-oriented programming is a
programming paradigm which combines object-
oriented programming (OOP) together with
concurrency.
• While numerous programming languages, such as
Java, combine OOP with concurrency mechanisms
like threads, the phrase "concurrent object-oriented
programming" primarily refers to systems where
objects themselves are a concurrency primitive, such
as when objects are combined with the actor model.
11. Parallel Computing
• a form of computation in which many calculations
are carried out simultaneously, operating on the
principle that large problems can often be divided
into smaller ones, which are then solved at the
same time.
12. Concurrent vs Parallel
• you can have two threads (or processes) executing
concurrently on the same core through context switching.
• When the two threads (or processes) are executed on two
different cores (or processors), you have parallelism.
• in concurrency, parallelism is only "virtual", while the other
true parallelism.
• Therefore, every parallel program is concurrent, but the
converse is not necessarily true
13. Why Concurrency?
• Efficiency
• Time (load sharing)
• Cost (resource sharing)
• Availability
• Multiple access
• Convenience
• • Perform several tasks at once
• Modeling power
• Describing systems that are inherently parallel
14. Real World Example
• Computer systems are used for modeling objects in
the real world
• Object-oriented programming
• The world often includes parallel operation
• example:
• Limited number of seats on the same plane
• Several booking agents active at the same time
15. Terms
• Multiprocessing
• the use of more than one processing unit in a
system
• Parallel execution
• processes running at the same time
16. Terms Cont.d
• Interleaving
• several tasks active, only one running at a time
• Multitasking
• the OS runs interleaved executions
• Concurrency
• multiprocessing, multitasking, or any combination
18. Why it matters?
• The “end of Moore’s law as we knew it” has important implications on the
software construction process
• Computing is taking an irreversible step toward parallel architectures
• Hardware construction of ever faster sequential CPUs has hit physical
limits
• Clock speed no longer increases for every new processor generation
• Moore’s Law expresses itself as exponentially increasing number of
processing cores per chip
• If we want programs to run faster on the next processor generation, the
software must exploit more concurrency
20. Amdahl’s Law Cont.d
• We go from 1 processor to n. What gain may we
expect?
• Amdahl’s law severely limits our hopes!
• Define gain as:
• speed up = ( 1 / (( 1-p)+( p/n)) )
• where p = parallelizable %; n= number of processors
• Not everything can be parallelized!
21. Types of Parallel
Computation
• Flynn’s taxonomy: classification of computer
architectures
• Considers relationship of instruction streams to
data streams:
• SISD: No parallelism (uniprocessor)
• SIMD: Vector processor, GPU
• MIMD: Multiprocessing (predominant today)
22. MIMD Variants
• SPMD (Single Program Multiple Data):
• All processors run same program, but at
independent speeds; no lockstep as in SIMD
• MPMD (Multiple Program Multiple Data):
• Often manager/worker strategy: manager
distributes tasks, workers return result to
manager
23. Shared Memory Model
• All processors share a common memory
• Shared-memory communication
Memory
Processor
1
Processor
2
Processor
4
Processor
3
24. Distributed Memory Model
• Each processor has own local memory, inaccessible to
others
• Message passing communication
• Common for SPMD architecture
Process
or
Process
or
Process
or
Memory MemoryMemory
Message Passing
25. Client Server Model
• Specific case of the distributed model
• Examples: Database-centered systems, World-Wide
Web
Process
or
Process
or
Process
or
Memory Memory
Memory
26. SCOOP Mechanism
• Simple Concurrent Object-Oriented Programming
• Evolved through previous two decades; CACM (1993) and chap.
32 of Object-Oriented Software Construction, 2nd edition, 1997
• Prototype-implementation at ETH in 2007
• Implementation integrated within EiffelStudio in 2011 (by Eiffel
Software)
• Current reference: ETH PhD Thesis by Piotr Nienaltowski, 2008;
articles by Benjamin Morandi, Sebastian Nanz and Bertrand
Meyer, 2010-2011
27. SCOOP Preview: a
sequential program
transfer (source, target: ACCOUNT;
amount: INTEGER)
-- If possible, transfer amount from source to target.
do
if source.balance >= amount then
source.withdraw (amount)
target.deposit (amount)
end
end
Typical calls:
transfer (acc1, acc2, 100)
transfer (acc1, acc3, 100)
28. In a concurrent setting, using
SCOOP
transfer (source, target: separate ACCOUNT;
amount: INTEGER)
-- If possible, transfer amount from source to target.
do
if source.balance >= amount then
source.withdraw (amount)
target.deposit (amount)
end
end
Typical calls:
transfer (acc1, acc2, 100)
transfer (acc1, acc3, 100)
29. A better SCOOP version
transfer (source, target: separate ACCOUNT;
amount: INTEGER)
-- Transfer amount from source to target.
require
source.balance >= amount
do
source.withdraw (amount)
target.deposit (amount)
ensure
source.balance = old source.balance – amount
target.balance = old target.balance + amount
end
32. Sequential Programming
• Used to be messy
• Still hard but key improvements:
• Structured programming
• Data abstraction & object technology
• Design by Contract
• Genericity, multiple inheritance
• Architectural techniques
33. Concurrent Programming
• Used to be messy
• Example: threading models in most popular
approaches
• Development level: sixties/seventies
• Only understandable through operational
reasoning