Task Parallel Library. Data Parallelism Patterns презентация

Содержание


Презентации» Информатика» Task Parallel Library. Data Parallelism Patterns
Task Parallel Library
 Data Parallelism PatternsIntroduction to Parallel Programming
 Introduction to Parallel Programming
 Parallel Loops
 ParallelIntroduction to Parallel Programming
 Introduction to Parallel Programming
 Parallel Loops
 ParallelHardware trends predict more cores instead of faster clock speeds
 MulticoreSome parallel applications can be written for specific hardware
 Potential parallelismDecomposition
 Parallel programming patterns aspectsTasks are sequential operations that work together to perform a largerTasks that are independent of one another can run in parallel
Tasks often need to share data
 Scalable sharing of dataUnderstand your problem or application and look for potential parallelism acrossConcurrency is a concept related to multitasking and asynchronous input-output (I/O)
With parallelism, concurrent threads execute at the same time on multipleAmdahl’s law says that no matter how many cores you have,Whenever possible, stay at the highest possible level of abstraction andUse patterns
 Parallel programming tipsBased on the .NET Framework 4
 Code examples of this presentationIntroduction to Parallel Programming
 Introduction to Parallel Programming
 Parallel Loops
 ParallelParallel programming patternsUse the Parallel Loop pattern when you need to perform theParallel.ForParallel.ForEachAlmost all LINQ-to-Objects expressions can easily be converted to their parallelUse PLINQ’s ForAll extension method in cases where you want toThe .NET implementation of the Parallel Loop pattern ensures that exceptionsParallel loops
 Parallel loops variationsWriting to shared variables
 Dependencies between loop iterationsReferencing data types that are not thread safe
 Dependencies between loopSequential iteration
 Breaking out of loops earlyUse Break to exit a loop early while ensuring that lower-indexedCalling Break doesn’t stop other steps that might have already startedParallelLoopResultUse Stop to exit a loop early when you don’t needExternal Loop CancellationSpecial handling of small loop bodiesThe number of ranges that will be created by a PartitionerYou usually let the system manage how iterations of a parallelThe PLINQ query in the code example will run with aSometimes you need to maintain thread-local state during the execution ofRandom initialization of the large arrayCalling the default Random constructor twice in short succession may useYou can substitute custom task scheduling logic for the default taskStep size other than one
 Anti-PatternsAdaptive partitioning
 Parallel loops design notesIntroduction to Parallel Programming
 Introduction to Parallel Programming
 Parallel Loops
 ParallelParallel programming patternsThe pattern is more general than calculating a sum 
 TheSequential version
 Calculating a sumPLINQ
 Calculating a sumPLINQ is usually the recommended approach
 Parallel aggregation pattern in .NETThe PLINQ Aggregate extension method includes an overloaded version that allowsAggregation using Parallel For and ForEach
 Design notesAggregation in PLINQ does not require the developer to use locks
Task Parallel Library Data Parallelism Patterns



Слайды и текст этой презентации
Слайд 1
Описание слайда:
Task Parallel Library Data Parallelism Patterns


Слайд 2
Описание слайда:
Introduction to Parallel Programming Introduction to Parallel Programming Parallel Loops Parallel Aggregation

Слайд 3
Описание слайда:
Introduction to Parallel Programming Introduction to Parallel Programming Parallel Loops Parallel Aggregation

Слайд 4
Описание слайда:
Hardware trends predict more cores instead of faster clock speeds Multicore system features

Слайд 5
Описание слайда:
Some parallel applications can be written for specific hardware Potential parallelism

Слайд 6
Описание слайда:
Decomposition Parallel programming patterns aspects

Слайд 7
Описание слайда:
Tasks are sequential operations that work together to perform a larger operation Decomposition

Слайд 8
Описание слайда:
Tasks that are independent of one another can run in parallel Coordination

Слайд 9
Описание слайда:
Tasks often need to share data Scalable sharing of data

Слайд 10
Описание слайда:
Understand your problem or application and look for potential parallelism across the entire application as a whole Parallel programming design approaches

Слайд 11
Описание слайда:
Concurrency is a concept related to multitasking and asynchronous input-output (I/O) Concurrency & parallelism

Слайд 12
Описание слайда:
With parallelism, concurrent threads execute at the same time on multiple cores Concurrency & parallelism

Слайд 13
Описание слайда:
Amdahl’s law says that no matter how many cores you have, the maximum speedup you can ever achieve is (1 / percent of time spent in sequential processing) The limits of parallelism

Слайд 14
Описание слайда:
Whenever possible, stay at the highest possible level of abstraction and use constructs or a library that does the parallel work for you Parallel programming tips

Слайд 15
Описание слайда:
Use patterns Parallel programming tips

Слайд 16
Описание слайда:
Based on the .NET Framework 4 Code examples of this presentation

Слайд 17
Описание слайда:
Introduction to Parallel Programming Introduction to Parallel Programming Parallel Loops Parallel Aggregation

Слайд 18
Описание слайда:
Parallel programming patterns

Слайд 19
Описание слайда:
Use the Parallel Loop pattern when you need to perform the same independent operation for each element of a collection or for a fixed number of iterations Parallel Loops

Слайд 20
Описание слайда:
Parallel.For

Слайд 21
Описание слайда:
Parallel.ForEach

Слайд 22
Описание слайда:
Almost all LINQ-to-Objects expressions can easily be converted to their parallel counterpart by adding a call to the AsParallel extension method Parallel LINQ (PLINQ)

Слайд 23
Описание слайда:
Use PLINQ’s ForAll extension method in cases where you want to iterate over the input values but you don’t want to select output values to return PLINQ ForAll

Слайд 24
Описание слайда:
The .NET implementation of the Parallel Loop pattern ensures that exceptions that are thrown during the execution of a loop body are not lost Exceptions

Слайд 25
Описание слайда:
Parallel loops Parallel loops variations

Слайд 26
Описание слайда:
Writing to shared variables Dependencies between loop iterations

Слайд 27
Описание слайда:
Referencing data types that are not thread safe Dependencies between loop iterations

Слайд 28
Описание слайда:
Sequential iteration Breaking out of loops early

Слайд 29
Описание слайда:
Use Break to exit a loop early while ensuring that lower-indexed steps complete Parallel Break

Слайд 30
Описание слайда:
Calling Break doesn’t stop other steps that might have already started running Parallel Break

Слайд 31
Описание слайда:
ParallelLoopResult

Слайд 32
Описание слайда:
Use Stop to exit a loop early when you don’t need all lower-indexed iterations to run before terminating the loop Parallel Stop

Слайд 33
Описание слайда:
External Loop Cancellation

Слайд 34
Описание слайда:
Special handling of small loop bodies

Слайд 35
Описание слайда:
The number of ranges that will be created by a Partitioner object depends on the number of cores in your computer Special handling of small loop bodies

Слайд 36
Описание слайда:
You usually let the system manage how iterations of a parallel loop are mapped to your computer’s cores, in some cases, you may want additional control Controlling the degree of parallelism

Слайд 37
Описание слайда:
The PLINQ query in the code example will run with a maximum of eight tasks at any one time Controlling the degree of parallelism

Слайд 38
Описание слайда:
Sometimes you need to maintain thread-local state during the execution of a parallel loop Task-local state in a loop body

Слайд 39
Описание слайда:
Random initialization of the large array

Слайд 40
Описание слайда:
Calling the default Random constructor twice in short succession may use the same random seed Random class in parallel

Слайд 41
Описание слайда:
You can substitute custom task scheduling logic for the default task scheduler that uses ThreadPool worker threads Using a custom task scheduler

Слайд 42
Описание слайда:
Step size other than one Anti-Patterns

Слайд 43
Описание слайда:
Adaptive partitioning Parallel loops design notes

Слайд 44
Описание слайда:
Introduction to Parallel Programming Introduction to Parallel Programming Parallel Loops Parallel Aggregation

Слайд 45
Описание слайда:
Parallel programming patterns

Слайд 46
Описание слайда:
The pattern is more general than calculating a sum The Parallel Aggregation pattern

Слайд 47
Описание слайда:
Sequential version Calculating a sum

Слайд 48
Описание слайда:
PLINQ Calculating a sum

Слайд 49
Описание слайда:
PLINQ is usually the recommended approach Parallel aggregation pattern in .NET

Слайд 50
Описание слайда:
The PLINQ Aggregate extension method includes an overloaded version that allows a very general application of the parallel aggregation pattern Using PLINQ aggregation with range selection

Слайд 51
Описание слайда:
Aggregation using Parallel For and ForEach Design notes

Слайд 52
Описание слайда:
Aggregation in PLINQ does not require the developer to use locks Design notes

Слайд 53
Описание слайда:
Task Parallel Library Data Parallelism Patterns


Скачать презентацию на тему Task Parallel Library. Data Parallelism Patterns можно ниже:

Похожие презентации