🗊Презентация Task Parallel Library. Data Parallelism Patterns

Нажмите для полного просмотра!
Task Parallel Library. Data Parallelism Patterns, слайд №1Task Parallel Library. Data Parallelism Patterns, слайд №2Task Parallel Library. Data Parallelism Patterns, слайд №3Task Parallel Library. Data Parallelism Patterns, слайд №4Task Parallel Library. Data Parallelism Patterns, слайд №5Task Parallel Library. Data Parallelism Patterns, слайд №6Task Parallel Library. Data Parallelism Patterns, слайд №7Task Parallel Library. Data Parallelism Patterns, слайд №8Task Parallel Library. Data Parallelism Patterns, слайд №9Task Parallel Library. Data Parallelism Patterns, слайд №10Task Parallel Library. Data Parallelism Patterns, слайд №11Task Parallel Library. Data Parallelism Patterns, слайд №12Task Parallel Library. Data Parallelism Patterns, слайд №13Task Parallel Library. Data Parallelism Patterns, слайд №14Task Parallel Library. Data Parallelism Patterns, слайд №15Task Parallel Library. Data Parallelism Patterns, слайд №16Task Parallel Library. Data Parallelism Patterns, слайд №17Task Parallel Library. Data Parallelism Patterns, слайд №18Task Parallel Library. Data Parallelism Patterns, слайд №19Task Parallel Library. Data Parallelism Patterns, слайд №20Task Parallel Library. Data Parallelism Patterns, слайд №21Task Parallel Library. Data Parallelism Patterns, слайд №22Task Parallel Library. Data Parallelism Patterns, слайд №23Task Parallel Library. Data Parallelism Patterns, слайд №24Task Parallel Library. Data Parallelism Patterns, слайд №25Task Parallel Library. Data Parallelism Patterns, слайд №26Task Parallel Library. Data Parallelism Patterns, слайд №27Task Parallel Library. Data Parallelism Patterns, слайд №28Task Parallel Library. Data Parallelism Patterns, слайд №29Task Parallel Library. Data Parallelism Patterns, слайд №30Task Parallel Library. Data Parallelism Patterns, слайд №31Task Parallel Library. Data Parallelism Patterns, слайд №32Task Parallel Library. Data Parallelism Patterns, слайд №33Task Parallel Library. Data Parallelism Patterns, слайд №34Task Parallel Library. Data Parallelism Patterns, слайд №35Task Parallel Library. Data Parallelism Patterns, слайд №36Task Parallel Library. Data Parallelism Patterns, слайд №37Task Parallel Library. Data Parallelism Patterns, слайд №38Task Parallel Library. Data Parallelism Patterns, слайд №39Task Parallel Library. Data Parallelism Patterns, слайд №40Task Parallel Library. Data Parallelism Patterns, слайд №41Task Parallel Library. Data Parallelism Patterns, слайд №42Task Parallel Library. Data Parallelism Patterns, слайд №43Task Parallel Library. Data Parallelism Patterns, слайд №44Task Parallel Library. Data Parallelism Patterns, слайд №45Task Parallel Library. Data Parallelism Patterns, слайд №46Task Parallel Library. Data Parallelism Patterns, слайд №47Task Parallel Library. Data Parallelism Patterns, слайд №48Task Parallel Library. Data Parallelism Patterns, слайд №49Task Parallel Library. Data Parallelism Patterns, слайд №50Task Parallel Library. Data Parallelism Patterns, слайд №51Task Parallel Library. Data Parallelism Patterns, слайд №52Task Parallel Library. Data Parallelism Patterns, слайд №53

Содержание

Вы можете ознакомиться и скачать презентацию на тему Task Parallel Library. Data Parallelism Patterns. Доклад-сообщение содержит 53 слайдов. Презентации для любого класса можно скачать бесплатно. Если материал и наш сайт презентаций Mypresentation Вам понравились – поделитесь им с друзьями с помощью социальных кнопок и добавьте в закладки в своем браузере.

Слайды и текст этой презентации


Слайд 1





Task Parallel Library
Data Parallelism Patterns
Описание слайда:
Task Parallel Library Data Parallelism Patterns

Слайд 2





Introduction to Parallel Programming
Introduction to Parallel Programming
Parallel Loops
Parallel Aggregation
Описание слайда:
Introduction to Parallel Programming Introduction to Parallel Programming Parallel Loops Parallel Aggregation

Слайд 3





Introduction to Parallel Programming
Introduction to Parallel Programming
Parallel Loops
Parallel Aggregation
Описание слайда:
Introduction to Parallel Programming Introduction to Parallel Programming Parallel Loops Parallel Aggregation

Слайд 4





Hardware trends predict more cores instead of faster clock speeds
Multicore system features
Описание слайда:
Hardware trends predict more cores instead of faster clock speeds Multicore system features

Слайд 5





Some parallel applications can be written for specific hardware
Potential parallelism
Описание слайда:
Some parallel applications can be written for specific hardware Potential parallelism

Слайд 6





Decomposition
Parallel programming patterns aspects
Описание слайда:
Decomposition Parallel programming patterns aspects

Слайд 7





Tasks are sequential operations that work together to perform a larger operation
Decomposition
Описание слайда:
Tasks are sequential operations that work together to perform a larger operation Decomposition

Слайд 8





Tasks that are independent of one another can run in parallel
Coordination
Описание слайда:
Tasks that are independent of one another can run in parallel Coordination

Слайд 9





Tasks often need to share data
Scalable sharing of data
Описание слайда:
Tasks often need to share data Scalable sharing of data

Слайд 10





Understand your problem or application and look for potential parallelism across the entire application as a whole
Parallel programming design approaches
Описание слайда:
Understand your problem or application and look for potential parallelism across the entire application as a whole Parallel programming design approaches

Слайд 11





Concurrency is a concept related to multitasking and asynchronous input-output (I/O)
Concurrency & parallelism
Описание слайда:
Concurrency is a concept related to multitasking and asynchronous input-output (I/O) Concurrency & parallelism

Слайд 12





With parallelism, concurrent threads execute at the same time on multiple cores
Concurrency & parallelism
Описание слайда:
With parallelism, concurrent threads execute at the same time on multiple cores Concurrency & parallelism

Слайд 13





Amdahl’s law says that no matter how many cores you have, the maximum speedup you can ever achieve is (1 / percent of time spent in sequential processing)
The limits of parallelism
Описание слайда:
Amdahl’s law says that no matter how many cores you have, the maximum speedup you can ever achieve is (1 / percent of time spent in sequential processing) The limits of parallelism

Слайд 14





Whenever possible, stay at the highest possible level of abstraction and use constructs or a library that does the parallel work for you
Parallel programming tips
Описание слайда:
Whenever possible, stay at the highest possible level of abstraction and use constructs or a library that does the parallel work for you Parallel programming tips

Слайд 15





Use patterns
Parallel programming tips
Описание слайда:
Use patterns Parallel programming tips

Слайд 16





Based on the .NET Framework 4
Code examples of this presentation
Описание слайда:
Based on the .NET Framework 4 Code examples of this presentation

Слайд 17





Introduction to Parallel Programming
Introduction to Parallel Programming
Parallel Loops
Parallel Aggregation
Описание слайда:
Introduction to Parallel Programming Introduction to Parallel Programming Parallel Loops Parallel Aggregation

Слайд 18






Parallel programming patterns
Описание слайда:
Parallel programming patterns

Слайд 19





Use the Parallel Loop pattern when you need to perform the same independent operation for each element of a collection or for a fixed number of iterations
Parallel Loops
Описание слайда:
Use the Parallel Loop pattern when you need to perform the same independent operation for each element of a collection or for a fixed number of iterations Parallel Loops

Слайд 20






Parallel.For
Описание слайда:
Parallel.For

Слайд 21






Parallel.ForEach
Описание слайда:
Parallel.ForEach

Слайд 22





Almost all LINQ-to-Objects expressions can easily be converted to their parallel counterpart by adding a call to the AsParallel extension method
Parallel LINQ (PLINQ)
Описание слайда:
Almost all LINQ-to-Objects expressions can easily be converted to their parallel counterpart by adding a call to the AsParallel extension method Parallel LINQ (PLINQ)

Слайд 23





Use PLINQ’s ForAll extension method in cases where you want to iterate over the input values but you don’t want to select output values to return
PLINQ ForAll
Описание слайда:
Use PLINQ’s ForAll extension method in cases where you want to iterate over the input values but you don’t want to select output values to return PLINQ ForAll

Слайд 24





The .NET implementation of the Parallel Loop pattern ensures that exceptions that are thrown during the execution of a loop body are not lost
Exceptions
Описание слайда:
The .NET implementation of the Parallel Loop pattern ensures that exceptions that are thrown during the execution of a loop body are not lost Exceptions

Слайд 25





Parallel loops
Parallel loops variations
Описание слайда:
Parallel loops Parallel loops variations

Слайд 26





Writing to shared variables
Dependencies between loop iterations
Описание слайда:
Writing to shared variables Dependencies between loop iterations

Слайд 27





Referencing data types that are not thread safe
Dependencies between loop iterations
Описание слайда:
Referencing data types that are not thread safe Dependencies between loop iterations

Слайд 28





Sequential iteration
Breaking out of loops early
Описание слайда:
Sequential iteration Breaking out of loops early

Слайд 29





Use Break to exit a loop early while ensuring that lower-indexed steps complete
Parallel Break
Описание слайда:
Use Break to exit a loop early while ensuring that lower-indexed steps complete Parallel Break

Слайд 30





Calling Break doesn’t stop other steps that might have already started running
Parallel Break
Описание слайда:
Calling Break doesn’t stop other steps that might have already started running Parallel Break

Слайд 31






ParallelLoopResult
Описание слайда:
ParallelLoopResult

Слайд 32





Use Stop to exit a loop early when you don’t need all lower-indexed iterations to run before terminating the loop
Parallel Stop
Описание слайда:
Use Stop to exit a loop early when you don’t need all lower-indexed iterations to run before terminating the loop Parallel Stop

Слайд 33






External Loop Cancellation
Описание слайда:
External Loop Cancellation

Слайд 34






Special handling of small loop bodies
Описание слайда:
Special handling of small loop bodies

Слайд 35





The number of ranges that will be created by a Partitioner object depends on the number of cores in your computer
Special handling of small loop bodies
Описание слайда:
The number of ranges that will be created by a Partitioner object depends on the number of cores in your computer Special handling of small loop bodies

Слайд 36





You usually let the system manage how iterations of a parallel loop are mapped to your computer’s cores, in some cases, you may want additional control
Controlling the degree of parallelism
Описание слайда:
You usually let the system manage how iterations of a parallel loop are mapped to your computer’s cores, in some cases, you may want additional control Controlling the degree of parallelism

Слайд 37





The PLINQ query in the code example will run with a maximum of eight tasks at any one time
Controlling the degree of parallelism
Описание слайда:
The PLINQ query in the code example will run with a maximum of eight tasks at any one time Controlling the degree of parallelism

Слайд 38





Sometimes you need to maintain thread-local state during the execution of a parallel loop
Task-local state in a loop body
Описание слайда:
Sometimes you need to maintain thread-local state during the execution of a parallel loop Task-local state in a loop body

Слайд 39






Random initialization of the large array
Описание слайда:
Random initialization of the large array

Слайд 40





Calling the default Random constructor twice in short succession may use the same random seed
Random class in parallel
Описание слайда:
Calling the default Random constructor twice in short succession may use the same random seed Random class in parallel

Слайд 41





You can substitute custom task scheduling logic for the default task scheduler that uses ThreadPool worker threads
Using a custom task scheduler
Описание слайда:
You can substitute custom task scheduling logic for the default task scheduler that uses ThreadPool worker threads Using a custom task scheduler

Слайд 42





Step size other than one
Anti-Patterns
Описание слайда:
Step size other than one Anti-Patterns

Слайд 43





Adaptive partitioning
Parallel loops design notes
Описание слайда:
Adaptive partitioning Parallel loops design notes

Слайд 44





Introduction to Parallel Programming
Introduction to Parallel Programming
Parallel Loops
Parallel Aggregation
Описание слайда:
Introduction to Parallel Programming Introduction to Parallel Programming Parallel Loops Parallel Aggregation

Слайд 45






Parallel programming patterns
Описание слайда:
Parallel programming patterns

Слайд 46





The pattern is more general than calculating a sum 
The Parallel Aggregation pattern
Описание слайда:
The pattern is more general than calculating a sum The Parallel Aggregation pattern

Слайд 47





Sequential version
Calculating a sum
Описание слайда:
Sequential version Calculating a sum

Слайд 48





PLINQ
Calculating a sum
Описание слайда:
PLINQ Calculating a sum

Слайд 49





PLINQ is usually the recommended approach
Parallel aggregation pattern in .NET
Описание слайда:
PLINQ is usually the recommended approach Parallel aggregation pattern in .NET

Слайд 50





The PLINQ Aggregate extension method includes an overloaded version that allows a very general application of the parallel aggregation pattern
Using PLINQ aggregation with range selection
Описание слайда:
The PLINQ Aggregate extension method includes an overloaded version that allows a very general application of the parallel aggregation pattern Using PLINQ aggregation with range selection

Слайд 51





Aggregation using Parallel For and ForEach
Design notes
Описание слайда:
Aggregation using Parallel For and ForEach Design notes

Слайд 52





Aggregation in PLINQ does not require the developer to use locks
Design notes
Описание слайда:
Aggregation in PLINQ does not require the developer to use locks Design notes

Слайд 53





Task Parallel Library
Data Parallelism Patterns
Описание слайда:
Task Parallel Library Data Parallelism Patterns



Похожие презентации
Mypresentation.ru
Загрузить презентацию