Parallel Computing
并行计算是同时利用多种计算资源来解决可计算问题:
1.利用多个处理器。
2.一个问题可以被分解为多个离散的、能并发解决的部分。
3.每个离散部分可以被进一步分解为一系列指令。
4.各个离散部分之间的指令可以同时在不同的处理器上执行。
5.使用控制/协同机制。
Concepts and Terminology
1.Supercomputing and HPC(High Performance Computing)
2.Node: comprised of multi-processors/CPUs/cores, memory, network interface etc.
3.Task: a logically discrete section of computitional work.
4.Shared Memory: From a strictly hardware point of view, describes a computer architecture where all processors have direct access to a common physical memory.
5.Symmetric Multi-processors(SMP).
6.Commucinations:parallel tasks typically need to exchange data.
7.Synchronization: the coordination of parallel task in real time.
8.Granularity: in parallel computing, granulairty is qualiative measure of the ration of compution to communication.
Coarse: relatively large amounts of computational work are done between communication events.
Fine: relatively small amount of computational work are done between communication events.
9.