1 Jun 2009

PARALLEL COMPUTING

LEARNING DIARY ON PARALLEL COMPUTING
This is one of the presentations that was shortly clearly in concept presented by
Mariam mwakisisile.&
Abdallah r. sasya

INTRODUCTION
The concept of parallel computing habitually, software has been written in serial computations where by to be run on a single computer we should have to get a single Central Processing Unity (CPU).
(a)CENTRAL PROCESSING UNIT
By definition is an integrated electronic device which understands commands as instructions to the computer, performs the logical and arithmetic processing operations, and foundation of the input and output operations to transpire or come around.

MEANING
Is a form of computation in which many calculations are carried out simultaneously, operating on the principle that large problems can often be divided into smaller ones, which are then solved concurrently “in parallel”

AIM AND IDEA OF PARALLEL COMPUTING
The ever increasing need of computing power has traditionally been fulfilled by producing new more powerful processors Physical limits restrict the growth so as can help us in computation of many calculation simultaneously. As a result on 1947 an Hungarian American mathematician John Von Neumann, he ask over himself that, if a single computer work out a problem in ten seconds, can ten computers solve problem in one second but he documented the need of parallelism in computers so as to solve problems.

FORMS OF PARALLEL COMPUTING
There are at least about four forms of parallel computing presented in our presentation as
(i)BIT LEVEL PARALLELISM.
-Is a form of parallel computing based on increasing processor word size as the number of bits in a word (the word size or word length)
Instruction Level Parallelism:- Is when how many of the operations in a computer program can be performed simultaneously eg: a+b=e, c+d=f, e*f=g or a=b=e, c+d=f, g*i=h

(ii)INSTRUCTION LEVEL PARALLELISM.
- is a measure of how many of the operations in a computer program can be performed simultaneously or at the same time at once.
(iii)DATA PARALLELISM.
- Also known as loop-level parallelism as a form of parallelization of computing across multiple processors in parallel computing environments. Data parallelism focuses on distributing the data across various parallel computing peripherals. It contrasts to task parallelism as another form of parallelism.
(iv)TASK PARALLELISM
-also known as function parallelism and control parallelism is a form of parallelization of computer code across multiple processors in parallel computing environments. Task parallelism focuses on distributing execution processes (threads) across different parallel computing nodes. It contrasts to data parallelism as another form of parallelism.

USES PARALLEL COMPUTING
(i)ELECTRICAL ENGINEERING e.g. circuit design.
In this field parallel computing can be able to design various circuits and use the in our home house.
(ii)COMPUTER SCIENCE e.g. Mathematics manipulation.
Also in the field of computer science as used in calculation and manipulation of mathematics in our dairy life.
(iii)PHYSICS APPLIED e.g. nuclear atoms.
Also in most of laboratory many physicians use parallel computer in making nuclear atom bombs.

IMPORTANCE PARALLEL COMPUTING
(i)IT SAVES TIME.
-It does not need amount of time in executing the programs
(ii)DATA SHARING
-where resources are passing from one person to another and accomplish the whole purpose through parallelisms.
(iii)SOLVE COMPLEX THINGS
-Also helps in solving most of the larger problems into small usable ones.
(iv)PROVIDE CONCURRENCY.
-In this field of computing of parallelism helps us to execute the concatenated inputs into a single result.

CHALLENGES
(i)Very expensive as highly price
-it needs more amount of money to accomplish the field of parallelism
(ii)Increases traffic on the shared memory-CPU’s path.
(iii)There must be a programmer to ensure correct access of global memory

TERMINATION
Apart from the challenges mentioned above, parallel computing is important because it took less time to execute a problem, sharing information and solving large problems

Refference
(i)htt://en.wikipedia.org/wiki/computing.
(ii)Leadbetter chips,2004.Computer studies and information technology. Cambridge university.
(iii)Computing.llnl.gov/tutorials/parallel comp/

By Venance

No comments:

Post a Comment