DocumentCode :
1458385
Title :
Components in the Pipeline
Author :
Gorton, Ian ; Wynne, Adam ; Liu, Yan ; Yin, Jian
Volume :
28
Issue :
3
fYear :
2011
Firstpage :
34
Lastpage :
40
Abstract :
State-of-the-art scientific instruments and simulations routinely produce massive datasets requiring intensive processing to disclose key features of the artifact or model under study. Scientists commonly call these data-processing pipelines, which are structured according to the pipe and-filter architecture pattern.1 Different stages typically communicate using files; each stage is an executable program that performs the processing needed at that point in the pipeline.The MeDICi (Middleware for Data-Intensive Computing) Integration Framework supports constructing complex software pipelines from distributed heterogeneous components and controlling qualities of service to meet performance, reliability and communication requirements.
Keywords :
middleware; pipeline processing; software quality; software reliability; MeDICi integration framework; data-processing pipelines; massive datasets; middleware for data-intensive computing; pipe-and-filter architecture; qualities of service; reliability; state-of-the-art scientific instruments; Data preprocessing; Protocols; Quality of service; Software engineering; components; pipelines; scientific software; software engineering;
fLanguage :
English
Journal_Title :
Software, IEEE
Publisher :
ieee
ISSN :
0740-7459
Type :
jour
DOI :
10.1109/MS.2011.23
Filename :
5719591
Link To Document :
بازگشت