Title :
Components in the Pipeline
Author :
Gorton, Ian ; Wynne, Adam ; Liu, Yan ; Yin, Jian
Abstract :
State-of-the-art scientific instruments and simulations routinely produce massive datasets requiring intensive processing to disclose key features of the artifact or model under study. Scientists commonly call these data-processing pipelines, which are structured according to the pipe and-filter architecture pattern.1 Different stages typically communicate using files; each stage is an executable program that performs the processing needed at that point in the pipeline.The MeDICi (Middleware for Data-Intensive Computing) Integration Framework supports constructing complex software pipelines from distributed heterogeneous components and controlling qualities of service to meet performance, reliability and communication requirements.
Keywords :
middleware; pipeline processing; software quality; software reliability; MeDICi integration framework; data-processing pipelines; massive datasets; middleware for data-intensive computing; pipe-and-filter architecture; qualities of service; reliability; state-of-the-art scientific instruments; Data preprocessing; Protocols; Quality of service; Software engineering; components; pipelines; scientific software; software engineering;
Journal_Title :
Software, IEEE