Stateful scheduling is of critical importance for the performance of a distributed stream computing system. In such a system, inappropriate task deployment lowers the resource utilization of cluster and introduces more communication between compute nodes. Also an online adjustment to task deployment scheme suffers slow state recovery during task restart. To address these issues, we propose a state lossless scheduling strategy (Sl-Stream) to optimize the task deployment and state recovery process. This paper discusses this strategy from the following aspects: (1) A stream application model and a resource model are constructed, together with the formalization of problems including subgraph partitioning, task deployment and stateful scheduling...
ii The era of big data has led to the emergence of new systems for real-time distributed stream proc...
This paper describes a new and novel scheme for job admission and resource allocation employed by th...
Fog computing is rapidly changing the distributed computing landscape by extending the Cloud computi...
Task scheduling in distributed stream computing systems is an NP-complete problem. Current schedulin...
With ever increasing data volumes, large compute clusters that process data in a distributed manner ...
Distributed Stream Processing Systems (DSPS) are ``Fast Data'' platforms that allow streaming applic...
The velocity dimension of Big Data refers to the need to rapidly process data that arrives continuou...
In the era of big data, with streaming applications such as social media, surveillance monitoring an...
Shuffle grouping is a technique used by stream processing frameworks to share input load among paral...
This paper describes the SODA scheduler for System S, a highly scalable distributed stream processin...
Shuffle grouping is a technique used by stream processing frameworks to share input load among paral...
In this study, we investigated the problem of scheduling streaming applications on a heterogeneous c...
International audienceShuffle grouping is a technique used by stream processing frameworks to share ...
ii The era of big data has led to the emergence of new systems for real-time distributed stream proc...
This paper describes a new and novel scheme for job admission and resource allocation employed by th...
Fog computing is rapidly changing the distributed computing landscape by extending the Cloud computi...
Task scheduling in distributed stream computing systems is an NP-complete problem. Current schedulin...
With ever increasing data volumes, large compute clusters that process data in a distributed manner ...
Distributed Stream Processing Systems (DSPS) are ``Fast Data'' platforms that allow streaming applic...
The velocity dimension of Big Data refers to the need to rapidly process data that arrives continuou...
In the era of big data, with streaming applications such as social media, surveillance monitoring an...
Shuffle grouping is a technique used by stream processing frameworks to share input load among paral...
This paper describes the SODA scheduler for System S, a highly scalable distributed stream processin...
Shuffle grouping is a technique used by stream processing frameworks to share input load among paral...
In this study, we investigated the problem of scheduling streaming applications on a heterogeneous c...
International audienceShuffle grouping is a technique used by stream processing frameworks to share ...
ii The era of big data has led to the emergence of new systems for real-time distributed stream proc...
This paper describes a new and novel scheme for job admission and resource allocation employed by th...
Fog computing is rapidly changing the distributed computing landscape by extending the Cloud computi...