+1-518-621-2074 | US-Canada Toll Free Contact Us

AI System Automates Allocation of Workload to Different Serves

about us

Published on : Aug 26, 2019

A new AI system is developed by researchers at MIT that automatically understands the scheduling of data processing operation over thousands of servers. Earlier, this task was designed traditionally designed for inaccurate human created algorithms. Now, the AI system itself learns the algorithm and takes the task ahead with complete precision. This is foreseen to be effective for data centers that are in constant need for power.

Each data center has several servers that continuously carry out data processing activities from the users and developers. This new algorithm, known as cluster scheduling algorithm allots the incoming activities to various servers that effectively use the available computing resources. The allotment task takes place in real time and at a faster pace.

But, these scheduling algorithms are generally fine-tuned by humans on the basis of certain basic policies, trade offs, and guidelines. The humans usually might embed certain code in the algorithm to fasten up some tasks or divide the resource equally among the jobs. However, workloads come in different sizes. As a result, it become impossible virtually for the individuals to optimize their algorithms for every type of workload, and thus they lag to meet up their full potential and efficiency.

System Helps in Decision making And Optimizing Tradeoff

The team of researchers offloaded the manual coding from the machines. The MIT paper was presented at SIGCOMM, which explained the system as a leverage for "reinforcement learning" (RL). This is a hit-and-try method for machine learning which designs scheduling algorithms and helps in decision making for every type of workloads in particular server clusters. For this operation, the team built an advanced RL technique which trains system on allocating complex workloads. It figures out the various ways of workload allocation across several serves and finds an optimal tradeoff. These are used further in faster processing and computation resources. The author of the study assures that no human help is needed after basic instructions.