A new record in sorting large amounts of data by Researchers at the Karlsruhe Institute of Technology


Researchers at the Karlsruhe Institute of Technology (KIT) has developed a new robust method for sorting large quantities of data. They beat the previous record held by the Massachusetts Institute of Technology (MIT) and, more importantly, less consumption of resources.

Networked computers on the Internet produce quantities of increasingly large data. In order to treat, it is necessary, first, the following order of criteria. Efficient sorting of data is of growing importance to search engines or databases and a central research topic both in theory and in practice.

For years, the Sort Benchmark, a table for free on the Internet, identifies the current record data sorting. Queen in the discipline, it is sort of the earliest at least 100 billion Byte. Prof. around. Peter Sanders, a team of researchers from the Institute for theoretical KIT imposed in two categories of Sort Benchmark. They managed to sort 100 Terabyte in less than three hours, which equates to an average of 564 GB per minute. To achieve such a feat, they used a combination of computers including 200 nodes. A team of giant Yahoo has exceeded the value of 564 GB per minute, but had to use to this end, 17 times more nodes.

In addition, researchers of KIT increased the record number of records that can be sorted in one minute. This value amounted to 950 GB. This is three times more than the previous record held by MIT and two times more than the record for Yahoo in this category. KIT researchers have also improved the record for Google to sort the fastest one Terabyte down the record from 68 to 64 seconds, again with a lower consumption of resources.

Related Posts by Categories