#Robotic Automation
COLLABORATIVE EFFORT BY ROBOTS PRODUCES RESULTS
MIT Algorithm Creates New Machine-Learning Model That Doesn't Require Data Aggregation
Researcher at MIT’s Laboratory for Information and Decision Systems have come up with an algorithm that allows independent agents (for example robots) collectively develop a machine-learning model without data aggregation. Machine learning is where computers learn new competences by looking for patterns in training data, it is the technique most autonomous robots use to build models of their surrounding environments. The team of researchers will present their findings at the 2014 Conference on Uncertainty in Artificial Intelligence, July 23rd -27th 2014.
During their experiments the team’s distributed algorithm outperformed a standard algorithm that is based on data being stored in one single location. They set out an algorithm, in which robots exploring a building gather data and analyze it separately. When pairs of robots cross paths they can exchange analyses and therefore build a more complete picture of their environment. According to Trevor Campbell who wrote the paper , “If smaller chunks of data are first processed by individual robots and then combined, the final model is less likely to get stuck at a bad solution.”
The applications of this distributed algorithm go beyond robot learning and could have a significant impact on how we treat big data on the web.