Home Tech Google Chips Away at Problems at “Mega-Batch” Scale

Google Chips Away at Problems at “Mega-Batch” Scale

6

As Google’s batch sizes for AI training continue to skyrocket, with some batch sizes ranging from over 100k to one million, the company’s research arm is looking at ways to improve everything from efficiency, scalability, and even privacy for those whose data is used in large-scale training runs.

This week Google Research published a number of pieces around new problems emerging at “mega-batch” training scale for some of its most-used models.

One of the most noteworthy new items from the large-scale training trenches is around batch active learning in the million-batch size ballpark. In essence, this cuts down on the amount of training data (thus compute resources/time) by automating some of the labeling, which is great for efficiency but comes with downsides in terms of flexibility and accuracy.

Google Research has developed its own active learning algorithm to layer into training sets called Cluster-Margin, which they say can operate at…

Click here for full article…www.nextplatform.com