DeepMind breaks 50-year math record using AI; new record falls a week later [Ars Technica]

View Article on Ars Technica

A colorful 3x4 matrix.

Enlarge / A colorful 3×3 matrix. (credit: Aurich Lawson / Getty Images)

Matrix multiplication is at the heart of many machine learning breakthroughs, and it just got faster—twice. Last week, DeepMind announced it discovered a more efficient way to perform matrix multiplication, conquering a 50-year-old record. This week, two Austrian researchers at Johannes Kepler University Linz claim they have bested that new record by one step.

Matrix multiplication, which involves multiplying two rectangular arrays of numbers, is often found at the heart of speech recognition, image recognition, smartphone image processing, compression, and generating computer graphics. Graphics processing units (GPUs) are particularly good at performing matrix multiplication due to their massively parallel nature. They can dice a big matrix math problem into many pieces and attack parts of it simultaneously with a special algorithm.

In 1969, a German mathematician named Volker Strassen discovered the previous-best algorithm for multiplying 4×4 matrices, which reduces the number of steps necessary to perform a matrix calculation. For example, multiplying two 4×4 matrices together using a traditional schoolroom method would take 64 multiplications, while Strassen’s algorithm can perform the same feat in 49 multiplications.

Read 8 remaining paragraphs | Comments