Adding two matrices in Java is a common programming task that demonstrates the use of multi-dimensional arrays. A Java program to add two matrices involves defining two matrices of the same size, adding corresponding elements, and storing the result in a third matrix.