I'm learning three.js and am trying to multiply two translation matrices together to get a single matrix that describes both translations:
| 1 0 0 -2 | | 1 0 0 3 |
| 0 1 0 7 | * | 0 1 0 4 |
| 0 0 1 6 | | 0 0 1 6 |
| 0 0 0 1 | | 0 0 0 1 |
I was expecting the following matrix:
| 1 0 0 1 |
| 0 1 0 11 |
| 0 0 1 12 |
| 0 0 0 1 |
But instead, I'm getting this matrix:
| 1 0 0 0 |
| 0 1 0 0 |
| 0 0 1 0 |
| 1 11 12 1 |
I'm using three.js's Matrix4.multiply()
method to do the multiplication:
var result = matrix1.multiply(matrix2);
Here's a jsfiddle that shows this in action.
And here's a WolframAlpha query that gives the expected result.
I'm obviously misunderstanding three.js's Matrix4.multiply()
method. What is this method doing and how can I instead achieve the expected behavior?
The problem is in how you're iterating the matrix elements to print them. The
.elements
property returns the elements in column-major order (despite the constructor and.set()
methods parameters being in row-major order!). So you should:Also note that there was a typo in your code and you were printing
e[13]
twice...