Glam Prestige Journal

Bright entertainment trends with youth appeal.

$\begingroup$

I have always dealt with vector - matrix multiplication where the vector is the right multiplicand, but I am not sure how to apply the product between a matrix and a vector when the vector is the left multiplicand.

I have the following example

$$\beta = \begin{pmatrix} \beta_0 & \beta_1 \end{pmatrix} \in \mathbb{R}^{1 \times 2}$$

and a general matrix

$$A = \begin{pmatrix} a_{11} & a_{12} \\ a_{21} & a_{22}\end{pmatrix} \in \mathbb{R}^{2 \times 2}$$

What would be the algorithm to multiply $\beta \cdot A$? Of course the result is a $1 \times 2$ row vector.

$\endgroup$ 3

2 Answers

$\begingroup$

So essentially you wish to compute: $$ \begin{pmatrix} \beta_0&\beta_1 \end{pmatrix} \begin{pmatrix} a_{11}&a_{12}\\ a_{21}&a_{22} \end{pmatrix}.$$ This equals the following: $$\begin{pmatrix} a_{11}\beta_0+a_{21}\beta_1&a_{12}\beta_0+a_{22}\beta_1 \end{pmatrix} . $$ Hopefully it is clear how the multiplication works.

$\endgroup$ 0 $\begingroup$

Matrix multiplication is defined so that the entry $(i,j)$ of the product is the dot product of the left matrix's row $i$ and the right matrix's column $j$.

If you want to reduce everything to matrices acting on the left, we have the identity $xA = \big(A^Tx^T\big)^T$ where $T$ denotes the transpose. This is because $(AB)^T = B^TA^T$, and the operation that sends a matrix to its transpose is self-inverse.

$\endgroup$ 1

Your Answer

Sign up or log in

Sign up using Google Sign up using Facebook Sign up using Email and Password

Post as a guest

By clicking “Post Your Answer”, you agree to our terms of service, privacy policy and cookie policy