0
$\begingroup$

Given $p_{A}, p_{B}, p_{C}$ so that $p_{A}+ p_{B}+ p_{C}= 1$, and the transition matrix equation $\begin{bmatrix} a+ m & -b & -s\\ -a & b+ d & 0\\ -m & -d & s\\ \end{bmatrix}\begin{bmatrix} p_{A}\\ p_{B}\\ p_{C}\\ \end{bmatrix}= \begin{bmatrix} 0\\ 0\\ 0\\ \end{bmatrix}$. So how to calculate $p_{A}+ p_{B}$ quickly?

I just have an approach by multiplying $\begin{bmatrix} 1 & 0 \\ 0 & 1 \\ 0 & 0 \\ \end{bmatrix}$ (edit: I'm wrong) based on the associative property of matrix multiplication to get $p_{A}+ p_{B}$ from $\begin{bmatrix} p_{A}\\ p_{B}\\ p_{C}\\ \end{bmatrix}$, but the RHS is still $0$-matrix. I don't know what should I do next? I need your help. Thank you.

$\endgroup$
2
  • $\begingroup$ Can you clarify what you mean by multiplying $\begin{bmatrix} 1 & 0 \\ 0 & 1 \\ 0 & 0 \\ \end{bmatrix}$? Where are you multiplying this matrix? $\endgroup$ Commented Oct 26 at 5:08
  • $\begingroup$ @BrianMoehring I'm sorry, I have tried a way to get $p_{A}+ p_{B}$ from $p_{A}+ p_{B}+ p_{C}$. I will fix it then. $\endgroup$ Commented Oct 26 at 5:11

1 Answer 1

3
$\begingroup$

I think your best bet is to solve for $p_C$, and compute $1 - p_C$. I know that this isn't what you're looking for particularly, but practically speaking, this will be the more efficient approach.

As I understand your question, you seem to want to know if there are quick manipulations, possibly involving matrix multiplication, where we could extract the information about $p_A + p_B$. I don't think what you're looking for is feasible.

If we include the $p_A + p_B + p_C = 1$ restriction, we would get the system of linear equations: $$\begin{bmatrix} 1 & 1 & 1 \\ a+ m & -b & -s\\ -a & b+ d & 0\\ -m & -d & s \end{bmatrix}\begin{bmatrix} p_{A}\\ p_{B}\\ p_{C}\\ \end{bmatrix}= \begin{bmatrix} 1\\ 0\\ 0\\ 0 \end{bmatrix}.$$ Note: the final row is just the negative of the sum of the second and third rows, so it is redundant. We could simply write: $$\begin{bmatrix} 1 & 1 & 1 \\ a+ m & -b & -s\\ -a & b+ d & 0 \end{bmatrix}\begin{bmatrix} p_{A}\\ p_{B}\\ p_{C}\\ \end{bmatrix}= \begin{bmatrix} 1\\ 0\\ 0 \end{bmatrix}.$$

Now, in order to suitably extract $p_A + p_B$, we would want a change of variables, which amounts to a change of basis. Let's call our new variables $x, y, z$. We want one of our new variables, $z$ say, to be $p_A + p_B$. We're hoping that, in transforming the system, we can simply read off the value of $z$, so that we can find $z$ without having to perform a full solving procedure. As I said, I don't think this is feasible.

To illustrate this, let's choose a change of variables in this vein: $x = p_B$, $y = p_C$, and $z = p_A + p_B$. Then $p_A = z - x$, $p_B = x$, and $p_C = y$, or in other words, $$\begin{bmatrix} p_A \\ p_B \\ p_C \end{bmatrix} = \begin{bmatrix} -1 & 0 & 1 \\ 1 & 0 & 0 \\ 0 & 1 & 0 \end{bmatrix} \begin{bmatrix} x \\ y \\ z \end{bmatrix}.$$ Our system would become:

$$\begin{bmatrix} 1 & 1 & 1 \\ a+ m & -b & -s\\ -a & b+ d & 0 \end{bmatrix} \begin{bmatrix} -1 & 0 & 1 \\ 1 & 0 & 0 \\ 0 & 1 & 0 \end{bmatrix} \begin{bmatrix} x\\ y\\ z\\ \end{bmatrix}= \begin{bmatrix} 1\\ 0\\ 0 \end{bmatrix},$$ i.e. $$\begin{bmatrix} 0 & 1 & 1 \\ -a - m - b & -s & a + m\\ a + b + d & 0 & -a \end{bmatrix} \begin{bmatrix} x\\ y\\ z\\ \end{bmatrix}= \begin{bmatrix} 1\\ 0\\ 0 \end{bmatrix}.$$ The value of $z$ is not easy to read from the above system without going down the path of Gaussian elimination. But maybe this was down to a bad choice of variable change? If we had chosen $x$ and $y$ more cleverly, could we have read off the value of $z = p_A + p_C$ more readily?

No, this is not really possible. In order to read off the value of $z$ easily, we would need one of the rows of the matrix to be of the form $[0, 0, \alpha]$ for some $\alpha \neq 0$, giving us an equation of the form $\alpha z = ?$, where $?$ is $0$ or $1$ taken from $\begin{bmatrix}1\\0\\0\end{bmatrix}$, depending if the row is the first row, or the second/third row. If the $?$ happens to be $0$, then this could only be possible if $p_A + p_B = 0$, which seems unlikely (especially since both are presumably probabilities). There may be some clever choices of $x$ and $y$ that would make the top row into $[0, 0, \alpha]$, but they would involve specific formulas in terms of the unknowns in the matrix, i.e. $a, b, d, m, s$, and finding such clever choices would be a problem at least as hard as solving the original system!

If you think, perhaps, that a little manipulation of the $\begin{bmatrix}1\\0\\0\end{bmatrix}$ is necessary, i.e. by multiplying by matrices on the left, then you're right, but this is exactly what Gaussian elimination does.

All of this to say, I think your idea is a dead end.

$\endgroup$
0

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .