Questions tagged [matrix-multiplication]
The matrix-multiplication tag has no usage guidance.
34 questions
2
votes
1
answer
31
views
Classical matrix multiplication via Min-Plus matrix multiplication
Just a thought I had in mind. I can use classical matrix multiplication to compute min-plus matrix multiplication. Generally speaking, considering $(n+1)^{a_{i,j}}$ for each entry and then taking the ...
3
votes
1
answer
71
views
Usage of matrix multiplication for distance products
This is more of a validation question, for the current best known results.
On one hand, we have classical matrix multiplication. Its running time is denoted as $n^\omega$.
On the other, we have ...
1
vote
1
answer
23
views
Algorithm for solving linear equations if interested only in the first component
If I want to solve $\mathbf A \mathbf x = \mathbf b$, but I am only interested in the value of $x_1$, what algorithm should I use, and will it always be strictly more efficient than solving for all of ...
0
votes
2
answers
51
views
Floating point operations in a zero padded Strassen multiplication
So I've seen other posts here that do discuss this, but I'm not quite sure how the time complexity (I think?) relates to the actual number of floating point operations done per second when you're ...
1
vote
1
answer
47
views
Compute matrix inversion / multiplication using a black box
Suppose you're given a black box $A$, and you're told $A$ can invert a matrix (assuming the matrix is invertible) $M$ in $O(T_A)$. You're also given a black box $B$, and you're told $B$ can multiply ...
0
votes
1
answer
87
views
Complexity of multiplying 3 matrices
There are algorithms that speed up matrix multiplication over the naive $n^3$ algorithm. But supposing you have 3 matrices $A$, $B$ and $C$, is there a way to compute $ABC$ that is asymptotically ...
1
vote
0
answers
35
views
Matrix multiplication of natural numbers
I know matrix multiplication of matrices with real numbers is bounded by $ \Omega (n^2 log(n))$, but what about if all numbers are natural? Can we use the same methods to get a lower bound for this ...
2
votes
0
answers
45
views
Has Triangle Finding ever been faster than Matrix Multiplication?
The Triangle Finding problem (TF) in Graph Theory was shown by Itai and Rodeh in 1977 [1] to be solvable as fast$^1$ as Boolean Matrix Multiplication (BMM, Matrix Multiplication over $\{0, 1\}$ with ...
3
votes
0
answers
89
views
What's the fastest known non-galactic algorithm for matrix multiplication of large matrices
"A galactic algorithm is one that outperforms any other algorithm for problems that are sufficiently large, but where "sufficiently large" is so big that the algorithm is never used in ...
0
votes
0
answers
33
views
Fast compute of F*P*FT matrix product
Let $P$ be a symmetric (positive definite, if that helps...) matrix of size $n$.
Let $F$ be a matrix of size $n$.
Is there an existing efficient algorithm implementation to calculate $FPF^T$ ?
Is ...
1
vote
2
answers
649
views
Can dot producting the result of vector-matrix multiplication speed up the runtime?
Suppose we have a matrix $A$ of dimension $n \times n$ and two vectors $\vec{u}$ and $\vec{v}$ of dimension $n$.
Then we have $A\vec{v} = \vec{x}$ with time complexity $O(n^2)$ and space complexity $O(...
0
votes
0
answers
27
views
What will be the Computational Complexity in terms of order O of the operations shown in the following figure
Suppose I have L bits. First, I want to multiply the L bits with L orthogonal codes of length N, and then I want to add all the vectors.
So, first, I have to do a scalar multiplication with a vector ...
6
votes
3
answers
1k
views
Are there absolute reasons to prefer row/column-major memory ordering?
I've heard it said that "Fortran uses column-major ordering because it's faster" but I'm not sure that's true. Certainly, matching column-major data to a column-major implementation will ...
0
votes
3
answers
239
views
Super-linear parallelism or speedup in parallel matrix multiplication algorithms
I'm reading this slides from a MIT course on parallel software performance. They introduced the concepts of Work $T_1$, Span $T_\infty$ and Parallelism (ratio $T_1/T_\infty$). What is called ''...
0
votes
0
answers
26
views
runtime of solving matrix differential equation wrt dimensions of matrix
Suppose a computer solves a coupled differential equations (with given boundary conditions) of which each equation deals with $2^n \times 2^n$ size of matrices as solutions. My question is
Does time-...
3
votes
0
answers
82
views
fast multiplication of power of a matrix by a vector
I'm interested in computing of the product $M^n v$, where $M$ is an $m\times m$ matrix (over a semiring) and $v$ is column-vector, with the smallest number of multiplications in the underlying ...
5
votes
2
answers
135
views
Best-known complexity for $l \times m$ by $m \times n$ matrix multiplication?
I know that the fastest known algorithm for multiplying two $m \times m$ matrices runs in time $m^{\omega}$, where currently we have $\omega = 2.3728596$ due to Virginia Williams's latest result
(see ...
3
votes
0
answers
60
views
Computing a series of matrix power - matrix products
Assuming we have two dense matrices $A \in \mathbb{R}^{m\times m}, B \in \mathbb{R}^{m\times n}$, is there a smart way to compute all entries of the series $A^1 B, A^2 B, A^3 B, \dots, A^k B$ up to ...
-1
votes
1
answer
139
views
Matrix Multiplication Verification
This question relates to the 2nd edition of Probability and Computation by Mitzenmacher and Upfal.
The authors then use the Law of Total Probability
My question concerns the line that begins with $\...
0
votes
0
answers
92
views
Merging the submatrices' time complexity in matrix multiplication
This is a problem of CLRS:
What is the largest $k$ such that if you can multiply $3 \times 3$ matrices using $k$ multiplications (not assuming commutativity of multiplication), then you can multiply
$...
0
votes
2
answers
1k
views
Matrix-vector multiplication using only lower triangular of matrix
Suppose one has a large sparse symmetric positive definite matrix $A$ and wants to multiply it by a vector $x$. Only the lower triangular part of matrix A is stored/known. The multiplication $Ax$ ...
0
votes
1
answer
275
views
Why do researchers only count the number of multiplications when analyse the time complexity of Matrix Multiplication?
In this article about the recent breakthough in Matrix Multiplication, it quotes Chris Umans's words:
Multiplications are everything. The exponent on the eventual running time is fully dependent only ...
1
vote
1
answer
164
views
Iterated multiplication of permutation matrices
Given $m$ matrices of size $n\times n$ each of which is promised to be a permutation is it in $\mathit{quasiAC}^0$ or $\mathit{AC}^0$ to multiply the permutations where
$m=\mathit{poly}(n)$
$m=\...
0
votes
1
answer
49
views
matrix multiplication speedup when the matrix elements are 0, 1 and -1
I would like to compute matrix multiplication A * B where A is Nx3 and B is 3x3. We also ...
2
votes
3
answers
1k
views
In Strassen's algorithm, why does padding the matrices with zeros not affect the asymptopic complexity?
In Strassen's algorithm, why does padding the matrices with zeros, in order to multiply matrices that are not powers of 2, not affect the asymptopic complexity?
Hi, I was reading this question but I ...
5
votes
1
answer
147
views
Is there a polynomial sized arithmetic formula for iterated matrix multiplication?
I found an article on Catalytic space which describes how additional memory (which must be returned to it's arbitrary, initial state) can be useful for computation. There's also an expository follow ...
1
vote
2
answers
999
views
Why is the weight matrix diagonal in weighted least squares regression?
I was going through the theory for weighted least-squares fitting and I understood its basic underlying concepts, but I couldn't understand why exactly do we keep the weights as a diagonal matrix ...
26
votes
2
answers
2k
views
What is the intuition behind Strassen's Algorithm?
I came across Strassen's algorithm for matrix multiplication, which has time complexity $O(n^{2.81})$, significantly better than the naive $O(n^3)$. Of course, there have been several other ...
1
vote
0
answers
87
views
Lower bounds for orthogonal matrix multiplication
Is it possible, according to the current state of knowledge, that orthogonal matrices can be multiplied faster than arbitrary matrices?
More precisely, let $T(N)$ denote the worst-case time of the ...
0
votes
1
answer
225
views
Is this benchmark sufficient to consider my algorithm as an efficient matrix multiplication algorithm?
I built a matrix multiplication algorithm and now I need some thoughts about following benchmark.
C++ chrono:: high resolution clock Time(micro second)
(Dim)256--> (Naive algo ) 296807, (My algo) ...
1
vote
1
answer
35
views
How can follow this this guide to construct a graph with matrix/reachability
Let's we have k matrices. For example we have 3 now, where first one is 8x5 ($a_1$ x $b_1$), second one is 5 x 6 ($a_2$ x $b_2$) and last one is 6 x 8 ($a_3$ x $b_3$). And our goal is to figure out ...
0
votes
1
answer
180
views
In a LP problem Ax = b, how to solve for A instead of x?
I have a multi-objective linear programming problem of the form Ax = b, where A is a matrix and x and b are vectors. x is known, and I'm looking to minimise each row of b by solving for A.
Constraints ...
1
vote
1
answer
391
views
Is matrix multiplication cheaper than inverse?
In wiki, it is shown that the time complexity of matrix multiplication and matrix inverse is similar. But people always to argue it is easier to do matrix multiplication rather than inverse. Is this ...
3
votes
1
answer
401
views
Calculate boolean matrix multiplication (BMM) using transitive closure
Let us say I am given an algorithm that calculates the transitive closure of a given graph $G = \{ V, E \}$.
How can I use this algorithm in order to perform the Boolean Matrix Multiplication of two ...