Skip to main content

All Questions

Filter by
Sorted by
Tagged with
0 votes
0 answers
18 views

Gaussian elimination choice for matrix

This questions isn't completely about mathematics, it is also part of computer science. I hope here is the correct place for it. I studied about the gaussian elimination algorithm (I want to implement ...
RaduV's user avatar
  • 95
1 vote
2 answers
128 views

Gauss-Jordan elimination gives inconsistent matrix for a consistent system?

I am trying to get an analytical expression for a steady state of an ODE system governing a chemical reaction network via symbolic computer algebra systems. As an example for this question I'll take a ...
linkz's user avatar
  • 45
2 votes
1 answer
540 views

LU decomposition of banded matrix with partial pivoting

Disclaimer: I'm rusty as can be in this department. I'm looking into how to implement a banded matrix LU decomposition with partial pivoting ($PA = LU$). So to start with I implemented regular matrix ...
Wout's user avatar
  • 123
0 votes
1 answer
44 views

Gaussian Elimination Elements $a^{(r)}_{ij}$

Let $A\in \mathbb{R}^{n\times n}$. We apply GE to it. Prove that: $\begin{align} a^{(r)}_{ij}&= a^{(r)}_{ij}=\frac{A\begin{pmatrix} 1 & 2 &\cdots & r & i \\ 1 & 2 &...
I Like Algebra's user avatar
0 votes
0 answers
32 views

Growth of a completely pivoted Matrix

Let A be a CP matrix.( A completely pivoted matrix is one such that during the Gauss transformation with full pivoting there is no need to exchange rows or columns) We apply to it Gaussian elimination....
I Like Algebra's user avatar
2 votes
0 answers
148 views

Computing Inverse with Gaussian Elimination is not Backward Stable

Suppose that we are operating in a three-digit decimal floating point system. We want to use Gaussian Elimination with Partial Pivoting (GEPP) to find the inverse of a given invertible $2\times2$ ...
Hosein Rahnama's user avatar
1 vote
1 answer
44 views

What is separable self-adjoint boundary value problem and why direct matrix solver is primarily restricted to this kind of problem?

I encountered this claim when I am reading a book on multigrid solvers [1]: "Direct methods, of which Gaussian elimination is the prototype, determine a solution exactly (up to machine precision) ...
8cold8hot's user avatar
  • 217
2 votes
1 answer
145 views

What are the advantages of solving a system of linear equations with the LR algorithm? And why shouldn't I always use Gaussian elimination?

I was recently introduced to the LR algorithm to solve systems of linear equations. What I don't really get is, why this method of solving is more effective for different RHS (right hand sides) of ...
schmitti2105's user avatar
4 votes
0 answers
322 views

Growth factor Gaussian elimination

I'm looking at the exercises in the book "Numerical Linear Algebra" by Trefethen and Bau. I'm quite stuck at question 22.2: Experiment with solving $60\times60$ systems of equations $Ax=b$ ...
katdan's user avatar
  • 41
1 vote
0 answers
96 views

Show that two combinations of two vectors in two dimensional space takes 7 multiplication [duplicate]

I was reading the book Linear Algebra and Its Applications by Gilbert Strong(4th edition page-15) and in the first chapter came across the line Two combinations of two vectors in two dimensional ...
Kazi Abu Rousan's user avatar
1 vote
1 answer
260 views

A symmetric matrix producing a smaller symmetric matrix upon Gaussian elimination

Given a real $n \times n$ symmetric matrix $A$ and $a_{11}$ is non-zero, if you use Gaussian Elimination to reduce it to $$ \begin{pmatrix} a_{11} & a_1^T \\ 0 & A_2 \end{pmatrix} , $$ $A_2$ ...
user297855's user avatar
-2 votes
1 answer
51 views

Matrices: Using the row echelon [duplicate]

This is my system of linear equations $$ \left\{ \begin{array}{c} 1x_1+2x_2-1x_3=2 \\ -3x_1+1x_2-3x_3=1\\ 4x_1+ax_2-4x_3=b \end{array} \right. $$ My Rank matrix looks like this: $$ \begin{...
Diana's user avatar
  • 19
0 votes
1 answer
111 views

Understanding the proof the iterative improvement method for the result of linear equation solving using LU decomposition in numerical recipes

The question is from section 2.5 Iterative Improvement of a Solution to Linear Equations in Numerical Recipes book. When we solve $\mathbf{Ax = b}$ using LU decomposition numerically, the result is ...
austin1511's user avatar
0 votes
1 answer
35 views

Solving systems of linear equations non directly

Consider the system of linear equations $$\mathbf A \mathbf x = \mathbf b,$$ where $\mathbf A \in \mathbb R^{n \times n}$ is invertible and $\mathbf b \in \mathbb R^n$. Let $\mathbf D \in \mathbb R^{n ...
user avatar
1 vote
1 answer
208 views

In the LU decomposition with pivoting proof $L = PM^{-1}$ is unit lower triangular

Contexts and definitions of the $LU$ method: Consider a matrix $A \in \mathbb{R}^{n \times n}$: $$ A = \begin{bmatrix}a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{n1} &...
Figurinha's user avatar
  • 424
0 votes
1 answer
135 views

Potential problems with the Gaußian Elimination method?

We can use Gaußian Elimination (GE) to help us solve larger systems of equations. If we have a matrix $A \in \mathbb{K}^{m \times n}$, where $\mathbb{K} \in \{ \mathbb R, \mathbb C, \mathbb G \}$, ...
Ski Mask's user avatar
  • 1,938
1 vote
1 answer
45 views

Assume $AX = C$. How to determine which entry of $BX - D$ is non-negative?

I posted this question on https://scicomp.stackexchange.com, but seems to receive no attention. As long as I get answer in one of them, I will inform in the other. Let $A,B$ be $n \times n$ matrices ...
Akira's user avatar
  • 17.9k
2 votes
2 answers
3k views

Why use elimination method vs. substitution method in linear algebra?

What is the motivation to use the elimination method as opposed to the substitution method? I've always found the latter much easier. Are there any examples where the substitution method fails or when ...
CuriousIndeed's user avatar
2 votes
2 answers
308 views

How do you consider which element in the extended matrix is the pivot?

In fact, many articles said that the pivot must be the largest number in absolute value. But in wikipedia, the case was... $\begin{bmatrix} 0.00300 & 59.14 & 59.17\\ 5.291 & -6.130 & ...
Supakorn Srisawat's user avatar
0 votes
1 answer
926 views

Gauss Seidel - NOT converging to actual solution

For the following problem: using Gauss-Seidel iteration method using partial pivoting find the solution of following system up to 5 iterations with initial values $(x,y,z,w) = (0,0,0,0)$ $10x-7y+3z+...
Akki's user avatar
  • 65
1 vote
0 answers
67 views

Number of operations needed to solve tridiagonal linear system

I was doing this question. I just checked some posts related to it, but didn't find anything suitable.
learningstudent's user avatar
1 vote
1 answer
632 views

Efficiently solve a system of equations in C++ for only certain degrees of freedom given a known structure

I have an algorithm such that at some point I must solve the following system for $X_5$: $$ \left( \begin{array}{cccccccccc} A_1& B_1& & C_1& & & & & \\ B_7& A_2&...
InfiniteElementMethod's user avatar
0 votes
2 answers
405 views

LU decomposition implies Gaussian Elimination without pivoting

If Gaussian elimination can be carried out without pivoting for A, then A has an LU decomposition. Is the converse true: if A has an LU decomposition, then Gaussian elimination can be carried out (in ...
dxdydz's user avatar
  • 1,381
1 vote
1 answer
2k views

Outer product reformulation of LU decomposition

For my numerical analysis class, I wanted to implement the rather simple, in-place LU decomposition algorithm. I did this the naive way and found my code was very, very slow, because I was doing every ...
Peiffap's user avatar
  • 355
0 votes
1 answer
367 views

Optimal pivoting strategy in LU factorization

I'm currently reading the book Numerical Linear Algebra by Lloyd N. Trefethen and David Bau, III, working my way through the required lectures for my Numerical Analysis class. The current subject is ...
Peiffap's user avatar
  • 355
0 votes
1 answer
1k views

Complexity of matrix multiplication and Gaussian elimination

It is said that if matrix multiplication of $n\times n$ matrices is doable in $O(n^\omega)$ for some $\omega\geq 2$, so is Gaussian elimination and vice versa. Now, this could imply that it is ...
Analysis801's user avatar
-1 votes
2 answers
2k views

Cost of Gaussian Elimination

Reference: Linear Algebra and Its Applications, Fourth Edition Gilbert Strang. How many separate arithmetical operations does elimination require, for n equations in n unknowns? While addressing ...
shul's user avatar
  • 109
1 vote
1 answer
2k views

PAQ=LU decomposition

From the decomposition $PAQ=LU$ of a matrix $A$, explain how to solve: $Ax=b$ $A^{-1}$ $det(A)$ ($P$ and $Q$ are permutation matrixes) When working with decompositions $A=LU$ or $PA=LU$ , this ...
Numbermind's user avatar
  • 1,211
5 votes
1 answer
269 views

"Pivot step" that Donald Knuth mentioned in TAOCP

I stumbled upon the following operation on matrices in section 2.2.6 Arrays and Orthogonal Lists in the first volume of The Art of Computer Programming, in an example of working with sparse matrices: $...
Alexey's user avatar
  • 2,482
2 votes
1 answer
6k views

Strictly column diagonally dominant matrices and Gaussian elimination with partial pivoting

Suppose $A \in \mathbb{C}^{m \times m}$ is strictly column diagonally dominant, which means that for each $k$, $$\left| a_{kk} \right| > \sum_{j \neq k} \left|a_{jk}\right|$$ Show that if ...
cgmil's user avatar
  • 1,335
0 votes
0 answers
55 views

how to know when we can finish the gauss elimination method without pivoting till the end

given a matrix $B$ for example $$ B= \begin{pmatrix} 1 & 1 & 1 \\ 2 & 2 & 5 \\ 4 & 6 & 8 \\ \end{pmatrix} $$ I know how to perform ...
the_firehawk's user avatar
  • 2,435
2 votes
1 answer
91 views

Solving linear system using inverse or triangular factorization?

There are at least two general ways to solve linear equations $Ax=b$. First approach: compute $A^{-1}$ using Gauss-Jordan method, then $x=A^{-1}b$. Second approach: compute the triangular ...
xiaohan2012's user avatar
3 votes
1 answer
10k views

Swapping columns in a matrix changes the solution vector order? How to keep track?

I am doing full pivoting for Gaussian elimination and in order to do the full pivoting I am required to swap columns. I am a bit rusty on my linear algebra but it seems that this process somehow ...
Darcy's user avatar
  • 601
1 vote
1 answer
588 views

Computing $PAQ = LU$ using Gaussian elimination with complete pivoting

Suppose $PAQ = LU$ is computed via Gaussian elimination with complete pivoting. Show that there is no element in $e_i^{T}U$ i.e., row $i$ of $U$, whose magnitude is larger than $|\mu_{ii}| = |e_i^{T}U ...
Wolfy's user avatar
  • 6,745
5 votes
1 answer
969 views

Gaussian Elimination preserves S.P.D.

Let $A \in \mathbb{R}^{n \times n} $ be symmetric positive definite with positive diagonal entries. I'm trying to show that at each step $m$ of Gaussian elimination $$ a^{(m+1)}_{ij} = a^{(m)}_{i,j} - ...
KyleW's user avatar
  • 592
2 votes
1 answer
546 views

Block Matrix Nonsingular $\iff v^T A^{-1} u\neq 0$

Here is the given question and my work so far: Question: Let $A$ be an $n \times n$ invertible matrix, and let $u$ and $v$ be two vectors in $\mathbb{R}^n$. Find the necessary and sufficient ...
hungryformath's user avatar
1 vote
2 answers
313 views

Show that n(n+1)/2 multiplications are required

$a_{11}x_1$+$a_{12}x_2$+$a_{13}x_3$+ ...+ $a_{1,n-1}x_{n-1}$+$a_{1n}x_n$ =$b_1$ $a_{22}x_2$+$a_{23}x_3$+ ...+ $a_{2,n-1}x_{n-1}$+$a_{2n}x_n$ =$b_2$ $a_{33}x_3$+ ...+ $a_{3,n-1}x_{n-1}$+$a_{3n}x_n$ =$...
user277100's user avatar
6 votes
2 answers
18k views

Number of Arithmetic Operations in Gaussian-elimination/Gauss-Jordan Hybrid Method for Solving Linear Systems

I am stucked at this problem from the book Numerical Analysis 8-th Edition (Burden) (Exercise 6.1.16) : Consider the following Gaussian-elimination/Gauss-Jordan hybrid method for solving linear ...
MathNerd's user avatar
  • 2,517
15 votes
3 answers
34k views

Time complexity of LU decomposition

I am trying to derive the LU decomposition time complexity for an $n \times n$ matrix. Eliminating the first column will require $n$ additions and $n$ multiplications for $n-1$ rows. Therefore, the ...
amaatouq's user avatar
  • 525
1 vote
1 answer
1k views

Thomas Algorithm for Tridiagonal System

A professor gave us an assignment to solve a Tridiagonal system using Thomas Algorithm. Here is the exercise: I am lost as to what to do with that $(0.2\pi)^2$ and do I just calculate the $\sin(0.2\...
Wilo Maldonado's user avatar
0 votes
1 answer
695 views

What is the computational cost of reduced row echelon and finding the null space?

I'm taking computational linear algebra, and haven't been able to find too much information about the computational cost (in terms of m=rows and n=cols) of these two routines: Reduced Row Echelon ...
royherma's user avatar
  • 111
0 votes
1 answer
551 views

Solving 2x2 diagonally dominant matrix systems (non-symmetric)

I have a linear system of the form $Ax=b$ where $A\in \mathbb{R}^{2\times2}, b\in \mathbb{R}^{2\times1}$. A is diagonally dominant and non-symmetric. This is a "kernel" that I am using to solve a ...
Alek Sobczyk's user avatar
2 votes
0 answers
2k views

Gaussian elimination vs. Jacobi iteration

How can I determine which of the matrix solver is faster for a given set of equations: Gaussian elimination or Jacobi iteration? In case, I have a banded matrix, is it advisable to use LU ...
user163917's user avatar
2 votes
1 answer
2k views

Linear system of equations and multiple linear regression: Numerical solving

I am currently implementing a test procedure for data, namely a linear form of the Kramers-Kronig relations (paper here: http://jes.ecsdl.org/content/142/6/1885.abstract). This includes solving a ...
Jens's user avatar
  • 167
0 votes
1 answer
1k views

Computational efficiency using Gaussian elimination

Assume it took 2 seconds to solve $Ax=b$ for $x$ (where $A$ is a $3 \times 3$ matrix and $b$ is a $3 \times 1$ matrix) using Gaussian elimination, how much longer would it take to: a) use Gaussian ...
Dennis's user avatar
  • 33
10 votes
1 answer
3k views

Decompose invertible matrix $A$ as $A = LPU$. (Artin, Chapter 2, Exercise M.11)

Decompose matrix $A$ as $A = LPU$, where $A \in Gl_n( \mathbb{R}^n)$, $L$ is lower triangular, $U$ is upper triangular with diagonal elements of $1$, and $P$ is a permutation matrix. It is fairly ...
gnometorule's user avatar
  • 4,650
2 votes
1 answer
1k views

Speeding up Gauss Elimination

I was working on something on Matlab couple of months back and found a tweak to speed up Gauss elimination on Matlab by dividing the original matrix into 4 block matrices and then solving them in a ...
user avatar