CS5016: Computational Methods and Applications: Linear Systems and Interpolation
CS5016: Computational Methods and Applications: Linear Systems and Interpolation
CS5016: Computational Methods and Applications: Linear Systems and Interpolation
Albert Sunny
01 February, 2023
Ax = b
where A is a n × n square matrix whose elements are aij , x and b are
column vectors of dimension n. Component-wise, the above equation can
be written as
Xn
aij xj = bi ∀i ∈ {1, 2, . . . , n}
j=1
Qj = Lj ∆pj
1
“Scientific Computing with MATLAB and Octave”, Alfio Quateroni and Fausto
Saleri
Albert Sunny (CSE, IIT Palakkad) CS5016 01 February, 2023 3 / 16
Iterative solution method
Px = b − (A − P )x (1)
Proposition
If the matrix A is strictly diagonally dominant by row, then the Jacobi
method converges.
Caution
There are no general results stating that the Gauss-Seidel method
converges faster than Jacobi’s
The NumPy linear algebra functions rely on BLAS and LAPACK to provide
efficient low level implementations of standard linear algebra algorithms.
f˜(xi ) = yi ∀i ∈ {0, 1, 2, . . . , n}
Can you figure out a cubic function that passes through the points (0, 1),
(1, 4), (−1, 0) and (2, 15)? How many such such cubic functions exist?
Assume
f (x) = a + bx + cx 2 + dx 3
Then we get the following linear system
1 0 0 0 a 1
1 1 1 1 b 4
1 −1 0 −1 c = 0
1 2 4 8 d 15
Solving the above linear system will give us coefficients of the desired
cubic polynomial.
What is the complexity of the above method?
polynomial interpolation
f˜(x) = a0 + a1 x + a2 x 2 + · · · + an x n
trigonometric interpolation
rational interpolation
a0 + a1 x + · · · + ak x k
f˜(x) =
b0 + b1 x + · · · + bn x n
Note that
xj −xi
(Qn
i=0,i̸=j xj −xi = 1 if k = j
ψj (xk ) =
0 otherwise
Then, the required approximation is
n
X
f˜(x) = yj ψj (x)
j=0