# Basic Linear Algebra

• Erik B. Bajalinov
Part of the Applied Optimization book series (APOP, volume 84)

## Abstract

In this chapter, we begin by giving some familiar definitions for the sake of completeness and to refresh readers’ memory. We survey the topics of linear algebra that will be needed in the rest of the book. First, we discuss the building blocks of linear algebra: vectors, matrices, linear dependence and independence, determinants, etc. We continue the chapter with an introduction to inverse of matrix, then we use our knowledge of matrices and vectors to develop a systematic procedure (Gaussian elimination method) for solving linear equations, which we then use to invert matrices. Finally, we close the chapter with a short description of the Gauss-Jordan method for solving systems of linear equations. The material covered in this chapter will be used in our study of linearfractional programming.

## Keywords

Triangular Matrix Gaussian Elimination Triangular Form Lower Triangular Matrix Partial Pivoting
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

## Notes

1. 1.
Cramer’s rule is beyond the scope of this book since in this method each component of the solution is computed as a ratio of determinants. Though often taught in elementary linear courses, this method is astronomically expensive for full matrices of nontrivial size. Cramer’s rule is useful mostly as a theoretical tool and is not usually used in operations research.Google Scholar