## Archive for the ‘Linear Algebra’ Category

### Deciding linear independence

February 23, 2012

Quantifier elimination has a bit of a magical feel to it.

Arnab Bhattacharyya (2011)

We would like to decide whether a given (finite) set of vectors $\{\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_k\} \subset \mathbb{R}^n$ is linearly independent. Let us first recall the definition of linear independence. From Dym’s book [1], we have:

Definition: A set of vectors $\mathbf{v}_1, \dots, \mathbf{v}_k$ in a vector space $\mathcal{V}$ over $\mathbb{F}$ is said to be linearly independent over $\mathbb{F}$ if the only scalars $\alpha_1, \dots, \alpha_k \in \mathbb{F}$ for which

$\alpha_1 \mathbf{v}_1 + \dots + \alpha_k \mathbf{v}_k = \mathbf{0}$

are $\alpha_1 = \dots = \alpha_k = 0$. This is just another way of saying that you cannot express one of these vectors in terms of the others.

_____

In this post we are interested in sets of real vectors and, thus, we shall restrict our attention to the case where $\mathbb{F} = \mathbb{R}$. Asserting that a given (finite) set of vectors is linearly independent is the same as stating that the set of vectors under study is not linearly dependent. Therefore, we now recall Dym’s definition [1] of linear dependence:

Definition: A set of vectors $\mathbf{v}_1, \dots, \mathbf{v}_k$ in a vector space $\mathcal{V}$ over $\mathbb{F}$ is said to be linearly dependent over $\mathbb{F}$ if there exists a set of scalars $\alpha_1, \dots, \alpha_k \in \mathbb{F}$, not all of which are zero, such that

$\alpha_1 \mathbf{v}_1 + \dots + \alpha_k \mathbf{v}_k = \mathbf{0}$.

Notice that this permits you to express one or more of the given vectors in terms of the others.

_____

From this definition it follows that saying that a given (finite) set of vectors $\{\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_k\} \subset \mathbb{R}^n$ is linearly dependent is the same as stating that the following existentially quantified formula

$\displaystyle\exists \alpha_1 \exists \alpha_2 \dots \exists \alpha_k \left( \neg \left(\bigwedge_{i=1}^k \alpha_i = 0\right) \land \left(\sum_{i = 1}^k \alpha_i \mathbf{v}_i = \mathbf{0}\right)\right)$

evaluates to $\text{True}$. Please do note that $\sum_{i = 1}^k \alpha_i \mathbf{v}_i = \mathbf{0}$ is a vector equation encapsulating $n$ scalar equations of the form

$\displaystyle\sum_{i = 1}^k \alpha_i v_i^{(j)} = 0$

where $v_i^{(j)}$ is the $j$-th component of vector $\mathbf{v}_i$. Finally, we conclude that a given (finite) set of vectors $\{\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_k\} \subset \mathbb{R}^n$ being linearly independent is equivalent to the following formula

$\neg \displaystyle\exists \alpha_1 \exists \alpha_2 \dots \exists \alpha_k \left( \neg \left(\bigwedge_{i=1}^k \alpha_i = 0\right) \land \left(\bigwedge_{j=1}^n \sum_{i = 1}^k \alpha_i v_i^{(j)} = 0\right)\right)$

evaluating to $\text{True}$. We can determine the truth value of the formula above using quantifier elimination.

__________

Example

Consider the three following vectors in $\mathbb{R}^4$

$\mathbf{v}_1 = \left[\begin{array}{c} 1\\ 0 \\ 0 \\ 0\end{array}\right]$,    $\mathbf{v}_2 = \left[\begin{array}{c} 0\\ 1 \\ 0 \\ 0\end{array}\right]$,    $\mathbf{v}_3 = \left[\begin{array}{c} 1\\ 1 \\ 0 \\ 0\end{array}\right]$.

Is the set of vectors $\{\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\}$ linearly independent (over $\mathbb{R}$)? It is not, as $\mathbf{v}_3 = \mathbf{v}_1 + \mathbf{v}_2$. However, we would like to arrive at this conclusion using quantifier elimination. Note that in this example we have $k = 3$ (number of vectors), and $n = 4$ (dimensionality).

The vector equation $\alpha_1 \mathbf{v}_1 + \alpha_2 \mathbf{v}_2 + \alpha_3 \mathbf{v}_3 = \mathbf{0}$ then becomes

$\alpha_1 \left[\begin{array}{c} 1\\ 0 \\ 0 \\ 0\end{array}\right] + \alpha_2 \left[\begin{array}{c} 0\\ 1 \\ 0 \\ 0\end{array}\right] + \alpha_3 \left[\begin{array}{c} 1\\ 1 \\ 0 \\ 0\end{array}\right] = \left[\begin{array}{c} 0\\ 0 \\ 0 \\ 0\end{array}\right]$

which yields two (scalar) equations: $\alpha_1 + \alpha_3 = 0$ and $\alpha_2 + \alpha_3 = 0$. Note that the vector equation above also yields two redundant equations of the form $0 = 0$, which we happily discard.

The set of vectors $\{\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\}$ is linearly independent (over $\mathbb{R}$) if and only if the following existentially quantified formula

$\neg \displaystyle\exists \alpha_1 \exists \alpha_2 \exists \alpha_3 \left( \neg \left(\bigwedge_{i=1}^3 \alpha_i = 0\right) \land \alpha_1 + \alpha_3 = 0 \land \alpha_2 + \alpha_3 = 0\right)$

evaluates to $\text{True}$. Using the following REDUCE + REDLOG script, we can determine the truth value of the formula above:

% decide the linear independence of a set of vectors

load_package redlog;
rlset ofsf;

% define linear functions
f1 := alpha1;
f2 := alpha2;
f3 := alpha3;
f4 := alpha1 + alpha3;
f5 := alpha2 + alpha3;

% define existentially quantified formula
phi := not ex({alpha1,alpha2,alpha3},
not (f1=0 and f2=0 and f3=0) and f4=0 and f5=0);

% perform quantifier elimination
rlqe phi;

end;

This script returns $\text{False}$, which allows us to conclude that the set of vectors $\{\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\}$ is, indeed, not linearly independent.

__________

Acknowledgements

This post was inspired by Arnab Bhattacharyya’s recent blog post on quantifier elimination.

__________

References

[1] Harry Dym, Linear algebra in action, American Mathematical Society, 2006.