Posts Tagged ‘Polynomials’

Sum of squares decomposition II

July 5, 2011

Last week, we concluded that the quartic form

$F(x, y) = 2 x^4 + 2 x^3 y - x^2 y^2 + 5 y^4$

has the following sum of squares decomposition

$F(x, y) = \displaystyle \frac{1}{2} \left(2 x^2 -3 y^2 + x y\right)^2 + \frac{1}{2} \left(y^2 + 3 x y\right)^2$

and, thus, $F$ is globally nonnegative. More generally, we saw that $F$ can be written as follows

$F(x,y) = \left[\begin{array}{c} x^2\\ y^2\\ x y\end{array}\right]^T \left[\begin{array}{ccc} 2 & -\lambda & 1\\ -\lambda & 5 & 0\\ 1 & 0 & 2 \lambda - 1\end{array}\right] \left[\begin{array}{c} x^2\\ y^2\\ x y\end{array}\right]$

where $\lambda \in \mathbb{R}$ is a parameter. Let us define $z := (x^2, y^2, x y)$ and

$Q := \left[\begin{array}{ccc} 2 & -\lambda & 1\\ -\lambda & 5 & 0\\ 1 & 0 & 2 \lambda - 1\end{array}\right]$

so that we can write polynomial $F$ in the more compressed form $F(x, y) = z^T Q z$. A sufficient condition for the existence of a SOS decomposition is that $Q$ is positive semidefinite, i.e., $Q \succeq 0$. Why? If $Q$ is positive semidefinite, then it has a Cholesky factorization $Q = L^T L$ and, thus, $F(x, y) = z^T Q z = z^T L^T L z = \| L z \|_2^2$, which finally yields the SOS decomposition

$F(x, y) = \displaystyle \sum_{i =1}^3 \left(L_{i1} x^2 + L_{i2} y^2 + L_{i3} xy \right)^2$

where $L_{ij}$ is the $(i,j)$ entry of matrix $L$. One can also think of the Cholesky factorization as $Q = B B^T$, where $B = L^T$, so that we obtain $F(x, y) = \| B^T z \|_2^2$. However, let us not lose sight of where we are going. The question to be answered is: how do we find a $\lambda \in \mathbb{R}$ such that $Q \succeq 0$?

__________

Admissible values of $\lambda$

Saying that $Q$ is positive semidefinite is equivalent to saying that the $2^3 - 1 = 7$ principal minors of $Q$ are nonnegative. If $Q \succeq 0$, then the three 1st degree principal minors (which are the diagonal entries of $Q$) lead to the three inequalities

$2 \geq 0$,   $5 \geq 0$,   $2 \lambda - 1 \geq 0$.

The first two inequalities are redundant, and the third yields $\lambda \geq \frac{1}{2}$. The three 2nd degree principal minors (determinants of $2 \times 2$ submatrices of $Q$) and the 3rd degree principal minor (determinant of $Q$) can be computed with the following Python / SymPy script:


from sympy import *

# symbolic variable
Lambda = Symbol('Lambda')

# build matrix Q
Q = Matrix([[2,-Lambda,1], [-Lambda,5,0], [1,0,2*Lambda-1]])

print "\n2nd degree principal minors:"
print Q.extract([0,1],[0,1]).det()
print Q.extract([0,2],[0,2]).det()
print Q.extract([1,2],[1,2]).det()

print "\n3rd degree principal minor:"
print Q.det().factor()



which produces the following output:

2nd degree principal minors:
-Lambda**2 + 10
4*Lambda - 3
10*Lambda - 5

3rd degree principal minor:
-(Lambda - 3)*(2*Lambda**2 + 5*Lambda - 5)

From the three 2nd degree principal minors, we then obtain the three inequalities

$- \lambda^2 + 10 \geq 0$,   $4 \lambda - 3 \geq 0$,   $10 \lambda - 5 \geq 0$

which are equivalent to

$\lambda^2 \leq 10$,   $\lambda \geq \frac{3}{4}$,   $\lambda \geq \frac{1}{2}$

which yields $\lambda \in [\frac{3}{4}, \sqrt{10}]$. From the 3rd degree principal minor, we obtain the inequality

$- 2 \lambda^{3} + \lambda^{2} + 20 \lambda - 15 = - \left(\lambda -3\right) \left(2 \lambda^{2} + 5 \lambda -5\right) \geq 0$.

Hence, we have that admissible values of $\lambda$ must satisfy

$\lambda \in [\frac{3}{4}, \sqrt{10}]$,   $\left(\lambda -3\right) \left(2 \lambda^{2} + 5 \lambda -5\right) \leq 0$.

The roots of polynomial $2 \lambda^{2} + 5 \lambda -5$ are $(-5 \pm \sqrt{65}) / 4$. Unfortunately $(-5 + \sqrt{65}) / 4$ is in $[\frac{3}{4}, \sqrt{10}]$. Hence, let us partition the interval $[\frac{3}{4}, \sqrt{10}]$ into two intervals:

• for $\lambda \in [\frac{3}{4}, \frac{-5 + \sqrt{65}}{4})$, we have that $2 \lambda^{2} + 5 \lambda -5$ is negative, and thus $\left(\lambda -3\right) \left(2 \lambda^{2} + 5 \lambda -5\right) \leq 0$ implies that $\lambda \geq 3$, but

$\{ \lambda \in \mathbb{R} : \lambda \in [\frac{3}{4}, \frac{-5 + \sqrt{65}}{4}) \land \lambda \geq 3\} = \emptyset$.

• for $\lambda \in [\frac{-5 + \sqrt{65}}{4}, \sqrt{10}]$, we have that polynomial $2 \lambda^{2} + 5 \lambda -5$ is nonnegative, and thus $\left(\lambda -3\right) \left(2 \lambda^{2} + 5 \lambda -5\right) \leq 0$ implies that $\lambda \leq 3$. Hence,

$\{ \lambda \in \mathbb{R} : \lambda \in [\frac{-5 + \sqrt{65}}{4}, \sqrt{10}] \land \lambda \leq 3\} = [\frac{-5 + \sqrt{65}}{4}, 3]$.

Finally, we conclude that the set of admissible values of $\lambda$ is $\Lambda := [\frac{-5 + \sqrt{65}}{4}, 3]$. Note that $\frac{-5 + \sqrt{65}}{4} \approx 0.765$.

__________

An ensemble of SOS decompositions

Now that we know for which values of $\lambda$ matrix $Q$ is positive semidefinite, we can compute various SOS decompositions. The following Python / SymPy script computes an SOS decomposition for $\lambda = 3$:


from sympy import *

# symbolic variables
Lambda = Symbol('Lambda')
x = Symbol('x')
y = Symbol('y')

# define polynomial
F = 2*(x**4) + 2*(x**3)*y - (x**2)*(y**2) + 5*(y**4)

# build matrix Q
Q = Matrix([[2,-Lambda,1], [-Lambda,5,0], [1,0,2*Lambda-1]])

# build vector of monomials
z = Matrix(3, 1, [x**2, y**2, x*y])

# compute Cholesky factorization
# for a particular value of Lambda
B = Q.subs(Lambda, Rational(3,1)).cholesky()

# obtain vector of SOS polynomials
f = B.T * z

print "\nSOS polynomials:"
pprint(f[0])
pprint(f[1])
pprint(f[2])

# compute residual polynomial
R = F - sum([f[i]**2 for i in [0,1,2]])

print "\nResidual polynomial: %s" % R.expand()


and it produces the SOS decomposition

$F(x, y) = \displaystyle \frac{1}{2} \left(2 x^2 -3 y^2 + x y\right)^2 + \frac{1}{2} \left(y^2 + 3 x y\right)^2$

which is the one we obtained last week. For $\lambda = 3/2$, we get

$F(x, y) = \displaystyle \frac{1}{2} \left(2 x^2 - \frac{3}{2} y^2 + x y\right)^2 + 62 \left(\frac{1}{4} y^2 + \frac{3}{62} x y\right)^2 +\frac{1302}{961} x^2 y^2$

and to verify that this is correct, we can run the script:

h0 = Rational(1,2) * (2*x**2 - Rational(3,2)*y**2 + x*y)**2
h1 = 62 * (Rational(3,62)*x*y + Rational(1,4)*y**2)**2
h2 = Rational(1302,961)*x**2*y**2
pprint((F - (h0 + h1 + h2)).expand())


For $\lambda = 5/2$, we obtain the SOS decomposition

$F(x, y) = \displaystyle \frac{1}{2} \left(2 x^2 - \frac{5}{2} y^2 + x y\right)^2 + \frac{30}{16} \left(y^2 + \frac{2}{3} x y\right)^2 +\frac{8}{3} x^2 y^2$.

But, Доверяй, но проверяй!

h0 = Rational(1,2) * (2*x**2 - Rational(5,2)*y**2 + x*y)**2
h1 = Rational(30,16) * (Rational(2,3)*x*y + y**2)**2
h2 = Rational(8,3)*x**2*y**2
pprint((F - (h0 + h1 + h2)).expand())


__________

What happens for non-admissible values of $\lambda$?

We know that if $\lambda \in \Lambda$, then $Q \succeq 0$. What if $\lambda \notin \Lambda$? Then matrix $Q$ will be negative definite and will, or will not, have a Cholesky factorization, depending on what convention one uses.

One can think of matrices $L$ and $B$ in the Cholesky factorizations $Q = L^T L$ and $Q = B B^T$, respectively, as square roots of matrix $Q$. Do negative reals have square roots? They do if you are willing to work with complex numbers. Likewise, a negative definite $Q$ will have a Cholesky factorization if one is willing to work with complex matrices. For example, for $\lambda = 0$ we obtain the SOS decomposition

$F(x, y) = \displaystyle \sum_{i=0}^2 f_i^2 (x, y)$

where the $f_i$ polynomials are:

$f_0 (x,y) = \frac{\sqrt{2}}{2} (2 x^2 + x y)$,   $f_1 (x,y) = \sqrt{5} y^2$,   $f_2 (x,y) = i \frac{\sqrt{6}}{2} x y$

where $i := \sqrt{-1}$. Note that $f_0, f_1 \in \mathbb{R}[x, y]$, but $f_2 \in \mathbb{C}[x, y]$. If we square $f_2$, we get $f_2^2 = -\frac{3}{2} x^2 y^2$. This term is being subtracted from the sum of squares, which prevents us from concluding global nonnegativity. That is why we impose the condition $Q \succeq 0$.

Sum of squares decomposition

June 30, 2011

Consider the following quartic form in $\mathbb{R}[x, y]$

$F(x, y) = 2 x^4 + 2 x^3 y - x^2 y^2 + 5 y^4$

and suppose we would like to decide if $F$ is globally nonnegative. In other words, we want to know whether the proposition

$F(x, y) \geq 0, \qquad{} \forall x, y \in \mathbb{R}$

is true. A sufficient condition for $F(x,y)$ to be globally nonnegative is the existence of a sum of squares decomposition [1, 2]

$F(x, y) = \displaystyle \sum_{i} f_i^2 (x, y)$

where $f_i \in \mathbb{R}[x, y]$. If such a decomposition does indeed exist, how do we compute it? How do we find the $f_i$ polynomials?

Note that $F(x, y)$ is  a quartic form, i.e., a polynomial whose monomials all have total degree 4. Using Parrilo’s method [1, 2], let us try to write $F(x, y)$ as a quadratic form in all the monomials of total degree equal to 2, as follows

$F(x,y) = \left[\begin{array}{c} x^2\\ y^2\\ x y\end{array}\right]^T \left[\begin{array}{ccc} q_{11} & q_{12} & q_{13}\\ q_{12} & q_{22} & q_{23}\\ q_{13} & q_{23} & q_{33}\end{array}\right] \left[\begin{array}{c} x^2\\ y^2\\ x y\end{array}\right]$

where the real matrix

$Q := \left[\begin{array}{ccc} q_{11} & q_{12} & q_{13}\\ q_{12} & q_{22} & q_{23}\\ q_{13} & q_{23} & q_{33}\end{array}\right]$

is symmetric. Hence, we obtain

$F(x,y) = q_{11} x^4 + 2 q_{13} x^3 y + (2 q_{12} + q_{33}) x^2 y^2 + 2 q_{23} x y^3 + q_{22} y^4$

and, matching coefficients, we get the equality constraints

$q_{11} = 2$,   $q_{13} = 1$,   $2 q_{12} + q_{33} = -1$,   $q_{23} = 0$,   $q_{22} = 5$.

Let us make $q_{12} = - \lambda$, where $\lambda \in \mathbb{R}$, so that $q_{33} = 2 \lambda - 1$. Hence, we obtain a family of symmetric $Q$ matrices parameterized by $\lambda$

$Q := \left[\begin{array}{ccc} 2 & -\lambda & 1\\ -\lambda & 5 & 0\\ 1 & 0 & 2 \lambda - 1\end{array}\right]$

For each choice of $\lambda$, we get a matrix $Q$. We then conclude that the representation $F(x, y) = z^T Q z$, where $z := (x^2, y^2, x y)$, is not unique. Since there are infinitely many admissible $Q$ matrices, if we are able to find a positive semidefinite one, then $F(x, y) = z^T Q z \geq 0$ for all $z$, and global nonnegativity of the given polynomial $F$ is guaranteed!

For instance, let us try $\lambda = 3$. We then get

$Q = \left[\begin{array}{ccc} 2 & -3 & 1\\ -3 & 5 & 0\\ 1 & 0 & 5\end{array}\right]$

which is positive semidefinite. Computing a Cholesky factorization of $Q$, i.e., finding a matrix $L$ such that $Q = L^T L$, we get

$L = \displaystyle \frac{1}{\sqrt{2}}\left[\begin{array}{ccc} 2 & -3 & 1\\ 0 & 1 & 3\end{array}\right]$

and, thus, $F(x, y) = z^T Q z = z^T L^T L z = \| L z \|_2^2$, which finally yields a sum of squares (SOS) decomposition

$F(x, y) = \displaystyle \frac{1}{2} \left(2 x^2 -3 y^2 + x y\right)^2 + \frac{1}{2} \left(y^2 + 3 x y\right)^2$.

To make sure that this SOS decomposition is correct, we can use a little Python / SymPy script:


from sympy import *

# symbolic variables
x = Symbol('x')
y = Symbol('y')

# define polynomial
F = 2*(x**4) + 2*(x**3)*y - (x**2)*(y**2) + 5*(y**4)

# define SOS polynomials
f1 = (sqrt(2)/2) * (2*(x**2) - 3*(y**2) + x*y)
f2 = (sqrt(2)/2) * (y**2 + 3*x*y)

# construct residual polynomial
R = F - (f1**2 + f2**2)

print "Residual polynomial: %s" % R.expand()



The residual polynomial is $R (x, y) = 0$. The SOS decomposition is correct! We can then conclude that the given polynomial $F$ is globally nonnegative.

__________

Remark: this post is based on example 4.1 of Parrilo’s doctoral dissertation [1] and also on example 3.5 of Parrilo’s paper [2].

__________

References

[1] Pablo Parrilo, Structured semidefinite programs and semialgebraic geometry methods in robustness and optimization, Dissertation (Ph.D.), California Institute of Technology, 2000.

[2] Pablo Parrilo, Semidefinite programming relaxations for semialgebraic problems, Mathematical Programming, 2003.

Building a polynomial from its roots II

December 10, 2008

Suppose we are given an $n$-tuple of (not necessarily distinct) complex numbers $(w_1, w_2, \ldots, w_n)$ and we construct a monic univariate polynomial of degree $n$ over field $\mathbb{C}$ whose $n$ roots are the elements of $(w_1, w_2, \ldots, w_n)$

$p_n(z) = \displaystyle\prod_{i=1}^n (z - w_i) = \displaystyle\sum_{m=0}^n a(m,n) z^m$,

where $a(m,n) \in \mathbb{C}$ is the coefficient of monomial $z^m$ of degree $m$ in polynomial $p_n(z)$ of degree $n$. This double-indexing might seem a bit confusing, but its usefulness will be made evident later on.

Usually, we take a univariate polynomial and try to find its roots. Now we are interested in the reverse operation, i.e., we would like to find the values of the coefficients $\{a(m,n)\}_{m=0}^{n}$ of polynomial $p_n(z)$ in terms of the roots $(w_1, w_2, \ldots, w_n)$. Note that every permutation of the $n$-tuple $(w_1, w_2, \ldots, w_n)$ results in the same monic polynomial of degree $n$.

_____

Constructing polynomials

Say we know the coefficients of polynomial $p_k(z)$ of degree $k$. Then we can write the coefficients $\{a(m, k+1)\}_{m=0}^{k+1}$ of polynomial $p_{k+1}(z)$ in terms of the coefficients  $\{a(m, k)\}_{m=0}^{k}$ and the $(k+1)$-th element of $n$-tuple $(w_1, w_2, \ldots, w_n)$. On my previous post on this topic, I derived the recursion

$a(m,k+1) = \displaystyle\left\{\begin{array}{rl} - w_{k+1} a(0,k) & \text{if} \quad{} m = 0\\ a(m-1,k) - w_{k+1} a(m,k) & \text{if} \quad{} 1 \leq m \leq k\\ a(k,k) & \text{if} \quad{} m=k+1\\ \end{array}\right.$

Let us start with a monic polynomial of degree zero, $p_0(z) = 1$, whose only coefficient is $a(0,0) = 1$. We take the first element of the $n$-tuple $(w_1, w_2, \ldots, w_n)$ and construct the monic polynomial $p_1(z) = z - w_1$ of degree $1$ using the recursion above. Then we take the 2nd root $w_2$ and construct the monic polynomial of degree $2$ whose roots are $w_1, w_2$

$p_2(z) = z^2 - (w_1 + w_2) z + w_1 w_2$.

Iterating successively, we obtain a monic polynomial $p_n(z)$ of degree $n$ whose roots are $w_1, w_2, \ldots, w_n$.

_____

A graphical approach

Consider the following set of 15 nodes arranged in 5 columns and 5 rows. Each node is labelled with an ordered pair $(i,j)$, where $i$ is the row index and $j$ is the column index.

We now attach a complex number $a(i,j)$ to each node $(i,j)$. The coefficient of monomial $z^m$ in polynomial $p_n(z)$ can be thought of as the number attached to node $(m,n)$. For example, the coefficients of polynomial $p_2(z)$ are the numbers attached to the nodes of the 3rd column (note that we count rows and columns starting at the zero index, like in the C programming language). The double-indexing now makes some sense ;-)

Finally, we add directed edges linking nodes in adjacent columns, and obtain the following directed graph (which reminds me of the trellis diagrams used in convolutional coding)

Recall the recursion formula which allows us to write the coefficients of polynomial $p_{k+1}(z)$ in terms of the coefficients of polynomial $p_{k}(z)$

$a(m,k+1) = \displaystyle\left\{\begin{array}{rl} - w_{k+1} a(0,k) & \text{if} \quad{} m = 0\\ a(m-1,k) - w_{k+1} a(m,k) & \text{if} \quad{} 1 \leq m \leq k\\ a(k,k) & \text{if} \quad{} m=k+1\\ \end{array}\right.$.

Since the coefficients $a(\cdot, \cdot)$ are attached to the nodes of the directed graph depicted above, we can use the recursion formula to compute the values attached to the nodes of a given column as a function of the values attached to the nodes in the column to the left.

This graphical approach is (most likely) of no practical use. Nonetheless, this approach does reveal the “structure” of the process of constructing a polynomial from its roots, and such structure is of much greater beauty than cumbersome algebraic manipulation.

_____

An example

Suppose we want to compute the coefficients of the monic polynomial $p_4(z) = (z - 1)^4$, which has a root of multiplicity $4$ at $z = 1$. In this case we have $w_1 = w_2 = w_3 = w_4 = 1$. From the well-known binomial theorem, we know that

$p_4(z) = z^4 - 4 z^3 + 6 z^2 - 4 z + 1$,

and therefore, the coefficients are $a(0,4) = 1$, $a(1,4) = -4$, $a(2,4) = 6$, $a(3,4) = -4$, and $a(4,4) = 1$. Note that the absolute values of these coefficients are the binomial coefficients and can be found in the 5th row of Pascal’s triangle.

Let us now construct this polynomial one step at a time using the graphical approach described above. We start with monic polynomial $p_0(z) = 1$, whose only coefficient is $a(0,0) = 1$

and then we compute the coefficients of the monic polynomial $p_1(z) = z-1$ using the recursion formula. Graphically, this means that the values of the 2 nodes in the 2nd column can be computed from $a(0,0)$, as follows

where the directed edges now have weights. For example, the edge linking nodes $(0,0)$ and $(0,1)$ has weight $-1$, while the edge linking $(0,0)$ and $(1,1)$ has weight $1$. If the weight of an edge is equal to $1$, then we do not label the edge in order to avoid cluttering the diagram.

From the coefficients of $p_1(z)$ and the second root, $w_2 = 1$, we can compute the coefficients of $p_2(z)$ using the recursion again, i.e., the values of the nodes in the 3rd column can be computed from the values in the nodes in the 2nd column, as follows

The values of the nodes in the 3rd column are known, and since the 3rd root is $w_3 = 1$ we can now obtain the coefficients of $p_3(z)$

and since $w_4 = 1$, we can finally obtain the coefficients of the monic polynomial $p_4(z)$, which are the values of the nodes in the 5th column

In this example the roots I chose have zero imaginary parts, but this approach still works if the roots have nonzero imaginary parts. This method could be used to compute the coefficients of polynomials of degrees larger than $4$, of course.

Building a polynomial from its roots

November 22, 2008

Suppose we are given a set of $n \geq 2$ distinct real numbers $\mathcal{R} = \{r_1, r_2, \ldots, r_n\}$, and we build a monic univariate polynomial (over field $\mathbb{R}$) of degree $n$ whose $n$ distinct roots are the elements of set $\mathcal{R}$

$p_n(x) = \displaystyle\prod_{i=1}^n (x - r_i) = \displaystyle\sum_{m=0}^n a_{m,n} x^m$,

where $a_{m,n} \in \mathbb{R}$ is the coefficient of monomial $x^m$. The second subscript in $a_{m,n}$ tells us that the coefficient corresponds to a polynomial whose degree is $n$.

Problem: for $n \geq 2$, what are the values of the coefficients $\{a_{m,n}\}_{m=0}^{n}$ in terms of the roots $\{r_i\}_{i=1}^n$?

At first glance, this problem seems elementary. However, I had never thought of it and I found it quite interesting. I prefer to think of examples before attempting to see the “big picture”, so let us consider the simplest cases.

_____

Case $n=1$

The monic polynomial of degree $1$ whose root is $r_1$ is

$p_1(x) = x - r_1$

and the coefficients are

$\left[\begin{array}{c}a_{0,1}\\a_{1,1}\end{array}\right] = \left[\begin{array}{c} -r_1\\1\end{array}\right]$.

_____

Case $n=2$

The monic polynomial of degree $2$ whose roots are $r_1, r_2$ is

$\begin{array}{rl} p_2(x) &= (x - r_1) (x - r_2)\\ &= x^2 - (r_1 + r_2) x + r_1 r_2\end{array}$

and the coefficients are

$\left[\begin{array}{c}a_{0,2}\\a_{1,2}\\a_{2,2}\end{array}\right] = \left[\begin{array}{c}r_1 r_2\\-(r_1+r_2)\\1\end{array}\right]$.

_____

Case $n=3$

The monic polynomial of degree $3$ whose roots are $r_1, r_2, r_3$ is

$\begin{array}{rl} p_3(x) &= (x - r_1) (x - r_2) (x - r_3)\\ &= x^3 - (r_1+r_2 + r_3) x^2 + (r_1 r_2 + r_1 r_3 + r_2 r_3) x - r_1 r_2 r_3\\\end{array}$

and the coefficients are

$\left[\begin{array}{c}a_{0,3}\\a_{1,3}\\a_{2,3}\\a_{3,3}\end{array}\right] = \left[\begin{array}{c} - r_1 r_2 r_3\\ r_1 r_2 + r_1 r_3 + r_2 r_3\\ -(r_1+r_2 + r_3)\\1\end{array}\right]$.

_____

The coefficients $\{a_{m,n}\}_{m=0}^{n}$ can be computed in a recursive fashion. Consider the monic polynomials

$p_{k}(x) = \displaystyle\prod_{i=1}^{k} (x - r_i)$

and

$p_{k+1}(x) = \displaystyle\prod_{i=1}^{k+1} (x - r_i) = (x - r_{k+1}) \displaystyle\prod_{i=1}^k (x - r_i)$,

then $p_{k+1} = (x - r_{k+1}) p_k(x)$, and thus

$\begin{array}{rl} p_{k+1}(x) &= (x - r_{k+1})\displaystyle\sum_{m=0}^k a_{m,k} x^m\\ &= \displaystyle\sum_{m=0}^k a_{m,k} x^{m+1} - \displaystyle\sum_{m=0}^k r_{k+1} a_{m,k} x^m.\\\end{array}$

Note that

$\begin{array}{rl} \displaystyle\sum_{m=0}^k a_{m,k} x^{m+1} &= \displaystyle\sum_{m=1}^k a_{m-1,k} x^m + a_{k,k} x^{k+1}\\ \displaystyle\sum_{m=0}^{k} r_{k+1} a_{m,k} x^m &= r_{k+1} a_{0,k} + \displaystyle\sum_{m=1}^k r_{k+1} a_{m,k} x^m\end{array}$

and therefore

$p_{k+1}(x) = a_{k,k} x^{k+1} + \displaystyle\sum_{m=1}^k \left(a_{m-1,k} - r_{k+1} a_{m,k}\right) x^m - r_{k+1} a_{0,k}$.

We can write $p_{k+1}(x)$ in the expanded form in terms of the coefficients $\{a_{m,k+1}\}_{m=0}^{k+1}$

$p_{k+1}(x) = a_{k+1,k+1} x^{k+1} + \displaystyle\sum_{m=1}^k a_{m,k+1} x^m + a_{0,k+1}$

and therefore we can write the coefficients $\{a_{m,k+1}\}_{m=0}^{k+1}$ in terms of the coefficients $\{a_{m,k}\}_{m=0}^{k}$, as follows

$a_{m,k+1} = \displaystyle\left\{\begin{array}{rl} - r_{k+1} a_{0,k} & \text{if} \quad{} m = 0\\ a_{m-1,k} - r_{k+1} a_{m,k} & \text{if} \quad{} 1 \leq m \leq k\\ a_{k,k} & \text{if} \quad{} m=k+1\\ \end{array}\right.$

Hence, starting with the zero-degree monic polynomial $p_0(x)$, whose only coefficient is $a_{0,0} = 1$, we can build $p_1(x)$, the monic polynomial of degree $1$ whose root is $r_1$. From $p_1(x)$, and given a second root $r_2$, we can build $p_2(x)$, the monic polynomial of degree $2$ whose roots are $r_1, r_2$. Iterating successively, we can build $p_n(x)$, the monic polynomial of degree $n$ whose roots are $r_1,r_2, \ldots, r_n$.