# Fliess' theorem for minimising weighted automata

We state and prove Fliess' theorem, which says that the minimal automaton recognising a given formal series has size the rank of its Hankel matrix.

A formal series (here over the reals) is a function . The Hankel matrix of is a bi-infinite matrix defined by

For recognising formal series we use weighted automata:

.

Such an automaton recognises the function

Theorem:(Fliess ‘74)

- Any automaton recognising $f$ has at least $\rk(H_f)$ many states,
- There exists an automaton recognising $f$ with $\rk(H_f)$ many states.

### First implication: the backward-forward decomposition

Let $\A$ be an automaton recognising $f$.

Define $P_\A(\cdot,q) \in \R^{\Sigma^* }$ the formal series recognised by the weighted automaton $\A$ with $e_q$ as final vector, and $S_\A(\cdot,q) \in \R^{\Sigma^* }$ the formal series recognised by the weighted automaton $\A$ with $e_q$ as initial vector.

We have

Hence we wrote $H_f$ as a sum of $Q$ terms, each of which is a product of a row vector with a column vector, hence has rank $1$. We conclude by noting that the rank of a sum of matrices is at most the sum of the ranks.

### Converse implication: construction of the minimal automaton

We introduce some notations. For $X,Y$ two subsets of $\Sigma^*$, we let $H_f(X,Y)$ denote the submatrix of $H_f$ restricted to rows in $X$ and columns in $Y$. Observe the following equality: .

#### The backward automaton

Without loss of generality let us assume that $f \neq 0$. It follows that . We consider the family of vectors given by rows of $H_f$. Let us consider $X$ indexing a basis of this family; thanks to the previous assumption we can assume $\varepsilon \in X$.

We construct an automaton as follows.

- The set of states is .
- The initial vector is .
- The transition is defined to satisfy .
- The final vector is .

To see that there exists $\Delta(a)$ such that , let $x \in X$. We decompose $H_f(xa,\Sigma^*)$ on the basis : there exists such that

.

We now argue that $\A$ recognises $f$: we show by induction that .

Note that we used the equality for all words , which is the projection on the column of the equality defining .

This concludes the proof.

For the sake of curiosity, we show that there exists a dual construction, which considers columns instead of rows.

#### The forward automaton

Since it follows that . We consider the family of vectors given by columns of $H_f$. Let us consider $Y$ indexing a basis of this family; thanks to the previous assumption we can assume $\varepsilon \in Y$.

We construct an automaton as follows.

- The set of states is .
- The initial vector is .
- The transition is defined to satisfy .
- The final vector is .

To see that there exists $\Delta(a)$ such that , let $y \in Y$. We decompose $H_f(\Sigma^*, ay)$ on the basis : there exists such that

.

We now argue that $\A$ recognises $f$: we show by induction that .

Note that we used the equality for all words , which is the projection on the row of the equality defining .

We finish this post with an alternative abstract point of view.

#### An abstract construction without choosing a basis

(Thanks to Pierre Ohlmann for working this out with me!)

Instead of constructing a weighted automaton, we build a representation, which is given by a vector space of finite dimension (over the reals), a morphism and . A representation computes a function by .

One can show that through the choice of a basis for a representation induces a weighted automaton, and conversely.

We consider the vector space $\R^{\Sigma^*}$, which has infinite dimension. We simplify the notation into $H(w)$. We let $V$ denote the vector space generated by , it has finite dimension since $\rk(H_f)$ is finite. Define

We first need to show that this is well defined: if $H_w = H_{w’}$, then $H_{wa} = H_{w’a}$. This is easily checked:

Then we extend $\mu$ into a morphism , and let $\lambda(H_w) = f(w)$. We obtain by proving the invariant that the representation computes the function $f$.