On the Random Generation of
Non-deterministic Automata (up to
isomorphism)
Pierre-Cyrille Héam and Jean-Luc Joly
FEMTO-ST, Université de Franche-Comté
CIAA 2015
PCH, JLJ
Random NFA
1 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Finite Automata : many algorithmic applications (Modelisation,
Pattern Matching, Machine Learning, Verification, Text
processing,...).
How to evaluate new (optimized) algorithms ?
I
Worst case complexity.
I
Benchmarking.
I
Hard instances.
I
Average Complexity.
I
Random Generation.
PCH, JLJ
Random NFA
2 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Uniform Random Generation of FA
Most work on deterministic Automata.
I
2-letter DFA [N00], extended to any alphabet in [CP05] O(n3 ).
PCH, JLJ
Random NFA
3 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Uniform Random Generation of FA
Most work on deterministic Automata.
I
2-letter DFA [N00], extended to any alphabet in [CP05] O(n3 ).
I
O(n3/2 ) algorithm in [BN07].
PCH, JLJ
Random NFA
3 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Uniform Random Generation of FA
Most work on deterministic Automata.
I
2-letter DFA [N00], extended to any alphabet in [CP05] O(n3 ).
I
O(n3/2 ) algorithm in [BN07].
I
O(n3/2 ) rejection algorithm in [CN12].
PCH, JLJ
Random NFA
3 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Uniform Random Generation of FA
Most work on deterministic Automata.
I
2-letter DFA [N00], extended to any alphabet in [CP05] O(n3 ).
I
O(n3/2 ) algorithm in [BN07].
I
O(n3/2 ) rejection algorithm in [CN12].
I
[CdF12] for acyclic automata using Markov Chains and
[dFN13] for a recursive approach.
PCH, JLJ
Random NFA
3 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Uniform Random Generation of FA
Most work on deterministic Automata.
I
2-letter DFA [N00], extended to any alphabet in [CP05] O(n3 ).
I
O(n3/2 ) algorithm in [BN07].
I
O(n3/2 ) rejection algorithm in [CN12].
I
[CdF12] for acyclic automata using Markov Chains and
[dFN13] for a recursive approach.
I
[TV05] , for NFAs, using a random graph approach.
PCH, JLJ
Random NFA
3 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Random Generation Using Markov Chains
1
4
1
4
0
3
4
1
4
1
1
2
11
22
2
1
4
3
3
4
PCH, JLJ
Random NFA
4 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Random Generation Using Markov Chains
1
4
1
4
0
3
4
1
4
1
1
2
11
22
2
1
4
3
3
4
PCH, JLJ
0.75 0.25 0
0
0.25
0
0.5 0.25
M=
0
0.5 0.5
0
0
0.25 0 0.75
Random NFA
4 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Random Generation Using Markov Chains
1
4
1
4
0
3
4
1
4
1
1
2
11
22
2
1
4
3
3
4
0.75 0.25 0
0
0.25
0
0.5 0.25
0
0.5 0.5
0
0
0.25 0 0.75
Starting from an arbitrary vertex, move long enough
PCH, JLJ
Random NFA
4 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Random Generation Using Markov Chains
0.75
0.25
M =
0
0
1
4
0
3
4
1
2
11
22
2
0
0.5
0.5
0
0
0.25
0
0.75
1
4
1
1
4
0.25
0
0.5
0.25
0
0.75 0.25 0
0.25
0
0.5 0.25
0
0.5 0.5
0
0
0.25 0 0.75
1
4
3
3
4
Starting from an arbitrary vertex, move long enough
1
PCH, JLJ
Random NFA
4 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Random Generation Using Markov Chains
M
1
4
0
3
4
1
2
11
22
2
0.625
0.1875
=
0.125
0.0625
0.1875
0.375
0.25
0.1875
0.125
0.25
0.5
0.125
0.0625
0.1875
0.125
0.625
1
4
1
1
4
2
0.75 0.25 0
0
0.25
0.5 0.25
0
0
0.5 0.5
0
0
0.25 0 0.75
1
4
3
3
4
Starting from an arbitrary vertex, move long enough
2
PCH, JLJ
Random NFA
4 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Random Generation Using Markov Chains
M
1
4
0
3
4
1
2
11
22
2
0.515625
0.234375
=
0.15625
0.09375
0.234375
0.21875
0.3125
0.234375
0.15625
0.3125
0.375
0.15625
0.09375
0.234375
0.15625
0.515625
1
4
1
1
4
3
0.75 0.25 0
0
0.25
0
0.5 0.25
0
0.5 0.5
0
0
0.25 0 0.75
1
4
3
3
4
Starting from an arbitrary vertex, move long enough
3
PCH, JLJ
Random NFA
4 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Random Generation Using Markov Chains
M
1
4
0
3
4
1
2
11
22
2
0.4453125
0.2304688
=
0.1953125
0.1289062
0.2304688
0.2734375
0.265625
0.2304688
0.1953125
0.265625
0.34375
0.1953125
0.1289062
0.2304688
0.1953125
0.4453125
1
4
1
1
4
4
0.75 0.25 0
0
0.25
0.5 0.25
0
0
0.5 0.5
0
0
0.25 0 0.75
1
4
3
3
4
Starting from an arbitrary vertex, move long enough
4
PCH, JLJ
Random NFA
4 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Random Generation Using Markov Chains
M
1
4
0
3
4
1
2
11
22
2
0.3916016
0.2412109
=
0.2128906
0.1542969
0.2412109
0.2480469
0.2695312
0.2412109
0.2128906
0.2695312
0.3046875
0.2128906
0.1542969
0.2412109
0.2128906
0.3916016
1
4
1
1
4
5
0.75 0.25 0
0
0.25
0
0.5 0.25
0
0.5 0.5
0
0
0.25 0 0.75
1
4
3
3
4
Starting from an arbitrary vertex, move long enough
5
PCH, JLJ
Random NFA
4 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Random Generation Using Markov Chains
M
1
4
0
3
4
1
2
11
22
2
0.3540039
0.2429199
=
0.2270508
0.1760254
0.2429199
0.2553711
0.2587891
0.2429199
0.2270508
0.2587891
0.2871094
0.2270508
0.1760254
0.2429199
0.2270508
0.3540039
1
4
1
1
4
6
0.75 0.25 0
0
0.25
0
0.5 0.25
0
0.5 0.5
0
0
0.25 0 0.75
1
4
3
3
4
Starting from an arbitrary vertex, move long enough
6 and return the current vertex
PCH, JLJ
Random NFA
4 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Random Generation Using Markov Chains
1
4
1
4
0
3
4
1
4
1
1
2
11
22
2
1
4
3
M
20
0.2516147
0.2499873
=
0.2499546
0.2484434
0.2499873
0.2500056
0.2500199
0.2499873
0.2499546
0.2500199
0.2500709
0.2499546
0.2484434
0.2499873
0.2499546
0.2516147
3
4
Theorem
If the graph is strongly connected (irreducible) and aperiodic. then
there is a stationary distribution.
If, moreover, the matrix is symmetric, then this stationary distribution
is uniform.
PCH, JLJ
Random NFA
4 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Uniform Random Generation of FA
Remarks
I “How many steps ?” is a difficult question (mixing time).
PCH, JLJ
Random NFA
5 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Uniform Random Generation of FA
Remarks
I “How many steps ?” is a difficult question (mixing time).
I
In practice the graphs are huge and defined by local
transformations : in our context vertices are NFAs.
?
NFA0
local t
PCH, JLJ
NFA1
?
Random NFA
5 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Uniform Random Generation of FA
Remarks
I “How many steps ?” is a difficult question (mixing time).
I
In practice the graphs are huge and defined by local
transformations : in our context vertices are NFAs.
I
Metropolis-Hastings Algorithm : modify the probability to get
a given distribution.
PCH, JLJ
Random NFA
5 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Metropolis-Hastings Algorithm
Given a finite irreducible, strongly connected Markov Chain on Ω
with a symmetric matrix M and a distribution π on Ω.
The associated Metropolis chain is defined by the matrix Mπ :
π(y )
if x =
6 y
M(x,
y
)
min
1,
π(x)
X
Mπ (x, y ) =
π(z)
M(x, z) min 1,
1
−
if x = y
π(x)
z6=x
PCH, JLJ
Random NFA
6 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Metropolis-Hastings Algorithm
Given a finite irreducible, strongly connected Markov Chain on Ω
with a symmetric matrix M and a distribution π on Ω.
The associated Metropolis chain is defined by the matrix Mπ :
π(y )
if x =
6 y
M(x,
y
)
min
1,
π(x)
X
Mπ (x, y ) =
π(z)
M(x, z) min 1,
1
−
if x = y
π(x)
z6=x
Requires to know how to compute
PCH, JLJ
Random NFA
π(y )
π(x) .
6 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Outline
1
Introduction - Motivation
2
Random Generation (up to isomorphism)
3
Counting Automorphisms
4
Experiments and Conclusion
PCH, JLJ
Random NFA
7 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Studied Classes
The alphabet Σ is fixed, set of states is {1, . . . , n}.
I
N(n) : trim NFAs with n-state. But most of them accepts all
the words, not very interesting for a practical use.
PCH, JLJ
Random NFA
8 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Studied Classes
The alphabet Σ is fixed, set of states is {1, . . . , n}.
I
N(n) : trim NFAs with n-state. But most of them accepts all
the words, not very interesting for a practical use.
I
Nm (n) : trim NFAs with n-state s.t. for each state there are at
most m outgoing transitions.
PCH, JLJ
Random NFA
8 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Studied Classes
The alphabet Σ is fixed, set of states is {1, . . . , n}.
I
N(n) : trim NFAs with n-state. But most of them accepts all
the words, not very interesting for a practical use.
I
Nm (n) : trim NFAs with n-state s.t. for each state there are at
most m outgoing transitions.
I
N0m (n) : trim NFAs with n-state s.t. for each letter a, for each
state p there are at most m a-labelled outgoing transitions.
PCH, JLJ
Random NFA
8 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Studied Classes
The alphabet Σ is fixed, set of states is {1, . . . , n}.
I
N(n) : trim NFAs with n-state. But most of them accepts all
the words, not very interesting for a practical use.
I
Nm (n) : trim NFAs with n-state s.t. for each state there are at
most m outgoing transitions.
I
N0m (n) : trim NFAs with n-state s.t. for each letter a, for each
state p there are at most m a-labelled outgoing transitions.
I
For a class X, X• is the class of elements of X having 1
as unique initial state.
PCH, JLJ
Random NFA
8 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Example
b
1
a
a
2
a, b
3
a, b
An element of N(3)• .
PCH, JLJ
Random NFA
9 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Example
b
1
a
a
2
a, b
3
a, b
An element of N(3)• , N4 (3)• .
PCH, JLJ
Random NFA
9 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Example
b
1
a
a
2
a, b
3
a, b
An element of N(3)• , N4 (3)• and N02 (3)• .
PCH, JLJ
Random NFA
9 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Example
b
1
a
a
2
a, b
3
a, b
An element of N(3)• , N4 (3)• and N02 (3)• .
PCH, JLJ
Random NFA
9 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Example
b
1
a
a
2
a, b
3
a, b
An element of N(3)• , N4 (3)• and N02 (3)• .
PCH, JLJ
Random NFA
9 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Example
b
1
a
a
2
a, b
3
a, b
An element of N(3)• , N4 (3)• and N02 (3)• .
Defining what are interesting classes for testing application is not
the main issue of the paper. The proposed approach is quite easily
adaptable for many classes of non-deterministic automata.
PCH, JLJ
Random NFA
9 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Markov Chains
p
a
?
a
q
b
Let ρ1 , ρ2 , ρ3 be positive real numbers such that ρ1 + ρ2 + ρ3 ≤ 1.
Pick x ∈ [0, 1].
PCH, JLJ
Random NFA
10 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Markov Chains
ρ1
|Q|
p
a
a
p
a
a
q
q
b
b
Let ρ1 , ρ2 , ρ3 be positive real numbers such that ρ1 + ρ2 + ρ3 ≤ 1.
Pick x ∈ [0, 1].
I
If x ≤ ρ1 , pick unif. p, switch whether p is(n’t) initial.
PCH, JLJ
Random NFA
10 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Markov Chains
ρ2
|Q|
p
a
a
p
a
a
q
q
b
b
Let ρ1 , ρ2 , ρ3 be positive real numbers such that ρ1 + ρ2 + ρ3 ≤ 1.
Pick x ∈ [0, 1].
I
If x ≤ ρ1 , pick unif. p, switch whether p is(n’t) initial.
I
If ρ1 < x ≤ ρ1 + ρ2 , pick unif. p, switch whether p is(n’t) final.
PCH, JLJ
Random NFA
10 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Markov Chains
ρ3
|Σ|.n2
p
a
a
p
a
b
a
q
q
b
b
Let ρ1 , ρ2 , ρ3 be positive real numbers such that ρ1 + ρ2 + ρ3 ≤ 1.
Pick x ∈ [0, 1].
I
If x ≤ ρ1 , pick unif. p, switch whether p is(n’t) initial.
I
If ρ1 < x ≤ ρ1 + ρ2 , pick unif. p, switch whether p is(n’t) final.
I
If ρ1 + ρ2 < x ≤ ρ1 + ρ2 + ρ3 , pick unif. (p, b, q) and switch
whether (p, b, q) is(n’t) a transition.
PCH, JLJ
Random NFA
10 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Markov Chains
p
a
a
q
b
Let ρ1 , ρ2 , ρ3 be positive real numbers such that ρ1 + ρ2 + ρ3 ≤ 1.
Pick x ∈ [0, 1].
I
If x ≤ ρ1 , pick unif. p, switch whether p is(n’t) initial.
I
If ρ1 < x ≤ ρ1 + ρ2 , pick unif. p, switch whether p is(n’t) final.
I
If ρ1 + ρ2 < x ≤ ρ1 + ρ2 + ρ3 , pick unif. (p, b, q) and switch
whether (p, b, q) is(n’t) a transition.
I
Otherwise do nothing.
PCH, JLJ
Random NFA
10 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Markov Chains
p
a
a
q
b
Let ρ1 , ρ2 , ρ3 be positive real numbers such that ρ1 + ρ2 + ρ3 ≤ 1.
Pick x ∈ [0, 1].
I
If x ≤ ρ1 , pick unif. p, switch whether p is(n’t) initial.
I
If ρ1 < x ≤ ρ1 + ρ2 , pick unif. p, switch whether p is(n’t) final.
I
If ρ1 + ρ2 < x ≤ ρ1 + ρ2 + ρ3 , pick unif. (p, b, q) and switch
whether (p, b, q) is(n’t) a transition.
I
Otherwise do nothing.
If the obtained automaton is in X(n), move. Otherwise don’t move.
PCH, JLJ
Random NFA
10 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Markov Chains
ρ3
|Σ|.n2
p
a
a
p
a
b
a
q
q
b
b
The chains (for the previously defined classes) satisfy :
I The underlying graph is aperiodic and strongly connected.
I The matrix of probabilities is symmetric.
I Moving one step into the graph can be done in linear time (for
testing whether automata are trim).
The stationary distribution is therefore the uniform distribution.
PCH, JLJ
Random NFA
11 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Markov Chains
ρ3
|Σ|.n2
p
a
a
p
a
b
a
q
q
b
b
The chains (for the previously defined classes) satisfy :
I The underlying graph is aperiodic and strongly connected.
I The matrix of probabilities is symmetric.
I Moving one step into the graph can be done in linear time (for
testing whether automata are trim).
The stationary distribution is therefore the uniform distribution.
For the class of all NFAs the mixing time is in O(n3 )
PCH, JLJ
Random NFA
11 / 25
Intro.
Random Gene.
Counting
XP - Concl.
The Isomorphism Problem
Two finite automata A1 and A2 are isomorphic if there are equal
up to state names : there exists a one to one function from the set
of states of A1 to the set of states of A2 preserving transitions,
initial states and final states.
1
a
a
2
b
4
a
a
b
3
a
2
4
a
b
1
b
3
b
b
PCH, JLJ
Random NFA
12 / 25
Intro.
Random Gene.
Counting
XP - Concl.
The Isomorphism Problem
Two finite automata A1 and A2 are isomorphic if there are equal
up to state names : there exists a one to one function from the set
of states of A1 to the set of states of A2 preserving transitions,
initial states and final states.
1
a
a
2
b
4
a
a
b
3
a
2
4
a
b
1
b
3
b
b
PCH, JLJ
Random NFA
12 / 25
Intro.
Random Gene.
Counting
XP - Concl.
The Isomorphism Problem
a, b
a, b
3
2
4
a, b
a, b
1
5
a, b
a, b
n
...
There is no other NFA isomorphic to the blue automaton.
PCH, JLJ
Random NFA
13 / 25
Intro.
Random Gene.
Counting
XP - Concl.
The Isomorphism Problem
a, b
a, b
a, b
3
2
4
2
a, b
a, b
1
a, b
1
a, b
n
4
a, b
5
a, b
a, b
3
5
a, b
a, b
n
...
...
There is no other NFA isomorphic to the blue automaton.
There are n! NFAs isomorphic to the red automaton.
PCH, JLJ
Random NFA
13 / 25
Intro.
Random Gene.
Counting
XP - Concl.
The Isomorphism Problem
a, b
a, b
a, b
3
2
4
2
a, b
a, b
1
a, b
1
a, b
n
4
a, b
5
a, b
a, b
3
5
a, b
a, b
n
...
...
There is no other NFA isomorphic to the blue automaton.
There are n! NFAs isomorphic to the red automaton.
Up to isomorphism the red automaton appears n! more often than
the blue one.
PCH, JLJ
Random NFA
13 / 25
Intro.
Random Gene.
Counting
XP - Concl.
The Isomorphism Problem
a, b
a, b
a, b
3
2
4
2
a, b
a, b
1
a, b
1
a, b
n
4
a, b
5
a, b
a, b
3
5
a, b
a, b
n
...
...
There is no other NFA isomorphic to the blue automaton.
There are n! NFAs isomorphic to the red automaton.
Up to isomorphism the red automaton appears n! more often than
the blue one.
Most of problems are not related to states names and requires a
random generation up to isomorphism.
PCH, JLJ
Random NFA
13 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Metropolis Hastings Approach
Given a finite irreducible, strongly connected Markov Chain on Ω with a symmetric matrix M and a
distribution π on Ω. The associated Metropolis chain is defined by the matrix Mπ :
n
o
M(x, y ) min 1, π(y )
if x 6= y
π(x) n
o
Mπ (x, y ) =
P
π(z)
1−
if x = y
z6=x M(x, z) min 1, π(x)
Requires to know how to compute
PCH, JLJ
Random NFA
π(y )
π(x) .
14 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Metropolis Hastings Approach
Given a finite irreducible, strongly connected Markov Chain on Ω with a symmetric matrix M and a
distribution π on Ω. The associated Metropolis chain is defined by the matrix Mπ :
n
o
M(x, y ) min 1, π(y )
if x 6= y
π(x) n
o
Mπ (x, y ) =
P
π(z)
1−
if x = y
z6=x M(x, z) min 1, π(x)
Requires to know how to compute
Here
π(x) =
1
|x̂|γ
I
γ : number of isomorphism classes
I
x̂ : isomorphism class of x.
PCH, JLJ
π(y )
π(x) .
Random NFA
14 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Metropolis Hasting Approach
how to compute
π(y )
π(x)
PCH, JLJ
?
π(x) =
Random NFA
1
|x̂|γ
15 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Metropolis Hasting Approach
how to compute
π(y )
π(x)
?
π(x) =
1
|x̂|γ
One can prove that
1
n!
=
|x̂|
|Aut(x)|
PCH, JLJ
Random NFA
15 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Metropolis Hasting Approach
how to compute
π(y )
π(x)
?
π(x) =
1
|x̂|γ
One can prove that
1
n!
=
|x̂|
|Aut(x)|
It follows that,
|Aut(x)|
π(y )
=
π(x)
|Aut(y )|
PCH, JLJ
Random NFA
15 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Metropolis Hasting Approach
how to compute
π(y )
π(x)
?
π(x) =
1
|x̂|γ
One can prove that
1
n!
=
|x̂|
|Aut(x)|
It follows that,
|Aut(x)|
π(y )
=
π(x)
|Aut(y )|
It suffices to know how to compute the size of automorphisms
groups of finite automata.
PCH, JLJ
Random NFA
15 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Outline
1
Introduction - Motivation
2
Random Generation (up to isomorphism)
3
Counting Automorphisms
4
Experiments and Conclusion
PCH, JLJ
Random NFA
16 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Known Results for Directed Graphs
For directed Graphs :
I
The isomorphism problem and the problem of counting the
number of automorphisms are polynomially equivalent.
PCH, JLJ
Random NFA
17 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Known Results for Directed Graphs
For directed Graphs :
I
The isomorphism problem and the problem of counting the
number of automorphisms are polynomially equivalent.
I
For graphs with a bounded output degree, the isomorphism
problem is polynomial.
PCH, JLJ
Random NFA
17 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Known Results for Directed Graphs
For directed Graphs :
I
The isomorphism problem and the problem of counting the
number of automorphisms are polynomially equivalent.
I
For graphs with a bounded output degree, the isomorphism
problem is polynomial.
I
In general, there is no known polynomial algorithm for the
isomorphism problem.
PCH, JLJ
Random NFA
17 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Known Results for Directed Graphs
For directed Graphs :
I
The isomorphism problem and the problem of counting the
number of automorphisms are polynomially equivalent.
I
For graphs with a bounded output degree, the isomorphism
problem is polynomial.
I
In general, there is no known polynomial algorithm for the
isomorphism problem.
I
In practice, both problems (isomorphism or counting
automorphisms) are tackled using labeling techniques.
PCH, JLJ
Random NFA
17 / 25
Intro.
Random Gene.
Counting
XP - Concl.
For Non-deterministic Automata
We proved for NFAs that :
I
The problem of counting the number of automorphisms can be
solved by a polynomial number of call to the isomorphism
problem.
PCH, JLJ
Random NFA
18 / 25
Intro.
Random Gene.
Counting
XP - Concl.
For Non-deterministic Automata
We proved for NFAs that :
I
The problem of counting the number of automorphisms can be
solved by a polynomial number of call to the isomorphism
problem.
I
For automata with a bounded output degree, the isomorphism
problem is polynomial (classes Nm (n), N0m (n), Nm (n)• ,
N0m (n)• ).
PCH, JLJ
Random NFA
18 / 25
Intro.
Random Gene.
Counting
XP - Concl.
For Non-deterministic Automata
We proved for NFAs that :
I
The problem of counting the number of automorphisms can be
solved by a polynomial number of call to the isomorphism
problem.
I
For automata with a bounded output degree, the isomorphism
problem is polynomial (classes Nm (n), N0m (n), Nm (n)• ,
N0m (n)• ).
I
In practice, both problems (isomorphism or counting
automorphisms) can also be tackled using labeling techniques.
PCH, JLJ
Random NFA
18 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Proof (hints)
For the polynomial reduction, arguments are the same than for
directed graphs.
For the polynomial time algorithm for automata with a bounded
degree, a technical encoding of automata on bounded output
degree graphs has been used.
PCH, JLJ
Random NFA
19 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Labellings (idea)
(Q1 , Σ, E1 , I1 , F1 )
isom ϕ ?
PCH, JLJ
(Q2 , Σ, E2 , I2 , F2 )
Random NFA
20 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Labellings (idea)
(Q1 , Σ, E1 , I1 , F1 )
Q1
I1 ∩ F1c
I1 ∩ F1
isom ϕ ?
ϕ1 ?
ϕ2 ?
I1c ∩ F1
I1c ∩ F1c
(Q2 , Σ, E2 , I2 , F2 )
ϕ3 ?
I2 ∩ F2c
I2 ∩ F2
ϕ4 ?
PCH, JLJ
Random NFA
Q2
I2c ∩ F2
I2c ∩ F2c
20 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Labellings (idea)
(Q1 , Σ, E1 , I1 , F1 )
Q1
I1 ∩ F1c
I1 ∩ F1
isom ϕ ?
ϕ1 ?
ϕ2 ?
I1c ∩ F1
I1c ∩ F1c
(Q2 , Σ, E2 , I2 , F2 )
ϕ3 ?
I2 ∩ F2c
I2 ∩ F2
ϕ4 ?
Q2
I2c ∩ F2
I2c ∩ F2c
|I1 ∩ F1 |! + |I1C ∩ F1 |! + |I1 ∩ F1c |! + |I1c ∩ F1c |! possibilities rather
than |Q1 |!.
PCH, JLJ
Random NFA
20 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Labellings - Examples
I
Final/Initial states.
I
Number of outgoing transitions for a given letter.
I
Number of ongoing transitions for a given letter.
I
Smallest word (lexico.) to reach a final state.
I
...
All this criterion can be computed in linear time and allows, in
practice, a fast computation.
PCH, JLJ
Random NFA
21 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Outline
1
Introduction - Motivation
2
Random Generation (up to isomorphism)
3
Counting Automorphisms
4
Experiments and Conclusion
PCH, JLJ
Random NFA
22 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Computation time for the generation
Using a Python prototype :
Average Time
n
|Σ| = 2
|Σ| = 3
(s) to
10
0.02
0.02
Average Time (s)
n
m = 2, |Σ| = 2
m = 2, |Σ| = 3
m = 3, |Σ| = 2
m = 3, |Σ| = 3
sample a NFA in N(n), n3 steps.
20
50
70
90
0.43 32.5 166.1 569.9
0.56 47.1 248.4 848.1
to sample a NFA in
10
20
50
0.2 0.43 32.5
0.2 0.57 47.0
0.2 0.43 33.0
0.2 0.57 47.2
N0m (n),
70
166.1
246.7
167.8
248.6
n3 steps.
90
566.8
847.2
561.9
851.3
The main problem is not to count automorphisms but to move n3
steps.
PCH, JLJ
Random NFA
23 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Comparison with Tabakov-Vardi Algorithm
Average sizes of the minimal automata corresponding to automata
sampling using TV (with density σ) and in N02 (n)• .
σ = 1.5, n =
s
σ = 2, n =
s
σ = 3, n =
s
N02 (n)• , n =
s
5
1.5
5
1.3
5
2.8
5
3.7
8
4.3
8
3.0
8
4.8
8
6.1
PCH, JLJ
11
4.7
11
4.8
11
4.7
11
7.9
14
3.8
14
5.1
14
3.8
14
10.0
Random NFA
17
3.1
17
4.5
17
3.4
17
11.5
20
2.7
20
4.0
20
3.0
20
13.9
24 / 25
Intro.
Random Gene.
Counting
XP - Concl.
Conclusion
We proposed
I
A new way to generate non deterministic automata for several
classes.
I
The Generation can be done up to isomorphism.
I
The generation is practically tractable for automata with
dozens of states.
It remains
I
to prove bound on mixing time,
I
to have an efficient implementation,
I
to design interesting classes of NFAs,
I
to test the performances of algorithms, ...
PCH, JLJ
Random NFA
25 / 25
© Copyright 2026 Paperzz