MULTILEVEL PLACEMENT ALGORITHM

Easy Optimization
Problems,
Relaxation,
Local Processing
for a small subset of
variables
Changes in the energy and overlap
X-direction line search and
“discrete derivatives”
~
 For node i, fix all other nodes at their current x j
~ )2  
E
(
x
)

a
(
x

x
 Current
 ij i j
 overlap ( i , j )
i
j
j
------- ---------------- ------------------------ Calculate E ( x i   x ) , E ( x i   x ) => choose sign Line
i
i
 Calculate E ( x i  sign 2 x ) => quadratic approx. search
i
 To effectively calculate the derivative which means:
the rate of changein E ( xi ) per unit changein xi
 Calculate
E ( xi   xi ) , E ( xi   xi ) and average:
[ E ( xi   xi )  E ( xi )] /  xi  [ E ( xi   xi )  E ( xi )] /(  xi )
2
Discrete
derivative
Different types of relaxation
 Variable by variable relaxation – strict
minimization
 Changing a small subset of variables
simultaneously – Window strict
minimization relaxation
 Stochastic relaxation – may increase the
energy – should be followed by strict
minimization
Window strict unconstrained
minimization
 Discrete (combinatorial) case :
Permutations of small subsets P=2, placement
1D data base
The nodes:
1 2 3 4 5 6 7 8 9
A permutation p: 5 3 9 6 2 7 1 4 8
p(1)= 7 , p(2)=5 , p(3)=2 …
To find a consecutive subset of nodes in the current
permutation, we need the inverse of p, p-1:
p-1(1)= 5 , p-1(2)= 3 , p-1(3)= 9 …
In 2D we have to insert a grid and store the list of
nodes within each square
Window strict unconstrained
minimization
 Discrete (combinatorial) case :
Permutations of small subsets P=2, placement
Problem: very small number of variables!
 Quadratic case : P=2
Window relaxation for P=2
unconstrained version
 Minimize ( x ) 
a (x  x )
ij
i
2
j
ij
~
x
 Pick a window of variables i W, fix all variables at
~ , i  W so as to minimize
 Find a correction  i to x
i
( ) 

i , jW
2
~
~
aij (x i   i  x j   j ) 

iW , jW
2
~
~
aij (x i   i  x j)
 Quadratic functional in many variables – easy to solve!
( )
 0  solve the linear system A  b
 (i )
Updating the window variables
 For each i in the window W
insert the correction: xi new =xi+i
 Sort the xi news and rearrange the window
accordingly
 To improve the result obtained by the inner
changes apply node-by-node relaxation on
W and on its boundary
 At the and compare the “old” energy with
the “new” energy and accept / reject
 Revision process: try a “big” change,
improve it by local minimization, choose
Window relaxation for P=2
constrained version
 To prevent nodes from collapsing on each other
~  }
 To express the aim of having { x
i
i iW an
~}
{
x
approximate permutation of
i iW add 2 constraints:
~   )m v 
(
x
 i i i
iW
m
~
 xi v i
iW
,
m  1,2
Exc#4: Permutation’s invariants
n
m
~
 xi v i
, m  1,2 are
1) Prove that
i 1
invariant under permutation.
2) Is it also true for m=3?
Window relaxation for P=2
constrained version
 To prevent nodes from collapsing on each other
~  }
 To express the aim of having { x
i
i iW an
~}
{
x
approximate permutation of
i iW add 2 constraints:
~   )m v 
(
x
 i i i
iW
m 1

m
~
 xi v i
iW

iW
m2

,
i
vi  0
~ 0

v
x
 i i i
iW
 Minimization with equality constraints
 Lagrange multipliers
m  1,2
Lagrange multipliers
 Goal: Transform a constrained optimization
problem with n variables and m equality
constraints to an unconstrained optimization
problem with n+m variables. The new m
variables are called the Lagrange multipliers
 Geometry explanation
2 constraints in 3D
The optimal ellipsoid is tangent to
the constraints curve
Lagrange multipliers
 Goal: Transform a constrained optimization
problem with n variables and m equality
constraints to an unconstrained optimization
problem with n+m variables. The new m
variables are called the Lagrange multipliers
 Geometry explanation
 Construct an augmented functional –
the Lagrangian
The Lagrangian
Given E(x) subject to m equality constraints:
hk(x)=0 , k=1,…,m , construct the Lagrangian
L(x,l) = E(x) + Sk lk hk(x) and solve the system
of n+m equations
L( x , l )
0 ,
x i
for
i  1,..., n
L( x , l )
0 ,
l k
for
k  1,..., m The constraints!
The value of l is meaningful
The Lagrangian: an example
 Minimize E(x,y)=x+y
 Subject to h(x,y)=x2+y2-2
 The Lagrangian: L(x,y,l)=E(x,y)+l(x2+y2-2)
L  E  lh  0 
E
 0  1  2l x  0
x
E
 0  1  2l y  0
y
E
0 
l
co-linearity
E   l h The
of the gradients
x2  y2  2  0
The constraint!
Window relaxation for 1D ordering
constrained/unconstrained version
 Minimize ( x ) 
P
a
ij | xi  xj |

ij
~
x
 Pick a window of variables i W, fix all variables at
~ , i W
 Find a correction  i to x
i
 Update the window’s variables, restore volume
constraints and revise around the window
 Switch to the next window chosen with overlap
 Use a (small) sequence of variable size windows
 For example use windows with 5,10,15,20,25 nodes
Easy to solve problems
 Quadratic functional and linear constraints
Solve a linear system of equations
Quadratization of the functional: P=1, P>2
Quadratization for P=1 and P>2
 Minimize ( x ) 

aij | xi  xj | ; ( x ) 
ij
a (x  x )
ij
i
ij
~i
 Given a current approximation x
 Minimize
a
ij
2
ˆ
( x )   ~ ~ ( xi  xj )
ij | x i  x j |
 Minimize
a
ij
2
ˆ
( x)   ~ ~  4 ( xi  xj )
ij ( x i  x j )
j
6
Easy to solve problems
 Quadratic functional and linear constraints
Solve a linear system of equations
Quadratization of the functional: P=1, P>2
Linearization of the constraints: P=2
Window relaxation for P=2
constrained version
 To prevent nodes from collapsing on each other
~  }
 To express the aim of having { x
i
i iW an
~}
{
x
approximate permutation of
i iW add 2 constraints:
~   )m v 
(
x
 i i i
iW
m 1

m
~
 xi v i

m  1,2
iW

iW
m2
,
i
vi  0
~ 0

v
x
 i i i
iW
 The  2 terms were neglected assuming they are small
enough compared with other terms in the equation
Easy to solve problems
 Quadratic functional and linear constraints
Solve a linear system of equations
Quadratization of the functional: P=1, P>2
Linearization of the constraints: P=2
Inequality constraints: active set method