Time slice 1

EXPReS FABRIC
WP 2.2 Correlator Engine
Meeting 25-09-2006
Poznan Poland
JIVE, Ruud Oerlemans
25-09-2006
EXPReS FABRIC meeting at Poznan, Poland
1
WP2.2 Correlator Engine
Develop a Correlator Engine that can run on
standard workstations, deployable on clusters
and grid nodes
1.
2.
3.
4.
5.
6.
7.
Correlator algorithm design (m5)
Correlator computational core, single node (m14)
Scaled up version for clusters (m23)
Distributed version, middle ware (m33)
Interactive visualization (m4)
Output definition (m15)
Output merge (m24)
25-09-2006
EXPReS FABRIC meeting at Poznan, Poland
2
Current broadband Software Correlator
Station 1
Station 2
Station N
EVN Mk4 equivalents
Raw data BW=16 MHz,
Mk4 format on Mk5 disk
From Mk5 to linux disk
Raw data 16 MHz,
Mk4 format on linux disk
DIM,TRM,
CRM
Channel extraction
Extracted data
SU
Pre-calculated,Delay tables
DCM,DMM,
FR
Delay corrections
Delay corrected data
Correlator
Chip
Correlation.
SFXC
Data Product
25-09-2006
EXPReS FABRIC meeting at Poznan, Poland
3
High level design Distributed Correlation
Principle
Investigator
SCHED
schedule
EOP
VEX
Telescope
operator
WFM
CALC
Field
System
Process
VEX
Mark5
System
VEX
Central
operator
CCF
Delay
Field
System
Mark5
System
Grid
Node
Field
System
Mark5
System
25-09-2006
Grid
Node
Grid
Node
EXPReS FABRIC meeting at Poznan, Poland
JIVE archive
4
Grid considerations/aspects
 Why use grid processing power?
 It is available, no hardware investment required
 It will be upgraded regularly
 Degree of distribution is trade-off between
 Processing power at the grid nodes
 Data transport capacity to the grid nodes
 Data logistics and coordination
 More complicated when more distributed
 Processing at telescope and grid nodes
 Station related processing at telescope site and
correlation elsewhere
 All processing at grid nodes
25-09-2006
EXPReS FABRIC meeting at Poznan, Poland
5
Data distribution over grid sites (1)
Baseline slicing
Pros
•Small nodes
•Simple implementation at node
Cons
•Multiplication of large data rates,
especially when number of
baselines is large
•Data logistics complex
•Scalability complex
25-09-2006
EXPReS FABRIC meeting at Poznan, Poland
6
Data distribution over grid sites (2)
Pros
•Simple data logistics
•Central processing
•Live processing easy
•Slicing at the grid site
•Dealing with only one
site.
Cons
•Powerful central
processing site
required
1
All data to one site
Time slicing
Pros
•Smaller nodes
•Live processing possible
•Data slicing at nodes
Cons
•Multiplication of large data
rates
•Simultaneous availability of
sites when processing live
All data to different sites
Pros
•Smaller nodes
•Live processing per
channel
•Simple implementation
•Easy scalable
Cons
•Channel extraction at
telescope increases data
rate
Pros
•Smaller nodes
•Smaller data rates
•Simple implementation
•Easy scalable
•No data mulitplication
Cons
•Complex data logistics
after correlation
•Live correlation
complex
3
25-09-2006
2
Channel slicing
EXPReS FABRIC meeting at Poznan, Poland
4
7
Correlator architecture for file input
Time slice 1
SA
SB
SC
Time slice 2
SA
SB
SC
Time slice 3
SA
SB
SC
Core1
CP1
•Processes data from one
channel
•Easy scalable, because
one application has all the
functionality
•Can exploit multiple
processors using MPI
•Code reuse through OO
and C++
Core2
CP2
Core1
SD
SD
SD
Core3
CP3
CP
This software
architecture can
work for data
distributions 1,2
and 3
Offline processing
25-09-2006
EXPReS FABRIC meeting at Poznan, Poland
8
Correlator architecture for data streams
1.1
CP
1.1
1.2
1.2
SA SB SC SD
1.3
2.1
Core1
1.3
2.1
Buffer 1
Time
2.2
2.2
2.3
3.1
Core2
Buffer 2
Core4
3.2
2.3
3.1
3.2
3.3
Core3
4.1
Buffer 3
3.3
4.1
4.2
4.2
4.3
4.3
Memory buffer with
short time slices
25-09-2006
Real time processing
EXPReS FABRIC meeting at Poznan, Poland
File on
disk
9
Other issues
 Swinburne University, Adam Deller
 Last summer exchange of expertise on their
software correlator
 New EXPReS employee: Yurii
Pidopryhora,
 Astronomy background
 Data analysis and testing
 New SCARIe employee: Nico Kruithof
 Computer science background
 Scari, NWO funded project aimed at sw
correlator on Dutch Grid
25-09-2006
EXPReS FABRIC meeting at Poznan, Poland
10
WP 2.2.? Status
Work Package
1. Correlator algorithm design
2. Correlator computational core
3. Scaled up version for clusters
4. Distributed version
5. Interactive visualization
6. Output definition
7. Output merge
25-09-2006
M
5
14
23
33
4
15
24
Status
almost finished
active
active
pending
pending
designing
designing
EXPReS FABRIC meeting at Poznan, Poland
11