An Effective & Interactive Approach to Particle Tracking for DNA Melting Curve Analysis 李穎忠 D E PA R T M E N T O F C O M P U T E R S C I E N C E & I N F O R M AT I O N E N G I N E E R I N G N AT I O N A L TA I W A N U N I V E R S I T Y DNA Melting Curve Analysis Used for the detection of DNA sequence variants DNA Melting Analysis in Temperature-Gradient Micro-channel Temperature-Gradient Micro-channel Carrier (Bead/Droplet) Thermometer Heater Substrate 1/54 Fluorescent Intensity DNA Melting Curve Analysis Melting Temperature Temperature 2/54 DNA Melting Curve Analysis 3/54 Motivation People label each particles (carrier) frame by frame That is time-consuming We design an annotation tool to reduce human effort 4/54 Related Work Particle tracking ParticleTracker: An ImageJ plugin for multiple particle detection and tracking [Sbalzarini et al., Journal of structural biology 2005] u-track [Jaqaman et al., Nature Methods 2008] Interactive video annotation Tracking with active learning [Vondrick et al., NIPS 2011] Interactive object detection [Yao et al., CVPR 2012] 5/54 Proposed System User annotation Acquisition of all correct labels Detection of bounding circle of the particle Update of tracker & labels Acquisition of labels at other frames by tracking the particle User correction 6/54 Detecting Bounding Circle of a Particle Median filter Otsu's method Dilation Erosion Least-squares fitting Edge detection 7/54 Least-Squares Fitting of Bounding Circle Assume the coordinates of the detected edge are 𝑥𝑖 , 𝑦𝑖 𝑖 = 1,2, … , 𝑁 Let 𝑥𝑐 , 𝑦𝑐 and 𝑟 denote the center and the radius of circle respectively 𝑥𝑐 − 𝑥1 𝑥𝑐 − 𝑥𝑁 2 + 𝑦𝑐 − 𝑦1 ⋮ 2+ 𝑦 −𝑦 𝑐 𝑁 2 = 𝑟2 2 = 𝑟2 2𝑥1 𝑥𝑐 + 2𝑦𝑐 𝑦1 + 𝑟 2 − 𝑥𝑐2 − 𝑦𝑐2 = 𝑥12 + 𝑦12 ⇒ ⋮ 2 2𝑥𝑁 𝑥𝑐 + 2𝑦𝑁 𝑦1 + 𝑟 − 𝑥𝑐2 − 𝑦𝑐2 = 𝑥𝑁2 + 𝑦𝑁2 8/54 Least-Squares Fitting of Bounding Circle 2𝑥1 ⋮ 2𝑥𝑁 2𝑦1 ⋮ 2𝑦𝑁 𝑥𝑐 1 𝑥12 + 𝑦12 𝑦𝑐 ⋮ = ⋮ 1 𝑟 2 − 𝑥𝑐2 − 𝑦𝑐2 𝑥𝑁2 + 𝑦𝑁2 𝑨𝒛 = 𝑩 𝒛 = 𝑨T 𝑨 −1 T 𝑨 𝑩 9/54 Possible Choices of Trackers Linear interpolation Correlation filter based tracker [Zhang et al., ECCV 2014] Normalized cross-correlation matching 10/54 Linear Interpolation 1 2 3 4 5 6 7 8 9 10 11 12 11/54 Linear Interpolation: User Correction 1 2 3 4 5 6 7 8 9 10 11 12 12/54 Linear Interpolation: Update of Labels 1 2 3 4 5 6 7 8 9 10 11 12 13/54 Linear Interpolation: Update of Labels 1 2 3 4 5 6 7 8 9 10 11 12 14/54 Linear Interpolation: User Correction 1 2 3 4 5 6 7 8 9 10 11 12 15/54 Correlation Filter Based Tracker 𝐺 =ℎ⊗𝑓 ℎ= ℱ −1 [Zhang et al., ECCV 2014] ℱ 𝐺 ℱ 𝑓 − = ℱ −1 𝒙−𝒙∗ 𝛼 ℱ 𝑒 ℱ 𝑓 𝑓: Input image 𝐺: Correlation ℎ: Filter 16/54 Online Update of Filter Frame 1 𝐻1 = ℎ1 17/54 Online Update of Filter Frame 2 𝐻1 ⊗ 𝐹 𝐻2 = 1 − 𝜌 𝐻1 + 𝜌ℎ2 18/54 One-Way Method 2 1 19/54 One-Way Method 2 1 20/54 One-Way Method 2 3 1 21/54 One-Way Method 2 1 3 Re-train the filter ℎ3 𝐻3 = 1 − 𝜌 𝐻2 + 𝜌ℎ3 22/54 Two-Way Method 2 1 3 4 5 6 7 8 9 10 11 12 13 14 23/54 Two-Way Method 2 1 3 4 5 6 7 8 9 10 11 12 13 14 24/54 Two-Way Method 2 1 3 4 5 6 7 8 9 10 11 12 13 14 25/54 Two-Way Method 2 1 3 4 5 6 7 8 9 10 11 12 13 14 26/54 Two-Way Method 2 1 3 4 5 6 7 8 9 10 11 12 13 14 27/54 Two-Way Method 2 1 3 4 5 6 7 8 9 10 11 12 13 14 28/54 Two-Way Method 2 1 3 4 5 6 7 8 9 10 11 12 13 14 29/54 Normalized Cross-Correlation Matching Given a image f and template t, normalized cross-correlation (NCC) measures the similarity between each part of f and t: 𝛾 𝑢, 𝑣 = 𝑥,𝑦 𝑥,𝑦 Template 𝑓 𝑥, 𝑦 − 𝑓𝑢,𝑣 𝑡 𝑥 − 𝑢, 𝑦 − 𝑣 − 𝑡 𝑓 𝑥, 𝑦 − 𝑓𝑢,𝑣 2 Input image 𝑥,𝑦 𝑡 𝑥 − 𝑢, 𝑦 − 𝑣 − 𝑡 2 Output NCC 30/54 Normalized Cross-Correlation Matching Frame 1 Template 31/54 Normalized Cross-Correlation Matching Frame 2 32/54 One-Way Method 2 1 33/54 One-Way Method 2 1 34/54 One-Way Method 2 3 1 35/54 One-Way Method 2 1 3 Update the template 36/54 Two-Way Method 2 1 3 4 5 6 7 8 9 10 11 12 13 14 37/54 Failure in Tracking with Normalized Cross-Correlation 2 Template of particle 1 1 38/54 Combining NCC & Extrapolation Frame t-2 Frame t-1 Frame t 1 x 1x 1 x 2 2 𝛿 𝑢, 𝑣 = 2 𝑢−𝑥 ′ ,𝑣−𝑦 ′ − 𝜎∙𝑙 𝑒 2 𝜙 𝑢, 𝑣 = 𝑤 × 𝛾 𝑢, 𝑣 + 1 − 𝑤 × 𝛿 𝑢, 𝑣 where 0 ≤ 𝑤 ≤ 1 39/54 Combining NCC & Extrapolation NCC Score of predicted location Combined score 40/54 Experiments Evaluate how much human effort our system can reduce Simulate the process of annotating video with our system Evaluation metric Number of manual annotation Count a tracked bounding box as a correct label if the distance between the centers of it and the ground-truth bounding box is not more than 10 pixels 41/54 Methods Interp CF-1way CF-2way NCC-1way NCC-2way NCC-Extrap-1way NCC-Extrap-2way 42/54 The Order of Labeling For those methods not restricting the order of labeling Always correct the label with maximum center location error For other methods Same as the video display order 43/54 Video Dataset Name # frames # particles # annotations Droplet1 1203 15 635 Droplet2 637 53 4192 Bead 420 5 727 Video Droplet 1 is for parameter tuning which is performed using brutal force search 44/54 Parameter Tuning for CF-1way Ground-truth correlation =𝑒 − 𝒙−𝒙∗ 𝛼 45/54 Parameter Tuning for CF-1way 𝐻𝑡 = 1 − 𝜌 𝐻𝑡−1 + 𝜌ℎ𝑡 46/54 Parameter Tuning for NCC-Extrap-1way 𝛿 𝑢, 𝑣 = 𝑢−𝑥 ′ ,𝑣−𝑦 ′ − 𝜎∙𝑙 𝑒 2 47/54 Parameter Tuning for NCC-Extrap-1way 𝜙 𝑢, 𝑣 = 𝑤 × 𝛾 𝑢, 𝑣 + 1 − 𝑤 × 𝛿 𝑢, 𝑣 48/54 Result Droplet2 (# annotations = 4192) Bead (# annotations = 727) Interp 457 (10.90%) 88 (12.10%) CF-1way 1475 (35.19%) 79 (10.89%) CF-2way 1973 (47.07%) 112 (15.41%) NCC-1way 56 (1.34%) 11 (1.51%) NCC-2way 129 (3.08%) 21 (2.89%) NCC-Extrap-1way 53 (1.26%) 9 (1.24%) NCC-Extrap-2way 115 (2.74%) 20 (2.75%) 49/54 Error Analysis for NCC-Extrap-1way 50/54 Error Analysis for NCC-Extrap-1way 51/54 Error Analysis for NCC-Extrap-1way Target Error 52/54 Conclusions We designed a system for particle annotation in video sequences Our system can reduce human effort in annotation Combining NCC and extrapolation achieves the best result It is better to annotate video in its display order Future work Use polynomial curve fitting to predict the location of particle in the next frame 53/54 Thank you for listening
© Copyright 2026 Paperzz