SlideShare ist ein Scribd-Unternehmen logo
1 von 90
Walking Your Dog in the
      Woods in Polynomial Time




                       Shripad Thite
                         shripad@ Caltech.edu
                                             ´
        Joint work with Erin Wolf Chambers, Eric Colin de Verdi`re,
                                                               e
               Jeff Erickson, Sylvain Lazard, Francis Lazarus
                       at SoCG’08, invited to CGTA
1-1
Walking your dog




            You and your dog walk along two given curves
      from beginning to end, continuously without backtracking,
                       joined by a tight leash

2-1
Walking your dog




                  How long must the leash be?



3-1
Walking your dog




      The Fr´chet distance between the curves is the minimum
            e
               leash length that permits such a walk


4-1
Walking your dog

                                             0.75
                       0.25


       A
                                 0.5
       0                                                     1
                                             0.75
                       0.25
           0
       B                                                 1
                                 0.5
      Curves A, B : [0, 1] → E2 continuous
      Re-parameterizations u, v : [0, 1] → [0, 1] define a walk
      At time t, you are at A(u(t)) and your dog is at B(v(t))
      u, v are continuous, monotone
5-1
Fr´chet distance = minimum feasible leash length
        e

                                                0.75
                        0.25


       A
                                      0.5
       0                                                            1
                                                0.75
                        0.25
           0
       B                                                     1
                                      0.5


                                                A(u(t)) − B(v(t))
       F (A, B) =        inf          max
                                       time t
                       walk u, v
                                      t∈[0,1]
                    u,v:[0,1]→[0,1]


6-1
Fr´chet distance between curves
        e

                                         0.75
                      0.25


       A
                                0.5
       0                                                   1
                                         0.75
                      0.25
           0
       B                                               1
                                0.5
      A metric defined by Maurice Fr´chet (1878-1973)
                                   e

      O(N 2 log N ) algorithm, in the plane, by Alt+Godau’95,
      where N = m + n + 2 = total input complexity
7-1
Fr´chet distance vs. Hausdorff distance
        e

          0
                    H
                                                        F




          1

          0                                                1
      Fr´chet distance is a better measure of similarity since it
        e
      accounts for the flow of the curves (handwriting recognition)
8-1
Walking in the Woods




9-1
Woods have trees . . . and other obstacles

       New condition: Leash must move continuously




       If there are obstacles, a longer leash may be required because
       the leash cannot jump over them
       Goal: Walk the dog with the shortest leash possible
10-1
Homotopic Fr´chet distance
                   e

       Dog-leash distance in a general metric space where the
       leash must move continuously in the metric space




       We give a polynomial-time algorithm to compute the ho-
       motopic Fr´chet distance between two polygonal curves in
                 e
       the plane with obstacles (= punctured plane)
11-1
Application: Morphing




       Leash motion encodes a continuous deformation between A
       and B, without penetrating obstacles
       The “cost” of the deformation is the maximum distance any
       point has to travel, i.e., the Fr´chet distance
                                        e
12-1
Example 1




13-1
Example 1




  1


  1



13-2
Example 1




                   2



  1
                       2

  1



13-3
Example 1




                   2

                            3
  1
                       23

  1



13-4
Example 1




                   2

                            3
                                4
  1
                       23

  1
                                4

13-5
Example 1




                   2

                            3       5
                                4
  1
                       23               5

  1
                                4

13-6
Example 1




                                                6
                   2

                            3       5
                                4
  1
                       23               5   6

  1
                                4

13-7
Example 1




                                                6
                   2

                            3       5
                                4
  1                                                 7
                       23               5   6

  1                                                 7
                                4

13-8
Example 2




14-1
Example 2


                   1




 1




14-2
Example 2


                             1
                     2




 1

                   2, 3, 4



14-3
Example 2


                             1
                     2
 3



 1

                   2, 3, 4



14-4
Example 2


                                       1
                     2
 3
                             4, 5, 6


 1

                   2, 3, 4



14-5
Example 2


                                       1
                     2
 3
                             4, 5, 6


 1

                   2, 3, 4             5



14-6
Example 2


                                       1
                     2
 3
                             4, 5, 6


 1

                   2, 3, 4             5

                                6

14-7
Example 2


                                       1
                     2
 3
                             4, 5, 6
                                       7

 1

                   2, 3, 4             5

                                6
 7
14-8
Definitions

         SLOW
         TOPOLOGY
          AHEAD




15-1
Leash map

                                  : [0, 1] × [0, 1] → S
       Continuous function
                                  arc-length   time      metric space
       s.t. u = (0, ·), v = (1, ·) are re-parameterizations of A, B


                   0.25                                      0.75
        A
                                     0.5
  u=0                                                                   u=1
                          0.25                        0.75


                                                                    v=1
            v=0
        B
                                     0.5


       i.e. (·, t) is the leash at time t joining A(u(t)) and B(v(t))
16-1
Homotopic Fr´chet distance
                   e

                                 : [0, 1] × [0, 1] → S
       Continuous function
                                 arc-length   time   metric space
       s.t. u = (0, ·), v = (1, ·) are re-parameterizations of A, B

       The cost of a leash map is the maximum length of the
       leash at any time during the leash motion:
                  cost( ) := max { Length of (·, t) }
                             t∈[0,1]




17-1
Homotopic Fr´chet distance
                   e

                                  : [0, 1] × [0, 1] → S
       Continuous function
                                  arc-length    time    metric space
       s.t. u = (0, ·), v = (1, ·) are re-parameterizations of A, B

       The cost of a leash map is the maximum length of the
       leash at any time during the leash motion:
                  cost( ) := max { Length of (·, t) }
                             t∈[0,1]


       The homotopic Fr´chet distance is the minimum cost of
                       e
       any leash map:
                                               { cost( ) }
                    F (A, B) :=        inf
                                  leash map
17-2
Meanwhile,
       back in the woods . . .



18-1
Punctured plane
       Let A, B be two given curves in E2




19-1
Punctured plane
       Let A, B be two given curves in E2
       Let P be a set of obstacles, with total complexity k
       Punctured Plane = E2  P




19-2
Punctured plane
       Let A, B be two given curves in E2
       Let P be a set of obstacles, with total complexity k
       Punctured Plane = E2  P
       A leash is a curve in E2  P joining A and B




19-3
Punctured plane
       Let A, B be two given curves in E2
       Let P be a set of obstacles, with total complexity k
       Punctured Plane = E2  P
       A leash is a curve in E2  P joining A and B




       Leash map : [0, 1] × [0, 1] → E2  P must be continuous;
       so, the leash cannot jump over obstacles
19-4
Relative homotopy

       Two leashes are relatively homotopic if one can be con-
       tinuously transformed into the other in the punctured plane
       while keeping their endpoints on the respective curves




20-1
Relative homotopy

       Two leashes are relatively homotopic if one can be con-
       tinuously transformed into the other in the punctured plane
       while keeping their endpoints on the respective curves




       Functions α, β : X → Y are (freely) homotopic if there is a continuous function
       h : X × [0, 1] → Y such that h(·, 0) = α(·) and h(·, 1) = β(·)
20-2
Relative homotopy class

       Every leash map h describes a set of leashes belonging to
       some relative homotopy class h




21-1
Homotopic Fr´chet distance redux
                   e


       Let h be a relative homotopy class




22-1
Homotopic Fr´chet distance redux
                   e


       Let h be a relative homotopy class

       Let       be a leash map in homotopy class h
             h




22-2
Homotopic Fr´chet distance redux
                   e


       Let h be a relative homotopy class

       Let       be a leash map in homotopy class h
             h



                                  { cost( h ) }
       Let Fh (A, B) := inf   h




22-3
Homotopic Fr´chet distance redux
                   e


       Let h be a relative homotopy class

       Let       be a leash map in homotopy class h
             h



                                  { cost( h ) }
       Let Fh (A, B) := inf   h




       Homotopic Fr´chet distance
                   e

                        F (A, B) := min{Fh (A, B)}
                                         h


22-4
Key Insights
            a.k.a.
       ‘Aha!’ moments


23-1
Insight 1: Geodesic leashes
       Lemma: There exists an optimum leash map such that the
       leash at every time is the shortest path in its homotopy class




24-1
Insight 1: Geodesic leashes
       Lemma: There exists an optimum leash map such that the
       leash at every time is the shortest path in its homotopy class
       Hence, w.l.o.g., ∗ (·, t) is the (unique) geodesic in homotopy
                        h
       class h between its endpoints




24-2
Insight 1: Geodesic leashes
       Lemma: There exists an optimum leash map such that the
       leash at every time is the shortest path in its homotopy class
       Hence, w.l.o.g., ∗ (·, t) is the (unique) geodesic in homotopy
                        h
       class h between its endpoints




       Fact: The geodesic between a ∈ A and b ∈ B moves
       continuously as a,b move continuously along their respective
       curves

24-3
Insight 1: Geodesic leashes
       Lemma: There exists an optimum leash map such that the
       leash at every time is the shortest path in its homotopy class
       Hence, w.l.o.g., ∗ (·, t) is the (unique) geodesic in homotopy
                        h
       class h between its endpoints




       Fact: The geodesic between a ∈ A and b ∈ B moves
       continuously as a,b move continuously along their respective
       curves
       but . . . there are infinitely many geodesics with the same
       endpoints, one in each of infinitely many homotopy classes
24-4
Insight 2: Proper line segment

       Lemma: The optimum homotopy class h∗ must contain a
       proper line segment, i.e., a line segment joining A and B
       avoiding all obstacle points




            m edges         k point obstacles       n edges




25-1
Insight 3: Minimal homotopy classes

       A homotopy class is minimal if it does not bend around
       obstacles unnecessarily

       Lemma: There exists an optimum homotopy class that is
       minimal

       Lemma: Every minimal homotopy class contains a proper
       line segment

       Here onwards, we speak only of minimal homotopy classes




26-1
Insight 4: Pinned leash map

       The optimum leash map ∗ may be pinned at a common
       subpath π, i.e., a globally shortest p-q path where p,q are
       obstacle boundary points




                             p
                                 π
                                  q



                                  ∗
       The optimum leash map          contains a direct geodesic
27-1
The Algorithm




28-1
Algorithm for point obstacles
       List all candidate homotopy classes h
                 There are O(mnk 2 ) extremal proper line seg-
                 ments, at least one in each homotopy class




            m edges             k obstacles        n edges


       Compute Fh (A, B)
                 In O(mnk log mnk) time
                 using parametric search
29-1
Algorithm for polygonal obstacles

       The optimum leash map ∗ may be pinned at a common
       subpath π, i.e., a globally shortest p-q path




                           p
                               π
                                q



       Enumerate O(mnk 4 ) pinned homotopy classes h
       Compute Fh (A, B) in O(mnk log mnk) time as before
30-1
Summary




31-1
We defined the homotopic Fr´chet distance between two
                                     e
       curves in a general metric space

           the most natural generalization of Fr´chet distance
                                                e
           to arbitrary metric spaces


       We gave a polynomial-time algorithm to compute the homo-
       topic Fr´chet distance between two polygonal curves A,B
               e
       in the plane with point or polygonal obstacles
           the punctured plane is the first metric space
           that we consider



32-1
Open problem: On a convex polyhedron



       Leash is not always a geodesic!

       e.g., leash must have enough slack to cross over a vertex
       (a ‘mountain’)


       Challenge: Characterize an optimum leash map




33-1
Thank you!




34-1
Extra slides




35-1
How to compute
          Fh (A, B)
          in detail


36-1
Computing Fh


       Decision problem: Given a real d ≥ 0, is Fh (A, B) ≤ d?




37-1
Computing Fh


       Decision problem: Given a real d ≥ 0, is Fh (A, B) ≤ d?

       Observation: There are polynomially many critical values
       of d at which the answer may change from ‘no’ to ‘yes’

            d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ di+1 ≤ di+2 ≤ . . .




37-2
Computing Fh


       Decision problem: Given a real d ≥ 0, is Fh (A, B) ≤ d?

       Observation: There are polynomially many critical values
       of d at which the answer may change from ‘no’ to ‘yes’

            d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ di+1 ≤ di+2 ≤ . . .




37-3
Computing Fh


       Decision problem: Given a real d ≥ 0, is Fh (A, B) ≤ d?

       Observation: There are polynomially many critical values
       of d at which the answer may change from ‘no’ to ‘yes’

            d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ di+1 ≤ di+2 ≤ . . .


                                  Fh (A, B)

       Goal: Find the smallest critical value d for which the answer
       above is ‘yes’
37-4
Is Fh ≤ d?



                                                                                                     i + 1, j + 1
                                                                                                     |P | changes




                                                                                      |P | changes

                                                                                    i, j
  http://www.cim.mcgill.ca/∼stephane/cs507/Project.html, St´phane Pelletier, 2002
                                                           e



  Is there a monotone path from (0, 0) to (m, n) in free space?
  Key lemma: Free space within each cell Cij is convex
     i.e., leash length is a convex function along any monotone path through Cij
     (think hourglasses)
38-1
Computing free space
         i + 1, j               i + 1, j + 1




         i, j                   i, j + 1

       Given the shortest path between ai and bj , compute:
       (1) the interval b ∈ [bj , bj+1 ] such that dist(ai , b) ≤ d
       (2) the interval a ∈ [ai , ai+1 ] such that dist(bj , a) ≤ d
       (3) shortest paths (ai , bj+1 ), (ai+1 , bj ), and (ai+1 , bj+1 )
39-1
Funnels
       Shortest paths from ai to
       [bj , bj+1 ] define a funnel∗
       ∗ in   the universal cover




40-1
Funnels
       Shortest paths from ai to                       bj+1
                                         b
       [bj , bj+1 ] define a funnel∗
       ∗ in   the universal cover
                                    bj



       Leash evolves like a deque               apex
       as its endpoints sweep
                                         tail

                                                ai




40-2
Funnels
       Shortest paths from ai to                         bj+1
                                           b
       [bj , bj+1 ] define a funnel∗
       ∗ in   the universal cover
                                    bj



       Leash evolves like a deque                 apex
       as its endpoints sweep
                                           tail

                                                  ai
       In O(k log k) time, build a funnel data structure of size
       O(k), using a deque, such that each free interval can be
       computed in O(log k) time
40-3
Computing Fh

       Decision algorithm: Given a real d ≥ 0, is Fh (A, B) ≤ d?
       . . . in O(mn log k) time




41-1
Computing Fh

       Decision algorithm: Given a real d ≥ 0, is Fh (A, B) ≤ d?
       . . . in O(mn log k) time

       Observation: There are polynomially many critical values
       of d at which the answer may change from ‘no’ to ‘yes’

            d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ di+1 ≤ di+2 ≤ . . .




41-2
Computing Fh

       Decision algorithm: Given a real d ≥ 0, is Fh (A, B) ≤ d?
       . . . in O(mn log k) time

       Observation: There are polynomially many critical values
       of d at which the answer may change from ‘no’ to ‘yes’

            d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ di+1 ≤ di+2 ≤ . . .




41-3
Computing Fh

       Decision algorithm: Given a real d ≥ 0, is Fh (A, B) ≤ d?
       . . . in O(mn log k) time

       Observation: There are polynomially many critical values
       of d at which the answer may change from ‘no’ to ‘yes’

            d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ di+1 ≤ di+2 ≤ . . .


                                  Fh (A, B)

       Goal: Find the smallest critical value d for which the answer
       above is ‘yes’
41-4
Parametric search                             [Megiddo ’83]

       Let As be an algorithm to decide, given a critical value di ,
       whether Fh (A, B) ≤ di , with running time O(Ts )

              d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ . . . ≤ dj ≤ . . .




42-1
Parametric search                             [Megiddo ’83]

       Let As be an algorithm to decide, given a critical value di ,
       whether Fh (A, B) ≤ di , with running time O(Ts )

              d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ . . . ≤ dj ≤ . . .


       We simulate As on input d∗ , with d∗ as a symbolic variable




42-2
Parametric search                                              [Megiddo ’83]

       Let As be an algorithm to decide, given a critical value di ,
       whether Fh (A, B) ≤ di , with running time O(Ts )

                 d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ . . . ≤ dj ≤ . . .


       We simulate As on input d∗ , with d∗ as a symbolic variable
       The control flow of As depends on comparisons of the form
       d∗ ≤ dj where dj is a critical value
       Each dj is a distance, i.e., a quadratic function of input coordinates.




42-3
Parametric search                                              [Megiddo ’83]

       Let As be an algorithm to decide, given a critical value di ,
       whether Fh (A, B) ≤ di , with running time O(Ts )

                 d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ . . . ≤ dj ≤ . . .


       We simulate As on input d∗ , with d∗ as a symbolic variable
       The control flow of As depends on comparisons of the form
       d∗ ≤ dj where dj is a critical value
       Each dj is a distance, i.e., a quadratic function of input coordinates.


       d∗ ≤ dj ? Run As on input dj , in O(Ts ) time


42-4
Parametric search                                              [Megiddo ’83]

       Let As be an algorithm to decide, given a critical value di ,
       whether Fh (A, B) ≤ di , with running time O(Ts )

                 d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ . . . ≤ dj ≤ . . .


       We simulate As on input d∗ , with d∗ as a symbolic variable
       The control flow of As depends on comparisons of the form
       d∗ ≤ dj where dj is a critical value
       Each dj is a distance, i.e., a quadratic function of input coordinates.


       d∗ ≤ dj ? Run As on input dj , in O(Ts ) time
                               2
       Total running time = O(Ts )
42-5
Parametric search on steroids                 [Megiddo ’83]

       Let Ap be a parallel algorithm to decide, given di , whether
       Fh (A, B) ≤ di , with parallel running time O(Tp ) on q pro-
       cessors

              d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ . . . ≤ dj ≤ . . .




43-1
Parametric search on steroids                 [Megiddo ’83]

       Let Ap be a parallel algorithm to decide, given di , whether
       Fh (A, B) ≤ di , with parallel running time O(Tp ) on q pro-
       cessors

              d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ . . . ≤ dj ≤ . . .


       We simulate Ap sequentially on input d∗ , with d∗ as a
       symbolic variable




43-2
Parametric search on steroids                 [Megiddo ’83]

       Let Ap be a parallel algorithm to decide, given di , whether
       Fh (A, B) ≤ di , with parallel running time O(Tp ) on q pro-
       cessors

              d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ . . . ≤ dj ≤ . . .


       We simulate Ap sequentially on input d∗ , with d∗ as a
       symbolic variable
       The control flow of Ap depends on comparisons of the form
       d∗ ≤ dj where dj is a critical value



43-3
Parametric search on steroids                 [Megiddo ’83]

       Let Ap be a parallel algorithm to decide, given di , whether
       Fh (A, B) ≤ di , with parallel running time O(Tp ) on q pro-
       cessors

              d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ . . . ≤ dj ≤ . . .


       We simulate Ap sequentially on input d∗ , with d∗ as a
       symbolic variable
       The control flow of Ap depends on comparisons of the form
       d∗ ≤ dj where dj is a critical value
       In each parallel stage, do binary search on O(q) values dj

43-4
Parametric search on steroids                 [Megiddo ’83]

       Let Ap be a parallel algorithm to decide, given di , whether
       Fh (A, B) ≤ di , with parallel running time O(Tp ) on q pro-
       cessors

              d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ . . . ≤ dj ≤ . . .


       We simulate Ap sequentially on input d∗ , with d∗ as a
       symbolic variable
       The control flow of Ap depends on comparisons of the form
       d∗ ≤ dj where dj is a critical value
       In each parallel stage, do binary search on O(q) values dj
       Total running time = O(Ts Tp log q + qTp )                     2
                                                       better than O(Ts )
43-5
How we use parametric search
       Let Ap be a parallel sorting algorithm
       (e.g., Cole’s parallel merge sort)




44-1
How we use parametric search
       Let Ap be a parallel sorting algorithm
       (e.g., Cole’s parallel merge sort)
       We use Ap to sort O(mn) critical distances with parallel
       running time O(log mn)
       We simulate Ap sequentially on input d∗ , with d∗ as a
       symbolic variable




44-2
How we use parametric search
       Let Ap be a parallel sorting algorithm
       (e.g., Cole’s parallel merge sort)
       We use Ap to sort O(mn) critical distances with parallel
       running time O(log mn)
       We simulate Ap sequentially on input d∗ , with d∗ as a
       symbolic variable
       The control flow of Ap depends on comparisons of the form
       d∗ ≤ dj where dj is a critical value




44-3
How we use parametric search
       Let Ap be a parallel sorting algorithm
       (e.g., Cole’s parallel merge sort)
       We use Ap to sort O(mn) critical distances with parallel
       running time O(log mn)
       We simulate Ap sequentially on input d∗ , with d∗ as a
       symbolic variable
       The control flow of Ap depends on comparisons of the form
       d∗ ≤ dj where dj is a critical value
       To simulate each parallel stage, we resolve all O(mn) com-
       parisons d∗ ≤ dj by computing O(mnk) critical intervals
       and locating each dj in some interval

44-4
How we use parametric search
       Let Ap be a parallel sorting algorithm
       (e.g., Cole’s parallel merge sort)
       We use Ap to sort O(mn) critical distances with parallel
       running time O(log mn)
       We simulate Ap sequentially on input d∗ , with d∗ as a
       symbolic variable
       The control flow of Ap depends on comparisons of the form
       d∗ ≤ dj where dj is a critical value
       To simulate each parallel stage, we resolve all O(mn) com-
       parisons d∗ ≤ dj by computing O(mnk) critical intervals
       and locating each dj in some interval
       Total running time = O(N 3 log N )
44-5
The End




45-1

Weitere ähnliche Inhalte

Was ist angesagt?

Alzheimer Disease Prediction using Machine Learning Algorithms
Alzheimer Disease Prediction using Machine Learning AlgorithmsAlzheimer Disease Prediction using Machine Learning Algorithms
Alzheimer Disease Prediction using Machine Learning AlgorithmsIRJET Journal
 
OpenCV presentation series- part 1
OpenCV presentation series- part 1OpenCV presentation series- part 1
OpenCV presentation series- part 1Sairam Adithya
 
Elements of visual perception
Elements of visual perceptionElements of visual perception
Elements of visual perceptionDr INBAMALAR T M
 
DCT image compression
DCT image compressionDCT image compression
DCT image compressionyoussef ramzy
 
digital image processing
digital image processingdigital image processing
digital image processingN.CH Karthik
 
Killzone Shadow Fall: Creating Art Tools For A New Generation Of Games
Killzone Shadow Fall: Creating Art Tools For A New Generation Of GamesKillzone Shadow Fall: Creating Art Tools For A New Generation Of Games
Killzone Shadow Fall: Creating Art Tools For A New Generation Of GamesGuerrilla
 
What goes on during haar cascade face detection
What goes on during haar cascade face detectionWhat goes on during haar cascade face detection
What goes on during haar cascade face detectionOnibiyo Joshua Toluse
 
DeepFake: Trick or Treat
DeepFake: Trick or TreatDeepFake: Trick or Treat
DeepFake: Trick or TreatSamama Khan
 
BRAIN TUMOR CLASSIFICATION IN 3D-MRI USING FEATURES FROM RADIOMICS AND 3D-CNN...
BRAIN TUMOR CLASSIFICATION IN 3D-MRI USING FEATURES FROM RADIOMICS AND 3D-CNN...BRAIN TUMOR CLASSIFICATION IN 3D-MRI USING FEATURES FROM RADIOMICS AND 3D-CNN...
BRAIN TUMOR CLASSIFICATION IN 3D-MRI USING FEATURES FROM RADIOMICS AND 3D-CNN...IAEME Publication
 
Emotion Recognition using Image Processing
Emotion Recognition using Image ProcessingEmotion Recognition using Image Processing
Emotion Recognition using Image Processingijtsrd
 
Digital image processing2.pptx
Digital image processing2.pptxDigital image processing2.pptx
Digital image processing2.pptxDivyanshAgarwal78
 
SPIHT(Set Partitioning In Hierarchical Trees)
SPIHT(Set Partitioning In Hierarchical Trees)SPIHT(Set Partitioning In Hierarchical Trees)
SPIHT(Set Partitioning In Hierarchical Trees)M.k. Praveen
 
Image segmentation based on color
Image segmentation based on colorImage segmentation based on color
Image segmentation based on coloreSAT Journals
 
Deep Residual Hashing Neural Network for Image Retrieval
Deep Residual Hashing Neural Network for Image RetrievalDeep Residual Hashing Neural Network for Image Retrieval
Deep Residual Hashing Neural Network for Image RetrievalEdwin Efraín Jiménez Lepe
 
ESTIMATING NOISE PARAMETER & FILTERING (Digital Image Processing)
ESTIMATING NOISE PARAMETER & FILTERING (Digital Image Processing)ESTIMATING NOISE PARAMETER & FILTERING (Digital Image Processing)
ESTIMATING NOISE PARAMETER & FILTERING (Digital Image Processing)Shajun Nisha
 
Neuroplasticity: The Brain That Changes Itself
Neuroplasticity: The Brain That Changes ItselfNeuroplasticity: The Brain That Changes Itself
Neuroplasticity: The Brain That Changes ItselfS'eclairer
 
J. Park, H. Shim, AAAI 2022, MLILAB, KAISTAI
J. Park, H. Shim, AAAI 2022, MLILAB, KAISTAIJ. Park, H. Shim, AAAI 2022, MLILAB, KAISTAI
J. Park, H. Shim, AAAI 2022, MLILAB, KAISTAIMLILAB
 

Was ist angesagt? (20)

Edge detection
Edge detectionEdge detection
Edge detection
 
Alzheimer Disease Prediction using Machine Learning Algorithms
Alzheimer Disease Prediction using Machine Learning AlgorithmsAlzheimer Disease Prediction using Machine Learning Algorithms
Alzheimer Disease Prediction using Machine Learning Algorithms
 
OpenCV presentation series- part 1
OpenCV presentation series- part 1OpenCV presentation series- part 1
OpenCV presentation series- part 1
 
OpenCV
OpenCVOpenCV
OpenCV
 
Elements of visual perception
Elements of visual perceptionElements of visual perception
Elements of visual perception
 
DCT image compression
DCT image compressionDCT image compression
DCT image compression
 
digital image processing
digital image processingdigital image processing
digital image processing
 
Killzone Shadow Fall: Creating Art Tools For A New Generation Of Games
Killzone Shadow Fall: Creating Art Tools For A New Generation Of GamesKillzone Shadow Fall: Creating Art Tools For A New Generation Of Games
Killzone Shadow Fall: Creating Art Tools For A New Generation Of Games
 
What goes on during haar cascade face detection
What goes on during haar cascade face detectionWhat goes on during haar cascade face detection
What goes on during haar cascade face detection
 
DeepFake: Trick or Treat
DeepFake: Trick or TreatDeepFake: Trick or Treat
DeepFake: Trick or Treat
 
Introduction to OpenCV
Introduction to OpenCVIntroduction to OpenCV
Introduction to OpenCV
 
BRAIN TUMOR CLASSIFICATION IN 3D-MRI USING FEATURES FROM RADIOMICS AND 3D-CNN...
BRAIN TUMOR CLASSIFICATION IN 3D-MRI USING FEATURES FROM RADIOMICS AND 3D-CNN...BRAIN TUMOR CLASSIFICATION IN 3D-MRI USING FEATURES FROM RADIOMICS AND 3D-CNN...
BRAIN TUMOR CLASSIFICATION IN 3D-MRI USING FEATURES FROM RADIOMICS AND 3D-CNN...
 
Emotion Recognition using Image Processing
Emotion Recognition using Image ProcessingEmotion Recognition using Image Processing
Emotion Recognition using Image Processing
 
Digital image processing2.pptx
Digital image processing2.pptxDigital image processing2.pptx
Digital image processing2.pptx
 
SPIHT(Set Partitioning In Hierarchical Trees)
SPIHT(Set Partitioning In Hierarchical Trees)SPIHT(Set Partitioning In Hierarchical Trees)
SPIHT(Set Partitioning In Hierarchical Trees)
 
Image segmentation based on color
Image segmentation based on colorImage segmentation based on color
Image segmentation based on color
 
Deep Residual Hashing Neural Network for Image Retrieval
Deep Residual Hashing Neural Network for Image RetrievalDeep Residual Hashing Neural Network for Image Retrieval
Deep Residual Hashing Neural Network for Image Retrieval
 
ESTIMATING NOISE PARAMETER & FILTERING (Digital Image Processing)
ESTIMATING NOISE PARAMETER & FILTERING (Digital Image Processing)ESTIMATING NOISE PARAMETER & FILTERING (Digital Image Processing)
ESTIMATING NOISE PARAMETER & FILTERING (Digital Image Processing)
 
Neuroplasticity: The Brain That Changes Itself
Neuroplasticity: The Brain That Changes ItselfNeuroplasticity: The Brain That Changes Itself
Neuroplasticity: The Brain That Changes Itself
 
J. Park, H. Shim, AAAI 2022, MLILAB, KAISTAI
J. Park, H. Shim, AAAI 2022, MLILAB, KAISTAIJ. Park, H. Shim, AAAI 2022, MLILAB, KAISTAI
J. Park, H. Shim, AAAI 2022, MLILAB, KAISTAI
 

Mehr von shripadthite

Pants Decomposition of the Punctured Plane
Pants Decomposition of the Punctured PlanePants Decomposition of the Punctured Plane
Pants Decomposition of the Punctured Planeshripadthite
 
IO-Efficient Point Location and Map Overlay in Low-Density Subdivisions
IO-Efficient Point Location and Map Overlay in Low-Density SubdivisionsIO-Efficient Point Location and Map Overlay in Low-Density Subdivisions
IO-Efficient Point Location and Map Overlay in Low-Density Subdivisionsshripadthite
 
Summary of My Research
Summary of My ResearchSummary of My Research
Summary of My Researchshripadthite
 
Spacetime Meshing for Discontinuous Galerkin Methods
Spacetime Meshing for Discontinuous Galerkin MethodsSpacetime Meshing for Discontinuous Galerkin Methods
Spacetime Meshing for Discontinuous Galerkin Methodsshripadthite
 
Strong Edge Coloring for Channel Assignment in Wireless Radio Networks
Strong Edge Coloring for Channel Assignment in Wireless Radio NetworksStrong Edge Coloring for Channel Assignment in Wireless Radio Networks
Strong Edge Coloring for Channel Assignment in Wireless Radio Networksshripadthite
 
Capturing a Convex Object with Three Discs
Capturing a Convex Object with Three DiscsCapturing a Convex Object with Three Discs
Capturing a Convex Object with Three Discsshripadthite
 

Mehr von shripadthite (6)

Pants Decomposition of the Punctured Plane
Pants Decomposition of the Punctured PlanePants Decomposition of the Punctured Plane
Pants Decomposition of the Punctured Plane
 
IO-Efficient Point Location and Map Overlay in Low-Density Subdivisions
IO-Efficient Point Location and Map Overlay in Low-Density SubdivisionsIO-Efficient Point Location and Map Overlay in Low-Density Subdivisions
IO-Efficient Point Location and Map Overlay in Low-Density Subdivisions
 
Summary of My Research
Summary of My ResearchSummary of My Research
Summary of My Research
 
Spacetime Meshing for Discontinuous Galerkin Methods
Spacetime Meshing for Discontinuous Galerkin MethodsSpacetime Meshing for Discontinuous Galerkin Methods
Spacetime Meshing for Discontinuous Galerkin Methods
 
Strong Edge Coloring for Channel Assignment in Wireless Radio Networks
Strong Edge Coloring for Channel Assignment in Wireless Radio NetworksStrong Edge Coloring for Channel Assignment in Wireless Radio Networks
Strong Edge Coloring for Channel Assignment in Wireless Radio Networks
 
Capturing a Convex Object with Three Discs
Capturing a Convex Object with Three DiscsCapturing a Convex Object with Three Discs
Capturing a Convex Object with Three Discs
 

Homotopic Frechet Distance Between Curves

  • 1. Walking Your Dog in the Woods in Polynomial Time Shripad Thite shripad@ Caltech.edu ´ Joint work with Erin Wolf Chambers, Eric Colin de Verdi`re, e Jeff Erickson, Sylvain Lazard, Francis Lazarus at SoCG’08, invited to CGTA 1-1
  • 2. Walking your dog You and your dog walk along two given curves from beginning to end, continuously without backtracking, joined by a tight leash 2-1
  • 3. Walking your dog How long must the leash be? 3-1
  • 4. Walking your dog The Fr´chet distance between the curves is the minimum e leash length that permits such a walk 4-1
  • 5. Walking your dog 0.75 0.25 A 0.5 0 1 0.75 0.25 0 B 1 0.5 Curves A, B : [0, 1] → E2 continuous Re-parameterizations u, v : [0, 1] → [0, 1] define a walk At time t, you are at A(u(t)) and your dog is at B(v(t)) u, v are continuous, monotone 5-1
  • 6. Fr´chet distance = minimum feasible leash length e 0.75 0.25 A 0.5 0 1 0.75 0.25 0 B 1 0.5 A(u(t)) − B(v(t)) F (A, B) = inf max time t walk u, v t∈[0,1] u,v:[0,1]→[0,1] 6-1
  • 7. Fr´chet distance between curves e 0.75 0.25 A 0.5 0 1 0.75 0.25 0 B 1 0.5 A metric defined by Maurice Fr´chet (1878-1973) e O(N 2 log N ) algorithm, in the plane, by Alt+Godau’95, where N = m + n + 2 = total input complexity 7-1
  • 8. Fr´chet distance vs. Hausdorff distance e 0 H F 1 0 1 Fr´chet distance is a better measure of similarity since it e accounts for the flow of the curves (handwriting recognition) 8-1
  • 9. Walking in the Woods 9-1
  • 10. Woods have trees . . . and other obstacles New condition: Leash must move continuously If there are obstacles, a longer leash may be required because the leash cannot jump over them Goal: Walk the dog with the shortest leash possible 10-1
  • 11. Homotopic Fr´chet distance e Dog-leash distance in a general metric space where the leash must move continuously in the metric space We give a polynomial-time algorithm to compute the ho- motopic Fr´chet distance between two polygonal curves in e the plane with obstacles (= punctured plane) 11-1
  • 12. Application: Morphing Leash motion encodes a continuous deformation between A and B, without penetrating obstacles The “cost” of the deformation is the maximum distance any point has to travel, i.e., the Fr´chet distance e 12-1
  • 14. Example 1 1 1 13-2
  • 15. Example 1 2 1 2 1 13-3
  • 16. Example 1 2 3 1 23 1 13-4
  • 17. Example 1 2 3 4 1 23 1 4 13-5
  • 18. Example 1 2 3 5 4 1 23 5 1 4 13-6
  • 19. Example 1 6 2 3 5 4 1 23 5 6 1 4 13-7
  • 20. Example 1 6 2 3 5 4 1 7 23 5 6 1 7 4 13-8
  • 22. Example 2 1 1 14-2
  • 23. Example 2 1 2 1 2, 3, 4 14-3
  • 24. Example 2 1 2 3 1 2, 3, 4 14-4
  • 25. Example 2 1 2 3 4, 5, 6 1 2, 3, 4 14-5
  • 26. Example 2 1 2 3 4, 5, 6 1 2, 3, 4 5 14-6
  • 27. Example 2 1 2 3 4, 5, 6 1 2, 3, 4 5 6 14-7
  • 28. Example 2 1 2 3 4, 5, 6 7 1 2, 3, 4 5 6 7 14-8
  • 29. Definitions SLOW TOPOLOGY AHEAD 15-1
  • 30. Leash map : [0, 1] × [0, 1] → S Continuous function arc-length time metric space s.t. u = (0, ·), v = (1, ·) are re-parameterizations of A, B 0.25 0.75 A 0.5 u=0 u=1 0.25 0.75 v=1 v=0 B 0.5 i.e. (·, t) is the leash at time t joining A(u(t)) and B(v(t)) 16-1
  • 31. Homotopic Fr´chet distance e : [0, 1] × [0, 1] → S Continuous function arc-length time metric space s.t. u = (0, ·), v = (1, ·) are re-parameterizations of A, B The cost of a leash map is the maximum length of the leash at any time during the leash motion: cost( ) := max { Length of (·, t) } t∈[0,1] 17-1
  • 32. Homotopic Fr´chet distance e : [0, 1] × [0, 1] → S Continuous function arc-length time metric space s.t. u = (0, ·), v = (1, ·) are re-parameterizations of A, B The cost of a leash map is the maximum length of the leash at any time during the leash motion: cost( ) := max { Length of (·, t) } t∈[0,1] The homotopic Fr´chet distance is the minimum cost of e any leash map: { cost( ) } F (A, B) := inf leash map 17-2
  • 33. Meanwhile, back in the woods . . . 18-1
  • 34. Punctured plane Let A, B be two given curves in E2 19-1
  • 35. Punctured plane Let A, B be two given curves in E2 Let P be a set of obstacles, with total complexity k Punctured Plane = E2 P 19-2
  • 36. Punctured plane Let A, B be two given curves in E2 Let P be a set of obstacles, with total complexity k Punctured Plane = E2 P A leash is a curve in E2 P joining A and B 19-3
  • 37. Punctured plane Let A, B be two given curves in E2 Let P be a set of obstacles, with total complexity k Punctured Plane = E2 P A leash is a curve in E2 P joining A and B Leash map : [0, 1] × [0, 1] → E2 P must be continuous; so, the leash cannot jump over obstacles 19-4
  • 38. Relative homotopy Two leashes are relatively homotopic if one can be con- tinuously transformed into the other in the punctured plane while keeping their endpoints on the respective curves 20-1
  • 39. Relative homotopy Two leashes are relatively homotopic if one can be con- tinuously transformed into the other in the punctured plane while keeping their endpoints on the respective curves Functions α, β : X → Y are (freely) homotopic if there is a continuous function h : X × [0, 1] → Y such that h(·, 0) = α(·) and h(·, 1) = β(·) 20-2
  • 40. Relative homotopy class Every leash map h describes a set of leashes belonging to some relative homotopy class h 21-1
  • 41. Homotopic Fr´chet distance redux e Let h be a relative homotopy class 22-1
  • 42. Homotopic Fr´chet distance redux e Let h be a relative homotopy class Let be a leash map in homotopy class h h 22-2
  • 43. Homotopic Fr´chet distance redux e Let h be a relative homotopy class Let be a leash map in homotopy class h h { cost( h ) } Let Fh (A, B) := inf h 22-3
  • 44. Homotopic Fr´chet distance redux e Let h be a relative homotopy class Let be a leash map in homotopy class h h { cost( h ) } Let Fh (A, B) := inf h Homotopic Fr´chet distance e F (A, B) := min{Fh (A, B)} h 22-4
  • 45. Key Insights a.k.a. ‘Aha!’ moments 23-1
  • 46. Insight 1: Geodesic leashes Lemma: There exists an optimum leash map such that the leash at every time is the shortest path in its homotopy class 24-1
  • 47. Insight 1: Geodesic leashes Lemma: There exists an optimum leash map such that the leash at every time is the shortest path in its homotopy class Hence, w.l.o.g., ∗ (·, t) is the (unique) geodesic in homotopy h class h between its endpoints 24-2
  • 48. Insight 1: Geodesic leashes Lemma: There exists an optimum leash map such that the leash at every time is the shortest path in its homotopy class Hence, w.l.o.g., ∗ (·, t) is the (unique) geodesic in homotopy h class h between its endpoints Fact: The geodesic between a ∈ A and b ∈ B moves continuously as a,b move continuously along their respective curves 24-3
  • 49. Insight 1: Geodesic leashes Lemma: There exists an optimum leash map such that the leash at every time is the shortest path in its homotopy class Hence, w.l.o.g., ∗ (·, t) is the (unique) geodesic in homotopy h class h between its endpoints Fact: The geodesic between a ∈ A and b ∈ B moves continuously as a,b move continuously along their respective curves but . . . there are infinitely many geodesics with the same endpoints, one in each of infinitely many homotopy classes 24-4
  • 50. Insight 2: Proper line segment Lemma: The optimum homotopy class h∗ must contain a proper line segment, i.e., a line segment joining A and B avoiding all obstacle points m edges k point obstacles n edges 25-1
  • 51. Insight 3: Minimal homotopy classes A homotopy class is minimal if it does not bend around obstacles unnecessarily Lemma: There exists an optimum homotopy class that is minimal Lemma: Every minimal homotopy class contains a proper line segment Here onwards, we speak only of minimal homotopy classes 26-1
  • 52. Insight 4: Pinned leash map The optimum leash map ∗ may be pinned at a common subpath π, i.e., a globally shortest p-q path where p,q are obstacle boundary points p π q ∗ The optimum leash map contains a direct geodesic 27-1
  • 54. Algorithm for point obstacles List all candidate homotopy classes h There are O(mnk 2 ) extremal proper line seg- ments, at least one in each homotopy class m edges k obstacles n edges Compute Fh (A, B) In O(mnk log mnk) time using parametric search 29-1
  • 55. Algorithm for polygonal obstacles The optimum leash map ∗ may be pinned at a common subpath π, i.e., a globally shortest p-q path p π q Enumerate O(mnk 4 ) pinned homotopy classes h Compute Fh (A, B) in O(mnk log mnk) time as before 30-1
  • 57. We defined the homotopic Fr´chet distance between two e curves in a general metric space the most natural generalization of Fr´chet distance e to arbitrary metric spaces We gave a polynomial-time algorithm to compute the homo- topic Fr´chet distance between two polygonal curves A,B e in the plane with point or polygonal obstacles the punctured plane is the first metric space that we consider 32-1
  • 58. Open problem: On a convex polyhedron Leash is not always a geodesic! e.g., leash must have enough slack to cross over a vertex (a ‘mountain’) Challenge: Characterize an optimum leash map 33-1
  • 61. How to compute Fh (A, B) in detail 36-1
  • 62. Computing Fh Decision problem: Given a real d ≥ 0, is Fh (A, B) ≤ d? 37-1
  • 63. Computing Fh Decision problem: Given a real d ≥ 0, is Fh (A, B) ≤ d? Observation: There are polynomially many critical values of d at which the answer may change from ‘no’ to ‘yes’ d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ di+1 ≤ di+2 ≤ . . . 37-2
  • 64. Computing Fh Decision problem: Given a real d ≥ 0, is Fh (A, B) ≤ d? Observation: There are polynomially many critical values of d at which the answer may change from ‘no’ to ‘yes’ d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ di+1 ≤ di+2 ≤ . . . 37-3
  • 65. Computing Fh Decision problem: Given a real d ≥ 0, is Fh (A, B) ≤ d? Observation: There are polynomially many critical values of d at which the answer may change from ‘no’ to ‘yes’ d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ di+1 ≤ di+2 ≤ . . . Fh (A, B) Goal: Find the smallest critical value d for which the answer above is ‘yes’ 37-4
  • 66. Is Fh ≤ d? i + 1, j + 1 |P | changes |P | changes i, j http://www.cim.mcgill.ca/∼stephane/cs507/Project.html, St´phane Pelletier, 2002 e Is there a monotone path from (0, 0) to (m, n) in free space? Key lemma: Free space within each cell Cij is convex i.e., leash length is a convex function along any monotone path through Cij (think hourglasses) 38-1
  • 67. Computing free space i + 1, j i + 1, j + 1 i, j i, j + 1 Given the shortest path between ai and bj , compute: (1) the interval b ∈ [bj , bj+1 ] such that dist(ai , b) ≤ d (2) the interval a ∈ [ai , ai+1 ] such that dist(bj , a) ≤ d (3) shortest paths (ai , bj+1 ), (ai+1 , bj ), and (ai+1 , bj+1 ) 39-1
  • 68. Funnels Shortest paths from ai to [bj , bj+1 ] define a funnel∗ ∗ in the universal cover 40-1
  • 69. Funnels Shortest paths from ai to bj+1 b [bj , bj+1 ] define a funnel∗ ∗ in the universal cover bj Leash evolves like a deque apex as its endpoints sweep tail ai 40-2
  • 70. Funnels Shortest paths from ai to bj+1 b [bj , bj+1 ] define a funnel∗ ∗ in the universal cover bj Leash evolves like a deque apex as its endpoints sweep tail ai In O(k log k) time, build a funnel data structure of size O(k), using a deque, such that each free interval can be computed in O(log k) time 40-3
  • 71. Computing Fh Decision algorithm: Given a real d ≥ 0, is Fh (A, B) ≤ d? . . . in O(mn log k) time 41-1
  • 72. Computing Fh Decision algorithm: Given a real d ≥ 0, is Fh (A, B) ≤ d? . . . in O(mn log k) time Observation: There are polynomially many critical values of d at which the answer may change from ‘no’ to ‘yes’ d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ di+1 ≤ di+2 ≤ . . . 41-2
  • 73. Computing Fh Decision algorithm: Given a real d ≥ 0, is Fh (A, B) ≤ d? . . . in O(mn log k) time Observation: There are polynomially many critical values of d at which the answer may change from ‘no’ to ‘yes’ d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ di+1 ≤ di+2 ≤ . . . 41-3
  • 74. Computing Fh Decision algorithm: Given a real d ≥ 0, is Fh (A, B) ≤ d? . . . in O(mn log k) time Observation: There are polynomially many critical values of d at which the answer may change from ‘no’ to ‘yes’ d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ di+1 ≤ di+2 ≤ . . . Fh (A, B) Goal: Find the smallest critical value d for which the answer above is ‘yes’ 41-4
  • 75. Parametric search [Megiddo ’83] Let As be an algorithm to decide, given a critical value di , whether Fh (A, B) ≤ di , with running time O(Ts ) d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ . . . ≤ dj ≤ . . . 42-1
  • 76. Parametric search [Megiddo ’83] Let As be an algorithm to decide, given a critical value di , whether Fh (A, B) ≤ di , with running time O(Ts ) d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ . . . ≤ dj ≤ . . . We simulate As on input d∗ , with d∗ as a symbolic variable 42-2
  • 77. Parametric search [Megiddo ’83] Let As be an algorithm to decide, given a critical value di , whether Fh (A, B) ≤ di , with running time O(Ts ) d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ . . . ≤ dj ≤ . . . We simulate As on input d∗ , with d∗ as a symbolic variable The control flow of As depends on comparisons of the form d∗ ≤ dj where dj is a critical value Each dj is a distance, i.e., a quadratic function of input coordinates. 42-3
  • 78. Parametric search [Megiddo ’83] Let As be an algorithm to decide, given a critical value di , whether Fh (A, B) ≤ di , with running time O(Ts ) d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ . . . ≤ dj ≤ . . . We simulate As on input d∗ , with d∗ as a symbolic variable The control flow of As depends on comparisons of the form d∗ ≤ dj where dj is a critical value Each dj is a distance, i.e., a quadratic function of input coordinates. d∗ ≤ dj ? Run As on input dj , in O(Ts ) time 42-4
  • 79. Parametric search [Megiddo ’83] Let As be an algorithm to decide, given a critical value di , whether Fh (A, B) ≤ di , with running time O(Ts ) d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ . . . ≤ dj ≤ . . . We simulate As on input d∗ , with d∗ as a symbolic variable The control flow of As depends on comparisons of the form d∗ ≤ dj where dj is a critical value Each dj is a distance, i.e., a quadratic function of input coordinates. d∗ ≤ dj ? Run As on input dj , in O(Ts ) time 2 Total running time = O(Ts ) 42-5
  • 80. Parametric search on steroids [Megiddo ’83] Let Ap be a parallel algorithm to decide, given di , whether Fh (A, B) ≤ di , with parallel running time O(Tp ) on q pro- cessors d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ . . . ≤ dj ≤ . . . 43-1
  • 81. Parametric search on steroids [Megiddo ’83] Let Ap be a parallel algorithm to decide, given di , whether Fh (A, B) ≤ di , with parallel running time O(Tp ) on q pro- cessors d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ . . . ≤ dj ≤ . . . We simulate Ap sequentially on input d∗ , with d∗ as a symbolic variable 43-2
  • 82. Parametric search on steroids [Megiddo ’83] Let Ap be a parallel algorithm to decide, given di , whether Fh (A, B) ≤ di , with parallel running time O(Tp ) on q pro- cessors d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ . . . ≤ dj ≤ . . . We simulate Ap sequentially on input d∗ , with d∗ as a symbolic variable The control flow of Ap depends on comparisons of the form d∗ ≤ dj where dj is a critical value 43-3
  • 83. Parametric search on steroids [Megiddo ’83] Let Ap be a parallel algorithm to decide, given di , whether Fh (A, B) ≤ di , with parallel running time O(Tp ) on q pro- cessors d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ . . . ≤ dj ≤ . . . We simulate Ap sequentially on input d∗ , with d∗ as a symbolic variable The control flow of Ap depends on comparisons of the form d∗ ≤ dj where dj is a critical value In each parallel stage, do binary search on O(q) values dj 43-4
  • 84. Parametric search on steroids [Megiddo ’83] Let Ap be a parallel algorithm to decide, given di , whether Fh (A, B) ≤ di , with parallel running time O(Tp ) on q pro- cessors d1 ≤ d2 ≤ . . . ≤ di−1 ≤ d∗ ≤ . . . ≤ dj ≤ . . . We simulate Ap sequentially on input d∗ , with d∗ as a symbolic variable The control flow of Ap depends on comparisons of the form d∗ ≤ dj where dj is a critical value In each parallel stage, do binary search on O(q) values dj Total running time = O(Ts Tp log q + qTp ) 2 better than O(Ts ) 43-5
  • 85. How we use parametric search Let Ap be a parallel sorting algorithm (e.g., Cole’s parallel merge sort) 44-1
  • 86. How we use parametric search Let Ap be a parallel sorting algorithm (e.g., Cole’s parallel merge sort) We use Ap to sort O(mn) critical distances with parallel running time O(log mn) We simulate Ap sequentially on input d∗ , with d∗ as a symbolic variable 44-2
  • 87. How we use parametric search Let Ap be a parallel sorting algorithm (e.g., Cole’s parallel merge sort) We use Ap to sort O(mn) critical distances with parallel running time O(log mn) We simulate Ap sequentially on input d∗ , with d∗ as a symbolic variable The control flow of Ap depends on comparisons of the form d∗ ≤ dj where dj is a critical value 44-3
  • 88. How we use parametric search Let Ap be a parallel sorting algorithm (e.g., Cole’s parallel merge sort) We use Ap to sort O(mn) critical distances with parallel running time O(log mn) We simulate Ap sequentially on input d∗ , with d∗ as a symbolic variable The control flow of Ap depends on comparisons of the form d∗ ≤ dj where dj is a critical value To simulate each parallel stage, we resolve all O(mn) com- parisons d∗ ≤ dj by computing O(mnk) critical intervals and locating each dj in some interval 44-4
  • 89. How we use parametric search Let Ap be a parallel sorting algorithm (e.g., Cole’s parallel merge sort) We use Ap to sort O(mn) critical distances with parallel running time O(log mn) We simulate Ap sequentially on input d∗ , with d∗ as a symbolic variable The control flow of Ap depends on comparisons of the form d∗ ≤ dj where dj is a critical value To simulate each parallel stage, we resolve all O(mn) com- parisons d∗ ≤ dj by computing O(mnk) critical intervals and locating each dj in some interval Total running time = O(N 3 log N ) 44-5