[SOLVED] CS代考程序代写 database algorithm Lecture 8 (Adv): Karger’s algorithms

30 $

File Name: CS代考程序代写_database_algorithm_Lecture_8_(Adv):_Karger’s_algorithms.zip
File Size: 734.76 KB

SKU: 6609349699 Category: Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Or Upload Your Assignment Here:


Lecture 8 (Adv): Karger’s algorithms
The University of Sydney Page 1

Randomization
– Algorithmicdesignpatterns. – Greed.
– Divide-and-conquer.
– Dynamicprogramming. – Networkflow.
– Randomization.
– Randomization: Allow fair coin flip in unit time.
– Why randomize? Can lead to simplest, fastest, or only known
algorithm for a particular problem.
– Examples: Symmetry breaking protocols, graph algorithms, quicksort, hashing, load balancing, Monte Carlo integration, cryptography.
in practice, access to a pseudo-random number generator
The University of Sydney
Page 2

13.2 Global Minimum Cut
Input: A connected, undirected graph G = (V, E). For a set SÌV let d(S) = {(u,v)ÎE : uÎS, vÎVS}.
S’=VS
S’
S
v2 v3
v6v5 v4 v7 v1
Aim: Find a cut (S, S’) of minimum cardinality.
|d(S)| = 4
The University of Sydney
Page 3

13.2 Global Minimum Cut
Input: A connected, undirected graph G = (V, E). For a set SÌV let d(S) = {(u,v)ÎE : uÎS, vÎVS}.
v2 v3
S’
S
|d(S)| = 2
v6v5 v4 v7 v1
Aim: Find a cut (S, S’) of minimum cardinality.
The University of Sydney
Page 4

13.2 Global Minimum Cut
Applications: Partitioning items in a database, identify clusters of related documents, network reliability, network design, circuit design, TSP solvers.
Network flow solution.
– Replace every edge (u, v) with two directed edges (u, v) and (v, u).
– Pick some vertex s and compute min s-v cut separating s from each other
vertex v Î V.
Running time: O((n-1)·MaxFlows)
The University of Sydney Page 5

Karger’s Contraction Algorithm
Definition: A multigraph is a graph that allows multiple edges
between a pair of vertices.
3
The University of Sydney
Page 6

Karger’s Contraction Algorithm
Definition: A multigraph is a graph that allows multiple edges
between a pair of vertices.
Algorithm:
3
1. 2.
3.
Start with the input graph G=(V,E). While |V|>2 do
Contract an arbitrary edge (u,v) in G. Return the cut (only one possible cut).
The University of Sydney
Page 7

Karger’s Contraction Algorithm
Let G=(V,E) be a multigraph (without self-loops). Contract an edge e=(u,v)ÎE Þ Ge
• Replace u and v by single new super-node w
• Replace all edges (u,x) or (v,x) with an edge (w,x) • Remove self-loops to w.
abc Þ
abc
udv w f e contract u-v f
The University of Sydney
Page 8

Karger’s Contraction Algorithm
Definition: A multigraph is a graph that allows multiple edges between a pair of vertices.
Algorithm:
1. 2.
3.
Start with the input graph G=(V,E). While |V|>2 do
Contract an arbitrary edge (u,v) in G. Return the cut (only one possible cut).
|d(S)|
The University of Sydney
Page 9

The University of Sydney
Karger’s contraction algorithm
Page 10

The University of Sydney Page 11

Karger’s Contraction Algorithm
Observation: An edge (u,v) contraction preserves those cuts where u and v are both in S or in S’.
ux
Þ
x
v
y
w
y
The University of Sydney
Page 12

Karger’s Contraction Algorithm
Observation: An edge (u,v) contraction preserves those cuts where u and v are both in S or in S’.
ux
S S’Þ
y
x
S
w
y
S’
v
The University of Sydney
Page 13

Karger’s Contraction Algorithm
Observation: An edge (u,v) contraction preserves those cuts where u and v are both in S or in S’.
S’ ux
Þ
S’ x
y
w
y
S
v
S
The University of Sydney
Page 14
If u,vÎS then dG(S) = dGe(S). (with u and v replaced with w)

Algorithm: General idea
– Contract n-2 edges Þ two vertices remain in G’
The University of Sydney Page 15

Algorithm: General idea
– Contract n-2 edges Þ two vertices remain in G’
– The two vertices in G’ correspond to a partition (S,S’) in G.
The University of Sydney Page 16

Algorithm: General idea
– Contract n-2 edges Þ two vertices remain in G’
– The two vertices in G’ correspond to a partition (S,S’) in G. – The edges remaining in G’ corresponds to dG(S).
– Output dG(S).
If we never contract edges from a minimal cut d(S*) then the algorithm will report d(S*).
How do we select the edges?
The University of Sydney
Page 17

Karger’s Contraction Algorithm
Algorithm:
1. 2.
Start with the input graph G=(V,E). While |V|>2 do
Contract an arbitrary edge (u,v) in G. Return the cut S (only one possible cut).
3.
Algorithm: Since S* is a minimum cut it has few edges!
Claim: This algorithm has a reasonable chance of finding a minimal cut.
The University of Sydney
Page 18

Prove the claim
Claim: The algorithm returns a minimal cut with probability 3 2/n2.
Proof:
Consider a global min cut (S,S’) of G. Let d be edges with one endpoint in S and the other in S’.
Let k = |d| = size of the min cut.
S
d
S’
The University of Sydney
Page 19

Prove the claim
Claim: The algorithm returns a minimal cut with probability 3 2/n2.
Proof:
Consider a global min cut (S,S’) of G. Let d be edges with one endpoint in S and the other in S’.
Let k = |d| = size of the min cut.
S
d
S’
Step 1: contract an edge in d with probability k/|E|. Size of E?
The University of Sydney
Page 20

Prove the claim
Claim: The algorithm returns a minimal cut with probability 3 2/n2.
Proof:
Consider a global min cut (S,S’) of G. Let d be edges with one endpoint in S and the other in S’.
Let k = |d| = size of the min cut.
S
d
S’
Step 1: contract an edge in d with probability k/|E|.
Every node has degree 3 k otherwise (S,S’) would not be min-cut. Þ |E|31⁄2kn.
The University of Sydney
Page 21

Prove the claim
Claim: The algorithm returns a minimal cut with probability 3 2/n2.
Proof:
Consider a global min cut (S,S’) of G. Let d be edges with one endpoint in S and the other in S’.
Let k = |d| = size of the min cut.
S
d
S’
Step 1: contract an edge in d with probability k/|E|. with probability £ 2/n.
The University of Sydney
Page 22

Prove the claim
Claim: The algorithm returns a minimal cut with probability 3 2/n2.
Proof:
Consider a global min cut (S,S’) of G. Let d be edges with one endpoint in S and the other in S’.
Let k = |d| = size of the min cut.
Step 1: contract an edge in d with probability 2/n.
Observation:
S
d
S’
The minimum degree in any (intermediate) multigraph is at least k. (Otherwise there would be a smaller cut)
Specifically this means that if an intermediate multigraph has n’ vertices, it will have at least n’·k/2 edges.
The University of Sydney
Page 23

Prove the claim
Claim: The algorithm returns a minimal cut with probability 3 2/n2.
Proof:
Consider a global min cut (S,S’) of G. Let d be edges with one endpoint in S and the other in S’.
Let k = |d| = size of the min cut.
Step 1: contract an edge in d with probability 2/n.
After step i: The multigraph Gi has n-i vertices and at least (n-i)·k/2 edges.
S
d
S’
The University of Sydney
Page 24

Prove the claim
Claim: The algorithm returns a minimal cut with probability 3 2/n2.
Proof: Consider a global min cut (S,S’) of G. Let d be edges with one endpoint in S and the other in S’.
Let k = |d| = size of the min cut.
d
S’
Step 1: contract an edge in d with probability 2/n. After step i: The multigraph Gi has n-i vertices
and at least (n-i)·k/2 edges.
Let ei = random edge in Gi that was contracted
S
Probability that the algorithm finds minimum cut?
Pr[edges in the final graph is d] = Pr[e1, e2, … , en-2 Ï d]
The University of Sydney
Page 25

Proof
Theorem: Pr[e1, e2, … , en-2 Ï d] > 2/n2 Proof:
Pr[e1, e2, … , en-2 Ï d] =
= Pr[e1Ï d] P Pr[ei+1Ï d : e1, … , ei Ï d]
3 (1- 2 )(1- 2 )(1- 2 ) … (1- 2 ) n n-1 n-2 3
= n-2·n-3·n-4·n-5 ···2·1
=
n
2 n(n-1)
n-1 n-2 n-3
4 3
The University of Sydney
Page 26

Proof
Theorem: Pr[e1, e2, … , en-2 Ï d] > 2/n2 Proof:
Pr[e1, e2, … , en-2 Ï d] =
= Pr[e1Ï d] P Pr[ei+1Ï d : e1, … , ei Ï d]
3 (1- 2 )(1- 2 )(1- 2 ) … (1- 2 ) n n-1 n-2 3
= n-2·n-3·n-4·n-5 ···2·1
n n-1 n-2 n-3
4 3
=2= n(n-1)
1 n 2
The University of Sydney
Page 27

Amplification
To amplify the probability of success, run the contraction algorithm many times.
Claim: If we repeat the contraction algorithm r 2n times with independent random choices, the probability that all runs fail is at
most
(1- 1n )r 2n ≤ (1/e)r 2
The University of Sydney
Page 28
(1- 1x )x ≤1/e

Amplification
To amplify the probability of success, run the contraction algorithm many times.
Claim: If we repeat the contraction algorithm r 2n times with independent random choices, the probability that all runs fail is at
most
(1- 1n )r 2n ≤ (1/e)r 2
constant
Set r = (c ln n) then probability of failure is: e-c ln n = n-c and probability of success is: 1-1/nc
The University of Sydney
Page 29

Karger’s Contraction Algorithm
Algorithm:
1. 2.
3.
Start with the input graph G=(V,E). While |V|>2 do
Contract an arbitrary edge (u,v) Return the cut S (only one possible cut).
Running time?
The University of Sydney
Page 30

Karger’s Contraction Algorithm
Algorithm:
1. 2.
3.
Start with the input graph G=(V,E). While |V|>2 do
Contract an arbitrary edge (u,v) Return the cut S (only one possible cut).
Running time: n-2 iterations.
each iteration requires O(n) time
Þ O(n2)
The algorithm is iterated O(n2 log n) times…total running time O(n4 log n).
The University of Sydney
Page 31

Improved algorithm
Improvement. [Karger-Stein 1996] O(n2 log3n).
– Early iterations are less risky than later ones: probability of
contracting an edge in min cut hits 50% when n/√2 nodes remain.
– Run contraction algorithm until n/√2 nodes remain.
– Run contraction algorithm twice on resulting graph, and return best of two cuts.
Running time?
The University of Sydney Page 32

Improved algorithm
Improvement. [Karger-Stein 1996] O(n2 log3n).
– Early iterations are less risky than later ones: probability of
contracting an edge in min cut hits 50% when n/√2 nodes remain .
– Run contraction algorithm until n/√2 nodes remain.
– Run contraction algorithm twice on resulting graph, and return best of two cuts.
Running time: T(n) = 2(n2+T(n/Ö2 ))
= O(n2 log n) [Master Thm]
The University of Sydney
Page 33

Improved algorithm
Improvement. [Karger-Stein 1996] O(n2 log3n).
– Early iterations are less risky than later ones: probability of
contracting an edge in min cut hits 50% when n/√2 nodes remain .
– Run contraction algorithm until n/√2 nodes remain.
– Run contraction algorithm twice on resulting graph, and return best of two cuts.
Running time: T(n) = 2(n2+T(n/Ö2 ))
= O(n2 log n) [Master Thm]
Probability of success?
The University of Sydney
Page 34

Improved algorithm
Improvement. [Karger-Stein 1996] O(n2 log3n).
– Early iterations are less risky than later ones: probability of
contracting an edge in min cut hits 50% when n/√2 nodes remain .
– Run contraction algorithm until n/√2 nodes remain.
– Run contraction algorithm twice on resulting graph, and return best of two cuts.
Running time: T(n) = 2(n2+T(n/Ö2 ))
= O(n2 log n) [Master Thm]
Probability of failure: Pr[n] ≤ (1-1⁄2·Pr[n/Ö2 ])2 = O(1/log n)
The University of Sydney
Page 35

Improved algorithm
Improvement. [Karger-Stein 1996] O(n2 log3n).
– Early iterations are less risky than later ones: probability of contracting an edge in min cut hits 50% when n/√2 nodes remain .
– Run contraction algorithm until n/√2 nodes remain.
– Run contraction algorithm twice on resulting graph, and return best of two cuts.
Running time: T(n) = 2(n2+T(n/Ö2 )) = O(n2 log n)
Probability of failure: Pr[n] ≤ (1-1⁄2·Pr[n/Ö2 ])2 = O(1/log n)
The University of Sydney
Page 36
Run the algorithm c log2 n times

Improved algorithm
Improvement. [Karger-Stein 1996] O(n2 log3n).
– Early iterations are less risky than later ones: probability of
contracting an edge in min cut hits 50% when n/√2 nodes remain.
– Run contraction algorithm until n/√2 nodes remain.
– Run contraction algorithm twice on resulting graph, and return best of two cuts.
Best known. [Karger 2000] O(m log3n).
faster than best known max flow algorithm or deterministic global min cut algorithm
The University of Sydney
Page 37

Reading material
Eric Vigoda’s lecture notes
http://www.cc.gatech.edu/~vigoda/7530-Spring10/Kargers- MinCut.pdf
The University of Sydney Page 38

Reviews

There are no reviews yet.

Only logged in customers who have purchased this product may leave a review.

Shopping Cart
[SOLVED] CS代考程序代写 database algorithm Lecture 8 (Adv): Karger’s algorithms
30 $