patternMinor
Analysis of a randomized algorithm for independent set construction
Viewed 0 times
independentanalysisconstructionrandomizedalgorithmforset
Problem
Suppose that $G = (V,E)$ is a 3-regular graph on $n$ vertices and $m$ edges. Below I propose a randomized algorithm for obtaining an independent set for $G$.
Step $1$: Delete each vertex (independently) with $\frac{2}{3}$ probability.
Step $2$: For each remaining edge, delete one of its endvertices.
I want to upper bound the probability that this $2$-step algorithm yields an independent set of size smaller than $\frac{n(1-\varepsilon)}{6}$, where $0 < \varepsilon \leq 1$. My partial attempt is below.
Let $X_1, X_2, \dots, X_n$ be independent Poisson trials satisfying $\textbf{Pr}[X_i = 1] = \frac{1}{3}$. Hence, $X = \sum_{i=1}^{n} X_i$ gives us the number of vertices in $G$ that survive Step $1$, and $\textbf{E}[X] = \frac{n}{3}$.
Let $Y_1, Y_2, \dots, Y_m$ likewise be independent Poisson trials satisfying $\textbf{Pr}[Y_i = 1] = \frac{1}{9}$. Hence, $Y = \sum_{i=1}^{m} Y_i$ gives us the number of edges in $G$ that survive Step $1$, and $\textbf{E}[Y] = \frac{n}{6}$.
Thus, $X-Y$ gives us the minimum number of vertices remaining after the algorithm terminates.
Now, the following Chernoff bound seems especially pertinent to this problem, where $Z$ is a sum of independent Poisson trials, $\mu$ is its expectation, and $0 < \delta \leq 1$:
$\textbf{Pr}[Z < (1 - \delta)\mu] < \exp(\frac{-\mu \delta^2}{2})$
Of course, $X-Y$ is a lower bound on the size of the output independent set, but I'm not sure how I can apply the expression to the Chernoff bound. Can $X-Y$ can be conceived as a sum of Poisson indicator variables, which would be required if I wanted to put $X-Y$ in place of $Z$ in the above bound? If so, then it seems to me that I can use this Chernoff bound to derive an upper bound on the probability that the minimum size of the output independent set is smaller than $\frac{n(1-\varepsilon)}{6}$. How can I get from this to my objective: an upper bound on the probability that the actual size of the output independent set is smaller than $\frac{n(1-\varepsilo
Step $1$: Delete each vertex (independently) with $\frac{2}{3}$ probability.
Step $2$: For each remaining edge, delete one of its endvertices.
I want to upper bound the probability that this $2$-step algorithm yields an independent set of size smaller than $\frac{n(1-\varepsilon)}{6}$, where $0 < \varepsilon \leq 1$. My partial attempt is below.
Let $X_1, X_2, \dots, X_n$ be independent Poisson trials satisfying $\textbf{Pr}[X_i = 1] = \frac{1}{3}$. Hence, $X = \sum_{i=1}^{n} X_i$ gives us the number of vertices in $G$ that survive Step $1$, and $\textbf{E}[X] = \frac{n}{3}$.
Let $Y_1, Y_2, \dots, Y_m$ likewise be independent Poisson trials satisfying $\textbf{Pr}[Y_i = 1] = \frac{1}{9}$. Hence, $Y = \sum_{i=1}^{m} Y_i$ gives us the number of edges in $G$ that survive Step $1$, and $\textbf{E}[Y] = \frac{n}{6}$.
Thus, $X-Y$ gives us the minimum number of vertices remaining after the algorithm terminates.
Now, the following Chernoff bound seems especially pertinent to this problem, where $Z$ is a sum of independent Poisson trials, $\mu$ is its expectation, and $0 < \delta \leq 1$:
$\textbf{Pr}[Z < (1 - \delta)\mu] < \exp(\frac{-\mu \delta^2}{2})$
Of course, $X-Y$ is a lower bound on the size of the output independent set, but I'm not sure how I can apply the expression to the Chernoff bound. Can $X-Y$ can be conceived as a sum of Poisson indicator variables, which would be required if I wanted to put $X-Y$ in place of $Z$ in the above bound? If so, then it seems to me that I can use this Chernoff bound to derive an upper bound on the probability that the minimum size of the output independent set is smaller than $\frac{n(1-\varepsilon)}{6}$. How can I get from this to my objective: an upper bound on the probability that the actual size of the output independent set is smaller than $\frac{n(1-\varepsilo
Solution
You can obtain a weak upper bound by resorting to the Markov inequality instead. Specifically, let random variable $\small Z$ be the size of the independent set remained. We have then
\begin{align}
\small
\Pr\left[Z \leq \frac{n(1 - \epsilon)}{6}\right] \leq&~\small\Pr\left[X - Y \leq \frac{n(1 - \epsilon)}{6}\right] \\
=&~\small\Pr\left[n - (X - Y) \geq n - \frac{n(1-\epsilon)}{6}\right] \\
=&~\small\Pr\left[n - (X - Y) \geq \frac{(5 + \epsilon)n}{6}\right] \\
\leq&~\small \frac{5n / 6}{(5 + \epsilon) n/ 6} = \frac{5}{5 + \epsilon}
\end{align}
where the last inequality is by Markov inequality and $\small \mathbb{E}[n - (X - Y)] = \frac{5n}{6}$.
Though the bound is weak, we can reduce it by repeating your algorithm multiple times (e.g., $\small \ell$) and then selecting the independent set with maximum size. The probability that the resultant size is still smaller than $\small \frac{n(1- \epsilon)}{6}$ then is at most $\small \left(\frac{5}{5+\epsilon}\right)^\ell$.
\begin{align}
\small
\Pr\left[Z \leq \frac{n(1 - \epsilon)}{6}\right] \leq&~\small\Pr\left[X - Y \leq \frac{n(1 - \epsilon)}{6}\right] \\
=&~\small\Pr\left[n - (X - Y) \geq n - \frac{n(1-\epsilon)}{6}\right] \\
=&~\small\Pr\left[n - (X - Y) \geq \frac{(5 + \epsilon)n}{6}\right] \\
\leq&~\small \frac{5n / 6}{(5 + \epsilon) n/ 6} = \frac{5}{5 + \epsilon}
\end{align}
where the last inequality is by Markov inequality and $\small \mathbb{E}[n - (X - Y)] = \frac{5n}{6}$.
Though the bound is weak, we can reduce it by repeating your algorithm multiple times (e.g., $\small \ell$) and then selecting the independent set with maximum size. The probability that the resultant size is still smaller than $\small \frac{n(1- \epsilon)}{6}$ then is at most $\small \left(\frac{5}{5+\epsilon}\right)^\ell$.
Context
StackExchange Computer Science Q#65062, answer score: 4
Revisions (0)
No revisions yet.