Minimizing deviations in hypergraph matrices The definition of good rows in the incidence matrix allows establishing a basis for the stability of the hypergraph even with extreme noise levels.
To begin, let's define the inequality that allows us to determine good rows in the incidence matrix. A row i i i of the matrix γ ( Υ ) \gamma(\Upsilon) γ ( Υ ) in our case will be called good if for each j = 1 , … , r j = 1, \ldots, r j = 1 , … , r the inequality holds:
ϵ i j ≤ 1 r − 1 r 2 − 2 r ln r \epsilon_{ij} \leq \frac{1}{r} - \frac{1}{r^2} - \frac{2}{r \ln r}
ϵ ij ≤ r 1 − r 2 1 − r ln r 2 Using this fact and the previously determined basis, we obtain:
k i ( γ ( Υ ) ) ≤ 6 + 37 r + O ( 1 r 2 ) ∑ j = 1 r ϵ i j 2 k_i(\gamma(\Upsilon)) \leq 6 + \frac{37}{r} + O\left(\frac{1}{r^2}\right) \sum_{j=1}^r \epsilon_{ij}^2
k i ( γ ( Υ )) ≤ 6 + r 37 + O ( r 2 1 ) j = 1 ∑ r ϵ ij 2 Next, applying the upper bound, obtain:
k i ( γ ( Υ ) ) ≤ r ln r + 30 6 ln r − 259 ln r 36 + r ln r r + O ( 1 r 2 ) ∑ j = 1 r ϵ i j 2
k_i(\gamma(\Upsilon)) \leq r \ln r + \frac{30}{6} \ln r - \frac{259 \ln r}{36} + \frac{r \ln r}{r} + O\left(\frac{1}{r^2}\right) \sum_{j=1}^r \epsilon_{ij}^2
k i ( γ ( Υ )) ≤ r ln r + 6 30 ln r − 36 259 ln r + r r ln r + O ( r 2 1 ) j = 1 ∑ r ϵ ij 2 Knowing this, consider situations in the case of good rows for ϵ i j < 0 \epsilon_{ij} < 0 ϵ ij < 0 :
( 1 r 2 + ϵ i j ) ln ( 1 + r 2 ϵ i j ) ≥ ϵ i j + 1 r 2 ( r 2 ϵ i j ) 2 = ϵ i j + r 2 ϵ i j 2 2 \
\left(\frac{1}{r^2} + \epsilon_{ij}\right) \ln(1 + r^2 \epsilon_{ij}) \geq \epsilon_{ij} + \frac{1}{r^2} (r^2 \epsilon_{ij})^2 = \epsilon_{ij} + \frac{r^2 \epsilon_{ij}^2}{2}
( r 2 1 + ϵ ij ) ln ( 1 + r 2 ϵ ij ) ≥ ϵ ij + r 2 1 ( r 2 ϵ ij ) 2 = ϵ ij + 2 r 2 ϵ ij 2 If ϵ i j > 0 \epsilon_{ij} > 0 ϵ ij > 0 , but ϵ i j ≤ 1 r ln r − 1 r 2 \epsilon_{ij} \leq \frac{1}{r \ln r} - \frac{1}{r^2} ϵ ij ≤ r l n r 1 − r 2 1 , we use the estimate of the function φ ( x ) = ( 1 + x ) ln ( 1 + x ) \varphi(x) = (1 + x) \ln (1 + x) φ ( x ) = ( 1 + x ) ln ( 1 + x ) . It is known that φ ( x ) > x + x 2 2 ( 1 + x 3 ) \varphi(x) > x + \frac{x^2}{2(1 + \frac{x}{3})} φ ( x ) > x + 2 ( 1 + 3 x ) x 2 for x > 0 x > 0 x > 0 . Let x = r 2 ϵ i j x = r^2 \epsilon_{ij} x = r 2 ϵ ij and get the following estimate:
( 1 r 2 + ϵ i j ) ln ( 1 + r 2 ϵ i j ) ≥ ϵ i j + r 2 ϵ i j 2 2 ( 1 + r 2 ϵ i j 3 ) \left(\frac{1}{r^2} + \epsilon_{ij}\right) \ln(1 + r^2 \epsilon_{ij}) \geq \epsilon_{ij} + \frac{r^2 \epsilon_{ij}^2}{2(1 + \frac{r^2 \epsilon_{ij}}{3})} ( r 2 1 + ϵ ij ) ln ( 1 + r 2 ϵ ij ) ≥ ϵ ij + 2 ( 1 + 3 r 2 ϵ ij ) r 2 ϵ ij 2 Estimating the denominator: 1 + r 2 ϵ i j 3 ≤ 1 + 3 r ln r r 1 + \frac{r^2 \epsilon_{ij}}{3} \leq 1 + \frac{3r \ln r}{r} 1 + 3 r 2 ϵ ij ≤ 1 + r 3 r l n r :
( 1 r 2 + ϵ i j ) ln ( 1 + r 2 ϵ i j ) ≥ ϵ i j + r 2 ϵ i j 2 2 ( 1 + r ln r ) \left(\frac{1}{r^2} + \epsilon_{ij}\right) \ln(1 + r^2 \epsilon_{ij}) \geq \epsilon_{ij} + \frac{r^2 \epsilon_{ij}^2}{2(1 + r \ln r)}
( r 2 1 + ϵ ij ) ln ( 1 + r 2 ϵ ij ) ≥ ϵ ij + 2 ( 1 + r ln r ) r 2 ϵ ij 2 Further, if ϵ i j ∈ ( 1 r ln r − 1 r 2 , 1 r − 2 ln ln r ln r ) \epsilon_{ij} \in \left( \frac{1}{r \ln r} - \frac{1}{r^2}, \frac{1}{r} - \frac{2 \ln \ln r}{\ln r} \right) ϵ ij ∈ ( r l n r 1 − r 2 1 , r 1 − l n r 2 l n l n r ) , we estimate the summand as follows:
( 1 r 2 + ϵ i j ) ln ( 1 + r 2 ϵ i j ) ≥ ( 1 r 2 + ϵ i j ) ln ( r ln r r ln r − r ) \left(\frac{1}{r^2} + \epsilon_{ij}\right) \ln(1 + r^2 \epsilon_{ij}) \geq \left(\frac{1}{r^2} + \epsilon_{ij}\right) \ln \left( \frac{r \ln r}{r \ln r - r} \right)
( r 2 1 + ϵ ij ) ln ( 1 + r 2 ϵ ij ) ≥ ( r 2 1 + ϵ ij ) ln ( r ln r − r r ln r ) Using the inequality 1 > r ϵ i j ( 1 − 1 r − 2 ln ln r ln r ) − 1 1 > r \epsilon_{ij} \left(1 - \frac{1}{r} - \frac{2 \ln \ln r}{\ln r}\right) - 1 1 > r ϵ ij ( 1 − r 1 − l n r 2 l n l n r ) − 1 :
( 1 r 2 + ϵ i j ) ln ( 1 + r 2 ϵ i j ) ≥ ϵ i j + r ln r ϵ i j ( 1 + 2 ln ln r ln r + O ( ln ln r ln r ) 2 ( 1 − ln ln r ln r ) ) \left(\frac{1}{r^2} + \epsilon_{ij}\right) \ln(1 + r^2 \epsilon_{ij}) \geq \epsilon_{ij} + r \ln r \epsilon_{ij} \left(1 + \frac{2 \ln \ln r}{\ln r} + O \left(\frac{\ln \ln r}{\ln r}\right)^2 (1 - \frac{\ln \ln r}{\ln r})\right)
( r 2 1 + ϵ ij ) ln ( 1 + r 2 ϵ ij ) ≥ ϵ ij + r ln r ϵ ij ( 1 + ln r 2 ln ln r + O ( ln r ln ln r ) 2 ( 1 − ln r ln ln r ) ) Thus, for a good row with sufficiently large r r r we get:
h i ( γ ( Υ ) ) − k i ( γ ( Υ ) ) ≥ r 2 4 ∑ j : ϵ i j < 0 ϵ i j 2 + r 2 ∑ j : ϵ i j > 0 ϵ i j 2 h_i(\gamma(Υ)) - k_i(γ(Υ)) \geq \frac{r^2}{4} \sum_{j: \epsilon_{ij} < 0} \epsilon_{ij}^2 + \frac{r}{2} \sum_{j: \epsilon_{ij} > 0} \epsilon_{ij}^2
h i ( γ ( Υ )) − k i ( γ ( Υ )) ≥ 4 r 2 j : ϵ ij < 0 ∑ ϵ ij 2 + 2 r j : ϵ ij > 0 ∑ ϵ ij 2 The rows in the matrix remain good even with a high level of noise ϵ i j \epsilon_{ij} ϵ ij (after several operations). A good row means that the hypergraph is stable.
The inequality also shows that for good rows in the matrix, the difference between the first element h i ( γ ( Υ ) ) h_i(\gamma(Υ)) h i ( γ ( Υ )) and the second element k i ( γ ( Υ ) ) k_i(γ(Υ)) k i ( γ ( Υ )) (at the time of recalculation) with a large r r r value is limited from below by the sum of squares of noise r 2 4 ∑ j : ϵ i j < 0 ϵ i j 2 + r 2 ∑ j : ϵ i j > 0 ϵ i j 2 \frac{r^2}{4} \sum_{j : \epsilon_{ij} < 0} \epsilon_{ij}^2 + \frac{r}{2} \sum_{j : \epsilon_{ij} > 0} \epsilon_{ij}^2 4 r 2 ∑ j : ϵ ij < 0 ϵ ij 2 + 2 r ∑ j : ϵ ij > 0 ϵ ij 2 .
A large value of r r r allows the hypergraph system to remain stable, as proven above.
Last updated 4 months ago