octra
TwitterDiscordTelegramChat [en]GitHub
  • Octra Developer Documentation
  • Technology
    • HFHE
      • Modular arithmetic over hypergraphs in HFHE
      • Subtraction in hypergraph space
        • Hypergraph stability after subtraction
        • Minimizing deviations in hypergraph matrices
      • Multiplication of integers
      • Integer division and range narrowing mechanics
      • HFHE Key Sharding
      • Starting vector generation
      • SBox module
      • Transformation mechanism
      • Noise basis determination
      • Noise assessment in ciphertext elements
    • Isolated Environments
  • Running a Node
    • Validators
Powered by GitBook
On this page
Export as PDF
  1. Technology
  2. HFHE
  3. Subtraction in hypergraph space

Minimizing deviations in hypergraph matrices

The definition of good rows in the incidence matrix allows establishing a basis for the stability of the hypergraph even with extreme noise levels.

PreviousHypergraph stability after subtractionNextMultiplication of integers

Last updated 10 months ago

To begin, let's define the inequality that allows us to determine good rows in the incidence matrix. A row iii of the matrix γ(Υ)\gamma(\Upsilon)γ(Υ) in our case will be called good if for each j=1,…,rj = 1, \ldots, rj=1,…,r the inequality holds:

ϵij≤1r−1r2−2rln⁡r\epsilon_{ij} \leq \frac{1}{r} - \frac{1}{r^2} - \frac{2}{r \ln r} ϵij​≤r1​−r21​−rlnr2​

Using this fact and the previously determined basis, we obtain:

ki(γ(Υ))≤6+37r+O(1r2)∑j=1rϵij2k_i(\gamma(\Upsilon)) \leq 6 + \frac{37}{r} + O\left(\frac{1}{r^2}\right) \sum_{j=1}^r \epsilon_{ij}^2 ki​(γ(Υ))≤6+r37​+O(r21​)j=1∑r​ϵij2​

Next, applying the upper bound, obtain:

ki(γ(Υ))≤rln⁡r+306ln⁡r−259ln⁡r36+rln⁡rr+O(1r2)∑j=1rϵij2 k_i(\gamma(\Upsilon)) \leq r \ln r + \frac{30}{6} \ln r - \frac{259 \ln r}{36} + \frac{r \ln r}{r} + O\left(\frac{1}{r^2}\right) \sum_{j=1}^r \epsilon_{ij}^2 ki​(γ(Υ))≤rlnr+630​lnr−36259lnr​+rrlnr​+O(r21​)j=1∑r​ϵij2​

Knowing this, consider situations in the case of good rows for ϵij<0\epsilon_{ij} < 0ϵij​<0:

 (1r2+ϵij)ln⁡(1+r2ϵij)≥ϵij+1r2(r2ϵij)2=ϵij+r2ϵij22\ \left(\frac{1}{r^2} + \epsilon_{ij}\right) \ln(1 + r^2 \epsilon_{ij}) \geq \epsilon_{ij} + \frac{1}{r^2} (r^2 \epsilon_{ij})^2 = \epsilon_{ij} + \frac{r^2 \epsilon_{ij}^2}{2}  (r21​+ϵij​)ln(1+r2ϵij​)≥ϵij​+r21​(r2ϵij​)2=ϵij​+2r2ϵij2​​

If ϵij>0\epsilon_{ij} > 0ϵij​>0, but ϵij≤1rln⁡r−1r2\epsilon_{ij} \leq \frac{1}{r \ln r} - \frac{1}{r^2}ϵij​≤rlnr1​−r21​, we use the estimate of the function φ(x)=(1+x)ln⁡(1+x)\varphi(x) = (1 + x) \ln (1 + x)φ(x)=(1+x)ln(1+x). It is known that φ(x)>x+x22(1+x3)\varphi(x) > x + \frac{x^2}{2(1 + \frac{x}{3})}φ(x)>x+2(1+3x​)x2​ for x>0x > 0x>0. Let x=r2ϵijx = r^2 \epsilon_{ij}x=r2ϵij​ and get the following estimate:

(1r2+ϵij)ln⁡(1+r2ϵij)≥ϵij+r2ϵij22(1+r2ϵij3)\left(\frac{1}{r^2} + \epsilon_{ij}\right) \ln(1 + r^2 \epsilon_{ij}) \geq \epsilon_{ij} + \frac{r^2 \epsilon_{ij}^2}{2(1 + \frac{r^2 \epsilon_{ij}}{3})}(r21​+ϵij​)ln(1+r2ϵij​)≥ϵij​+2(1+3r2ϵij​​)r2ϵij2​​

Estimating the denominator: 1+r2ϵij3≤1+3rln⁡rr1 + \frac{r^2 \epsilon_{ij}}{3} \leq 1 + \frac{3r \ln r}{r}1+3r2ϵij​​≤1+r3rlnr​:

(1r2+ϵij)ln⁡(1+r2ϵij)≥ϵij+r2ϵij22(1+rln⁡r)\left(\frac{1}{r^2} + \epsilon_{ij}\right) \ln(1 + r^2 \epsilon_{ij}) \geq \epsilon_{ij} + \frac{r^2 \epsilon_{ij}^2}{2(1 + r \ln r)} (r21​+ϵij​)ln(1+r2ϵij​)≥ϵij​+2(1+rlnr)r2ϵij2​​

Further, if ϵij∈(1rln⁡r−1r2,1r−2ln⁡ln⁡rln⁡r)\epsilon_{ij} \in \left( \frac{1}{r \ln r} - \frac{1}{r^2}, \frac{1}{r} - \frac{2 \ln \ln r}{\ln r} \right)ϵij​∈(rlnr1​−r21​,r1​−lnr2lnlnr​), we estimate the summand as follows:

(1r2+ϵij)ln⁡(1+r2ϵij)≥(1r2+ϵij)ln⁡(rln⁡rrln⁡r−r)\left(\frac{1}{r^2} + \epsilon_{ij}\right) \ln(1 + r^2 \epsilon_{ij}) \geq \left(\frac{1}{r^2} + \epsilon_{ij}\right) \ln \left( \frac{r \ln r}{r \ln r - r} \right) (r21​+ϵij​)ln(1+r2ϵij​)≥(r21​+ϵij​)ln(rlnr−rrlnr​)

Using the inequality 1>rϵij(1−1r−2ln⁡ln⁡rln⁡r)−11 > r \epsilon_{ij} \left(1 - \frac{1}{r} - \frac{2 \ln \ln r}{\ln r}\right) - 11>rϵij​(1−r1​−lnr2lnlnr​)−1:

(1r2+ϵij)ln⁡(1+r2ϵij)≥ϵij+rln⁡rϵij(1+2ln⁡ln⁡rln⁡r+O(ln⁡ln⁡rln⁡r)2(1−ln⁡ln⁡rln⁡r))\left(\frac{1}{r^2} + \epsilon_{ij}\right) \ln(1 + r^2 \epsilon_{ij}) \geq \epsilon_{ij} + r \ln r \epsilon_{ij} \left(1 + \frac{2 \ln \ln r}{\ln r} + O \left(\frac{\ln \ln r}{\ln r}\right)^2 (1 - \frac{\ln \ln r}{\ln r})\right) (r21​+ϵij​)ln(1+r2ϵij​)≥ϵij​+rlnrϵij​(1+lnr2lnlnr​+O(lnrlnlnr​)2(1−lnrlnlnr​))

Thus, for a good row with sufficiently large rrr we get:

hi(γ(Υ))−ki(γ(Υ))≥r24∑j:ϵij<0ϵij2+r2∑j:ϵij>0ϵij2h_i(\gamma(Υ)) - k_i(γ(Υ)) \geq \frac{r^2}{4} \sum_{j: \epsilon_{ij} < 0} \epsilon_{ij}^2 + \frac{r}{2} \sum_{j: \epsilon_{ij} > 0} \epsilon_{ij}^2 hi​(γ(Υ))−ki​(γ(Υ))≥4r2​j:ϵij​<0∑​ϵij2​+2r​j:ϵij​>0∑​ϵij2​

The rows in the matrix remain good even with a high level of noise ϵij\epsilon_{ij}ϵij​ (after several operations). A good row means that the hypergraph is stable.

The inequality also shows that for good rows in the matrix, the difference between the first element hi(γ(Υ))h_i(\gamma(Υ))hi​(γ(Υ)) and the second element ki(γ(Υ))k_i(γ(Υ))ki​(γ(Υ)) (at the time of recalculation) with a large rrr value is limited from below by the sum of squares of noise r24∑j:ϵij<0ϵij2+r2∑j:ϵij>0ϵij2\frac{r^2}{4} \sum_{j : \epsilon_{ij} < 0} \epsilon_{ij}^2 + \frac{r}{2} \sum_{j : \epsilon_{ij} > 0} \epsilon_{ij}^24r2​∑j:ϵij​<0​ϵij2​+2r​∑j:ϵij​>0​ϵij2​.

A large value of rrr allows the hypergraph system to remain stable, as proven above.