by cbro
Last Updated January 14, 2018 08:19 AM

In this paper --> http://www.math.tau.ac.il/~nogaa/PDFS/akv3.pdf on **Page 5 "3. Concluding Remarks"**. I am trying to make sense of the second point, which starts as "Our estimation from Theorem 1 is sharp...". I don't quite get how he arrived at "the probability that $\lambda_1$ (largest eigenvalue) exceeds its median by $t$ is at least $\Omega(e^{-O(t^2)})$.

This is what I **understood** so far. He gives the **upper bound** on $E(\lambda_1)$ and the **lower** **bound** on $\lambda_1$ to get the **smallest deviation from the mean**. But then $t$ is just a deviation parameter... how does he compute the probability that the least difference exceeds $t$? And how is he getting this $e^-$ something term? Has this got something to do with the relation of the binomial distribution with the normal distribution? Is he measuring $t$ somehow in terms of how many standard deviations away from the mean it is?

- ServerfaultXchanger
- SuperuserXchanger
- UbuntuXchanger
- WebappsXchanger
- WebmastersXchanger
- ProgrammersXchanger
- DbaXchanger
- DrupalXchanger
- WordpressXchanger
- MagentoXchanger
- JoomlaXchanger
- AndroidXchanger
- AppleXchanger
- GameXchanger
- GamingXchanger
- BlenderXchanger
- UxXchanger
- CookingXchanger
- PhotoXchanger
- StatsXchanger
- MathXchanger
- DiyXchanger
- GisXchanger
- TexXchanger
- MetaXchanger
- ElectronicsXchanger
- StackoverflowXchanger
- BitcoinXchanger
- EthereumXcanger