Does anyone know how the probabilty inequalities work. e.g. pr(x=3) given pr(x>1)?
In classical probability theory, there are a few very very famous inequalities.
1. Chebyshev's (Tchebysheff's) inequality:
 \leq \frac{1}{k^2})
for ANY random variable

with mean

and s.d

. I'll let you see why this inequality is so famous (and powerful...)
2. Markov inequality:
 \leq \frac{E(X)}{a})
for any non-negative random variable X with expectation E(X). Exercise: try prove this from 1 (hint: consider the variance as a random variable...)
3. Gaussian inequality:
 \le \frac{2\exp(-e^2/2)}{e})
There are literally hundreds more... however the majority of them can not be stated (at least "accurately") using classical probability, and this is where the power of measure theory comes in.
Both 1. 2. and 3. have even more formal definitions under measure theoretic probability (something which I urge students to have a read if you're considering a path down probability).