This might be a really bad question but can anyone briefly please explain to me what's the difference between using E(X) = SUM [x*p(x)] and Varance E(X^2)-[E(X)]^2 AS OPPOSED TO E(X)=np and Variance = np(1-p)?
Aren't these both used for discrete random variables and when to use which?
No! it is a very good question to ask!
We can prove that the expectation of a binomial random variable is indeed equal to

. There are many ways to prove this, we can consider a binomial random variable as a sum of bernoulli random variables (which I'll omit the proof) or we can apply the direct definition of expectation. Consider the following argument:
Let

be a binomial random variable, thus we require
 = \sum_{y} yp(y))
where
)
denotes
)
 = \sum_{y=0}^n y\binom{n}{y}p^yq^{n-y})
where

!y!}p^yq^{n-y})
!(y-1)!}p^yq^{n-y})
Now we use this little trick, let

and we factor out

 = np\sum_{y=1}^n \frac{(n-1)!}{(n-y)!(y-1)!}p^{y-1}q^{n-y})
!}{(n-1-z)!z!}p^zq^{n-1-z})

Now note that from the axioms of a discrete random variable,

Thus
 = np)
as required.
We can also show that the variance of

is equal to
)
, again by expressing

as a sum of bernoulli random variables can easily show this however I'll apply the direct definition of variance as asked in your question.
Note that
 = \sum_{y=0}^ny^2\binom{n}{y}p^yq^{n-y}= \sum_{y=0}^ny^2\frac{n!}{y!(n-y)!}p^yq^{n-y})
Now note that
) = E(Y^2-Y) = E(Y^2) - E(Y))
Thus
 = E(Y(Y-1)) + E(Y))
) = \sum_{y=0}^ny(y-1)\frac{n!}{y!(n-y)!}p^yq^{n-y} = \sum_{y=2}^n \frac{n!}{(y-2)!(n-y)!}p^yq^{n-y})
Now we apply a little trick again, let

and factor out
) = n(n-1)p^2 \sum_{y=2}^n\frac{(n-2)!}{(y-2)!(n-y)!}p^{y-2}q^{n-y})
p^2 \sum_{z=0}^{n-2}\frac{(n-2)!}{z!(n-2-z)!}p^zq^{n-2-z})
p^2\sum_{z=0}^{n-2}\binom{n-2}{z}p^zq^{n-2-z})
Again note that

Thus we have shown
)= n(n-1)p^2)
Now
 = n(n-1)p^2+np)
So
 - E(X)^2 = n(n-1)p^2+np - (np)^2 = np(1-p))
as required.
E(X)=np and Variance = np(1-p) is used in binomial distribution when you only have 2 probabilities, one is success and one is failure, they dont affect each other so you can take Pr(success)=p and Pr(failure)= 1-p.
But E(X) = SUM [x*p(x)] and Varance E(X^2)-[E(X)]^2 is used for more than 2 probabilities
You're almost right, what do you mean by "2 probabilities"? I think a better word to use instead of probabilities is 2 "outcomes" or 2 "events", more formally, a binomial event has 2
independent events, on the other hand, there could be an infinite amount of probabilities attached to the occurrence of any of these "events". In fact, the expressions E(Y) = np and V(Y) = np(1-p) ARE the same as the general definition of expectations. You can use both interchangeably as I have shown above. It is just that the expression np is easier to compute rather than going through the entire proof each time. Remember, all PMFs and PDFs mean's and variance's can be derived from the general definition (unless in the rare cases where they don't exist, but that's outside of highschool level).
Now I will show an easier way to derive the mean and variance of a binomial random variable.
Consider the bernoulli random variable,

where

if the i^th is a success and

if failure.
If we consider a binomial random variable

which is a sum of bernoulli random variables, thus

then
 = E(\sum_{i=1}^n X_i) = \sum_{i=1}^n E(X_i) = np)
as required.
I will leave deriving the variance from the sum of the bernoulli random variables as an exercise
