Does E(X2)-[E(X)]^2 also work for sample variance or does it just work for population variance?
What's your question exactly?
If you assume the random variable X follows a probability density function of p(x) and this density function is an accurate characterization of the population frequency distribution, then
 - [E(X)]^2 = \sigma^2)
where

is the population variance. If p(x) is not an accurate characterization then
 - [E(X)]^2 \neq \sigma^2)
.
Now assume we take a random sample

from the population, because this is a
random sample, they are iid (independent and identically distributed), that is
 = \mu)
and
 = \sigma^2)
. Now the question is how do we estimate

? In other words, we need to find a point estimator. With that, we also need a criteria to assess the "goodness" of our point estimator, there are many favourable properties one wants to satisfy when obtaining a good "estimator", unbiasedness, consistency, sufficiency, efficiency etc. One can take a guess and estimate

with the following expression:
^2)
However

is itself a random variable, and it can be shown that
 \neq \sigma^2)
, ie, it is is a biased estimator. It can also be easily shown that the unbiased estimator of

is in fact given by:
^2)
that is
 = \sigma^2)
.
We define

to be the sample variance. In fact it can be easily shown that

is not only unbiased, but also consistent, sufficient and efficient.
With regards to the above posts
^2)
is
not the sample variance. That expression is the variance of the sample mean (which is itself a function of random variables, hence why we can compute its variance).