Solve
for p
ClimbTooHigh's method is very nice and I'll elaborate further on the interesting aspects of it.
Basically what he did was find the "maximum likelihood" parameter for the binomial distribution.
What that means is that he has selected the parameter p which maximises the probability of the event occuring given the data set. This method of likelihood approach is very flexible and can be applied to a range of distributions. Consider the normal distribution, one can maximise jointly the probability of a certain event by maximising the probability with respect to mu and sigma, although the difference here is that you will be maximising with respect to 2 parameters, thus you will need to partial differentiate etc
One can actually show that the maximum likelihood estimator of the parameter p in a binomial distribution is always given by

, this is derived as follows:
Assume we have a sample of X_i's which are identically and independently distributed binomial random variables.
The pmf is thus given by
 = \binom{n}{x}p^x(1-p)^{n-x})
, we now construct what is known as the likelihood function which is a function of the parameter p in this case:
 = \prod_{i=1}^n p(x_i) = \prod_{i=1}^n \binom{n}{x_i}p^{x_i}(1-p)^{n-x_i} )
Now if we take logs on each other we get:
 = \prod_{i=1}^n ln( \binom{n}{x_i}) + ln(p) \sum x_i + ln(1-p) \sum (n-x_i))
Now if we take the first order derivative with respect to p:
 \sum (n-x_i))
Set this to 0 and solve for p yields:

as required.
Furthermore, it is interesting to note that if we define X to be a bernoulli random variable, then the maximum likelihood estimator for p is just simply the sum of the bernoulli random variables, divided by the sample space, which makes intuitive sense too. Furthermore, it is easily shown that this estimator is unbiased, consistent, efficient, all in all, it is the "best" estimator for p.
Note that not every distribution has a maximum likelihood estimator for some parameters in it, and this method is quite powerful to yield unbiased and consistent estimators of parameters.
You can read more about it here:
http://en.wikipedia.org/wiki/Maximum_likelihood