When you calibrate a calorimeter, what you do is you put in a set amount of energy, and then measure the change in temperature of the calorimeter.
If you put in 100 kJ and the temperature of the calorimeter increases by 10 degrees, you can say, "Right, well, averaging it out, it looks like I need to put in 10 kJ of energy to raise the temperature by 1 degree Celsius."
Now, imagine we put some amount of methane into the calorimeter and burnt it, and, combustion of methane being an exothermic reaction, the temperature of your calorimeter went up 5 degrees Celsius.
Now, if it took 10 kJ of energy to raise the calorimeter's temperature by 1 degree Celsius, and your combustion of methane raised the temperature by 5 degrees Celsius, you, being a brilliant chemistry student, could work out that the reaction produced five times the change in heat and hence produced five times the energy, so the reaction must have released 50 kJ.
So the calibration factor tells you how much energy you need to raise the temperature of the calorimeter by 1 degree Celsius. You multiply your change in temperature by your calibration factor to work out how much energy you would need to produce the temperature change you observe.