Cent is a measure of the ratio of two frequencies M and N given by: 1200 log_{2} (N/M).
The measure was developed by taking an equal tempered chromatic semitone and splitting it (exponentially) into 100 equal intervals. Thus, since there are 12 semitones in an octave, there would be 1200 cents in the octave. More precisely, if A = 440 Hz, then A an octave higher would be A = 2 * 440 Hz = 880 Hz, and the number of cents between these two frequencies will be computed as 1200 log_{2} (880 / 440) = 1200.
Take for example A = 440 Hz and the corresponding equal tempered A# of approximately 466.16 Hz and compute 1200 log_{2} (466.16 / 440) = 100. Thus, a semitone is 100 cents.
For a more interesting example, consider that when using Pythagorean tuning we compute Eb = 313.24 Hz and D# = 309.03 Hz (see also Frequency of notes). Then the difference between D# and Eb expressed in cents would be 1200 log_{2} (313.24 / 309.03) = 23.46 cents, which is approximately a quarter tone (25 cents). (This difference creates an interval in the Pythagorean tuning around D# and Eb, which is out of tune and is called a wolf interval.)
The cent uses a logarithmic measure since frequencies in music are related exponentially (meaning that frequencies that sound good together are integer multiples of each other; see Harmonics), but are perceived by the human ear to have a linear relationship. The cent measure is probably precise enough to capture changes in frequencies that the human ear can perceive. It is not clear exactly how perceptive the human ear is, but 5-15 cents are usually quoted.