A bit of mis-information and a few misconceptions in this thread; hopefully I can clarify a few things.
An IQ score is just a standardised score. If you know what a 'z-score' is, then you'll have no difficulty understanding IQ. IQ is standardised to a mean of 100 and a standard deviation of 15. In other words, if your IQ is 100, then you're bang on the average. If your IQ score is 115, then you're 1 standard deviation above the average. If your IQ score is 80 then you're 1.33 standard deviations below the average and so forth.
Because an IQ is just a standardised score, you can actually create an IQ score out of any test (or indeed any instrument) that gives you a score, and has a comparison group with a known mean and standard deviation. But IQ scores are most commonly used as a means to quantify intelligence.
Like most human characteristics, intelligence is normally (bell-curve) distributed. This means that we know a lot about its distribution in the population. For details, google "bell curves" but some basic facts are that 68% of us fall within one standard deviation either side of the mean, and 96% of us fall within two standard deviations of the mean. In practical terms, this means that most of us are close to the average, and the further away from the average you move, in either direction, the rarer that level of intelligence is in the population. A person on the 98th percentile of IQ would have an IQ of about 130, and any IQ greater than 135 would be 1 in a 100 or rarer as the number goes up.
If you're keen to see the exact percentile associated with a specific IQ, you can enter the following formula into Excel:
=NORM.DIST(x, 100, 15, TRUE)*100
Replace x with the IQ you want to test.
If you're keen to see the exact IQ, given a percentile then you can use the following formula in Excel:
=NORM.INV(x, 100, 15)
Replace x with a percentile (e.g., .95 for the 95th percentile).
Now, there are still two problems. The first relates to the fact that an "IQ score" can be derived, in principle, from any measure. E.g., if you know the average person's height and the standard deviation of height in the population, then you could actually calculate your 'height IQ' very easily. The implications of this is that any test that yields a score can purport to be an 'IQ test'. But in practice there are many different tests out there. Different tests might measure different characteristics, and indeed many tests don't actually measure anything useful at all. But all of these tests, provided they give a score, can purport to give you an "IQ score". If you're interested in learning about your actual intelligence, then you need to sit a genuine standardised psychometric test of general intelligence. My best advice is to speak with a professional Psychologist if you're interested.
The second problem relates to the fact that an IQ score is relative, not absolute. In other words, your IQ score tells you where you sit in relation to a large group of people. But, unless that chosen group of people is a sensible comparison group, your IQ score won't mean much.
E.g., I am 6 foot 2. This would give me a very high 'height IQ' if you were to compare me to a large group of professional jockeys. But it would give me a very low 'height IQ' if you were to compare me to a large group of professional basketballers.
The same principle applies to IQ testing. If you are an adult and compare your test score to the scores observed in a sample of 5th graders, you'll probably come off looking like a genius but if you compare your score to a bunch of Albert Einsteins, then you'll come off looking like a dumbass.
So, when sitting a test, it is vital that your score gets compared to scores observed in a group of people that are similar to you.