I agree that with the ever-increasing temperatures, it is becoming more and more likely to set new higher temperature records.

My idea was to figure out what would be an average number of years since a new temperature record was set and then

take an average for all 365 days from a weather station and do a comparison. When I have time maybe I will do that.]]>

Each year has an equal probability of being the record-holder.

I don't agree with that assumption. With climate changes, for example, if, say, it is getting wetter, then the probability of a new record gets higher as each year passes, and diminishes if it is getting dryer.

The year that records began had lots of broken records. What a bumper year that was!

Even a low record still counted as the new record. So it was easy to break it. As the record values increase, it becomes harder and harder to break them even if there is no trend in climate change.

If, let's say, the rainfall in any year is part of a normal distribution, then you could estimate the chance of getting a high value one particular year. Let's say that is the new record. Then for subsequent years, you have to get new value that is beyond that value in order to set a new record. The 'tail' of the distribution will diminish over time.

Bob

]]>Let's say that there are records going back N years. The probability of each year being the one that holds the record is 1/N.

So the average number of years since the latest record (which is the record overall of course) is

I generated 65 random numbers between 1 and 1300 inclusive and did this 50 times.

The average number of numbers since the all-time high number was 30.28 which is close to 33 which is what we would expect.

Let's say you have temperature records going back 130 years, the average number of years since the all-time high

record would be 65.5. To me this seems counter-intuitive because it seems like later years would be more likely to hold

the all-time high records.]]>

What are your possible sequences for 3 years?

]]>This is kind of a strange problem because we are dealing with something varying around an average that is unknown.

True, the setting of 50% is somewhat arbitrary, but the first year that there is a record we assume that the rainfall (or temperature)

record is just as likely to be above as below the average, since we don't know what the average is.

because each of the following 4 years has 50% chance of breaking it or not.

I'm comparing the following 4 years only to the first year.

There are 2 possible sequences for the first 2 years, with only one sequence having the 2nd year setting a new record. So we have a probability of 1/2 for the 2nd year setting a new record. Similar reasoning shows that the probability of the 3rd year setting a new record is 1/3 and so on...

because each of the following 4 years has 50% chance of breaking it or not.

I would question that. Each record would be harder and harder to break. Also, setting the first probability to 50 % is arbitrary.

]]>Let's say you have rainfall (or temperature) records at a certain location going back 80 years.

On average, how many years will it have been since the last record has been set?

I tried doing the math for 5 years and I'm not certain if it's correct or not.

My reasoning is as follows...

The probability of the first year record still standing after 5 years is because each of the following 4 years has 50% chance of breaking it or not.

The probability of the second year setting a new record is and the probability of it still standing after the following 3 years is

And so on...

Is this correct or not?]]>