You are not logged in.
Pages: 1
having some problems figuring out this;
A used car dealership has found that the length of time before a major repair is required on the cars it sells is normally distributed with a mean equal to 10 months and a standard deviation of 3 months. If the dealer only wants 5% of the cars to fail before the end of the guarantee period, for how many months should the cars be guaranteed?
Any help would be greatly appreciated.
Offline
You'll need to look at a table of values (which I'm guessing you have, if you've been given this question). They tell you the probability that a random variable is less than Z standard deviations above the mean, for lots of values of Z.
So you need to use that table to find what Z-value gives a probability of 0.95, and then work out how many months is that many standard deviations away from the mean.
Why did the vector cross the road?
It wanted to be normal.
Offline
Pages: 1