View Single Post
Old 06-09-03, 11:09 AM
  #4  
Captain Crunch
Mad For Marinoni !!!
 
Captain Crunch's Avatar
 
Join Date: Aug 2001
Location: Matheson, Ontario, Canada
Posts: 438
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Likes: 0
Liked 0 Times in 0 Posts
The simple answer is that you are going slower for longer than you are going faster! Below is a little quiz which relates to this problem very well. Before you look at the answer see if you can figure it out on your own.

Question:

You have a 1 mile loop racing track. Your goal is to average 60 mph for two laps. You do the first lap at 30 mph. How fast must you go on the second lap in order to average 60 mph?
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
Here is the correct answer! Good job to those who got it correct.


No, it is *NOT* 90 mph. If you do the first lap at 30 mph, it takes you 2 minutes for one lap. An average of 60 mph is 2 laps in 2 minutes. Thus, it is *IMPOSSIBLE* to average 60 mph if you do the first lap at 30 mph (even if you could travel at the speed of light--which you can't, because you have mass--you'd still be averaging slightly over 60 mph.

The place where people's intuition breaks down is that the basis for finding average speed is time, not distance. So, while it would be true that one minute at 30 mph and one minute at 90 mph *does* give you a 60 mph average, it doesn't work if you do one mile at 30 and one mile at 90. The reason is that you spend two minutes doing 30 mph but only 40 seconds doing 90 mph (which comes out to 45 mph).
Captain Crunch is offline