A speedometer on, say, a car, counts the number of times the wheel has turned and divides by the time elapsed on board the car. Because time is being measured on board the moving vehicle, special relativity comes into play.
What is the relationship between the speed calculated by the speedometer, and the speed of the car measured by a stationary observer (for example, a cop seeking to arrest you for speeding) on the ground? This should be easy.
I suspect the speedometer reports speeds greater than that measured by the stationary observer, and can report arbitrarily high speeds faster than the speed of light, though that might require car parts moving faster than the speed of light in the reference frame of the car.
One could still practically achieve impossibly high speedometer readings by using a speedometer that functions using a similar principle as an optical computer mouse, more specifically, early optical mice which required special mousepads marked with grid lines of known spacing. (Incidentally, the ball mouse could be considered a sophistication of the mechanical speedometer described previously: it counts the number of times the ball has turned.) The car's course is marked with landmarks at periodic intervals, and the on-board speedometer calculates distance by sensing and counting landmarks passed. Length contraction might make difficult recognizing the landmarks. (If the sensor compensates for length contraction, then it could calculate speed directly from the amount of length contraction, the amount of compensation it is doing. But calculating speed from observed length contraction will never result in readings higher than the speed of light.)
EmoticonEmoticon