There is not much heads or tails to make of FastAnalogIn, you can just replace regular AnalogIn and it should work. But if it is going more than fast enough anyway I wouldn't worry about it, regular AnalogIn averages a few times, so that is more accurate. I think you are confused with the 200kHz of the ADC, not 200Hz ;). So that is 5us per ADC sample, it takes a few, some overhead, lets say 20us. And then some other stuff, your remap function will also take a few microseconds probably for example.
Regarding which you need, timer.read returns a float, but timer.read_us returns the time in microseconds as an integer. Advantage of that is also that you don't need to worry about overflows when you calculate the time between two samples (that will fix itself when you substract them, as long as you define them as unsigned ints.
The reason I assume it was running a few minutes is because of the times you showed, somewhere in the 200 seconds range. Which would be roughly 4 minutes.
Continuing my little project: http://mbed.org/forum/helloworld/post/23256/ and http://mbed.org/forum/helloworld/post/23317/
So far I have created a 1dof robot arm that is positioned with feedback by an analog IR distance sensor. Currently the arm moves to maintain a desired sensor reading. I can adjust the desired reading to move the arm, or move a target in front of the sensor to move the arm. This all works pretty well.
My next step is to target a desired rate of change between target values. Right now the arm moves at a rate determined by a fixed PWM step value. I would like to change this so the arm will adjust the PWM step to achieve a certain rate of change in the sensor data. Long story short, I need to know the time between sensor readings so I can calculate the rate of change. I searched around but could not come up with a method to do this. Currently the program is running at nearly 200 hz, determined by using a stopwatch and a counter displayed on teraterm, best I could do.
The ultimate goal is to tell the arm "move 3 inches forward at a rate of 0.2 inches/second" and have it all be controlled by the sensor. I will need to calibrate and linearize the sensor data for this, but the first step is to figure out the rate of change. Any ideas to point me in the right direction?