So, just for my own curiosity, I ran the following program on my F401RE
Serial pc(USBTX, USBRX);
DigitalOut myled(LED1);
int main() {
Timer t;
t.reset(); t.start(); wait(10); t.stop();
pc.printf("wait(10) - %d ms\r\n", t.read_ms());
t.reset(); t.start(); wait_ms(10000); t.stop();
pc.printf("wait_ms(10000) - %d ms\r\n", t.read_ms());
t.reset(); t.start(); for (int i = 0; i < 1000; i++) wait(0.01); t.stop();
pc.printf("wait(0.01) x 1000 - %d ms\r\n", t.read_ms());
t.reset(); t.start(); for (int i = 0; i < 1000; i++) wait_ms(10); t.stop();
pc.printf("wait_ms(10) x 1000 - %d ms\r\n", t.read_ms());
}
With the following output...
wait(10) - 10000 ms
wait_ms(10000) - 10000 ms
wait(0.01) x 1000 - 10000 ms
wait_ms(10) x 1000 - 10000 ms
Which kind of surprised me a bit since that seems almost TOO perfect. Any ideas on what I might be doing wrong?
Edit:
Also ran your wait_ms(1) x 20000 example with the following output:
wait_ms(1) x 20000 - 20000 ms
Edit2:
Did one last time but used the same timing mechanism you did (eg. difftime):
wait_ms(1) x 20000 - 19.000000 s
So mine seems to be pretty accurate if I assume that last result is simply a rounding error?
ST Nucleo F401RE: I am having trouble with the wait() command, although wait_ms() seems to work fine.
#include "mbed.h" DigitalOut myled(LED1); int main() { while(1) { myled = 1; // LED is ON wait_ms(200); // 200 ms myled = 0; // LED is OFF wait_ms(1000); // 1 sec } }gives the expected result, but
#include "mbed.h" DigitalOut myled(LED1); int main() { while(1) { myled = 1; // LED is ON wait(0.2); // 200 ms myled = 0; // LED is OFF wait(1.0); // 1 sec } }and the LED stays off.
my software: Windows 7 with Firefox