7 years, 2 months ago.

how to make delta time in mcu ?

now i still work for make IMU with MPU6050

and in my reference code use deltat or we can say " delta time "

and in mbed it can get with use this code

Now = t.read_us(); deltat = (float)((Now - lastUpdate)/1000000.0f) ; set integration time by time elapsed since last filter update lastUpdate = Now; sum += deltat; sumCount++;

sum = 0; sumCount = 0;

until now i dont understand how can get it deltat, because i want port this code to my stm32 not use mbed

pliss help me

1 Answer

7 years, 2 months ago.

Almost all Cortex parts have a Systick peripheral which is located at the same memory location as it is part of the ARM CPU.

I use it to count to 65 msec, between resets, which also increments a longer time intervals, code below--

for 48 MHz
  #define SYSTICK_RELOAD_VAL 0x300000	//  48 * 0x10000	4799999
  #define SYSTICK_DIV(x)	(((x>>4)*0x5555)>>16)

for 120 MHz
  #define SYSTICK_RELOAD_VAL 0x780000	//	120 * 0x10000	11999999		
  #define SYSTICK_DIV(x)	(((x>>3)*0x1111)>>16)


void SysTick_Handler(void) {
	ms65_count++;                        // increment counter
}

void systick_init( void ) {
	// SysTick counter, interrupt every 65 ms
	SysTick->LOAD = SYSTICK_RELOAD_VAL-1;
	SysTick->VAL  = 0;
	SysTick->CTRL  = 4 | 2 | 1; //CLKSOURCE=CPU clock | TICKINT | ENABLE
	ms65_count = 0;
}

unsigned int micros( void )
{
	unsigned int Tick;
	unsigned int count;

    do {
		count = ms65_count;
        Tick = SysTick->VAL;
    } while (SysTick->CTRL & SysTick_CTRL_COUNTFLAG_Msk);

//	divides take a long time in terms of microseconds
//  return (count * 100000 + (SYSTICK_RELOAD_VAL+1 - Tick) / ( (SYSTICK_RELOAD_VAL+1) / 100000 )) & 0xffffffff ;//LPC111x
    return (count << 16) + (0x10000 - SYSTICK_DIV(Tick));
}

Accepted Answer

Hei thanks for answer, i think this code configuration Time with systick. i use systick for my delay

but in above my code calculate spent Time for processing ... So we have to calculate the Time not set the Time .

posted by Septian Gusonela 28 Jul 2014

Call the micros() routine before and after an event, subtract the 2 numbers for elapsed time.

posted by BASICchip Coridium 28 Jul 2014

thanks for always reply for my problem

this is my code for delay

void TimingDelay_Decrement(void) { if (TimingDelay != 0) { TimingDelay; } }

void SysTick_Handler(void) { TimingDelay_Decrement(); }

void Delay_Init() { SysTick_Config(21000); SysTick_CLKSourceConfig(SysTick_CLKSource_HCLK_Div8); Delay_Interrupt(); }

and i want ask again .. can you give me for 168 Mhz ... because i use stm32f407VG

thanks

posted by Septian Gusonela 30 Jul 2014

For 168 MHz

  1. define SYSTICK_RELOAD_VAL 0xA80000
  2. define SYSTICK_DIV(x) ((x*0x0186)>>16)

For longer times, or for more accuracy just read the Systick->VAL, take the difference and divide by 168, adjust for rollover if needed.

posted by BASICchip Coridium 30 Jul 2014

thanks again...

how you get 65 ms for interrupt ?? sorry iam newbie :D

posted by Septian Gusonela 31 Jul 2014

Systick is a 24 bit down counter that when it reaches 0 it reloads with SYSTICK_RELOAD_VAL -1 and interrupts, which in your case is 168 * 65535. That interrupt calls Systick_Handler which increments the ms65_count.

These values are chosen so that it is easy to combine them get a 32 bit microsecond counter, which can time upto 71 minutes.

posted by BASICchip Coridium 01 Aug 2014

sorry i ask again.. iam very poor newbie

yes i little understood about systick.. can you give me example that code interrupt every 1 ms

and stil dont undertand

and whether your code i edit with Systick_Config(21000)

because i use it for make 1 ms ?

thankss

posted by Septian Gusonela 02 Aug 2014