Robert Lopez / CMSIS5
Embed: (wiki syntax)

« Back to documentation index

Neural Network Activation Functions

Neural Network Activation Functions
[Neural Network Functions]

Perform activation layers, including ReLU (Rectified Linear Unit), sigmoid and tanh. More...

Functions

void arm_nn_activations_direct_q15 (q15_t *data, uint16_t size, uint16_t int_width, arm_nn_activation_type type)
 Q15 neural network activation function using direct table look-up.
void arm_nn_activations_direct_q7 (q7_t *data, uint16_t size, uint16_t int_width, arm_nn_activation_type type)
 Q7 neural network activation function using direct table look-up.
void arm_relu_q15 (q15_t *data, uint16_t size)
 Q15 RELU function.
void arm_relu_q7 (q7_t *data, uint16_t size)
 Q7 RELU function.

Detailed Description

Perform activation layers, including ReLU (Rectified Linear Unit), sigmoid and tanh.


Function Documentation

void arm_nn_activations_direct_q15 ( q15_t *  data,
uint16_t  size,
uint16_t  int_width,
arm_nn_activation_type  type 
)

Q15 neural network activation function using direct table look-up.

Parameters:
[in,out]datapointer to input
[in]sizenumber of elements
[in]int_widthbit-width of the integer part, assume to be smaller than 3
[in]typetype of activation functions
Returns:
none.

This is the direct table look-up approach.

Assume here the integer part of the fixed-point is <= 3. More than 3 just not making much sense, makes no difference with saturation followed by any of these activation functions.

Definition at line 61 of file arm_nn_activations_q15.c.

void arm_nn_activations_direct_q7 ( q7_t *  data,
uint16_t  size,
uint16_t  int_width,
arm_nn_activation_type  type 
)

Q7 neural network activation function using direct table look-up.

Parameters:
[in,out]datapointer to input
[in]sizenumber of elements
[in]int_widthbit-width of the integer part, assume to be smaller than 3
[in]typetype of activation functions
Returns:
none.

This is the direct table look-up approach.

Assume here the integer part of the fixed-point is <= 3. More than 3 just not making much sense, makes no difference with saturation followed by any of these activation functions.

Definition at line 61 of file arm_nn_activations_q7.c.

void arm_relu_q15 ( q15_t *  data,
uint16_t  size 
)

Q15 RELU function.

Parameters:
[in,out]datapointer to input
[in]sizenumber of elements
Returns:
none.

Optimized relu with QSUB instructions.

Definition at line 55 of file arm_relu_q15.c.

void arm_relu_q7 ( q7_t *  data,
uint16_t  size 
)

Q7 RELU function.

Parameters:
[in,out]datapointer to input
[in]sizenumber of elements
Returns:
none.

Optimized relu with QSUB instructions.

Definition at line 55 of file arm_relu_q7.c.