XCSF  1.4.7
XCSF learning classifier system
neural_activations.c File Reference

Neural network activation functions. More...

#include "neural_activations.h"
#include "neural_layer.h"
#include "utils.h"
Include dependency graph for neural_activations.c:

Go to the source code of this file.

Functions

double neural_activate (const int a, const double x)
 Returns the result from applying a specified activation function. More...
 
double neural_gradient (const int a, const double x)
 Returns the derivative from applying a specified activation function. More...
 
const char * neural_activation_string (const int a)
 Returns the name of a specified activation function. More...
 
int neural_activation_as_int (const char *a)
 Returns the integer representation of an activation function. More...
 
void neural_activate_array (double *state, double *output, const int n, const int a)
 Applies an activation function to a vector of neuron states. More...
 
void neural_gradient_array (const double *state, double *delta, const int n, const int a)
 Applies a gradient function to a vector of neuron states. More...
 

Detailed Description

Neural network activation functions.

Author
Richard Preen rpree.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om
Date
2012–2020.

Definition in file neural_activations.c.

Function Documentation

◆ neural_activate()

double neural_activate ( const int  a,
const double  x 
)

Returns the result from applying a specified activation function.

Parameters
[in]aThe activation function to apply.
[in]xThe input to the activation function.
Returns
The result from applying the activation function.

Definition at line 35 of file neural_activations.c.

References COS, cos_activate(), GAUSSIAN, gaussian_activate(), LEAKY, leaky_activate(), LINEAR, linear_activate(), LOGGY, loggy_activate(), LOGISTIC, logistic_activate(), RELU, relu_activate(), SELU, selu_activate(), SIN, sin_activate(), SOFT_PLUS, soft_plus_activate(), TANH, and tanh_activate().

Referenced by neural_activate_array().

Here is the call graph for this function:
Here is the caller graph for this function:

◆ neural_activate_array()

void neural_activate_array ( double *  state,
double *  output,
const int  n,
const int  a 
)

Applies an activation function to a vector of neuron states.

Parameters
[in,out]stateThe neuron states.
[in,out]outputThe neuron outputs.
[in]nThe length of the input array.
[in]aThe activation function.

Definition at line 199 of file neural_activations.c.

References clamp(), neural_activate(), NEURON_MAX, and NEURON_MIN.

Referenced by neural_layer_connected_forward(), neural_layer_convolutional_forward(), neural_layer_lstm_backward(), and neural_layer_lstm_forward().

Here is the call graph for this function:
Here is the caller graph for this function:

◆ neural_activation_as_int()

int neural_activation_as_int ( const char *  a)

Returns the integer representation of an activation function.

Parameters
[in]aString representing the name of an activation function.
Returns
Integer representing an activation function.

Definition at line 149 of file neural_activations.c.

References COS, GAUSSIAN, LEAKY, LINEAR, LOGGY, LOGISTIC, RELU, SELU, SIN, SOFT_MAX, SOFT_PLUS, STRING_COS, STRING_GAUSSIAN, STRING_LEAKY, STRING_LINEAR, STRING_LOGGY, STRING_LOGISTIC, STRING_RELU, STRING_SELU, STRING_SIN, STRING_SOFT_MAX, STRING_SOFT_PLUS, STRING_TANH, and TANH.

Referenced by layer_args_json_import_activation().

Here is the caller graph for this function:

◆ neural_activation_string()

const char* neural_activation_string ( const int  a)

Returns the name of a specified activation function.

Parameters
[in]aThe activation function.
Returns
The name of the activation function.

Definition at line 110 of file neural_activations.c.

References COS, GAUSSIAN, LEAKY, LINEAR, LOGGY, LOGISTIC, RELU, SELU, SIN, SOFT_MAX, SOFT_PLUS, STRING_COS, STRING_GAUSSIAN, STRING_LEAKY, STRING_LINEAR, STRING_LOGGY, STRING_LOGISTIC, STRING_RELU, STRING_SELU, STRING_SIN, STRING_SOFT_MAX, STRING_SOFT_PLUS, STRING_TANH, and TANH.

Referenced by layer_args_json_export_activation(), neural_layer_connected_json_export(), neural_layer_convolutional_json_export(), neural_layer_lstm_json_export(), and neural_layer_recurrent_json_export().

Here is the caller graph for this function:

◆ neural_gradient()

double neural_gradient ( const int  a,
const double  x 
)

Returns the derivative from applying a specified activation function.

Parameters
[in]aThe activation function applied.
[in]xThe input to the activation function.
Returns
The derivative from applying the activation function.

Definition at line 73 of file neural_activations.c.

References COS, cos_gradient(), GAUSSIAN, gaussian_gradient(), LEAKY, leaky_gradient(), LINEAR, linear_gradient(), LOGGY, loggy_gradient(), LOGISTIC, logistic_gradient(), RELU, relu_gradient(), SELU, selu_gradient(), SIN, sin_gradient(), SOFT_PLUS, soft_plus_gradient(), TANH, and tanh_gradient().

Referenced by neural_gradient_array().

Here is the call graph for this function:
Here is the caller graph for this function:

◆ neural_gradient_array()

void neural_gradient_array ( const double *  state,
double *  delta,
const int  n,
const int  a 
)

Applies a gradient function to a vector of neuron states.

Parameters
[in]stateThe neuron states.
[in,out]deltaThe neuron gradients.
[in]nThe length of the input array.
[in]aThe activation function.

Definition at line 215 of file neural_activations.c.

References neural_gradient().

Referenced by neural_layer_connected_backward(), neural_layer_convolutional_backward(), and neural_layer_lstm_backward().

Here is the call graph for this function:
Here is the caller graph for this function: