XCSF  1.4.7
XCSF learning classifier system
neural_activations.h File Reference

Neural network activation functions. More...

#include <math.h>
Include dependency graph for neural_activations.h:
This graph shows which files directly or indirectly include this file:

Go to the source code of this file.

Macros

#define LOGISTIC   (0)
 Logistic [0,1]. More...
 
#define RELU   (1)
 Rectified linear unit [0,inf]. More...
 
#define TANH   (2)
 Tanh [-1,1]. More...
 
#define LINEAR   (3)
 Linear [-inf,inf]. More...
 
#define GAUSSIAN   (4)
 Gaussian (0,1]. More...
 
#define SIN   (5)
 Sine [-1,1]. More...
 
#define COS   (6)
 Cos [-1,1]. More...
 
#define SOFT_PLUS   (7)
 Soft plus [0,inf]. More...
 
#define LEAKY   (8)
 Leaky rectified linear unit [-inf,inf]. More...
 
#define SELU   (9)
 Scaled-exponential linear unit [-1.7581,inf]. More...
 
#define LOGGY   (10)
 Logistic [-1,1]. More...
 
#define NUM_ACTIVATIONS   (11)
 Number of activations available. More...
 
#define SOFT_MAX   (100)
 Softmax. More...
 
#define STRING_LOGISTIC   ("logistic\0")
 Logistic. More...
 
#define STRING_RELU   ("relu\0")
 RELU. More...
 
#define STRING_TANH   ("tanh\0")
 Tanh. More...
 
#define STRING_LINEAR   ("linear\0")
 Linear. More...
 
#define STRING_GAUSSIAN   ("gaussian\0")
 Gaussian. More...
 
#define STRING_SIN   ("sin\0")
 Sine. More...
 
#define STRING_COS   ("cos\0")
 Cos. More...
 
#define STRING_SOFT_PLUS   ("softplus\0")
 Soft plus. More...
 
#define STRING_LEAKY   ("leaky\0")
 Leaky. More...
 
#define STRING_SELU   ("selu\0")
 SELU. More...
 
#define STRING_LOGGY   ("loggy\0")
 Loggy. More...
 
#define STRING_SOFT_MAX   ("softmax\0")
 Softmax. More...
 

Functions

double neural_activate (const int a, const double x)
 Returns the result from applying a specified activation function. More...
 
double neural_gradient (const int a, const double x)
 Returns the derivative from applying a specified activation function. More...
 
const char * neural_activation_string (const int a)
 Returns the name of a specified activation function. More...
 
int neural_activation_as_int (const char *a)
 Returns the integer representation of an activation function. More...
 
void neural_activate_array (double *state, double *output, const int n, const int a)
 Applies an activation function to a vector of neuron states. More...
 
void neural_gradient_array (const double *state, double *delta, const int n, const int a)
 Applies a gradient function to a vector of neuron states. More...
 
static double logistic_activate (const double x)
 
static double logistic_gradient (const double x)
 
static double loggy_activate (const double x)
 
static double loggy_gradient (const double x)
 
static double gaussian_activate (const double x)
 
static double gaussian_gradient (const double x)
 
static double relu_activate (const double x)
 
static double relu_gradient (const double x)
 
static double selu_activate (const double x)
 
static double selu_gradient (const double x)
 
static double linear_activate (const double x)
 
static double linear_gradient (const double x)
 
static double soft_plus_activate (const double x)
 
static double soft_plus_gradient (const double x)
 
static double tanh_activate (const double x)
 
static double tanh_gradient (const double x)
 
static double leaky_activate (const double x)
 
static double leaky_gradient (const double x)
 
static double sin_activate (const double x)
 
static double sin_gradient (const double x)
 
static double cos_activate (const double x)
 
static double cos_gradient (const double x)
 

Detailed Description

Neural network activation functions.

Author
Richard Preen rpree.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om
Date
2012–2020.

Definition in file neural_activations.h.

Macro Definition Documentation

◆ COS

#define COS   (6)

Cos [-1,1].

Definition at line 34 of file neural_activations.h.

◆ GAUSSIAN

#define GAUSSIAN   (4)

Gaussian (0,1].

Definition at line 32 of file neural_activations.h.

◆ LEAKY

#define LEAKY   (8)

Leaky rectified linear unit [-inf,inf].

Definition at line 36 of file neural_activations.h.

◆ LINEAR

#define LINEAR   (3)

Linear [-inf,inf].

Definition at line 31 of file neural_activations.h.

◆ LOGGY

#define LOGGY   (10)

Logistic [-1,1].

Definition at line 38 of file neural_activations.h.

◆ LOGISTIC

#define LOGISTIC   (0)

Logistic [0,1].

Definition at line 28 of file neural_activations.h.

◆ NUM_ACTIVATIONS

#define NUM_ACTIVATIONS   (11)

Number of activations available.

Definition at line 39 of file neural_activations.h.

◆ RELU

#define RELU   (1)

Rectified linear unit [0,inf].

Definition at line 29 of file neural_activations.h.

◆ SELU

#define SELU   (9)

Scaled-exponential linear unit [-1.7581,inf].

Definition at line 37 of file neural_activations.h.

◆ SIN

#define SIN   (5)

Sine [-1,1].

Definition at line 33 of file neural_activations.h.

◆ SOFT_MAX

#define SOFT_MAX   (100)

Softmax.

Definition at line 40 of file neural_activations.h.

◆ SOFT_PLUS

#define SOFT_PLUS   (7)

Soft plus [0,inf].

Definition at line 35 of file neural_activations.h.

◆ STRING_COS

#define STRING_COS   ("cos\0")

Cos.

Definition at line 48 of file neural_activations.h.

◆ STRING_GAUSSIAN

#define STRING_GAUSSIAN   ("gaussian\0")

Gaussian.

Definition at line 46 of file neural_activations.h.

◆ STRING_LEAKY

#define STRING_LEAKY   ("leaky\0")

Leaky.

Definition at line 50 of file neural_activations.h.

◆ STRING_LINEAR

#define STRING_LINEAR   ("linear\0")

Linear.

Definition at line 45 of file neural_activations.h.

◆ STRING_LOGGY

#define STRING_LOGGY   ("loggy\0")

Loggy.

Definition at line 52 of file neural_activations.h.

◆ STRING_LOGISTIC

#define STRING_LOGISTIC   ("logistic\0")

Logistic.

Definition at line 42 of file neural_activations.h.

◆ STRING_RELU

#define STRING_RELU   ("relu\0")

RELU.

Definition at line 43 of file neural_activations.h.

◆ STRING_SELU

#define STRING_SELU   ("selu\0")

SELU.

Definition at line 51 of file neural_activations.h.

◆ STRING_SIN

#define STRING_SIN   ("sin\0")

Sine.

Definition at line 47 of file neural_activations.h.

◆ STRING_SOFT_MAX

#define STRING_SOFT_MAX   ("softmax\0")

Softmax.

Definition at line 53 of file neural_activations.h.

◆ STRING_SOFT_PLUS

#define STRING_SOFT_PLUS   ("softplus\0")

Soft plus.

Definition at line 49 of file neural_activations.h.

◆ STRING_TANH

#define STRING_TANH   ("tanh\0")

Tanh.

Definition at line 44 of file neural_activations.h.

◆ TANH

#define TANH   (2)

Tanh [-1,1].

Definition at line 30 of file neural_activations.h.

Function Documentation

◆ cos_activate()

static double cos_activate ( const double  x)
inlinestatic

Definition at line 199 of file neural_activations.h.

Referenced by neural_activate().

Here is the caller graph for this function:

◆ cos_gradient()

static double cos_gradient ( const double  x)
inlinestatic

Definition at line 205 of file neural_activations.h.

Referenced by neural_gradient().

Here is the caller graph for this function:

◆ gaussian_activate()

static double gaussian_activate ( const double  x)
inlinestatic

Definition at line 101 of file neural_activations.h.

Referenced by neural_activate().

Here is the caller graph for this function:

◆ gaussian_gradient()

static double gaussian_gradient ( const double  x)
inlinestatic

Definition at line 107 of file neural_activations.h.

Referenced by neural_gradient().

Here is the caller graph for this function:

◆ leaky_activate()

static double leaky_activate ( const double  x)
inlinestatic

Definition at line 175 of file neural_activations.h.

Referenced by neural_activate().

Here is the caller graph for this function:

◆ leaky_gradient()

static double leaky_gradient ( const double  x)
inlinestatic

Definition at line 181 of file neural_activations.h.

Referenced by neural_gradient().

Here is the caller graph for this function:

◆ linear_activate()

static double linear_activate ( const double  x)
inlinestatic

Definition at line 137 of file neural_activations.h.

Referenced by neural_activate().

Here is the caller graph for this function:

◆ linear_gradient()

static double linear_gradient ( const double  x)
inlinestatic

Definition at line 143 of file neural_activations.h.

Referenced by neural_gradient().

Here is the caller graph for this function:

◆ loggy_activate()

static double loggy_activate ( const double  x)
inlinestatic

Definition at line 88 of file neural_activations.h.

Referenced by neural_activate().

Here is the caller graph for this function:

◆ loggy_gradient()

static double loggy_gradient ( const double  x)
inlinestatic

Definition at line 94 of file neural_activations.h.

Referenced by neural_gradient().

Here is the caller graph for this function:

◆ logistic_activate()

static double logistic_activate ( const double  x)
inlinestatic

Definition at line 75 of file neural_activations.h.

Referenced by neural_activate().

Here is the caller graph for this function:

◆ logistic_gradient()

static double logistic_gradient ( const double  x)
inlinestatic

Definition at line 81 of file neural_activations.h.

Referenced by neural_gradient().

Here is the caller graph for this function:

◆ neural_activate()

double neural_activate ( const int  a,
const double  x 
)

Returns the result from applying a specified activation function.

Parameters
[in]aThe activation function to apply.
[in]xThe input to the activation function.
Returns
The result from applying the activation function.

Definition at line 35 of file neural_activations.c.

References COS, cos_activate(), GAUSSIAN, gaussian_activate(), LEAKY, leaky_activate(), LINEAR, linear_activate(), LOGGY, loggy_activate(), LOGISTIC, logistic_activate(), RELU, relu_activate(), SELU, selu_activate(), SIN, sin_activate(), SOFT_PLUS, soft_plus_activate(), TANH, and tanh_activate().

Referenced by neural_activate_array().

Here is the call graph for this function:
Here is the caller graph for this function:

◆ neural_activate_array()

void neural_activate_array ( double *  state,
double *  output,
const int  n,
const int  a 
)

Applies an activation function to a vector of neuron states.

Parameters
[in,out]stateThe neuron states.
[in,out]outputThe neuron outputs.
[in]nThe length of the input array.
[in]aThe activation function.

Definition at line 199 of file neural_activations.c.

References clamp(), neural_activate(), NEURON_MAX, and NEURON_MIN.

Referenced by neural_layer_connected_forward(), neural_layer_convolutional_forward(), neural_layer_lstm_backward(), and neural_layer_lstm_forward().

Here is the call graph for this function:
Here is the caller graph for this function:

◆ neural_activation_as_int()

int neural_activation_as_int ( const char *  a)

Returns the integer representation of an activation function.

Parameters
[in]aString representing the name of an activation function.
Returns
Integer representing an activation function.

Definition at line 149 of file neural_activations.c.

References COS, GAUSSIAN, LEAKY, LINEAR, LOGGY, LOGISTIC, RELU, SELU, SIN, SOFT_MAX, SOFT_PLUS, STRING_COS, STRING_GAUSSIAN, STRING_LEAKY, STRING_LINEAR, STRING_LOGGY, STRING_LOGISTIC, STRING_RELU, STRING_SELU, STRING_SIN, STRING_SOFT_MAX, STRING_SOFT_PLUS, STRING_TANH, and TANH.

Referenced by layer_args_json_import_activation().

Here is the caller graph for this function:

◆ neural_activation_string()

const char* neural_activation_string ( const int  a)

Returns the name of a specified activation function.

Parameters
[in]aThe activation function.
Returns
The name of the activation function.

Definition at line 110 of file neural_activations.c.

References COS, GAUSSIAN, LEAKY, LINEAR, LOGGY, LOGISTIC, RELU, SELU, SIN, SOFT_MAX, SOFT_PLUS, STRING_COS, STRING_GAUSSIAN, STRING_LEAKY, STRING_LINEAR, STRING_LOGGY, STRING_LOGISTIC, STRING_RELU, STRING_SELU, STRING_SIN, STRING_SOFT_MAX, STRING_SOFT_PLUS, STRING_TANH, and TANH.

Referenced by layer_args_json_export_activation(), neural_layer_connected_json_export(), neural_layer_convolutional_json_export(), neural_layer_lstm_json_export(), and neural_layer_recurrent_json_export().

Here is the caller graph for this function:

◆ neural_gradient()

double neural_gradient ( const int  a,
const double  x 
)

Returns the derivative from applying a specified activation function.

Parameters
[in]aThe activation function applied.
[in]xThe input to the activation function.
Returns
The derivative from applying the activation function.

Definition at line 73 of file neural_activations.c.

References COS, cos_gradient(), GAUSSIAN, gaussian_gradient(), LEAKY, leaky_gradient(), LINEAR, linear_gradient(), LOGGY, loggy_gradient(), LOGISTIC, logistic_gradient(), RELU, relu_gradient(), SELU, selu_gradient(), SIN, sin_gradient(), SOFT_PLUS, soft_plus_gradient(), TANH, and tanh_gradient().

Referenced by neural_gradient_array().

Here is the call graph for this function:
Here is the caller graph for this function:

◆ neural_gradient_array()

void neural_gradient_array ( const double *  state,
double *  delta,
const int  n,
const int  a 
)

Applies a gradient function to a vector of neuron states.

Parameters
[in]stateThe neuron states.
[in,out]deltaThe neuron gradients.
[in]nThe length of the input array.
[in]aThe activation function.

Definition at line 215 of file neural_activations.c.

References neural_gradient().

Referenced by neural_layer_connected_backward(), neural_layer_convolutional_backward(), and neural_layer_lstm_backward().

Here is the call graph for this function:
Here is the caller graph for this function:

◆ relu_activate()

static double relu_activate ( const double  x)
inlinestatic

Definition at line 113 of file neural_activations.h.

Referenced by neural_activate().

Here is the caller graph for this function:

◆ relu_gradient()

static double relu_gradient ( const double  x)
inlinestatic

Definition at line 119 of file neural_activations.h.

Referenced by neural_gradient().

Here is the caller graph for this function:

◆ selu_activate()

static double selu_activate ( const double  x)
inlinestatic

Definition at line 125 of file neural_activations.h.

Referenced by neural_activate().

Here is the caller graph for this function:

◆ selu_gradient()

static double selu_gradient ( const double  x)
inlinestatic

Definition at line 131 of file neural_activations.h.

Referenced by neural_gradient().

Here is the caller graph for this function:

◆ sin_activate()

static double sin_activate ( const double  x)
inlinestatic

Definition at line 187 of file neural_activations.h.

Referenced by neural_activate().

Here is the caller graph for this function:

◆ sin_gradient()

static double sin_gradient ( const double  x)
inlinestatic

Definition at line 193 of file neural_activations.h.

Referenced by neural_gradient().

Here is the caller graph for this function:

◆ soft_plus_activate()

static double soft_plus_activate ( const double  x)
inlinestatic

Definition at line 150 of file neural_activations.h.

Referenced by neural_activate().

Here is the caller graph for this function:

◆ soft_plus_gradient()

static double soft_plus_gradient ( const double  x)
inlinestatic

Definition at line 156 of file neural_activations.h.

Referenced by neural_gradient().

Here is the caller graph for this function:

◆ tanh_activate()

static double tanh_activate ( const double  x)
inlinestatic

Definition at line 162 of file neural_activations.h.

Referenced by neural_activate().

Here is the caller graph for this function:

◆ tanh_gradient()

static double tanh_gradient ( const double  x)
inlinestatic

Definition at line 168 of file neural_activations.h.

Referenced by neural_gradient().

Here is the caller graph for this function: