The neural-classifier Reference Manual

This is the neural-classifier Reference Manual, version 0.2, generated automatically by Declt version 4.0 beta 2 "William Riker" on Mon Feb 26 17:25:00 2024 GMT+0.

Table of Contents


1 Introduction


2 Systems

The main system appears first, followed by any subsystem dependency.


2.1 neural-classifier

Classification of samples based on neural network.

Author

Vasily Postnicov

License

2-clause BSD

Version

0.2

Dependencies
  • alexandria (system).
  • magicl/ext-blas (system).
  • magicl/ext-lapack (system).
  • snakes (system).
Source

neural-classifier.asd.

Child Components

3 Files

Files are sorted by type and then listed depth-first from the systems components trees.


3.1 Lisp


3.1.1 neural-classifier/neural-classifier.asd

Source

neural-classifier.asd.

Parent Component

neural-classifier (system).

ASDF Systems

neural-classifier.


3.1.2 neural-classifier/package.lisp

Source

neural-classifier.asd.

Parent Component

neural-classifier (system).

Packages

neural-classifier.


3.1.3 neural-classifier/magicl-blas.lisp

Dependency

package.lisp (file).

Source

neural-classifier.asd.

Parent Component

neural-classifier (system).


3.1.4 neural-classifier/definitions.lisp

Dependency

magicl-blas.lisp (file).

Source

neural-classifier.asd.

Parent Component

neural-classifier (system).

Public Interface
Internals

3.1.5 neural-classifier/utility.lisp

Dependency

definitions.lisp (file).

Source

neural-classifier.asd.

Parent Component

neural-classifier (system).

Public Interface

idx-abs-max (function).

Internals

3.1.6 neural-classifier/activation.lisp

Dependency

utility.lisp (file).

Source

neural-classifier.asd.

Parent Component

neural-classifier (system).

Public Interface
Internals

3.1.7 neural-classifier/optimizers.lisp

Dependency

activation.lisp (file).

Source

neural-classifier.asd.

Parent Component

neural-classifier (system).

Public Interface
Internals

3.1.8 neural-classifier/neural-network.lisp

Dependency

optimizers.lisp (file).

Source

neural-classifier.asd.

Parent Component

neural-classifier (system).

Public Interface
Internals

4 Packages

Packages are listed by definition order.


4.1 neural-classifier

Source

package.lisp.

Use List
  • alexandria.
  • common-lisp.
Public Interface
Internals

5 Definitions

Definitions are sorted by export status, category, package, and then by lexicographic order.


5.1 Public Interface


5.1.1 Special variables

Special Variable: *decay-rate*

Regularization parameter @c(λ/N), where @c(N) is the number of objects in the training set and @c(λ) must be about 1-10. If not sure, start with zero (which is the default).

Package

neural-classifier.

Source

definitions.lisp.

Special Variable: *learn-rate*

Learning speed for gradient descent algorithms. Bigger values result in faster learning, but too big is bad. Default value is good for SGD, SGD with momentum and NAG optimizers. For Adagrad and RMSprop try 0.001f0.

Package

neural-classifier.

Source

definitions.lisp.

Special Variable: *minibatch-size*

Number of samples to be used for one update of network parameters.

Package

neural-classifier.

Source

definitions.lisp.

Special Variable: *momentum-coeff*

Hyperparameter for SGD optimizers which use momentum. Zero means just usual SGD. RMSprop also uses this parameter in accumulation of squared partial derivatives of network parameters. Good values are 0.8-0.9.

Package

neural-classifier.

Source

definitions.lisp.


5.1.2 Ordinary functions

Function: calculate (neural-network object)

Calculate output from the network @c(neural-network) for the object @c(object). The input transformation function (specified by @c(:input-trans) when creating a network) is applied to the @c(object) and the output transformation function (specified by @c(:output-trans)) is applied to output Nx1 matrix from the network.

Package

neural-classifier.

Source

neural-network.lisp.

Function: idx-abs-max (matrix)

Returns index of first element with maximal absolute value by calling isamax() function from BLAS. Works only for rows or columns.

Package

neural-classifier.

Source

utility.lisp.

Function: make-neural-network (layout &key input-trans output-trans input-trans% label-trans activation-funcs)

Create a new neural network.
@begin(list)
@item(@c(layout) is a list of positive integers which describes the amount of neurons in each layer (starting from input layer).) @item(@c(activation-funcs) is a list all the elements of which are objects of type @c(activation). The length of this list must be equal to the length of @c(layout) minus one because the input layer does not have an activation function. The last element must be of type @c(output-layer-activation) and the all elements but last must be of type @c(hidden-layer-activation).) @item(@c(input-trans) is a function which is applied to an object passed to @c(calculate) to transform it into an input column (that is a matrix with the type @c(magicl:matrix/single-float) and the shape @c(Nx1), where @c(N) is the first number in the @c(layout)). For example, if we are recognizing digits from the MNIST set, this function can take a number of an image in the set and return @c(784x1) matrix.)
@item(@c(output-trans) is a function which is applied to the output of @c(calculate) function (that is a matrix with the type @c(magicl:matrix/single-float) and the shape Mx1, where M is the last number in the @c(layout)) to return some object with user-defined meaning (called a label). Again, if we are recognizing digits, this function transforms @c(10x1) matrix to a number from 0 to 9.)
@item(@c(input-trans%) is just like @c(input-trans), but is used while training. It can include additional transformations to extend your training set (e.g. it can add some noise to input data, rotate an input picture by a small random angle, etc.).) @item(@c(label-trans) is a function which is applied to a label to get a column (that is a matrix with the type @c(magicl:matrix/single-float) and the shape @c(Mx1), where @c(M) is the last number in the @c(layout)) which is the optimal output from the network for this object. With digits recognition, this function may take a digit @c(n) and return @c(10x1) matrix of all zeros with exception for @c(n)-th element which would be @c(1f0).)
@end(list)
Default value for all transformation functions is @c(identity).

Package

neural-classifier.

Source

neural-network.lisp.

Function: make-optimizer (type network)

Make optimizer of type @c(type) for a network @c(network).

Package

neural-classifier.

Source

optimizers.lisp.

Function: rate (neural-network generator &key test)

Calculate accuracy of the @c(neural-network) (ratio of correctly guessed samples to all samples) using testing data from the generator @c(generator). Each item returned by @c(generator) must be a cons pair in the form @c((data-object . label)), as with @c(train-epoch) function. @c(test) is a function used to compare the expected label with the label returned by the network.

Package

neural-classifier.

Source

neural-network.lisp.

Function: train-epoch (neural-network generator &key optimizer learn-rate decay-rate minibatch-size)

Perform training of @c(neural-network) on every object returned by the generator @c(generator). Each item returned by @c(generator) must be in the form @c((data-object . label)) cons
pair. @c(input-trans%) and @c(label-trans) functions passes to @c(make-neural-network) are applied to @c(car) and @c(cdr) of each pair respectively.

Package

neural-classifier.

Source

neural-network.lisp.


5.1.3 Generic functions

Generic Reader: neural-network-input-trans (object)
Generic Writer: (setf neural-network-input-trans) (object)
Package

neural-classifier.

Methods
Reader Method: neural-network-input-trans ((neural-network neural-network))
Writer Method: (setf neural-network-input-trans) ((neural-network neural-network))

Function which translates an input object to a vector

Source

definitions.lisp.

Target Slot

input-trans.

Generic Reader: neural-network-input-trans% (object)
Generic Writer: (setf neural-network-input-trans%) (object)
Package

neural-classifier.

Methods
Reader Method: neural-network-input-trans% ((neural-network neural-network))
Writer Method: (setf neural-network-input-trans%) ((neural-network neural-network))

Function which translates an input object to a vector (used for training)

Source

definitions.lisp.

Target Slot

input-trans%.

Generic Reader: neural-network-label-trans (object)
Generic Writer: (setf neural-network-label-trans) (object)
Package

neural-classifier.

Methods
Reader Method: neural-network-label-trans ((neural-network neural-network))
Writer Method: (setf neural-network-label-trans) ((neural-network neural-network))

Function which translates a label to a vector

Source

definitions.lisp.

Target Slot

label-trans.

Generic Reader: neural-network-layout (object)
Package

neural-classifier.

Methods
Reader Method: neural-network-layout ((neural-network neural-network))

Number of neurons in each layer of the network

Source

definitions.lisp.

Target Slot

layout.

Generic Reader: neural-network-output-trans (object)
Generic Writer: (setf neural-network-output-trans) (object)
Package

neural-classifier.

Methods
Reader Method: neural-network-output-trans ((neural-network neural-network))
Writer Method: (setf neural-network-output-trans) ((neural-network neural-network))

Function which translates an output vector to a label.

Source

definitions.lisp.

Target Slot

output-trans.


5.1.4 Standalone methods

Method: initialize-instance :after ((neural-network neural-network) &rest initargs)
Source

neural-network.lisp.

Method: initialize-instance :after ((optimizer memoizing-optimizer) &rest initargs &key &allow-other-keys)
Source

optimizers.lisp.


5.1.5 Classes

Class: activation

Generic class for activation functions. Not to be instantiated.

Package

neural-classifier.

Source

activation.lisp.

Direct subclasses
Class: adagrad-optimizer

Adagrad optimizer

Package

neural-classifier.

Source

optimizers.lisp.

Direct superclasses

memoizing-optimizer.

Direct methods

learn.

Direct Default Initargs
InitargValue
:initial-value1.0e-8
Class: hidden-layer-activation

Generic class for activation functions associated with hidden layers. Not to be instantiated.

Package

neural-classifier.

Source

activation.lisp.

Direct superclasses

activation.

Direct subclasses
Class: identity%

Identity activation function (does nothing on its input).

Package

neural-classifier.

Source

activation.lisp.

Direct superclasses

output-layer-activation.

Direct methods

activate.

Class: leaky-relu

Leaky ReLU activation function. It returns its
argument when it is greater than zero or the argument multiplied by @c(coeff) otherwise.

Package

neural-classifier.

Source

activation.lisp.

Direct superclasses

hidden-layer-activation.

Direct methods
Direct slots
Slot: coeff
Initform

0.0

Initargs

:coeff

Readers

leaky-relu-coeff.

Writers

This slot is read-only.

Class: momentum-optimizer

SGD optimizer with momentum

Package

neural-classifier.

Source

optimizers.lisp.

Direct superclasses

memoizing-optimizer.

Direct subclasses

nesterov-optimizer.

Direct methods
Direct slots
Slot: coeff
Type

single-float

Initform

neural-classifier:*momentum-coeff*

Initargs

:coeff

Readers

momentum-coeff.

Writers

(setf momentum-coeff).

Class: nesterov-optimizer

Nesterov accelerated SGD, improvement of SGD with momentum

Package

neural-classifier.

Source

optimizers.lisp.

Direct superclasses

momentum-optimizer.

Direct methods

learn.

Class: neural-network

Class for neural networks

Package

neural-classifier.

Source

definitions.lisp.

Direct methods
Direct slots
Slot: layout

Number of neurons in each layer of the network

Type

list

Initform

(error "specify number of neurons in each layer")

Initargs

:layout

Readers

neural-network-layout.

Writers

This slot is read-only.

Slot: activation-funcs

List of activation functions.

Type

list

Initargs

:activation-funcs

Readers

neural-network-activation-funcs.

Writers

(setf neural-network-activation-funcs).

Slot: weights

Weight matrices for each layer

Type

list

Readers

neural-network-weights.

Writers

(setf neural-network-weights).

Slot: biases

Bias vectors for each layer

Type

list

Readers

neural-network-biases.

Writers

(setf neural-network-biases).

Slot: input-trans

Function which translates an input object to a vector

Type

function

Initform

(function identity)

Initargs

:input-trans

Readers

neural-network-input-trans.

Writers

(setf neural-network-input-trans).

Slot: output-trans

Function which translates an output vector to a label.

Type

function

Initform

(function identity)

Initargs

:output-trans

Readers

neural-network-output-trans.

Writers

(setf neural-network-output-trans).

Slot: input-trans%

Function which translates an input object to a vector (used for training)

Type

function

Initform

(function identity)

Initargs

:input-trans%

Readers

neural-network-input-trans%.

Writers

(setf neural-network-input-trans%).

Slot: label-trans

Function which translates a label to a vector

Type

function

Initform

(function identity)

Initargs

:label-trans

Readers

neural-network-label-trans.

Writers

(setf neural-network-label-trans).

Class: output-layer-activation

Generic class for activation functions associated with an output layer. Not to be instantiated.

Package

neural-classifier.

Source

activation.lisp.

Direct superclasses

activation.

Direct subclasses
Class: rmsprop-optimizer

RMSprop optimizer

Package

neural-classifier.

Source

optimizers.lisp.

Direct superclasses

memoizing-optimizer.

Direct methods
Direct Default Initargs
InitargValue
:initial-value1.0e-8
Direct slots
Slot: coeff
Type

single-float

Initform

neural-classifier:*momentum-coeff*

Initargs

:coeff

Readers

momentum-coeff.

Writers

(setf momentum-coeff).

Class: sgd-optimizer

The simplest SGD optimizer

Package

neural-classifier.

Source

optimizers.lisp.

Direct superclasses

optimizer.

Direct methods

learn.

Class: sigmoid

Sigmoid activation function.

Package

neural-classifier.

Source

activation.lisp.

Direct superclasses
Direct methods
Class: softmax

Softmax activation function.

Package

neural-classifier.

Source

activation.lisp.

Direct superclasses

output-layer-activation.

Direct methods

activate.

Class: tanh%

Hyberbolic tangent activation function.

Package

neural-classifier.

Source

activation.lisp.

Direct superclasses
Direct methods

5.2 Internals


5.2.1 Ordinary functions

Function: calculate-delta (neural-network z network-output expected-output)

Calculate partial derivative of the cost function by z for all layers

Package

neural-classifier.

Source

neural-network.lisp.

Function: calculate-gradient (neural-network sample)

Calculate gradient of the cost function

Package

neural-classifier.

Source

neural-network.lisp.

Function: calculate-gradient-minibatch (neural-network samples)

Calculate gradient of the cost function based on multiple input samples

Package

neural-classifier.

Source

neural-network.lisp.

Function: calculate-z-and-out (neural-network input)

Calculate argument and value of activation function for all layers

Package

neural-classifier.

Source

neural-network.lisp.

Function: nrandom-generator (&key μ σ)

Return a function which generates random values from a distibution N(μ, σ).

Package

neural-classifier.

Source

utility.lisp.

Function: sasum (matrix)
Package

neural-classifier.

Source

utility.lisp.

Function: standard-random ()

Return a random value sampled from a distribution N(0, 1).

Package

neural-classifier.

Source

utility.lisp.

Function: σ (z)

Sigmoid activation function.

Package

neural-classifier.

Source

activation.lisp.


5.2.2 Generic functions

Generic Function: activate (vector activation)

Apply activation function ACTIVATION to a
VECTOR. VECTOR is an output vector from a layer of a neural network.

Package

neural-classifier.

Source

activation.lisp.

Methods
Method: activate (vector (activation identity%))
Method: activate (vector (activation softmax))
Method: activate (vector (activation leaky-relu))
Method: activate (vector (activation tanh%))
Method: activate (vector (activation sigmoid))
Generic Function: activate' (vector type)

Apply derivative of activation function ACTIVATION
to a VECTOR. VECTOR is an output vector from a layer of a neural network.

Package

neural-classifier.

Source

activation.lisp.

Methods
Method: activate' (vector (activation leaky-relu))
Method: activate' (vector (activation tanh%))
Method: activate' (vector (activation sigmoid))
Generic Reader: leaky-relu-coeff (object)
Package

neural-classifier.

Methods
Reader Method: leaky-relu-coeff ((leaky-relu leaky-relu))

automatically generated reader method

Source

activation.lisp.

Target Slot

coeff.

Generic Function: learn (optimizer neural-network samples)

Update network parameters using SAMPLES for training.

Package

neural-classifier.

Source

optimizers.lisp.

Methods
Method: learn ((optimizer rmsprop-optimizer) neural-network samples)
Method: learn ((optimizer adagrad-optimizer) neural-network samples)
Method: learn ((optimizer nesterov-optimizer) neural-network samples)
Method: learn ((optimizer momentum-optimizer) neural-network samples)
Method: learn ((optimizer sgd-optimizer) neural-network samples)
Generic Reader: momentum-coeff (object)
Package

neural-classifier.

Methods
Reader Method: momentum-coeff ((rmsprop-optimizer rmsprop-optimizer))

automatically generated reader method

Source

optimizers.lisp.

Target Slot

coeff.

Reader Method: momentum-coeff ((momentum-optimizer momentum-optimizer))

automatically generated reader method

Source

optimizers.lisp.

Target Slot

coeff.

Generic Writer: (setf momentum-coeff) (object)
Package

neural-classifier.

Methods
Writer Method: (setf momentum-coeff) ((rmsprop-optimizer rmsprop-optimizer))

automatically generated writer method

Source

optimizers.lisp.

Target Slot

coeff.

Writer Method: (setf momentum-coeff) ((momentum-optimizer momentum-optimizer))

automatically generated writer method

Source

optimizers.lisp.

Target Slot

coeff.

Generic Reader: neural-network-activation-funcs (object)
Generic Writer: (setf neural-network-activation-funcs) (object)
Package

neural-classifier.

Methods
Reader Method: neural-network-activation-funcs ((neural-network neural-network))
Writer Method: (setf neural-network-activation-funcs) ((neural-network neural-network))

List of activation functions.

Source

definitions.lisp.

Target Slot

activation-funcs.

Generic Reader: neural-network-biases (object)
Generic Writer: (setf neural-network-biases) (object)
Package

neural-classifier.

Methods
Reader Method: neural-network-biases ((neural-network neural-network))
Writer Method: (setf neural-network-biases) ((neural-network neural-network))

Bias vectors for each layer

Source

definitions.lisp.

Target Slot

biases.

Generic Reader: neural-network-weights (object)
Generic Writer: (setf neural-network-weights) (object)
Package

neural-classifier.

Methods
Reader Method: neural-network-weights ((neural-network neural-network))
Writer Method: (setf neural-network-weights) ((neural-network neural-network))

Weight matrices for each layer

Source

definitions.lisp.

Target Slot

weights.

Generic Reader: optimizer-biases (object)
Package

neural-classifier.

Methods
Reader Method: optimizer-biases ((memoizing-optimizer memoizing-optimizer))

automatically generated reader method

Source

optimizers.lisp.

Target Slot

biases.

Generic Writer: (setf optimizer-biases) (object)
Package

neural-classifier.

Methods
Writer Method: (setf optimizer-biases) ((memoizing-optimizer memoizing-optimizer))

automatically generated writer method

Source

optimizers.lisp.

Target Slot

biases.

Generic Reader: optimizer-initial-value (object)
Package

neural-classifier.

Methods
Reader Method: optimizer-initial-value ((memoizing-optimizer memoizing-optimizer))

automatically generated reader method

Source

optimizers.lisp.

Target Slot

initial-value.

Generic Writer: (setf optimizer-initial-value) (object)
Package

neural-classifier.

Methods
Writer Method: (setf optimizer-initial-value) ((memoizing-optimizer memoizing-optimizer))

automatically generated writer method

Source

optimizers.lisp.

Target Slot

initial-value.

Generic Reader: optimizer-weights (object)
Package

neural-classifier.

Methods
Reader Method: optimizer-weights ((memoizing-optimizer memoizing-optimizer))

automatically generated reader method

Source

optimizers.lisp.

Target Slot

weights.

Generic Writer: (setf optimizer-weights) (object)
Package

neural-classifier.

Methods
Writer Method: (setf optimizer-weights) ((memoizing-optimizer memoizing-optimizer))

automatically generated writer method

Source

optimizers.lisp.

Target Slot

weights.


5.2.3 Classes

Class: memoizing-optimizer

Optimizer which memoizes some old state related to weights and biases. Not to be instantiated.

Package

neural-classifier.

Source

optimizers.lisp.

Direct superclasses

optimizer.

Direct subclasses
Direct methods
Direct slots
Slot: weights
Type

list

Readers

optimizer-weights.

Writers

(setf optimizer-weights).

Slot: biases
Type

list

Readers

optimizer-biases.

Writers

(setf optimizer-biases).

Slot: initial-value
Type

single-float

Initform

0.0

Initargs

:initial-value

Readers

optimizer-initial-value.

Writers

(setf optimizer-initial-value).

Class: optimizer

Generic optimizer class. Not to be instantiated

Package

neural-classifier.

Source

optimizers.lisp.

Direct subclasses

Appendix A Indexes


A.1 Concepts


A.2 Functions

Jump to:   (  
A   C   F   G   I   L   M   N   O   R   S   T   Σ  
Index Entry  Section

(
(setf momentum-coeff): Private generic functions
(setf momentum-coeff): Private generic functions
(setf momentum-coeff): Private generic functions
(setf neural-network-activation-funcs): Private generic functions
(setf neural-network-activation-funcs): Private generic functions
(setf neural-network-biases): Private generic functions
(setf neural-network-biases): Private generic functions
(setf neural-network-input-trans%): Public generic functions
(setf neural-network-input-trans%): Public generic functions
(setf neural-network-input-trans): Public generic functions
(setf neural-network-input-trans): Public generic functions
(setf neural-network-label-trans): Public generic functions
(setf neural-network-label-trans): Public generic functions
(setf neural-network-output-trans): Public generic functions
(setf neural-network-output-trans): Public generic functions
(setf neural-network-weights): Private generic functions
(setf neural-network-weights): Private generic functions
(setf optimizer-biases): Private generic functions
(setf optimizer-biases): Private generic functions
(setf optimizer-initial-value): Private generic functions
(setf optimizer-initial-value): Private generic functions
(setf optimizer-weights): Private generic functions
(setf optimizer-weights): Private generic functions

A
activate: Private generic functions
activate: Private generic functions
activate: Private generic functions
activate: Private generic functions
activate: Private generic functions
activate: Private generic functions
activate': Private generic functions
activate': Private generic functions
activate': Private generic functions
activate': Private generic functions

C
calculate: Public ordinary functions
calculate-delta: Private ordinary functions
calculate-gradient: Private ordinary functions
calculate-gradient-minibatch: Private ordinary functions
calculate-z-and-out: Private ordinary functions

F
Function, calculate: Public ordinary functions
Function, calculate-delta: Private ordinary functions
Function, calculate-gradient: Private ordinary functions
Function, calculate-gradient-minibatch: Private ordinary functions
Function, calculate-z-and-out: Private ordinary functions
Function, idx-abs-max: Public ordinary functions
Function, make-neural-network: Public ordinary functions
Function, make-optimizer: Public ordinary functions
Function, nrandom-generator: Private ordinary functions
Function, rate: Public ordinary functions
Function, sasum: Private ordinary functions
Function, standard-random: Private ordinary functions
Function, train-epoch: Public ordinary functions
Function, σ: Private ordinary functions

G
Generic Function, (setf momentum-coeff): Private generic functions
Generic Function, (setf neural-network-activation-funcs): Private generic functions
Generic Function, (setf neural-network-biases): Private generic functions
Generic Function, (setf neural-network-input-trans%): Public generic functions
Generic Function, (setf neural-network-input-trans): Public generic functions
Generic Function, (setf neural-network-label-trans): Public generic functions
Generic Function, (setf neural-network-output-trans): Public generic functions
Generic Function, (setf neural-network-weights): Private generic functions
Generic Function, (setf optimizer-biases): Private generic functions
Generic Function, (setf optimizer-initial-value): Private generic functions
Generic Function, (setf optimizer-weights): Private generic functions
Generic Function, activate: Private generic functions
Generic Function, activate': Private generic functions
Generic Function, leaky-relu-coeff: Private generic functions
Generic Function, learn: Private generic functions
Generic Function, momentum-coeff: Private generic functions
Generic Function, neural-network-activation-funcs: Private generic functions
Generic Function, neural-network-biases: Private generic functions
Generic Function, neural-network-input-trans: Public generic functions
Generic Function, neural-network-input-trans%: Public generic functions
Generic Function, neural-network-label-trans: Public generic functions
Generic Function, neural-network-layout: Public generic functions
Generic Function, neural-network-output-trans: Public generic functions
Generic Function, neural-network-weights: Private generic functions
Generic Function, optimizer-biases: Private generic functions
Generic Function, optimizer-initial-value: Private generic functions
Generic Function, optimizer-weights: Private generic functions

I
idx-abs-max: Public ordinary functions
initialize-instance: Public standalone methods
initialize-instance: Public standalone methods

L
leaky-relu-coeff: Private generic functions
leaky-relu-coeff: Private generic functions
learn: Private generic functions
learn: Private generic functions
learn: Private generic functions
learn: Private generic functions
learn: Private generic functions
learn: Private generic functions

M
make-neural-network: Public ordinary functions
make-optimizer: Public ordinary functions
Method, (setf momentum-coeff): Private generic functions
Method, (setf momentum-coeff): Private generic functions
Method, (setf neural-network-activation-funcs): Private generic functions
Method, (setf neural-network-biases): Private generic functions
Method, (setf neural-network-input-trans%): Public generic functions
Method, (setf neural-network-input-trans): Public generic functions
Method, (setf neural-network-label-trans): Public generic functions
Method, (setf neural-network-output-trans): Public generic functions
Method, (setf neural-network-weights): Private generic functions
Method, (setf optimizer-biases): Private generic functions
Method, (setf optimizer-initial-value): Private generic functions
Method, (setf optimizer-weights): Private generic functions
Method, activate: Private generic functions
Method, activate: Private generic functions
Method, activate: Private generic functions
Method, activate: Private generic functions
Method, activate: Private generic functions
Method, activate': Private generic functions
Method, activate': Private generic functions
Method, activate': Private generic functions
Method, initialize-instance: Public standalone methods
Method, initialize-instance: Public standalone methods
Method, leaky-relu-coeff: Private generic functions
Method, learn: Private generic functions
Method, learn: Private generic functions
Method, learn: Private generic functions
Method, learn: Private generic functions
Method, learn: Private generic functions
Method, momentum-coeff: Private generic functions
Method, momentum-coeff: Private generic functions
Method, neural-network-activation-funcs: Private generic functions
Method, neural-network-biases: Private generic functions
Method, neural-network-input-trans: Public generic functions
Method, neural-network-input-trans%: Public generic functions
Method, neural-network-label-trans: Public generic functions
Method, neural-network-layout: Public generic functions
Method, neural-network-output-trans: Public generic functions
Method, neural-network-weights: Private generic functions
Method, optimizer-biases: Private generic functions
Method, optimizer-initial-value: Private generic functions
Method, optimizer-weights: Private generic functions
momentum-coeff: Private generic functions
momentum-coeff: Private generic functions
momentum-coeff: Private generic functions

N
neural-network-activation-funcs: Private generic functions
neural-network-activation-funcs: Private generic functions
neural-network-biases: Private generic functions
neural-network-biases: Private generic functions
neural-network-input-trans: Public generic functions
neural-network-input-trans: Public generic functions
neural-network-input-trans%: Public generic functions
neural-network-input-trans%: Public generic functions
neural-network-label-trans: Public generic functions
neural-network-label-trans: Public generic functions
neural-network-layout: Public generic functions
neural-network-layout: Public generic functions
neural-network-output-trans: Public generic functions
neural-network-output-trans: Public generic functions
neural-network-weights: Private generic functions
neural-network-weights: Private generic functions
nrandom-generator: Private ordinary functions

O
optimizer-biases: Private generic functions
optimizer-biases: Private generic functions
optimizer-initial-value: Private generic functions
optimizer-initial-value: Private generic functions
optimizer-weights: Private generic functions
optimizer-weights: Private generic functions

R
rate: Public ordinary functions

S
sasum: Private ordinary functions
standard-random: Private ordinary functions

T
train-epoch: Public ordinary functions

Σ
σ: Private ordinary functions


A.4 Data types

Jump to:   A   C   D   F   H   I   L   M   N   O   P   R   S   T   U  
Index Entry  Section

A
activation: Public classes
activation.lisp: The neural-classifier/activation․lisp file
adagrad-optimizer: Public classes

C
Class, activation: Public classes
Class, adagrad-optimizer: Public classes
Class, hidden-layer-activation: Public classes
Class, identity%: Public classes
Class, leaky-relu: Public classes
Class, memoizing-optimizer: Private classes
Class, momentum-optimizer: Public classes
Class, nesterov-optimizer: Public classes
Class, neural-network: Public classes
Class, optimizer: Private classes
Class, output-layer-activation: Public classes
Class, rmsprop-optimizer: Public classes
Class, sgd-optimizer: Public classes
Class, sigmoid: Public classes
Class, softmax: Public classes
Class, tanh%: Public classes

D
definitions.lisp: The neural-classifier/definitions․lisp file

F
File, activation.lisp: The neural-classifier/activation․lisp file
File, definitions.lisp: The neural-classifier/definitions․lisp file
File, magicl-blas.lisp: The neural-classifier/magicl-blas․lisp file
File, neural-classifier.asd: The neural-classifier/neural-classifier․asd file
File, neural-network.lisp: The neural-classifier/neural-network․lisp file
File, optimizers.lisp: The neural-classifier/optimizers․lisp file
File, package.lisp: The neural-classifier/package․lisp file
File, utility.lisp: The neural-classifier/utility․lisp file

H
hidden-layer-activation: Public classes

I
identity%: Public classes

L
leaky-relu: Public classes

M
magicl-blas.lisp: The neural-classifier/magicl-blas․lisp file
memoizing-optimizer: Private classes
momentum-optimizer: Public classes

N
nesterov-optimizer: Public classes
neural-classifier: The neural-classifier system
neural-classifier: The neural-classifier package
neural-classifier.asd: The neural-classifier/neural-classifier․asd file
neural-network: Public classes
neural-network.lisp: The neural-classifier/neural-network․lisp file

O
optimizer: Private classes
optimizers.lisp: The neural-classifier/optimizers․lisp file
output-layer-activation: Public classes

P
Package, neural-classifier: The neural-classifier package
package.lisp: The neural-classifier/package․lisp file

R
rmsprop-optimizer: Public classes

S
sgd-optimizer: Public classes
sigmoid: Public classes
softmax: Public classes
System, neural-classifier: The neural-classifier system

T
tanh%: Public classes

U
utility.lisp: The neural-classifier/utility․lisp file