This is the simple-neural-network Reference Manual, version 3.1, generated automatically by Declt version 4.0 beta 2 "William Riker" on Sun Dec 15 07:42:28 2024 GMT+0.
The main system appears first, followed by any subsystem dependency.
simple-neural-network
Simple neural network
Guillaume Le Vaillant
GPL-3
3.1
cl-store
(system).
lparallel
(system).
simple-neural-network.lisp
(file).
Files are sorted by type and then listed depth-first from the systems components trees.
simple-neural-network/simple-neural-network.asd
simple-neural-network
(system).
simple-neural-network/simple-neural-network.lisp
simple-neural-network
(system).
accuracy
(function).
copy
(function).
create-neural-network
(function).
find-learning-rate
(function).
find-normalization
(function).
index-of-max-value
(function).
mean-absolute-error
(function).
neural-network-biases
(reader).
(setf neural-network-biases)
(writer).
neural-network-layers
(reader).
(setf neural-network-layers)
(writer).
neural-network-weights
(reader).
(setf neural-network-weights)
(writer).
predict
(function).
restore
(function).
same-category-p
(function).
store
(function).
train
(function).
%dotimes
(macro).
%mapc
(macro).
activation
(function).
activation-prime
(function).
add-gradients
(function).
add-weight-gradient
(function).
average-gradient
(function).
average-gradients
(function).
backpropagate
(function).
clear-momentum
(function).
clear-momentums
(function).
compute-delta
(function).
compute-output-delta
(function).
compute-single-delta
(function).
compute-value
(function).
compute-values
(function).
copy-neural-network
(function).
denormalize
(function).
double-float-array
(type).
get-output
(function).
make-double-float-array
(function).
make-neural-network
(function).
make-random-weights
(function).
means
(function).
neural-network
(structure).
neural-network-bias-gradients
(reader).
(setf neural-network-bias-gradients)
(writer).
neural-network-bias-momentums
(reader).
(setf neural-network-bias-momentums)
(writer).
neural-network-deltas
(reader).
(setf neural-network-deltas)
(writer).
neural-network-p
(function).
neural-network-weight-gradients
(reader).
(setf neural-network-weight-gradients)
(writer).
neural-network-weight-momentums
(reader).
(setf neural-network-weight-momentums)
(writer).
normalize
(function).
propagate
(function).
set-input
(function).
standard-deviations
(function).
update-weights
(function).
update-weights-and-biases
(function).
Packages are listed by definition order.
simple-neural-network
snn
common-lisp
.
accuracy
(function).
copy
(function).
create-neural-network
(function).
find-learning-rate
(function).
find-normalization
(function).
index-of-max-value
(function).
mean-absolute-error
(function).
neural-network-biases
(reader).
(setf neural-network-biases)
(writer).
neural-network-layers
(reader).
(setf neural-network-layers)
(writer).
neural-network-weights
(reader).
(setf neural-network-weights)
(writer).
predict
(function).
restore
(function).
same-category-p
(function).
store
(function).
train
(function).
%dotimes
(macro).
%mapc
(macro).
activation
(function).
activation-prime
(function).
add-gradients
(function).
add-weight-gradient
(function).
average-gradient
(function).
average-gradients
(function).
backpropagate
(function).
clear-momentum
(function).
clear-momentums
(function).
compute-delta
(function).
compute-output-delta
(function).
compute-single-delta
(function).
compute-value
(function).
compute-values
(function).
copy-neural-network
(function).
denormalize
(function).
double-float-array
(type).
get-output
(function).
make-double-float-array
(function).
make-neural-network
(function).
make-random-weights
(function).
means
(function).
neural-network
(structure).
neural-network-bias-gradients
(reader).
(setf neural-network-bias-gradients)
(writer).
neural-network-bias-momentums
(reader).
(setf neural-network-bias-momentums)
(writer).
neural-network-deltas
(reader).
(setf neural-network-deltas)
(writer).
neural-network-p
(function).
neural-network-weight-gradients
(reader).
(setf neural-network-weight-gradients)
(writer).
neural-network-weight-momentums
(reader).
(setf neural-network-weight-momentums)
(writer).
normalize
(function).
propagate
(function).
set-input
(function).
standard-deviations
(function).
update-weights
(function).
update-weights-and-biases
(function).
Definitions are sorted by export status, category, package, and then by lexicographic order.
Return the rate of good guesses computed by the NEURAL-NETWORK when testing it with some INPUTS and TARGETS. TEST must be a function taking an output and a target returning T if the output is considered to be close enough to the target, and NIL otherwise. SAME-CATEGORY-P is used by default.
Return a copy of the NEURAL-NETWORK.
Create a neural network having INPUT-SIZE inputs, OUTPUT-SIZE outputs, and optionally some intermediary layers whose sizes are specified by HIDDEN-LAYERS-SIZES. The neural network is initialized with random weights and biases.
Return the best learing rate found in ITERATIONS steps of dichotomic search (between MINIMUM and MAXIMUM). In each step, the NEURAL-NETWORK is trained EPOCHS times using some INPUTS, TARGETS, BATCH-SIZE and MOMENTUM-COEFFICIENT.
Return four values. The first is a normalization function taking an input and returning a normalized input. Applying this normalization function to the inputs gives a data set in which each variable has mean 0 and standard deviation 1. The second is a denormalization function that can compute the original input from the normalized one. The third is the code of the normalization function. The fourth is the code of the denormalization function.
Return the index of the greatest value in VALUES.
Return the mean absolute error on the outputs computed by the NEURAL-NETWORK when testing it with some INPUTS and TARGETS.
Return the output computed by the NEURAL-NETWORK for a given INPUT. If OUTPUT is not NIL, the output is written in it, otherwise a new vector is allocated.
Restore the neural network stored in PLACE, which must be a stream or a pathname-designator.
Return T if calls to INDEX-OF-MAX-VALUE on OUTPUT and TARGET return the same value, and NIL otherwise.
Store the NEURAL-NETWORK to PLACE, which must be a stream or a pathname-designator.
Train the NEURAL-NETWORK with the given LEARNING-RATE and MOMENTUM-COEFFICIENT using some INPUTS and TARGETS. The weights are updated every BATCH-SIZE inputs.
Activation function for the neurons.
Derivative of the activation function.
Add the gradients computed for an input to sum of the gradients for previous inputs.
Add the gradients computed for an input for the weights of the neuron at INDEX in a layer to the sum of the gradients for previous inputs.
Compute the average gradients for a layer.
Compute the average gradients for the whole NEURAL-NETWORK.
Propagate the error of the output layer of the NEURAL-NETWORK back to the first layer and compute the gradients.
Reset momentum to 0.
Reset all the momentums to 0.
Compute the error of the OUTPUT layer based on the error of the next layer.
Compute the error between the output layer of the NEURAL-NETWORK and the TARGET.
Compute the delta for the neuron at INDEX in the OUTPUT layer.
Compute the values of the neuron at INDEX in the OUTPUT layer.
Compute the values of the neurons in the OUTPUT layer.
Return the original input computed from its normalized variant.
Return the output layer of the NEURAL-NETWORK.
Make a new array of SIZE double floats.
Generate a matrix (OUTPUT-SIZE * INPUT-SIZE) of random weights.
Return the means of the variables of the INPUTS.
Return a normalized variant of the INPUT.
Propagate the values of the input layer of the NEURAL-NETWORK to the output layer.
Set the input layer of the NEURAL-NETWORK to INPUT.
Return the standard deviations of the variables of the INPUTS.
Update the WEIGHTS and MOMENTUMS of a layer and clear the GRADIENTS.
Update all the weights and biases of the NEURAL-NETWORK.
structure-object
.
Jump to: | %
(
A B C D F G I M N P R S T U |
---|
Jump to: | %
(
A B C D F G I M N P R S T U |
---|
Jump to: | B D L S W |
---|
Jump to: | B D L S W |
---|
Jump to: | D F N P S T |
---|
Jump to: | D F N P S T |
---|