public class DifferentiableFeedforwardNeuralNetwork extends FeedforwardNeuralNetwork implements GradientDescendable
Constructor and Description |
---|
DifferentiableFeedforwardNeuralNetwork(java.util.ArrayList<java.lang.Integer> nodesPerLayer,
java.util.ArrayList<DifferentiableUnivariateScalarFunction> layerActivationFunctions,
java.util.Random random)
Creates a new instance of DifferentiableFeedforwardNeuralNetwork
|
DifferentiableFeedforwardNeuralNetwork(DifferentiableGeneralizedLinearModel... layers)
Creates a new instance of FeedforwardNeuralNetwork
|
DifferentiableFeedforwardNeuralNetwork(int numInputs,
int numHiddens,
int numOutputs,
DifferentiableUnivariateScalarFunction scalarFunction,
java.util.Random random)
Creates a new instance of FeedforwardNeuralNetwork
|
DifferentiableFeedforwardNeuralNetwork(int numInputs,
int numHiddens,
int numOutputs,
DifferentiableVectorFunction activationFunction,
java.util.Random random)
Creates a new instance of FeedforwardNeuralNetwork
|
Modifier and Type | Method and Description |
---|---|
DifferentiableFeedforwardNeuralNetwork |
clone()
This makes public the clone method on the
Object class and
removes the exception that it throws. |
Matrix |
computeParameterGradient(Vector input)
Computes the derivative of the function about the input with respect
to the parameters of the function.
|
java.util.ArrayList<DifferentiableGeneralizedLinearModel> |
getLayers()
Getter for layers
|
convertFromVector, convertToVector, evaluate, evaluateAtEachLayer, setLayers, toString
equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, wait
convertFromVector, convertToVector
public DifferentiableFeedforwardNeuralNetwork(java.util.ArrayList<java.lang.Integer> nodesPerLayer, java.util.ArrayList<DifferentiableUnivariateScalarFunction> layerActivationFunctions, java.util.Random random)
nodesPerLayer
- Number of nodes in each layer, must have no fewer than 2 layerslayerActivationFunctions
- Squashing function to assign to each layer, must have one fewer squashing
function than you do layers (that is, the input layer has no squashing)random
- The random number generator for initial weights.public DifferentiableFeedforwardNeuralNetwork(int numInputs, int numHiddens, int numOutputs, DifferentiableVectorFunction activationFunction, java.util.Random random)
numInputs
- Number of nodes in the input layernumHiddens
- Number of nodes in the hidden (middle) layernumOutputs
- Number of nodes in the output layeractivationFunction
- Squashing function to assign to all layersrandom
- The random number generator for the initial weights.public DifferentiableFeedforwardNeuralNetwork(int numInputs, int numHiddens, int numOutputs, DifferentiableUnivariateScalarFunction scalarFunction, java.util.Random random)
numInputs
- Number of nodes in the input layernumHiddens
- Number of nodes in the hidden (middle) layernumOutputs
- Number of nodes in the output layerscalarFunction
- Squashing function to assign to all layersrandom
- The random number generator for the initial weights.public DifferentiableFeedforwardNeuralNetwork(DifferentiableGeneralizedLinearModel... layers)
layers
- Layers of the neural networkpublic DifferentiableFeedforwardNeuralNetwork clone()
AbstractCloneableSerializable
Object
class and
removes the exception that it throws. Its default behavior is to
automatically create a clone of the exact type of object that the
clone is called on and to copy all primitives but to keep all references,
which means it is a shallow copy.
Extensions of this class may want to override this method (but call
super.clone()
to implement a "smart copy". That is, to target
the most common use case for creating a copy of the object. Because of
the default behavior being a shallow copy, extending classes only need
to handle fields that need to have a deeper copy (or those that need to
be reset). Some of the methods in ObjectUtil
may be helpful in
implementing a custom clone method.
Note: The contract of this method is that you must use
super.clone()
as the basis for your implementation.clone
in interface GradientDescendable
clone
in interface Vectorizable
clone
in interface VectorizableVectorFunction
clone
in interface CloneableSerializable
clone
in class FeedforwardNeuralNetwork
public java.util.ArrayList<DifferentiableGeneralizedLinearModel> getLayers()
FeedforwardNeuralNetwork
getLayers
in class FeedforwardNeuralNetwork
public Matrix computeParameterGradient(Vector input)
GradientDescendable
computeParameterGradient
in interface GradientDescendable
computeParameterGradient
in interface ParameterGradientEvaluator<Vector,Vector,Matrix>
input
- Point about which to differentiate w.r.t. the parameters.