DenseLayer - Maple Help
For the best experience, we recommend viewing online help using Google Chrome or Microsoft Edge.

DeepLearning

 DenseLayer
 create dense layer

 Calling Sequence DenseLayer(units,opts)

Parameters

 units - positive integer opts - one or more options as specified below

Options

 • activation : string or symbol
 Specifies the activation function to use, one of deserialize, elu, exponential, gelu, get, hard_sigmoid, linear, relu, selu, serialize, sigmoid, softmax, softplus, softsign, swish, or tanh. Default is linear, the identity function.
 • inputshape : list of integers or the symbol auto
 Shape of the input Tensor, not including the batch axis.
 With the default value auto, the shape is inferred. If inference is not possible, an error is issued.
 This option need only be specified when this layer is the first in a Sequential model.
 • usebias : truefalse
 Specifies whether to use a bias vector. Default is true.

Description

 • The DenseLayer(units, opts) command creates a dense neural network layer with the dimensionality of the output space equal to units.
 • This function is part of the DeepLearning package, so it can be used in the short form DenseLayer(..) only after executing the command with(DeepLearning). However, it can always be accessed through the long form of the command by using DeepLearning[DenseLayer](..).

Details

 • The implementation of DenseLayer uses tf.keras.layers.Dense from the TensorFlow Python API. Consult the TensorFlow Python API documentation for tf.keras.layers.Dense for more information.

Examples

 > $\mathrm{with}\left(\mathrm{DeepLearning}\right)$
 $\left[{\mathrm{AddMultiple}}{,}{\mathrm{ApplyOperation}}{,}{\mathrm{BatchNormalizationLayer}}{,}{\mathrm{BidirectionalLayer}}{,}{\mathrm{BucketizedColumn}}{,}{\mathrm{CategoricalColumn}}{,}{\mathrm{Classify}}{,}{\mathrm{Concatenate}}{,}{\mathrm{Constant}}{,}{\mathrm{ConvolutionLayer}}{,}{\mathrm{DNNClassifier}}{,}{\mathrm{DNNLinearCombinedClassifier}}{,}{\mathrm{DNNLinearCombinedRegressor}}{,}{\mathrm{DNNRegressor}}{,}{\mathrm{Dataset}}{,}{\mathrm{DenseLayer}}{,}{\mathrm{DropoutLayer}}{,}{\mathrm{EinsteinSummation}}{,}{\mathrm{EmbeddingLayer}}{,}{\mathrm{Estimator}}{,}{\mathrm{FeatureColumn}}{,}{\mathrm{Fill}}{,}{\mathrm{FlattenLayer}}{,}{\mathrm{GRULayer}}{,}{\mathrm{GatedRecurrentUnitLayer}}{,}{\mathrm{GetDefaultGraph}}{,}{\mathrm{GetDefaultSession}}{,}{\mathrm{GetEagerExecution}}{,}{\mathrm{GetVariable}}{,}{\mathrm{GradientTape}}{,}{\mathrm{IdentityMatrix}}{,}{\mathrm{LSTMLayer}}{,}{\mathrm{Layer}}{,}{\mathrm{LinearClassifier}}{,}{\mathrm{LinearRegressor}}{,}{\mathrm{LongShortTermMemoryLayer}}{,}{\mathrm{MaxPoolingLayer}}{,}{\mathrm{Model}}{,}{\mathrm{NumericColumn}}{,}{\mathrm{OneHot}}{,}{\mathrm{Ones}}{,}{\mathrm{Operation}}{,}{\mathrm{Optimizer}}{,}{\mathrm{Placeholder}}{,}{\mathrm{RandomTensor}}{,}{\mathrm{ResetDefaultGraph}}{,}{\mathrm{Restore}}{,}{\mathrm{Save}}{,}{\mathrm{Sequential}}{,}{\mathrm{Session}}{,}{\mathrm{SetEagerExecution}}{,}{\mathrm{SetRandomSeed}}{,}{\mathrm{SoftMaxLayer}}{,}{\mathrm{SoftmaxLayer}}{,}{\mathrm{Tensor}}{,}{\mathrm{Variable}}{,}{\mathrm{Variables}}{,}{\mathrm{VariablesInitializer}}{,}{\mathrm{Zeros}}\right]$ (1)
 > $\mathrm{v1}≔\mathrm{Vector}\left(8,i↦i,\mathrm{datatype}=\mathrm{float}\left[8\right]\right)$
 ${\mathrm{v1}}{≔}\left[\begin{array}{c}{1.}\\ {2.}\\ {3.}\\ {4.}\\ {5.}\\ {6.}\\ {7.}\\ {8.}\end{array}\right]$ (2)
 > $\mathrm{v2}≔\mathrm{Vector}\left(8,\left[-1.0,1.0,5.0,11.0,19.0,29.0,41.0,55.0\right],\mathrm{datatype}=\mathrm{float}\left[8\right]\right)$
 ${\mathrm{v2}}{≔}\left[\begin{array}{c}{-1.}\\ {1.}\\ {5.}\\ {11.}\\ {19.}\\ {29.}\\ {41.}\\ {55.}\end{array}\right]$ (3)
 > $\mathrm{model}≔\mathrm{Sequential}\left(\left[\mathrm{DenseLayer}\left(2,\mathrm{inputshape}=\left[1\right]\right)\right]\right)$
 ${\mathrm{model}}{≔}\left[\begin{array}{c}{\mathrm{DeepLearning Model}}\\ {\mathrm{}}\end{array}\right]$ (4)
 > $\mathrm{model}:-\mathrm{Compile}\left(\mathrm{optimizer}="sgd",\mathrm{loss}="mean_squared_error"\right)$
 > $\mathrm{model}:-\mathrm{Fit}\left(\mathrm{v1},\mathrm{v2},\mathrm{epochs}=500\right)$
 ${">"}$ (5)
 > $\mathrm{model}:-\mathrm{Evaluate}\left(\left[10\right],\left[30\right]\right)$
 $\left\{{"loss"}{=}{849.608520507812}{,}{"accuracy"}{=}{0.}\right\}$ (6)

Compatibility

 • The DeepLearning[DenseLayer] command was introduced in Maple 2021.
 • For more information on Maple 2021 changes, see Updates in Maple 2021.