Layers

The base layer

This is the class from which all layers inherit.

interface ILayer {
    error TensorTypeNotSupported();
    error IncorrectTensorType();
    error IncorrectTensorDim();

    function appendWeights(Float32x32[] calldata weights, uint idx) external returns (uint, bool);
    function getParamsCount() external view returns (uint);
    function predict(Tensors.TensorData[] calldata input) external returns (Tensors.TensorData memory);    
}

Layer activations

relu sigmoid softmax leakyrelu tanh linear

Coming soon:

  • softplus

  • softsign

Core layers

Input layer Dense layer Embedding layer

Coming soon:

  • Masking layer

  • Lambda layer

  • Identity layer

Convolutional Layers

Conv2D layer

Coming soon:

  • Conv3D layer

Pooling layers

MaxPooling2D layer AveragePooling2D layer

Recurrent layers

SimpleRNN layer LSTM layer

Coming soon:

  • GRU layer

  • Bidirectional layer

Normalization layers

Coming soon:

  • BatchNormalization layer

  • LayerNormalization layer

  • UnitNormalization layer

  • GroupNormalization layer

Reshaping layers

Flatten layer Rescale layer

Coming soon:

  • ZeroPadding2D layer

  • Reshape layer

  • Permute layer

Merging layers

Add layer Subtract layer Multiply layer

Coming soon:

  • Concatenate layer

  • Average layer

  • Maximum layer

  • Minimum layer

  • Dot layer

Activation Layers

ReLU layer Sigmoid layer

Coming soon:

  • Softmax layer

  • LeakyReLU layer

  • PReLU layer

  • ELU layer

Special Purpose Layers

OnesLike layer ZerosLike layer

Last updated