Layer API

The Layer class represents a basic computational unit in MLLM’s neural network framework. Layers are typically used to implement specific operations like linear transformations, convolutions, or activation functions.

#include "mllm/nn/Layer.hpp"

Base Class

class Layer

Base class for neural network layers. Layers are typically used to implement specific operations.

Constructors

explicit Layer::Layer(const LayerImpl::ptr_t &impl)

Constructor with LayerImpl pointer.

Parameters:

impl – Shared pointer to LayerImpl instance

template<typename T>
Layer::Layer(OpTypes op_type, const T &cargo)

Constructor with operation type and options.

Parameters:
  • op_type – Operation type for the layer

  • cargo – Options for the operation

Core Methods

LayerImpl::ptr_t Layer::impl() const

Get the underlying LayerImpl pointer.

Returns:

Shared pointer to LayerImpl

std::vector<Tensor> Layer::__main(const std::vector<Tensor> &inputs)

Main execution method for the layer.

Parameters:

inputs – Input tensors

Returns:

Output tensors

OpTypes Layer::opType() const

Get the operation type of the layer.

Returns:

Operation type

BaseOpOptionsBase &Layer::refOptions()

Get reference to the layer’s options.

Returns:

Reference to BaseOpOptionsBase

Layer &Layer::to(DeviceTypes device_type)

Move the layer to specified device.

Parameters:

device_type – Target device type (kCPU, kCUDA, etc.)

Returns:

Reference to this layer

void Layer::__fmt_print(std::stringstream &ss)

Format print information about the layer.

Parameters:

ss – String stream to write formatted output

Helper Macros

MLLM_LAYER_ANY_INPUTS_1_OUTPUTS_FORWARD

Macro for defining operator() with any number of inputs and 1 output.

MLLM_LAYER_ANY_INPUTS_2_OUTPUTS_FORWARD

Macro for defining operator() with any number of inputs and 2 outputs.

MLLM_LAYER_ANY_INPUTS_3_OUTPUTS_FORWARD

Macro for defining operator() with any number of inputs and 3 outputs.