aihwkit.cloud.converter.v1.i_mappings module
Mappings for version 1 of the AIHW Composer format.
- class aihwkit.cloud.converter.v1.i_mappings.Function(id_, args)[source]
Bases:
object
Mapping for a function-like entity.
- Parameters:
id_ (str) –
args (Dict) –
- from_proto(source, cls, default=None)[source]
Convert a proto object into a destination object.
- Parameters:
source (Any) –
cls (type) –
default (Any | None) –
- Return type:
object
- get_argument_from_proto(source, field, default=None)[source]
Get the value of an argument.
- Parameters:
source (Any) –
field (str) –
default (Any | None) –
- Return type:
Dict
- class aihwkit.cloud.converter.v1.i_mappings.InverseMappings[source]
Bases:
object
Mappings between AIHW Composer format and Python entities.
- activation_functions = {'LeakyReLU': <class 'torch.nn.modules.activation.LeakyReLU'>, 'LogSigmoid': <class 'torch.nn.modules.activation.LogSigmoid'>, 'LogSoftmax': <class 'torch.nn.modules.activation.LogSoftmax'>, 'ReLU': <class 'torch.nn.modules.activation.ReLU'>, 'Sigmoid': <class 'torch.nn.modules.activation.Sigmoid'>, 'Softmax': <class 'torch.nn.modules.activation.Softmax'>, 'Tanh': <class 'torch.nn.modules.activation.Tanh'>}
- datasets = {'fashion_mnist': <class 'torchvision.datasets.mnist.FashionMNIST'>, 'svhn': <class 'torchvision.datasets.svhn.SVHN'>}
- layers = {'AnalogConv2d': <class 'aihwkit.nn.modules.conv.AnalogConv2d'>, 'AnalogConv2dMapped': <class 'aihwkit.nn.modules.conv_mapped.AnalogConv2dMapped'>, 'AnalogLinear': <class 'aihwkit.nn.modules.linear.AnalogLinear'>, 'AnalogLinearMapped': <class 'aihwkit.nn.modules.linear_mapped.AnalogLinearMapped'>, 'BatchNorm2d': <class 'torch.nn.modules.batchnorm.BatchNorm2d'>, 'Conv2d': <class 'torch.nn.modules.conv.Conv2d'>, 'ConvTranspose2d': <class 'torch.nn.modules.conv.ConvTranspose2d'>, 'Flatten': <class 'torch.nn.modules.flatten.Flatten'>, 'Linear': <class 'torch.nn.modules.linear.Linear'>, 'MaxPool2d': <class 'torch.nn.modules.pooling.MaxPool2d'>}
- loss_functions = {'BCELoss': <class 'torch.nn.modules.loss.BCELoss'>, 'CrossEntropyLoss': <class 'torch.nn.modules.loss.CrossEntropyLoss'>, 'MSELoss': <class 'torch.nn.modules.loss.MSELoss'>, 'NLLLoss': <class 'torch.nn.modules.loss.NLLLoss'>}
- optimizers = {'AnalogSGD': <class 'aihwkit.optim.analog_optimizer.AnalogSGD'>}
- presets = {'InferenceRPUConfig': <class 'aihwkit.simulator.configs.configs.InferenceRPUConfig'>, 'OldWebComposerInferenceRPUConfig': <class 'aihwkit.simulator.presets.web.OldWebComposerInferenceRPUConfig'>, 'WebComposerInferenceRPUConfig': <class 'aihwkit.simulator.presets.web.WebComposerInferenceRPUConfig'>}
- class aihwkit.cloud.converter.v1.i_mappings.LayerFunction(id_, args)[source]
Bases:
Function
Mapping for a function-like entity (Layer).
- Parameters:
id_ (str) –
args (Dict) –
- class aihwkit.cloud.converter.v1.i_mappings.Mappings[source]
Bases:
object
Mappings between Python entities and AIHW format.
- activation_functions = {<class 'torch.nn.modules.activation.LeakyReLU'>: <aihwkit.cloud.converter.v1.i_mappings.Function object>, <class 'torch.nn.modules.activation.LogSigmoid'>: <aihwkit.cloud.converter.v1.i_mappings.Function object>, <class 'torch.nn.modules.activation.LogSoftmax'>: <aihwkit.cloud.converter.v1.i_mappings.Function object>, <class 'torch.nn.modules.activation.ReLU'>: <aihwkit.cloud.converter.v1.i_mappings.Function object>, <class 'torch.nn.modules.activation.Sigmoid'>: <aihwkit.cloud.converter.v1.i_mappings.Function object>, <class 'torch.nn.modules.activation.Softmax'>: <aihwkit.cloud.converter.v1.i_mappings.Function object>, <class 'torch.nn.modules.activation.Tanh'>: <aihwkit.cloud.converter.v1.i_mappings.Function object>}
- datasets = {<class 'torchvision.datasets.mnist.FashionMNIST'>: 'fashion_mnist', <class 'torchvision.datasets.svhn.SVHN'>: 'svhn'}
- layers = {<class 'aihwkit.nn.modules.conv.AnalogConv2d'>: <aihwkit.cloud.converter.v1.i_mappings.LayerFunction object>, <class 'aihwkit.nn.modules.conv_mapped.AnalogConv2dMapped'>: <aihwkit.cloud.converter.v1.i_mappings.LayerFunction object>, <class 'aihwkit.nn.modules.linear.AnalogLinear'>: <aihwkit.cloud.converter.v1.i_mappings.LayerFunction object>, <class 'aihwkit.nn.modules.linear_mapped.AnalogLinearMapped'>: <aihwkit.cloud.converter.v1.i_mappings.LayerFunction object>, <class 'torch.nn.modules.batchnorm.BatchNorm2d'>: <aihwkit.cloud.converter.v1.i_mappings.LayerFunction object>, <class 'torch.nn.modules.conv.Conv2d'>: <aihwkit.cloud.converter.v1.i_mappings.LayerFunction object>, <class 'torch.nn.modules.conv.ConvTranspose2d'>: <aihwkit.cloud.converter.v1.i_mappings.LayerFunction object>, <class 'torch.nn.modules.flatten.Flatten'>: <aihwkit.cloud.converter.v1.i_mappings.LayerFunction object>, <class 'torch.nn.modules.linear.Linear'>: <aihwkit.cloud.converter.v1.i_mappings.LayerFunction object>, <class 'torch.nn.modules.pooling.MaxPool2d'>: <aihwkit.cloud.converter.v1.i_mappings.LayerFunction object>}
- loss_functions = {<class 'torch.nn.modules.loss.BCELoss'>: <aihwkit.cloud.converter.v1.i_mappings.Function object>, <class 'torch.nn.modules.loss.CrossEntropyLoss'>: <aihwkit.cloud.converter.v1.i_mappings.Function object>, <class 'torch.nn.modules.loss.MSELoss'>: <aihwkit.cloud.converter.v1.i_mappings.Function object>, <class 'torch.nn.modules.loss.NLLLoss'>: <aihwkit.cloud.converter.v1.i_mappings.Function object>}
- optimizers = {<class 'aihwkit.optim.analog_optimizer.AnalogSGD'>: <aihwkit.cloud.converter.v1.i_mappings.Function object>}
- presets = {<class 'aihwkit.simulator.configs.configs.InferenceRPUConfig'>: 'InferenceRPUConfig', <class 'aihwkit.simulator.presets.web.OldWebComposerInferenceRPUConfig'>: 'OldWebComposerInferenceRPUConfig', <class 'aihwkit.simulator.presets.web.WebComposerInferenceRPUConfig'>: 'WebComposerInferenceRPUConfig'}