aihwkit.simulator.tiles.module module

Tile module base.

class aihwkit.simulator.tiles.module.TileModule[source]

Bases: Module, TileModuleBase

Class of all tiles based on torch.Module.

A TileModule class inherits from three base classes:

class MyTile(TileModule, MyTile(BaseTile), MySimulatorTileWrapper)

Assuming this structure, here utility methods are defined that help to go through all these classes, such as cuda.

compatible_with(rpu_config)[source]

Checks whether current RPUConfig is compatible with given one.

Parameters:

rpu_config (RPUConfigBase) – New RPUConfig to check against

Returns:

Whether the given RPUConfig is compatible msg: Error message if not

Return type:

success

cpu()[source]

Return a copy of this tile in CUDA memory.

Returns:

Self with the underlying C++ tile moved to CUDA memory.

Raises:

CudaError – if the library has not been compiled with CUDA.

Return type:

TileModule

cuda(device=None)[source]

Return a copy of this tile in CUDA memory.

Parameters:

device (str | device | int | None) – CUDA device

Returns:

Self with the underlying C++ tile moved to CUDA memory.

Raises:

CudaError – if the library has not been compiled with CUDA.

Return type:

TileModule

extra_repr()[source]

Set the extra representation of the module.

Returns:

A string with the extra representation.

Return type:

str

static get_analog_state_name(prefix)[source]

Returns the analog state name.

Parameters:

prefix (str) –

Return type:

str

is_floating_point()[source]

Dummy for .to to work.

Return type:

bool

replace_with(rpu_config)[source]

Replaces the current RPUConfig with the given one.

Parameters:

rpu_config (RPUConfigBase) – New RPUConfig to check against

Raises:

TileModuleError – if given RPUConfig is not compatible.

Return type:

None

set_load_rpu_config_state(load_rpu_config, strict_rpu_config_check=None)[source]

Sets the behavior of when using load_state_dict.

Caution

If load_rpu_config=False the RPU config can be changed from the stored model. However, the user has to make sure that the changed RPU config makes sense.

For instance, changing the device type might change the expected fields in the hidden parameters and result in an error.

Parameters:
  • load_rpu_config (bool | None) – Whether to load the saved RPU

  • model. (config or use the current RPU config of the) –

  • strict_rpu_config_check (bool | None) – Whether to check and throw an error if the current rpu_config is not of the same class type when setting load_rpu_config to False. In case of False the user has to make sure that the rpu_config are compatible.

Return type:

None

state_dict(destination, prefix='', keep_vars=False)[source]

Overload to add the hooks for pytorch < 1.12.

Parameters:
  • destination (Dict) –

  • prefix (str) –

  • keep_vars (bool) –

Return type:

None

supports_ddp: bool = False
to(*args, **kwargs)[source]

Move analog tile module to a device.

RPUConfig conversions can be done as well.

Note

Please be aware that moving analog tiles from GPU to CPU is currently not supported.

Returns:

This module in the specified device and converted to the specified data type.

Parameters:
  • args (Any) –

  • kwargs (Any) –

Return type:

TileModule