Replies: 5 comments 1 reply
-
06 Example (Hardware-aware training of simulated CNNs -https://github.com/IBM/aihwkit/blob/master/examples/06_lenet5_hardware_aware.py#L79) |
Beta Was this translation helpful? Give feedback.
-
Hi @24367452 , what are you trying to do exactly, in-memory inference or in-situ training? We have and ReRAM model that is used during in-situ training (that is in-memeory training that uses incremental updates on ReRAM during backpropgation on the analog crossbar arrays) and an ReRAM model that is used for inference only. |
Beta Was this translation helpful? Give feedback.
-
Please also refer to the tutorial paper, where it is explained in more detail, see here |
Beta Was this translation helpful? Give feedback.
-
Hello @maljoras ,Thank you for your interest, I would like to use the ReRAM model for in-situ training and evaluate the impact of non-ideal factors on the accuracy of the model.Does the following code fulfil my requirements?
|
Beta Was this translation helpful? Give feedback.
-
Hello @24367452! The inputs from @maljoras-sony help you out? Need any more help or we can consider this closed? Thanks for taking your time using and testing the project! |
Beta Was this translation helpful? Give feedback.
-
I want to use ReRAM as a processing unit for simulating CNNs, but I don't know what interface to use to configure it.
At first, I thought that configuring the noise model of the inference RPU also configured the processing unit as follows:
But then I realised that it seems that using
rpu_config = SingleRPUConfig(device=ReRamESPresetDevice())
It is the way to configure ReRAM as a processing unit.
In the 06 example (Hardware-aware training of a simulated CNN - [examples/06_lenet5_hardware_aware.py](https://github.com/IBM/aihwkit/blob/master/examples/06_In examples/06_lenet5_hardware_aware.py) SingleRPUConfig is not used, so how is the processing unit configured in this example?Does it use ReRAM or PCM?
If you know the right way to configure the ReRAM processing unit in a CNN, please let me know, thank you very much!
Beta Was this translation helpful? Give feedback.
All reactions