How to resolve Pytorch_lightning module:can't set attribute error
Pytorch is a machine learning framework that was developed by Facebook's artificial intelligence research team. This framework lets us implement various neural networks such as Tree recursive neural networks (RNN), convolution neural networks (CNNs), long short-term memory (LSTM), and many others which can help achieve many tasks.
Using Pytorch we can train a neural network, which due to this framework has been simplified to a very hassle-free procedure. We can predict some values which could have a varying degree of accuracy depending on various factors such as:
Quality and size of training data
Training algorithms and hyperparameters
Preprocessing and feature engineering of the data
Network architecture
Optimization algorithms
Pytorch framework is used in various aspects of our daily life. These can be speech recognition such as calling out our virtual assistants (e.g. ‘Hey Siri’, ‘OK Google!’) or the autocorrect we often highly depend on which uses Natural Language Processing (NLP).
Scenarios where we face this error
Focusing back on the error, we first have to understand the scenarios where these occur and the reason behind them.
There are some specific attributes in the Pytorch framework that can’t be directly modified and can raise some errors. This is due to the fact that these are managed by the framework itself. We can see the common reasons why we face this error:
Undeclared or private attribute: This error can be raised when we try to assign a value to the variable we haven’t declared or the attribute in focus is a private attribute. We can distinguish them if they start with an underscore (e.g.’_my_example’).
Attributing shadowing: This is a problem when we try to assign a value to an attribute while the project has a method with the same name.
Read-Only attributes: In the Pytorch Lightning modules, various attributes are set to be read-only attributes and an attempt to modify them can raise this error.
Inherited attributes: Attempts in modifying some inherited attributes from Pytorch Lightning modules can be the read-only attributes, hence raising the error.
Examples
Error: Modifying self.trainer directly
import pytorch_lightning as plclass MyModel(pl.LightningModule):def training_step(self, batch, batch_idx):self.trainer.max_epochs = 10 # Raises AttributeError
Solving this error
import pytorch_lightning as plclass MyModel(pl.LightningModule):def training_step(self, batch, batch_idx):# treating the self.trainer as a read-only attributecurrent_epochs = self.trainer.max_epochsself.log("Current Epochs", current_epochs)
The explanation as to why this program raised the AttributeError is due to self.trainer object is a framework-specific attribute which are set to be read-only. Thus trying to modify these attributes raise this error.
Error: Modifying self.hparams directly
import pytorch_lightning as plclass MyModel(pl.LightningModule):def __init__(self, learning_rate=0.001):super().__init__()# Attempting to modify self.hparams directlyself.hparams.learning_rate = learning_rate # AttributeError raised
Resolving this error
import pytorch_lightning as plclass MyModel(pl.LightningModule):def __init__(self, learning_rate=0.001):super().__init__()#use self.hparams as a read-only attributeself.learning_rate = learning_ratedef configure_optimizers(self):optimizer = torch.optim.Adam(self.parameters(), lr=self.hparams.learning_rate)return optimizer
This can be considered as one of the ways to tackle this error. Other work arounds to solve this error could be to modify the hparams attribute through self.hparams.update() or through the self.save_hyperparameters() .
Summary
To summarize this Answer, we have discussed what Pytorch framework is, where we face these errors, and what the solutions are to resolve them.
Free Resources