Custom Layer From Keras To Pytorch
Coming from TensorFlow background, I am trying to convert a snippet of code of the custom layer from Keras to PyTorch. The custom layer in Keras looks like this: class Attention_mo
Solution 1:
class Attention_module(torch.nn.Module):
def __init__(self, class_num, input_shape):
super().__init__()
self.class_num = class_num
embedding_length = int(input_shape[2])
self.Ws = torch.nn.Embedding(num_embeddings=class_num,
embedding_dim=embedding_length) # Embedding layer
torch.nn.init.xavier_uniform_(self.Ws.weight) # Glorot initialization
Here's the reference for layer initialization methods. Xavier init is another name for Glorot init.
The _
at the end of torch.nn.init.xavier_uniform_
is a pytorch convention that signifies an inplace operation.
You can also use torch.nn.init
at runtime. It doesn't have to be within __init__()
. Like:
att = Attention_module(class_num, input_shape)
torch.nn.init.xavier_uniform_(att.Ws.weight)
or :
for param in att.parameters():
torch.nn.init.xavier_uniform_(param)
Post a Comment for "Custom Layer From Keras To Pytorch"