写了一小段代码来进行pytorch中tensor的维度变换实验,实验代码如下所示:
import torch
import numpy as np
a = np.array([[[1, 2, 3, 4, 5], [6, 7, 8, 9, 10]], [[11, 12, 13, 14, 15], [16, 17, 18, 19, 20]], [[21, 22, 23, 24, 25], [26, 27, 28, 29, 30]]])
b = torch.tensor(a)
print(b)
print(b.size())
b1 = b.view(3, 2, 1, 5)
print(b1)
print(b1.size())
b2 = b1.repeat(1, 1, 2, 1)
print(b2)
print(b2.size())
b3 = torch.softmax(b2, -2)
print(b3)
print(b3.size())
上述代码报错:
Traceback (most recent call last):
File "ceshi.py", line 17, in
b3 = torch.softmax(b2, -2)
RuntimeError: "softmax" not implemented for 'torch.LongTensor'
错误原因:
tensor初始化的值按我代码中的情况,默认为int类型,即int64,但是softmax函数没有针对int64类型数据的代码实现,所以数据类型应该改为浮点型。
因此,将代码改成如下情况,则不再报错。
import torch
import numpy as np
a = np.array([[[1., 2., 3., 4., 5.], [6., 7., 8., 9., 10.]], [[11., 12., 13., 14., 15.], [16., 17., 18., 19., 20.]], [[21., 22., 23., 24., 25.], [26., 27., 28., 29., 30.]]])
b = torch.tensor(a)
print(b)
print(b.size())
b1 = b.view(3, 2, 1, 5)
print(b1)
print(b1.size())
b2 = b1.repeat(1, 1, 2, 1)
print(b2)
print(b2.size())
b3 = torch.softmax(b2, -2)
print(b3)
print(b3.size())