小白的进阶之路系列之八----人工智能从初步到精通pytorch综合运用的讲解第一部分
PyTorch Tensors
通过大量实例学习编程应用是最有效的方法。
本篇是PyTorch综合运用,旨在让读者通过一行行代码亲自掌握Pytorch工具包的各种功能,有利于大家部署自己的神经网络人工智能计算工程。
首先,载入torch库。
import torch
我们来看看一些基本的张量操作。首先,只是一些创建张量的方法:
z = torch.zeros(5, 3)
print(z)
print(z.dtype)
输出为:
tensor([[0., 0., 0.],[0., 0., 0.],[0., 0., 0.],[0., 0., 0.],[0., 0., 0.]])
torch.float32
上面,我们创建了一个5x3的矩阵,里面充满了零,并查询它的数据类型来发现零是32位浮点数,这是PyTorch的默认值。
如果你想要整数呢?你总是可以覆盖默认值:
z = torch.zeros(5, 3)
print(z)
print(z.dtype)
输出为:
tensor([[0., 0., 0.],[0., 0., 0.],[0., 0., 0.],[0., 0., 0.],[0., 0., 0.]])
torch.float32
您可以看到,当我们更改默认值时,张量在打印时有用地报告了这一点。
随机初始化学习权重是很常见的,通常会为PRNG设置一个特定的种子,以获得结果的可重复性:
torch.manual_seed(1729)
r1 = torch.rand(2, 2)
print('A random tensor:')
print(r1)r2 = torch.rand(2, 2)
print('\nA different random tensor:')
print(r2) # new valuestorch.manual_seed(1729)
r3 = torch.rand(2, 2)
print('\nShould match r1:')
print(r3) # repeats values of r1 because of re-seed
输出为:
A random tensor:
tensor([[0.3126, 0.3791],[0.3087, 0.0736]])A different random tensor:
tensor([[0.4216, 0.0691],[0.2332, 0.4047]])Should match r1:
tensor([[0.3126, 0.3791],[0.3087, 0.0736]])
PyTorch张量直观地执行算术运算。相似形状的张量可以相加、相乘等。标量操作分布在张量上:
ones = torch.ones(2, 3)
print(ones)twos = torch.ones(2, 3) * 2 # every element is multiplied by 2
print(twos)threes = ones + twos # addition allowed because shapes are similar
print(threes) # tensors are added element-wise
print(threes.shape) # this has the same dimensions as input tensorsr1 = torch.rand(2, 3)
r2 = torch.rand(3, 2)
# uncomment this line to get a runtime error
# r3 = r1 + r2
输出为:
tensor([[1., 1., 1.],[1., 1., 1.]])
tensor([[2., 2., 2.],[2., 2., 2.]])
tensor([[3., 3., 3.],[3., 3., 3.]])
torch.Size([2, 3])
下面是可用的数学运算的一个小示例:
r = (torch.rand(2, 2) - 0.5) * 2 # values between -1 and 1
print('A random matrix, r:')
print(r)# Common mathematical operations are supported:
print('\nAbsolute value of r:')
print(torch.abs(r))# ...as are trigonometric functions:
print('\nInverse sine of r:')
print(torch.asin(r))# ...and linear algebra operations like determinant and singular value decomposition
print('\nDeterminant of r:')
print(torch.det(r))
print('\nSingular value decomposition of r:')
print(torch.svd(r))# ...and statistical and aggregate operations:
print('\nAverage and standard deviation of r:')
print