pytorch学习笔记-高阶篇(感知机)
本篇主要记录一下一些感知机的相关知识,完整地回顾并实践之前所学的梯度求导相关过程
一、感知机介绍
感知机(perceptron)是二类分类的线性分类模型,其输入为实例的特征向量,输出为实例的类别。下图是单层感知机的示例图
单层感知机梯度推导:
二、单层感知机梯度推导代码
# 单层感知机梯度推导
# 也就是输入x的特征有10个
x = torch.randn(1, 10)
'''
Out[18]:
tensor([[ 0.8005, -0.2278, 0.4467, 1.4833, 0.2212, -0.0604, -1.8940, -0.1799,
-0.4088, 1.7971]])
'''
w = torch.randn(1, 10, requires_grad=True)
'''
Out[20]:
tensor([[-0.4720, -0.2564, 1.5323, -0.2673, 0.4987, 0.0123, -1.1776, 1.0636,
-0.5548, -1.6960]], requires_grad=True)
'''
o = torch.sigmoid(x@w.t())
o.shape
# Out[22]: torch.Size([1, 1])
loss = F.mse_loss(torch.ones(1, 1), o)
loss.shape
# Out[24]: torch.Size([])
loss.backward()
w.grad
'''
Out[26]:
tensor([[-0.2372, 0.0675, -0.1323, -0.4395, -0.0655, 0.0179, 0.5612, 0.0533,
0.1211, -0.5325]])
'''
三、单层感知机梯度推导
多层感知机梯度推导:
# 多层感知机梯度推导
x = torch.randn(1, 10)
'''
Out[28]:
tensor([[ 0.1166, 1.8313, -0.7689, -1.1989, 1.0554, 0.1023, -0.0929, -0.4683,
-1.4945, -1.2604]])
'''
w = torch.randn(2, 10, requires_grad=True)
'''
Out[30]:
tensor([[ 1.3166, 1.6998, 2.7425, 0.4619, -0.3792, 1.5305, 0.3245, 0.2149,
-0.9502, 1.2917],
[ 0.8944, -0.5217, 0.2363, 0.9228, -1.5709, -1.3228, 0.4027, 1.6695,
1.6203, 1.0451]], requires_grad=True)
'''
o = torch.sigmoid(x@w.t())
o.shape
# Out[32]: torch.Size([1, 2])
loss = F.mse_loss(torch.ones(1, 2), o)
# Out[38]: tensor(0.6221, grad_fn=<MseLossBackward0>)
loss.backward()
w.grad
'''
Out[40]:
tensor([[-1.4420e-02, -2.2642e-01, 9.5069e-02, 1.4823e-01, -1.3048e-01,
-1.2644e-02, 1.1491e-02, 5.7900e-02, 1.8479e-01, 1.5583e-01],
[-2.3947e-05, -3.7601e-04, 1.5788e-04, 2.4615e-04, -2.1669e-04,
-2.0997e-05, 1.9082e-05, 9.6152e-05, 3.0687e-04, 2.5878e-04]])
'''
本博客所有文章除特别声明外,均采用 CC BY-NC-SA 4.0 许可协议。转载请注明来自 不听话的兔子君!