当前位置: 首页 > news >正文

【笔记】【B站课程 pytorch】梯度下降模型

课程来源
b站课程 《PyTorch深度学习实践》完结合集
在这里插入图片描述
训练代码

import matplotlib.pyplot as pltx_data = [1.0, 2.0, 3.0]
y_data = [2.0, 4.0, 6.0]
w = 1.0# 定义y hat
def forward(x):return x * wdef cost(xs, ys):cost_value = 0for x, y in zip(xs, ys):y_pred = forward(x)cost_value += (y_pred - y) ** 2return cost_value / len(xs)def gradient(xs, ys):grad = 0for x, y in zip(xs, ys):grad += 2 * x * (x * w - y)return grad / len(xs)print('Predict (before training)', 4, forward(4))# 轮次
epoch_list = []
# 损失值
cost_list = []
for epoch in range(100):cost_val = cost(x_data, y_data)grad_val = gradient(x_data, y_data)w -= 0.05 * grad_valprint('Epoch:', epoch, 'w=', w, 'loss=', cost_val)epoch_list.append(epoch)cost_list.append(cost_val)
print('Predict (after training)', 4, forward(4))

训练日志

Predict (before training) 4 4.0
Epoch: 0 w= 1.4666666666666668 loss= 4.666666666666667
Epoch: 1 w= 1.7155555555555557 loss= 1.3274074074074067
Epoch: 2 w= 1.8482962962962963 loss= 0.3775736625514398
Epoch: 3 w= 1.9190913580246913 loss= 0.10739873068129853
Epoch: 4 w= 1.9568487242798354 loss= 0.030548972282680543
Epoch: 5 w= 1.976985986282579 loss= 0.008689485449295776
Epoch: 6 w= 1.9877258593507088 loss= 0.0024716758611330204
Epoch: 7 w= 1.9934537916537114 loss= 0.0007030544671667267
Epoch: 8 w= 1.9965086888819794 loss= 0.0001999799373274173
Epoch: 9 w= 1.9981379674037223 loss= 5.688318217313286e-05
Epoch: 10 w= 1.9990069159486519 loss= 1.618010515146993e-05
Epoch: 11 w= 1.9994703551726143 loss= 4.602341020862254e-06
Epoch: 12 w= 1.9997175227587276 loss= 1.3091103348236213e-06
Epoch: 13 w= 1.9998493454713213 loss= 3.723691619052323e-07
Epoch: 14 w= 1.999919650918038 loss= 1.0591833938645004e-07
Epoch: 15 w= 1.999957147156287 loss= 3.0127883203364745e-08
Epoch: 16 w= 1.9999771451500197 loss= 8.569709000094413e-09
Epoch: 17 w= 1.999987810746677 loss= 2.4376061155645925e-09
Epoch: 18 w= 1.9999934990648944 loss= 6.933635173220451e-10
Epoch: 19 w= 1.9999965328346103 loss= 1.9722340048226596e-10
Epoch: 20 w= 1.9999981508451254 loss= 5.609910058542032e-11
Epoch: 21 w= 1.9999990137840669 loss= 1.5957077500436847e-11
Epoch: 22 w= 1.999999474018169 loss= 4.538902045415441e-12
Epoch: 23 w= 1.9999997194763568 loss= 1.2910654708931954e-12
Epoch: 24 w= 1.9999998503873904 loss= 3.6723640039091726e-13
Epoch: 25 w= 1.999999920206608 loss= 1.0445835388749555e-13
Epoch: 26 w= 1.9999999574435243 loss= 2.971259855722779e-14
Epoch: 27 w= 1.9999999773032129 loss= 8.451583569452665e-15
Epoch: 28 w= 1.999999987895047 loss= 2.4040060320624316e-15
Epoch: 29 w= 1.999999993544025 loss= 6.838061337110749e-16
Epoch: 30 w= 1.9999999965568134 loss= 1.9450486402996616e-16
Epoch: 31 w= 1.9999999981636338 loss= 5.532582289380589e-17
Epoch: 32 w= 1.9999999990206048 loss= 1.573712494993027e-17
Epoch: 33 w= 1.9999999994776558 loss= 4.4763367583436385e-18
Epoch: 34 w= 1.9999999997214164 loss= 1.2732695708436325e-18
Epoch: 35 w= 1.9999999998514222 loss= 3.6217462890900584e-19
Epoch: 36 w= 1.9999999999207585 loss= 1.0301831624602159e-19
Epoch: 37 w= 1.9999999999577378 loss= 2.9303006500354953e-20
Epoch: 38 w= 1.9999999999774603 loss= 8.335100760491641e-21
Epoch: 39 w= 1.9999999999879787 loss= 2.3708344012213664e-21
Epoch: 40 w= 1.9999999999935887 loss= 6.743793343241322e-22
Epoch: 41 w= 1.9999999999965807 loss= 1.9182889646896314e-22
Epoch: 42 w= 1.9999999999981763 loss= 5.455821876973161e-23
Epoch: 43 w= 1.9999999999990274 loss= 1.5520779885212614e-23
Epoch: 44 w= 1.9999999999994813 loss= 4.4140317521189105e-24
Epoch: 45 w= 1.9999999999997233 loss= 1.2555468094916013e-24
Epoch: 46 w= 1.9999999999998523 loss= 3.5696409556271286e-25
Epoch: 47 w= 1.9999999999999212 loss= 1.0181467785899592e-25
Epoch: 48 w= 1.999999999999958 loss= 2.896140110957243e-26
Epoch: 49 w= 1.9999999999999776 loss= 8.237499222146303e-27
Epoch: 50 w= 1.999999999999988 loss= 2.357067080993807e-27
Epoch: 51 w= 1.9999999999999936 loss= 6.603423160787553e-28
Epoch: 52 w= 1.9999999999999967 loss= 1.9637706159345563e-28
Epoch: 53 w= 1.9999999999999982 loss= 5.030631731003161e-29
Epoch: 54 w= 1.9999999999999991 loss= 1.4725403564125555e-29
Epoch: 55 w= 1.9999999999999996 loss= 3.681350891031389e-30
Epoch: 56 w= 1.9999999999999998 loss= 1.3805065841367707e-30
Epoch: 57 w= 2.0 loss= 3.4512664603419266e-31
Epoch: 58 w= 2.0 loss= 0.0
Epoch: 59 w= 2.0 loss= 0.0
Epoch: 60 w= 2.0 loss= 0.0
Epoch: 61 w= 2.0 loss= 0.0
Epoch: 62 w= 2.0 loss= 0.0
Epoch: 63 w= 2.0 loss= 0.0
Epoch: 64 w= 2.0 loss= 0.0
Epoch: 65 w= 2.0 loss= 0.0
Epoch: 66 w= 2.0 loss= 0.0
Epoch: 67 w= 2.0 loss= 0.0
Epoch: 68 w= 2.0 loss= 0.0
Epoch: 69 w= 2.0 loss= 0.0
Epoch: 70 w= 2.0 loss= 0.0
Epoch: 71 w= 2.0 loss= 0.0
Epoch: 72 w= 2.0 loss= 0.0
Epoch: 73 w= 2.0 loss= 0.0
Epoch: 74 w= 2.0 loss= 0.0
Epoch: 75 w= 2.0 loss= 0.0
Epoch: 76 w= 2.0 loss= 0.0
Epoch: 77 w= 2.0 loss= 0.0
Epoch: 78 w= 2.0 loss= 0.0
Epoch: 79 w= 2.0 loss= 0.0
Epoch: 80 w= 2.0 loss= 0.0
Epoch: 81 w= 2.0 loss= 0.0
Epoch: 82 w= 2.0 loss= 0.0
Epoch: 83 w= 2.0 loss= 0.0
Epoch: 84 w= 2.0 loss= 0.0
Epoch: 85 w= 2.0 loss= 0.0
Epoch: 86 w= 2.0 loss= 0.0
Epoch: 87 w= 2.0 loss= 0.0
Epoch: 88 w= 2.0 loss= 0.0
Epoch: 89 w= 2.0 loss= 0.0
Epoch: 90 w= 2.0 loss= 0.0
Epoch: 91 w= 2.0 loss= 0.0
Epoch: 92 w= 2.0 loss= 0.0
Epoch: 93 w= 2.0 loss= 0.0
Epoch: 94 w= 2.0 loss= 0.0
Epoch: 95 w= 2.0 loss= 0.0
Epoch: 96 w= 2.0 loss= 0.0
Epoch: 97 w= 2.0 loss= 0.0
Epoch: 98 w= 2.0 loss= 0.0
Epoch: 99 w= 2.0 loss= 0.0
Predict (after training) 4 8.0

分析
假设我们有一个损失函数,我们假定一个w值,我们需要最终达到最优的w值
在这里插入图片描述
我们需要不断的更新w值,
在这里插入图片描述
在输出的日志中,w为2.0时,损失最小
在这里插入图片描述
绘制图表

# 绘制图表
plt.plot(epoch_list, cost_list)
plt.ylabel('Cost')
plt.xlabel('Epoch')
plt.show()

在这里插入图片描述

相关文章:

  • 深入理解 mapper-locations
  • LintCode407-加一,LintCode第479题-数组第二大数
  • MySQL - 事务
  • 5.2创新架构
  • 浔川AI 第二次内测报告
  • 浅析MySQL 的 **触发器(Trigger)** 和 **存储过程(Stored Procedure)原理及优化建议
  • c++学习合集(2025-4-29)
  • 基于Anaconda的Pycharm环境配置
  • 使用图像生成式AI和主题社区网站助力运动和时尚品牌的新产品设计和市场推广的点子和实现
  • 20250506让NanoPi NEO core开发板使用Ubuntu core16.04系统的TF卡启动
  • 中达瑞和便携式高光谱相机:珠宝鉴定领域的“光谱之眼”
  • 车载通信网络安全:挑战与解决方案
  • 【表设计】外键的取舍-分布式中逐渐消失的外键
  • 【十五】Mybatis动态SQL实现原理
  • 在Unity AR应用中实现摄像头切换功能
  • 2025年服务器技术全景解析:量子计算、液冷革命与未来生态构建
  • 基于图像处理的道路监控与路面障碍检测系统设计与实现 (源码+定制+开发) 图像处理 计算机视觉 道路监控系统 视频帧分析 道路安全监控 城市道路管理
  • HTTP请求与前端资源未优化的系统性风险与高性能优化方案
  • Java高频面试之并发编程-12
  • 论文速读:《CoM:从多模态人类视频中学习机器人操作,助力视觉语言模型推理与执行》
  • 郭旭涛转任河北省科协党组书记、常务副主席,曾任团省委书记
  • 山大齐鲁医院回应论文现“男性确诊子宫肌瘤”:给予该护士记过处分、降级处理
  • 酒店民宿一房难求,湖北宣恩文旅局工作人员腾出家中空房给游客救急
  • 新加坡执政党人民行动党在2025年大选中获胜
  • 市场驱动的系统改造:丹麦零弃风经验研究
  • 新华社评论员:在推进中国式现代化的宽广舞台上绽放青春光彩