python打卡day54@浙大疏锦行
知识点回顾:
- 传统计算机视觉发展史:LeNet-->AlexNet-->VGGNet-->nceptionNet-->ResNet
之所以说传统,是因为现在主要是针对backbone-neck-head这样的范式做文章
- inception模块和网络
- 特征融合方法阶段性总结:逐元素相加、逐元素相乘、concat通道数增加等
- 感受野与卷积核变体:深入理解不同模块和类的设计初衷
作业:一次稍微有点学术感觉的作业:
- 对inception网络在cifar10上观察精度
- 消融实验:引入残差机制和cbam模块分别进行消融
Inception网络消融实验方案 :
1. 基础Inception模块实现(修改 src/models/train.py )
class InceptionBlock(nn.Module):def __init__(self, in_channels):super().__init__()self.branch1 = nn.Conv2d(in_channels, 16, 1)self.branch3 = nn.Sequential(nn.Conv2d(in_channels, 16, 1),nn.Conv2d(16, 24, 3, padding=1))self.branch5 = nn.Sequential(nn.Conv2d(in_channels, 16, 1),nn.Conv2d(16, 24, 5, padding=2))self.pool = nn.MaxPool2d(3, stride=1, padding=1)def forward(self, x):return torch.cat([self.branch1(x),self.branch3(x),self.branch5(x),self.pool(x)], 1)
2.残差增强版(新增在 src/models/train.py )
class ResInceptionBlock(InceptionBlock):def __init__(self, in_channels):super().__init__(in_channels)self.shortcut = nn.Conv2d(in_channels, 80, 1) # 4分支输出通道总和def forward(self, x):return super().forward(x) + self.shortcut(x)
3. CBAM增强版(需先创建 src/models/cbam.py )
class CBAMInception(InceptionBlock):def __init__(self, in_channels):super().__init__(in_channels)self.cbam = ChannelGate(80) # 假设已实现CBAM模块def forward(self, x):features = super().forward(x)return self.cbam(features)
4.实验配置
# 训练命令对比
!python train.py --model inception_vanilla --lr 0.001
!python train.py --model inception_residual --lr 0.001
!python train.py --model inception_cbam --lr 0.0005