跳转至

Convolutional Neural Networks

Convolutional Networks

Pytorch

torch.nn.Conv1d(in_channels,out_channels,lernel_size,stride=1,padding=0,dilation =1,grounps=1,bias=True,padding_mode='zeros')
torch.nn.Conv3d(in_channels,out_channels,kernel_size,stride=1,padding=0.dilation=1,groups=1,bias=True,padding_mode='zeros')

1

2

Problem: Deep Networks very hard to train!

Normalization

https://medium.com/techspace-usict/normalization-techniques-in-deep-neural-networks-9121bf100d8

https://zhuanlan.zhihu.com/p/56542480

Batch Normalization

  • Idea: “Normalize” the outputs of a layer so they have zero mean and unit variance

Why? Helps reduce “internal covariate shift”, improves optimization

  • We can normalize a batch of activations like this:

3

8

  • Problem: What if zero-mean, unit variance is too hard of a constraint?

Learnable scale and shift parameters: \(\gamma\ \beta\)

4

problem : If in two pics,one has a batch of a cat and the other has a batch of dog, we donot expect them to be in different classes,but the batch normalization will cause such problems

Use constant \(\mu\) and \(\sigma\)​ !

  • During testing batchnorm becomes a linear operator! Can be fused with the previous fully-connected or conv layer

6

5

Layer Normalization

9

Instance Normalization

10

Group Normalization

11

Summary

7

CNN Architectures


最后更新: 2024年4月21日 10:20:14
创建日期: 2024年2月15日 00:02:09