Making Layers Not Learn To stop a layer from learning further, you can set it's param attributes in your prototxt.
layer {
name: "example"
type: "example"
...
param {
lr_mult: 0 #learning rate of weights
decay_mult: 1
}
param {
lr_mult: 0 #learning rate of bias
decay_mult: 0
}
}