Weight Sharing

Weight sharing is a technique in neural networks where the same parameters are reused across different parts of the model, reducing the number of learnable weights and improving generalisation. A good examples the filters used in Convolutional Layer.