Dropout
This refers to a technique used in machine learning to prevent the overfitting of data in neural networks. During the training phase, it randomly sets a number of output features to zero, thereby ensuring that the model does not rely too heavily on any single feature and can generalize better to unseen data.