pytorch中的损失函数
本帖最后由 Handsome_zhou 于 2022-11-11 00:55 编辑损失函数可以分为三类: 回归损失函数、二分类损失函数和三分类损失函数。在机器学习中,损失函数通常作为学习准则与优化问题相联系,不同的场景运用的损失函数也不同。
下图为pytorch中常用的损失函数:
交叉熵和负对数似然损失函数介绍比较好的: https://blog.csdn.net/tcn760/article/details/123910565?spm=1001.2101.3001.6661.1&utm_medium=distribute.pc_relevant_t0.none-task-blog-2%7Edefault%7ECTRLIST%7ERate-1-123910565-blog-88914652.pc_relevant_3mothn_strategy_recovery&depth_1-utm_source=distribute.pc_relevant_t0.none-task-blog-2%7Edefault%7ECTRLIST%7ERate-1-123910565-blog-88914652.pc_relevant_3mothn_strategy_recovery&utm_relevant_index=1
各种损失函数清楚的可视化表示的博客: https://builtin.com/machine-learning/common-loss-functions
对pytorch用法与实例讲解比较详细: https://www.v7labs.com/blog/pytorch-loss-functions
损失函数和评价指标的区别:
Key Difference:
1. A loss function is implemented during training to optimize a learning function. It is not a judge of overall performance.
2. A Criterion/Evaluation Metric is used after training to measure overall performance
页:
[1]