Notice
Recent Posts
Recent Comments
Link
반응형
일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | ||
6 | 7 | 8 | 9 | 10 | 11 | 12 |
13 | 14 | 15 | 16 | 17 | 18 | 19 |
20 | 21 | 22 | 23 | 24 | 25 | 26 |
27 | 28 | 29 | 30 | 31 |
Tags
- 업무강도
- universal-approximation
- 로드
- 다중
- cross-entropy
- multiclass-label
- deep-learning
- neural-networks
- 안장
- 회사
- vsCode
- stone-weierstrass
- 입문
- NN
- loadentrypoint
- 변속
- flutter
- 구매리스트
- 일상
- how-to
- sftp
- optimizations
- miniconda
- serviceworkerversion
- jupyter-notebook
Archives
- Today
- Total
목록cross-entropy (1)
Happy Sisyphe
Cross entropy loss
Cross-Entropy Loss: Detailed Explanation and Derivation (D2P)1. What is Cross-Entropy Loss?Cross-entropy loss is a commonly used loss function in classification tasks. It measures the dissimilarity between the true labels $y$ and the predicted probabilities $\hat{y}$ output by the model. The goal is to minimize the cross-entropy loss to ensure the predicted probabilities match the true labels as..
Programming/ML&DL
2024. 12. 18. 13:42