본문 바로가기

Happy Sisyphe

검색하기
Happy Sisyphe
프로필사진 happysisyphe

  • 분류 전체보기 (12)
    • Programming (6)
      • ML&DL (4)
      • Mobile (1)
      • Cloud (0)
      • Programming-Language (0)
      • How-to (1)
    • Daily (2)
      • Road-bike (1)
      • Travel (1)
      • Food (0)
      • Product-Review (0)
      • 초보 블로거 일상 (0)
      • 회사 일상 (2)
Guestbook
Notice
Recent Posts
Recent Comments
Link
반응형
«   2025/07   »
일 월 화 수 목 금 토
1 2 3 4 5
6 7 8 9 10 11 12
13 14 15 16 17 18 19
20 21 22 23 24 25 26
27 28 29 30 31
Tags
  • 업무강도
  • universal-approximation
  • 로드
  • 다중
  • cross-entropy
  • multiclass-label
  • deep-learning
  • neural-networks
  • 안장
  • 회사
  • vsCode
  • stone-weierstrass
  • 입문
  • NN
  • loadentrypoint
  • 변속
  • flutter
  • 구매리스트
  • 일상
  • how-to
  • sftp
  • optimizations
  • miniconda
  • serviceworkerversion
  • jupyter-notebook
more
Archives
Today
Total
관리 메뉴
  • 글쓰기
  • 방명록
  • RSS
  • 관리

목록cross-entropy (1)

Happy Sisyphe

Cross entropy loss

Cross-Entropy Loss: Detailed Explanation and Derivation (D2P)1. What is Cross-Entropy Loss?Cross-entropy loss is a commonly used loss function in classification tasks. It measures the dissimilarity between the true labels $y$ and the predicted probabilities $\hat{y}$ output by the model. The goal is to minimize the cross-entropy loss to ensure the predicted probabilities match the true labels as..

Programming/ML&DL 2024. 12. 18. 13:42
이전 Prev 1 Next 다음

Blog is powered by kakao / Designed by Tistory

티스토리툴바