Regularized Mixtures of Experts in High-Dimensional Data
Mixture of experts (MoE) models are successful neural-network architectures for modeling heterogeneous data in many machine learning problems including regression, clustering and classification. The model learning is in general performed by maximum likelihood estimation (MLE). For high-dimensional...
Wedi'i Gadw mewn:
Prif Awduron: | , |
---|---|
Fformat: | Conference paper |
Iaith: | English |
Cyhoeddwyd: |
2023
|
Pynciau: | |
Mynediad Ar-lein: | https://scholar.dlu.edu.vn/handle/123456789/2335 |
Tagiau: |
Ychwanegu Tag
Dim Tagiau, Byddwch y cyntaf i dagio'r cofnod hwn!
|
Thư viện lưu trữ: | Thư viện Trường Đại học Đà Lạt |
---|