Regularized Mixtures of Experts in High-Dimensional Data

Mixture of experts (MoE) models are successful neural-network architectures for modeling heterogeneous data in many machine learning problems including regression, clustering and classification. The model learning is in general performed by maximum likelihood estimation (MLE). For high-dimensional...

Descripció completa

Guardat en:
Dades bibliogràfiques
Autors principals: Faicel, Chamroukhi, Huỳnh, Bảo Tuyên
Format: Conference paper
Idioma:English
Publicat: 2023
Matèries:
Accés en línia:https://scholar.dlu.edu.vn/handle/123456789/2335
Etiquetes: Afegir etiqueta
Sense etiquetes, Sigues el primer a etiquetar aquest registre!
Thư viện lưu trữ: Thư viện Trường Đại học Đà Lạt