Regularised maximum-likelihood inference of mixture of experts for regression and clustering

Variable selection is fundamental to high-dimensional statistical modeling, and is challenging in particular in unsupervised modeling, including mixture models. We propose a regularised maximumlikelihood inference of the Mixture of Experts model which is able to deal with potentially correlated feat...

Mô tả đầy đủ

Đã lưu trong:
Chi tiết về thư mục
Những tác giả chính: Huỳnh, Bảo Tuyên, Faicel, Chamroukhi
Định dạng: Conference paper
Ngôn ngữ:English
Được phát hành: Bruges (Belgium) 2023
Những chủ đề:
Truy cập trực tuyến:https://scholar.dlu.edu.vn/handle/123456789/2334
Các nhãn: Thêm thẻ
Không có thẻ, Là người đầu tiên thẻ bản ghi này!
Thư viện lưu trữ: Thư viện Trường Đại học Đà Lạt
Miêu tả
Tóm tắt:Variable selection is fundamental to high-dimensional statistical modeling, and is challenging in particular in unsupervised modeling, including mixture models. We propose a regularised maximumlikelihood inference of the Mixture of Experts model which is able to deal with potentially correlated features and encourages sparse models in a potentially high-dimensional scenarios. We develop a hybrid Expectation- Majorization- Maximization (EM/MM) algorithm for model fitting. Unlike state-of-the art regularised ML inference [1, 2], the proposed modeling doesn’t require an approximate of the regularisation. The proposed algorithm allows to automatically obtain sparse solutions without thresholding, and includes coordinate descent updates avoiding matrix inversion. An experimental study shows the capability of the algorithm to retrieve sparse solutions and for model fitting in model-based clustering of regression data.