Pre-trained Self-Attention Framework: An Efficient Mechanism for Source Separation
Lecture Notes in Networks and Systems (LNNS,volume 882); The 13th Conference on Information Technology and Its Applications (CITA 2024) ; pp: 99-110.
محفوظ في:
| المؤلفون الرئيسيون: | , , , |
|---|---|
| التنسيق: | Bài viết |
| اللغة: | English |
| منشور في: |
Springer Nature
2024
|
| الموضوعات: | |
| الوصول للمادة أونلاين: | https://elib.vku.udn.vn/handle/123456789/4272 https://doi.org/10.1007/978-3-031-74127-2_9 |
| الوسوم: |
إضافة وسم
لا توجد وسوم, كن أول من يضع وسما على هذه التسجيلة!
|
| Thư viện lưu trữ: | Trường Đại học Công nghệ Thông tin và Truyền thông Việt Hàn - Đại học Đà Nẵng |
|---|
| id |
oai:elib.vku.udn.vn:123456789-4272 |
|---|---|
| record_format |
dspace |
| spelling |
oai:elib.vku.udn.vn:123456789-42722024-12-06T03:38:08Z Pre-trained Self-Attention Framework: An Efficient Mechanism for Source Separation Ha, Minh Tan Fhadli, Muhammad Nguyen, Kim Quoc Vu, Quang Duc The suggested method is tested and assessed on a conventional dataset Model is relearned, enhanced Lecture Notes in Networks and Systems (LNNS,volume 882); The 13th Conference on Information Technology and Its Applications (CITA 2024) ; pp: 99-110. In this work, the pre-trained model of the self-attention framework is proposed for single-channel speech separation. Firstly, all layers in the pre-trained self-attention framework are frozen. The model is then retrained through three stages using the scheduling mechanism for learning rates and the layers of the framework are unlocked following the schedule. This way, the model is relearned, enhanced, and updated from previous knowledge. This is an effective way to improve the advanced model performance while significantly reducing the time and cost of a training model. This method is beneficial in applying existing models to perform a similar task or enhancing model performance. In this strategy, the pre-trained system outperforms the non-pre-trained system since the following phases of the model’s training repurpose characteristics extracted through the previously trained early phases. The suggested method is tested and assessed on a conventional dataset. The findings from experiments suggest that the methodology has higher performance than the baseline framework and outperforms current methods for the monaural speech separation task. 2024-12-04T04:04:20Z 2024-12-04T04:04:20Z 2024-11 Working Paper 978-3-031-74126-5 https://elib.vku.udn.vn/handle/123456789/4272 https://doi.org/10.1007/978-3-031-74127-2_9 en application/pdf Springer Nature |
| institution |
Trường Đại học Công nghệ Thông tin và Truyền thông Việt Hàn - Đại học Đà Nẵng |
| collection |
DSpace |
| language |
English |
| topic |
The suggested method is tested and assessed on a conventional dataset Model is relearned, enhanced |
| spellingShingle |
The suggested method is tested and assessed on a conventional dataset Model is relearned, enhanced Ha, Minh Tan Fhadli, Muhammad Nguyen, Kim Quoc Vu, Quang Duc Pre-trained Self-Attention Framework: An Efficient Mechanism for Source Separation |
| description |
Lecture Notes in Networks and Systems (LNNS,volume 882); The 13th Conference on Information Technology and Its Applications (CITA 2024) ; pp: 99-110. |
| format |
Working Paper |
| author |
Ha, Minh Tan Fhadli, Muhammad Nguyen, Kim Quoc Vu, Quang Duc |
| author_facet |
Ha, Minh Tan Fhadli, Muhammad Nguyen, Kim Quoc Vu, Quang Duc |
| author_sort |
Ha, Minh Tan |
| title |
Pre-trained Self-Attention Framework: An Efficient Mechanism for Source Separation |
| title_short |
Pre-trained Self-Attention Framework: An Efficient Mechanism for Source Separation |
| title_full |
Pre-trained Self-Attention Framework: An Efficient Mechanism for Source Separation |
| title_fullStr |
Pre-trained Self-Attention Framework: An Efficient Mechanism for Source Separation |
| title_full_unstemmed |
Pre-trained Self-Attention Framework: An Efficient Mechanism for Source Separation |
| title_sort |
pre-trained self-attention framework: an efficient mechanism for source separation |
| publisher |
Springer Nature |
| publishDate |
2024 |
| url |
https://elib.vku.udn.vn/handle/123456789/4272 https://doi.org/10.1007/978-3-031-74127-2_9 |
| _version_ |
1849202065031561216 |