Combinations of Fast Activation and Trigonometric Functions in Kolmogorov–Arnold Networks

For years, many neural networks have been developed based on the Kolmogorov-Arnold Representation Theorem (KART), which was created to address Hilbert’s 13th problem. Recently, relying on KART, Kolmogorov-Arnold Networks (KANs) have attracted attention from the research community, stimulating the us...

תיאור מלא

שמור ב:
מידע ביבליוגרפי
Những tác giả chính: Linh, Trần Thị Phương, Tạ, Hoàng Thắng, Thai Duy Quy
פורמט: Conference paper
שפה:English
יצא לאור: 2025
נושאים:
גישה מקוונת:https://scholar.dlu.edu.vn/handle/123456789/4901
תגים: הוספת תג
אין תגיות, היה/י הראשונ/ה לתייג את הרשומה!
Thư viện lưu trữ: Thư viện Trường Đại học Đà Lạt
תיאור
סיכום:For years, many neural networks have been developed based on the Kolmogorov-Arnold Representation Theorem (KART), which was created to address Hilbert’s 13th problem. Recently, relying on KART, Kolmogorov-Arnold Networks (KANs) have attracted attention from the research community, stimulating the use of polynomial functions such as B-splines and RBFs. However, these functions are not fully supported by GPU devices and are still considered less popular. In this paper, we propose the use of fast computational functions, such as ReLU and trigonometric functions (e.g., ReLU, sin, cos, arctan), as basis components in Kolmogorov–Arnold Networks (KANs). By integrating these function combinations into the network structure, we aim to enhance computational efficiency. Experimental results show that these combinations maintain competitive performance while offering potential improvements in training time and generalization.