Combinations of Fast Activation and Trigonometric Functions in Kolmogorov–Arnold Networks

For years, many neural networks have been developed based on the Kolmogorov-Arnold Representation Theorem (KART), which was created to address Hilbert’s 13th problem. Recently, relying on KART, Kolmogorov-Arnold Networks (KANs) have attracted attention from the research community, stimulating the us...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Linh, Trần Thị Phương, Tạ, Hoàng Thắng, Thai Duy Quy
Formato: Conference paper
Lenguaje:English
Publicado: 2025
Materias:
Acceso en línea:https://scholar.dlu.edu.vn/handle/123456789/4901
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
Thư viện lưu trữ: Thư viện Trường Đại học Đà Lạt
Descripción
Sumario:For years, many neural networks have been developed based on the Kolmogorov-Arnold Representation Theorem (KART), which was created to address Hilbert’s 13th problem. Recently, relying on KART, Kolmogorov-Arnold Networks (KANs) have attracted attention from the research community, stimulating the use of polynomial functions such as B-splines and RBFs. However, these functions are not fully supported by GPU devices and are still considered less popular. In this paper, we propose the use of fast computational functions, such as ReLU and trigonometric functions (e.g., ReLU, sin, cos, arctan), as basis components in Kolmogorov–Arnold Networks (KANs). By integrating these function combinations into the network structure, we aim to enhance computational efficiency. Experimental results show that these combinations maintain competitive performance while offering potential improvements in training time and generalization.