Combinations of Fast Activation and Trigonometric Functions in Kolmogorov–Arnold Networks

For years, many neural networks have been developed based on the Kolmogorov-Arnold Representation Theorem (KART), which was created to address Hilbert’s 13th problem. Recently, relying on KART, Kolmogorov-Arnold Networks (KANs) have attracted attention from the research community, stimulating the us...

Full description

Saved in:
Bibliographic Details
Main Authors: Linh, Trần Thị Phương, Tạ, Hoàng Thắng, Thai Duy Quy
Format: Conference paper
Language:English
Published: 2025
Subjects:
Online Access:https://scholar.dlu.edu.vn/handle/123456789/4901
Tags: Add Tag
No Tags, Be the first to tag this record!
Institutions: Thư viện Trường Đại học Đà Lạt
Description
Summary:For years, many neural networks have been developed based on the Kolmogorov-Arnold Representation Theorem (KART), which was created to address Hilbert’s 13th problem. Recently, relying on KART, Kolmogorov-Arnold Networks (KANs) have attracted attention from the research community, stimulating the use of polynomial functions such as B-splines and RBFs. However, these functions are not fully supported by GPU devices and are still considered less popular. In this paper, we propose the use of fast computational functions, such as ReLU and trigonometric functions (e.g., ReLU, sin, cos, arctan), as basis components in Kolmogorov–Arnold Networks (KANs). By integrating these function combinations into the network structure, we aim to enhance computational efficiency. Experimental results show that these combinations maintain competitive performance while offering potential improvements in training time and generalization.