1690 shaares
869 private links
869 private links
A non-traditional neuron network architecture where the activation functions are trained instead of fixed as in multi-layer perceptron (MLP). The output of the activation functions are merely summed in each layer. Each of the activation function is described as a linear combination of basis functions where the coefficients are trained.
Read https://github.com/GistNoesis/FourierKAN/ for a simple implementation of the core idea. See further discussion at https://news.ycombinator.com/item?id=40219205.