Links

GitHub

Tags

#Functional Approximation #Neural Networks #Machine Learning #Artificial Intelligence

MFANN

We developed the Multivariate Functional Approximation Neural Network (MFANN), an architecture that combines the principles of multivariate functional approximation (MFA) with iterative optimization techniques commonly found in the neural network (NN) literature. MFA is a data modeling, compression, and visualization tool that uses the tensor product of B-spline functions to build continuous, differentiable representations of input data. We extend MFA to use stochastic iterative mini-batch optimization methods, periodically updating the spline-based models instead of numerically solving for the representation. We’ve demonstrated MFANN is less prone to common problems in neural network optimization, such as overfitting and hyperparameter selection, while remaining flexible enough to fit complex analytical functions and real-world scientific data. Our work highlights MFANN as a promising paradigm for advancing the theory and practice of data-driven function approximation with a new class of neural networks.

Publications

Coming soon!

Funding and Acknowledgements

This work is partially funded by the NSF CSGrad4US Fellowship.

People