Abstract
Tree-structured neural networks (TSNN) are universal approximators that have extremely fast evaluation procedures if so-called lazy activation functions are used. In this paper, it is shown how to choose a proper lazy activation function such that the existence of continuous derivatives of order n and less is guaranteed. A fast algorithm is presented that evaluates the n-th order derivative of a univariate multidimensional TSNN function using Taylor expansions of a functional decomposition. The same technique is used to derive a fast algorithm for the evaluation of definite integrals.