Abstract:
A function is approximated by two-point Hermite interpolating polynomials with an asymmetric orders-of-derivatives distribution at the endpoints of the interval. The local error estimate is examined theoretically and numerically. As a result, the position of the maximum of the error estimate is shown to depend on the ratio of the numbers of conditions imposed on the function and its derivatives at the endpoints of the interval. The shape of a universal curve representing a reduced error estimate is found. Given the sum of the orders of derivatives at the endpoints of the interval, the ordersof-derivatives distribution is optimized so as to minimize the approximation error. A sufficient condition for the convergence of a sequence of general two-point Hermite polynomials to a given function is given.
Key words:Hermite interpolating polynomial, Taylor polynomial, asymmetric two-point Hermite polynomial, approximation error estimate, approximation error minimization, asymmetric expansion of a function.