Score: 0

Numerical Derivatives, Projection Coefficients, and Truncation Errors in Analytic Hilbert Space With Gaussian Measure

Published: April 22, 2025 | arXiv ID: 2504.16246v2

By: M. W. AlMasri

Potential Business Impact:

Finds math patterns in complex shapes.

Business Areas:
Analytics Data and Analytics

We introduce the projection coefficients algorithm, a novel method for determining the leading terms of the Taylor series expansion of a given holomorphic function from a graph perspective, while also analyzing the associated truncation errors. Let $ f(z) $ be a holomorphic function, and let $\langle \cdot, \cdot \rangle$ denote the inner product defined over an analytic Hilbert space equipped with a Gaussian measure. The derivatives $ f^{(n)}(z) $ at a point $ z_0 $ can be computed theoretically by evaluating an inner product of the form $ f^{(n)}(z_0) = \frac{\langle z^n, f(z) \rangle}{C}, $ where $ C $ is a normalization constant. Specifically, in the Bargmann space (the analytic Hilbert space with a Gaussian weight and orthogonal monomials), this constant is $ \pi $. This result assumes that $ f(z) $ is a holomorphic function of a single complex variable. The accuracy of the computed derivative values depends on the precision and reliability of the numerical routines used to evaluate these inner products. The projection coefficients offer valuable insights into certain properties of analytic functions, such as whether they are odd or even, and whether the $ n $-th derivatives exist at a given point $ z_0 $. Due to its relevance to quantum theory, our approach establishes a correspondence between quantum circuits derived from quantum systems and the theory of analytic functions. This study lays the groundwork for further applications in numerical analysis and approximation theory within Hilbert spaces equipped with Gaussian measures. Additionally, it holds potential for advancing fields such as quantum computing, reproducing kernel Hilbert space (RKHS) methods -- which are widely used in support vector machines (SVM) and other areas of machine learning -- and probabilistic numerics.

Page Count
40 pages

Category
Mathematics:
Numerical Analysis (Math)