Several papers have explored the possibility of using homeomorphic or diffeomorphic transformations within feed-forward
machine learning models. Discrete invertible versions of the
ResNet architecture (He et al, CVPR 2016) were proposed as
``normalizing flows'' by Rezende and Mohamed (ICML 2015), and extended
to a time-continuous form by Chen et al (Neurips 2018). Continuous-time optimal control as a learning principle was proposed
by Weinan (Comm. Math. Stats, 2017).
In a similar vein, the LDDMM framework can be adapted to develop powerful
predictor in typical data science contexts. The following two
papers explore this paradigm for classification [1] and regression
[2].