Math Musings

1/10/2025

My running compilation of math topics and resources I find interesting. I’m quite bullish on some of these more “pure” concepts converging into elegant and powerful applications in emerging machine learning fields.

Kolmogorov Arnold Networks

Don't have too many original thoughts beyond the fact that I find the math fascinating and believe using splines can have some nice, downstream impacts on interpretability. Good paper here and a nontechnical overview article that cites contributions across the field.

Hyperbolic Learning

This is essentially the idea of creating embeddings in hyperbolic spaces instead of Euclidean ones, which can be implemented at each layer with tunable weights for curvature, or as a final layer in the fine tuning case. Some pretty cool pieces here, here, and here, which discuss some using transformers on images and graphs. Also this and this, which both introduce operations and mappings on hyperbolic space to carry out the required computations for a transformer (the last paper is phenomenal).

This newer HVT paper is also great, as their methods use logarithmic and exponential maps to map onto Euclidean tangent space, conduct the normal Euclidean operations, and then map back.

Long Time Derivation

I got the chance to interview Xiao Ma and Yu Deng about their recent paper, Long time derivation of the Boltzmann equation from hard sphere dynamics, which justifiably so brought buzz to the differential equations world. They walked me through their semi-novel approach of maintaining a memory of the full collision history, and a rudimentary correctness argument involving partitioning interaction graphs.

Symplectic Geometry

I read Quanta Magazine pretty obsessively. They have several pieces on symplectic geometry, but this one details its applications to spacecraft orbit calculations. Super neat.