Formal Power Series for Competitive Programming
Published:
The use of formal power series (FPS) in solving competitive programming tasks, whether implicit or explicit, as the author intended or simply to bash,1 has long interested me. Recently, I set out to implement some library code for truncated formal power series, specifically for use in programming contests, with the twist that it would require only implementations of modular integers and the corresponding convolution operation (typically, NTT), allowing interoperability with existing competitive programming libraries, such as AtCoder’s.
The heart of this task lies in implementing various FPS operations. For example, given a truncated FPS, how do we efficiently compute the first \(N\) terms of its multiplicative inverse (which themselves form a truncated FPS)? One common approach to this involves determining, from the first \(C\) coefficients of the answer, its next \(C\) coefficients. Therefore, if the constant term of the result is known, and this lifting process (which doubles the number of known coefficients) is sufficiently fast, we can efficiently solve the task by repeatedly lifting until we reach \(N\) known coefficients.
The lifting process works roughly as follows: we let the desired, non-truncated result FPS be the zero of a function \(F\). Then, if \(Q_i\), for all \(i\), is an FPS such that its first \(2^i\) coefficients match those of the non-truncated result, it holds that (source: CP-Algorithms)2:
\[\label{eq:newton} Q_{k + 1} \equiv Q_k - \frac {F(Q_k)} {F'(Q_k)} \quad\left(\operatorname{mod}\,{x^{2^{k + 1}}}\right).\]Essentially, if we continually maintain the first \(C\) correct terms of the desired FPS, directly computing the expression on the right-hand side of \eqref{eq:newton}, which typically takes \(O(C \log C)\) time due to polynomial multiplication, provides us with the first \(2 C\) correct terms. As lifting is repeated until \(N\) terms are reached (alternatively, a recursive implementation is possible), we then obtain the time complexity recurrence \(T(N) = T(N/2) + O(N \log N)\), which solves to \(T(N) \in O(N \log N)\). So, FPS multiplicative inverses can be found asymptotically as quickly as a single length \(N\) polynomial multiplication, by using polynomial multiplications! The best is yet to come, however…
It is often said that unexpected mathematical connections are sources of mathematical beauty. Perhaps the sight of \eqref{eq:newton} is itself enough to warrant flashbacks to high school math class. If not, recall that we sometimes approximate $\sin x$, for example, by taking the first few terms of its Maclaurin expansion — the more terms we take, the better the approximation. In this sense, the known coefficients in the lifting method described earlier approximate the true desired FPS.
Therefore, to efficiently compute the first $N$ terms of a desired FPS, we considered a function $F$ that it is a zero of, and iteratively obtained better approximations to the true quantity, with the number of correct coefficients doubling in each step. All of these things remind one of Newton’s method (!) in numerical analysis, used to iteratively obtain improving real approximations $q_0, q_1, \dots$ to zeroes of a real-valued function $f$ via
\[q_{k + 1} = q_k - \frac {f(q_k)} {f'(q_k)},\]an equation that is (visually) near-identical to \eqref{eq:newton} — remarkable!
The repeated-lifting method used to implement FPS operations, unnamed so far, is often simply referred to as Newton’s method within the competitive programming community. In fact, it is apparently a specialisation of Hensel lifting, and it has long been maintained by number theorists that the methods from Hensel and Newton are one and the same, though my mathematical knowledge (or lack thereof) prevents me from elaborating or appreciating this further at the moment.
Thanks for reading! The repository for this project, containing the library, can be found here — please consult the README for specifics. Included in the repo are some example problems, solution write-ups, and sample submissions using the library. However, the applications of FPS in competitive programming3 (not their implementation) may deserve a separate blog post, so watch this space!
The term generating functions bashing is sometimes used (e.g., in the editorial of Problem F here) to refer to the act of throwing generating functions at a problem, battling through the resultant math and/or implementation to arrive at a solution that may still require effort to squeeze through the time limit, while the intended solution is often smarter, shorter, and faster. ↩
CP-Algorithms is a great initiative without which this project would not have been possible. Check it out! ↩
Well, maybe just the simple stuff that I do understand. ↩