Here it is:
https://patents.justia.com/patent/20230401438On Google Patents: https://patents.google.com/patent/US20230401438A1/en
The authors simply implement a continued fraction library in Pytorch and call the backward() function on the resulting computation graph.
That is, they chain linear neural network layers and use the reciprocal (not RELU ) as the primary non-linearity.
The authors reinvent the wheel countless times:
1. They rename continued fractions and call them ‘ladders’.
2. They label basic division ‘The 1/z nonlinearity’.
3. Ultimately, they take the well-defined concept of Generalized Continued Fractions and call them CoFrNets and got a patent.
IBM's lawyers can strip out all the buzzword garbage if they feel litigious and sue anyone whose written a continued fraction library.
Because, that's what the patent (without all the buzzwords) protects.