One of the problems in many of these domains is that the potential higher-order interaction space is so large that it's impossible to make inferences about them (in an exploratory way at least). So in genetics for example, there's a lot of genes, and the number of potential combinations of causal factors is huge. Unless you have an a priori reason to think a particular P-way combination of factors is important, it's impossible to search for them because the resources required to make inferences about the P-way interactions exceeds any computational resources available to study them.
This is kind of the idea of emergence, that at some point the information involved in representing a set of higher-order interactions becomes too great to actually represent, so we measure some property of the system that summarizes these interactions instead.
I think the problem scientifically always is knowing whether that information required actually is too large, or whether we just don't understand what is exactly involved. Unknown unknowns or something like that, but where some of the unknowns might in fact be fundamentally unknowable.
I've always found it might be useful to have an estimate of the predictability of some system from some level of analysis of predictors, so we at least know how much we can ever expect to explain from them. In some cases I think this might be doable and others impossible.
As I see it, emergence comes in two flavours: a higher-order interaction among microscopic parts is already emergent in the sense that it is a non-atomic thing that determines the behaviour of atoms (I use atoms to refer to the 'singletons' or smallest elements of the theory, not necessarily physical atoms). But you're completely right in saying that there is another sense of emergence which only really happens for a 'thermodynamic' number of atoms. The difference seems somehow captured by the contrast between:
-- the whole is more than the sum of the parts -- the whole is less than the sum of the parts.
Both are commonly called emergence! If it turns out that you don't need to keep track of all birds in a flock to describe its behaviour, then we call that emergent because the whole is somehow less than the sum of the parts.
Your example of genetics is interesting, because it is actually what got me interested in this problem in the first place. I spent most of my PhD struggling with calculating up to 7-point interactions among genes, and you indeed need some clever tricks to make this tractable. I used causal discovery methods to rule out most potential interactions based on conditional dependencies. This is now a piece of open-source software: https://www.embopress.org/doi/full/10.1038/s44320-024-00074-...
In a way, calculating quantities q through Möbius inversion is just calculating Euler characteristics, weighted by by Q. (with some caveats)
In this article, we fix a mereology and a kind of quantity Q that "decomposes" over it---in the sense that Q(p) = sum_{r <= p} q(r) for some function q(r)---and then see that Mobius inversion lets us solve for q in terms of Q. In terms of incidence algebras, we're saying: assume Q = zeta q, as a product of elements in an incidence algebra. Then zeta has an inverse mu, so q = mu Q.
In other situations, we might want to "solve for" a quantity Q that decomposes over some class of metrologies while respecting some properties. The "simpler" and more "homogeneous" the parts of your mereology, the less you can express, but the easier it becomes to reason about Q. A mereology that breaks me up into the empty set, singleton sets with each of my atoms, and the set of all my atoms admits no "decomposing quantities" besides a histogram of my atoms. An attempt to measure "how healthy I am" in terms of that mereology can't do much. On the other hand, if I choose the mereology that breaks me up into the empty set and my whole, all quantities decompose but I have no tools to reason about them.
I guess Euler characteristic could be an example of how the requirement of respecting a certain kind of mereology can "bend" a hard-to-decompose quantity into a weirder but "nicer" quantity. For example, say we're interested in defining a Q that attempts to "count the number of connected regions" of some object, and we insist on using a mereology that lets us divide regions up into "cells". Of course this is impossible, as we can see in the problem of counting connected components of a graph-like object: we can't get the answer just as a function of the number of vertices and edges. However, if we insist on assigning a value of 1 to "blobs" of any dimension, the "compositionality requirement" forces us to define the Euler characteristic. This doesn't help us much with graph algorithms in general, but gives us an unexpectedly easy way to, say, count the number of blob-shaped islands on a map.
I wonder if there are other examples of this?
Here is one:
A is connected to B
B is connected to C
C is connected to A
This is a description in terms of pairs, and from it is trivial to deduce how the rings are connected.
In the text:
If you only consider two of the rings and ignore the third, then any pair can be smoothly separated.
and from looking at the diagram. Carefully looking.A is not connected to B B is not connected to C C is not connected to A A, B, and C are connected
This seems paradoxical, but the paradox is resolved by the 'higher-order' linkage.