If you have a block of code a compiler will compile a language expression or statement into a particular set of assembly/bytecode instructions. For example converting `a + b` to `ADD a b`.
A reversing decompiler will look at the `ADD a b` and produce `a + b` as the output. This is the simplest approach as it is effectively just a collection of these types of mapping. While this works, it can be harder to read and noisier than the actual code. This is because:
1. it does not handle annotations like @NotNull correctly -- these are shown as `if (arg == null) throw ...` instead of the annotation because the if/throw is what the compiler generated for that annotation;
2. it doesn't make complex expressions readable;
3. it doesn't detect optimizations like unrolling loops, reordering expressions, etc.
For (1) an analytical decompiler can recognize the `if (arg == null) throw` expression at the start of the function and map that to a @NotNull annotation.
Likewise, it could detect other optimizations like loop unrolling and produce better code for that.