However, traditional way of computing efficiency of compression would not be useful for a meaningful analysis of the efficiency of a language. Barring issues like having an ideal encoding to bits, or even having the concept of "efficiency" being rigorously defined, there are problems just from the outset.
Take context for example.
All useful compression methods have some sort of decompression key involved. This could be the dictionary, or the bitmap or the know-how (for cases like RLE). In natural langauges, the compression/decompression key is stored in a distributed fashion across the minds of a society.
"Darmok and Jalad at Tanagra" is a VERY efficient compression for what is presumably a very long story about two hunters who met at an island and fought a beast together, but it is only efficient to the people who speak that language. The "local" efficiency (to the population who speak the language) is very high, but the "global" efficiency isn't.
So we must account for efficiency in terms of the size of the compressed concept as well as the compression key. And from my experience, it's a sorta lumpy kinda world out there.