It's been confirmed by multiple (unofficial) sources that GPT-4 is 8 models, each 220B parameters. Another rumor is GPT-4 being 16x111B models.
There's a quite fresh and active project replicating something similar with herd of llamas: https://github.com/jondurbin/airoboros