It's a combination of what you have already seen, read about or heard of, isn't it?
That being said, you are assuming that something alien is from space, and that they would be something that could even be visually experienced.
ChatGPT can exceed humans in its knowledge store. It is excellent at doing research. But it’s not thinking it is merely selecting the most likely nest words based on some algorithm.
If it were up to me I'd try to give it another representation than just words. I think those models should be trained to represent text as relationship graphs of objects. There's not much natural data lole that, but it should be fairly rasy to create vast amounts of synthetic data, text generated from relationship graphs. Model should be able to make the connection to natural language.
Once models are taught this representation they might learn how the graphs transform during reasoning just by training on natural language reasoning.
Or this: https://preview.redd.it/finally-made-my-scientist-species-to...
Humans are capable of thinking and fleshing out novel concepts, current AI are not. Sure your first thing will greatly resemble current things, but as you iterate and get further and further away from existing things what you do stops being an imitation and starts being its onw thing. Current AI can't do that.
Then when you got an initial concept, you can start adding more similar things and now you have built a whole new world or ecosystem. That is where all the wondrous things we have in our current images and stories comes from. An AI that is to replace us must be able to achieve similar things.
The wealth of things you see around you doesn't exist in nature. Stick figures doesn't exist in nature, things in nature doesn't have black outlines yet we draw that everywhere in cartoons etc. Human have proven we have imagined many entirely novel things that doesn't exist in nature. And the creatures I posted have many aspects to them that are entirely unnatural, you clearly know that there are no animals like that even without knowing about all animals, so clearly they are something novel and not just more of the same.
Anyway, whenever you put yourself in a position where you can say "nuh uh, to me that isn't like that!" to everything, you are just tricking yourself when you do so.
Personally I think there is a bit of evidence in your comment that we don't really understand our minds or cognition very well.