In what universe 512x342 was better that 640x400 ?
There was an add-on called a "Flicker Fixer" that cached the video signal and emitted VGA-signals at twice the pixel clock and horizontal refresh rate. The Amiga 3000 had one built in.
The ECS and AGA chipsets supported "Productivity mode" that emitted VGA signals but ECS supported only four colours in this mode. All games used the TV modes. "Multisync" monitors that could switch between VGA and television refresh rates were expensive, so few people had them.
Also remember the Amiga was competing with the Mac II line for most of its life. Yes, it was much more expensive... but we are comparing specs, and you could get Mac II displays that supported 256 colors out of 16 million (24-bit.) The Amiga didn't have 24-bit color until 1992.
The Amiga was ahead of its time in many ways, and the pre-emptive multitasking was fantastic, but claiming it was some paragon doesn't help anyone. If you wanted a fun home machine attached to a TV, it was great. Even a fun home machine attached to a monitor. If you wanted a business machine with a monitor, it wasn't the safest or best choice, if only due to a lack of software.
That being said I preferred the Amiga.
The Amiga was a bargain in comparison, but it was not without its flaws, like all early machines. I had an A500 with a 1084 monitor, and the flicker at high res was bothersome to me. I later upgraded to an A3000 w/VGA monitor, and it was a vast improvement. I ran at 640x400 for everything at that point.
I think you are underestimating the price of "flicker fixers" at the time. I looked up the price of a Microway flicker fixer in an old Amiga World from 1988: Over $500. You also had to add an a VGA monitor: another $400.