How weak are they? Detecting very weak signals (-110 dBm) at hundreds of MHz and even GHz is routinely done by Wi-Fi cards and cellphone radios, and GPS receivers detect signals that are orders of magnitude weaker than that (routinely -150 dBm), but only at tens of MHz. My eyes routinely detect submillilux signals when I look at the stars at night, with an integration time of well under a second; if I'm doing the calculations correctly, that's about -70 or -80 dBm in the 100–1000 THz band. PMTs (including microchannel plates) and SPADs routinely detect optical signals much weaker than that.
To a significant extent you can detect arbitrarily weak signals with coding gain and longer averaging times, although if your benchtop machine already takes ten minutes to give you a result, you probably can't afford to wait more than about 36 dB longer, give you another 18 dB of SNR).
So, I'm not worried about the electronics or the signal processing; there's no such thing as an "amount of electronics". Precision analog equipment is not easy to design, calibrate, and build, but you only need a very small "amount" of it, and it can be mass-produced.
Take resistors. When I was a kid back in the 01980s normal resistors were ±20% carbon composition, which would drift by more than 20% over time or if overvolted, with fiendish temperature coefficients. Now you can't buy a ±20% resistor; most resistors are ±1%, ±0.1% resistors are commonplace, and ±0.01% resistors are easily available for a dollar or two. Precision resistors are now made with an extremum of resistance around room temperature, so the temperature coefficient there is literally zero.
No, what I'm worried about is the physics. I'm not surprised YBCO spectrometers turned out to be a pain in the ass; YBCO is a huge pain in the ass in every possible way. What do you think the physical obstacles are?