Most of what terminals have to do is blitting of prerendered text bitmaps, which is relatively slow on CPU and does benefit from the much faster memory bandwidth of the GPU. Core Graphics generally does not use the GPU for text blitting in most Mac apps, so having a custom renderer can help here. Font rasterization on the Mac does not use the GPU either, but the glyph cache hit rate for a terminal emulator is so high that it ends up pretty much irrelevant.
On Windows, Microsoft ships multiple rendering stacks for legacy compatibility. Most terminals are old Win32 programs that use classic GDI, which as far as I know is partially accelerated but mostly CPU (and implemented in the kernel!) Direct2D, the newer API, does use the GPU for blitting text. Like macOS, Windows still does all font rasterization on CPU. In GDI, font rasterization is done on CPU in the kernel (!) (except on Windows 10, in which the kernel calls out to a userspace fontdrvhost.exe). In Direct2D, font rasterization is done on CPU in userspace.