I thought that most font rendering was still happening on CPU. Usually, characters are rather small, so the overhead of shipping the data to the GPU does not seem worth it.
Even with CPU rendering, there will be differences depending on the specific software libraries and system configuration, e.g. aliasing settings. Patches to one of the pieces of software doing rendering could create pixel level differences in rendered fonts that could be used to fingerprint.
It's actually an interesting case study how "identical installs" often have minor config variations which produce a sort of chaos in the end result. Also, staggered downstream distribution of software updates doesn't help.
I haven't tested it much myself, but I suspect there's a lot to unpack here.