I know someone who seems to be doing this successfully, whereas I did not succeed. I cannot tell you definitively either way, but here was my experience..
My setup consisted of using a sony CRT fed through a DVI port connected to my X1900 graphics card. The resulting signal was tuned by an ISF engineer to compensate for any gamma/color shift from the signal path and then re-profiled with an EyeOne Pro. In theory, everything was ideal, direct RGB with no conversions, and the LUT ensured perfect 709.
However when I compared it to broadcast monitors, there was a clear difference. I believe there's a scaling issue in pc levels to video levels which causes colors to clip at 235 instead of 255. (Additionally, there's a gamma shift being introduced by the graphics card itself.) The result was that I had crippled the available gamut of the display, but couldn't see it unless I went outside of "broadcast safe" space. (Hence all my bars/test patterns looked good). So I would grade things, and then they would go over-saturated and I'd never know.
For me the issue is not the CRT, its the data path to the CRT. And a graphics card pipeline, no matter how good the card, didn't work for me.