EDIT: To answer your question... I was not disputing your whole comment, only that one little part that DVI couldn't handle it "without multiple DVI or HDMI in/out ports". It clearly can, via a dual-link connection which still only requires one port and is part of the spec.
My other point is this... Assuming your graphics card can push it, and your monitor can receive and process it, you are likely going to have very good results for computer-use (including playing back footage at 24p, 25p, 30p, 60i, 60p, and everything else) by pushing the refresh rate as high as possible. True, you will get best results by using an exact multiple (which is why 100 or 120Hz would be best if possible), but I suspect your problem might be mitigated by forcing the refresh rate to 72Hz if you can (which is at least an exact multiple of 24 and is higher than everything else).
Now, your monitor, not being a "monitor" might not support those rates (and appears not to, but I thought I read that it could do 72Hz somehow), but many monitors can and DVI certainly can.
Of course, this is not to say that your system isn't technically "better". However, I don't know that it is necessary for the vast majority of people (and therefore would probably not be worth the development effort to switch refresh rates on the fly like that).
The reason is, I think (and I am by no means sure) that you are confusing (and I was at first) telecine (can cause judder) with refresh mismatch (can cause flicker). I am fairly certain that the computer (or the TV) do not ever insert extra fields (or frames) into content (telecine). Telecine is a method used during the authoring or recording of media (making the DVD, HD-DVD, BluRay encoding, or recording the footage with a camera onto an interlaced medium or format). They do detect and remove these extra fields (reverse telecine) but they don't ever add them "on the fly".
Flicker can be caused when the refresh rate of a monitor doesn't match the framerate of the video being displayed on it. It is caused by delayed frames. This happens when the transmitting device (the software application in this case, not the display drivers) sends a frame and the monitor/GPU/driver is half-way through the previous "scan". Therefore it can't draw the new frame until the current one is finished. It queues that frame until it is done, and then draws it, unless it has received a second frame in the interim (then it drops the "middle" frame, causing flicker). However, flicker is by far most apparent when the framerate is close to the refresh rate of the monitor (which is why most video games include an option to limit the frame rate to a maximum of the refresh rate). However, as refresh rates increase, your ability as a human to detect the imperfections in the video signal diminishes dramatically. The reason is simple... At 100Hz, even when playing 60p content (which is exceedingly rare) relatively few frames actually get "dropped". Some might get delayed by less than 1/100th of a second, but that's all. Those that do get dropped are only "missing" for 1/100th of a second, and are exceptionally rare. 24p material should effectively NEVER drop a frame at 100hz, only possibly occasionally slightly delay one (and you won't ever be able to tell), which is why they call them "flicker free".
That's why "response time" on an LCD monitor is very important, for the exact same reasons. Below roughly 5-8ms response time (real black-white response rates, not the cheesy gray-to-gray ones the cheap monitor manufacturers sometimes list), the human eye (and brain) loses all ability to distinguish flicker or lag. That's also why the "guides" recommend that you disable those framerate limiting options in video games if you run your monitor at higher refresh rates than 60hz (or have a fast LCD monitor). Because there might be a few artifacts there, but you'll never be able to see them and the higher framerate is better on your eyes!
What I wonder is if the distortion you are seeing would lessen, or go away, if you can force it to receive at 72Hz (again, I don't know exactly what model you have, but I've read those Panny's can do it). If it is caused not really because of the mismatch between 24fps and 50hz, but because 50hz isn't high enough to hide the distortion from your eyes (which are probably pretty good)!