To clarify first, there is no "doubling" or anything like that, a video indicates that a frame should be shown for 40ms (at 25 FPS), if the screen refreshes once or twice in that timespan does not change how long the image is shown - and thats all a video renderer does, it shows the image for the time specified in the video file.
Based on this alone, there is no difference to the resulting image, as long as every frame is shown right as long as it should be. The video renderer only has one job, show the frame as long as it should be shown. And as pointed out above - this costs literally zero performance, as its just not rendering a new image, its not rendering the old one twice (hence no "doubling")
Now, its certainly possible that some display devices can't really cope with that, while many others to. Its actually a big testing category for TVs, if they can properly handle integer-multiple refresh rates, or even mismatched ones. Many modern TVs can extract 25 from 50/100, or even 24 from 30/60 without losing any features, which is important when playing using their built-in apps or cheap playback devices like FireTVs, as things like Netflix/Disney+ etc will generally not do refresh rate matching. Instead, the TV can detect the pattern and just handle it.
As for why it is how it is?Lets start with 30/60 first, because its more obvious there. 30 fps content is practically non-existent. You have 24 fps movies, and 60i TV transmissions. 60i may present as 30 fps, but during playback you want it to come out as 60 fps. There might be the odd home video at 30 fps, but commercially produced content at 30 fps is exceedingly rare, and any content flagged as 30 has a high chance to actually want to play at 60 after deinterlacing.
25/50 is more complicated. The same interlacing problem applies for (older) TV transmissions, but at the same time, movies are pure 25 fps (and sometimes TV too). 50 is the safe choice that has both play reasonably well, perfect even with many TVs that handle it, and might historically just have been copied from the NTSC 30/60 option as-is anyway.
These settings actually exist longer even then I have worked on MC, and I've never really given them much thought because I've always had a TV that does the thing talked about above anyway, and even living in a PAL country myself, 25 fps content it still rare, as I don't watch our local TV.
For 25 in particular, there is also the aforementioned problem with some displays actually not being highly compatible with such a mode, and 50 working better.
So what do we do?I can certainly add separate 25/30 options to manual mode, and when unset use the 50/60 options for compatibility. What I'm wary of is changing the default automatic behavior to favor 25/30. Even if we have some information in the library these days that a video is interlaced (which we never had before recently, which would've made a proper choice impossible), I feel like playing an interlaced 50i file at 25 accidentally is worse then playing a 25p file at 50, which in over a decade of MC i don't remember anyone complaining about before (although I'm not ruling out having forgotten over the years)
Of course manual mode should be just fine for you guys, Automatic is essentially the same but selecting the refresh rates to use on its own.
Needless to say, doubling frame rates without frame-interpolation is pointless and simply taxes the GPU needlessly. Thankfully my TV ignores the duplicate frame for its own native interpolation which I have constantly enabled.
As was point out before, and confirmed by me, it does not in fact tax the GPU at all. And if your TV handles it perfectly otherwise, 50Hz is actually favorable, as it gives you smoother menus, and an extra refresh cycle to deal with frame drops/repeats, if they happen to be needed.
In general, this behavior does not affect people with display device without any processing (PC monitors, many projectors, TVs without interpolation, or disabled interpolation), or those with processing that handles this just fine (many TVs)