The point is keeping the original video intact, since the original video when properly de-interlaced during playback is better quality than re-encoded material.
Pet Peeve Alert:
This kind of argument always makes me laugh.
You don't have the "original video". BluRay has the holy living CRAP compressed out of it, compared to the way it was shot (or scanned if it was shot on film).
If you recompress using proper settings, you
will not be able to tell the difference. Now, if you're trying to take the 50GB source and squeeze it into 4GB (or worse)? Well, of course
that's going to have a noticeable quality drop. But the original source of that two-hour movie was probably
terabytes in size, and
it was shot progressively.
Almost all commercially available interlaced content can be deinterlaced essentially "perfectly" because the footage was shot using a progressive camera (or scanned from film, which is by nature progressive). The only real exceptions are for sources that were originally shot interlaced (almost always news or sports footage). If it was shot interlaced, the quality is junk anyway (because it was almost certainly shot with a cheap news camera). Interlacing is almost always done for broadcast. If you get an interlaced BluRay, it is because the producers were too lazy/cheap to go back and re-encode the source to get rid of the interlacing. It is the mark of a terrible quality disc.
To sum up my pet peeve: The version you can obtain commercially != "the original".