INTERACT FORUM

Please login or register.

Login with username, password and session length
Advanced search  
Pages: [1]   Go Down

Author Topic: Is there a way to avoid using the GPU for HDMI output?  (Read 473 times)

haggis999

  • Galactic Citizen
  • ****
  • Posts: 461
Is there a way to avoid using the GPU for HDMI output?
« on: May 14, 2024, 02:56:48 am »

The Intel NUC I have been using for several years as an HTPC has just died. Rather than replace it, I decided to use my primary Windows 11 workstation as the HTPC. The HDMI output of the GPU card on this workstation is now connected to my AV amp.

When using this to play multi-channel DSF files (ripped from my SACDs) I hit an unexpected problem. The music would stop as soon as the workstation's monitors went to sleep. However, it didn't take me long to find that JRiver already had a solution to this problem, using the audio setting of "Disable display from turning off (useful for HDMI audio)".

The only issue with that solution is that it leaves the two monitors on my workstation running when they are not being used. I could just switch them off, but is there a way to add an alternative HDMI output to a PC that bypasses the GPU card? I'm guessing that a discrete sound card might be the only way to do this.
Logged

JimH

  • Administrator
  • Citizen of the Universe
  • *****
  • Posts: 71462
  • Where did I put my teeth?
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #1 on: May 14, 2024, 05:22:26 am »

Logged

haggis999

  • Galactic Citizen
  • ****
  • Posts: 461
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #2 on: May 14, 2024, 06:24:54 am »

Like this?
https://www.amazon.com/Adapter-Multi-Display-Converter-Projector-Chromebook/dp/B087PD9KSZ/


I had already tried a similar solution last night before raising my post this morning (using an adapter I bought to provide an HDMI output for my Dell XPS 15, which only has USB-C outputs), but my normal MC32 audio device of DENON-AVRHD (NVIDIA High Definition Audio) [WASAPI] was no longer available and I had to change my Output Format setting of 5.1 channels to Source number of channels before MC would start to play some music - though this was not a positive result, as my AV amp did not receive any audio signal.

The key problem is that Windows Playback settings show that my DENON-AVRHD device is not plugged in when using this USB to HDMI adapter.
Logged

haggis999

  • Galactic Citizen
  • ****
  • Posts: 461
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #3 on: May 14, 2024, 06:48:38 am »


Or maybe you have an old video card around?

I suspect that the AV capability of my workstation is provided by my RTX 3070 and there is no alternative AV source from my combination of Gigabyte B550 Vision D-P motherboard and AMD Ryzen 7 5800X processor. If true, that would explain why my USB to HDMI adapter is not working.

Your other suggestion of using a spare video card might do the job, though won't older cards fail to handle newer audio codecs?
Logged

tzr916

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 1322
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #4 on: May 14, 2024, 08:30:32 am »

Quote
music would stop as soon as the workstation's monitors went to sleep

Windows EDID strikes again.

Basically when Windows detects that one of the connected "monitors" shuts off, or the input on the AVR/TV changes etc, playback stops or Windows "moves/resizes" apps/video, and even desktop icons.

The fix is to trick Windows into thinking that the monitor stays connected. I've heard that Windows 11 supposedly has this option, but I run Win 10. Some GPU's have EDID override to get around this, like nvidia Quadro or some newer Intel NUC's. But if your GPU/driver does not, the BEST way to trick Windows is an external hdmi hardware EDID device.

I tried many - several active hdmi switches, "monitor detect killer" (pin 19), 4k HD Fury, but the only one that provides ZERO problems ended up being: https://atlona.com/product/at-etu-sync/ (can be found on eBay for less than $50)
Logged
JRiverMC v32 •Windows 10 Pro 64bit •Defender Exclusions •ṈØ 3rd party AV
•ASUS TUF gaming WiFi z590 •Thermaltake Toughpower GX2 600W
•i7-11700k @ 3.6GHz~5GHz •32GB PC4-25600 DDR4
•OS on Crucial P5 Plus M.2 PCIe Gen4 •Tv Recordings on SATA 6TB WD Red Pro
•4 OTA & 6 CableCard SiliconDust Tuners
•nVidia RTX2060 •XBR65Z9D •AVRX3700H •Fluance 7.2.2 [FH]
•SMP1000DSPѫeD A3-300[RSS315HE-22] •SPA300DѫYSTSW215[15-PRX8S4]

haggis999

  • Galactic Citizen
  • ****
  • Posts: 461
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #5 on: May 14, 2024, 11:37:43 am »

Windows EDID strikes again.

Basically when Windows detects that one of the connected "monitors" shuts off, or the input on the AVR/TV changes etc, playback stops or Windows "moves/resizes" apps/video, and even desktop icons.

The fix is to trick Windows into thinking that the monitor stays connected. I've heard that Windows 11 supposedly has this option, but I run Win 10. Some GPU's have EDID override to get around this, like nvidia Quadro or some newer Intel NUC's. But if your GPU/driver does not, the BEST way to trick Windows is an external hdmi hardware EDID device.

I tried many - several active hdmi switches, "monitor detect killer" (pin 19), 4k HD Fury, but the only one that provides ZERO problems ended up being: https://atlona.com/product/at-etu-sync/ (can be found on eBay for less than $50)

That Atlona Etude Sync adapter is currently only available on eBay UK as an import from the USA, which more than doubles the price when you take shipping into account. It would cost me £88 GBP, without any guarantee or technical support.

However, your very useful advice persuaded me to order a Lindy HDMI 2.0 EDID Emulator (https://www.lindy.co.uk/audio-video-c2/hdmi-2-0-edid-emulator-p11244), with a 2 year guarantee and lifetime support. Lindy's tech support in the UK have assured me it would do the job. If it doesn't, I will return it. I was told that this adapter doesn't support Dolby Atmos, but I don't think that has any relevance to my needs, as I only have a 5.0 speaker system.
Logged

haggis999

  • Galactic Citizen
  • ****
  • Posts: 461
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #6 on: May 16, 2024, 09:48:07 am »

I tried many - several active hdmi switches, "monitor detect killer" (pin 19), 4k HD Fury, but the only one that provides ZERO problems ended up being: https://atlona.com/product/at-etu-sync/ (can be found on eBay for less than $50)

Sadly, my results with the Lindy HDMI EDID adapter appear to echo your own bad experiences in this area. This adapter simply failed to keep the music playing when the PC's monitors went to sleep.

After I contacted Lindy UK tech support, they ran a test of their own this afternoon. That test worked fine, but they couldn't explain why it failed with my equipment. They could only speculate that a difference in drivers might be to blame. The adapter will be returned for a refund. 

It looks like I will just have to use JRiver's audio setting of "Disable display from turning off (useful for HDMI audio)" and then manually switch off my PC's monitors when I am listening to music via HDMI (though that has the irritating effect of briefly stopping playback).
Logged

tzr916

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 1322
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #7 on: May 16, 2024, 12:57:48 pm »

What's your exact  chain of hardware?

Eg. Mine is: primary playing MC on Nvidia gpu > HDMI > EDID box > AVR > TV... and secondary display via on-board graphics vga to monitor
Logged
JRiverMC v32 •Windows 10 Pro 64bit •Defender Exclusions •ṈØ 3rd party AV
•ASUS TUF gaming WiFi z590 •Thermaltake Toughpower GX2 600W
•i7-11700k @ 3.6GHz~5GHz •32GB PC4-25600 DDR4
•OS on Crucial P5 Plus M.2 PCIe Gen4 •Tv Recordings on SATA 6TB WD Red Pro
•4 OTA & 6 CableCard SiliconDust Tuners
•nVidia RTX2060 •XBR65Z9D •AVRX3700H •Fluance 7.2.2 [FH]
•SMP1000DSPѫeD A3-300[RSS315HE-22] •SPA300DѫYSTSW215[15-PRX8S4]

haggis999

  • Galactic Citizen
  • ****
  • Posts: 461
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #8 on: May 16, 2024, 03:50:51 pm »

The chain I was using when testing the Lindy adapter was:

Music playing via MC on Windows 11 workstation > RTX 3070 > Lindy EDID > Denon X8500H AV amp > TV (not switched on)

The workstation has two monitors that I had been hoping could turn off after a period of inactivity without stopping the music if the Lindy had done its job. 
Logged

tzr916

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 1322
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #9 on: May 16, 2024, 05:52:24 pm »

How are the other two monitors connected? HDMI or display port or vga? Are they on the same RTX gpu? or on motherboard? or on another gpu card?

Sadly, Windows likes to change, or reorder, the default display (audio device) whenever it detects one of them going inactive. So unfortunately you might need an edid emulator on every monitor, or at least on the ones that you want to let sleep/power off.
Logged
JRiverMC v32 •Windows 10 Pro 64bit •Defender Exclusions •ṈØ 3rd party AV
•ASUS TUF gaming WiFi z590 •Thermaltake Toughpower GX2 600W
•i7-11700k @ 3.6GHz~5GHz •32GB PC4-25600 DDR4
•OS on Crucial P5 Plus M.2 PCIe Gen4 •Tv Recordings on SATA 6TB WD Red Pro
•4 OTA & 6 CableCard SiliconDust Tuners
•nVidia RTX2060 •XBR65Z9D •AVRX3700H •Fluance 7.2.2 [FH]
•SMP1000DSPѫeD A3-300[RSS315HE-22] •SPA300DѫYSTSW215[15-PRX8S4]

haggis999

  • Galactic Citizen
  • ****
  • Posts: 461
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #10 on: May 17, 2024, 02:17:30 am »

How are the other two monitors connected? HDMI or display port or vga? Are they on the same RTX gpu? or on motherboard? or on another gpu card?

Sadly, Windows likes to change, or reorder, the default display (audio device) whenever it detects one of them going inactive. So unfortunately you might need an edid emulator on every monitor, or at least on the ones that you want to let sleep/power off.

The workstation's two monitors are both connected via DisplayPort outputs on the same RTX 3070 GPU as I'm using for the HDMI connection to my AV amp. That's the only GPU on this machine.
Logged

zybex

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 2397
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #11 on: May 17, 2024, 01:25:19 pm »

Try setting the screensaver to black screen and disable the screen sleep in Windows. It's not ideal as the screen stays powered on, but it works. On my LG TV I can turn the image off while keeping the sound, so I sometimes do that.
Adding a PCIe/USB sound card should also work, as you mention.

An HDMI dongle doesn't help because the audio is still going via the other HDMI output, which still gets put to sleep.
Logged

haggis999

  • Galactic Citizen
  • ****
  • Posts: 461
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #12 on: May 17, 2024, 05:27:06 pm »

Try setting the screensaver to black screen and disable the screen sleep in Windows. It's not ideal as the screen stays powered on, but it works. On my LG TV I can turn the image off while keeping the sound, so I sometimes do that.
Adding a PCIe/USB sound card should also work, as you mention.

That's an excellent example of lateral thinking. A blank (= black) screensaver does the job very well :):)


An HDMI dongle doesn't help because the audio is still going via the other HDMI output, which still gets put to sleep.

What other HDMI output do you mean? I only have the one on my GPU card.
Logged

zybex

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 2397
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #13 on: May 18, 2024, 03:37:10 am »

Nevermind, I've misread diagonally. I read "EDID Adapter" and thought about the fake monitor dongles that are used to keep a server/graphics card from sleeping. I thought you had one of those on one of the DP/HDMI outputs, with the AVR/TV connected to a another GPU output.
Logged

haggis999

  • Galactic Citizen
  • ****
  • Posts: 461
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #14 on: May 18, 2024, 04:16:43 am »

Nevermind, I've misread diagonally. I read "EDID Adapter" and thought about the fake monitor dongles that are used to keep a server/graphics card from sleeping. I thought you had one of those on one of the DP/HDMI outputs, with the AVR/TV connected to a another GPU output.

I'm still using the hardware chain of RTX 3070 HDMI output > Lindy EDID Emulator> HDMI cable > Denon X8500H AV amp, while the GPU DisplayPort outputs are directly connected to a pair of monitors.

The Lindy EDID Emulator is still failing to prevent the music stopping when the monitors go to sleep, so I'm now using your blank screen saver solution. I've delayed returning this dongle, as Lindy UK tech support are making another attempt to find out why it doesn't work for me.
Logged

zybex

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 2397
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #15 on: May 18, 2024, 04:30:05 am »

I don't think it can ever work, even with the adapter. When Windows puts the display to sleep due to inactivity, the GPU will power down the HDMI link completely which also kills the audio transport. The only way is to disable sleep to keep the link alive, either by using the blank screensaver or with MC's "Disable display from turning off" (which sounds like it isn't working for your hardware).

I believe this was an oversight of the HDMI spec, they didn't anticipate the need for this half-sleep scenario.

The main purpose of the EDID emulator is to tell the GPU that a monitor is indeed connected (even if it's not). This doesn't prevent Windows from shutting down the GPU output for inactivity.
Logged

haggis999

  • Galactic Citizen
  • ****
  • Posts: 461
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #16 on: May 18, 2024, 06:21:01 am »

As I mentioned in my OP, MC's "Disable display from turning off" setting works fine, but I didn't like my displays being lit up all the time I was listening to music and not actively using the computer.

I share your doubts about the EDID emulator, but the first Lindy tech support person I contacted claimed to have run a test later that same day proving it worked in the desired fashion. Another Lindy techie is now taking a closer look at the matter and I hope to get his feedback next week.   
Logged

tzr916

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 1322
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #17 on: May 18, 2024, 08:42:40 am »

Sorry, if you are using the windows display sleep function, then ALL monitors will sleep, including the HDMI going to the Denon. An EDID emulator will never be able to help when that this case. EDID emulators are designed to help when displays are powered OFF using the physical power button, or when the input changes on an AVR/TV.

An EDID emulator should work IF a windows settings/tool/app existed to sleep individual monitors, in a multi-monitor environment. Or if the monitor itself has a sleep function built into the settings menu.

EDIT: Another option (if your AVR supports it) would be sending music using DLNA. No physical connection required and I don't think sleeping monitors would interfere.
Logged
JRiverMC v32 •Windows 10 Pro 64bit •Defender Exclusions •ṈØ 3rd party AV
•ASUS TUF gaming WiFi z590 •Thermaltake Toughpower GX2 600W
•i7-11700k @ 3.6GHz~5GHz •32GB PC4-25600 DDR4
•OS on Crucial P5 Plus M.2 PCIe Gen4 •Tv Recordings on SATA 6TB WD Red Pro
•4 OTA & 6 CableCard SiliconDust Tuners
•nVidia RTX2060 •XBR65Z9D •AVRX3700H •Fluance 7.2.2 [FH]
•SMP1000DSPѫeD A3-300[RSS315HE-22] •SPA300DѫYSTSW215[15-PRX8S4]

haggis999

  • Galactic Citizen
  • ****
  • Posts: 461
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #18 on: Today at 03:44:38 am »


EDIT: Another option (if your AVR supports it) would be sending music using DLNA. No physical connection required and I don't think sleeping monitors would interfere.

I use JRiver MC for organising all my media files and for playing video. Stereo audio files are played from my NAS to a Naim streamer (without any MC involvement).

Multi-channel DSF audio (ripped from my SACDs) needs to be played via my Denon X8500H AV amp, but unfortunately it only supports stereo DSF files, so a DLNA connection is not going to work. This is why I am choosing to decode the DSFs in MC and sending a multi-channel PCM signal to the AV amp.
Logged
Pages: [1]   Go Up