INTERACT FORUM

Please login or register.

Login with username, password and session length
Advanced search  
Pages: [1]   Go Down

Author Topic: Is there a way to avoid using the GPU for HDMI output?  (Read 243 times)

haggis999

  • Galactic Citizen
  • ****
  • Posts: 451
Is there a way to avoid using the GPU for HDMI output?
« on: May 14, 2024, 02:56:48 am »

The Intel NUC I have been using for several years as an HTPC has just died. Rather than replace it, I decided to use my primary Windows 11 workstation as the HTPC. The HDMI output of the GPU card on this workstation is now connected to my AV amp.

When using this to play multi-channel DSF files (ripped from my SACDs) I hit an unexpected problem. The music would stop as soon as the workstation's monitors went to sleep. However, it didn't take me long to find that JRiver already had a solution to this problem, using the audio setting of "Disable display from turning off (useful for HDMI audio)".

The only issue with that solution is that it leaves the two monitors on my workstation running when they are not being used. I could just switch them off, but is there a way to add an alternative HDMI output to a PC that bypasses the GPU card? I'm guessing that a discrete sound card might be the only way to do this.
Logged

JimH

  • Administrator
  • Citizen of the Universe
  • *****
  • Posts: 71452
  • Where did I put my teeth?
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #1 on: May 14, 2024, 05:22:26 am »

Logged

haggis999

  • Galactic Citizen
  • ****
  • Posts: 451
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #2 on: May 14, 2024, 06:24:54 am »

Like this?
https://www.amazon.com/Adapter-Multi-Display-Converter-Projector-Chromebook/dp/B087PD9KSZ/


I had already tried a similar solution last night before raising my post this morning (using an adapter I bought to provide an HDMI output for my Dell XPS 15, which only has USB-C outputs), but my normal MC32 audio device of DENON-AVRHD (NVIDIA High Definition Audio) [WASAPI] was no longer available and I had to change my Output Format setting of 5.1 channels to Source number of channels before MC would start to play some music - though this was not a positive result, as my AV amp did not receive any audio signal.

The key problem is that Windows Playback settings show that my DENON-AVRHD device is not plugged in when using this USB to HDMI adapter.
Logged

haggis999

  • Galactic Citizen
  • ****
  • Posts: 451
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #3 on: May 14, 2024, 06:48:38 am »


Or maybe you have an old video card around?

I suspect that the AV capability of my workstation is provided by my RTX 3070 and there is no alternative AV source from my combination of Gigabyte B550 Vision D-P motherboard and AMD Ryzen 7 5800X processor. If true, that would explain why my USB to HDMI adapter is not working.

Your other suggestion of using a spare video card might do the job, though won't older cards fail to handle newer audio codecs?
Logged

tzr916

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 1320
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #4 on: May 14, 2024, 08:30:32 am »

Quote
music would stop as soon as the workstation's monitors went to sleep

Windows EDID strikes again.

Basically when Windows detects that one of the connected "monitors" shuts off, or the input on the AVR/TV changes etc, playback stops or Windows "moves/resizes" apps/video, and even desktop icons.

The fix is to trick Windows into thinking that the monitor stays connected. I've heard that Windows 11 supposedly has this option, but I run Win 10. Some GPU's have EDID override to get around this, like nvidia Quadro or some newer Intel NUC's. But if your GPU/driver does not, the BEST way to trick Windows is an external hdmi hardware EDID device.

I tried many - several active hdmi switches, "monitor detect killer" (pin 19), 4k HD Fury, but the only one that provides ZERO problems ended up being: https://atlona.com/product/at-etu-sync/ (can be found on eBay for less than $50)
Logged
JRiverMC v32 •Windows 10 Pro 64bit •Defender Exclusions •ṈØ 3rd party AV
•ASUS TUF gaming WiFi z590 •Thermaltake Toughpower GX2 600W
•i7-11700k @ 3.6GHz~5GHz •32GB PC4-25600 DDR4
•OS on Crucial P5 Plus M.2 PCIe Gen4 •Tv Recordings on SATA 6TB WD Red Pro
•4 OTA & 6 CableCard SiliconDust Tuners
•nVidia RTX2060 •XBR65Z9D •AVRX3700H •Fluance 7.2.2 [FH]
•SMP1000DSPѫeD A3-300[RSS315HE-22] •SPA300DѫYSTSW215[15-PRX8S4]

haggis999

  • Galactic Citizen
  • ****
  • Posts: 451
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #5 on: May 14, 2024, 11:37:43 am »

Windows EDID strikes again.

Basically when Windows detects that one of the connected "monitors" shuts off, or the input on the AVR/TV changes etc, playback stops or Windows "moves/resizes" apps/video, and even desktop icons.

The fix is to trick Windows into thinking that the monitor stays connected. I've heard that Windows 11 supposedly has this option, but I run Win 10. Some GPU's have EDID override to get around this, like nvidia Quadro or some newer Intel NUC's. But if your GPU/driver does not, the BEST way to trick Windows is an external hdmi hardware EDID device.

I tried many - several active hdmi switches, "monitor detect killer" (pin 19), 4k HD Fury, but the only one that provides ZERO problems ended up being: https://atlona.com/product/at-etu-sync/ (can be found on eBay for less than $50)

That Atlona Etude Sync adapter is currently only available on eBay UK as an import from the USA, which more than doubles the price when you take shipping into account. It would cost me £88 GBP, without any guarantee or technical support.

However, your very useful advice persuaded me to order a Lindy HDMI 2.0 EDID Emulator (https://www.lindy.co.uk/audio-video-c2/hdmi-2-0-edid-emulator-p11244), with a 2 year guarantee and lifetime support. Lindy's tech support in the UK have assured me it would do the job. If it doesn't, I will return it. I was told that this adapter doesn't support Dolby Atmos, but I don't think that has any relevance to my needs, as I only have a 5.0 speaker system.
Logged

haggis999

  • Galactic Citizen
  • ****
  • Posts: 451
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #6 on: Today at 09:48:07 am »

I tried many - several active hdmi switches, "monitor detect killer" (pin 19), 4k HD Fury, but the only one that provides ZERO problems ended up being: https://atlona.com/product/at-etu-sync/ (can be found on eBay for less than $50)

Sadly, my results with the Lindy HDMI EDID adapter appear to echo your own bad experiences in this area. This adapter simply failed to keep the music playing when the PC's monitors went to sleep.

After I contacted Lindy UK tech support, they ran a test of their own this afternoon. That test worked fine, but they couldn't explain why it failed with my equipment. They could only speculate that a difference in drivers might be to blame. The adapter will be returned for a refund. 

It looks like I will just have to use JRiver's audio setting of "Disable display from turning off (useful for HDMI audio)" and then manually switch off my PC's monitors when I am listening to music via HDMI (though that has the irritating effect of briefly stopping playback).
Logged

tzr916

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 1320
Re: Is there a way to avoid using the GPU for HDMI output?
« Reply #7 on: Today at 12:57:48 pm »

What's your exact  chain of hardware?

Eg. Mine is: primary playing MC on Nvidia gpu > HDMI > EDID box > AVR > TV... and secondary display via on-board graphics vga to monitor
Logged
JRiverMC v32 •Windows 10 Pro 64bit •Defender Exclusions •ṈØ 3rd party AV
•ASUS TUF gaming WiFi z590 •Thermaltake Toughpower GX2 600W
•i7-11700k @ 3.6GHz~5GHz •32GB PC4-25600 DDR4
•OS on Crucial P5 Plus M.2 PCIe Gen4 •Tv Recordings on SATA 6TB WD Red Pro
•4 OTA & 6 CableCard SiliconDust Tuners
•nVidia RTX2060 •XBR65Z9D •AVRX3700H •Fluance 7.2.2 [FH]
•SMP1000DSPѫeD A3-300[RSS315HE-22] •SPA300DѫYSTSW215[15-PRX8S4]
Pages: [1]   Go Up