INTERACT FORUM

Please login or register.

Login with username, password and session length
Advanced search  
Pages: 1 [2]   Go Down

Author Topic: JRiver audio testing dilema  (Read 32316 times)

blgentry

  • Regular Member
  • Citizen of the Universe
  • *****
  • Posts: 8009
Re: JRiver audio testing dilema
« Reply #50 on: August 23, 2015, 10:04:26 pm »

If you are going to try to make recordings of the output of the speakers and use that as your basis of comparison, I think it's going to be extremely difficult.  I'm betting that you can't make two recordings of exactly the same signal chain with exactly the same song, and have the recordings be identical.  If you *can* make recordings that are identical, bit for bit of two different "plays" of the same thing, then you might have a shot at recording the differences between two *different* playback chains.

But I say, why go all the way to the acoustic level anyway?  If you're trying to prove whether there's a difference, why not start further up the chain, at the digital level?  I don't know how to do it, but I would like to see "recordings" of the digital bit stream being fed to a DAC.  Record the output of one player feeding the DAC.  Then record the output of the other player.  Then compare and see if they are different.

The other way of doing this is pure blind testing.  I see you've already said that your listening environment isn't suited to this.  Blind testing, done correctly, is pretty difficult, so maybe it's good for you that you can't do it.  :)  (Meaning that it's a pain to do, so you save yourself the pain by not doing it.)

Brian.
Logged

Hilton

  • Regular Member
  • Citizen of the Universe
  • *****
  • Posts: 1291
Re: JRiver audio testing dilema
« Reply #51 on: August 24, 2015, 12:02:51 am »

It would be possible to do the blind testing using my HD600 headphones though which should be revealing enough. 
I'm thinking about getting a new fully balanced DAC and headphone/pre AMP, the Asus Essence III, which does DSD too.



Logged

JimH

  • Administrator
  • Citizen of the Universe
  • *****
  • Posts: 71429
  • Where did I put my teeth?
Re: JRiver audio testing dilema
« Reply #52 on: August 24, 2015, 12:38:22 am »

Read user mitchco's articles about testing JRiver on computeraudiphile.com.
Logged

RoderickGI

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 8186
Re: JRiver audio testing dilema
« Reply #53 on: August 24, 2015, 05:34:13 am »

Darko's review: >> (which I quite enjoyed)
http://www.digitalaudioreview.net/2015/08/a-stand-off-down-under-puremusic-vs-jriver-vs-audirvana/


Ah. I read as much of that as I could. It wasn't easy.

One question out of that though:
How can you show that each player produced the same output, if the only voting option was to pick one player which was the "best"?

The whole test was preordained to find difference. That is what we humans do. If we sought to find difference in something measured in the millions, where difference may defined as in the 100,000's, and we saw no difference, we would redefine difference as 10,000's, and if still no difference, in the 1,000's, and so on until even where there is no difference, we will see/hear/measure it, even when we have no instrument that can measure it. (i.e. Pico second jitter.)

The world is full of lots of very different people. I guess that is good, mostly. But that article just made me feel . . . yuck!
Logged
What specific version of MC you are running:MC27.0.27 @ Oct 27, 2020 and updating regularly Jim!                        MC Release Notes: https://wiki.jriver.com/index.php/Release_Notes
What OS(s) and Version you are running:     Windows 10 Pro 64bit Version 2004 (OS Build 19041.572).
The JRMark score of the PC with an issue:    JRMark (version 26.0.52 64 bit): 3419
Important relevant info about your environment:     
  Using the HTPC as a MC Server & a Workstation as a MC Client plus some DLNA clients.
  Running JRiver for Android, JRemote2, Gizmo, & MO 4Media on a Sony Xperia XZ Premium Android 9.
  Playing video out to a Sony 65" TV connected via HDMI, playing digital audio out via motherboard sound card, PCIe TV tuner

mwillems

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 5177
  • "Linux Merit Badge" Recipient
Re: JRiver audio testing dilema
« Reply #54 on: August 24, 2015, 07:54:14 am »

If you are going to try to make recordings of the output of the speakers and use that as your basis of comparison, I think it's going to be extremely difficult.  I'm betting that you can't make two recordings of exactly the same signal chain with exactly the same song, and have the recordings be identical.  If you *can* make recordings that are identical, bit for bit of two different "plays" of the same thing, then you might have a shot at recording the differences between two *different* playback chains.

But I say, why go all the way to the acoustic level anyway?  If you're trying to prove whether there's a difference, why not start further up the chain, at the digital level?  I don't know how to do it, but I would like to see "recordings" of the digital bit stream being fed to a DAC.  Record the output of one player feeding the DAC.  Then record the output of the other player.  Then compare and see if they are different.

The reason to involve analog measurement is that jitter, for example, is measurable on an analog output of a DAC (not necessarily fed to a speaker, you'd want to loop it back to a very low distortion ADC or scope to measure the distortion).  Analog output fed to a speaker would also be a really easy way to measure a volume difference.

I think loopback measurements (both digital and analog) are probably the most reliable way to measure and will show a better picture of various distortion components. However, in-room measurements with speakers can show that most of these exotic kinds of distortion are buried in orders of magnitude louder speaker and room distortion that often may render them completely inaudible in practice.

Even quite expensive speakers in treated rooms often have total distortion profiles at or worse than 1% (I've seen speakers that were a good bit better than that, but not orders of magnitude better).  A well-behaved DAC might have a jitter distortion component much lower than .001%.  It's true that jitter is "more audible" than harmonic distortion, but I'm not sure it's necessarily a hundred times more audible, etc.

If anyone is going down this road, here are two articles on measuring jitter, which is by far the hardest distortion component to measure:
http://www.anedio.com/index.php/article/measuring_jitter
http://nwavguy.blogspot.com/2011/02/jitter-does-it-matter.html

If anyone wants to measure plain old harmonic distortion, Holm Impulse has a distortion analyzer built-in that can provide the THD and also graph the individual harmonics.  It can be used for both loopback measurements, and for in room measurements.
Logged

mojave

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 3732
  • Requires "iTunes or better" so I installed JRiver
Re: JRiver audio testing dilema
« Reply #55 on: August 24, 2015, 01:46:04 pm »

If you are going to try to make recordings of the output of the speakers and use that as your basis of comparison, I think it's going to be extremely difficult.  I'm betting that you can't make two recordings of exactly the same signal chain with exactly the same song, and have the recordings be identical.  If you *can* make recordings that are identical, bit for bit of two different "plays" of the same thing, then you might have a shot at recording the differences between two *different* playback chains.

But I say, why go all the way to the acoustic level anyway?  If you're trying to prove whether there's a difference, why not start further up the chain, at the digital level?  I don't know how to do it, but I would like to see "recordings" of the digital bit stream being fed to a DAC.  Record the output of one player feeding the DAC.  Then record the output of the other player.  Then compare and see if they are different.
The MOTU AVB audio devices make this kind of testing very easy due to its multi-client driver and excellent mixer. You can reroute the digital input from a media player back to the computer. I do this by routing movies to Room Equalization Wizard for checking the maximum SPL levels on various channels and by routing to Spectrum Lab for checking bass content. You can also reroute back into a DAW or other program and save the file. Compare all files to see if the media players are identical digitally.

You could also have several media players all playing the same content at once to the DAC, and then switch in the mixer which media player is outputting to the speakers. This is using the identical DAC chip and routing for all media players with instantaneous switching.

With a 32-bit DAC, the MOTU AVB audio devices will eliminate dither as being an issue unless 24-bit dither is selected in JRiver in Options > Audio > Audio Device > Device Settings. (Note:  The MOTU Ultralight AVB has a 24-bit DAC)
Logged

gvanbrunt

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 1232
  • MC Nerd
Re: JRiver audio testing dilema
« Reply #56 on: August 24, 2015, 06:40:49 pm »

I'll make this fairly short: you have an opportunity here. What you need to do is talk it out and see how you can improve another round of tests. Don't argue the results of the current one, just suggest ways to improve methodology next time. Also break it down further than just sound. If it wins on sound, having reliability, features etc. would gain some support for other players.

ABX double blind is the de facto standard. They need to do that. And prior to that, you need to make sure you remove as many variables as possible first. Volume, eq etc need to go unless they are the same for all.

That is what I would press them on. Not on how the test was flawed, but here is how to do it better "so we can really get the ultimate player". Play into their thinking rather than against it.
Logged

glynor

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 19608
Re: JRiver audio testing dilema
« Reply #57 on: August 24, 2015, 08:09:12 pm »

I think gvanbrunt has a good point here.

I've seen tests in the past, back in the old days, where people recorded the SPDIF output of MC and then compared them to the original files, in order to prove lossless output. I looked around briefly, but couldn't find any. They were buried in threads on HydrogenAudio, I believe.

I imagine it would be more difficult, but also possible to accomplish the same thing with a USB DAC, and sniff the USB traffic on the bus.  You can do it with Wireshark, but to convince anyone that it was "right at the plug" you'd probably need to use some specialized hardware.

But, I think gvanbrunt is right here. That won't convince anyone (though it might be interesting to catch "cheaters" if any exist). You'd pretty much need to be an engineer to figure it out, and even with a detailed analysis, to the uninformed, it would just be one more conflicting analysis they don't fully grasp to compare to the other ones from the other "side".

It would be nice to do additional, actually rigorous testing. Because that might actually convert some people. The only way to do it for real is an ABX test. You can ask, for each attempt, for people to choose which they prefer, but you need to do the X test to determine if they know what they're talking about.

You also cannot tabulate the results where the participants can see them "as they're collected" (by a show of hands, marking on a board up front, or whatever) because we're pack hunting animals and the psychology of "going with the herd" (or in some cases with some participants, the urge to "go against the herd") is way too strong. You also need to do multiple "rounds" of testing, with a variety of different samples, to get good results.

But, in the end, psychology is so strong, I don't know if it matters and if you'll convince anyone. We think we are rational machines, but we're not. We operate nearly entirely on a kind of subconscious "auto-pilot", and those tendencies are especially strong with things like this, that are ephemeral (and, to some people, basically spiritual).

If you're interested, and you get iPlayer or can otherwise dig it out, BBC's Horizon did a fantastic show about how the mind works and how we make decisions a while back: Horizon - s2014 e5 - How You Really Make Decisions (and the episode right before it, was The Power of the Placebo, which was also good).
Logged
"Some cultures are defined by their relationship to cheese."

Visit me on the Interweb Thingie: http://glynor.com/

gvanbrunt

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 1232
  • MC Nerd
Re: JRiver audio testing dilema
« Reply #58 on: August 24, 2015, 08:23:58 pm »

Glynor sums up the technical details well. Also keep in mind to "play at their own level" by motivating them add additional criteria to improve the results. All in the interests of finding the best sounding results of course. :)

I should point out that it is a very interesting situation you are in where you can influence their beliefs and possibly see some interesting results. The key is not to whack them over the head. They love experimenting with stuff and trying things. So build on that.
Logged

Hilton

  • Regular Member
  • Citizen of the Universe
  • *****
  • Posts: 1291
Re: JRiver audio testing dilema
« Reply #59 on: August 24, 2015, 09:38:06 pm »

I agree with all the above. However...
 
I just want to point out I wasn't trying whack anyone or be overly critical. As we have all agreed, testing is a very complex process and from the discussions here it's clear a lot of things can be done differently which may or may not alter the outcome of testing. All I was trying to achieve is point out the variability of the test conditions.  

Maybe it wasn't fair testing but the point still remains JRiver didn't test well under those conditions. There may have been a number of uncontrolled and unknown variables and biases that led to the result.

I'll consider all of the commentary and perhaps discuss with Tom, if he ever speaks to me again! :)

Even if we don't get the opportunity for a "re-match" I still think it would be an interesting experiment to do more detailed testing and comparison at the bit level and sample level as well as some more controlled listening test conditions, with at least blind ABX testing - double blind is very difficult under these circumstances.

I still think it's possible and quite likely that the human ear and brain is able to resolve more detail than can be measured in digital samples which means scientific analysis of "data" wont necessarily prove they sound different or "better".

Here's some very interesting reading on USB Audio.
http://www.thewelltemperedcomputer.com/KB/USB.html

A highspeed realtime hardware based USB analyser is about $500. (enough for up to 12mbit for class 1 USB Audio - ie upto 96/24)
Logged

glynor

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 19608
Re: JRiver audio testing dilema
« Reply #60 on: August 24, 2015, 10:18:35 pm »

Maybe it wasn't fair testing but the point still remains JRiver didn't test well under those conditions. There may have been a number of uncontrolled and unknown variables and biases that led to the result.

See, this is where I vehemently disagree, and is the biggest problem with many of these kinds of "for fun" audio tests (which try to cloak themselves in science without actually practicing it).

It wasn't an audio test. It was test of some kind, but not an audio test between players. You're saying (and have said a few times) "maybe it was a bad test, but"...

The "but" is... It wasn't a test. If you don't account for some of the factors you didn't account for, rigorously, you have meaningless results. You may have seen some "difference" but you have no way to gauge why you saw the difference, which is pretty much the definition of testing between different audio players. If the effect was "real" (and not driven by psychology), the most likely explanation for your test results, by far, is that volume levels were not properly matched.

It was a waste of time, except for the joy of doing it. Any document published from the results, is not serious and should be disregarded.
Logged
"Some cultures are defined by their relationship to cheese."

Visit me on the Interweb Thingie: http://glynor.com/

glynor

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 19608
Re: JRiver audio testing dilema
« Reply #61 on: August 24, 2015, 10:28:52 pm »

I still think it's possible and quite likely that the human ear and brain is able to resolve more detail than can be measured in digital samples which means scientific analysis of "data" wont necessarily prove they sound different or "better".

This is a religious statement, not a scientific one.

If that is your hypothesis, then prove it. ;) If you're saying it is "beyond science" and "beyond proof" then it is either religion, science woo, or grifter-speak. You can believe whatever you want, but then you don't get to wear the trappings of science (at least, and not have people call you on it).
Logged
"Some cultures are defined by their relationship to cheese."

Visit me on the Interweb Thingie: http://glynor.com/

Hilton

  • Regular Member
  • Citizen of the Universe
  • *****
  • Posts: 1291
Re: JRiver audio testing dilema
« Reply #62 on: August 25, 2015, 12:50:05 am »

This is a religious statement, not a scientific one.

If that is your hypothesis, then prove it. ;) If you're saying it is "beyond science" and "beyond proof" then it is either religion, science woo, or grifter-speak. You can believe whatever you want, but then you don't get to wear the trappings of science (at least, and not have people call you on it).

LOL That's too true.  I stand corrected... I'm not sure that I'm qualified to prove such a hypothesis. :)
)
Logged

JimH

  • Administrator
  • Citizen of the Universe
  • *****
  • Posts: 71429
  • Where did I put my teeth?
Re: JRiver audio testing dilema
« Reply #63 on: August 25, 2015, 01:54:30 am »

I agree with glynor.  That kind of "belief" is passed around in some audiophile circles until eventually it becomes "fact".

We used to call this "hand waving", as in "you understand, don't you?".
Logged

gvanbrunt

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 1232
  • MC Nerd
Re: JRiver audio testing dilema
« Reply #64 on: August 25, 2015, 06:57:46 am »

I still would love to see another round of tests with more rigorous criteria. I don't believe you need to spend money on a USB analyzer, and perhaps some of the technophiles here could come up with a few simple ways to match volume eq etc for the tests? Anyone have a good way to do ABX that is also simple and cheap? I would bet you can't tell the difference if you even get them close.
Logged

JimH

  • Administrator
  • Citizen of the Universe
  • *****
  • Posts: 71429
  • Where did I put my teeth?
Re: JRiver audio testing dilema
« Reply #65 on: September 03, 2015, 02:56:48 pm »

Our friend, mwillems, had a modified version of his post published on Computer Audiophile this weekend.

http://www.computeraudiophile.com/content/663-my-lying-ears/

Thanks, mwillems.  
Logged

mwillems

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 5177
  • "Linux Merit Badge" Recipient
Re: JRiver audio testing dilema
« Reply #66 on: September 03, 2015, 03:09:11 pm »

Our friend, mwillems, had a modified version of his post published on Computer Audiophile this weekend.

http://www.computeraudiophile.com/content/663-my-lying-ears/

Thanks, mwillems.  

I'm pleased and humbled by the reaction over there.  I wasn't sure what to expect, but the comments have been very constructive so far.  Some of the discussion over there about the difficulty of describing what different types of distortion or audio filters sound like almost inspired me to try and work up a primer with sample audio clips, but then I remembered that I have a 1 year old who is not very interested in waterfall plots or linear phase filters and who thinks microphone cables are great fun to yank on ;D 

But if I were a younger man with more free time...
Logged

jmone

  • Administrator
  • Citizen of the Universe
  • *****
  • Posts: 14267
  • I won! I won!
Re: JRiver audio testing dilema
« Reply #67 on: September 03, 2015, 03:49:40 pm »

A nice (but no doubt brief) period of clarity.... Now lets get back to Hospital Grade Power Points!
Logged
JRiver CEO Elect

gvanbrunt

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 1232
  • MC Nerd
Re: JRiver audio testing dilema
« Reply #68 on: September 04, 2015, 09:26:42 am »

Now lets get back to Hospital Grade Power Points!

Oh god no. If I have to hear that one again I'll install one in my home just so I can "insure maximum power transfer" when I stick my tongue in it.
Logged

astromo

  • MC Beta Team
  • Citizen of the Universe
  • *****
  • Posts: 2239
Re: JRiver audio testing dilema
« Reply #69 on: September 05, 2015, 08:13:33 am »

If you're interested, and you get iPlayer or can otherwise dig it out, BBC's Horizon did a fantastic show about how the mind works and how we make decisions a while back: Horizon - s2014 e5 - How You Really Make Decisions (and the episode right before it, was The Power of the Placebo, which was also good).

Chased up both docos. Thanks for the tip. Well worth the watch.

While not targetting audio listening tests, both programmes show the power of the mind in convincing the brain of truth. Loved the work with monkeys and currency. Revealing.
Logged
MC31, Win10 x64, HD-Plex H5 Gen2 Case, HD-Plex 400W Hi-Fi DC-ATX / AC-DC PSU, Gigabyte Z370 ULTRA Gaming 2.0 MoBo, Intel Core i7 8700 CPU, 4x8GB GSkill DDR4 RAM, Schiit Modi Multibit DAC, Freya Pre, Nelson Pass Aleph J DIY Clone, Ascension Timberwolf 8893BSRTL Speakers, BJC 5T00UP cables, DVB-T Tuner HDHR5-4DT

Hilton

  • Regular Member
  • Citizen of the Universe
  • *****
  • Posts: 1291
Re: JRiver audio testing dilema
« Reply #70 on: September 06, 2015, 08:24:11 am »

Our friend, mwillems, had a modified version of his post published on Computer Audiophile this weekend.

http://www.computeraudiophile.com/content/663-my-lying-ears/

Thanks, mwillems.  

Thanks Jim, Thanks mwillems. Nice story/piece. :)
Logged

georgethesixth

  • Recent member
  • *
  • Posts: 19
Re: JRiver audio testing dilema
« Reply #71 on: October 22, 2015, 08:20:37 pm »

The AES looks into this apparently.

1. http://www.aes.org/events/136/papers/?ID=3894.  Some of the resulting papers are relatively cheap (approx. 20 USD) and very interesting IMO.

2. The site provides names of people in the academic world who could be asked about testing methods and methodology.

George



Logged
Pages: 1 [2]   Go Up