HDMI isilencer

QuadraphonicQuad

Help Support QuadraphonicQuad:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

Bretho72

Well-known Member
Joined
Mar 25, 2022
Messages
136
I bought a HDMI Isilencer a couple of months ago now and man I love this thing, I understand snake oil and all that stuff but I really do think that this does make an improvement depending on what it is used on. I have 3 multichannel playback choices, 1 is Apple TV, 2 is Zappiti neo media player and 3 is a PC running Roon core.
On Apple TV yes there is an improvement much more refined and less harsh sounding, on the Zappiti not so much difference still sounds a big boomy. On the PC running roon rock playing multichannel flacs and DSD this is really good, just sounds more refined. A very good investment.
 
Jitter jitter go away, leave HDMI this day! Shades of Agatha (Marvel-Disney). If it works for you, it works. HDMI is a very jittery interface that certainly could have been way better designed from an electrical & jitter standpoint.

My Chris Stephen's Appletv-X (modded 2021 Appletv4k) with its own hand built linear power supply with short cable hard wired into the modded Appletv4k is another example with a very clean out HDMI interface.
 
My Chris Stephen's Appletv-X (modded 2021 Appletv4k) with its own hand built linear power supply with short cable hard wired into the modded Appletv4k is another example with a very clean out HDMI interface.
Yes agree, I have modded my own Apple TV with Apple TVx design.
 
If the difference is measurable, it may be audible. If it isn't measurable, and you're hearing a difference, that's placebo. I guarantee the recording and mixing engineers weren't worried about jitter in their chain; the mastering engineers probably weren't, either. So, why should you? As long as your cables actually meet specifications—and, unfortunately, many don't—you can rest assured that you're getting audibly transparent signal transmission. HDMI isn't such a broken interface that it needs fixing.
 
Yes agree, I have modded my own Apple TV with Apple TVx design.
So even after the mods performed on your Apple TV 4K, the addition of the ISilencer provided a "much more refined and less harsh sounding" experience. I know the answer to this already, but i dont suppose you had a way to truly confirm this with any real A/B testing?

Sounds to me like the Apple mods fell a little short for ya.
 
Anticipated first thoughts...
Extraordinary claims needing extraordinary evidence.
What's an explanation for why the industry would have an apparently easily cured shortcoming like this?

Then the expected first questions...
What's the definition of 'jitter' being talked about here?
I'm familiar with jitter in context of sample rate clocks. Lower jitter clocks are desirable in the AD and DA stages. Being that shuttling the digital data in between - as HDMI does - doesn't involve AD or DA, there's nothing beyond data being received or not. So what's the context here in digital data transmission over HDMI?

If there's noise (RF or something) getting into the HDMI line and causing errors or dropouts and this device filters RF noise and cures it, that would be solid. But then someone would say that instead of what sounds like a description of an analog stage process.

If the digital audio is audibly dropout riddled (but still staying connected and receiving) and the device cures that... Again, I'd expect to hear more than what sounds like a fidelity critique. I'd expect to hear a comment like "had serious static/dropouts before".

Does it strip out HDCP or something? :D
 
So even after the mods performed on your Apple TV 4K, the addition of the ISilencer provided a "much more refined and less harsh sounding" experience. I know the answer to this already, but i dont suppose you had a way to truly confirm this with any real A/B testing?

Sounds to me like the Apple mods fell a little short for ya.
He never said that he uses an iSilencer. Any you are one of many that won't believe modding anything makes any difference. At least we all like multi-channel or we wouldn't hang out here. That we can agree on.
 
If the difference is measurable, it may be audible. If it isn't measurable, and you're hearing a difference, that's placebo. I guarantee the recording and mixing engineers weren't worried about jitter in their chain; the mastering engineers probably weren't, either. So, why should you? As long as your cables actually meet specifications—and, unfortunately, many don't—you can rest assured that you're getting audibly transparent signal transmission. HDMI isn't such a broken interface that it needs fixing.

The recording and mixing engineers are not designing, manufacturing, marketing & selling playback gear!
Or making decisions on how good or expensive the playback gear should be. Your argument makes no sense!
 
Anticipated first thoughts...
Extraordinary claims needing extraordinary evidence.
What's an explanation for why the industry would have an apparently easily cured shortcoming like this?

Then the expected first questions...
What's the definition of 'jitter' being talked about here?
I'm familiar with jitter in context of sample rate clocks. Lower jitter clocks are desirable in the AD and DA stages. Being that shuttling the digital data in between - as HDMI does - doesn't involve AD or DA, there's nothing beyond data being received or not. So what's the context here in digital data transmission over HDMI?

If there's noise (RF or something) getting into the HDMI line and causing errors or dropouts and this device filters RF noise and cures it, that would be solid. But then someone would say that instead of what sounds like a description of an analog stage process.

If the digital audio is audibly dropout riddled (but still staying connected and receiving) and the device cures that... Again, I'd expect to hear more than what sounds like a fidelity critique. I'd expect to hear a comment like "had serious static/dropouts before".

Does it strip out HDCP or something? :D

I can't speak to the iFi stuff or how good if at all it works. Haven't tried it.

As for Appletv-X, Chris Stephens is very well known in the AV Industry. For those of us who attended CES in the late 90s and early 2000s, Chris did all the 5.1 AV setups both audio and video that won the best of CES awards. Chris also did the mods to the Electrohome 9" projectors marketed and sold as Vidikron projectors.
Chris has all sorts of very expensive gear and measures his mods.

https://appletvx.com/
 
I bought a HDMI Isilencer a couple of months ago now and man I love this thing, I understand snake oil and all that stuff but I really do think that this does make an improvement depending on what it is used on. I have 3 multichannel playback choices, 1 is Apple TV, 2 is Zappiti neo media player and 3 is a PC running Roon core.
On Apple TV yes there is an improvement much more refined and less harsh sounding, on the Zappiti not so much difference still sounds a big boomy. On the PC running roon rock playing multichannel flacs and DSD this is really good, just sounds more refined. A very good investment.
Interesting! I purchased two of these units a couple of weeks ago and liked what I was hearing and seeing when installed (not a placebo for me). Normally, I am skeptical with the claims many of these type products claim they do but, the reason why it didn’t take me long to push that add to cart button on Amazon is because I use a couple of other iFi audio SilentPower Enhancer products and both those products work for me too, in my system. I use one for my ATV’s output into the AVR, another on the output of my BD player into the AVR and another one on the input of my OLED TV. Using the HDMI iSilencer has helped me to tolerate many of my discs that have that top end brightness including APP, Turn of a Friendly Card. What amazes me on the video side is how realistic and 3-dimensional the picture looks on my 2016 65” LG OLED TV. Similar to feeling like I’m right there, in the action, but at a distance to the screen (looking into the action).
 
He never said that he uses an iSilencer. Any you are one of many that won't believe modding anything makes any difference. At least we all like multi-channel or we wouldn't hang out here. That we can agree on.
Uh, no. The OP says he bought one a couple of months ago and then offers his impressions on the differences it makes across his HDMI sources.

Its not that i dont believe in mods, Its that i dont believe in conclusions arising from subjective listening experiences.
 
Uh, no. The OP says he bought one a couple of months ago and then offers his impressions on the differences it makes across his HDMI sources.

Its not that i dont believe in mods, Its that i dont believe in conclusions arising from subjective listening experiences.

Woops! I didn't get that the dude who posted about modding his Appletv4k is the same dude who started this thread! My BAD!

As for conclusions from subjective listening, I would simply say that I don't necessarily believe it will apply to me unless I try it out to my satisfaction in my own system and on my own time. And of course objective info certainly helps, - but in the end, its my subjective satisfaction and jumping up and down for joy that matters!
 
I believe most of us dismiss the $10,000 power cord and similar claims out of hand. So a claim that a dongle-like blob in the HDMI channel will make a difference is likely to be met with skepticism.

Personally, I usually just try to keep my signal path as clean and simple as I can, using “pretty good” quality components and good practice design.

So I’ve never tried this bebob. I don’t really know what it does, and my complaints about HDMI connections are all about cables that just give up and break. Maybe I’m bending them too sharply, but I”m definitely trying not to. Since my HDMI connections are working, I don’t believe I’ll try to fix them.

That CD4 demodulator is another story.
 
The recording and mixing engineers are not designing, manufacturing, marketing & selling playback gear!
Or making decisions on how good or expensive the playback gear should be. Your argument makes no sense!
Even the cheapest commodity gear being sold today is not going to suffer from audible jitter. That's why engineers don't worry about it and consumers shouldn't, either.

Then the expected first questions...
What's the definition of 'jitter' being talked about here?
I'm familiar with jitter in context of sample rate clocks. Lower jitter clocks are desirable in the AD and DA stages. Being that shuttling the digital data in between - as HDMI does - doesn't involve AD or DA, there's nothing beyond data being received or not. So what's the context here in digital data transmission over HDMI?
Jitter is relevant when dealing with synchronous digital connections, of which HDMI is one. The source device provides both the bitstream and clock data. Contrast that with USB, which can be synchronous, but is usually asynchronous when used for audio playback. In that case, the clock of the endpoint device is used to time playback of the samples.

But it's academic, really. We have long since passed the point where audible jitter was a real problem.
 
You all makin' me crazy, what the heck are you talking about, no links, no research notes so I had to look up myself. I'm just lurking here, don't own it and have no opinion.
IFI AUDIO MAKES IT
ISILENCER DIRECT LINK
MOON AUDIO REVIEW

There are many reviews online of this product. Seems to be a little on the Pro side vs the Con side. Price is $59.00 to $79.00.
 
Jitter is relevant when dealing with synchronous digital connections, of which HDMI is one. The source device provides both the bitstream and clock data. Contrast that with USB, which can be synchronous, but is usually asynchronous when used for audio playback. In that case, the clock of the endpoint device is used to time playback of the samples.

But it's academic, really. We have long since passed the point where audible jitter was a real problem.
I wasn't thinking about clocking off the incoming HDMI stream... Yeah OK, that makes sense. Duh! even. :)

Agreed that clock jitter in AD and DA devices hasn't been an issue for well over 20 years since some very early days devices. (And yes I did use an Aardvark sample rate clock to clock my ADAT machines for a minute back then!) Today... Grab a lowly Behringer/Midas UMC interface and the converters and clock are just fine.

I mean, I think a device would need to be really broken or like some Amazon sold bootleg facsimile product to be so bad that some filter on the input can help. But I do suspect such devices are out there in the wild. Amazon, Worst Purchase, et all.

So not straight out calling BS at all. But it is telling when asking a few technical question gets either defensive responses or redirection isn't it! :D (Not meaning to suggest anyone doing that here! Just mentioning the trend of seeing that.) It just seems like something like this could only be a solution for something really broken and that leads to looking for a root cause fix instead. If some audio device has audible jitter issues here in the 21st century, that's on the designer/builder!

Well, I'm using firewire and USB connected devices here. (Yes I do still have some firewire interfaces!) I still think avoiding HDMI for anything beyond a display connection is the best advice.
 
If as I suspect all it may do is jitter reduction of the HDMI signal then the only thing that might improve is the bit error rate, and you would see that by an improved the eye-diagram of in versus out as that is how it would be tested (but significantly the company making this item shows nothing technical to back-up its claims, and looking at their Tech Note its just marketing bumph!). However, if there is an improvement then that would indicate non-HDMI compliant cables and poorly designed and manufactured equipment at either end. In the past we did have poor clocks as equipment was designed for a particular price, the cost of high quality clocks has dropped so no excuses for poor clocking these days, the ones I design in have femtosecond jitter specs. Even if the data rates are related and the input clock is divided down to give an output clock in the equipment, the jitter is reduced by the division ratio, and if you use a PLL or DLL circuit to produce the output clock you can reduce it even further.

The data rate on HDMI is way beyond the sample rate of the DACs etc. in the receiving equipment. So the serial data on the HDMI link will be read into parallel memory and processed in a DSP at a different clock rate (no jitter will affect this processing). Consequently, the clock rate seen by the output DAC is different to that on HDMI data.

Jitter issues are another of the fallacies peddled by a lot of 'audiophile' companies to make money, all it does at sample rate is add a bit of noise way down in level so out of hearing range or so low it is masked by the audio signal.
 
Jitter and data loss and error correction are different artifacts.

Data loss can be glaring when it crosses the line of 100% recoverable. Artifacts get wild as the system gets more unpredictable. Big chunks of dropouts still make chirps, clicks, and pops.

Jitter is uneven clock pulses (viewed under the microscope). A looser clock vs a tighter more stable clock usually just makes the audio a little less clear. Maybe visually imagine an image on a grid. Now imagine the grid being just slightly imperfect in the spacing. The image would be just a little blurred or skewed. Clock jitter so bad it creates resonance with buzzing and zinging sounds is a full blown melt down example. A visual example of something like this might be a moray pattern. Now you have artifacts on top of the thing that are pretty altering.

All the brochures with these questionable devices talk about the problem like it's the melt down example.
 
Back
Top