P.J wrote:Most of the videos are interlaced
I have interlaced video only from my camera.
All rips are progressive since most of BD too (except "LSD-driven Japanese companies", that are using hard telecine = 30i video that is 24p). So... where do they came from? Recording from some TV channels (but all shows are still progessive)? I can't count them as "most of the video".
P.J wrote:and the most of computers don't have powerful CPU.
... but have powerful GPU's, described in your link?
P.J wrote:When you have to use GPU, you can't run de-interlacing on CPU.
Nope. With Intel QuickSync you can do any postprocessing you want on CPU while video decodining using GPU. With nVidia, LAV CUVID (or CoreAVC) and CUDA you can do the same thing, if you add ffshow as filter for raw video. Poor ati... again
E.g. debanding: ffdshow deband + madVR, splash. Deinterlacing is just one another postprocessing filter.
P.J wrote:Software decoding isn't right way
It's right way for desktops and notebooks when you don't care about battery life. E.g. my desktop can decode 1080p video at about 400fps, that is more than 8 times faster then my ati card (it even can't handle 1080p@50fps). That has a lot of advantages: much faster seeking, forgetting about profile@level stuff, 10 bit support and so on.
Any modern dualcore can easily decode 4K, so 1080p is limit for really slow processors like Atom with 1 core.
P.J wrote:only good for some cases like 1080p60
Btw, with new SB decoder in ffdshow my integrated video can handle 1080p@60fps with 16 ReFrames (High@5.1) decoding perfectly (splash still has artefacts). Obviously, nVidia can too.
P.J wrote:Just look at the CPU usage while decoding 1080
max 10% for 1080p on desktop
or the same 10% for 720p on my acer.
P.J wrote:Ain't it crazy to burn CPU power for decoding it?
When you do care about battery on notebook - yes. In other situations - no.
But when you care about battery, you don't care about quality.
P.J wrote:Did you forget CrystalHD or your previous netbook, hp mini 311?
Yes. Because it wasn't CrystalHD - it was GeForce 9400M (aka ION, that has slightly faster video decoding block than desktop ati card, lol).
P.J wrote:And madVR is just a codec.
Codec = Coder/decoder.
So madVR isn't a codec. It's a renderer, like Haali or EVR or overlay...
Video has low resolution, so difference is huge. On HQ sources it less noticeable, but still noticeable, especially when it comes to gradients: Splash (it decided to strech it, for some reason), madVR
Also splash doing some extra deblocking that kills a lot of details: splash, madVR.
madVR wirt softcubic50, splash (text is pixilated).
madVR with default lanczos4, splash
P.J wrote:Using it with some programs like MPC-HC is painful.
This is what real pain is