Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Television

Ask Slashdot: Is There An Open Source Tool Measuring The Sharpness of Streaming Video? 101

dryriver asks: Is there an open source video analysis tool available that can take a folder full of video captures (e.g. news, sports, movies, music videos, TV shows), analyze the video frames in those captures, and put a hard number on how optically sharp, on average, the digital video provided by any given digital TV or streaming service is?

If such a tool exists, it could be of great use in shaming paid video content delivery services that promise proper "1080 HD" or "4K UHD" quality content, but deliver video that is actually Youtube quality or worse. With such a tool, people could channel-hop across their digital TV service's various offerings for an hour or so, capture the video stream to harddisk, and then have an "average optical sharpness score" for that service calculated that can be shared with others and published online, possibly shaming the content provider -- satellite TV providers in particular -- into upping their bitrate if the score turns out to be atrociously low for that service....

People in many countries -- particularly developing countries -- cough up hard cash to sign up for various satellite TV, digital TV, streaming video and similar services, only to then find that the bitrate, compression quality and optical sharpness of the video content delivered isn't too great at all. At a time when 4K UHD content is available in some countries, many satellite TV and streaming video services in many different countries do not even deliver properly sharp and well-defined 1080 HD video to their customers, even though the content quality advertised before signing up is very much "crystal clear 1080 HD High-Definition".

What's the solution? Leave your thoughts and suggestions in the comments.

And is there an open source tool measuring the sharpness of streaming video?
This discussion has been archived. No new comments can be posted.

Ask Slashdot: Is There An Open Source Tool Measuring The Sharpness of Streaming Video?

Comments Filter:
  • You'll probably run into copy-protection issues with many media sources. If you can get at the bits to analyze it, you can dump them out to create a copy. I'm not even sure sharpness is going to be easy to measure, if a soft/low-res source has been artificially sharpened to give the appearance of quality. Sports broadcasts seem to be the worst for it for it from my experience, with glowing halos around high-contrast edges to make them stand out even more.

    • Sports are bad to encode because of the camera pan movement and the usual fast action of sports. Many broadcasters do not increase the bitrate for sports even if the broadcast will benefit from it. Add to the injury that many broadcasters use 1080i rather than 720p. Many TVs do deinterlacing bad. Images look OK on the news programme but any movement and details are blurred.
      • While it may be true that some TVs are not good at interlacing, interlacing in fact was designed to smooth motion, not blur it.

        Theoretically, assuming your hardware is capable of doing it properly, a progressive-scan (p) video would need to have nearly twice the framerate of an interlaced (i) video, in order to achieve the same motion smoothness.
    • Yeah, any tool that was capable of doing this on DRM'd sources would be sued to oblivion and back.

    • Agreed... of course, they're not looking for the bitstream itself, it can be "post transform" which when stored would cause generational copying issues.

      The technical solution to this would be to :
      a) split the video signal between an HDCP compliant screen and a capture card (such as black magic or preferably dektec)
      b) Capture the frames and perform full frame FFTs to identify frequency distribution. A large area DCT would work extremely well. Another option is to use wavelets and identify act
      • That will still be a problem, as any properly HDCP-compliant capture card will not (or should not!) give you access to the data either. Every device should check the downstream devices are compliant before passing any data on, and not provide any way to intercept it. Any that knowingly do are breaking the HDCP licensing agreement and are likely to be sued.

  • by Solandri ( 704621 ) on Saturday December 15, 2018 @04:53PM (#57809568)
    It created by degrading the video in a process called unsharp masking [wikipedia.org]. Basically you find the edges between light and dark objects, and exaggerate them. You make the light side lighter on that edge, and make the dark side darker. This adds no information, in fact it degrades video quality from the original unsharpened version (loses information).

    But your brain has special cells which recognize transitions between light and dark, and identifies them as an edge. When the exaggerated transition from an unsharp masked edge hits those brain cells, they get more excited and signal the rest of your brain that this is a really strong edge. Thus creating the illusion that you are seeing a sharper image, when objectively it's a degraded image.

    So any algorithm which detects "sharpness" as interpreted by the brain would actually rate inferior (heavily processed) video streams higher than video streams conveying the maximum amount of information possible. It's why unsharp masking is typically added by the TV or video player, rather than incorporated into the original video stream. (A slight amount of sharpening is done to counteract the blurring caused by the Bayer filter used in camera sensors, but that's another story.)
    • How does it lose information if changes the information it finds?
      That sounds pretty odd

      • by Anubis IV ( 1279820 ) on Saturday December 15, 2018 @11:23PM (#57810938)

        Sharpening the image is an inherently lossy process. You're replacing a more detailed image containing a soft edge with something that has a crisp edge with less gradation. You can't then reverse the process to get back to the original image, since there'd be no way to tell the difference between things that actually had a sharp edge in the original image and things that had merely had their edges sharpened after the fact. They'd both look the same.

        Frankly, sharpness is a really lousy measure for determining how good an image is. It doesn't make images look better, in and of itself, and it's something that the user can already introduce themselves if that's where their preferences lie, simply by bumping up the sharpness on their TV. Of course, if your goal is to have the highest fidelity image, you shouldn't set your TV's sharpness to anything over 0. Back in the day of analog CRTs it used to be the case that sharpness might improve the fidelity, but ever since the switch to digital LCDs anything more than 0 adds sharpness that wasn't there in the original image, thus decreasing the visual fidelity of the resulting image.

      • by MrL0G1C ( 867445 )

        Because a sharpened image over-writes some parts of the image with false information, see a good example here:
        https://www.youtube.com/watch?... [youtube.com]

    • by AmiMoJo ( 196126 )

      The questioner is hopelessly confused anyway. For example, YouTube has some of the best streaming quality, especially with 4k videos. What makes YouTube videos look bad is when the person editing and uploading the video doesn't know what they are doing.

      It certainly would be possible to create some kind of measure of video quality, but it could be gamed and would be of limited use. All you can do is verify that the specs that the streaming services claim are true (e.g. 1080p frames are being sent) and look a

  • by Anonymous Coward

    Video quality is a function of codec and bitrate. It will always be a compromise between image quality, file size and decoding power. Even if the service wanted to serve users super sharp, high preset h265 + flac audio, most users would be unable to use it - file sizes would eat their download caps and their netbooks/phones would be unable to cope with decoding. Obviously on the service side they also don't want to pay for storage and transfer of those huge files, especially if they would be hardly watched

  • Video metrics (Score:4, Informative)

    by sanf780 ( 4055211 ) on Saturday December 15, 2018 @05:05PM (#57809610)
    There are already known metrics, like PSNR and SSIM. The thing is, these metrics are used when you encode the videos from the raw material and there is a tradeoff between bitrate and the desired metric.

    The only thing you can more or less do on the broadcast video is to identify encoding artifacts like macro blocks, mosquito noise and deinterlacing side effects. MPEG2 streams are easy to identify, H264 and H265 are slightly more difficult. I do not know of a package solution to this. Be aware of news stations that use low fi mobile feeds!

  • by wolfheart111 ( 2496796 ) on Saturday December 15, 2018 @05:12PM (#57809644)
    It was on G... could have done it yourself but, here u are. http://www.compression.ru/vide... [compression.ru]
  • by Anonymous Coward

    There are already algorithms for doing this, like PSNR, SSIM, and VMAF. Here's a good read: https://medium.com/netflix-techblog/toward-a-practical-perceptual-video-quality-metric-653f208b9652

    And there are already commercial solutions that apply these metrics as well as network analysis etc. across a range of video delivery methods, e.g.:https://www.telestream.net/iq/

  • You need to check with the specific streaming services terms of use to see what they say about their streaming. Most likely most will have a way out if the video is not up to par. Also, sometimes the streaming video is based on the actual speed of your internet at the time of the streaming (Amazon Prime, Netflix, etc). In addition some service providers will throttle the stream - especially if mobile.

    IMO, getting a measuring tool would really be a waste of time.

  • by Graymalkin ( 13732 ) * on Saturday December 15, 2018 @05:19PM (#57809674)

    Is a way to tell not that the video is "sharp" but closely matched the original source in fidelity. If all you looked at was sharpness a scene that had motion blur or an out of focus effect would run afoul of the algorithm. Your end goal of name and shame is incredibly difficult to do well or even accurately.

    There's ways to get video quality measures (SSIM, MSE, etc) but they require a comparison to the source (or a source you consider good enough) as they're relative measures. Interpreting the results is also not obvious.

    Besides the challenge of the comparison it's also important to understand the video pipeline from raw source to what you see on screen. The video stream from the provider might be providing a high quality stream but the scaler on your TV might suck and muddies the image upon display or your LCD panel's dithering might be really shitty.

    Video, especially streaming video (either Internet streaming or live delivery like cable), is really complicated. There's a lot of different dimensions where you might judge "quality" and even then it's an envelope and not a single scalar value. There's no objective "good" reference for any recorded scene, even the concept of "life-like" is not clear since the recording is entirely dependent on physical properties of the equipment.

  • Just use bitrate. (Score:5, Insightful)

    by sonoronos ( 610381 ) on Saturday December 15, 2018 @05:20PM (#57809680)

    There's already a metric that basically defines video sharpness: bitrate.

    The sharpness of H.264 and H.265 is very well known. Since commercial streaming services use commercial streaming video codecs, it's a pretty safe bet that you can almost directly correlate resolution to bitrate.

    There's virtually no incentive for streaming companies to deliver lower resolutions at higher bitrates. It would be a technical challenge to deliver higher resolutions at higher bitrates.

    Therefore, bitrate is most likely the simplest and most accurate measure of streaming video sharpness.

    • Sorry, correction to the above: It would be a technical challenge to deliver higher resolutions at LOWER bitrates.
    • Ignoring the point that different algorithms can produce different results at the same given bit rate. If there's a video provider that doesn't really care about the service they're offering, they may ensure everything is sent at a particular bitrate. This can result in inconsistent quality between titles if the algorithms that were used to encode it are different (and again, this is for a content provider that doesn't care about what they're doing). Also complex scenes need higher bitrates in order to rend

    • That's what I was going to say. Compare to competitors using the same codec and more or less bitrate = quality. It's not a PERFECT measure, but it's pretty good and dead simple.

      • No, itâ(TM)s not. Every encoder offers configurations that trade off speed against quality. If you donâ(TM)t want to spend much on hardware, or turn around time is important, youâ(TM)ll reduce the quality irrespective of bitrate.

        Then live encoding is another issue. x264 is very CPU intensive, and watch the CPU spike during (Iâ(TM)m not sure which, but they could be the same) scene detection or I-frame generation.

        • That's true, one can decide to spend less CPU encoding.

          Having said that, when widely comparing services such as Dish Network vs Comcast vs Frontier, that part would be different only if they are very stupid. For a given level of quality, the higher CPU encoding gives lower bandwidth. Some of us may be used to thinking of higher quality if it were the same bandwidth, but the flip side of that is it also means lower bandwidth for whatever quality level they accept.

          Since they are going to transfer the stream

          • The idea of encoding content once for everyone (in the streaming case) is actually the totally incorrect thing to do. You actually want to encode a single source a number of times with different settings intended for different client environments. If you're Netflix you even encode different segments of each source video to different profiles.

            Storage and CPU are much cheaper than bandwidth. HTTP streaming protocols like HLS and MPEG-DASH break video streams up temporally into time-based segments. The playlis

            • Of course.

              My point is that when 12 million people watched an episode of Game of Thrones, that means it was transferred 12 million times. It wasn't encoded 12 million times. Sure a scene might be encoded 12 times, then each encoded copy was downloaded a million times.

              If I watch a 7Mbps stream of GoT, a million other people watched the same 7Mbps stream. That 7Mbps version only had to be encoded once, for a million viewers, so it would be crazy to skimp on a dollar of encoding time.

    • by guruevi ( 827432 )

      I think the primary problem is the provider and in-between boxes lying about the bitrates. You can output 4K HDR on your cable box but your provider may be streaming 240p which the box up converts.

    • There is a bigger factor than just resolution sizes when it comes the required bitrate. The amount of full screen motion and detail.
      You can have a 4k size clip go on forever and require very little bitrate so long as nothing much is actually moving; the p-frames will be almost entirely black.

      So the bitrate required is a function of the amount of detail/motion necessary to explain it. This makes it heavily scene dependent and the best example of this in recent memory was in Thor: Ragnarok during a fight
    • That is total bollocks. You'd have to be a video encoding total noob to believe it.

      These video standards have various "profiles" that define the encoding "tools" that can be used, and the characteristics of the target decoders. These can vastly change the required bitrate to acheive a given quality. The quality of the encoder design makes a vast different too. Finally, moreso than anything, the characteristics of the input video affect the required bitrate for a given quality. A football game may need 3x th

    • There's already a metric that basically defines video sharpness: bitrate.

      Sorry but that is just plain wrong. Even within an identical decoder the same bitrate can produce wildly different results in terms of perceived image quality depending on the many hundreds of settings available, including sharpness. On top of that perceived sharpness can be faked by applying a local contrast increase right at the point of the change in brightness.

      There's virtually no incentive for streaming companies to deliver lower resolutions at higher bitrates.

      Actually there is. Improving image quality costs time, or hardware which directly translates into actual money. e.g. I could live capture video i

  • You are probably not aware of how video streaming works nowadays, and you should read about adaptive bitrate streaming [wikipedia.org].

    TLDR, nowadays content providers encode each video at different resolutions and quality levels (and bitrates), each with a different level of "sharpness", and the client selects the one that best fits its available network bandwidth.

    This means that playing the same video several times may result in different quality each time. This also implies that the quality of the video you receive may

  • Fundamentally, a 2-D FFT describes the dynamics present in any image. Sure, you could compare them between a known source displayed via different streaming methods on different screens. However, these numbers are meaningless on their own without taking the context of the viewer into account.

    The FFT must include physical dimensions, starting with the screen's PPI value. So a 4K phone screen is inherently "sharper" than a 4K projector on a 20 meter movie screen. But we all know the movie screen offers a f

    • Curious, why do you need that much detail? I can understand an eye or brain surgeon may want to see more; but for normal tv content? even everyday you walk around, see things / people, you only use the necessary and sufficient detail to do object recognition (I see a bit patttern n this matches my known db of object xyz). I mean just because modern tech allows something does not mean we need it.
  • by Anonymous Coward

    Dear Salty Dude,

    Your picture quality is crap because streaming services dynamically vary the quality to match the available bandwidth. If you've got less than 20Mbps or if your ISP is throttling your 20Mbps connection so that you can only get 2Mbps from Netflix, then you're not going to see 4K(UHD). https://fast.com

    Even if you could programatically characterize the quality of the video, your salty rants are not going to shame the likes of NetFlix and Amazon. So don't make a spectacle of yourself or waste yo

  • And that's the law!

  • by Anonymous Coward

    As others here already sort of mentioned; Sharpness is the wrong metric.

    Another person mentioned "macro blocks".

  • First of all... sharpness is something you see and experience. There's no way to measure it. In order to measure something you need a unit and that doesn't exist. Don't tell me whether it's 1080p or 4K or 5K. Those are not sharpness units. If you want to measure sharpness, at least you're expecting certain value which you don't even know because again there's no unit.
  • It is very annoying to watch video on you-tube only to figure out many of them are ripped off from someone else. These thieves alter the video so YouTube doesn't kill it by changing the speed, mirroring, and other methods to get it online. It tends to make it unwatchable.
  • FFmpeg can do PSNR calculations. The only problem with what you call 'sharpness' (in technical terms, we call it noise or signal-to-noise ratio) is you have to have something to compare it with. Movie and video makers often use 'blur' either as a proxy to indicate motion or to put emphasis on the thing in the story that the viewer should focus on. It's also natural for things that are in the 'focus' of the camera to be sharp and everything else to be blurry (unsharp). So measuring 'sharpness' of an image is

  • Measuring sharpness is possible, but that would mean nothing. There are many image sharpening algorithms. Any self respecting editor can play with sharpness any way he wants, and it is even built in some TVs and video players.

    What you want is a measurement of quality and it can't be done without a reference. Think about it, if you can tell how close a video is to the reference without the reference, then the same algorithm could be used to reconstruct the original from a degraded video. And guess what, comp

  • Have max bandwidth use by a broadcaster around the world. No attempts on adsl, low end 3rd world consumer ISP.
    Work out what the original material was created in, film, HD, 4K, 8K.
    Find the same show globally.
    Sort for the bandwidth, compression, chip sets used.
    And only selling so much bandwidth to broadcasters?
    Are 2nd and 3rd world nations broadcasters trying to fit many shows over a set amount of bandwidth they can use in that part of the world?

    Its not a conspiracy to make TV look bad in 2nd and 3
  • Since transmission over the last mile is the expensive part, why not just check the received stream's bps? You can assume that they're not going to upsample it and pay more just for the last mile.

  • That reminds me of the abhorrent sharpness setting most TVs even including 4k TVs have.

    Sharpness is not a measurement of quality, quality is a measurement of quality or more specifically accurate reproduction - how close does the stream match a high quality version such as a blu-ray 4k copy etc.

    If you introduce measuring sharpness as a way of measuring quality then the end result is cheating that will artificially make all streams sharper at the expense of picture quality.

    So, get it right, sharpness != qual

  • Why sharpness? (Score:4, Informative)

    by thegarbz ( 1787294 ) on Sunday December 16, 2018 @07:14AM (#57811744)

    Image quality is not determined by sharpness. This idea is the reason why those horrible TVs apply a post process sharpness increase to content (the first thing you should turn off when you buy a TV).

    Quality covers a large number of metrics including artifacting, posterisation, loss of colour fidelity, and loss of contrast ratio to name a few.

If you have a procedure with 10 parameters, you probably missed some.

Working...