1 / 42

Video Gross Error Detection

Video Gross Error Detection. Royce Fernald, Phillip Corriveau, Audrey Younkin – Intel Corporation. Abstract. Provide an overview of the Gross Error Detection (GED) methodology for measuring the video playback experience

lgalaz
Download Presentation

Video Gross Error Detection

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Video Gross Error Detection Royce Fernald, Phillip Corriveau, Audrey Younkin – Intel Corporation

  2. Abstract • Provide an overview of the Gross Error Detection (GED) methodology for measuring the video playback experience • Present initial data on the correlation between objective GED metrics and subjective user opinions • Discuss goals for the GED project and outline current industry standardization efforts

  3. Agenda • Video Gross Error Detection Overview • Gross Error Detection Methodology • Frame Identifier Sampling • Sequence Markers • Temporal Alignment • Intel Video GED Application • GED / Subjective User Experience Correlation • Subjective Assessment Methodology • Results and Analysis • Current Standardization Efforts

  4. Gross Error Detection Overview • Designed to assess the video playback experience • Provides objective data on dropped, repeated or out-of-sequence frames • Measures frame rate stability • Suitable for video streaming assessment over unreliable media such as wireless networks • Complements full reference methodologies such as VQM

  5. GED Methodology • Instrument video clips with a deterministic sequence of color blocks • Play video through system under test and capture the results • Measure missing or repeated frames by examining capture files for the expected frame identifiers

  6. Platform Independence • The GED methodology provides a way to rate the user video experience of any platform or device • Since GED operates directly on capture files, it can characterize the user experience impact of various display technologies, network transports, operating systems, streaming applications, media players and compression formats

  7. GED Frame Identifiers Frame 1 Frame 2 Frame 3 Original Sequence GED Encoded

  8. Detection of Repeated Frames Frame 1 Frame 2 Frame 3 Original Sequence GED Encoded (Green Marker is duplicated)

  9. Detection of Dropped Frames Frame 1 Frame 2 Frame 3 Original Sequence GED Encoded (Green Marker is missing)

  10. GED Methodology – Workflow 0. Source Material 1. Marked Source Material GED Encode Video Encoder 2. Compressed and Marked 3. Capture Results 4. GED Scoring System Under Test GED Decode

  11. GED Scoring • GED provides an overall “Gross Error” metric, which is the sum of the dropped, repeated and out-of-sequence frames in the test clip • GED scores correlate strongly with subjective opinions of video playback smoothness (user experience)

  12. Agenda • Video Gross Error Detection Overview • Gross Error Detection Methodology • Frame Identifier Sampling • Sequence Markers • Temporal Alignment • Intel Video GED Application • GED / Subjective User Experience Correlation • Subjective Assessment Methodology • Results and Analysis • Current Standardization Efforts

  13. Frame Identifier Sampling • GED averages over the entire color patch, discarding the outside edges • Accommodates blurring, minor compression artifacts and other small changes

  14. Color Sampling Methodology • GED frame identifiers consist of fully-saturated colors to tolerate compression, digital to analog conversion, color space changes and capture device deltas • Basic Frame Sequence (RGB24 colorspace): 0 1 2 3 4 5 6 7 R: 0 G: 0 B: 0 R: 0 G: 0 B: 255 R: 0 G: 255 B: 0 R: 0 G: 255 B: 255 R: 255 G: 0 B: 0 R: 255 G: 0 B: 255 R: 255 G: 255 B: 0 R: 255 G: 255 B: 255

  15. GED Composite Sequence • GED can use composite marker sequences to uniquely identify large numbers of frames while tolerating normal chroma changes during capture • For instance, a 3x3 grid using 8 colors per element allows 8^9 or 13 million unique frames (enough for 51 days of video) • Composite Sequence (RGB24 colorspace): 0 1 2 3 4 5 6 7 8 9 10 … 2162688 2162689 2162690 2162691 2162692

  16. Agenda • Video Gross Error Detection Overview • Gross Error Detection Methodology • Frame Identifier Sampling • Sequence Markers • Temporal Alignment • Intel Video GED Application • GED / Subjective User Experience Correlation • Subjective Assessment Methodology • Results and Analysis • Current Standardization Efforts

  17. Sequence Markers • Start of sequence markers can be used to delineate several video clips that have been concatenated together • Start markers also allow padding at the beginning of video clips during analog capture – this padding can be removed to keep the clips in temporal sync • Example of start of sequence marker for 3x3 grid (RGB24 colorspace):

  18. Sequence Markers – Multiple Sequences • Multiple sequences in the same test clip can be evaluated and scored separately Sequence 2 Sequence 1 Frame 4 Frame 5 Frame 1 Frame 2 Frame 3

  19. Sequence Markers – Video Capture Padding Source File – 3 Frames Capture File – 5 Frames Start marker used to crop extra frames X X

  20. Agenda • Video Gross Error Detection Overview • Gross Error Detection Methodology • Frame Identifier Sampling • Sequence Markers • Temporal Alignment • Intel Video GED Application • GED / Subjective User Experience Correlation • Subjective Assessment Methodology • Results and Analysis • Current Standardization Efforts

  21. Temporal Alignment • Missing frames can be replaced and repeated frames deleted to temporally align capture files with source clips • The feature is useful for full-reference tools, such as VQM

  22. Temporal Alignment Example – Source Clip of Three Frames Frame 1 Frame 2 Dropped Frame Detected = 2 Frames Frame 1 Frame 2 Frame 3 = 3 Frames (Frame 2 replaced with copy of frame 1 and marked accordingly)

  23. Agenda • Video Gross Error Detection Overview • Intel Video GED Application • GED / Subjective User Experience Correlation • Subjective Assessment Methodology • Results and Analysis • Current Standardization Efforts

  24. Intel Video GED Application • The Video GED application is a reference implementation of the GED methodology described above • The Video GED is freely available in binary form and unencumbered by patents • Calculates Mean Opinion Scores based on Intel’s research

  25. Intel Video GED Application – Screenshot

  26. Agenda • Video Gross Error Detection Overview • Intel Video GED Application • GED / Subjective User Experience Correlation • Subjective Assessment Methodology • Results and Analysis • Current Standardization Efforts

  27. Subjective Assessment Methodology • Video sequences with varying levels of degradation were generated to represent errors generally encountered during video over wireless playback • A series of 240-frame Standard Definition (720x486) clips was assembled into a one hour test session (10 video clips x 5 conditions x 2 Error types) • Fifty non-expert subjects (50% male and 50% female) were asked to evaluate each clip for “playback smoothness and fluidity,” not their opinion of the video content • Before each test session, subjects were shown a sample of the best and worst clips to establish a frame of reference in order to reduce the impact of participant inherent biases

  28. After each video sequence, participants were presented with a choice of five adjectives describing their opinion of the video experience on a subjective scale Subjective Video Quality Scale Trial Structure

  29. Randomization • The presentation order of the video content was randomized using a pseudo-random number generator tool to prevent ordering effects, i.e. to mask any tendency for a participant to rate a clip in relation to the previous one • Randomization is a key element of psycho-visual testing that ensures participants do not see the material in a repeated fashion that would allow a learning effect

  30. 20° 5H Participants Sat at a Predetermined Viewing Distance

  31. Subjective Assessment – Workflow Diagram

  32. Agenda • Video Gross Error Detector Overview • Intel Video GED Application • GED / Subjective User Experience Correlation • Subjective Assessment Methodology • Results and Analysis • Current Standardization Efforts

  33. Overall Means for Dropped and Repeated Frames Significant difference found between dropped frames and repeated frames.

  34. Collapsed Across Dropped and Repeated for Each Condition Plotted means for both error types.

  35. Collapsed Across Dropped and Repeated for Each Clip User ratings are content dependent.

  36. Mean Opinion Score for Dropped Frames Log fit graph for predicting dropped errors.

  37. Mean Opinion Score for Repeated Frames Log fit graph for predicting repeated errors.

  38. Combined Mean Opinion Score for Dropped and Repeated Frames Log fit graph for predicting dropped and/or repeated errors with high correlation.

  39. Results and Analysis

  40. Agenda • Video Gross Error Detector Overview • Intel Video GED Application • GED / Subjective User Experience Correlation • Current Standardization Efforts

  41. Current Standardization Efforts • GED was designed to measure the playback experience over unreliable media, such as 802.11 wireless networks • Initial standardization efforts have focused on IEEE 802.11 Task Group T: User-Centric Wireless Performance Prediction • While generally in favor of the GED methodology, the task group would prefer including the GED in the TGT draft after it has received the endorsement of a recognized body of video experts • VQEG’s input would be extremely helpful for refining the GED and furthering the standards process

  42. Conclusion • Video Gross Error Detection provides an efficient, repeatable method of characterizing video playback performance • GED complements quality tools such as VQM, providing a complete picture of the overall user experience • GED is freely available and intended to help the industry perform platform-independent technology comparisons

More Related