doc.:ieee 802.11-05/0887r0 submission september 2005 philip j. corriveau - intelslide 1 video...
Post on 05-Jan-2016
214 Views
Preview:
TRANSCRIPT
September 2005
Philip J. Corriveau - IntelSlide 1
doc.:IEEE 802.11-05/0887r0
Submission
Video Testing Strategy
Notice: This document has been prepared to assist IEEE 802.11. It is offered as a basis for discussion and is not binding on the contributing individual(s) or organization(s). The material in this document is subject to change in form and content after further study. The contributor(s) reserve(s) the right to add, amend or withdraw material contained herein.
Release: The contributor grants a free, irrevocable license to the IEEE to incorporate material contained in this contribution, and any modifications thereof, in the creation of an IEEE Standards publication; to copyright in the IEEE’s name any IEEE Standards publication even though it may include portions of this contribution; and at the IEEE’s sole discretion to permit others to reproduce in whole or in part the resulting IEEE Standards publication. The contributor also acknowledges and accepts that this contribution may be made public by IEEE 802.11.
Patent Policy and Procedures: The contributor is familiar with the IEEE 802 Patent Policy and Procedures <http:// ieee802.org/guides/bylaws/sb-bylaws.pdf>, including the statement "IEEE standards may include the known use of patent(s), including patent applications, provided the IEEE receives assurance from the patent holder or applicant with respect to patents essential for compliance with both mandatory and optional portions of the standard." Early disclosure to the Working Group of patent information that might be relevant to the standard is essential to reduce the possibility for delays in the development process and increase the likelihood that the draft publication will be approved for publication. Please notify the Chair <stuart.kerry@philips.com> as early as possible, in written or electronic form, if patented technology (or technology under patent application) might be incorporated into a draft standard being developed within the IEEE 802.11 Working Group. If you have questions, contact the IEEE Patent Committee Administrator at <patcom@ieee.org>.
Date: 2005-09-14
Authors:Name Company Address Phone email
Philip Corriveau
IntelHF3-96 5200 NE Elam Young Pkwy, Hillsboro, OR. 97124
(503)-696-1837 philip.j.corriveau@intel.com
Rik Logan Intel 5200 NE Elam Young Pkwy Hillsboro, OR 87124
503-712-1675 rik.e.logan@intel.com
Neeraj Sharma
Intel13290 Evening Creek DriveSan Diego, CA 92128
(858)-385-4112 neeraj.k.sharma@intel.com
Uriel Lemberger
IntelPO Box 1659, Matam Industrial Park, Haifa 31015 Israel
+972-4-865-5701
uriel.lemberger@intel.com
Sasha Tolpin
IntelPO Box 1659, Matam Industrial Park, Haifa 31015 Israel
+972-4-865-5430
alexander.tolpin@intel.com
September 2005
Philip J. Corriveau - IntelSlide 2
doc.:IEEE 802.11-05/0887r0
Submission
Agenda
• Intel Video Strategy Introduction• Video Quality Definition (Wireless Test
Points)• Subjective & Objective Testing• Wireless Video Learning's / Adaptability• Summary
September 2005
Philip J. Corriveau - IntelSlide 3
doc.:IEEE 802.11-05/0887r0
Submission
Philip Corriveau User Centered Design Media and Acoustics Perception LabIntel CorporationHillsboro, Oregon
Intel Video Testing Strategy Concepts and Methods
September 2005
Philip J. Corriveau - IntelSlide 4
doc.:IEEE 802.11-05/0887r0
Submission
Intel Expertise
Philip Corriveau 16 years of ExperienceCo-Chair www.vqeg.org
Started in HD Subjective Assessment 1990Internationally recognized Video
Subjective/Objective Expert is leading Intel’s Video Strategy
Use Standardized Methods in the ITU
September 2005
Philip J. Corriveau - IntelSlide 5
doc.:IEEE 802.11-05/0887r0
Submission
Broadcast merges with Personal Computing
)()()(
)(
FEfnCfnFQfnwhere
FQfnFDfnMOS
FD – Frames Dropped or missingFQ – Individual Frame QualityC – Content being transmittedFE – Frame Encoding Technology Chosen.
Mean Opinion Score – Represents Real End-Users
September 2005
Philip J. Corriveau - IntelSlide 6
doc.:IEEE 802.11-05/0887r0
Submission
Wireless Issues
Wireless Setups will vary – PC to PC, PC to DMA, PC to Other…
There are 3 basics stages to Video Analysis– Gross Error Detector (Did my Video make it there?)– Signal Integrity Testing (Was my Video Legal? De-interlaced etc..)– Video Quality (Is the quality of the Video Acceptable to my End-user?)
– Used at different Test points.
DMA
TV
September 2005
Philip J. Corriveau - IntelSlide 7
doc.:IEEE 802.11-05/0887r0
Submission
Test pointsWas the quality Good enough to start with!
Computer Processing Encoder etc.. was OK!
Did All my information make it over the Link!Am I dealing with Adaptive Coding?
Was there any effect of the DMA!Scaling, Colour Conversion etc..
What is the End Result!Acceptable to End-user..
Multiple Test points Exist in any video path!
September 2005
Philip J. Corriveau - IntelSlide 8
doc.:IEEE 802.11-05/0887r0
Submission
Gross Error Detector
• Self-contained package– Works using Coded blocks inserted in Source– Controlled Sources are preprocessed– More in a minute…
• Concept… .– If you can not delivery a smooth experience the End-
User Experience degrades rapidly
Step One – Did enough information reach the user not to terminate the Experience!
September 2005
Philip J. Corriveau - IntelSlide 9
doc.:IEEE 802.11-05/0887r0
Submission
Signal Integrity
• Checking to ensure Proper Video– Some of this is checked in subsequent steps
• Colour Spaces• De-interlacing• Frame-rate conversion• Signal to Noise Ratio• STL – Lab tests from Microsoft – MCE test
points
Was the Video sent considered legal video!
September 2005
Philip J. Corriveau - IntelSlide 10
doc.:IEEE 802.11-05/0887r0
Submission
Quality
• Subjective Quality– Expert – Philip Corriveau– Non-Expert
• Most Reliable• Uses Real People• Conducted in Controlled Environment• Used to correlate Objective model performance.
• Objective Quality– Software – portable and useable– Benchmarked to Subjective Results– Standardized– Black-box non-adaptive Algorithm
Was the Quality of the Video Information Acceptable!
September 2005
Philip J. Corriveau - IntelSlide 11
doc.:IEEE 802.11-05/0887r0
Submission
VQM – The Objective Visual Quality Metric
• What is it?– VQM is a free industry accepted metric for evaluating Video quality
• Tested by the Video Quality Experts Group www.vqeg.org• General model standardized 2003 (ANSI T1.801.03-2003)• ITU – J144 – Quality metrics Standard
• What it does– Compares a source clip and a processed clip calculating a score that
co-relates to non-expert subjective assessment– The quality of the source is relatively “perfect” to the application. It
means you measure how well the source was reproduced• What it does not
– Provide an automated, batch-type mechanism for push button video pass/fail
Objectify the Subjective measurements of Video Quality
September 2005
Philip J. Corriveau - IntelSlide 12
doc.:IEEE 802.11-05/0887r0
Submission
Original VideoOriginal Video
Processed VideoProcessed Video
Score that Score that correlates to correlates to subjective subjective
assessmentassessment
Score
Imperceptible (5)
Perceptible but not annoying (4)
Slightly Annoying (3)
Annoying (2)
Very Annoying (1)
VQM – Full Reference
TimeRespons
e
SubjectiveTest
ParadigmRef.Clip
Ref.Clip
Test
Clip
Test
Clip
SubjectiveRating Scale
September 2005
Philip J. Corriveau - IntelSlide 13
doc.:IEEE 802.11-05/0887r0
Submission
VQM Issues• VQM can tell you whether artifacts happen
– i.e. blocking, bluring, color loss/gain, luminance loss/gain, etc.• But it cannot replace the human perception of the
seriousness of artifacts– The “goodness” of the processed clip is highly subjective, and a
function of many psychological factors• Scene type, source material color/brightness, motion, detail, etc.
• Therefore, it cannot be fully automated– You cannot batch thousands of tests in a regression suite with a simple
pass/fail– A human being must correlate the results
• Example on next slide
September 2005
Philip J. Corriveau - IntelSlide 14
doc.:IEEE 802.11-05/0887r0
Submission
Example• Both videos have blockiness
in them.• VQM will report the same
level of blockiness in the two sequences.
• However, the perception of this blockiness is different– In the top sequence, the
blockiness occurs in an area that is visually “acceptable” (the background)
– In the bottom sequence, the blockiness occurs in an area that is visually “unacceptable” (Susie’s face).
September 2005
Philip J. Corriveau - IntelSlide 15
doc.:IEEE 802.11-05/0887r0
Submission
VQM and Validation
• To measure video quality, therefore, a trained human is also needed
• This means we cannot run thousands of clips and say pass/fail
• We must therefore limit the clips to short sequences, with specific features that can create known artifacts when processed improperly– People can be trained (by Intel’s video expert) to spot these
features– VQM can be used once a level set is created, to judge intermediate
changes– Periodically, the VQM result must be recalibrated with a human
eye
Black-box that is non-adaptive needs checks and balances!
September 2005
Philip J. Corriveau - IntelSlide 16
doc.:IEEE 802.11-05/0887r0
Submission
What we have not addressed!
• Audio – Where is it…. Not here…. Why..– Approaches to coupling Audio and Video Change– Separate Transport Streams– Same Transport Stream– Sync issues
• Focused on Audio and Video separately since there are no Standardized tools to look at both!
September 2005
Philip J. Corriveau - IntelSlide 17
doc.:IEEE 802.11-05/0887r0
Submission
Wireless Testing Philosophy
First set of Experiments
Intel
September 2005
Philip J. Corriveau - IntelSlide 18
doc.:IEEE 802.11-05/0887r0
Submission
First tests
We Watched HOURS of Video Looking for Frame Drops and Errors….
In the Grove – California – you can imagine the fun we had..
Then we got Smart !
September 2005
Philip J. Corriveau - IntelSlide 19
doc.:IEEE 802.11-05/0887r0
Submission
Measuring Video QualityThe Wireless Experiments
• First pass testing was done using the eye and a video Camera.
• GED now solves these hurdles and a capture system allows for analysis.
• Standardized sequences – Always important.
• Multiple passes required to understand link performance.
If you Blink you will miss the artifact.. Or even a series…
September 2005
Philip J. Corriveau - IntelSlide 20
doc.:IEEE 802.11-05/0887r0
Submission
Clips Tested – All standard and accepted
• Building– Helicopter flyover of Manhattan. Shot of the skyline. Lots of
horizontal / vertical lines, Statue of Liberty in the distance• Flowers
– Zoom out on flowers blowing in a breeze with leaves falling around. High color saturation, random movement.
• Football– Football game – it’s a fumble! High motion
• Mobile and Calendar– High color saturation, slow movement, toy train against a moving
calendar• Ship
– 1700’s style ship in dock, waves and breeze lapping at flags. Windmills turning in the distance. Little motion, high detail
• Susie– Close-up of woman talking on phone. Human eye is very sensitive
to faces, so will pick up errors that may be hard to see elsewhere
Varied Content is EXTREMELY important!
September 2005
Philip J. Corriveau - IntelSlide 21
doc.:IEEE 802.11-05/0887r0
Submission
Typical Failure• Dropped frames• Slide Show• Stalled Video – See Time Code on Capture below
(over a 7 sec stall)
The Gross Error Detector will capture these types of Errors
September 2005
Philip J. Corriveau - IntelSlide 22
doc.:IEEE 802.11-05/0887r0
Submission
Wireless Adaptability
• Interference – video performance – microwave, baby monitor– Totally disrupted Video Experience (Stops the Stream)
• Transport – BW and its impact on the video’s ability to be trans-rated– Link instability leads to tricky adaptive requirements
• UDP and HTTP– Defines delivery stats of packets (is the video frame on-
time)
• Player differences exist– Some players pause and wait then play, other fast-forward
to catch up
• Encode technology– Different encode technologies perform differently
September 2005
Philip J. Corriveau - IntelSlide 23
doc.:IEEE 802.11-05/0887r0
Submission
Gross Error Detector Concept
1. Encode source material with GED frame identifiers
2. Convert to desired compression format3. Play video through system under test
and capture the results4. Analyze GED markers and playback
timestamps to detect dropped, delayed, repeated or out-of-sequence frames
September 2005
Philip J. Corriveau - IntelSlide 24
doc.:IEEE 802.11-05/0887r0
Submission
• GED frame identifiers consist of a sequence of color blocks
GED Methodology
• Anomalies in the color sequence indicate playback errors
GED Encode
September 2005
Philip J. Corriveau - IntelSlide 25
doc.:IEEE 802.11-05/0887r0
Submission
GED Methodology0. Source Material 1. Marked Source Material
2. Compressed and Marked
4. GED Analysis
GED Encode WMV Encode
System Under Test
3. Capture Results
GED Decode
September 2005
Philip J. Corriveau - IntelSlide 26
doc.:IEEE 802.11-05/0887r0
Submission
GED Final stage & handoff
• Once complete check pass/fail criteria.• If Fail – report out and start again.• If Pass – create files for processing with
VQM to investigate Quality– File conversions to uyvy colour space– Check file size– Account for temporal alignment– Hand off to next module.
The Gross Error Detector assess experience before investment into Quality Evaluation.
September 2005
Philip J. Corriveau - IntelSlide 27
doc.:IEEE 802.11-05/0887r0
Submission
Summary
• TGT IEEE is starting to standardized the approach to Video Quality assessment
• Identify logical steps and tools are being developed and slowly deployed throughout Intel and the industry.
• We are working closely with Standards groups to further develop, deploy and validate methods.– IUT– VQEG– ANSI– SMPTE– IEEE
September 2005
Philip J. Corriveau - IntelSlide 28
doc.:IEEE 802.11-05/0887r0
Submission
Thank you !
Questions?
Comments?
philip.j.corriveau@intel.com
top related