how we test tvideo at skype
DESCRIPTION
TRANSCRIPT
How we test Video at Skype April 2012
Oksana Dementsova
• SDET, Video in Tallinn office
• Microsoft Platforms Team
oksana.dementsova
Slide 2
Agenda
1. Skype Video Team
2. How we develop Video
3. How we test Video
2011 © Skype. Commercially confidential. Slide 3
Video Team
Slide 4
Video Team • Video Team develops Skype Video functionality (Video Library)
• Structure:
• Video Developers team: 18 Developers
• Video QE team: 9 Quality Engineers (QEs)
• Location: Tallinn and Stockholm
Slide 5
Development process
Slide 6
Video Library • Platform
• Microsoft Platforms
• iOS, OSX Platform
• Embedded and Android
• Streaming
• Codec, processing
Slide 7
Video Library release • RV – collection of functionality that provides an end user or third party a complete set of
functionality that is valuable in the market.
• RVs at Skype: Video, Audio, iPhone UI, Call Signalling
• Release of Video Library each 2 months
• Important fixes are backported to the release
• Other RVs (release vehicles) consume last released Video Library during development
Slide 8
Development processes • Scrum teams of 2-5 developers, 1-2 QEs
• Sprint length: 2 weeks
• QEs tasks in Scrum:
• Adding Acceptance Criteria to Product Backlog Items (PBIs)
• Creating Testing Product Backlog Items (PBIs)
• Taking part in task estimation
• Working on task during the Sprint
• Reporting on quality status during Sprint review
Slide 9
What else does Video QE do? • We own the product we develop
• We are responsible for quality
• We make the product quality status visible
• We are a communication channel between user and developers
• Technology scouting
Slide 10
QE Team Communications • Why? To share knowledge, update the status outside Scrum team, to get the idea of
Video Library status
• How? Skype chats; Video calls; Face to face communication; Offsites; Presentations; workshops
• When? Weekly PPP (Progress, Problems, Plans) update; Weekly Video QE meeting; Video QE summits, Video team offsites
Slide 11
Testing process
Slide 12
Tools used for test planning and reporting • Bug tracking system: Jira
• Test repository: TMT
• Documentation, reporting system: Confluence
Slide 13
Testing metrics for real time video • Objective:
• Frame rate
• Resolution and aspect ratio
• Color space
• Bitrate
• Delay
• Subjective:
• Smoothness, jerkiness, freezes
• Sharpness, pixelation, artifacts, flickering
Slide 14
Objective testing • Logs
• Call technical info
Slide 15
Subjective testing • Visually
• Call quality feedback
• Labs
Slide 16
Types of testing • Manual and automatic
• Unit, Component, system, integration
• Functional and non-functional (NFR – non-functional requirements)
• Performance
• Integration: Exploratory, Smoke, Interoperability
Slide 17
After integration
Slide 18
Smoke testing
Slide 19
Functional testing • Does it work?
Slide 20
Non-functional testing • How does it work?
Slide 21
Interoperability testing
Slide 22
Performance testing
Slide 23
Release testing • Build configurations
• Release testing matrix for Windows desktop:
Slide 24
Tools we use
Slide 25
Automatic calling system (ACS) • Internally developed
• Hundreds of computers in Tallinn and Stockholm offices, a lot of individual machines over the world
Slide 26
Automatic calling system (ACS) • Runs thousands automatic 1:1 and conference calls every day on different platforms.
Test reports provided
• Testcases: number of calls, their duration, idle time, participants, build version, video input device (camera or screen sharing)
Slide 27
Client logging • Should be enabled
• Logs are encrypted
• Internal tool which allows:
• Decrypting logs
• Parsing logs
• Saving logs
Slide 28
Parsing the log files • Example of Windows Phone log lines parsing
Slide 29
Helpful tools • Network emulating tools: Dummynet
• Tools to load the PC: CPU killer, CPU burn
• Virtual cameras
Slide 30
Helpful Windows Phone tools • Bugsense
• Internal tool for saving call stack
• zSystemInfo
Slide 31
Thank you! Questions?
Slide 32