Date post: | 05-Dec-2014 |
Category: |
Technology |
Upload: | qa-club-kiev |
View: | 717 times |
Download: | 0 times |
Video automation testing at Skype
Pierre Gronlier - [email protected]
Video Software Development Engineer in Test - Microsoft Skype division
April 2012 - Kiev
1 The Video Library
2 Continuous IntegrationBuildingTestingFeedback
3 Unit, Component, System testingSome wrappers for testing.Test Driven Development
4 Cross-platform testingCI teamPlugin mechanisms
5 NFRDe�nitionKPIsIncrease visibility
6 Conclusion
The Video Library
What is Skype made of ?
Network
MessagingVideo Audio
Network
MessagingVideo Audio
UIUI
Figure: Inside Skype
StreamingVideo Codec ToolBox
<Platforms>Apple, Android, Windows, Linux, Embedded, ...
Figure: Inside the Video Library
<Platforms> contains speci�c code like capturing, rendering methods.
What is Skype made of ?
Network
MessagingVideo Audio
Network
MessagingVideo Audio
UIUI
Figure: Inside Skype
StreamingVideo Codec ToolBox
<Platforms>Apple, Android, Windows, Linux, Embedded, ...
Figure: Inside the Video Library
<Platforms> contains speci�c code like capturing, rendering methods.
Continuous Integration
Continuous integration means :
building continuously.
testing continuously.
having an immediate feedback.
QuickbuildThere is, only for the Video Library, around 20 di�erent build con�gurations for di�erentplatforms and compilation modes.
release/debuginternal/externalstable/experimental. . .
We have a farm of building computers.To enable compilation and maintenance across platforms, Makefile is used for compilingand farm agents are in Java
Figure: HeatMap
Cross branches builds
Example
Network :
trunk/
branches/
network-69
*
network-68
. . .
Video :
trunk/
branches/
video-42
video-41
*
. . .
Codec :
trunk/
branches/
codec-23
*
codec-22
. . .
Cross branches builds
Example
Network :
trunk/
branches/
network-69 *network-68
. . .
Video :
trunk/
branches/
video-42
video-41 *. . .
Codec :
trunk/
branches/
codec-23 *codec-22
. . .
To enable two di�erent dependent teams to develop new features without becomingincompatible, we compile our code with the latest stable release of the dependencies.
In addition to trunk source code, we build our latest Long Term Support branch (*) everytime there is a backport of a �x.
Cross branches builds
Example
Network :
trunk/
branches/
network-69 *network-68
. . .
Video :
trunk/
branches/
video-42
video-41 *. . .
Codec :
trunk/
branches/
codec-23 *codec-22
. . .
Mode Network Video CodecVideo stable ∅ video-41 codec-23
Video release ∅ trunk codec-23
Video experimental ∅ trunk trunk
Network release trunk video-41 codec-23
Network experimental trunk trunk trunk
CI as a daily tool
Continuous integration means that :
1 every 10 mins, a script checks for new commits on video trunk/ or the branches/.
2 once a build for a platform is done successfully, it triggers a list of short tests. Every testlasts around 30 seconds.
3 at night, a list of longer tests is executed.
4 for every test execution, a report is generated in a database and the results are aggregatedon a web page for Devs and QEs
CI as a daily tool
Figure: Test results
The importance of visual feedback
Figure: TVs with build/test feedback
Make it visible ! !
Unit, Component, System testing
Who writes and maintains the tests ?Writing tests is writing code.
When you automate testing, QE are software developers in test.
The closer and deeper you get into the production source code, more probably it will be adeveloper test.
PythonNetwork
MessagingVideo Audio
Network
MessagingVideo Audio
UIUI
Figure: Inside Skype
Lua
Lua, C#StreamingVideo Codec ToolBox
<Platforms>Apple, Android, Windows, Linux, Embedded, ...
Figure: Inside the Video Library
QE and Devs together
1 Don't wait for developers to write yourtests.
2 De�ne the tests when you de�ne theAcceptance Criteria of your PBI.
3 Evaluate the value of your tests (e.g. codecoverage).
4 KISS : Keep it Stupid Short and Simple.
Figure: Test plan
Cross-platform testing
RequirementsWe want to have those features :
run our tests on di�erent platforms
run our tests with di�erent builds
retrieve the results of our tests and analyze it
save the result of the analysis
output a report, trigger alarms
The cross-platform CI team can provide :
a pool of devices, platform and capture devices.
access to various builds.
provide uniform alarming systems (chat, email, sms)
a database.
a storage space.
It is only a matter of contract de�nition between you and the CI team
How to conceive a modular testing framework ?
Reduced logs
DataBase
Storage Server
FrontendServer
Insert/Update entry
Targets:- tablets, mobile, notebook w/ and w/o hardware encoding camera, desktop- Windows (desktop + mobile), Linux, Mac (desktop + mobile), Android
+/-
Full logs
ParsingServer Web Rendering
Figure: Framework
How to conceive a modular testing framework ?
Reduced logs
DataBase
Storage Server
FrontendServer
Insert/Update entry
Targets:- tablets, mobile, notebook w/ and w/o hardware encoding camera, desktop- Windows (desktop + mobile), Linux, Mac (desktop + mobile), Android
+/-
Full logs
ParsingServer Web Rendering
Figure: Framework
NFR
What is non-functional ?
Functional vs Non-Functional
the video works = we see somethingvs
the video has a good quality = we enjoy our video call
Key performance indicators
list of kpis
resolution and frame rate
bitrate
dropped frames and freeze durations
frame-quality
. . .
list of usecases
for every codec
for every media protocol version
1-to-1 call and Group Video Calling
software encoding vs hardware encoding
for di�erent network conditions
Pass/Fail vs Score
Network Emulation Video Call Bwith analyzed outputs
Video Call Awith controlled inputs
Network Emulation
Example :
KPI Functional Non-functionalpass/fail 0% → 100%
resolution 6= 0x0 max = VGA 1
framerate 6= 0 max = 15fpsbitrate in the range of [20..5000]kb 350kbps ± 10 %
frame-quality frame exist PSNR or SSIM 2 score...
......
Everything is automated using stats and feedback values from the Video Library.
1. VGA = 640x480, QVGA = 320x240, QQVGA = 160x120
2. Image Quality measurement algorithms
Pass/Fail vs Score
Network Emulation Video Call Bwith analyzed outputs
Video Call Awith controlled inputs
Network Emulation
Example :
KPI Functional Non-functionalpass/fail 0% → 100%
resolution 6= 0x0 max = VGA 1
framerate 6= 0 max = 15fpsbitrate in the range of [20..5000]kb 350kbps ± 10 %
frame-quality frame exist PSNR or SSIM 2 score...
......
Everything is automated using stats and feedback values from the Video Library.
1. VGA = 640x480, QVGA = 320x240, QQVGA = 160x120
2. Image Quality measurement algorithms
How to evaluate the best available quality for a call ?
The best quality of a call is given by :
optimal settings = gcd(sender , receiver)
withsender = gcd (max (Encoding power) ,max (Network) ,max (Camera))receiver = gcd (max (Decoding power) ,max (Network) ,max (Screen))
where, with some simpli�cations,
Encoding power = f1 (CPU power,Power supply mode,Codec perf.)Network = f2 (Bandwidth,RTT,Relay/P2P)Camera = f3 (Resolution,Framerate)Decoding power = f4 (CPU power,Power supply mode,Codec perf.)Screen = f5 (Resolution)
(gcd = greatest common divisor)
How to evaluate the best available quality for a call ?
The best quality of a call is given by :
optimal settings = gcd(sender , receiver)
withsender = gcd (max (Encoding power) ,max (Network) ,max (Camera))receiver = gcd (max (Decoding power) ,max (Network) ,max (Screen))
where, with some simpli�cations,
Encoding power = f1 (CPU power,Power supply mode,Codec perf.)Network = f2 (Bandwidth,RTT,Relay/P2P)Camera = f3 (Resolution,Framerate)Decoding power = f4 (CPU power,Power supply mode,Codec perf.)Screen = f5 (Resolution)
(gcd = greatest common divisor)
Compare across revisions / branches
Compare across revisions / branches
Conclusion
Summary
1 Quick feedback between development and testing.
2 Devs and QE in the same team.
3 Collocation helps a lot !
4 Don't over-complicate your tests/frameworks.
5 Measure the e�ciency/value of your tests.
Questions
1 The Video Library2 Continuous Integration
BuildingTestingFeedback
3 Unit, Component, System testingSome wrappers for testing.Test Driven Development
4 Cross-platform testingCI teamPlugin mechanisms
5 NFRDe�nitionKPIsIncrease visibility
6 Conclusion
Äÿêóþ çà óâàãó !
Çàïèòàííÿ ?