Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
qtdd11_qtmultimediakitonmobile
1. Getting the most out of
QtMultimediaKit on mobile
Gareth Stockwell
2. Accenture
• It‟s not just a management consultancy ...
• I work for a division called Accenture Mobility
– Provides Mobility aspects of end-to-end solutions
– Work on many mobile/embedded platforms
• Android, iOS, MeeGo, Symbian, Windows Phone ...
– In a wide range of industry sectors
• Healthcare, automotive, retail, consumer electronics ...
– Some of us even get to hack on Qt
3. About me
• Based in London, UK
• 6 years in embedded software
– Mostly on Symbian
– Mostly developing multimedia adaptations, lately some graphics
• 2 years developing Symbian port of Qt
– Phonon MMF backend
– QAudio backend
– QtMultimediaKit, mostly around video rendering
4. Conventions
• Source code for the demos shown is available at
https://gitorious.org/garethstockwell-qtdevdays-2011/mainline
• Path within the repository is shown as
$DD11/examples/foo
• Logos show which platform(s) a topic or example applies
to
5. Overview
• QtMultimediaKit provides cross-platform abstractions
– So why do you need to care whether you are targeting a mobile
device?
– Where are the limitations and leaks in those abstractions?
• How can you tune your code so that it works well in the
constrained mobile environment?
6. Mobile considerations
Less processing power
Low latency processing
Reliance on co-processors
Multiple connectivity options Access point control
Memory management
Limited memory
and monitoring
Latency
Portable – always in your pocket Cool demos
8. Sources of latency
time
Resource
Buffering Playback
allocation
Media playback use case
Real-time use cases
9. Media playback: startup latency
Resource
Buffering Playback
allocation
MediaStatus NoMedia Loading Loaded Buffering Buffered
QML audio.source = “foo.mp3” audio.play()
Native m_audio->setMedia(...); m_audio->play()
• Startup latency can be minimised by pre-loading media
sources where possible
• Buffering latency is not controllable via QtMultimediaKit
high-level APIs
11. Low level audio APIs
QAudioInput QIODevice
QIODevice QAudioOutput
Push QIODevice *QAudioOutput::start();
+ Easy to use – no need to implement QIODevice
- Requires an extra buffer copy from client data into the audio stack
Pull void QAudioOutput::start(QIODevice *device);
Best choice for low-latency processing
12. Real time audio latency
time
Buffering Playback
Real-time data generation
Impulse
• Keep buffer size to a minimum
e.g. User input – But beware of underflow
• This is controllable via an API
void QAudioOutput::setBufferSize(int value);
– But this isn‟t supported on Symbian
– Here, you can control the buffering instead via the
source QIODevice
qint64 QIODevice::bytesAvailable() const;
13. Real time audio latency
Demo $DD11/qpiano
Hybrid QML / C++ app
Guess what it does ...
15. Access point selection
• By default, the access point will be chosen by the
platform
– May involve popping up a dialog for the user
• Apps may want programmatic control
– For application-specific reasons (e.g. video chat protocol won‟t
work well over laggy cellular networks)
– To simplify the UX
QList<QNetworkConfiguration>
QNetworkConfigurationManager::allConfigurations(...) const;
void QMediaPlayer::setNetworkConfigurations(
const QList<QNetworkConfiguration> &configurations);
QNetworkConfiguration QMediaPlayer::currentNetworkConfiguration() const;
16. Access point selection
Demo $DD11/networkplayer
Hybrid QML / C++ app
No access point selection
QML bindings provided Network configuration state
in QtMultimediaKit Undefined
Defined
Discovered
Active
18. GPU memory constraints
• Current S^3 devices have only 32MB graphics memory
– This is shared by the graphics subsystem (GLES and VG
engines) and the multimedia subsystem (video and camera)
– Combination of heavy graphics use cases (e.g. storing large
QPixmaps or textures on the GPU) with MM (e.g. playback of
high-resolution video clips; starting camera viewfinder) can
quickly exhaust the memory
• It can be difficult to understand whether a given bug is
due to GOOM
– There is an API via which GPU memory usage can be queried
– There are some strategies which can be used to minimise GPU
memory usage
• The good news: the next generation of S^3
devices, starting with the C7-01, have 128MB
19. Memory usage monitoring
• EGL_NOK_resource_profiling
– Provides total memory, total used memory, and per-process
usage
– API is not very friendly (particularly when compared to Qt APIs)
– So I wrote a Qt wrapper
class GraphicsMemoryMonitor : public QObject
{
Q_PROPERTY(qint64 totalMemory READ totalMemory
NOTIFY totalMemoryChanged)
Q_PROPERTY(qint64 usedMemory READ usedMemory
NOTIFY usedMemoryChanged)
Q_PROPERTY(qint64 currentProcessUsage READ currentProcessUsage
NOTIFY currentProcessUsageChanged)
...
};
$DD11/snippets/graphicsmemorymonitor
20. Memory usage monitoring
Total usage
Current process usage *
Total memory
• Not all memory allocations are correctly tagged
– Here, the 3.84MB is the EGL window surface
– Memory consumed by the video decoder is not represented
– So the only reliable measurement is the total usage
Demo $DD11/videoplayer
21. Mimimising camera memory usage
Demo $DD11/viewfinder
• The image capture ISP firmware module
consumes several MB
– So if you only need a viewfinder, switch to
video capture mode
m_camera->setCaptureMode(QCamera::CaptureModeVideo);
– You may also need to lower the viewfinder
resolution
QMediaRecorder *mediaRecorder = new QMediaRecorder(m_camera);
QVideoEncoderSettings videoSettings = mediaRecorder->videoSettings();
videoSettings->setResolution(QSize(320, 240));
mediaRecorder->setEncodingSettings(audioSettings, videoSettings);
23. Video post-processing with GLSL
• If QAbstractVideoSurface returns texture handles, we
can use QGLShaderProgram to apply effects
• From Qt 4.7.4, this becomes much easier, as we can
embed GLSL directly in QML via ShaderEffectItem
ShaderEffectItem {
property variant source: ShaderEffectSource { sourceItem: anItem; hideSource: true }
property real granularity: 10
property int targetSize: 256
fragmentShader: "
uniform highp float granularity;
uniform sampler2D source;
uniform highp float targetSize;
uniform lowp float qt_Opacity;
varying highp vec2 qt_TexCoord0;
void main() {
vec2 uv = qt_TexCoord0.xy;
float dx = granularity / targetSize;
float dy = granularity / targetSize;
vec2 tc = vec2(dx*(floor(uv.x/dx) + 0.5),
dy*(floor(uv.y/dy) + 0.5));
gl_FragColor = qt_Opacity * texture2D(source, tc);
}"
}
24. Video post-processing with GLSL
Demo $DD11/qmlvideofx
Pure QML app (!)
• Possible use cases
– Video playback transition effects (e.g. pixellated fade-in)
– Camera „ageing‟, applied both to viewfinder and captured images
25. Video and Qt3D
• Qt3D provides the Material import QtQuick 1.0
import QtQuick3D 1.0
abstraction, which can be a wrapper import VideoTexture 1.0
import TextureStream 1.0
around a texture
– But at present, it doesn‟t abstract Rectangle {
width: 600
texture streams height: 600
• With a bit of C++, we can provide VideoTexture {
id: video
the glue which pipes source: "test.mp4”
QAbstractVideoSurface output into }
a Qt3D material Viewport {
Cube { id: cube }
– Requires definition of some new QML }
elements
TextureStream {
source: video
target: cube
}
}
26. Video and Qt3D
Demo Source not available yet, sorry...
Hybrid QML / C++ app
C++ provides the glue
which sticks
QtMultimediaKit to Qt3D
27. Summary
• Low latency
– Startup
– Steady state
• Network access point control
• Graphics memory usage
– Monitoring
– Reduction
• Advanced video rendering
– Shader effects
– 3D
28. Feedback
Remember to send your session feedback
via the Qt Developer Days app
Get the app by
• Tapping one of the NFC tags on the event floor
• Downloading the “Qt Developer Days” app from Nokia
Store
• Downloading from qt.nokia.com/qtdevdays2011
• Visiting m.qtdevdays2011.qt.nokia.com to use web
version
QtMultimediaKit provides cross-platform abstractions, so the same code should work on both desktop and mobile platforms But in practice, it often has to be tweaked or tuned to work well in the constrained context of a mobile device This presentation will look at which aspects of mobile devices are relevant to app developers using QtMultimediaKit It will provide some tips on how those constraints can be managed The demos shown have been tested on Symbian and Harmattan
Demonstrate that pre-loading an audio clip noticeably reduces the startup latency Show that a second clip can be loaded while the first is still playing – this allows for gapless playlist playback
On Symbian, buffer size can be managed in pull mode by throttling the amount of data via QIODevice::bytesAvailable
Show that pull mode with a buffer size of ~12-25ms gives the best user experience Show that further reducing the buffer size causes underflow, leading to nasty audio jittering
QtMultimediaKit provides various APIs for getting multimedia data from a network stream into the native MM stack The most high-level is passing a QUrl to QMediaPlayer::setSource Where a custom networking protocol and/or DRM scheme is required, the app can manage the network interaction via QTcpSocket, and pass a QIODevice to QMediaPlayer This is not supported on Symbian due to limitations of the native MM stack For playback of audio streams which are not encapsulated in containers, QIODevice can be provided to QAudioOutput Support for playback of encoded audio streams via QAudioOutput varies by platform On platforms which support only PCM, the app will need to manage the decoding of the audio stream
Set up the device beforehand so that it has multiple access points (e.g. cellular and wi-fi) Show that not selecting an access point causes a dialog to pop up when Play is pressed Show that choosing an AP in this dialog causes the relevant network configuration to go green when playback starts Reset Show that selecting an NC from the list causes this to be used without the dialog being shown
Show this app running, and point out how the memory consumed by the video playback can be calculated
Show that checking video capture mode leads to a reduction in memory consumption Show that changing the encoder resolution makes no difference (at least on the C7-01 running qtm-multimedia/master)
Show the demo Point out that, in Qt4, processing video via ShaderEffectItem is not optimally efficient: it uses an FBO, despite the fact that the video frames are already available as texture handles This will be resolved in Qt5, in which the texture stream will be directly available to shader effects