This guide explains how you can integrate the LuciadCPillar map in the Qt framework.
Rendering the map
The Qt framework separates component interaction from component rendering. To comply with the
threading rules of the LuciadCPillar map, you must render the map on
the render thread. When you’re integrating with Qt, you can do so by using a custom QQuickFramebufferObject::Renderer
implementation. This class is called on the render thread by Qt, and it delegates its render call to the
map renderer .
The LuciadCPillar map has a mechanism to notify the UI framework that something has changed, and that it needs to be repainted. The Qt integration code is responsible for making sure that a repaint is scheduled when it gets this notification:
_invalidationCallback = IInvalidationCallback::create([this]() { scheduleRepaint(); }); _map->setMapInvalidationCallback(_invalidationCallback); void QQuickMapObject::scheduleRepaint() { if (thread() == QThread::currentThread()) { update(); } else { QMetaObject::invokeMethod(this, "update", Qt::QueuedConnection); } }
High-resolution (HiDPI) display handling
One of the challenges of dealing with a HiDPI display is that, most of the time, the display operates with a display scale set through the operating system. The display scale allows applications and the OS to scale UI elements to a more readable size. To properly support those displays in your application, you must take the display scale into account.
Both Qt and LuciadCPillar offer facilities for supporting HiDPI displays. On the Qt side, the default Qt::HighDpiScaleFactorRoundingPolicy::PassThrough
should be retained. On the LuciadCPillar side, you must configure the DPI and the display scale.
You can retrieve the display scale programmatically through the QScreen::devicePixelRatio()
or the QQuickWindow::effectiveDevicePixelRatio()
methods.
Qt and LuciadCPillar both express their screen coordinates with device-independent pixels. Internally, those device-independent
pixels are
multiplied or divided by the display scale to convert them to or from physical screen pixels. This means that you don’t
need to scale events received from Qt before passing them from Qt to LuciadCPillar, or the other way round. You must pass
screen coordinates as received.
It’s likely that the display scale or DPI changes during the lifetime of the application. It may change because the user had changed their display settings, or because the application window moved to another display with a different display scale or DPI. |
if (_map) { auto displayScale = getDisplayScale(); auto dpi = getDpi(); _map->setDisplayScale(displayScale); _map->setDpi(dpi); _gestureForwarder.setDisplayScale(displayScale); _gestureForwarder.setDpi(dpi); }
Event handling
Overview
You must first convert events received from the Qt framework to LuciadCPillar input events. To help you with this, the Qt
sample integration project offers the GestureForwarder
utility class.
You can then pass these LuciadCPillar input events to the MouseGestureRecognizer
and TouchGestureRecognizer
classes. These utility classes help you
convert raw input events into more high-level gestures that are easier to consume for
IController
implementations. For example:
-
Mouse-pressed, mouse-move and mouse-released events are converted to mouse drag or click events.
-
Touch-up, touch-down or move events are converted to pinch and rotate events or tap events.
You can forward the new events to your controller using this code in your UI- toolkit-specific code. In this case, we show Qt Quick code:
_gestureForwarder( [this](const std::shared_ptr<IInputEvent>& event) { auto controller = _map->getController(); if (controller) { controller->onEvent(event); } }, [this](const std::shared_ptr<ITask>& task) { QMetaObject::invokeMethod( this, [task]() { task->execute(); }, Qt::QueuedConnection); })
The following sections explain how you can achieve this in more detail.
Enumerations mapping
Each UI toolkit can come with its own modifier keys representation and its own key values. You must translate
them into something LuciadCPillar understands, before passing them to the MouseGestureRecognizer
. This is an example of
the translation for the mouse buttons.
MouseButton toMouseButton(Qt::MouseButton qtMouseButton) { switch (qtMouseButton) { case Qt::LeftButton: return MouseButton::left(); case Qt::MiddleButton: return MouseButton::middle(); case Qt::RightButton: return MouseButton::right(); default: return MouseButton::unknown(); } }
For other conversions, see the GestureForwarder
class.
Mouse events handling
To receive the mouse events from the Qt toolkit, you must override the related methods from the Qt class. For example,
for a MouseMoveEvent
, you must add this piece of code:
void mouseMoveEvent(QMouseEvent* event) override;
void QQuickMapObject::mouseMoveEvent(QMouseEvent* event) { _gestureForwarder.doMouseMoveEvent(event); }
Then, you use the information from the QMouseEvent
to create the input event and give it to the MouseGestureRecognizer
:
ModifierKeys controllerModifiers = toModifiers(event->modifiers()); auto position = toCoordinate(event->pos()); MouseMoveEvent mouseEvent{position, controllerModifiers}; _mouseGestureRecognizer.onMouseMoveEvent(mouseEvent);
The same mechanism applies to handle the events for mouse-pressed, mouse-released, hover, the wheel, and the keyboard keys.
Have a look at the GestureForwarder
and the QQuickMapObject
code for the details.
Touch events handling
Touch events need more care. They require the use of a TouchGestureRecognizer
,
with a more elaborate entry point. See onTouchEvent
.
The touch gesture recognizer method needs the location, ID, and current state of each touch input, one for each used finger for example.
Before you can forward the Qt touch events to LuciadCPillar, you must receive them in the integration code:
setAcceptTouchEvents(true);
void touchEvent(QTouchEvent* event) override;
void QQuickMapObject::touchEvent(QTouchEvent* event) { forceActiveFocus(); _gestureForwarder.doTouchEvent(event); }
Then you can process them. The integration code looks like this:
void GestureForwarder::doTouchEvent(QTouchEvent* event) { const auto& touchPoints = event->points(); std::vector<TouchPoint> points; points.reserve(touchPoints.count()); for (const auto& point : touchPoints) { auto position = toCoordinate(point.position()); auto state = point.state(); TouchPointState pointState = TouchPointState::Stationary; switch (state) { case QEventPoint::Pressed: pointState = TouchPointState::Pressed; break; case QEventPoint::Released: pointState = TouchPointState::Released; break; case QEventPoint::Stationary: pointState = TouchPointState::Stationary; break; case QEventPoint::Updated: pointState = TouchPointState::Moved; break; default: break; } points.emplace_back(position, pointState, point.id()); } ModifierKeys modifiers = toModifiers(event->modifiers()); _touchGestureRecognizer.onTouchEvent(TouchPointEvent{points, modifiers}); }
LongPress events
A common gesture available when using a touch device is a long-press event. You receive such an event when users press a touch
point and stay stationary for a certain amount of time.
Because we won’t receive any new event while the touch point is stationary, we must detect the long-press using a dedicated
timer in the TouchGestureRecognizer
in a separated thread.
As a consequence, the event is also emitted on another thread, which could lead to issues.
To solve that, we give you the ability to inject a ITaskScheduler
in the TouchGestureRecognizer
. The TaskScheduler has the responsibility to delegate the
execution of the given Task
to the desired thread. See Program: Bind Controller to a Recognizer for a demonstration.
This way, the gesture recognizers receive all the information they need to generate the correct input events.
Providing credentials
For some data formats, authentication is required to access the data. The article on authentication shows how this can be done using Qt.