Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Groups
  • Search
  • Get Qt Extensions
  • Unsolved
Collapse
Brand Logo
  1. Home
  2. Qt Development
  3. QML and Qt Quick
  4. Best Approach for Displaying Multiple Camera Streams in Qt 6.9.1 with Microservices
Forum Updated to NodeBB v4.3 + New Features

Best Approach for Displaying Multiple Camera Streams in Qt 6.9.1 with Microservices

Scheduled Pinned Locked Moved Unsolved QML and Qt Quick
3 Posts 2 Posters 33 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • G Offline
    G Offline
    greed_14
    wrote last edited by
    #1

    Hi,

    I’m working with Qt 6.9.1 in a microservice-based architecture. Our current setup involves parsing RTSP camera streams, converting them into JPEG format, and rendering them through QML’s VideoOutput using a VideoSink.

    We need to handle multiple video streams simultaneously—potentially more than 10–15 at once. What would be the most scalable and efficient approach to achieve this?

    Regards,
    Adnan

    1 Reply Last reply
    0
    • SGaistS Offline
      SGaistS Offline
      SGaist
      Lifetime Qt Champion
      wrote last edited by
      #2

      Hi,

      What are the specs of the streams ?
      Is there are a particular need to convert them to jpeg ?
      How are you doing that conversion ?
      What is the target OS to run that application ?

      Interested in AI ? www.idiap.ch
      Please read the Qt Code of Conduct - https://forum.qt.io/topic/113070/qt-code-of-conduct

      1 Reply Last reply
      0
      • G Offline
        G Offline
        greed_14
        wrote last edited by
        #3

        Hi,

        1. Currently, we are considering 1024×768 @ 30fps and 720p as standard resolutions. However, in some cases, 1080p streams may also be required, depending on the camera capabilities.

        2. The solution needs to be ONVIF compliant, as ONVIF mandates support for streaming MJPEG video over RTSP.

        3. Our approach uses a GStreamer pipeline within a service application to read the RTSP stream. The raw frames are extracted through an appsink and written to a shared memory segment. A separate streaming service consumes this shared memory, encodes the frames into MJPEG, and sends them to the client UI application via gRPC, where they are displayed using a VideoOutput element’s videoSink. The same shared memory data will also be leveraged for video analysis.

        4. The target OS is Windows for now, but the implementation should remain as cross-platform as possible for future portability.

        1 Reply Last reply
        0

        • Login

        • Login or register to search.
        • First post
          Last post
        0
        • Categories
        • Recent
        • Tags
        • Popular
        • Users
        • Groups
        • Search
        • Get Qt Extensions
        • Unsolved