Sensor status

This page describes the sensor status buttons on the top right, which indicate the current state of the sensors (cameras and microphones) and allow to start and stop the sensors.

It also describes the sensor status window that appears when clicking the left sensor status button, which shows the sensor status in more detail.

Sensor status buttons

The sensor status buttons are always visible on the top right of the application window.

Start / stop sensors

The right button allows to start and stop the sensors:

Sensor status

The left button shows whether the sensors are currently active (running). In addition, if the sensors are active and the program detected an issue with them, such as an unusual amount of dropped camera frames, then the button will change its color to warn about this.

A list of possible states with example images is given in the following.

Sensor status window

The sensor status window can be opened by clicking the left sensor status button on the top right while the sensors are active.

It shows more details about the connected sensors, which may for example look like this:


For cameras, in addition to the configured frames-per-second, the window also shows the actually measured frames-per-second, which is useful to detect potential issues with the sensor connections.

Small deviations of the measured frames-per-second from the configured frames-per-second are normal, for example a value of 30.1 instead of 30. However, if the measured frames-per-second differ significantly from the configured frames-per-second, then they are highlighted in orange as for the first color stream in this example:


Note that it is normal that the measured frames-per-second are low for a moment directly after starting the cameras. However, there is an issue if they remain low after this. Please refer to the troubleshooting section of the documentation page for your camera type for things to try in this case.


For Azure Kinect users: The Azure Kinect may deliver invalid MJPEG frames, seemingly in case of incompatibility with the used USB host controller (see here). ScannedReality Studio has to drop such invalid frames when creating volumetric videos. However, these frames are not recognized as dropped for the “measured FPS” counter of the sensor status display because ScannedReality Studio usually does not decode the frames immediately when they are received (because this would significantly increase the processing load at all times when the sensors are running).