Table of Contents
Mokap is an easy to use multi-camera acquisition software developed for animal behaviour recording using hardware-triggered (synchronised) machine vision cameras.
- Cross platform (Linux, Windows, macOS)
- Supports Basler and FLIR cameras (they can be used at the same time, yes 🙂)
- Supports hardware-synchronisation of cameras (using a Raspberry Pi, an Aduino, a USB-to-TTL adapter, or one of your cameras itself)
- Supports Hardware-accelerated encoding (Nvidia, AMD, Intel Arc, Apple silicon, etc)
- Completely tunable encoding parameters (but mokap will automatically select one for you)
- Live multi-camera calibration and pose estimation
- Supports classic USB (and internal) Webcams, but their features are limited (no hardware sync, etc.)
- Generate printable calibration boards in 1 click
This is an example of how you may give instructions on setting up your project locally. To get a local copy up and running follow these simple example steps.
If you wish to use straight-to-video encoding, you will need ffmpeg installed on your machine.
- Linux (most Debian-based distros):
sudo apt install ffmpeg
- Windows:
winget install --id Gyan.FFmpeg
- macOS:
brew install ffmpeg
If you do not want to use ffmpeg, you can still use Mokap in image mode (videos will be written as individual frames), but be aware that this is much slower.
We recommend using uv to manage Python environments and install Mokap easily.
- If you don't have uv installed, see here.
- Download the installer package for your system: https://www2.baslerweb.com/en/downloads/software-downloads/
-
You need to increase the limit on file descriptors and USB memory. Basler provides a script to do so automatically, but it may not completely work on all distros.
Run
sudo chmod +x /opt/pylon/share/pylon/setup-usb.sh
andsudo /opt/pylon/share/pylon/setup-usb.sh
(assuming you installed the Pylon SDK to the default/opt/pylon
directory) -
Note: Basler's default increase on USB memory is 1000 Mib. This is, in our case, not enough for more than 3 USB cameras. You can increase it even further by modifying the
/sys/module/usbcore/parameters/usbfs_memory_mb
file. A value of2048
is enough for our 5 cameras. -
Note: On Arch-based systems, you need to manually add the line
DefaultLimitNOFILE=2048
to/etc/systemd/user.conf
(or/etc/systemd/system.conf
if you want to apply it system-wide) -
On systems that do not use GRUB, if you want to the USB memory setting to be persistent, Basler's script won't work. You need to change your bootloader options manually.
For instance, EndeavourOS uses systemd-boot: edit
/efi/loader/entries/YOURDISTRO.conf
(replaceYOURDISTRO
by the name of the entry for your system, typically the machine-id in the case of EndeavourOS) and addusbcore.usbfs_memory_mb=2048
to theoptions
line.
Refer to FLIR's guide for detailed instructions. But in short:
- Download the Spinnaker SDK from FLIR's website.
- Note: You need both the Full SDK with the binaries and the Python SDK, and they need to match in version number. Currently, on Windows and Linux, the Python bindings only support SDK version 3.2.0.62. On macOS, you can install Spinnaker 4.1.
- Install the full SDK with the installer provided
- Make sure you have mokap installed, and activate Mokap's virtual env (see below)
- Install the Spinnaker Python wheels with
uv pip install /path/to/spinnaker_python-3.2.0.62-cp310-cp310-win_amd64.whl
(replace with the path and the name of the.whl
file you downloaded)
- Clone this repository:
git clone https://github.com/FlorentLM/mokap.git
- Create environment:
cd mokap && uv sync
- Customise
config_example.yaml
and rename it toconfig.yaml
(or whatever you want)
Configuration example:
# ----------------------------------
# Mokap Example Configuration File
# ----------------------------------
# --- Global Acquisition Settings ---
base_path: D:/MokapTests # where the recordings will be stored
hardware_trigger: true # whether to use an external hardware trigger
framerate: 60 # in frames per seconds (Hz)
exposure: 15000 # in milliseconds
trigger_line: 4 # which GPIO line is used as an input (to listen to the hardware trigger)
gain: 1.0
pixel_format: Mono8 # or Mono10, BGR8, BayerRG8, ...
binning: 1 # or 2, 3, 4
binning_mode: average # or sum
black_level: 1.0
gamma: 1.0
roi: [0, 0, 1440, 1080] # ROI can be [x offset, y offset, width, height] or [width, height] (automatically centered)
# --- Global Saving & Encoding Settings ---
save_format: mp4 # or 'png', 'jpg', 'bmp', 'tiff'
save_quality: 90 # 0-100 scale (only for images, ignored in video encoding)
frame_buffer_size: 200 # max number of frames to buffer in RAM (per camera)
# --- Hardware trigger parameters ---
# You can use a Raspberry Pi
trigger:
type: raspberry
pin: 18 # The GPIO pin you connect your cameras to. Pin 18 is recommended.
## or an Arduino
#trigger:
# type: arduino
# port: COM5 # 'COMX' on Windows, '/dev/ttyUSBX' on Linux, '/dev/cu.usbserial-XXXX' on macOS
# pin: 11 # The GPIO pin you connect your cameras to. Usually 3 or 11 on Arduino
# baudrate: 115200 # Optional. If you use one of the two firmwares provided with Mokap, you should not change this
## or a USB-to-TTL adapter (this is less accurate though)
#trigger:
# type: ftdi
# port: COM3 # 'COMX' on Windows, '/dev/ttyUSBX' on Linux, '/dev/cu.usbserial-XXXX' on macOS
# pin: RTS # Optional, can be 'RTS' or 'DTR'
# baudrate: 9600 # Optional. Should not matter too much
## or use one of the cameras to control the others
#trigger:
# type: camera
# name: my-first-camera # use the friendly name from 'sources'
# output_line: 2 # The GPIO line to use for output
# --- Video encoding parameters ---
ffmpeg:
path: 'ffmpeg'
gpu: true
params:
# --- CPU Profiles ---
# H.265
cpu_h265: >-
-c:v libx265 -preset superfast -tune zerolatency -crf 20 -x265-params "vbv-maxrate=60000k:vbv-bufsize=120000k:keyint=100:min-keyint=100"
# H.264
cpu_h264: >-
-c:v libx264 -preset superfast -tune zerolatency -crf 21
# --- NVIDIA Profiles ---
# H.265
gpu_nvenc_h265: >-
-c:v hevc_nvenc -preset fast -tune ll -rc constqp -qp 19 -g 100 -bf 0
# H.264
gpu_nvenc_h264: >-
-c:v h264_nvenc -preset fast -tune ll -rc constqp -qp 20 -g 100 -bf 0
# --- AMD profile (Windows) ---
gpu_amf: >-
-c:v hevc_amf -preset quality -low_latency 1 -rc vbr_hq -quality 20 -g 100 -bf 0
# --- AMD profile (Linux) ---
gpu_vaapi: >-
-vaapi_device /dev/dri/renderD128 -c:v hevc_vaapi -qp 21 -g 100 -bf 0
# --- Apple profiles ---
gpu_videotoolbox: >-
-c:v hevc_videotoolbox -realtime true -q:v 80 -allow_sw 1 -g 100
# --- Intel QSV profiles (for Intel Arc GPUs, or Intel CPUs with QSV) ---
gpu_arc_av1: >-
-c:v av1_qsv -preset veryfast -global_quality 23 -low_power 0 -g 100 -bf 0
gpu_arc_hevc: >-
-c:v hevc_qsv -preset veryfast -global_quality 21 -low_power 0 -g 100 -bf 0
# --- Camera-Specific Definitions ---
# This is where you add your cameras
sources:
my-first-camera: # This is your defined, friendly name for this camera :)
vendor: basler
serial: xxxxxxxx # you can specify the serial number to make sure it gets the right name, colour etc
color: da141d
# # Camera-specific settings can override globals
# exposure: 9000
# gain: 2.0
# gamma: 1.5
# pixel_format: Mono8
# blacks: 1.0
# binning: 1
# save_format: jpg
# save_quality: 90 # you can set per-camera writer settings
some-other-camera:
vendor: flir
color: 7a9c21
# You can also use your laptop's internal camera (or any USB webcam)
# Features are limited of course, but it is useful for debugging
laptop-camera:
vendor: webcam
color: f3d586
Note: This is temporary, an actual launcher will come soon (along with a full CLI interface)
- Activate the uv environment within mokap.
Linux:source .venv/bin/activate
Windows:.venv/Scripts/activate
- Run
./mokap.py
Mokap supports Raspberry Pi, Arduino boards, and USB-to-TTL adapters to act as hardware triggers.
For the Raspberry Pi option, the commands are sent from the host PC to the Pi via the network, so you MUST have three environment variables defined.
The recommended way is to create a file named .env
that contains the three variables:
TRIGGER_HOST=192.168.0.10
TRIGGER_USER=pi
TRIGGER_PASS=hunter2
(Replace with your trigger's IP or hostname, username and passsword)
You must first enable the GPIO & SSH interface on the PI using:
sudo raspi-config
Make sure that the pigpiod
service is enabled on the Pi
sudo systemctl status pigpiod
You have to make sure the two decives can find each other on the network. If they are on the same local subnet and you're using DHCP, this whould be straightforward. Otherwise you may need to explicitly set IP addresses and subnets. We recommend using a tunnel like plain WireGuard or Tailscale for secure communication between the devices (especially if they are not on the same subnet!!) Test by pinging between devices.
Any Arduino board should be compatible. You just have to flash it with one of the two provided firmwares (located in mokap/triggers/arduino_firmware
).
trigger_millis_v1.ino
: Good precision, jitter is in the microsecond order. Supports any frequency and any duty cycle.trigger_tone_v1.ino
: Highest precision, jitter is negligible (nanosecond). Only supports 50% duty rate, and frequencies >= 31 Hz on 16 MHz boards (Arduino Uno for instance).
If you're recording at more than 100 frames per second, then you will want to use the trigger_tone_v1.ino
firmware.
If you're recording at less than 31 fps you will have to use the trigger_millis_v1.ino
firmware.
This is a very cheap and quick solution. But the timing is under the control of your host computer, so it is less accurate than the other two alternatives (jitter is in the order of 1-15+ milliseconds). Not recommended for recording above 30 fps.
You can also use one of the cameras to act as a hardware trigger for the others. The idea is the same, except that the primary camera is configured with a line output, and is connected to the other cameras' input lines.
Note: You will be subject to the drive-current limit on the primary camera's I/O line, so probably won't work reliably with more than 3-4 secondary cameras.
- If you plan on recording high framerate from many cameras, you probably want to use a GPU, as the software encoders are slower.
- Add support for Arduino triggers
- Add support for FLIR cameras
- Add support for cameras as trigger
- Add support for other camera brands
- Add support for AMD GPU encoding
-
Finish calibration modeerm... needs some fixes - Embed useful metadata in the videos / images
permission denied: ./mokap.py
Fix: make the file executable chmod u+x ./mokap.py
Failed to open device xxxxx for XML file download. Error: 'The device cannot be operated on an USB 2.0 port. The device requires an USB 3.0 compatible port.'
Fix: Unplug and plug the camera(s) again
Warning: Cannot change group of xxxx to 'video'.
Fix: Add the local user to the video group: sudo usermod -a -G video $USER
Error: 'Insufficient system resources exist to complete the API.'
or
Too many open files. Reached open files limit
Fix: Increase the number of open file descriptors: ulimit -n 2048
(or more)
Note: mokap normally does this automatically
Distributed under the MIT License. See LICENSE.txt
for more information.
Florent Le Moël - @opticflo.xyz
Project Link: https://github.com/FlorentLM/mokap