mac-avcapture: Update plugin to ObjC and modern Cocoa APIs by PatTheMav · Pull Request #9344 · obsproject/obs-studio (original) (raw)
Description
This PR effectively reimplements the video capture device support in OBS Studio using ObjC classes and language features and deprecates the old variant to ensure functionality of existing scene setups.
Important
This PR has been updated since its original publishing. The changes are detailed in the following section.
Updates to the PR
- New "Capture Card Device" source type was added.
- This source type uses a "fast path" to render video output directly at the internal OBS Studio frame rate and does not use asynchronous video frames like the normal source. This improves frame pacing especially when using capture cards to work with fast-paced video game output.
- All source code has been amended with DocC style doc-headers, which gives extensive automatic code help in Xcode and would also allow Xcode to automatically generate documentation (and add it to the documentation viewer on a local system)
- Strict aliasing has been enabled for the plugin, adding additional checks for safe pointer aliasing.
- The property refresh handling was simplified into a single callback functions which checks and refreshes all available properties at once, reducing the overall amount of callback calls required when a property refresh is triggered
Motivation and Context
Application development on macOS has become Swift-oriented, which means that many current and future Frameworks (and associated documentation) make the assumption of Swift being used by default.
To allow OBS Studio to move with the times, existing macOS-specific code needs to be migrated to more Apple-specific coding practices and languages, which can serve as a stepping stone toward full Swift implementations.
Specifically this requires moving C/C++-oriented code to ObjC (and also migrate ObjC++ code to ObjC, as Cocoa/CoreFoundation solutions are usually more elegant and easier to maintain than C++ code).
This also allowed separating plugin code functionality into specific blocks (plugin main entry point, plugin functionality, property API, native ObjC type classes), which makes the code more readable and avoids having a single big source code file.
Native ObjC types and helper functions also allow pure ObjC code to interact more elegantly with native C types provided by libobs
, which will probably be necessary to correctly bridge to Swift at a later point as well.
Note
This PR also makes use of native async queues which allow otherwise slow or blocking operations (creating and starting a capture session in this case) be handled by Cocoa's event loop and its ephemeral worker threads without the need for libobs
bespoke threads (function calls exclusively happening off the main thread are not queued async because OBS Studio code expects code in those threads to be sync mostly).
How Has This Been Tested?
Tested on macOS 13 with built-in Studio Display Camera and Logitech C922 external camera:
- Tested use of presets and distinct device configurations
- Tested attachment and detachment of external devices
- Tested scene setup restoration between OBS launches
- Tested preset changes after device configurations
Notable Quirk
Note
There is a quirk present in the old video capture source plugin already and has not been fixed in the rewrite (I attempted to do so but this created an entire new slew of issues):
When a device is initialised with a preset, the sample buffer dimensions provided by AVFoundation
is defined by that preset. This will be independent from the actual device resolution:
- Select "1280x720" preset
- Observe sample buffer resolution of 1280x720 pixels
- Change internal device resolution to 160x120 pixels
- Observe sample buffer resolution is kept at 1280x720 pixels, but image is pillarboxed and upscaled from the 160x120 device image
- Select "1920x1280" preset
- Observe sample buffer resolution is changed to 1920x1080 pixels, but image is pillarboxed and upscaled still
This does noticeably not happen if preset usage is disabled before selecting a device, as the capture session will be set up with a device format directly. Interestingly the sample buffer provided by AVFoundation
will then keep track with the actual device resolution, but once presets are used, the buffer size is determined by that entirely.
No Automatic Selection of Properties
Important
The automatic property selection of the old plugin has not been reimplemented, requiring users to explicitly select the colour format, frame rate, and other properties when not using presets. Presets are the canonical and best way to use video capture input.
Quirks with iOS Devices
Using iOS devices as camera sources via USB still works (including muxed audio), but retains some notable quirks:
- The device will not be automatically rediscovered/enabled between OBS restarts.
- Only presets can be used, the device will not expose any specific resolutions or framerates
- Switching between landscape and portrait mode is supported
When used just for camera functionality, "Continuity Camera" should be preferred and capture via USB used for capture of screen contents.
Types of changes
- Tweak (non-breaking change to improve existing functionality)
- Performance enhancement (non-breaking change which improves efficiency)
- Code cleanup (non-breaking change which makes code smaller or more readable)
Checklist:
- My code has been run through clang-format.
- I have read the contributing document.
- My code is not on the master branch.
- The code has been tested.
- All commit messages are properly formatted and commits squashed where appropriate.
- I have included updates to all appropriate documentation.