VideoFrame

Video frame property settings.

@JsonSerializable(explicitToJson: true, includeIfNull: false)
class VideoFrame {
  const VideoFrame(
      {this.type,
      this.width,
      this.height,
      this.yStride,
      this.uStride,
      this.vStride,
      this.yBuffer,
      this.uBuffer,
      this.vBuffer,
      this.rotation,
      this.renderTimeMs,
      this.avsyncType,
      this.metadataBuffer,
      this.metadataSize,
      this.textureId,
      this.matrix,
      this.alphaBuffer,
      this.alphaStitchMode,
      this.pixelBuffer,
      this.metaInfo;
      this.colorSpace});

  @JsonKey(name: 'type')
  final VideoPixelFormat? type;

  @JsonKey(name: 'width')
  final int? width;

  @JsonKey(name: 'height')
  final int? height;

  @JsonKey(name: 'yStride')
  final int? yStride;

  @JsonKey(name: 'uStride')
  final int? uStride;

  @JsonKey(name: 'vStride')
  final int? vStride;

  @JsonKey(name: 'yBuffer', ignore: true)
  final Uint8List? yBuffer;

  @JsonKey(name: 'uBuffer', ignore: true)
  final Uint8List? uBuffer;

  @JsonKey(name: 'vBuffer', ignore: true)
  final Uint8List? vBuffer;

  @JsonKey(name: 'rotation')
  final int? rotation;

  @JsonKey(name: 'renderTimeMs')
  final int? renderTimeMs;

  @JsonKey(name: 'avsync_type')
  final int? avsyncType;

  @JsonKey(name: 'metadata_buffer', ignore: true)
  final Uint8List? metadataBuffer;

  @JsonKey(name: 'metadata_size')
  final int? metadataSize;

  @JsonKey(name: 'textureId')
  final int? textureId;

  @JsonKey(name: 'matrix')
  final List<double>? matrix;

  @JsonKey(name: 'alphaBuffer', ignore: true)
  final Uint8List? alphaBuffer;

  @JsonKey(name: 'alphaStitchMode')
  final AlphaStitchMode? alphaStitchMode;

  @JsonKey(name: 'pixelBuffer', ignore: true)
  final Uint8List? pixelBuffer;

  @VideoFrameMetaInfoConverter()
  @JsonKey(name: 'metaInfo')
  final VideoFrameMetaInfo? metaInfo;

  @JsonKey(name: 'colorSpace')
  final ColorSpace? colorSpace;

  factory VideoFrame.fromJson(Map<String, dynamic> json) =>
      _$VideoFrameFromJson(json);

  Map<String, dynamic> toJson() => _$VideoFrameToJson(this);
}

The buffer is a pointer to a pointer. This interface cannot modify the buffer pointer, only the content of the buffer.

Properties

type
Pixel format. See VideoPixelFormat.
width
Pixel width of the video.
height
Pixel height of the video.
yStride
For YUV data, the line stride of the Y buffer; for RGBA data, the total data length.
Note: When processing video data, you must handle the offset between each row of pixel data based on this parameter; otherwise, image distortion may occur.
uStride
For YUV data, the line stride of the U buffer; for RGBA data, the value is 0.
Note: When processing video data, you must handle the offset between each row of pixel data based on this parameter; otherwise, image distortion may occur.
vStride
For YUV data, the line stride of the V buffer; for RGBA data, the value is 0.
Note: When processing video data, you must handle the offset between each row of pixel data based on this parameter; otherwise, image distortion may occur.
yBuffer
For YUV data, the pointer to the Y buffer; for RGBA data, the data buffer.
uBuffer
For YUV data, the pointer to the U buffer; for RGBA data, the value is null.
vBuffer
For YUV data, the pointer to the V buffer; for RGBA data, the value is null.
rotation
Clockwise rotation angle of this frame before rendering. Supported values are 0, 90, 180, and 270 degrees.
renderTimeMs
Unix timestamp (in milliseconds) when the video frame is rendered. This timestamp can be used to guide video frame rendering. This parameter is required.
avsyncType
Reserved parameter.
metadataBuffer
This parameter applies only to video data in Texture format. It refers to the data buffer of MetaData. Default is NULL.
metadataSize
This parameter applies only to video data in Texture format. It refers to the size of MetaData. Default is 0.
textureId
This parameter applies only to video data in Texture format. Texture ID.
matrix
This parameter applies only to video data in Texture format. A 4x4 transformation matrix input, typically an identity matrix.
colorSpace
Color space properties of the video frame. By default, Full Range and BT.709 configuration is applied. You can customize it according to the needs of custom capture and rendering. See VideoColorSpace.
alphaBuffer
Alpha channel data output by portrait segmentation algorithm. This data has the same dimensions as the video frame. Each pixel value ranges from [0,255], where 0 represents background and 255 represents foreground (portrait). By setting this parameter, you can render the video background with various effects such as transparency, solid color, image, or video.
Note:
  • In custom video rendering scenarios, ensure that both the video frame and alphaBuffer are Full Range type; other types may result in incorrect rendering of Alpha data.
  • Make sure that the dimensions of alphaBuffer exactly match the video frame (width × height), otherwise the app may crash.
alphaStitchMode
When the video frame contains Alpha channel data, sets the relative position of alphaBuffer and the video frame. See AlphaStitchMode.
metaInfo
Metadata in the video frame. This parameter requires contacting technical support to use.