ContentModerationDetection

Information about an inappropriate, unwanted, or offensive content label detection in a stored video.

Types

Link copied to clipboard
class Builder
Link copied to clipboard
object Companion

Properties

Link copied to clipboard

A list of predicted results for the type of content an image contains. For example, the image content might be from animation, sports, or a video game.

Link copied to clipboard

The time duration of a segment in milliseconds, I.e. time elapsed from StartTimestampMillis to EndTimestampMillis.

Link copied to clipboard

The time in milliseconds defining the end of the timeline segment containing a continuously detected moderation label.

Link copied to clipboard

The content moderation label detected by in the stored video.

Link copied to clipboard

The time in milliseconds defining the start of the timeline segment containing a continuously detected moderation label.

Link copied to clipboard

Time, in milliseconds from the beginning of the video, that the content moderation label was detected. Note that Timestamp is not guaranteed to be accurate to the individual frame where the moderated content first appears.

Functions

Link copied to clipboard
Link copied to clipboard
open operator override fun equals(other: Any?): Boolean
Link copied to clipboard
open override fun hashCode(): Int
Link copied to clipboard
open override fun toString(): String