PCast™ Express
The PCast™ Express extends the PCast™ API to provide a single-step configuration based API for:
- Publishing local media
- Subscribing to published streams
This API is intended to be used as a supplement to the Room Express although it can be used to stream all on its own.
Initializing
import PhenixSdk
let pcastExpressOptions = PhenixPCastExpressFactory.createPCastExpressOptionsBuilder()
.withAuthenticationToken("DIGEST:eyJhc...")
.buildPCastExpressOptions()
let pcastExpress = PhenixPCastExpressFactory.createPCastExpress(pcastExpressOptions)
PhenixPCastExpressOptionsBuilder
Name | Type | Default | Description |
---|---|---|---|
withAuthenticationToken (required) | NSString | The authentication token generated using the Phenix EdgeAuth library. | |
withUnrecoverableErrorCallback (optional) | PhenixPCastExpressUnrecoverableErrorCallback | Function to be called when authentication fails or a failure occurs that is unrecoverable. | |
withPCastUri (optional) | NSString | Allows overriding default PCast™ URI. | |
withPCastInitializationOptions (optional) | PhenixPCastInitializeOptions | Use custom options when initializing PCast™. | |
buildPCastExpressOptions | none | Builds the PhenixPCastExpressOptions |
Publishing Local Media
Publish local user media:
- Camera
- Microphone
import PhenixSdk
let pcastExpress: PhenixPCastExpress = ... // previously obtained
let userMediaConstraints = PhenixUserMediaOptions()
userMediaConstraints.video.enabled = true
userMediaConstraints.video.capabilityConstraints[PhenixDeviceCapability.facingMode.rawValue] =
[PhenixDeviceConstraint.initWith(PhenixFacingMode.user)]
userMediaConstraints.audio.enabled = true
userMediaConstraints.audio.capabilityConstraints[PhenixDeviceCapability.audioEchoCancelationMode.rawValue] =
[PhenixDeviceConstraint.initWith(PhenixAudioEchoCancelationMode.on)]
let publishOptions = PhenixPCastExpressFactory.createPublishOptionsBuilder()
.withStreamToken("DIGEST:eyJhc...")
.withMediaConstraints(userMediaConstraints)
.buildPublishOptions()
pcastExpress.publish(publishOptions) { (status: PhenixRequestStatus, publisher: PhenixExpressPublisher?) in
if status == .ok {
// Do something with publisher
} else {
// Handle error
}
}
// Create a publisher with an automatically started preview renderer
let renderLayer: CALayer = ... // previously obtained
let publishOptionsWithPreview = PhenixPCastExpressFactory.createPublishOptionsBuilder()
.withStreamToken("DIGEST:eyJhc...")
.withMediaConstraints(userMediaConstraints)
.withPreviewRenderer(renderLayer)
.buildPublishOptions()
pcastExpress.publish(withPreview: publishOptionsWithPreview) {
(status: PhenixRequestStatus, publisher: PhenixExpressPublisher?, preview: PhenixRenderer?) in
if status == .ok {
// Do something with publisher and preview renderer
} else {
// Handle error
}
}
Notes:
- The preview renderer will already have been started by the time it is received by your callback
- The publisher will remain active for as long as you keep a reference to it
Publishing Local Media Parameters
Name | Type | Description |
---|---|---|
options (required) | PhenixPublishOptions | Publish options |
callback (required) | function | Callback for error/success handling |
PhenixPublishOptionsBuilder
Name | Type | Default | Description |
---|---|---|---|
withMediaConstraints (required) | PhenixUserMediaOptions | getUserMedia options Constraints to get the user media. | |
withUserMedia (optional) | PhenixUserMediaStream | alternative to withMediaConstraints - you can pass user media stream returned from getUserMedia. | |
withPreviewRenderer (optional) | CALayer | Render layer on which to display local preview. If none of the withPreview... methods are called, no preview renderer will be instantiated. | |
withPreviewRenderer (optional) | none | Will trigger instantiation of preview renderer. Useful for audio only type streams that do not require a render surface. | |
withPreviewRendererOptions (optional) | PhenixRendererOptions | Options passed to preview renderer. Will trigger instantiation of preview renderer. | |
withMonitor (optional) | PhenixMonitorSetupFailedCallback, PhenixMonitorStreamEndedCallback, PhenixMonitorOptions | Options for monitoring a publisher for failure. | |
withTags (optional) | NSArray of NSStrings | Tags for the stream. | |
withStreamToken (required) | NSString | The publish token generated using the Phenix EdgeAuth library. | |
buildPublishOptions | none | Builds the PhenixPublishOptions |
Publishing Local Media Callback Arguments
Name | Type | Description |
---|---|---|
status | PhenixRequestStatus | The status of the operation |
publisher | PhenixExpressPublisher | Phenix publisher object |
previewRenderer | PhenixRenderer | Optionally provided if any of the withPreview... methods were called on the options builder and publish is invoked with withPreview . |
PhenixExpressPublisher
Shares most methods with regular PhenixPublisher returned by PhenixPCast, see Publish a Stream.
Name | Signature | Returns | Description |
---|---|---|---|
stop | () | void | Stops publisher. Subscribers will receive stream ended. |
stop | (reason) | void | Stops publisher with a custom reason. Subscribers will receive PhenixStreamEndedReasonCustom reason. |
enableAudio | () | void | Unmutes audio. |
disableAudio | () | void | Mutes audio. |
enableVideo | () | void | Unmutes video. |
disableVideo | () | void | Mutes video (black frames). |
setDataQualityChangedCallback | (callback) | void | Listen for Data Quality Feedback |
limitBandwidth | (bandwidthLimitInBps) | PhenixDisposable | Temporarily limit published video bitrate, see Limit Bandwidth |
getStreamId | () | NSString | Returns stream ID of publisher |
hasEnded | () | bool | Indicates whether publisher has ended, e.g. by stop having been invoked |
Subscribing to Published Media
Subscribe to streams published with the Phenix platform
import PhenixSdk
let pcastExpress: PhenixPCastExpress = ... // previously obtained
let renderLayer: CALayer = ... // previously obtained
let subscribeOptions = PhenixPCastExpressFactory.createSubscribeOptionsBuilder()
.withStreamId("us-west#us-west1-b.zzzzzzzz.20000000.xxxxxxxx")
.withStreamToken("DIGEST:eyJhc...")
.withRenderer(renderLayer)
.buildSubscribeOptions()
pcastExpress.subscribe(subscribeOptions) { (status: PhenixRequestStatus, subscriber: PhenixExpressSubscriber?, renderer: PhenixRenderer?) in
if status == .ok {
// Do something with subscriber
if let renderer = renderer {
// Returned if `withRenderer...` option was enabled - Do something with renderer
}
} else {
// Handle error
}
}
Notes:
- The renderer will already have been started by the time it is received by your callback
- If a renderer is provided, your PhenixExpressSubscriber will be kept alive for as long as you are referencing that renderer. There is no need to also store a reference to the subscriber in that case
- Once subscriber and renderer references have been released, the renderer and subscription will automatically be stopped
Subscribe Parameters
Name | Type | Description |
---|---|---|
options (required) | PhenixSubscribeOptions | Subscribe options |
callback (required) | function | Callback for error/success handling |
Subscribe Options
Name | Type | Default | Description |
---|---|---|---|
withStreamId (required) | NSString | The stream ID of the published stream | |
withStreamToken (required) | NSString | The subscribe token generated using the Phenix EdgeAuth library. | |
withRenderer (optional) | CALayer | Render layer on which to display stream. If none of the withRenderer... methods are called, no renderer will be instantiated. | |
withRenderer (optional) | none | Will trigger instantiation of renderer. Useful for audio only type streams that do not require a render surface. | |
withRendererOptions (optional) | PhenixRendererOptions | Options passed to renderer. Will trigger instantiation of renderer. | |
withMonitor (optional) | PhenixMonitorSetupFailedCallback, PhenixMonitorStreamEndedCallback, PhenixMonitorOptions | Options for monitoring a subscriber for failure. | |
withTags (optional) | NSArray of NSStrings | Tags for the stream | |
buildSubscribeOptions | none | Builds the PhenixSubscribeOptions |
PhenixExpressSubscriber
Shares most methods with regular PhenixMediaStream returned by PhenixPCast, see Subscribe to a Stream.
Name | Signature | Returns | Description |
---|---|---|---|
createRenderer | () | PhenixRenderer | Creates a new renderer. This should only be called if none of the withRenderer... builder methods were invoked. |
createRenderer | (PhenixRendererOptions) | PhenixRenderer | Creates a new renderer with PhenixRendererOptions. This should only be called if none of the withRenderer... builder methods were invoked. |
getAudioTracks | () | NSArray of PhenixMediaStreamTrack | Returns all associated audio tracks of this stream |
getVideoTracks | () | NSArray of PhenixMediaStreamTrack | Returns all associated video tracks of this stream |
getTracks | () | NSArray of PhenixMediaStreamTrack | Returns all associated tracks of this stream |
stop | () | void | Stops subscription. This will trigger the stream ended event. |
disableAudio | () | void | Mutes audio. |
enableVideo | () | void | Unmutes video. |
disableVideo | () | void | Mutes video (black frames). |
Monitor
Note: On iOS, the monitor options are currently ignored, but the callbacks for stream setup and stream ended will be triggered.
Monitor callbacks and options can be passed to subscribe and publish options builders. The first callback gets invoked only when we internally fail to setup a stream. The second callback gets invoked whenever a stream ends, whether it is due to failure or not. The retry PhenixOptionalAction allows you to retry publishing or subscribing the failed stream. You must test first whether there is a retry action present by calling isPresent
as it may not be possible to retry the stream (example: a stream that ended normally cannot be retried). You may call dismiss
on the retry action, but you do not have to (the action is automatically dismissed when no longer referenced). You also may defer invoking the retry action.
Example Monitor for subscribing
import PhenixSdk
let monitorOptions = PhenixPCastExpressFactory.createMonitorOptionsBuilder().buildMonitorOptions();
let subscribeOptions = PhenixPCastExpressFactory.createSubscribeOptionsBuilder()
.withStreamId("us-west#us-west1-b.zzzzzzzz.20000000.xxxxxxxx")
.withStreamToken("DIGEST:eyJhc...")
.withMonitor({ (status: PhenixRequestStatus, retry: PhenixOptionalAction?) in
// Stream failed to setup, check if retry is a possibility:
if let retry = retry, retry.isPresent() {
if determineWhetherToRetry() {
retry.perform()
} else {
// Not technically necessary, but here for clarity
retry.dismiss()
}
}
},
{ (reason: PhenixStreamEndedReason, description: String?, retry: PhenixOptionalAction?) in
// Stream has ended, check if due to failure
if let retry = retry, retry.isPresent() {
if reason == .failed {
retry.perform()
} else {
// Not technically necessary, but here for clarity
retry.dismiss()
}
}
},
monitorOptions)
.buildSubscribeOptions()
PhenixOptionalAction
Name | Signature | Returns | Description |
---|---|---|---|
perform | () | void | Performs the action. This will cause a failure if isPresent is false. |
dismiss | () | void | Dismisses the action (if any). Can be called multiple times, will result in isPresent to return false. Dropping reference to PhenixOptionsAction has same effect. |
isPresent | () | bool | Indicates whether an action can be performed. |
PhenixMonitorSetupFailedCallback Callback Arguments
Name | Type | Description |
---|---|---|
status | PhenixRequestStatus | The status of the operation. |
retry | PhenixOptionalAction | Optionally allow retrying the failed stream. |
PhenixMonitorStreamEndedCallback Callback Arguments
Name | Type | Description |
---|---|---|
reason | PhenixStreamEndedReason | Reason for stream ended. |
description | NSString | Optional additional ended reason description. Carries custom message. |
retry | PhenixOptionalAction | Optionally allow retrying the failed stream. For normally ended streams isPresent will always return false. |
Express Get User Media
Get local user media. For now this is merely a wrapper around Get Local User Media.
import PhenixSdk
let pcastExpress: PhenixPCastExpress = ... // previously obtained
let userMediaOptions = PhenixUserMediaOptions()
pcastExpress.getUserMedia(userMediaOptions) { (status: PhenixRequestStatus, userMedia: PhenixUserMediaStream?) in
if status == .ok {
// Do something with user media stream
} else {
// Handle error
}
}
Express Get User Media Parameters
Name | Type | Description |
---|---|---|
options (required) | PhenixUserMediaOptions | User media options |
callback (required) | function | Callback for error/success handling |
Express Get User Media Callback Arguments
Name | Type | Description |
---|---|---|
status | PhenixRequestStatus | The status of the operation. |
userMedia | PhenixUserMediaStream | User media stream |
Get PCast™
Get the underlying instance of the PCast™. This is preferred to creating another instance as this will introduce more overhead.
import PhenixSdk
let pcastExpress: PhenixPCastExpress = ... // previously obtained
let pcast = pcastExpress.pcast
Clean up
Subscribers and publishers are kept alive for as long as they are being referenced in your app. PhenixPCastExpress will only shutdown once it is no longer being referenced.
Process Raw Frames with Frame Ready API
The Frame Ready API allows access to raw unencoded audio and video frames on the publisher as well as subscriber side. This enables use cases such as the following:
- Injection of raw frames from a custom source
- Composition of raw frames (e.g. for applying watermarks, or stickers)
- Application of effects (e.g. changing audio volume, applying video filters)
- Controlling playback (e.g. temporary slow-motion, pause)
Please note: Any processing needs to keep up with the incoming frame rate; otherwise some of the incoming frames will be dropped to compensate.
Process Raw Frames Publisher side
The API is attached to PhenixUserMediaStream
and called setFrameReadyCallback
. A track needs to be passed in to indicate for which frames to receive notifications.
PhenixUserMediaStream
(via its contained PhenixMediaStream
) provides access to the currently available tracks via getVideoTracks
and getAudioTracks
.
Example Code for processing video raw frames
import PhenixSdk
let userMediaStream: PhenixUserMediaStream = ... // previously obtained
// Assume there is at least one video track:
let videoTrack = userMediaStream.mediaStream.getVideoTracks()[0]
// 1) Example showing how read incoming video frame and produce and outgoing one:
userMediaStream.setFrameReadyCallback(videoTrack) { (frameNotification: PhenixFrameNotification?) in
frameNotification?.read { (inputFrame: CMSampleBuffer?) in
let pixelBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(inputFrame!)!
// Process, manipulate pixel buffer
// ...
// Assume we generate an output CVPixelBuffer, which may or may not be based
// on the incoming frame:
let outputPixelBuffer: CVPixelBuffer = ...
// Assemble an output frame using the same timestamps as the input sample buffer:
let presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(inputFrame!)
let duration = CMSampleBufferGetDuration(inputFrame!)
var sampleTimingInfo = CMSampleTimingInfo.init(
duration: duration,
presentationTimeStamp: presentationTimeStamp,
decodeTimeStamp: CMTime.invalid)
var formatDescription: CMFormatDescription? = nil
CMVideoFormatDescriptionCreateForImageBuffer(
allocator: kCFAllocatorDefault,
imageBuffer: outputPixelBuffer,
formatDescriptionOut: &formatDescription)
var outputFrame: CMSampleBuffer? = nil
CMSampleBufferCreateReadyWithImageBuffer(
allocator: kCFAllocatorDefault,
imageBuffer: outputPixelBuffer,
formatDescription: formatDescription!,
sampleTiming: &sampleTimingInfo,
sampleBufferOut: &outputFrame)
frameNotification?.write(outputFrame)
}
}
// 2) Example showing how we can just directly write output frames from a custom source without the need to
// read the incoming frame:
userMediaStream.setFrameReadyCallback(videoTrack) { (frameNotification: PhenixFrameNotification?) in
// Assume we have a custom source that is able to provide CMSampleBuffer:
let outputFrame: CMSampleBuffer? = ...
frameNotification?.write(outputFrame)
}
// 3) Example showing how we can stop frames from getting propagated by instructing the notification
// to drop them:
userMediaStream.setFrameReadyCallback(videoTrack) { (frameNotification: PhenixFrameNotification?) in
// We want to prevent frames from getting propagated further:
frameNotification?.drop()
}
// 4) Example showing how to read and convert incoming frames to a specific pixel format:
userMediaStream.setFrameReadyCallback(videoTrack) { (frameNotification) in
frameNotification?.read(with: .BGRA) { (inputFrame: CMSampleBuffer?) in
let bgraPixelBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(inputFrame!)!
// ...
}
}
Example Code for processing audio raw frames
import PhenixSdk
let userMediaStream: PhenixUserMediaStream = ... // previously obtained
// Assume there is at least one audio track:
let audioTrack = userMediaStream.mediaStream.getAudioTracks()[0]
// 1) Example showing how read incoming audio frame and produce and outgoing one:
userMediaStream.setFrameReadyCallback(audioTrack) { (frameNotification: PhenixFrameNotification?) in
frameNotification?.read { (inputFrame: CMSampleBuffer?) in
let blockBuffer = CMSampleBufferGetDataBuffer(inputFrame!)
var totalLength = Int()
var lengthAtOffset = Int()
var dataPointer: UnsafeMutablePointer<Int8>? = nil
guard CMBlockBufferGetDataPointer(
blockBuffer!,
atOffset: 0,
lengthAtOffsetOut: &lengthAtOffset,
totalLengthOut: &totalLength,
dataPointerOut: &dataPointer) == kCMBlockBufferNoErr && lengthAtOffset == totalLength else {
// Handle error, unexpected buffer length
return
}
// Verify audio format:
guard let formatDescription: CMAudioFormatDescription =
CMSampleBufferGetFormatDescription(inputFrame!) else {
return
}
guard let audioStreamBasicDescription =
CMAudioFormatDescriptionGetStreamBasicDescription(formatDescription) else {
return
}
guard (audioStreamBasicDescription.pointee.mFormatFlags & kAudioFormatFlagIsSignedInteger)
== kAudioFormatFlagIsSignedInteger &&
audioStreamBasicDescription.pointee.mBitsPerChannel == 16 else {
return
}
// Raw samples are contained in `dataPointer` as Int16 values
...
// Assume we generate an output CMBlockBuffer, which may or may not be based
// on the incoming audio frame, and which as the same audio format and same
// length as the incoming frame:
let outputAudioBuffer: CMBlockBuffer = ...
let presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(inputFrame!)
let numberOfSamples =
totalLength /
Int(audioStreamBasicDescription.pointee.mChannelsPerFrame * audioStreamBasicDescription.pointee.mBitsPerChannel / 8)
var outputFrame: CMSampleBuffer? = nil
CMAudioSampleBufferCreateReadyWithPacketDescriptions(
allocator: kCFAllocatorDefault,
dataBuffer: outputAudioBuffer,
formatDescription: formatDescription,
sampleCount: numberOfSamples,
presentationTimeStamp: presentationTimeStamp,
packetDescriptions: nil,
sampleBufferOut: &outputFrame)
frameNotification?.write(outputFrame)
}
}
// Other examples would look very similar to video
Frame Ready Callback Arguments
This callback gets invoked for each frame that is passing through.
Name | Type | Description |
---|---|---|
frameNotification | PhenixFrameNotification | Object representing the current frame |
Frame Notification
Represents the current frame. Allows for reading, writing, and dropping.
Name | Signature | Returns | Description |
---|---|---|---|
read | (ReadFrameCallback) | void | Retrieves current raw frame in form of a CMSampleBufferRef |
readWithFormat | (PhenixMediaFormat, ReadFrameCallback) | void | Retrieves current raw frame in form of a CMSampleBufferRef |
write | (CMSampleBufferRef) | void | Writes back a processed or newly generated frame. The frame can have a different resolution and timestamps (the buffer attributes are expected to be set correctly) |
drop | () | void | Instructs stream to drop the current frame |
Read Frame Callback Arguments
Receives the current raw frame.
Name | Type | Description |
---|---|---|
frame | CMSampleBufferRef | The raw audio or video frame |
Media Format
Strategy | Description |
---|---|
PhenixMediaFormatI420 | FourCC planar pixel format YUV-I420, corresponds to Apple kCVPixelFormatType_420YpCbCr8Planar |
PhenixMediaFormatNV12 | FourCC planar pixel format YUV-NV12, corresponds to Apple kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange |
PhenixMediaFormatBGRA | FourCC pixel format RGB-ARGB, corresponds to Apple kCVPixelFormatType_32BGRA |
Process Raw Frames Subscriber side
The Frame Ready API on the subscriber side can be accessed via the renderer, which offers an analoguous setFrameReadyCallback
method to the Publisher Side.
When using the Express API, you can access the stream tracks directly via the Express Subscriber. Otherwise, tracks can be accessed via the PhenixMediaStream
object.
The example code is somewhat abbreviated as the contents of the Frame Ready callbacks would look the same as for the Publisher Side.
Example Code for hooking up frame-ready callback with a PhenixMediaStream object
import PhenixSdk
let mediaStream: PhenixMediaStream = ... // previously obtained
let renderer: PhenixRenderer = ... // previously obtained
// Assume there is at least one video track:
let videoTrack = mediaStream.getVideoTracks()[0]
renderer.setFrameReadyCallback(videoTrack, ...
// Remainder of code identical to Publish Side
Example Code for hooking up frame-ready callback with a PhenixExpressSubscriber object
import PhenixSdk
let expressSubscriber: PhenixExpressSubscriber = ... // previously obtained
let renderer: PhenixRenderer = ... // previously obtained
// Assume there is at least one video track:
let videoTrack = expressSubscriber.getVideoTracks()[0]
renderer.setFrameReadyCallback(videoTrack, ...
// Remainder of code identical to Publish Side
Limit Bandwidth
The outgoing or incoming video bandwidth can be limited on the publisher and subscriber side.
Limit Bandwidth Publisher side
The published video bitrate can be limited temporarily if needed. The returned disposable allows to control for how long the limitation should stay in effect. If limitBandwidth
is called multiple times before any of the previous disposables are released, then only the most recent override will remain in effect until its disposable is released. Relasing any of the disposables from earlier limitBandwidth
calls will have no effect.
Example Code for limiting video bandwidth with a PhenixPublisher object
import PhenixSdk
// Previously obtained
let publisher: PhenixPublisher = ...
// Limit video bitrate to 200kbps for 10 seconds
var disposable = publisher.limitBandwidth(200000)
DispatchQueue.main.asyncAfter(deadline: .now() + 10) {
// Dropping the disposable will undo the bandwidth limitation
disposable = nil
}
Example Code for limiting video bandwidth with a PhenixExpressPublisher object
import PhenixSdk
// Previously obtained
let expressPublisher: PhenixExpressPublisher = ...
// Limit video bitrate to 200kbps for 10 seconds
var disposable = expressPublisher.limitBandwidth(200000)
DispatchQueue.main.asyncAfter(deadline: .now() + 10) {
// Dropping the disposable will undo the bandwidth limitation
disposable = nil
}
Parameters
Name | Type | Description |
---|---|---|
bandwidthLimitInBps (required) | UInt64 | Maximum bitrate limit in bps for video |
Limit Bandwidth Subscriber side
Invoking limitBandwidth
on a subscriber will inform the publishing side to lower the video bandwidth to try to match the requested value. The API is attached to PhenixMediaStreamTrack
. The semantics with regards to the disposable returned by the API are identical to the publisher side.
Notes:
- Whether a publisher is able to meet the requested bandwidth also depends on the stream capabilities for the given publisher. For MBR (
multi-bitrate
publisher capability enabled) the publisher will be able to fairly closely match the requested bitrate. With SBR (withoutmulti-bitrate
capability) the bit rate received by the subscriber may not match as well. - The video bitrate will not immediately change after
limitBandwidth
has been invoked. Similarly, once the disposable has been released, it may take several seconds for the bitrate to recover.
Example Code for limiting video bandwidth with a PhenixMediaStream object
import PhenixSdk
// Previously obtained
let subscriber: PhenixMediaStream = ...
// Limit video bitrate to 200kbps for 10 seconds
// We assume there is at least one video track on this stream
var disposable = subscriber.getVideoTracks()[0].limitBandwidth(200000)
DispatchQueue.main.asyncAfter(deadline: .now() + 10) {
// Dropping the disposable will undo the bandwidth limitation
disposable = nil
}
Example Code for limiting video bandwidth with a PhenixExpressSubscriber object
import PhenixSdk
// Previously obtained
let subscriber: PhenixExpressSubscriber = ...
// Limit video bitrate to 200kbps for 10 seconds
// We assume there is at least one video track on this stream
var disposable = subscriber.getVideoTracks()[0].limitBandwidth(200000)
DispatchQueue.main.asyncAfter(deadline: .now() + 10) {
// Dropping the disposable will undo the bandwidth limitation
disposable = nil
}
Parameters
Name | Type | Description |
---|---|---|
bandwidthLimitInBps (required) | UInt64 | Maximum bitrate limit in bps for video |