PCast™ API - Deprecated
Initializing
Before using a PCast™ object, it must be initialized as shown.
import com.phenixrts.pcast.PCast;
import com.phenixrts.pcast.PCastInitializeOptions;
import com.phenixrts.pcast.android.AndroidPCastFactory;
PCast pcast = AndroidPCastFactory.createPCast(this);
// #1 Default initialization options
pcast.initialize();
// OR
// #2 Custom initialization options (initialized by constructor)
final boolean enableProcessTerminationSignalHandling = false;
PCastInitializeOptions initOptions = new PCastInitializeOptions(
enableProcessTerminationSignalHandling);
pcast.initialize(initOptions);
Initialize Options
Name | Description |
---|---|
enableProcessTerminationSignalHandling | Controls if process termination signal handling is enabled or not |
configureLogging | Control if phenix logging is enabled |
streamingSourceMapping | Allows partial override of streaming source URIs |
Stream Source Mapping
Optional field that can be provided with PCastInitializeOptions
. It only has an effect on the subscriber side, and only for streams with the "streaming" capability (on both publisher and subscriber side). The property allows you to redirect requests from the SDKs player to your own CDN.
import com.phenixrts.pcast.PCastInitializeOptions;
import com.phenixrts.pcast.StreamingSourceMapping;
final String patternToReplace = "https:\\/\\/phenixrts\\.com\\/video";
final String replacement = "https://myown.cdn.com";
final StreamingSourceMapping streamingSourceMapping = new StreamingSourceMapping(
patternToReplace, replacement);
final boolean enableProcessTerminationSignalHandling = false;
final boolean configureLogging = true;
final PCastInitializeOptions pcastInitOptions = new PCastInitializeOptions(
enableProcessTerminationSignalHandling,
configureLogging,
streamingSourceMapping);
Name | Description |
---|---|
patternToReplace | A regular expression indicating with part of the incoming streaming source URI to replace |
replacement | The replacement string to insert into the streaming source URI |
Note: The regular expression has to be properly escaped to work correctly.
Connect and Authenticate
Follow the example for connecting and authenticating to PCast™.
import android.util.Log;
import com.phenixrts.pcast.PCast;
import com.phenixrts.pcast.RequestStatus;
// Created via EdgeAuth library
String authenticationToken = ...;
// Previously initialized
PCast pcast = ...;
pcast.start(
authenticationToken,
new PCast.AuthenticationCallback() {
@Override
public void onEvent(
PCast pcast,
RequestStatus requestStatus,
String sessionId) {
if (requestStatus == RequestStatus.OK) {
Log.i("Phenix SDK Example", "PCast started...");
} else {
Log.e("Phenix SDK Example", "Failed to start PCast...");
}
}
},
new PCast.OnlineCallback() {
@Override
public void onEvent(PCast pcast) {
Log.i("Phenix SDK Example", "We are online...");
}
},
new PCast.OfflineCallback() {
@Override
public void onEvent(PCast pcast) {
Log.i("Phenix SDK Example", "We are offline...");
}
});
Connect and Authenticate Parameters
Name | Type | Description |
---|---|---|
authenticationToken (required) | string | The authentication token generated using the Phenix EdgeAuth library |
authenticationCallback (required) | PCast.AuthenticationCallback | Called upon successful authentication or when authentication failed or has to be redone. Upon successful authentication, the authenticationCallback will be called with status=RequestStatus.OK. If at any time a new authenticationToken is required, then the authenticationCallback is called with status=RequestStatus.UNAUTHORIZED to indicate that we are no longer authenticated. |
onlineCallback (required) | PCast.OnlineCallback | Called when the client is connected to the streaming platform |
offlineCallback (required) | PCast.OfflineCallback | Called when the client is disconnected from the streaming platform. Ongoing streams may continue while we are temporarily disconnected. However, no new streams can be started while being disconnected. The client automatically tries to reconnect and will call onlineCallback when it succeeds in doing so or eventually call authenticationCallback to indicate that re-authentication is required. |
Authentication Callback Status Codes
Status | Valid Fields | Description |
---|---|---|
RequestStatus.OK | sessionId | Authentication succeeded, the sessionId is populated |
RequestStatus.UNAUTHORIZED | none | Authentication failed or re-authentication required |
varies | none | Authentication failed for other reasons |
Disconnect
Please follow the self explanatory example for disconnecting from PCast™.
import com.phenixrts.pcast.PCast;
import com.phenixrts.pcast.RequestStatus;
// Previously initialized and started
PCast pcast = ...;
pcast.stop();
// Once you are done using PCast (e.g. exiting the app)
pcast.shutdown();
Get Local User Media
Please follow the self explanatory example for getting local user media.
import com.phenixrts.pcast.FacingMode;
import com.phenixrts.pcast.PCast;
import com.phenixrts.pcast.UserMediaOptions;
import com.phenixrts.pcast.UserMediaStream;
import com.phenixrts.pcast.android.AndroidPCastFactory;
// Previously initialized and started
PCast pcast = ...;
UserMediaOptions gumOptions = new UserMediaOptions();
// Customize options if desired
gumOptions.getVideoOptions().capabilityConstraints.put(
DeviceCapability.FACING_MODE, Arrays.asList(new DeviceConstraint(FacingMode.USER)));
gumOptions.getAudioOptions().enabled = false;
pcast.getUserMedia(
gumOptions,
new PCast.UserMediaCallback() {
@Override
public void onEvent(
PCast pcast,
RequestStatus status,
UserMediaStream userMediaStream) {
// Check status and store 'userMediaStream'
}
});
Get Local User Media Parameters
Name | Type | Description |
---|---|---|
options (required) | UserMediaOptions | The options defining the requested user media stream |
userMediaCallback (required) | PCast.UserMediaCallback | Upon acquiring of the user media stream, the userMediaCallback will be called with status=RequestStatus.OK. If the user media is currently used by another application, then you may receive a code status=RequestStatus.CONFLICT . If the operation fails with status=RequestStatus.FAILED then please check the logs for more information |
Device Capability
Name | Description |
---|---|
DeviceCapability.WIDTH | Width in pixels |
DeviceCapability.HEIGHT | Height in pixels |
DeviceCapability.FRAME_RATE | Number of frames per second |
DeviceCapability.FACING_MODE | Facing mode |
DeviceCapability.FLASH_MODE | Flash mode |
DeviceCapability.DEVICE_ID | Device ID string (obtain from here) |
DeviceCapability.LOCATION | Device Location |
DeviceCapability.POLAR_PATTERN | Polar pattern |
DeviceCapability.AUDIO_ECHO_CANCELATION_MODE | Audio echo cancelation mode |
Constraint Type
Name | Description |
---|---|
ConstraintType.MIN | Hard constraint: Capability must have at least the specified value |
ConstraintType.MAX | Hard constraint: Capability must have at most the specified value |
ConstraintType.EXACT | Hard constraint: Capability must have exactly the specified value |
ConstraintType.IDEAL | Soft constraint: Capability should have specified value, but other values are acceptable (default) |
Facing Mode
Name | Description |
---|---|
FacingMode.AUTOMATIC | Select a facing mode automatically (default) |
FacingMode.ENVIRONMENT | Facing the surrounding environment (e.g., back camera) |
FacingMode.USER | Facing the user (e.g., front camera) |
Flash Mode
Only applicable to video devices
Name | Description |
---|---|
FlashMode.AUTOMATIC | Flash is turned on automatically when needed (default) |
FlashMode.ALWAYS_ON | Flash is on (if available) |
FlashMode.ALWAYS_OFF | Flash is off |
Device Location
Name | Description |
---|---|
Location.AUTOMATIC | Select any device (default) |
Location.UPPER | Mounted on top of phone/tablet |
Location.LOWER | Mounted at bottom of phone/tablet |
Polar Pattern
Only applicable to audio devices
Name | Description |
---|---|
PolarPattern.AUTOMATIC | Automatically select pattern (default) |
PolarPattern.OMNIDIRECTIONAL | Equally sensitive to sound from any direction |
PolarPattern.CARDIOID | Most sensitive to sound from the direction in which the data source points and is (nearly) insensitive to sound from the opposite direction |
PolarPattern.SUBCARDIOID | Most sensitive to sound from the direction in which the data source points and is less sensitive to sound from the opposite direction |
Note: This option is currently not supported on Android
Audio Echo Cancelation Mode
Only applicable to audio devices
Name | Description |
---|---|
AudioEchoCancelationMode.AUTOMATIC | Automatically select AEC mode (default) |
AudioEchoCancelationMode.ON | Enable AEC if available |
AudioEchoCancelationMode.OFF | Disable AEC |
Note (as of v2019.2.0): Automatic mode is currently always set to "off" on Android.
Note 2 (as of v2019.2.0): AudioEchoCancelationMode must be set to ON in the UserMediaOptions for all published streams and RendererOptions for all subscribed streams in order for AEC to work reliably on all Android devices.
Updating Options
Sometimes you find it useful to change the camera while a stream is running or just would like to turn on the flash light temporarily.
import com.phenixrts.pcast.UserMediaOptions;
import com.phenixrts.pcast.UserMediaStream;
// Previously obtained via 'getUserMedia'
UserMediaStream userMediaStream = ...;
// Previously initialized and used with 'getUserMedia'
UserMediaOptions gumOptions = new UserMediaOptions();
gumOptions.getVideoOptions().capabilityConstraints.put(
DeviceCapability.FACING_MODE, Arrays.asList(new DeviceConstraint(FacingMode.ENVIRONMENT)));
gumOptions.getVideoOptions().capabilityConstraints.put(
DeviceCapability.FLASH_MODE, Arrays.asList(new DeviceConstraint(FlashMode.ALWAYS_ON)));
gumOptions.getVideoOptions().capabilityConstraints.put(
DeviceCapability.HEIGHT,
Arrays.asList(new DeviceConstraint(720, ConstraintType.EXACT)));
gumOptions.getVideoOptions().capabilityConstraints.put(
DeviceCapability.WIDTH,
Arrays.asList(
new DeviceConstraint(800, ConstraintType.MIN),
new DeviceConstraint(1500, ConstraintType.MAX)));
userMediaStream.applyOptions(gumOptions);
Enumerating Source Devices
You can get a list of available source devices.
import android.util.Log;
import com.phenixrts.pcast.MediaType;
import com.phenixrts.pcast.PCast;
import com.phenixrts.pcast.SourceDeviceInfo;
// Previously initialized and started
PCast pcast = ...;
pcast.enumerateSourceDevices(
MediaType.VIDEO,
new PCast.EnumerateSourceDevicesCallback() {
@Override
public void onEvent(PCast pcast, SourceDeviceInfo[] devices) {
// Store devices as needed
}
});
Enumerating Source Devices Parameters
Name | Type | Description |
---|---|---|
mediaType (required) | MediaType | The media type for which to enumerate source devices |
SourceDeviceInfo fields
Name | Type | Description |
---|---|---|
id | String | Source device ID |
name | String | Source device Name |
mediaType | MediaType | Source device media type |
deviceType | SourceDeviceType | Source device type |
facingMode | FacingMode | Source device facing mode |
Media Type
Name | Description |
---|---|
MediaType.VIDEO | Video |
MediaType.AUDIO | Audio |
Source Device Type
Name | Description |
---|---|
SourceDeviceType.NULL | Null device (e.g. blank screen or silence) |
SourceDeviceType.PHYSICAL | Physical device (e.g. camera or microphone) |
SourceDeviceType.SYSTEM_OUTPUT | System output capture (screencast) |
Screencasting android 5.0 or later
You can also stream the screen of your phone.
In order to do this, you must first enumerate the available source devices to find the screencast device ID.
Find the screencast device ID
import android.util.Log;
import com.phenixrts.pcast.MediaType;
import com.phenixrts.pcast.PCast;
import com.phenixrts.pcast.SourceDeviceInfo;
import com.phenixrts.pcast.SourceDeviceType;
// Previously initialized and started
PCast pcast = ...;
pcast.enumerateSourceDevices(
MediaType.VIDEO,
new PCast.EnumerateSourceDevicesCallback() {
@Override
public void onEvent(PCast pcast, SourceDeviceInfo[] devices) {
for (SourceDeviceInfo info : devices) {
if (info.sourceDeviceType == SourceDeviceType.SYSTEM_OUTPUT) {
Log.i("Phenix SDK Example", "Screencasting is available");
String screenCaptureDeviceId = info.id;
// Store screenCaptureDeviceId
}
}
}
});
pcast.enumerateSourceDevices();
The screen capture device requires a valid android.media.projection.MediaProjection object to be passed to the SDK before it can be used. The MediaProjection must stay valid for the duration of the screencast.
Next, pass a valid android.media.projection.MediaProjection object
import android.media.projection.MediaProjection;
import com.phenixrts.pcast.android.AndroidPCastFactory;
// Previously obtained by calling
// android.media.projection.MediaProjectionManager.getMediaProjection(),
// see Android API reference.
MediaProjection mediaProjection = ...;
AndroidPCastFactory.setMediaProjection(mediaProjection);
Finally, pass the screen capture device ID
Next, pass the screen capture device ID to getUserMedia() in the user media options.
import com.phenixrts.pcast.FacingMode;
import com.phenixrts.pcast.PCast;
import com.phenixrts.pcast.UserMediaOptions;
import com.phenixrts.pcast.UserMediaStream;
// Previously initialized and started
PCast pcast = ...;
// Found by calling enumerateSourceDevices() as described above
String screenCaptureDeviceId = ...;
UserMediaOptions gumOptions = new UserMediaOptions();
gumOptions.getVideoOptions().capabilityConstraints.put(
DeviceCapability.DEVICE_ID, Arrays.asList(new DeviceConstraint(screenCaptureDeviceId)));
pcast.getUserMedia(
gumOptions,
new PCast.UserMediaCallback() {
@Override
public void onEvent(
PCast pcast,
RequestStatus status,
UserMediaStream userMediaStream) {
// Check status and store 'userMediaStream'
}
});
Please follow the self explanatory code snippets for understanding above steps.
Publish a Stream
import android.util.Log;
import com.phenixrts.pcast.MediaStream;
import com.phenixrts.pcast.PCast;
import com.phenixrts.pcast.Publisher;
import com.phenixrts.pcast.RequestStatus;
import com.phenixrts.pcast.StreamEndedReason;
import com.phenixrts.pcast.UserMediaOptions;
import com.phenixrts.pcast.UserMediaStream;
// Previously initialized and started
PCast pcast = ...;
// Previously created via EdgeAuth library
String streamToken = ...;
// Previously obtained via either PCast.subscribe()
// or UserMediaStream.getMediaStream()
MediaStream mediaStream = ...;
String[] tags = { "my-tag" };
pcast.publish(
streamToken,
mediaStream,
new PCast.PublishCallback() {
@Override
public void onEvent(
PCast pcast,
RequestStatus requestStatus,
Publisher publisher) {
// Check status and store 'publisher'
// The "streamId" of the publisher
String streamId = publisher.getStreamId();
if (publisher.hasEnded()) {
// Checks if the publisher has ended
}
// Attach publisher ended callback
publisher.setPublisherEndedCallback(
new Publisher.PublisherEndedCallback() {
@Override
public void onEvent(
Publisher publisher,
StreamEndedReason reason,
String reasonDescription) {
// Called when the stream has ended
Log.i("Phenix SDK Example",
"Publish stream ended with reason ["
+ reasonDescription + "]");
}
});
// To stop later
publisher.stop("I-am-done-publishing");
}
},
tags);
Publishing a Stream Parameters
Name | Type | Description |
---|---|---|
streamToken (required) | String | The publish token is generated using the Phenix EdgeAuth library |
mediaStream (required) | MediaStream | The user media stream acquired through PCast.subscribe(...) or locally with UserMediaStream.getMediaStream() |
publishCallback (required) | PCast.PublishCallback | Called upon completion of the operation |
tags (optional) | String[] | Tags that will be provided with the stream notifications to your backend callback endpoint |
StreamEndedReason
Reason | Description |
---|---|
StreamEndedReason.ENDED | The stream ended normally |
StreamEndedReason.FAILED | The stream failed |
StreamEndedReason.CENSORED | The stream was censored |
StreamEndedReason.MAINTENANCE | A maintenance event caused this stream to be terminated |
StreamEndedReason.CAPACITY | The stream was terminated due to capacity limitations |
StreamEndedReason.APP_BACKGROUND | The stream was terminated due to the mobile app entering into the background |
StreamEndedReason.CUSTOM | A custom termination reason is provided in the "reasonDescription" field |
Process Raw Frames
The frame-ready API allows for pre-processing of raw audio and video frames before they are encoded and transmitted.
Any processing needs to keep up with the incoming frame rate; otherwise some of the incoming frames will be dropped to compensate.
import android.graphics.Bitmap;
import com.phenixrts.media.audio.android.AndroidAudioFrame;
import com.phenixrts.media.video.android.AndroidVideoFrame;
import com.phenixrts.pcast.FrameNotification;
import com.phenixrts.pcast.MediaStreamTrack;
import com.phenixrts.pcast.PCast;
import com.phenixrts.pcast.UserMediaStream;
import com.phenixrts.pcast.android.AndroidReadAudioFrameCallback;
import com.phenixrts.pcast.android.AndroidReadVideoFrameCallback;
// Previously obtained via PCast.getUserMedia
final UserMediaStream userMediaStream = ...;
final MediaStreamTrack[] videoTracks = userMediaStream.getMediaStream().getVideoTracks();
for (MediaStreamTrack videoTrack : videoTracks) {
userMediaStream.setFrameReadyCallback(videoTrack,
(frameNotification) -> {
frameNotification.read(new AndroidReadVideoFrameCallback() {
@Override
public void onVideoFrameEvent(AndroidVideoFrame videoFrame) {
final Bitmap pixelBuffer = videoFrame.bitmap;
final int displayHeightInPixels = pixelBuffer.getHeight();
final int displayWidthInPixels = pixelBuffer.getWidth();
// Manipulate bitmap as needed
...
frameNotification.write(videoFrame);
}
});
});
}
final MediaStreamTrack[] audioTracks = userMediaStream.getMediaStream().getAudioTracks();
for (MediaStreamTrack audioTrack : audioTracks) {
userMediaStream.setFrameReadyCallback(audioTrack,
(frameNotification) -> {
frameNotification.read(new AndroidReadAudioFrameCallback() {
@Override
public void onAudioFrameEvent(AndroidAudioFrame audioFrame) {
// Attenuate audio:
for (int sampleIndex = 0; sampleIndex < audioFrame.audioSamples.length; ++sampleIndex) {
audioFrame.audioSamples[sampleIndex] *= 0.1;
}
frameNotification.write(audioFrame);
}
});
});
}
Frame Notification API
Name | Argument | Description |
---|---|---|
read | FrameReadyForProcessingCallback | Retrieves current raw frame in form of an AndroidAudioFrame or AndroidVideoFrame |
write | AndroidAudioFrame/AndroidVideoFrame | Writes back a processed or newly generated frame. The frame can have a different resolution and timestamps |
drop | (none) | Instructs stream to drop the current frame |
AndroidAudioFrame
Name | Type | Description |
---|---|---|
sampleRateInHz | int | Audio sampling frequency, e.g. 48000 |
numberOfChannels | int | Number of channels, e.g. 1 for mono, 2 for stereo |
timestampInMicroseconds | long | Timestamp of current audio frame in us |
audioSamples | short[] | Audio samples; if more than one channel is present, then samples will be interleaved |
AndroidVideoFrame
Name | Type | Description |
---|---|---|
bitmap | android.graphics.Bitmap | Bitmap containing the pixel data. This object also contains other properties such as width and height. |
timestampInMicroseconds | long | Timestamp of current video frame in us |
durationInMicroseconds | long | Duration of the current video frame in us |
Limit Bitrate
The published video bitrate can be limited temporarily if needed. The returned disposable allows you to control for how long the limitation should stay in effect. If limitBandwidth
is called multiple times before any of the previous disposables are released, then only the most recent override will remain in effect until its disposable is released. Relasing any of the disposables from earlier limitBandwidth
calls will have no effect.
To release a Disposable deterministrically, call close
on it.
import com.phenixrts.common.Disposable;
import com.phenixrts.pcast.Publisher;
// Previously obtained
Publisher publisher = ...;
Disposable disposable = publisher.limitBandwidth(200000);
// ... after some time: force dispose to cancel bandwidth limitation:
disposable.close();
Limit Bitrate Parameters
Name | Type | Description |
---|---|---|
bandwidthLimitInBps (required) | long | Maximum bitrate limit in bps for video |
Subscribe to a Stream
import android.util.Log;
import com.phenixrts.pcast.MediaStream;
import com.phenixrts.pcast.PCast;
import com.phenixrts.pcast.RequestStatus;
import com.phenixrts.pcast.StreamEndedReason;
// Previously initialized and started
PCast pcast = ...;
// Previously created via EdgeAuth library
String streamToken = ...;
pcast.subscribe(
streamToken,
new PCast.SubscribeCallback() {
@Override
public void onEvent(
PCast pcast,
RequestStatus requestStatus,
MediaStream mediaStream) {
// Check status and store 'mediaStream'
// Attach stream ended callback
mediaStream.setStreamEndedCallback(
new MediaStream.StreamEndedCallback() {
@Override
public void onEvent(
MediaStream mediaStream,
StreamEndedReason reason,
String reasonDescription) {
Log.i("Phenix SDK Example",
"Subscriber stream ended with reason ["
+ reasonDescription + "]");
}
}
});
// To stop later
mediaStream.stop();
}
});
Subscribe to a Stream Parameters
Name | Type | Description |
---|---|---|
streamToken (required) | String | The publish token is generated using the Phenix EdgeAuth library |
subscribeCallback (required) | PCast.SubsribeCallback | Called upon completion of the operation |
View a Stream
In order to view a stream you have to supply a Surface for the Renderer to draw on.
import com.phenixrts.pcast.MediaStream;
import com.phenixrts.pcast.Renderer;
import com.phenixrts.pcast.RendererStartStatus;
import com.phenixrts.pcast.android.AndroidVideoRenderSurface;
// Previously obtained via either PCast.subscribe
// or PhenixUserMediaStream.mediaStream
MediaStream mediaStream = ...;
// Previously obtained, e.g., from a SurfaceView
Surface renderSurface = ...;
Renderer renderer = mediaStream.createRenderer();
RendererStartStatus status =
renderer.start(new AndroidVideoRenderSurface(renderSurface));
if (status == RendererStartStatus.OK) {
Log.i("Phenix SDK Example", "Renderer started successfully");
} else {
Log.e("Phenix SDK Example", "Renderer start failed");
}
// To stop later
renderer.stop();
Renderer options
It is possible to pass additional options when creating a renderer.
import com.phenixrts.pcast.AspectRatioMode;
import com.phenixrts.pcast.MediaStream;
import com.phenixrts.pcast.Renderer;
import com.phenixrts.pcast.RendererOptions;
// Previously obtained via either PCast.subscribe or UserMediaStream.mediaStream
MediaStream mediaStream = ...;
RendererOptions options = new RendererOptions();
options.aspectRatioMode = AspectRatioMode.FILL;
Renderer renderer = mediaStream.createRenderer(options);
Properties
Name | Type | Description |
---|---|---|
aspectRatioMode (optional) | AspectRatioMode | How to fill available video render surface |
audioEchoCancelationMode (optional) | AudioEchoCancelationMode | Audio echo cancelation mode |
hardwareAcceleratedDecodingMode (optional) | HardwareAcceleratedDecodingMode | Hardware accelerated decoding mode |
Aspect Ratio Mode
Name | Description |
---|---|
AspectRatioMode.AUTOMATIC | Defaults to fill |
AspectRatioMode.FILL | Fill entire render area. Video may be truncated |
AspectRatioMode.LETTERBOX | Black bars are added on sides or top/bottom of render area, video will not be truncated |
Note: This option is currently only supported for Real-Time streams
Hardware Accelerated Decoding Mode
Name | Description |
---|---|
HardwareAcceleratedDecodingMode.AUTOMATIC | Use hardware decoding on certified devices |
HardwareAcceleratedDecodingMode.ON | Always use hardware decoding |
HardwareAcceleratedDecodingMode.OFF | Always use software decoding |
Preview Local User Media
import com.phenixrts.pcast.Renderer;
import com.phenixrts.pcast.UserMediaStream;
// Previously obtained via 'getUserMedia'
UserMediaStream userMediaStream = ...;
Renderer renderer = userMediaStream.getMediaStream().createRenderer();
Muting and Unmuting of Audio
import com.phenixrts.pcast.Renderer;
// Previously obtained from media stream
Renderer renderer = ...;
boolean isMuted = renderer.isAudioMuted();
renderer.muteAudio();
renderer.unmuteAudio();
Taking a Screenshot
If you would like to show a preview, you can take a still image from a renderer.
import com.phenixrts.pcast.Renderer;
import com.phenixrts.android.AndroidLastFrameRenderedCallback;
// Previously obtained from media stream and started
Renderer renderer = ...;
renderer.setLastVideoFrameRenderedReceivedCallback(
new AndroidLastFrameRenderedCallback() {
@override
public void onEvent(Renderer renderer, Bitmap videoFrame) {
// Process the frame as needed
}
});
renderer.requestLastVideoFrameRendered();
You can also take a still image from a local stream.
import com.phenixrts.pcast.UserMediaStream;
import com.phenixrts.android.AndroidLastFrameCapturedCallback;
// Previously obtained via 'getUserMedia'
UserMediaStream userMediaStream = ...;
userMediaStream.setLastVideoFrameCapturedReceivedCallback(
new AndroidLastFrameCapturedCallback() {
@override
public void onEvent(Renderer renderer, Bitmap videoFrame) {
// Process the frame as needed
}
});
userMediaStream.requestLastVideoFrameCaptured();
Data Quality Feedback
If you like to show the user feedback about how their internet connectivity affects the stream quality, you can listen for data quality notifications.
import com.phenixrts.pcast.DataQualityReason;
import com.phenixrts.pcast.DataQualityStatus;
import com.phenixrts.pcast.Publisher;
import com.phenixrts.pcast.Renderer;
// Previously obtained from media stream and started
Renderer renderer = ...;
renderer.setDataQualityChangedCallback(
new Renderer.DataQualityChangedCallback() {
@Override
public void onEvent(
Renderer renderer,
DataQualityStatus quality,
DataQualityReason reason) {
// Inform user, take action based on status and reason
}
});
// Previously obtained via PCast.publish
Publisher publisher = ...;
publisher.setDataQualityChangedCallback(
new Publisher.DataQualityChangedCallback() {
@Override
public void onEvent(
Publisher publisher,
DataQualityStatus quality,
DataQualityReason reason) {
// Inform user, take action based on status and reason
}
});
Data Quality Status for Publishers
Status | Reason | Description |
---|---|---|
DataQualityStatus.NO_DATA | DataQualityReason.NONE | The publisher has a bad internet connection and no data is being streamed. |
DataQualityStatus.NO_DATA | DataQualityReason.UPLOAD_LIMITED | The publisher has a bad internet connection and no data is being streamed. |
DataQualityStatus.ALL | DataQualityReason.NONE | Good internet connection and no quality reduction in effect. |
DataQualityStatus.ALL | DataQualityReason.UPLOAD_LIMITED | The publisher has a slow internet connection and the quality of the stream is reduced. |
DataQualityStatus.ALL | DataQualityReason.NETWORK_LIMITED | Subscribers have bad internet connections and the quality of the stream is reduced. |
DataQualityStatus.AUDIO_ONLY | DataQualityReason.UPLOAD_LIMITED | The publisher has a bad internet connection and only audio is streamed. |
DataQualityStatus.AUDIO_ONLY | DataQualityReason.NETWORK_LIMITED | Subscribers have bad internet connections and only audio is streamed. |
Data Quality Status for Viewers
Status | Reason | Description |
---|---|---|
DataQualityStatus.NO_DATA | DataQualityReason.NONE | The subscriber has a bad internet connection and no data is being received. |
DataQualityStatus.NO_DATA | DataQualityReason.DOWNLOAD_LIMITED | The subscriber has a bad internet connection and no data is being received. |
DataQualityStatus.NO_DATA | DataQualityReason.PUBLISHER_LIMITED | The publisher has a bad internet connection and no data is being received. |
DataQualityStatus.NO_DATA | DataQualityReason.NETWORK_LIMITED | The network is limiting the quality of the stream and no data is being received. |
DataQualityStatus.ALL | DataQualityReason.NONE | Good internet connection and no quality reduction in effect. |
DataQualityStatus.ALL | DataQualityReason.DOWNLOAD_LIMITED | The subscriber has a bad internet connection and the quality of the stream is reduced. |
DataQualityStatus.ALL | DataQualityReason.PUBLISHER_LIMITED | The publisher has a bad internet connection and the quality of the stream is reduced. |
DataQualityStatus.ALL | DataQualityReason.NETWORK_LIMITED | Other subscribers have bad internet connections and the quality of the stream is reduced. |
DataQualityStatus.AUDIO_ONLY | DataQualityReason.NONE | Audio only stream, good internet connection and no quality reduction in effect. |
DataQualityStatus.AUDIO_ONLY | DataQualityReason.DOWNLOAD_LIMITED | The subscriber has a bad internet connection and is only receiving audio. |
DataQualityStatus.AUDIO_ONLY | DataQualityReason.PUBLISHER_LIMITED | The publisher has a bad internet connection and the subscriber is only receiving audio. |
DataQualityStatus.AUDIO_ONLY | DataQualityReason.NETWORK_LIMITED | The network is limiting the quality of the stream and the subscriber is only receiving audio. |
Handling Dimension Changes
Cameras may be switched at runtime and devices may be rotated. Register a handler to receive a notification whenever the video dimension changes.
import com.phenixrts.pcast.Dimensions;
import com.phenixrts.pcast.Renderer;
// Previously obtained from media stream
Renderer renderer = ...;
renderer.setVideoDisplayDimensionsChangedCallback(
new Renderer.VideoDisplayDimensionsChangedCallback() {
@Override
public void onEvent(
Renderer renderer,
Dimensions displayDimensions) {
// Can get called multiple times while rendering a stream
// Values in displayDimensions.width, displayDimensions.height
}
});