• Skip to Search
  • Skip to Content
  • Skip to Side Navigation
Getting StartedSDK ReferenceGlossary
  • Home
  • Getting Started
  • SDK Reference
  • Portal
  • How-To
  • Troubleshooting
  • FAQs
  • Reference
  • Glossary
FAQs
  • How is Phenix real-time streaming different from off the shelf WebRTC?
  • How does Phenix scale WebRTC to millions?
  • What is the difference between a CDN and Phenix?
  • What is the difference between a Channel and a Room?
  • Does Phenix encrypt its Real-Time streams?
  • How does Phenix define Real-Time vs Low Latency?
  • How does Phenix compare with technologies such as CMAF?
  • On which cloud platforms is the Phenix platform deployed?
  • What is the recommended minimum hardware spec for devices to publish from a web browser to Phenix?
  • What WebSocket reconnect mechanisms are built into the Web SDK?
  • What are the benefits of the Phenix hardware encoder?
  • How does Phenix multi-bitrate transcoding work and how do I enable it?
  • What bitrates are used for encoding and publishing?
  • How does Phenix adapt to challenging network connections?
  • How does Phenix handle rapid join rates and broadcast size audiences?
  • Where can I see the status of the Phenix system?
  • When do sessions and sessionIds expire?
  • What is the digest field portion of the Auth token?
  • Which video players support Phenix?
  • Which capture devices are compatible with Phenix?
  • Where can I find documentation of the text chat feature?
  • What size and bitrate should I use for publishing?
  • Does Phenix provide a video player?
  • What effects will 5G have on Phenix?

How does Phenix handle adaptivity for challenging internet connections?

Phenix proprietary Adaptive Bitrate technology (United States Patent Application No. 62/663,182) transcodes streams into multiple resolutions and bitrates, enabling each viewer to dynamically receive the bitrate most suitable to his or her connection speed at any given time. All of this is accomplished while maintaining less than 500ms of end-to-end latency. The Adaptive Bitrate (ABR) capabilities for optimal stream quality are handled by the Phenix system according to each individual viewer’s network conditions.

The default ABR policy uses a resolution bitrate ladder similar to YouTube recommendations. Phenix will automatically transcode to the applicable quality layers below the published quality level. For example, if you publish an HD stream as the top layer, then Phenix automatically creates the appropriate SD, LD, VLD and ULD layers.

There are no special requirements on the viewer side to enable ABR streaming. Phenix will automatically connect viewers to the highest quality layer that is sustainable for their network connection. This differs from other technologies such as DASH, which requires the viewer side to parse a manifest and choose a presentation based on bitrate and other factors. Viewers will switch between quality layers as needed throughout the duration of their streams.

Phenix real-time streaming is a packet based frame-by-frame approach, in contrast to the chunk-based approaches of HLS and DASH. Phenix uses a proprietary algorithm and architecture for dynamic keyframe generation that scales across large audiences in order to allow switching quality levels at any time for any viewer.

Page Content
    Copyright 2023 © Phenix RTS
    Privacy Policy | Terms of Service
    v2023-01-31T21:25:10