• Skip to Search
  • Skip to Content
  • Skip to Side Navigation
Getting StartedSDK ReferenceGlossary
  • Home
  • Getting Started
  • SDK Reference
  • Portal
  • How-To
  • Troubleshooting
  • FAQs
  • Reference
  • Glossary
FAQs
  • How is Phenix real-time streaming different from off the shelf WebRTC?
  • How does Phenix scale WebRTC to millions?
  • What is the difference between a CDN and Phenix?
  • What is the difference between a Channel and a Room?
  • Does Phenix encrypt its Real-Time streams?
  • How does Phenix define Real-Time vs Low Latency?
  • How does Phenix compare with technologies such as CMAF?
  • On which cloud platforms is the Phenix platform deployed?
  • What is the recommended minimum hardware spec for devices to publish from a web browser to Phenix?
  • What WebSocket reconnect mechanisms are built into the Web SDK?
  • What are the benefits of the Phenix hardware encoder?
  • How does Phenix multi-bitrate transcoding work and how do I enable it?
  • What bitrates are used for encoding and publishing?
  • How does Phenix adapt to challenging network connections?
  • How does Phenix handle rapid join rates and broadcast size audiences?
  • Where can I see the status of the Phenix system?
  • When do sessions and sessionIds expire?
  • What is the digest field portion of the Auth token?
  • Which video players support Phenix?
  • Which capture devices are compatible with Phenix?
  • Where can I find documentation of the text chat feature?
  • What size and bitrate should I use for publishing?
  • Does Phenix provide a video player?
  • What effects will 5G have on Phenix?

What bitrates are used for encoding and publishing?

The bitrates used by Phenix correspond to quality capabilities. Each set of bitrates is referred to as a "ladder."

  • phenix-2019 - current default for contribution encoding; to remain on this ladder please contact Phenix

  • phenix-2020 - new default for cloud encoding (including content ingested via RTMP)

  • Streaming - used when the Streaming or OnDemand capabilities are selected (see the reference documentation). Note that the bitrates and resolution are the same as the values in the other ladders, but with a resampled frame rate for the lowest resolutions.

All layers will use the same playout buffer value to avoid lip sync issues. The system always uses the maximum playout buffer value of any of the layers once it finalizes the layers.

For example, if the highest resolution for a content stream is 'fhd' when using the phenix-2020 ladder, all layers will use a playout buffer of 0.233 seconds.

Publishing quality information can be found here.

Phenix 2019 Ladder

The frame rate for all layers is equal to the frame rate of the input content.

Layer NameResolutionBitratePlayout BufferType
uld144800.033Static
vld2403500.033Static
ld3605200.033Dynamic
sd4808300.033Dynamic
hd72016000.066Static
fhd108030000.233Static
xhd108055000.266Static
uhd108085000.266Static

Phenix 2020 Ladder

The frame rate for all layers is equal to the frame rate of the input content.

Layer NameResolutionBitratePlayout BufferType
uld144800.033Static
vvld1441920.033Dynamic
vld2403500.033Static
ld3605200.033Both
sd4808300.033Both
hd72016000.066Static
fhd108030000.233Static
xhd108055000.266Static
uhd108085000.266Static

Streaming Ladders

The following table represents values used in phenix-2019.

In the Frame Rate Resampling column:

  • "2/1" indicates that the frame rate is half that of the frame rate of the input content.

  • "none" indicates the frame rate is equal to the frame rate of the input content.

IDQualityResolutionBitrateFrame Rate Resampling
qcifuld144802/1
sifvld2403502/1
ldld3605202/1
sdsd480830none
hdhd7201600none
fhdfhd10803000none
xhdxhd10805500none
Page Content
    Copyright 2023 © Phenix RTS
    Privacy Policy | Terms of Service
    v2023-01-31T21:25:10