How To Write A Video Streaming Plan

Size: px
Start display at page:

Download "How To Write A Video Streaming Plan"

Transcription

1 Workbook MANAGER S Planning for Video Streaming What You Really Need to Know PLANNING FOR VIDEO STREAMING AVTechnology Manager s Workbook Learning Objectives & Thought Leadership Learn the key decision points and concepts, and gain enough technical background information to make informed decisions during the specifying and integrating streaming media solutions process. A familiarity with IP networks and video technology is helpful but not required. p p p p p p p Audience Requirements Real Time or On Demand? Latency Tolerance Key Content Attributes Streaming Technology Components Video Compression Network Transit, and much more

2

3 Perspective s MANAGER S Workbook EDITORIAL Margot Douaihy Editorial Director Cindy Davis Contributing Editor SALES & MARKETING Sue Horwitz Publisher sue@audientmarketing.com ART & PRODUCTION Nicole Cobban Senior Art Director Walter Makarucha Jr. Associate Art Director NEWBAY MEDIA CORPORATE Steve Palm President & CEO Anthony Savona VP of Content & Marketing NEWBAY AV/CONSUMER ELECTRONICS GROUP Adam Goldstein Executive VP/Group Publisher Published by NewBay Media L.L.C. 28 East 28th Street, 12th Floor New York, NY Tel: Web: nbmedia.com PRINTED IN THE USA. Myriad consumer electronics devices from tablets to PCs and watches are driving the expectation that video can and should be accessed anywhere, any time, and the quality should be outstanding. This brandand product-agnostic workbook is the only one Cindy Davis, Editorial Contributor, AV Technology magazine of its kind to help enterprise AV and IT technology managers, integrators, and consultants set expectations and devise a video streaming plan through identifying application and content attributes, choosing the appropriate streaming technology components, and understanding network transit to deliver purpose-built media streaming systems. You don t need to have a technical background; this guide provides you a solid blueprint. Thought Leader Sponsor The content in this Workbook is a collaborative effort between the editors of AV Technology magazine, its Thought Leader Sponsors, and editorial contributors. All contributed content has been edited and approved by AVT editors to ensure that it is first and foremost informative and educational, as well as provides inspiration. This is a marketing free zone. The content and images in this Workbook are owned by its authors and/or respective companies and may not be reprinted or reproduced in any way. avnetwork.com Planning for Video Streaming AVTechnology Manager s Workbook 3

4 Content MANAGER S Workbook 5 Getting Started: THE Workbook You Need Learn the key decision points and concepts, and gain enough technical background information to make informed decisions during the specifying and integrating streaming media solutions process. 6 Application Attributes 7 Real Time or On Demand? Audience Who is consuming the stream and why? Where are they? What are they viewing the stream on? Do they have special requirements? 7 Latency Latency Tolerance Sources of Latency 8 Bandwidth 8 Security 8 Management 9 Scalability 10 Content Attributes 11 Content Source HDCP Content 11 Interface AV Interface USB interface Software Interface Direct Streaming Devices 12 Resolution and Rate Video Common Video Resolutions Video Scaling Audio 13 Bandwidth 14 Streaming Technology Components 14 Video Compression Lossless vs Lossy Compression 14 Compression and Codec Terminology Group of Pictures (GOP) 15 Intra-frame Codecs M-JPEG JPEG2000 Proprietary Codecs 15 Inter-frame Codecs MPEG Codecs Inter-frame Compression H Containers MPEG Transport Stream Audio Codecs 18 Session Controls Progressive Download Real Time Streaming Protocol (RTSP) Session Announcement Protocol (SAP) HTTP Live Streaming (HLS) 18 Transport Protocols RTP 19 Network Transit 19 TCP and UDP TCP UDP 19 When to Use Which 20 Unicast and Multicast Unicast Multicast 21 Mix and Match and Tradeoffs 21 General Attributes Prioritized Session Technologies 23 Bandwidth Requirements and Typical Distribution Latencies Thought Leader PAUL ZIELIE, CTS-D,I Manager, Enterprise Solutions for Harman Professional Since joining AMX by Harman in the fall of 2013, his expertise in the areas of solutions architecture, network video transport and security, have been crucial to the development of AMX s platform roadmaps. Zielie has over 30 years of experience designing and integrating IT, telecommunications and audiovisual (AV) solutions. Over the course of his career he has had most of the roles in the AV/IT spectrum including customer/end user, IT owner, integrator, designer, managed service provider, distributor, presale specifier, executive, and now manufacturer. He is a prolific writer and speaker, and is the recipient of the 2015 InfoComm International Educator of the year. Link to a downloadable video streaming workbook automate/ multicast-forenterprise-videostreaming.aspx 4 AVTechnology Manager s Workbook Planning for Video Streaming avnetwork.com

5 Getting Started p THE Workbook You Need for Video Streaming Streaming audio and video content across a network is a broad topic, from real time connections replacing a traditional AM matrix switch to conferencing to cat videos on YouTube with a huge range of mix and match technologies in any given solution. Choosing the right solution involves considering both the application and content attributes and narrowing down the technologies that do not support those attributes. The goal of this document is to explain key decision points and concepts and give enough technical background information to allow the audience to make informed decisions in specifying and integrating streaming media solutions. This document will discuss networks only at a high level and when it impacts the decision process. A familiarity with IP networks and video technology is helpful but not required. avnetwork.com Planning for Video Streaming AVTechnology Manager s Workbook 5

6 Application Attributes p Application Attributes Real Time or On Demand?. Audience. Latency. Bandwidth. Security. Management. Scalability Attributes Real Time or On Demand or Both Audience Latency Bandwidth Security Management Scalability Description Real time streaming is consumed as it is produced. On Demand streaming is recorded and each user consumes the video at their convenience. Who is consuming the stream and why? Where are they? What are they viewing the stream on? Do they have special requirements? Important in real time streaming. The allowable delay between the time the AV signal leaves the source and it is reproduced on the output (display and/or speakers). The number of bits per second available on the network for content distribution. If and how the content is protected for unauthorized users. Logistical tasks associated with distributing the content. The ability of a technology or system to accommodate a larger number of endpoints or more content. Maximum number of endpoints or amount of accessible content within a single installation for a given technology. The possibility or ease of adding endpoints or accessible content storage in a given installation. 6 AVTechnology Manager s Workbook Planning for Video Streaming avnetwork.com

7 Application Attributes p REAL TIME OR ON DEMAND? The first step in characterizing the streaming application is determining if the application is real time (live) or on demand. Live streaming technologies are appropriate for time sensitive applications such as, live lectures or television transport, or ones where interaction with the audience is required. Any production, such as switching between content sources or subtitles will have to be done within the integrated video capture system. On demand video allows for more sophisticated production and editing with only the level of effort you are willing to put into it limiting the final product. It is appropriate when the content is not time sensitive or should be available over time. On demand video allows for individual controls such as pause and rewind, since each stream is individually sent to the consumer. A hybrid approach where a live event is streamed and also captured for on demand availability is very common. Often these hybrid systems have limited editing capabilities to trim the start and stop times and add bookmarks. Otherwise, the captured video can be exported to a video editing program for more complete editing. If this is an application requirement, then the workflow involved in getting the captured content to be available on demand should be explored to make sure it meets the requirements. AUDIENCE The purpose of streaming content is to deliver information to the audience. It is important to characterize the audience in order to make sure the content reaches them and has the required properties for them to effectively consume it. Who is consuming the stream and why? It is important to understand the size of the audience in terms of simultaneous users in order to properly choose from the various technologies available. Other important considerations include: Understanding the requirements for quality based on the audience requirements. The quality acceptable for a cute cat video may not be acceptable for something that requires close examination like an advanced math class. The length of time they are viewing. Poor quality audio and video is taxing to watch. Longer content generally requires higher quality. If the content is on demand and the users want to consume part of the content and continue later, perhaps on another device, it needs to be considered before the system is chosen. Where are they? This is a network consideration. It is important to know how the audience is accessing the content relative to the source. Possibilities include: In the same room. The content is streamed as an alternative to traditional AV distribution. Examples include; streaming the presentation computer to a shared display or streaming content such as a video camera, attached microscope, to a display at each student s workspace On the same campus. The content is available with very few network restrictions because source and content are on the same Local Area Network (LAN) In the same enterprise. The source and audience may be at different locations, but content is streamed across a network completely under the control of the enterprise. On the Internet. At least part of the network is completely out of the control of the organization. Available network bandwidth may be unknown and devices such as firewalls may need to be traversed. What are they viewing the stream on? You can no longer assume that streaming is being consumed on set top boxes or computers. The audience may want to consume the video on phones, tablets, video game systems, or any combination of operating systems and player software. Consideration must be made based on the audience what types of playback devices will be supported. Do they have special requirements? Are there any special requirements required in order to meet the goals of the streaming system? Some possible special requirements include: Keyword searches to find sections of the content Closed captioning Verification or reports that the content has been viewed Ability to view multiple simultaneous content streams (presenter camera and content) LATENCY Tolerance for delay or latency is perhaps the least understood critical attribute and is among the hardest to quantify. There are large differences in latency tolerance for different use cases ranging from almost no latency to several seconds (or minutes). Latency Tolerance The human brain is very adaptable to delay and stitching together perceptions that do not arrive either completely or simultaneously, but there are limitations that eventually cause the brain to perceive the interactions as unrelated. This may not immediately be perceived on a conscious level, but is exhibited in additional fatigue and dissatisfaction with the application without necessarily knowing why. At some point, the delay becomes so high that the application is unusable. The study of these and related phenomena is called Psychophysics. Latency considerations are typically only a concern in real time streaming applications, especially those that involve interaction with technology or between people on opposite sides of a streaming link. There is considerable interest in moving AV transport from traditional technologies such as HDMI cables and video switchers to the network. If the content is streamed within an environment that contains both the source and reproduction, the latency requirements can be quite strict. Imagine the frustration of trying to manipulate a mouse while the display has a 3-5 second delay, common in some streaming products. avnetwork.com Planning for Video Streaming AVTechnology Manager s Workbook 7

8 Application Attributes p Sources of Latency There are many causes of delay along a streaming signal path so latency has to be treated holistically. The thresholds described before contain the entire latency budget for the signal path. A little known fact is that many modern displays introduce 100ms or more of latency because of video enhancement. The first community that took notice were video game players who s scores dropped when they got new monitors and protested. Many monitors now have a game mode which removes the high latency processing. BANDWIDTH This attribute takes into account the available network bandwidth for the various network links that the media will need to traverse in order to reach the audience. Most network traffic like and web browsing is bursty and delay tolerant. This means that the traffic is intermittent, and if it has to be delayed that s okay. You probably wouldn t notice if an showed up a second late. The nature of this traffic allows the network to be oversubscribed, which means that as a shared resource the network could fail if everyone used it to capacity simultaneously, but the chances of that happening is very small and if it did happen momentarily, they would just buffer (store) the extra traffic until there was capacity. AV streaming is continuous, and is not tolerant to variations in the amount of delay. The network team specifying the available bandwidth will have to understand the bandwidth available for this type of traffic. This is especially important for Wide Area Networks (WANs) and the links connected to the video servers on Local Area Networks (LANs) if you are using unicast. It is possible that the network team will use Quality of Service (QoS) to prioritize the video traffic. Some network organizations will require you to request bandwidth, rather than tell you what they have available. In the case of video on demand or unicast real time video, the number will be calculated by multiplying the peak simultaneous users by the bandwidth of a single video stream. For multicast the bandwidth will be the peak number of streams times the bandwidth of a single video stream. SECURITY Beyond the typical application security requirements, like not letting unauthorized people originate content, you have to consider if the content is sensitive and should be protected. In many cases it may be open to the public, but in others you will want to limit who has access and has records of that access. In more extreme cases there are software packages that can watermark each individual video stream so if it shows up in an unauthorized location, you can tell who originally obtained the content. MANAGEMENT Many content management packages exist in the streaming market. A content management package that is excellent for live IPTV may not be well suited streaming video on demand even though it can technically be done. Once the streaming application is understood and products are considered it is important to look at the production workflow to make sure it fits your needs. If a critical attribute of your application requires a complex string of manual tasks, then you may be looking Current Accepted Standards for Allowable Latency by Task Limiting Standard Max Latency Reference Perceptible Threshold 13ms Detecting Meaning in Rapid Pictures Attention, Perception, & Psychophysics, 2014 Distributed Video with Live Audio (Lip Sync) Audio Reinforcement (Echo threshold for intelligibility) <22.5ms Acceptable <18ms Marginal 18-30ms ITU BT.1359 (Maximum specification for audio to lead video) ITU-T G.131, Talker echo and its control The Influence of a Single Echo on the Audibility of Speech, Helmut Haas In the Current Accepted Standards for Latency by Task table, It is interesting to note that many of these numbers are much lower than they were ten years ago as new information and measurement techniques have been discovered. Live Audio Distribution (Lip Sync) Machine interaction (Keyboard, mouse, joystick, etc.) Video Games (Counterstrike) Human interaction (Audio / Video Conference) <30ms <45ms Excellent <50ms Diminished>100ms Acceptable <150ms Marginal ms Real Time, No interaction 2s-5+s N/A On Demand No Limit N/A ITU BT.1359 (Maximum specification for video to lead audio) Quantifying the Effects of Latency on Sensory Feedback in Distributed Virtual Environments Caroline Jay & Roger Hubbold QoS Requirements of Multimedia Applications Brett Berliner, Brian Clark and Albert Hartono ITU G.114, One-way transmission time (May require echo Cancellation per ITU-T G.131) 8 AVTechnology Manager s Workbook Planning for Video Streaming avnetwork.com

9 Application Attributes s 1: Input Buffer 4: Congestion Queuing 6: Decoding Buffer 8: Display Lag 2: Compression 5: Distance 7: Decompression 3: FIFO at the wrong package. SCALABILITY Scalability is concerned with the ability of the system to accommodate a larger number of endpoints or more content (storage or simultaneous streams). Typically, the originating device will have a limited number of simultaneous streams and may be augmented by a streaming server which will allow a larger number of simultaneous viewers. It is very common that media streaming applications are installed as a pilot without a known final use case or size. If this is the case then it is important to examine the ease of which additional capacity may be added without losing portions of the original investment. Another potential scalability consideration is the ability to off- load, off-network traffic to a service provider. If an organization is typically using the system internally, but has content that reaches out to an Internet audience, it may be more economical to own the internal streaming assets and outsource the external connections. This is done by uploading the content, either real time or as on demand files to the service provider who then handles the authentication, if required, and streaming. This is especially attractive when there is a large amount of internal content which would be downloaded back into the network if the application were completely hosted by a service provider. Some packages and technologies may be more compatible with a chosen service provider. AVT Allowable Latency Standards Applied to Use Case and Characteristics Use Case Conference Room Small Classroom Characteristics No sound reinforcement, manipulation of user content while observing display (Keyboard, Mouse) Limiting Standard Machine interaction Maximum Latency <45ms Large Classroom Reinforcement of presenter voice, manipulation of user content while observing display Audio Reinforcement <18-30ms Auditorium, IMAG (Image Magnification) Video of presenter or performer magnified for view-ability, non-delayed audio Distributed Video (Lip Sync) <22.5ms Auditorium, Live Sound reinforcement Room Overflow (with Q&A) Room Overflow (watch only) Sound is digitized for transmission to loudspeakers Presenter in primary room interacts with remote participants Remote participants only watch presentation. Questions, if any are sent by or chat, Audio Reinforcement Live Audio Distribution Human interaction Real Time, No interaction <18-30ms <45ms 2s-5+s This table has taken the allowed latencies from the Current Accepted Standards for Latency by Task table, and applied them to the applicable standard. avnetwork.com Planning for Video Streaming AVTechnology Manager s Workbook 9

10 Content Attributes p Content Attributes Content Source. Interface. Resolution and Rate. Bandwidth Attributes Content Source Interface Resolution (Source, Transmitted and Reproduced) Rate Bandwidth Description What is generating the content? Are there more than one content sources? Do multiple sources need to be streamed simultaneously? The physical interface and of the content source. Video The number of pixels (the smallest element of an image) in the video. Generally specified as Width X Height Audio The number of bits of information in each audio sample Higher bit depth result in higher maximum signal to noise ratio (S/N) Video Frame Rate The number of frames (complete images) per second Generally specified as frames per second (fps) Audio Sample rate The number of times per second an audio signal is sampled (measured for its instantaneous value) Higher sample rates allow for reproduction of higher frequencies The number of bits per second required to transmit the required AV signal at an acceptable quality 10 AVTechnology Manager s Workbook Planning for Video Streaming avnetwork.com

11 Content Attributes p CONTENT SOURCE The source(s) of the content will often help determine the type of encoders you may use. Typical content sources include computers, cameras, and video playback devices. If multiple content sources will be used for inputs at various times for a single video stream, some sort of video switching must be accounted for. If the switched sources have different resolutions then a video scaler, a component which converts video signals from one display resolution to another, will be required since changing streaming resolution within a session is impractical. Video scalers can be external devices or can be built into the encoder. If multiple content sources will be streamed simultaneously for use within a single application player software, which allows for that use case, a hardware player, often called a streaming multi-viewer, will be required. HDCP Content High-bandwidth Digital Content Protection (HDCP), is a form of digital copy protection designed to prevent copying of digital audio and video content as it travels between devices. These devices are typically a video source such as a Blu-ray player and a sink, such as a monitor. HDCP capabilities are often present in interfaces such as HDMI, Display Port and DVI-D. Analog connections such as VGA and component do not have HDCP capabilities. The owner of the content can encrypt the content and a HDCP compliant player will distribute the key to the content, to compliant receiving devices (sinks). This works well in the originally intended scenario of a single source connected to a single sink, like a Blue-ray player connected directly to a monitor in your living room, but a video streaming encoder is not typically an HDCP compliant sink. The exception to this are some systems that certify a sink connection on a dedicated streaming decoder. The stream is encrypted and can only be decoded by a compliant decoder from the same manufacturer as the Interface encoder. INTERFACE There are many ways to get the content into a video stream. One factor in determining which method to use are the possible ways to interface the content to the device which will compress and stream the content. AV Interface The most common way that organizations have streamed AV content is the use of a streaming encoder, a standalone device which uses the AV outputs of the content source as inputs. It then captures, compresses and streams the content. There are many standard audio and video interfaces which are not interoperable, although some are with simple passive adapters. It is important to understand the available video interfaces that can be used on Common Video Interfaces your sources when specifying an encoder, or understand the interfaces on your encoder when specifying a source. For non-matching video interfaces, it is possible to get a media converter which will convert the signals. Audio interfaces tend to be more standard, although they may be analog or digital. Audio may also be brought in on the HDMI video interface. USB interface Software based video encoders are often designed to use a native video interface within a computer such as a USB web camera. Increasingly there are USB cameras with higher functionality such as pan tilt zoom capabilities. Often there may be a desire to include non-webcam content, such as the AV interface from a computer. This may be accomplished through a USB capture device, but will require thorough investigation to ensure compatibility. VGA and standard definition capture devices general present few interoperability problems. Although there are several HDMI capture devices which interface HDMI inputs to the USB interface on the computer, they generally are designed for recording and use proprietary software not compatible with streaming or web conferencing software. If a USB capture device is desirable, it is best to make sure it supports the USB video device class (also called USB video class or UVC). UVC 1.1 uses an uncompressed video format or MJPEG to transfer the video from the origin to the computer. UVC 1.5 released in 2012 also allows for H.264 or VP8 compression. The video stream is converted back to an uncompressed state in the computer and re-encoded within the software to the desired format, even if the desired compression matches the original format. UVC 1.5 is Signal Max Resolution Compatibility with Adapter HDMI Digital 4K DVI-D, HDMI DVI-D Digital 3,840 2,400 HDMI, DVI-I DVI-I Both 1,920 1,200 VGA, DVI-D, HDMI Display Port Digital 4K HDMI*, DVI-D* VGA Analog 1,920 1,200 DVI-I Component Analog 576i No S-Video Analog 576i No Composite Analog 1080i No SDI (SMPTE 344M)** Digital 576p No HD-SDI (SMPTE 292M)** Digital Backwards to previous SDI Dual Link HD-SDI (SMPTE 372M)** Digital 1080p Backwards to previous SDI 3G-SDI (SMPTE 424M) Digital 1080p Backwards to previous SDI *Only Display port Source is compatible ** SDI interfaces only support SMPTE resolutions and not all the standard computer resolutions avnetwork.com Planning for Video Streaming AVTechnology Manager s Workbook 11

12 Content Attributes p not universally supported, so if there is doubt, UVC 1.1 is most likely to be supported by streaming software. Software Interface In a simple application, which just requires content which is displayed on a computer, software running on the computer may capture, compress and stream the content. This software may be installed and reside on the computer or may be a cloud-hosted application. Many cloud-based conferencing services have options which allow multiple participants with web cameras and shared screens to capture and stream the conference. This type of application is likely to grow as computers continue to increase in processing power and web interfaces, especially HTML5 become more sophisticated. Direct streaming devices Increasingly, there are cameras which have a streaming interface incorporated into the camera. These are primarily designed for the security market and may lack the features required for a content-based streaming application, unless paired with production or management software, designed for this sort of capture. In the future, these devices and the required management software may become more common, but today they will typically only be applicable for the simplest integrations. RESOLUTION AND RATE Video Digital video consists of a series of bitmap digital images displayed in rapid succession at a constant rate. These individual images are called frames. The terminology for the rate at which these frames are shown is frames per second (FPS). Each frame of digital video is composed of an array of pixels. Common Video Resolutions Video AR W H QCIF 4: CIF 4: SIF 4: CIF 4: i (NTSC) 4: p 4: i 4: p 4: p (HDTV) 16: i (HDTV) 16: p (HDTV) 16: p (UHDTV) 16: p (UHDTV or 8k) 16: Computer Graphics In the computer graphics world this array is called a raster, a term that comes from the early days of CRT displays. Frame resolution is traditionally expressed as width X height, although recently it has become common to express it in just width with the scan mode indicated as a p or i and the frame rate. Aspect ratio is the ratio of width to height expressed as a ratio or a reduced decimal number. Common aspect ratios are 4:3 or 1.33:1 (often represented as 1.33) and 16:9 or 1.78:1. Computer Graphics often use a 16:10 ratio. There are two scan modes progressive and interlaced. Progressive-mode (p) displays each full frame. Interlace mode (i) displays each frame as two fields the first displaying all the odd horizontal lines and the second displaying all the even lines. It is important to reconcile the source rates and resolutions to the streaming rates and resolutions which meet the audience requirements in order to optimize bandwidth (discussed later on). If the audience is going to be viewing the content on a phone with 480p resolution, then streaming the source content at 1080p is a waste. If the audience is receiving PowerPoint slides that only change every couple of minutes then streaming at 60fps is a waste. On the other hand, if the streaming resolution or frame rate is too small to properly convey the information, the needs of the audience are not met. Matching the source and streaming resolution can be handled in two ways. The resolution of the source can be changed. This may not be possible or desirable, especially in situations where the streamed source is also being presented to a live audience who can benefit from the additional resolution. Or, the source can be scaled to the desired resolution. Video Scaling A video scaler is a component which converts video signals from one resolution to another. Video scalers can be external devices, can be built into the encoder, can be a network service, or in on demand situations, scaling can be done as a post-capture process in software to re-encode the content at the desired AR W H resolution. The source may be scaled multiple times in cases where there are multiple resolutions desired. The original source content should be used as the master of each scaled output in order to avoid generational loss. Typically, better scaling results are obtained when the source is at a higher resolution than the stream. When sources are upscaled (the displayed resolution is increased), the familiar blocky image with large pixels is produced or information is invented to make viewing less distracting, but is not necessarily valid. Downscaling, while fine detail may be lost, is typically less distracting and doesn t give that out of focus impression. VGA 4: SVGA 4: XGA 4: WXGA 4: WXGA 4: SXGA 5: SXGA+ 4: WXGA+ 16: UXGA 4: WSXGA+ 16: WUXGA 16: QWXGA 16: AVTechnology Manager s Workbook Planning for Video Streaming avnetwork.com

13 Content Attributes s Audio Frequency Ranges Human Voice 80-1,200Hz Piano ,186Hz Orchestra Strings 30-16,000Hz Telephone 200-3,500Hz Video Conferencing 80-8,000Hz CD Audio 20-22,000Hz 20Hz 200Hz 2,000Hz 20,000Hz Pitch Color Presence Audio Audio, which is inherently analog, is converted to a digital representation that can be transmitted through a network by a process with the easy to remember name, analog to digital conversion (ADC). In ADC, the audio signal is sampled (measured) at a regular interval. The resulting set of samples is a Pulse Code Modulated (PCM) signal. PCM has two attributes which determine the quality of the sampled audio-bit depth and sampling rate. Bit depth, the number of bits used to quantify the signal, determines the resolution of a sample. An 8 bit sample has 64 possible values, a 16 bit sample has 65,536, and a 24 bit sample has 16,777,216. The sampling rate is the number of times in a second the sample is taken. This determines the highest frequency that can be reproduced, which is described in the Nyquist Shannon sampling theorem which states: If a function x(t) contains no frequencies higher than B hertz, it is completely determined by giving its ordinates at a series of points spaced 1/(2B) seconds apart. This means that the highest frequency which can be reproduced is half the sample rate. The full range of human hearing is between 20Hz and 20kHz. Human speech is in the range of 300Hz to 3,500Hz. Common sampling frequencies run from 8,000Hz for telephone calls to 48,000Hz for professional audio recordings. PCM audio can be streamed uncompressed or it can be further compressed before streaming. If the audio source is in the form of a digital PCM stream that has a different sample rate from the desired stream, that conversion is typically done as part of the encoding process. BANDWIDTH In the context of content, bandwidth is the bitrate of a single AV stream crossing the network. The biggest trade offs typically made in a streaming application are balancing quality and bandwidth. Based on the overall bandwidth available discovered in the application phase, a bandwidth budget for the content must be established, and used as goal when evaluating the technologies presented in the next section. AVT avnetwork.com Planning for Video Streaming AVTechnology Manager s Workbook 13

14 Technology Components p Streaming Technology Components Video Compression. Compression and Codec Terminology. Intra-frame Codecs. Inter-frame Codecs. Containers. Session Controls. Transport Protocols VIDEO COMPRESSION A single uncompressed 1080p, 30 frames per second (fps) video would consume 1.49Gbps (1920 horizontal pixels x 1080 vertical pixels x 30fps x 24 bits color information) of bandwidth, just for the data stream. There would also be a minimum of 30Mbps of IP overhead and 10-50Mbps of streaming control overhead. To stream or capture this kind of bandwidth for any usable amount of time would be very difficult, and certainly very expensive. We use compression to combat this problem. The tools used to perform compression are called codecs, COmpressor DECompressor. The term codec is often used interchangeably with the concept for the compression algorithm which may or may not be accurate. The tool Codec can contain one or more compression algorithms and methodology, which depending on the codec, may be used together or completely independently depending on the design and settings. Codecs also define other characteristics like bitstream format, the way the compressed information is organized within a file or transported. Two different codecs could have the exact same compression algorithm and be incompatible. Lossless vs Lossy Compression In lossless compression, the bandwidth of the stream is reduced and when it is decompressed the output is exactly the same as the input. In lossy compression, some information is discarded, but intelligent choices are made to minimize the perceived impact of the loss. While there are some lossless video codecs and some codecs have lossless and lossy profiles, business class compression is almost always lossy. Tradeoffs between bandwidth, resolution, frame rate, color accuracy and other factors determine how lossy. COMPRESSION AND CODEC TERMINOLOGY There are two main classes of compression, intra-frame and inter-frame compression. Intra-frame (intra is a Latin prefix meaning within) each frame of video is individually compressed as if it were an individual picture. Inter-frame (inter is a Latin prefix meaning between) compression uses multiple frames as references for each other allowing portions of the frames that do not change to just be repeated rather than resending the information. In video compression there are three types of frames; I frames (intra coded frames) are compressed individually and can stand alone as complete pictures. These frames have the most complete information and therefore are the largest (least compressed). Codec MPEG 1 MPEG 2 MPEG 3 MPEG 4 Description Low bandwidth, low framerate, low resolution, rarely used except for the mp3 audio format which is part of this standard Designed and still used for broadcast transport and DVD Originally created for HDTV but was rolled into the MPEG 2 standard and was never ratified. Uses encoding with additional complexity to achieve higher compression factors than MPEG- 2. The most commonly used codec today. 14 AVTechnology Manager s Workbook Planning for Video Streaming avnetwork.com

15 Technology Components p Group of Pictures is a repeating pattern of how the various types of frames are arranged P frames (predictive coded frames) use data from previous frames as a reference and are smaller than I frames (about 1/2 the size of an I-frame). B frames (bi-predictive coded frames) use both previous and future frames as references. Because of this they cannot be encoded until the future frame is received. They are the smallest of the 3 types (about 1/4 the size of an I-frame). Group of Pictures (GOP) Intra-frame compression uses only I-frames. Inter-frame compression bundles the three types of frames into a Group of Pictures. A GOP always starts with an I-frame (sometimes referred to as the key frame). The codec defines how the various P- and B-frames are interleaved, which frames can be used as references and what compression method to use. GOP size is generally a user configurable setting on the encoder. Because each GOP contains one I-frame, larger GOP sizes allow for higher compression rates. INTRA-FRAME CODECS Intra-frame are generally not used for enterprise video streaming due to the lower compression rate achievable compared to inter-frame codecs. The exception is the growing use in applications to replace AV switching. They lower latency more than inter-frame codes and can be very lightly compressed to mimic the results of a directly cabled video connection. M-JPEG The Joint Photographic Experts Group (JPEG) image file format was developed by the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC). JPEG encompasses a group of ISO/IEC Standards and first issued in JPEG is a versatile codec ranging from lossless compression to very lossy. JPEG compression is based on the Discrete Cosine Transform (DCT) algorithm and uses a 4:4:4, 4:2:2 or 4:2:0 chroma subsampling depending desired compression and quality. M-JPEG (Motion JPEG) video consists of a series JPEG images. JPEG2000 JPEG 2000 is an updated standard from the ISO/IEC published in It is fairly unique in that it uses Discrete Wavelet Transform (DWT). DWT allows for multi-resolution image representation which means that different image resolutions can be extracted from a single file without re-compressing. Like M-JPEG, M-JPEG 2000 consists of a series of JPEG 2000 frames and is widely used in broadcast production at very high bit rates. The all I-frame format and high quality make it easy to perform a second compression to an inter-frame codec for transmission. Proprietary Codecs There are several non-standard codecs which are optimized for low latency, often under 20ms. These typically have a very low compression and correspondingly high bandwidth requirements. INTER-FRAME CODECS MPEG Codecs Like JPEG, the Moving Picture Experts Group (MPEG) is a working group formed by ISO/IEC. MPEG s purpose is to establish standards for audio and video compression and transmission. The working group was established in 1988 and published its first standard, MPEG-1 in The MPEG family is the basis for many codecs. The MPEG family of standards is organized by a hierarchy of streams. A mix of streams, including the possibility for multiple video streams is included in the MPEG standards. Elementary Streams (ES) are the raw bitstreams of encoded audio or video. Elementary streams are fully valid files and can be played alone. The mp3 file format is a MPEG-1 Part 3 (audio) Layer 3 elementary stream. Packetized Elementary Streams (PES) are created by segmenting elementary streams into packets (a formatted unit of data) and adding a PES header (supplemental data placed at the beginning of a block of data being stored or transmitted). avnetwork.com Planning for Video Streaming AVTechnology Manager s Workbook 15

16 Technology Components p GOP Compression /Decompression Order This stream hierarchy makes MPEG codecs particularity suitable for transport over IP networks. MPEG encoders capabilities are defined by profiles and levels. The profile defines the subset of features such as compression algorithm, chroma format, etc. The level defines the subset of quantitative capabilities such as maximum bit rate, maximum frame size. Decoders often describe their playback specification using the maximum program and level they are capable of decoding. For example a decoder which has a MPEG specification of MP@ML, can play back any stream encoded up to Main Profile, Main Level. MPEG defines a bitstream and a decoder function. A valid MPEG file or stream is one that can be decoded by the standard decoder. This leaves the encoding methodology up to the manufacturer. While this means that there is room for innovation and growth, the video quality and encoding efficiency is not guaranteed by the standard, so due diligence should be performed when choosing an encoder. 64 samples. The macroblock is divided into 6 blocks with 64 samples in each four Y, one Cb, one Cr. These 8 x 8 blocks are then compressed using Discrete Cosine Transform (DCT) and the resulting compressed values are combined to form the compressed macroblock. The entire I-frame is compressed as an array of macroblocks. If the height or width in pixels of the frame is not evenly divisible by 16, the leftover area must still be encoded as a macroblock and the unallocated area is just not displayed. I-frames can be considered effectively identical to baseline JPEG images. Even if the next frame is a B-frame, the next frame to be encoded is a P-frame. This is because a B-frame is bi-predictive. It requires a compressed anchor frame before and after it as references for compression. P-frame Compression Example Inter-frame Compression MPEG codecs use a variety of compression techniques to compress video based around a GOP. A greatly simplified version of the compression is discussed below. The first frame to be compressed in a GOP is the I-frame. Macroblock Compression Example MPEG-4 compression is organized around blocks. The Illustration above depicts the data in a 16 pixel x 16 pixel macroblock, the smallest unit of video in MPEG-2, with 4:2:0 chroma subsampling. The Y component has 256 Y samples. The Cb and Cr each contain P-Frames exploit the fact that often much of the video changes little if at all. This is called temporal (over time) redundancy. The P-frame is divided into macroblocks and checked to see if any macroblocks on the P-Frame equal any macroblocks on the anchor frame. If there are, they are recorded as the same as that other one which takes much less space than a DTC compression (shown in red on the above illustration). The remaining macroblocks are checked to see if they mostly match any other macroblock on the anchor frame. If they do, the moved macroblock is recorded as a vector, (a direction and a distance) as shown in green on the above illustration. Any remaining macroblocks are compressed with DTC. If a video drastically changes from one frame to the next (such as a cut), it is more efficient to encode it as an I-frame. Next, any B frames between the two anchor frames. This uses the 16 AVTechnology Manager s Workbook Planning for Video Streaming avnetwork.com

17 Technology Components p same process as the P-frame compression, but has a better chance of matching macroblocks, because it has 2 frames to match against. B-frames provide maximum compression but require the previous as well as next frame for computation. Therefore, processing of B-frames require more buffer on both the encoded and decoded side. Each successive B-frame without an anchor frame adds one frame of latency on the encode side. In the example above you see the two frames of latency in red. Because the frames are sent in the order they are encoded, there is only one additional frame buffer, no matter the number of B frames. H.264 The H series codecs, H.261, H.262, H.263, and H.264 are a group of standards published by the International Telecommunication Union (ITU) Telecommunication Standardization Sector, a standards setting group dating back to the telegraph. It is now a specialized agency of the United Nations, and operates a public-private partnership. The ITU s standards reach across the spectrum of telecommunications, including standards for fax, modems, optical communications, VoIP, Public Key Infrastructure (PKI), SIM cards, and a host of other technologies to ensure interoperability between manufacturers and between nations. The standards often cross reference under umbrella standards like the video conferencing standards H.320 and H.323 which include H.261, H.263, and H.264. The ITU works with other standards organizations including ISO/ IEC. The H.26X series standards map closely with the MPEG standards. In many cases, the ITU and ISO/IEC video standards documents are virtually identical. FLV, WMV, Ogg, MOV, ASF, MP4 are examples of container files Different players support multiple different containers and most containers support multiple file types. Just because a player can read a container file does not mean that it has all the tools needed to decode the files within the container. For example container may have a video file encoded in h.264, but the player does not have an h.264 codec, the player will either just not play it or return an error. MPEG Transport Stream The MPEG Transport Stream was initially specified under Part 1 of MPEG 2. Because of this it is still often referred to as MPEG-2 transport stream causing confusion. Transport stream is included in PART 1 of the MPEG-4 specification without significant changes. It is more accurately called MPEG Transport Stream, or just Transport Stream (TS). TS a data packet format was originally designed to transmit one data packet in four Asynchronous Transfer Mode (ATM) cells. ATM cell payloads are 48 Bytes which accounts for the fairly small TS packet size of 188 bytes. It is purposely designed for digital video and audio transmission mediums, where the beginning and the end of the stream may not be identified, and is the primary container for digital video transmission in the broadcast world. As a container designed for broadcast it is self-contained and doesn t require a separate descriptor (SDP) file as described below, it is encoded in the stream. H.261 MPEG 1 H.262 MPEG 2 H.263 MPEG 4 Part 2 H.264 MPEG 4 Part 10 H.264 is an ITU adoption of MPEG 4 Part 10 (AVC). The ITU and ISO/IEC jointly maintain the standards so they remain identical. It is important to note that not all MPEG 4 video is H.264. An MPEG 4 encoder may or may not be H.264, depending on the profiles implemented unless it states MPEG 4 Part 10 or AVC. An H.264 decoder will be backwards compatible across all the MPEG 4 profiles and typically be backwards compatible across all the H.26X standards, including H.262 (MPEG-2). CONTAINERS A container is a metafile; a file that can store multiple types of data in a structured format. A video container holds all the different files and data the player requires to play the video. This may include one or more audio channels, one or more videos, graphics files, metadata, and data files. Metadata id data about data it is a data structure that organizes information the player may need, including, but not limited to, timing information, window size, what codec(s) is being used, titles, program information, bookmarks, and closed captioning. Color Model and Color Space Pixels have only one property, color. The color of a pixel in digital video is represented by a string of bits. A color model is a mathematical way of representing a color with numbers. The most commonly known color model is RGB which represents a color by a combination of red, green and blue values. 24 bit RGB video has 8 bits representing each color. This allows for 256 X 256 X 256 = 16.7 million individual colors. Other bit depths can be used depending on the application. The color model most often used for digital video is Y Cb Cr. The prime ( ) is a mathematical symbol that shows that the value is non-linear (the sample values are not equally spaced). Y Cb Cr is very typically noted as YCbCr, as it will be for the remainder of this document. The term YUV which is the PAL and SECAM equivalent (and therefore incorrectly used in digital video) is also often used interchangeably. Y is for luma (brightness), the simple, description of Cb and Cr is Blue Y and Red Y. Color space is the real world matchup between the numbers and the colors they represent. Depending on the range of colors needing to be represented, the same color models can represent different color spaces. When a computer which uses a 24 bit RGB model is compressed with a standard video codec, the color space must be converted to YCb- Cr before it can be compressed. If you compress a video and the colors come out wrong the likely culprit is color space conversion. avnetwork.com Planning for Video Streaming AVTechnology Manager s Workbook 17

18 Technology Components s SESSION CONTROLS Session controls are used for establishing and controlling media sessions between end points. Most session controls pass the required metadata to the receiver though a file called a Session Description Protocol (SDP) with the extension.sdp. SDP is a standard governed by the Internet Engineering Task Force (IETF). The SDP (often called a session profile) file contains various information elements, including origination address, session name, connection information (multicast address), media information for each stream (stream name, codec type) and stream attributes like frame size. Progressive Download The simplest form of video on demand is the progressive download, it is not really streaming, but the downloading of the container file in a bitstream order so that the file can start to be played before the download is complete. The advantage of progressive download is that it can be implemented on a simple http server without additional streaming services. As a downside, the user is able to get a copy of the video, often as simply as right clicking on the link and choosing save target as. If a viewer wants to see a small portion of the video, especially at the end, they have to download the entire portion of the video, prior to the desired segment, before they can view it. This can be a waste of time and bandwidth. Real Time Streaming Protocol (RTSP) RTSP is used with entertainment and communications systems to control the streams delivered to the clients via commands like play and pause. Since the player controls what parts of the stream is sent, the user can jump to any spot in the stream. The media is typically transferred using RTP and RTSP described later. TRANSPORT PROTOCOLS The various containers need a way to be streamed across a network to a player. The various methods are typically called Transport Protocols, although technically they may be hybrids of different techniques. RTP Real-Time Transport Protocol (RTP) is not a container format; rather it is an additional protocol (with other required support protocols), which allows for streaming compressed audio and video and performs the functions of a container. RTP performs time stamping and synchronization of separate audio and video streams and includes information in its header for proper reassembly at the receiving end if packets are received out of order. Real Time Control Protocol (RTCP) is a companion protocol to RTP that is used to maintain Quality of Service (QoS). RTP nodes analyze network conditions and periodically send each other RTCP packets that report on network congestion. All receivers are expected to report back RTCP information, even in a multicast where the QoS information will not be acted upon. Typically RTP will be sent on an even-numbered UDP port, with RTCP messages being sent over the next higher odd-numbered port. When audio and video are streamed together, typically each have their own RTP/RTCP streams and are reassembled using RTP time stamps at the receiver. An exception to this is thr MPEG Program stream, which handles the multiplexing within the container. AVT Session Announcement Protocol (SAP) A service using SAP periodically sends the SDP file as a multicast. The receiver subscribes to the SAP multicast with a SAP listener program so they are aware of the available streams. When the receiver wants to view a stream, it gets the SDP file from the local SAP listener and uses the metadata in the SDP file to subscribe to the stream. SAP is primarily used as a program guide for IPTV systems. HTTP Live Streaming (HLS) HLS is an HTTP-based media streaming communications protocol originally implemented by Apple Inc. as part of its QuickTime software. It uses an M3U playlist file rather than a SDP. The player makes HTTP requests based on the information in the M3U and the server returns MPEG Transport Stream packets for playout. HLS offers the random access control of RTSP, without requiring a streaming sever. Any web server can serve streaming video. 18 AVTechnology Manager s Workbook Planning for Video Streaming avnetwork.com

19 Network Transit p Network Transit TCP and UDP. When to Use Which. Unicast and Multicast TCP AND UDP Transmission Control Protocol (TCP) and User Datagram Protocol (UDP) are the two most common protocols of the Internet protocol (IP) suite. Both protocols use port numbers as their addressing scheme. Port numbers are locally significant in a host. Port numbers are not used as addressing for traveling across the network, but as logical addresses so that data streams can be directed to the correct decoding process. For example in RTSP streaming, each media stream is directed to a different port number so the program knows which packet contain audio and which contain video. TCP TCP is described as a reliable, connection-oriented protocol. This means the sender and receiver have a conversation about the data to be sent and how fast to send it, and if some doesn t make it, the receiver knows and requests the sender to resend it. In TCP transmission, if a packet is lost or corrupted (then discarded) and sent later, it is the responsibility of the network layer of the receiver to put the data back in the right order before sending it to the program. Data sent across the network can be segmented across multiple TCP packets as a TCP stream (In telecommunications there are two ways transfer data messages and streams. Messages have a beginning, an end, syntax and a size. Streams do not have set boundaries and can transmit any data. TCP and UDP are stream oriented protocols, video streaming is also stream oriented. They are different applications that employ the same communications concept) and the network layer of the receiver must reassemble the data before passing it to the programs. For these reasons TCP is described as a heavyweight protocol. UDP UDP is described as an unreliable, connectionless protocol. This means that once the receiver requests the data, the sender just sends it and assumes it gets there. There is no mechanism within UDP to request retransmission if the packet is lost or corrupted (then discarded), although some programs that use UDP transmission, check the data and ask for it to be sent again at the program level. In UDP transmission, the packets contain complete segments of data, called datagrams. The network layer of the receiver just passes the datagrams to the program as they come in without regard to order or other packets. For these reasons, UDP is described as a lightweight protocol. WHEN TO USE WHICH TCP is a great protocol for transferring data like a document or a web page where accuracy is very important. There are multiple mechanisms that ensure that the data gets there with no errors. However all that error correction adds delay, and the larger header size adds additional bandwidth overhead compared to UDP. UDP is typically used when the data is time sensitive, and is resilient to some data loss, like streaming video transport. With the large amount of data passed with streaming video, the additional bandwidth savings is also appreciated. Multicast with its one-to-many approach is particularly ill suited to TCP which includes one to one handshaking and error correction. Therefore multicast almost universally uses unicast. That s not to say video can t be sent over TCP, YouTube uses TCP transmission. It is just better suited to non-real time, one to one transmission like video on demand, which can be transmitted as either TCP or UDP. avnetwork.com Planning for Video Streaming AVTechnology Manager s Workbook 19

20 Network Transit p Unicast Multicast UNICAST AND MULTICAST There are two primary ways that video is transmitted across an IP network: unicast and multicast. Unicast is used in applications like video on demand where each user is viewing the content on their own time frame. Due to increased network consumption, it is not preferable for applications where multiple viewers are receiving the same content simultaneously. Multicast is preferable in real time applications where the network supports it, typically on a campus network. Multicast on the Internet is not practical because the Internet is generally not multicast-enabled. Unicast Unicast is a one-to one connection between the decoder and the source. Unicast uses IP delivery methods such as Transmission Control Protocol (TCP) or User Datagram Protocol (UDP), which are session-based protocols. When a video decoder receives a unicast stream from a streaming source, that client has a direct relationship to the server. Each unicast client that connects to the server takes up additional bandwidth. For example, if you have 10 clients all playing 1 Mbps streams, those clients as a group are taking up 10 Mbps. If you have only one client playing the 1 Mbps stream, only 1 Mbps is being used. Multicast Multicast is a one-to-one or more connection between multiple decoders and the source. The multicast source relies on multicast-enabled routers to forward the packets to all client subnets that have clients listening. There is no direct relationship between the decoders and the source. The decoders subscribe to a multicast group and the network ensures delivery of the stream. Each client that listens to the multicast adds no additional overhead on the server. The server sends out only one stream per source. The same load is experienced on the source whether only one client or 1,000 clients are listening. Multicast works by the source device addressing the content to be multicast to a multicast address, unlike unicast where a copy of the content is addresses to each destination individually. The range of IP addresses reserved for multicast is , however many address ranges are reserved for special purposes. Best practice for streaming is to use the range from to , unless there is a specific reason to use other addressing. A properly configured network forwards the multicast content to any user that is subscribed to the multicast with only one copy of the content on any given network segment. If an endpoint does receive a multicast it is not subscribed to, it ignores it. AVT 20 AVTechnology Manager s Workbook Planning for Video Streaming avnetwork.com

White paper. H.264 video compression standard. New possibilities within video surveillance.

White paper. H.264 video compression standard. New possibilities within video surveillance. White paper H.264 video compression standard. New possibilities within video surveillance. Table of contents 1. Introduction 3 2. Development of H.264 3 3. How video compression works 4 4. H.264 profiles

More information

White paper. An explanation of video compression techniques.

White paper. An explanation of video compression techniques. White paper An explanation of video compression techniques. Table of contents 1. Introduction to compression techniques 4 2. Standardization organizations 4 3. Two basic standards: JPEG and MPEG 4 4. The

More information

Digital Audio and Video Data

Digital Audio and Video Data Multimedia Networking Reading: Sections 3.1.2, 3.3, 4.5, and 6.5 CS-375: Computer Networks Dr. Thomas C. Bressoud 1 Digital Audio and Video Data 2 Challenges for Media Streaming Large volume of data Each

More information

ADVANTAGES OF AV OVER IP. EMCORE Corporation

ADVANTAGES OF AV OVER IP. EMCORE Corporation ADVANTAGES OF AV OVER IP More organizations than ever before are looking for cost-effective ways to distribute large digital communications files. One of the best ways to achieve this is with an AV over

More information

Glossary of Terms and Acronyms for Videoconferencing

Glossary of Terms and Acronyms for Videoconferencing Glossary of Terms and Acronyms for Videoconferencing Compiled by Irene L. Ferro, CSA III Education Technology Services Conferencing Services Algorithm an algorithm is a specified, usually mathematical

More information

Networked AV Systems Pretest

Networked AV Systems Pretest Networked AV Systems Pretest Instructions Choose the best answer for each question. Score your pretest using the key on the last page. If you miss three or more out of questions 1 11, consider taking Essentials

More information

QOS Requirements and Service Level Agreements. LECTURE 4 Lecturer: Associate Professor A.S. Eremenko

QOS Requirements and Service Level Agreements. LECTURE 4 Lecturer: Associate Professor A.S. Eremenko QOS Requirements and Service Level Agreements LECTURE 4 Lecturer: Associate Professor A.S. Eremenko Application SLA Requirements Different applications have different SLA requirements; the impact that

More information

Video Streaming Primer

Video Streaming Primer Video Streaming Primer Christopher Benes, CTO, rvibe This document is a primer on video streaming. It describes the video streaming process, video compression (encoding) and decompression (decoding), high

More information

Broadband Networks. Prof. Dr. Abhay Karandikar. Electrical Engineering Department. Indian Institute of Technology, Bombay. Lecture - 29.

Broadband Networks. Prof. Dr. Abhay Karandikar. Electrical Engineering Department. Indian Institute of Technology, Bombay. Lecture - 29. Broadband Networks Prof. Dr. Abhay Karandikar Electrical Engineering Department Indian Institute of Technology, Bombay Lecture - 29 Voice over IP So, today we will discuss about voice over IP and internet

More information

Video compression: Performance of available codec software

Video compression: Performance of available codec software Video compression: Performance of available codec software Introduction. Digital Video A digital video is a collection of images presented sequentially to produce the effect of continuous motion. It takes

More information

Understanding Megapixel Camera Technology for Network Video Surveillance Systems. Glenn Adair

Understanding Megapixel Camera Technology for Network Video Surveillance Systems. Glenn Adair Understanding Megapixel Camera Technology for Network Video Surveillance Systems Glenn Adair Introduction (1) 3 MP Camera Covers an Area 9X as Large as (1) VGA Camera Megapixel = Reduce Cameras 3 Mega

More information

Application Note How To Determine Bandwidth Requirements

Application Note How To Determine Bandwidth Requirements Application Note How To Determine Bandwidth Requirements 08 July 2008 Bandwidth Table of Contents 1 BANDWIDTH REQUIREMENTS... 1 1.1 VOICE REQUIREMENTS... 1 1.1.1 Calculating VoIP Bandwidth... 2 2 VOIP

More information

Encapsulating Voice in IP Packets

Encapsulating Voice in IP Packets Encapsulating Voice in IP Packets Major VoIP Protocols This topic defines the major VoIP protocols and matches them with the seven layers of the OSI model. Major VoIP Protocols 15 The major VoIP protocols

More information

How To Understand The Technical Specifications Of Videoconferencing

How To Understand The Technical Specifications Of Videoconferencing Videoconferencing Glossary Algorithm A set of specifications that define methods and procedures for transmitting audio, video, and data. Analog Gateway A means of connecting dissimilar codecs. Incoming

More information

Chapter 3 ATM and Multimedia Traffic

Chapter 3 ATM and Multimedia Traffic In the middle of the 1980, the telecommunications world started the design of a network technology that could act as a great unifier to support all digital services, including low-speed telephony and very

More information

Understanding Network Video Security Systems

Understanding Network Video Security Systems Understanding Network Video Security Systems Chris Adesanya Panasonic System Solutions Company adesanyac@us.panasonic.com Introduction and Overview This session will provide vendor neutral introduction

More information

VoIP QoS. Version 1.0. September 4, 2006. AdvancedVoIP.com. sales@advancedvoip.com support@advancedvoip.com. Phone: +1 213 341 1431

VoIP QoS. Version 1.0. September 4, 2006. AdvancedVoIP.com. sales@advancedvoip.com support@advancedvoip.com. Phone: +1 213 341 1431 VoIP QoS Version 1.0 September 4, 2006 AdvancedVoIP.com sales@advancedvoip.com support@advancedvoip.com Phone: +1 213 341 1431 Copyright AdvancedVoIP.com, 1999-2006. All Rights Reserved. No part of this

More information

Understanding HD: Frame Rates, Color & Compression

Understanding HD: Frame Rates, Color & Compression Understanding HD: Frame Rates, Color & Compression HD Format Breakdown An HD Format Describes (in no particular order) Resolution Frame Rate Bit Rate Color Space Bit Depth Color Model / Color Gamut Color

More information

Cisco TelePresence Quick Set C20

Cisco TelePresence Quick Set C20 Cisco TelePresence Quick Set C20 The Cisco TelePresence portfolio creates an immersive, face-to-face experience over the network empowering you to collaborate with others like never before. Through a powerful

More information

How To Compare Video Resolution To Video On A Computer Or Tablet Or Ipad Or Ipa Or Ipo Or Ipom Or Iporom Or A Tv Or Ipro Or Ipot Or A Computer (Or A Tv) Or A Webcam Or

How To Compare Video Resolution To Video On A Computer Or Tablet Or Ipad Or Ipa Or Ipo Or Ipom Or Iporom Or A Tv Or Ipro Or Ipot Or A Computer (Or A Tv) Or A Webcam Or Whitepaper: The H.264 Advanced Video Coding (AVC) Standard What It Means to Web Camera Performance Introduction A new generation of webcams is hitting the market that makes video conferencing a more lifelike

More information

Understanding Compression Technologies for HD and Megapixel Surveillance

Understanding Compression Technologies for HD and Megapixel Surveillance When the security industry began the transition from using VHS tapes to hard disks for video surveillance storage, the question of how to compress and store video became a top consideration for video surveillance

More information

Classes of multimedia Applications

Classes of multimedia Applications Classes of multimedia Applications Streaming Stored Audio and Video Streaming Live Audio and Video Real-Time Interactive Audio and Video Others Class: Streaming Stored Audio and Video The multimedia content

More information

Understanding Video Latency What is video latency and why do we care about it?

Understanding Video Latency What is video latency and why do we care about it? By Pete Eberlein, Sensoray Company, Inc. Understanding Video Latency What is video latency and why do we care about it? When choosing components for a video system, it is important to understand how the

More information

A Look at Emerging Standards in Video Security Systems. Chris Adesanya Panasonic Network Systems Company Chris.Adesanya@us.panasonic.

A Look at Emerging Standards in Video Security Systems. Chris Adesanya Panasonic Network Systems Company Chris.Adesanya@us.panasonic. A Look at Emerging Standards in Video Security Systems Chris Adesanya Panasonic Network Systems Company Chris.Adesanya@us.panasonic.com Standards Standards are published documents that establish specifications

More information

How To Understand How Bandwidth Is Used In A Network With A Realtime Connection

How To Understand How Bandwidth Is Used In A Network With A Realtime Connection CoS and QoS - Managing Bandwidth, Complexity, and Cost One area of networking technology that has been heavily discussed, but less well actually understood is the topic of QoS or Quality of Service. Within

More information

Requirements of Voice in an IP Internetwork

Requirements of Voice in an IP Internetwork Requirements of Voice in an IP Internetwork Real-Time Voice in a Best-Effort IP Internetwork This topic lists problems associated with implementation of real-time voice traffic in a best-effort IP internetwork.

More information

White paper. Latency in live network video surveillance

White paper. Latency in live network video surveillance White paper Latency in live network video surveillance Table of contents 1. Introduction 3 2. What is latency? 3 3. How do we measure latency? 3 4. What affects latency? 4 4.1 Latency in the camera 4 4.1.1

More information

Cisco TelePresence Quick Set C20

Cisco TelePresence Quick Set C20 Cisco TelePresence Quick Set C20 The Cisco TelePresence portfolio creates an immersive, face-to-face experience over the network empowering you to collaborate with others like never before. Through a powerful

More information

Bandwidth Adaptation for MPEG-4 Video Streaming over the Internet

Bandwidth Adaptation for MPEG-4 Video Streaming over the Internet DICTA2002: Digital Image Computing Techniques and Applications, 21--22 January 2002, Melbourne, Australia Bandwidth Adaptation for MPEG-4 Video Streaming over the Internet K. Ramkishor James. P. Mammen

More information

Video display interfaces

Video display interfaces White Paper Video display interfaces Within the last few years, a number of new and competing video display interfaces have emerged. While each claims to have unique benefits, there is some overlap. This

More information

(Refer Slide Time: 4:45)

(Refer Slide Time: 4:45) Digital Voice and Picture Communication Prof. S. Sengupta Department of Electronics and Communication Engineering Indian Institute of Technology, Kharagpur Lecture - 38 ISDN Video Conferencing Today we

More information

QoS issues in Voice over IP

QoS issues in Voice over IP COMP9333 Advance Computer Networks Mini Conference QoS issues in Voice over IP Student ID: 3058224 Student ID: 3043237 Student ID: 3036281 Student ID: 3025715 QoS issues in Voice over IP Abstract: This

More information

Combining Voice over IP with Policy-Based Quality of Service

Combining Voice over IP with Policy-Based Quality of Service TechBrief Extreme Networks Introduction Combining Voice over IP with Policy-Based Quality of Service Businesses have traditionally maintained separate voice and data networks. A key reason for this is

More information

Region 10 Videoconference Network (R10VN)

Region 10 Videoconference Network (R10VN) Region 10 Videoconference Network (R10VN) Network Considerations & Guidelines 1 What Causes A Poor Video Call? There are several factors that can affect a videoconference call. The two biggest culprits

More information

Study and Implementation of Video Compression Standards (H.264/AVC and Dirac)

Study and Implementation of Video Compression Standards (H.264/AVC and Dirac) Project Proposal Study and Implementation of Video Compression Standards (H.264/AVC and Dirac) Sumedha Phatak-1000731131- sumedha.phatak@mavs.uta.edu Objective: A study, implementation and comparison of

More information

Computer Network. Interconnected collection of autonomous computers that are able to exchange information

Computer Network. Interconnected collection of autonomous computers that are able to exchange information Introduction Computer Network. Interconnected collection of autonomous computers that are able to exchange information No master/slave relationship between the computers in the network Data Communications.

More information

EE3414 Multimedia Communication Systems Part I

EE3414 Multimedia Communication Systems Part I EE3414 Multimedia Communication Systems Part I Spring 2003 Lecture 1 Yao Wang Electrical and Computer Engineering Polytechnic University Course Overview A University Sequence Course in Multimedia Communication

More information

Voice over IP. Presentation Outline. Objectives

Voice over IP. Presentation Outline. Objectives Voice over IP Professor Richard Harris Presentation Outline Brief overview of VoIP and applications Challenges of VoIP IP Support for Voice Protocols used for VoIP (current views) RTP RTCP RSVP H.323 Semester

More information

2102642 Computer Vision and Video Electronics

2102642 Computer Vision and Video Electronics What is Video? 2102642 Computer Vision and Video Electronics Chapter 7 Video Signals and Standards Suree Pumrin, Ph.D. 1 Video is a collation of images presented so fast they give the illusion of motion;

More information

Applications that Benefit from IPv6

Applications that Benefit from IPv6 Applications that Benefit from IPv6 Lawrence E. Hughes Chairman and CTO InfoWeapons, Inc. Relevant Characteristics of IPv6 Larger address space, flat address space restored Integrated support for Multicast,

More information

Figure 1: Relation between codec, data containers and compression algorithms.

Figure 1: Relation between codec, data containers and compression algorithms. Video Compression Djordje Mitrovic University of Edinburgh This document deals with the issues of video compression. The algorithm, which is used by the MPEG standards, will be elucidated upon in order

More information

Video over IP WHITE PAPER. Executive Summary

Video over IP WHITE PAPER. Executive Summary Video over IP Executive Summary Thinking as an executive, there are pressures to keep costs down and help a company survive in this challenging market. Let us assume that company A has 10 locations and

More information

Best practices for producing quality digital video files

Best practices for producing quality digital video files University of Michigan Deep Blue deepblue.lib.umich.edu 2011-03-09 Best practices for producing quality digital video files Formats Group, Deep Blue http://hdl.handle.net/2027.42/83222 Best practices for

More information

Network Security Systems Fundamentals for ITS Professionals

Network Security Systems Fundamentals for ITS Professionals Network Security Systems Fundamentals for ITS Professionals Chris Adesanya Sr. Systems Engineer Panasonic System Solutions Company adesanyac@us.panasonic.com BICSI Southeast Regional Meeting Dulles, VA

More information

VoIP Bandwidth Considerations - design decisions

VoIP Bandwidth Considerations - design decisions VoIP Bandwidth Considerations - design decisions When calculating the bandwidth requirements for a VoIP implementation the two main protocols are: a signalling protocol such as SIP, H.323, SCCP, IAX or

More information

Video Conferencing. Femi Alabi UNC-CH - Comp 523 November 22, 2010

Video Conferencing. Femi Alabi UNC-CH - Comp 523 November 22, 2010 Video Conferencing Femi Alabi UNC-CH - Comp 523 November 22, 2010 Introduction Videoconferencing What Is It? Videoconferencing is a method of communicating between two or more locations where sound, vision

More information

HIGH-DEFINITION: THE EVOLUTION OF VIDEO CONFERENCING

HIGH-DEFINITION: THE EVOLUTION OF VIDEO CONFERENCING HIGH-DEFINITION: THE EVOLUTION OF VIDEO CONFERENCING Technology Brief Polycom, Inc. 4750 Willow Road Pleasanton, CA 94588 1.800.POLYCOM This white paper defines high-definition (HD) and how it relates

More information

Clearing the Way for VoIP

Clearing the Way for VoIP Gen2 Ventures White Paper Clearing the Way for VoIP An Alternative to Expensive WAN Upgrades Executive Overview Enterprises have traditionally maintained separate networks for their voice and data traffic.

More information

. ImagePRO. ImagePRO-SDI. ImagePRO-HD. ImagePRO TM. Multi-format image processor line

. ImagePRO. ImagePRO-SDI. ImagePRO-HD. ImagePRO TM. Multi-format image processor line ImagePRO TM. ImagePRO. ImagePRO-SDI. ImagePRO-HD The Folsom ImagePRO TM is a powerful all-in-one signal processor that accepts a wide range of video input signals and process them into a number of different

More information

5. DEPLOYMENT ISSUES Having described the fundamentals of VoIP and underlying IP infrastructure, let s address deployment issues.

5. DEPLOYMENT ISSUES Having described the fundamentals of VoIP and underlying IP infrastructure, let s address deployment issues. 5. DEPLOYMENT ISSUES Having described the fundamentals of VoIP and underlying IP infrastructure, let s address deployment issues. 5.1 LEGACY INTEGRATION In most cases, enterprises own legacy PBX systems,

More information

Troubleshooting VoIP and Streaming Video Problems

Troubleshooting VoIP and Streaming Video Problems Using the ClearSight Analyzer to troubleshoot the top five VoIP problems and troubleshoot Streaming Video With the prevalence of Voice over IP and Streaming Video applications within the enterprise, it

More information

159.334 Computer Networks. Voice over IP (VoIP) Professor Richard Harris School of Engineering and Advanced Technology (SEAT)

159.334 Computer Networks. Voice over IP (VoIP) Professor Richard Harris School of Engineering and Advanced Technology (SEAT) Voice over IP (VoIP) Professor Richard Harris School of Engineering and Advanced Technology (SEAT) Presentation Outline Basic IP phone set up The SIP protocol Computer Networks - 1/2 Learning Objectives

More information

An Introduction to VoIP Protocols

An Introduction to VoIP Protocols An Introduction to VoIP Protocols www.netqos.com Voice over IP (VoIP) offers the vision of a converged network carrying multiple types of traffic (voice, video, and data, to name a few). To carry out this

More information

Curso de Telefonía IP para el MTC. Sesión 2 Requerimientos principales. Mg. Antonio Ocampo Zúñiga

Curso de Telefonía IP para el MTC. Sesión 2 Requerimientos principales. Mg. Antonio Ocampo Zúñiga Curso de Telefonía IP para el MTC Sesión 2 Requerimientos principales Mg. Antonio Ocampo Zúñiga Factors Affecting Audio Clarity Fidelity: Audio accuracy or quality Echo: Usually due to impedance mismatch

More information

Internet Desktop Video Conferencing

Internet Desktop Video Conferencing Pekka Isto 13.11.1998 1(8) Internet Desktop Video Conferencing ABSTRACT: This is report outlines possible use of Internet desktop videoconferencing software in a distributed engineering project and presents

More information

Voice over IP (VoIP) Overview. Introduction. David Feiner ACN 2004. Introduction VoIP & QoS H.323 SIP Comparison of H.323 and SIP Examples

Voice over IP (VoIP) Overview. Introduction. David Feiner ACN 2004. Introduction VoIP & QoS H.323 SIP Comparison of H.323 and SIP Examples Voice over IP (VoIP) David Feiner ACN 2004 Overview Introduction VoIP & QoS H.323 SIP Comparison of H.323 and SIP Examples Introduction Voice Calls are transmitted over Packet Switched Network instead

More information

District of Columbia Courts Attachment 1 Video Conference Bridge Infrastructure Equipment Performance Specification

District of Columbia Courts Attachment 1 Video Conference Bridge Infrastructure Equipment Performance Specification 1.1 Multipoint Control Unit (MCU) A. The MCU shall be capable of supporting (20) continuous presence HD Video Ports at 720P/30Hz resolution and (40) continuous presence ports at 480P/30Hz resolution. B.

More information

PackeTV Mobile. http://www.vsicam.com. http://www.linkedin.com/company/visionary- solutions- inc. http://www.facebook.com/vsiptv

PackeTV Mobile. http://www.vsicam.com. http://www.linkedin.com/company/visionary- solutions- inc. http://www.facebook.com/vsiptv PackeTV Mobile Delivering HLS Video to Mobile Devices White Paper Created by Visionary Solutions, Inc. July, 2013 http://www.vsicam.com http://www.linkedin.com/company/visionary- solutions- inc. http://www.facebook.com/vsiptv

More information

Using the ClearSight Analyzer To Troubleshoot the Top Five VoIP Problems And Troubleshooting Streaming Video

Using the ClearSight Analyzer To Troubleshoot the Top Five VoIP Problems And Troubleshooting Streaming Video Using the ClearSight Analyzer To Troubleshoot the Top Five VoIP Problems And Troubleshooting Streaming Video With the prevalence of Voice over IP applications within the enterprise, it is important to

More information

IP-Telephony Real-Time & Multimedia Protocols

IP-Telephony Real-Time & Multimedia Protocols IP-Telephony Real-Time & Multimedia Protocols Bernard Hammer Siemens AG, Munich Siemens AG 2001 1 Presentation Outline Media Transport RTP Stream Control RTCP RTSP Stream Description SDP 2 Real-Time Protocol

More information

Goal We want to know. Introduction. What is VoIP? Carrier Grade VoIP. What is Meant by Carrier-Grade? What is Meant by VoIP? Why VoIP?

Goal We want to know. Introduction. What is VoIP? Carrier Grade VoIP. What is Meant by Carrier-Grade? What is Meant by VoIP? Why VoIP? Goal We want to know Introduction What is Meant by Carrier-Grade? What is Meant by VoIP? Why VoIP? VoIP Challenges 2 Carrier Grade VoIP Carrier grade Extremely high availability 99.999% reliability (high

More information

SIP Trunking and Voice over IP

SIP Trunking and Voice over IP SIP Trunking and Voice over IP Agenda What is SIP Trunking? SIP Signaling How is Voice encoded and transported? What are the Voice over IP Impairments? How is Voice Quality measured? VoIP Technology Confidential

More information

(Refer Slide Time: 01:46)

(Refer Slide Time: 01:46) Data Communication Prof. A. Pal Department of Computer Science & Engineering Indian Institute of Technology, Kharagpur Lecture - 38 Multimedia Services Hello viewers, welcome to today's lecture on multimedia

More information

Video Codec Requirements and Evaluation Methodology

Video Codec Requirements and Evaluation Methodology -47pt -30pt :white Font : edium t Video Codec Requirements and Evaluation Methodology www.huawei.com draft-filippov-netvc-requirements-02 Alexey Filippov, Jose Alvarez (Huawei Technologies) Contents An

More information

Study and Implementation of Video Compression standards (H.264/AVC, Dirac)

Study and Implementation of Video Compression standards (H.264/AVC, Dirac) Study and Implementation of Video Compression standards (H.264/AVC, Dirac) EE 5359-Multimedia Processing- Spring 2012 Dr. K.R Rao By: Sumedha Phatak(1000731131) Objective A study, implementation and comparison

More information

Audio and Video Synchronization:

Audio and Video Synchronization: White Paper Audio and Video Synchronization: Defining the Problem and Implementing Solutions Linear Acoustic Inc. www.linearacaoustic.com 2004 Linear Acoustic Inc Rev. 1. Introduction With the introduction

More information

WebEx. Network Bandwidth White Paper. WebEx Communications Inc. - 1 -

WebEx. Network Bandwidth White Paper. WebEx Communications Inc. - 1 - WebEx Network Bandwidth White Paper WebEx Communications Inc. - 1 - Copyright WebEx Communications, Inc. reserves the right to make changes in the information contained in this publication without prior

More information

How To Understand The Differences Between A Fax And A Fax On A G3 Network

How To Understand The Differences Between A Fax And A Fax On A G3 Network The Fax on IP Networks White Paper February 2011 2 The Fax on IP Networks Contents Overview... 3 Group 3 Fax Technology... 4 G.711 Fax Pass-Through... 5 T.38 IP Fax Relay... 6 Network Design Considerations...

More information

Video-Conferencing System

Video-Conferencing System Video-Conferencing System Evan Broder and C. Christoher Post Introductory Digital Systems Laboratory November 2, 2007 Abstract The goal of this project is to create a video/audio conferencing system. Video

More information

Computer Networks and Internets, 5e Chapter 6 Information Sources and Signals. Introduction

Computer Networks and Internets, 5e Chapter 6 Information Sources and Signals. Introduction Computer Networks and Internets, 5e Chapter 6 Information Sources and Signals Modified from the lecture slides of Lami Kaya (LKaya@ieee.org) for use CECS 474, Fall 2008. 2009 Pearson Education Inc., Upper

More information

How To Test Video Quality With Real Time Monitor

How To Test Video Quality With Real Time Monitor White Paper Real Time Monitoring Explained Video Clarity, Inc. 1566 La Pradera Dr Campbell, CA 95008 www.videoclarity.com 408-379-6952 Version 1.0 A Video Clarity White Paper page 1 of 7 Real Time Monitor

More information

Conditions affecting performance of a WebEx session.

Conditions affecting performance of a WebEx session. Conditions affecting performance of a WebEx session. WebEx Network Bandwidth White Paper The performance of a WebEx session depends on many factors. While vendors like Cisco can control some of these factors,

More information

Video Conferencing Glossary of Terms

Video Conferencing Glossary of Terms Video Conferencing Glossary of Terms A Algorithm A step-by-step problem-solving procedure. Transmission of compressed video over a communications network requires sophisticated compression algorithms.

More information

Agilent Technologies Performing Pre-VoIP Network Assessments. Application Note 1402

Agilent Technologies Performing Pre-VoIP Network Assessments. Application Note 1402 Agilent Technologies Performing Pre-VoIP Network Assessments Application Note 1402 Issues with VoIP Network Performance Voice is more than just an IP network application. It is a fundamental business and

More information

SERIES H: AUDIOVISUAL AND MULTIMEDIA SYSTEMS Infrastructure of audiovisual services Communication procedures

SERIES H: AUDIOVISUAL AND MULTIMEDIA SYSTEMS Infrastructure of audiovisual services Communication procedures I n t e r n a t i o n a l T e l e c o m m u n i c a t i o n U n i o n ITU-T TELECOMMUNICATION STANDARDIZATION SECTOR OF ITU Technical Paper (11 July 2014) SERIES H: AUDIOVISUAL AND MULTIMEDIA SYSTEMS Infrastructure

More information

Analog vs. Digital Transmission

Analog vs. Digital Transmission Analog vs. Digital Transmission Compare at two levels: 1. Data continuous (audio) vs. discrete (text) 2. Signaling continuously varying electromagnetic wave vs. sequence of voltage pulses. Also Transmission

More information

Datasheet EdgeVision

Datasheet EdgeVision Datasheet Multichannel Quality of Experience Monitoring Stay in control with customizable monitoring and interfaces. offers richly featured, Quality of Experience (QoE) monitoring across an entire network

More information

Troubleshooting Common Issues in VoIP

Troubleshooting Common Issues in VoIP Troubleshooting Common Issues in VoIP 2014, SolarWinds Worldwide, LLC. All rights reserved. Voice over Internet Protocol (VoIP) Introduction Voice over IP, or VoIP, refers to the delivery of voice and

More information

Video Coding Basics. Yao Wang Polytechnic University, Brooklyn, NY11201 yao@vision.poly.edu

Video Coding Basics. Yao Wang Polytechnic University, Brooklyn, NY11201 yao@vision.poly.edu Video Coding Basics Yao Wang Polytechnic University, Brooklyn, NY11201 yao@vision.poly.edu Outline Motivation for video coding Basic ideas in video coding Block diagram of a typical video codec Different

More information

Quality of Service Testing in the VoIP Environment

Quality of Service Testing in the VoIP Environment Whitepaper Quality of Service Testing in the VoIP Environment Carrying voice traffic over the Internet rather than the traditional public telephone network has revolutionized communications. Initially,

More information

Network administrators must be aware that delay exists, and then design their network to bring end-to-end delay within acceptable limits.

Network administrators must be aware that delay exists, and then design their network to bring end-to-end delay within acceptable limits. Delay Need for a Delay Budget The end-to-end delay in a VoIP network is known as the delay budget. Network administrators must design a network to operate within an acceptable delay budget. This topic

More information

NICE-RJCS Issue 2011 Evaluation of Potential Effectiveness of Desktop Remote Video Conferencing for Interactive Seminars Engr.

NICE-RJCS Issue 2011 Evaluation of Potential Effectiveness of Desktop Remote Video Conferencing for Interactive Seminars Engr. NICE-RJCS Issue 2011 Evaluation of Potential Effectiveness of Desktop Remote Video Conferencing for Interactive Seminars Engr. Faryal Zia Abstract This research paper discusses various aspects of desktop

More information

Indepth Voice over IP and SIP Networking Course

Indepth Voice over IP and SIP Networking Course Introduction SIP is fast becoming the Voice over IP protocol of choice. During this 3-day course delegates will examine SIP technology and architecture and learn how a functioning VoIP service can be established.

More information

Network Performance Optimisation: The Technical Analytics Understood Mike Gold VP Sales, Europe, Russia and Israel Comtech EF Data May 2013

Network Performance Optimisation: The Technical Analytics Understood Mike Gold VP Sales, Europe, Russia and Israel Comtech EF Data May 2013 Network Performance Optimisation: The Technical Analytics Understood Mike Gold VP Sales, Europe, Russia and Israel Comtech EF Data May 2013 Copyright 2013 Comtech EF Data Corporation Network Performance

More information

Multimedia Conferencing Standards

Multimedia Conferencing Standards Multimedia Conferencing Standards The ITU/TS sector has produced a number of international standards for real-time digital multimedia communication, including video and data conferencing. This chapter

More information

AT&T Connect Video Conferencing Functional and Architectural Overview. v9.5 October 2012

AT&T Connect Video Conferencing Functional and Architectural Overview. v9.5 October 2012 AT&T Connect Video Conferencing Functional and Architectural Overview v9.5 October 2012 Video Conferencing Functional and Architectural Overview Published by: AT&T Intellectual Property Product: AT&T Connect

More information

Mobile VoIP: Managing, scheduling and refining voice packets to and from mobile phones

Mobile VoIP: Managing, scheduling and refining voice packets to and from mobile phones Mobile VoIP: Managing, scheduling and refining voice packets to and from mobile phones MOHAMMAD ABDUS SALAM Student ID: 01201023 TAPAN BISWAS Student ID: 01201003 \ Department of Computer Science and Engineering

More information

HDMI / Video Wall over IP Transmitter with PoE

HDMI / Video Wall over IP Transmitter with PoE / Wall over IP Transmitter with Key Features Network 1080P ultra high quality video transmitter Assigns video sources to any monitor of the video wall Up to 8 x 8 Screen Array supported Extends high definition

More information

Application Note. IPTV Services. Contents. TVQM Video Quality Metrics Understanding IP Video Performance. Series. Overview. Overview...

Application Note. IPTV Services. Contents. TVQM Video Quality Metrics Understanding IP Video Performance. Series. Overview. Overview... Title Series TVQM Video Quality Metrics Understanding IP Video Performance Date September 2012 (orig. Feb 2008) Overview IPTV, Internet TV, and Video on Demand provide exciting new revenue opportunities

More information

Network Connection Considerations for Microsoft Response Point 1.0 Service Pack 2

Network Connection Considerations for Microsoft Response Point 1.0 Service Pack 2 Network Connection Considerations for Microsoft Response Point 1.0 Service Pack 2 Updated: February 2009 Microsoft Response Point is a small-business phone solution that is designed to be easy to use and

More information

AirCam POE-200HD. H.264 1.3 MegaPixel POE Dome. H.264 Compression. 1.3 Mega-Pixel Video Quality

AirCam POE-200HD. H.264 1.3 MegaPixel POE Dome. H.264 Compression. 1.3 Mega-Pixel Video Quality AirCam POE-200HD H.264 1.3 MegaPixel POE Dome T he AirLive AirCam POE-200HD is a highend 1.3 -megapixel network camera designed for professional indoor surveillance and security applications. Megapixel

More information

Practical advices for setting up IP streaming services.

Practical advices for setting up IP streaming services. Practical advices for setting up IP streaming services. 1. Overview of the problem. I want to stream. I am new to it. How do I go about it? I have a DSL with static IP. Now I can set up a streaming service

More information

12 Quality of Service (QoS)

12 Quality of Service (QoS) Burapha University ก Department of Computer Science 12 Quality of Service (QoS) Quality of Service Best Effort, Integrated Service, Differentiated Service Factors that affect the QoS Ver. 0.1 :, prajaks@buu.ac.th

More information

www.dm-networkvideo.com

www.dm-networkvideo.com Introducing the NVR Media Server A dedicated network appliance offering cost effective High Definition network IP video. HD With its ground breaking embedded network CCTV switch, the NVR Media Server provides

More information

Video Conferencing Protocols

Video Conferencing Protocols Welcome to the TANDBERG University prerequisite. Before commencing you are requested to ensure that you have completed the Introduction to the TANDBERG University elearning Experience Module that is available

More information

CCTV & Video Surveillance over 10G ip

CCTV & Video Surveillance over 10G ip CCTV & Video Surveillance over 10G ip Background With the increase in data, research and development and corporate competition, many companies are realizing the need to not only protect their data, but

More information

Per-Flow Queuing Allot's Approach to Bandwidth Management

Per-Flow Queuing Allot's Approach to Bandwidth Management White Paper Per-Flow Queuing Allot's Approach to Bandwidth Management Allot Communications, July 2006. All Rights Reserved. Table of Contents Executive Overview... 3 Understanding TCP/IP... 4 What is Bandwidth

More information

Cisco TelePresence SX20 Quick Set

Cisco TelePresence SX20 Quick Set Data Sheet Cisco TelePresence SX20 Quick Set Product Overview The Cisco TelePresence SX20 Quick Set (SX20 Quick Set) can transform any flat panel display into a sleek and powerful telepresence system.

More information

Lehrstuhl für Informatik 4 Kommunikation und verteilte Systeme

Lehrstuhl für Informatik 4 Kommunikation und verteilte Systeme Chapter 2: Representation of Multimedia Data Chapter 3: Multimedia Systems Communication Aspects and Services Multimedia Applications and Communication Protocols Quality of Service and Resource Management

More information

Echo Cancelling - Digital Audio Processing Example

Echo Cancelling - Digital Audio Processing Example Videoconference Videoconference Videoconference Videoconference erik.luyten@avnet.kuleuven.be AVNet K.U.Leuven What is it? Two-way communication Real-time (latency < 1second) 2 Sound: echo-cancelation

More information