(51) Int Cl.: H04N 7/24 ( )
|
|
|
- Amice Lloyd
- 10 years ago
- Views:
Transcription
1 (19) TEPZZ_79 49 B_T (11) EP B1 (12) EUROPEAN PATENT SPECIFICATION (4) Date of publication and mention of the grant of the patent: Bulletin 13/49 (21) Application number: (22) Date of filing: (1) Int Cl.: H04N 7/24 (11.01) (86) International application number: PCT/US0/ (87) International publication number: WO 06/ ( Gazette 06/14) (4) VIDEO DEMULTIPLEXER AND DECODER WITH EFFICIENT DATA RECOVERY VIDEODEMULTIPLEXER UND DEKODIERER MIT EFFIZIENTER DATENRÜCKGEWINNUNG DEMULTIPLEXEUR ET DECODEUR VIDEO A RECUPERATION EFFICACE DE DONNEES (84) Designated Contracting States: AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR () Priority: US (43) Date of publication of application: Bulletin 07/23 (73) Proprietor: Qualcomm Incorporated San Diego, CA (US) (72) Inventors: LEE, Yen-Chi San Diego, CA (US) TSAI, Ming-Chang San Diego, CA 921 (US) YE, Yan San Diego, CA 921 (US) LING, Fan San Diego, CA (US) EL-MALEH, Khaled Helmi San Diego, CA (US) (6) References cited: EP-A WO-A-01/60011 US-A US-B1-6 0 LIANG J ET AL: "TOOLS FOR ROBUST IMAGE AND VIDEO CODING IN JPEG 00 AND MPEG4 STANDARDS" PROCEEDINGS OF THE SPIE, SPIE, BELLINGHAM, VA, US, vol. 363, January 1999 ( ), pages -1, XP ISSN: X TALLURI R: "ERROR-RESILIENT VIDEO CODING IN THE ISO MPEG-4 STANDARD" IEEE COMMUNICATIONS MAGAZINE, IEEE SERVICE CENTER,NEW YORK, NY, US, vol. 36, no. 6, June 1998 ( ), pages , XP ISSN: HAGENAUER J ET AL: "Error robust multiplexing for multimedia applications - Fundamentals and Applications" SIGNAL PROCESSING. IMAGE COMMUNICATION, ELSEVIER SCIENCE PUBLISHERS, AMSTERDAM, NL, vol. 14, no. 6-8, May 1999 (1999-0), pages 8-97, XP ISSN: (74) Representative: Reedy, Orlaith et al Tomkins & Co Dartmouth Road Dublin 6 (IE) EP B1 Note: Within nine months of the publication of the mention of the grant of the European patent in the European Patent Bulletin, any person may give notice to the European Patent Office of opposition to that patent, in accordance with the Implementing Regulations. Notice of opposition shall not be deemed to have been filed until the opposition fee has been paid. (Art. 99(1) European Patent Convention). Printed by Jouve, 7001 PARIS (FR)
2 1 EP B1 2 Description TECHNICAL FIELD [0001] The disclosure relates to video decoding and, more particularly, techniques for limiting video data loss due to channel error. BACKGROUND [0002] In a typical Moving Picture Experts Group (MPEG)-4 video decoder implementation, when an error is detected, the decoder conceals all macroblocks (MBs) of a corrupted slice, or an entire frame. Concealment prevents the presentation of wrongly decoded MBs in displayed video, which can be very noticeable and visually annoying. In addition, concealment prevents the use of incorrect motion vectors from wrongly decoded MBs, which could otherwise propagate additional errors into the video stream. Hence, concealing all of the MBs of a corrupted slice or frame generally provides a more visually pleasant video signal. [0003] Although concealment techniques prevent the presentation of corrupted MBs, such techniques also purposely drop correctly received data, which can contain useful MBs at the beginning of a slice or frame. If an error actually occurs at a given MB, for example, the video decoder considers all of the MBs within the applicable slice or frame to be "possibly" corrupted and conceals them. The concealment of correctly received data is inefficient, and can significantly impact performance in some systems in which channel error is prevalent, such as wireless communication systems. SUMMARY [0004] In general, the disclosure is directed to a video demultiplexing and decoding technique that includes features for efficient video data recovery in the event of channel error. A demultiplexer detects boundaries between physical layer data units and adds boundary information to adaptation layer data units produced by the demultiplexer. When a video decoder encounters an error in a video data frame, it uses the boundary information produced by the demultiplexer to limit the amount of data to be concealed. The boundary information may take the form of boundary markers embedded in the video data frame. [000] The boundary markers permit the error to be associated with a small segment of data within the video data frame. The segment may be identified based on the location of physical layer data units, which are typically the smallest units that are subject to loss during transmission. The video decoder uses the boundary markers to conceal a small segment of data, rather than the entire slice or frame in which the segment resides. In this manner, the video decoder provides efficient data recovery, limiting the loss of useful data that otherwise would be purposely discarded as part of the concealment process. In some cases, the decoding technique also may rely on error resilience features, such as resynchronization markers, in combination with boundary markers. [0006] In one embodiment, the disclosure provides a video decoding method comprising generating multiplex layer data units containing video data based on physical layer data units, embedding boundary markers in the multiplex layer data units to indicate boundaries between the physical layer data units, demultiplexing the multiplex layer data units to produce a video data frame, and associating a detected decoding error with a segment of the video data frame using the boundary markers. [0007] In another embodiment, the disclosure provides a video decoding system comprising a demultiplexing engine to generate multiplex layer data units containing video data based on physical layer data units, and demultiplex the multiplex layer data units, a boundary generator to embed boundary markers in the multiplex layer data units to indicate boundaries between the physical layer data units, and a video decoding engine to decode a video data frame containing the video data, and associate a detected decoding error with a segment of the video data frame using the boundary markers. [0008] In an added embodiment, the disclosure provides a video demultiplexer comprising a demultiplexing engine to generate multiplex layer data units containing video data based on physical layer data units, and demultiplex the multiplex layer data units, and a boundary generator to embed boundary markers in the multiplex layer data units to indicate boundaries between the physical layer data units to permit a video decoder to associate a detected decoding error with a segment of a video data frame using the boundary markers. [0009] In a further embodiment, the disclosure provides a wireless communication device comprising a wireless receiver to receive physical layer data units via wireless communication, the physical layer data units containing video data, a demultiplexing engine to generate multiplex layer data units based on the physical layer data units, and demultiplex the multiplex layer data units, a boundary generator to embed boundary markers in the multiplex layer data units to indicate boundaries between the physical layer data units, and a video decoding engine to decode a video data frame containing the video data, and isolate a detected decoding error to a segment of the video data frame using the boundary markers. [00] The details of one or more embodiments are set forth in the accompanying drawings and the description below. Other features, objects, and advantages will be apparent from the description and drawings, and from the claims. BRIEF DESCRIPTION OF DRAWINGS [0011] FIG. 1 is a block diagram illustrating a video encoding and decoding system. [0012] FIG 2 is a block diagram illustrating a video de- 2
3 3 EP B1 4 coder system that makes use of boundary markers to identify segments of a video data frame corresponding to physical layer data units. [0013] FIG 3 is a diagram illustrating a prior art technique for concealment of macroblocks in a video data frame upon detection of an error. [0014] FIG. 4 is a diagram illustrating a prior art technique for concealment of macroblocks in a video data frame using resynchronization markers upon detection of an error. [00] FIG is a diagram illustrating an exemplary prior art multiplexing and packetization technique. [0016] FIGS. 6A-6D illustrate different techniques for concealment of macroblocks in a video data frame upon detection of an error. [0017] FIG 7 is a diagram illustrating a demultiplexing and depacketization technique that makes use of physical data unit boundary markers embedded in a video data frame. [0018] FIG 8 is a diagram illustrating the technique of FIG 7 when a physical data unit is lost. [0019] FIG 9 is a diagram illustrating an alternative demultiplexing and depacketization technique that uses a boundary marker to identify a lost physical data unit within a video data frame. [00] FIGS. A-D are diagrams illustrating various demultiplexing and depacketization techniques including a technique that uses resynchronization markers (RMs), header extension code (HEC) and boundary markers. [0021] FIG 11 is a flow diagram illustrating a video decoding technique in accordance with this disclosure. DETAILED DESCRIPTION [0022] FIG. 1 is a block diagram illustrating a video encoding and decoding system. As shown in FIG. 1, system includes an encoder system 12 and a decoder system 14 connected by a transmission channel 16. Channel 16 may be any wired or wireless medium suitable for transmission of video information. Decoder system 14 enables efficient video data recovery in the event of channel error. As will be described in detail, decoder system 14 is configured to limit the loss of useful data that ordinarily would be purposely discarded as part of the concealment process in the event of a channel error. In this manner, decoder system 14 can provide greater efficiency, enhanced decoding performance, and improved error resilient capabilities. [0023] Encoder system 12 includes a multiplexer (MUX) 18, a video encoder and an audio encoder 22. Video encoder generates encoded video data according to a video compression protocol, such as MPEG-4. Other video compression protocols may be used, such as the International Telecommunication Union (ITU) H. 263, ITU H.264, or MPEG-2 protocols. Audio encoder 22 encodes audio data to accompany the video data. Multiplexer 18 multiplexes the video data and audio data to form a series of multiplex data units for transmission via channel 16. As an example, multiplexer 18 may operate according to the H.223 multiplexer protocol, published by the ITU. However, other protocols may be used, such as the user datagram protocol (UDP). [0024] Channel 16 carries the multiplexed information to decoder system 14 as physical layer data units. Channel 16 may be any physical connection between encoder system 12 and decoder system 14. For example, channel 16 may be a wired connection, such as a local or widearea network. Alternatively, as described herein, channel 16 may be a wireless connection such as a cellular, satellite or optical connection. [00] Decoder system 14 includes a demultiplexer (DEMUX) 26, a video decoder 28, and an audio decoder. Demultiplexer 26 identifies the multiplex data units from physical layer data units and demultiplexes the content of the multiplex layer data units to produce video and audio adaptation layer data units. The adaptation layer data units are processed in the adaptation layer to produce video data frames. Video decoder 28 decodes the video data frames at the application layer to produce a stream of video data for use in driving a display device. Audio decoder decodes the audio data to produce audio. [0026] In accordance with this disclosure, demultiplexer 26 detects a boundary between the physical layer data units and adds boundary information to the bitstream produced by the demultiplexer. Demultiplexer 26 produces adaptation layer data units, which are processed by the adaptation layer to produce an application layer bitstream. When video decoder 28 encounters an error in the bitstream, it uses the boundary information to limit the amount of video data that must be concealed. In particular, video decoder 28 uses the boundary information to isolate the error to a smaller segment of data, e.g., based on the locations of physical layer data units, in this example. Video decoder 28 conceals a smaller segment of data, rather than the entire slice or frame in which the error resides. [0027] In operation, demultiplexer 26 generates multiplex layer data units containing video and audio data based on physical layer data units received via channel 16. Demultiplexer 26 embeds one or more boundary markers in the multiplex layer data units to indicate a boundary between the physical layer data units, and demultiplexes the multiplex layer data units to produce a video data frame. Then, upon detecting a decoding error, video decoder 28 associates the detected decoding error with a segment of the video data frame using the boundary markers. [0028] With the aid of one or more boundary markers, video decoder 28 then conceals the segment of the video data frame in which the error occurred, rather than the entire slice or frame. In some embodiments, video decoder 28 also may make use of resynchronization markers embedded in the multiplex layer data units. For example, if the video data frame includes resynchronization 3
4 EP B1 6 markers, video decoder 28 may be configured to conceal macroblocks (MBs) within a segment of the video data frame identified by the boundary markers, and MBs up to the next resynchronization marker in the video data frame. [0029] FIG 2 is a block diagram illustrating an embodiment of a video decoder system 14 that makes use of boundary markers to identify segments of a video data frame corresponding to physical layer data units. Video decoder system 14 makes use of one or more video boundary markers to limit the amount of data that is concealed in the event of a decoding error. In the example of FIG 2, video decoder system 14 includes a wireless receiver 33 to receive video and audio data over a wireless channel. Wireless receiver 33 may be configured to receive radio frequency (RF) wireless signals according to any of a variety of wireless transmission techniques such as Code Division Multiple Access (CDMA), wideband CDMA (W-CDMA), or Time Division Multiple Addressing (TDMA). [00] As shown in FIG. 2, demultiplexer (DEMUX) 26 includes a demultiplexing engine 36, a radio link control (RLC) boundary detector 38, and a boundary code generator. Demultiplexing engine 36 generates multiplex layer data units containing video and audio data based on physical layer data units received from wireless receiver 33. In some embodiments, the physical layer data units may be W-CDMA radio link control (RLC) packet data units (PDUs), i.e., RLC PDUs. Alternatively, the physical layer data units may take a variety of different forms, such as CDMA00 1x RLP (Radio Link Protocol) PDUs, CDMA00 1x EV-DO RLP PDUs, CDMA00 EV-DV RLP PDUs. Demultiplexing engine 36 generates multiplex layer packet data units (MUX PDUs) according to a demultiplexing protocol, such as H.223. However, the techniques described herein may be applicable to other video transport protocols, such as SIP-based and H.323 video telephony protocols using RTP/LTDP/IP (Real-time Transport Protocol/User Datagram Protocol/ Internet Protocol). [0031] RLC boundary detector 38 detects boundaries between the RLC PDUs. Boundary code generator generates a code for each boundary, and embeds the code as a boundary marker at an appropriate location within the multiplex layer data units produced by demultiplexing engine 36. In this manner, demultiplexer 26 preserves an indication of the boundaries between the physical layer data units. When demultiplexing engine 36 produces a MUX PDU, and the adaptation layer module 44 produces a video data frame, the boundary markers remain intact for use by video decoder engine 28 in isolating decoding errors to small segments of the video data frame. [0032] For MPEG-4 wireless transmissions using W- CDMA, an RLC PDU is the smallest unit that is subject to losses during transmission. For example, a W-CDMA RLC-PDU is 160-bytes long for every ms. With the aid of boundary markers, video decoder 28 can associate a detected decoding error with a small segment of the video data frame produced by demultiplexer 26. Upon detection of the decoding error, video decoder 28 conceals the small segment of the video data frame rather than an excessive number of MBs, or even the entire video data frame in some instances. [0033] As further shown in FIG. 2, an adaptation layer module 44 converts the MUX PDUs produced by demultiplexer engine 36 into a video data frame for processing by video decoder 28. In this example, video decoder 28 includes an error detection module 46, a boundary code detector 48, a decoder engine 0 and memory 2. Boundary code detector 48 scans the incoming video frame bitstream to detect boundary markers, which indicate the boundaries between RLC PDUs in the original transmission at the physical layer. Boundary code detector 48 removes the boundary markers from the video frame bitstream, and records the locations of the boundary markers in memory 2. When error detection module 46 detects a decoding error, decoder engine 0 makes use of the recorded boundary maker locations to determine the position of the error in terms of the boundaries between RLC PDUs in the original transmission at the physical layer. Decoder engine 0 records the locations in memory 2 so that the size of the segment of concealed MBs can be limited, generally to the size of the RLC PDUs. [0034] Hence, decoder system 14 provides a unique transport-decoder cross-layer design that promotes efficient video data recovery. Decoder system 14 limits the amount of useful data that must be discarded in the presence of a transmission error. According to this cross-layer design, transport layers pass additional information to video decoder engine 0 in order to recover those data that were correctly received before the channel impairments. [003] As further shown in FIG. 2, video decoder 0 produces a decoded video bitstream, and delivers it to a video driver 1. Video driver 1 drives a display device 3 to present video imagery to a user. Video decoder system 14 may support a variety of video applications, including delivery of streaming video or video telephony. In each case, decoder system 14 is effective in limiting the loss of useful data, and thereby enhancing efficiency and performance. [0036] Video decoder system 14 may be implemented as a decoding process, or coding/decoding (CODEC) process, running on a digital signal processor (DSP) or other processing device. Video decoder system 14 may have a dedicated memory 2 for storing instructions and data, as well as dedicated hardware, software, firmware, or combinations thereof. Various aspects of the techniques described herein may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the techniques may be embodied as instructions on a computer-readable medium such as random access memory (RAM), read-only memory (ROM), non-volatile random access memory (NVRAM), electrically erasable programmable read-only memory 4
5 7 EP B1 8 (EEPROM), FLASH memory, or the like. The instructions cause one or more processors to perform certain aspects of the functionality described in this disclosure. [0037] FIG 3 is a diagram illustrating concealment of MBs in a video data frame upon detection of an error according to a prior art technique. FIG 3 is provided for purposes of comparison to better illustrate the techniques described in this disclosure. As shown in FIG 3, in a typical prior art concealment process, when an error is detected due to loss of MBs during transmission, a decoder conceals all MBs of a corrupted slice, or an entire frame. Although this approach prevents the presentation of corrupted MBs, it also purposely drops correctly received data, which can contain tens of MBs at the beginning of a slice or frame prior to the position of an error. [0038] The diagram in FIG. 3 illustrates the general inefficiency of concealing useful data. FIG. 3 identifies respective MBs by sequence numbers within a video data frame extending from MB 0 to MB 98. Successive video data frames are bounded by video object plane (VOP) fields that signify the end of a video data field. In the example of FIG. 3, an error actually occurs at MB 41, but the video decoder considers the MBs from 0 to 98 as being "possibly" corrupted and conceals all of them. This is equivalent to dropping the data from MB 0 to MB. Consequently, MBs following the error are "LOST," while correctly received MBs before the error are "WASTED." Clearly, the MBs 0 to do not include errors, and instead carry useful data. Yet, the prior art concealment techniques result in concealment of all MBs 0 to 98, i.e., the entire video data frame. [0039] FIG. 4 is a diagram illustrating concealment of MBs in a video data frame using resynchronization markers upon detection of an error according to another prior art technique. In the example of FIG 4, resynchronization makers (RMs) are embedded in the data frame to support error resilience techniques. The use of RMs improves the efficiency of the decoding process in the presence of errors, but still results in wasted MBs. In the example of FIG 4, when RMs are used, the video decoder can only recover data from MB 0 to MB, and still must conceal correctly received MBs from 21 to, which results in a loss of MBs. Although the error occurs at MB 41 in the example of FIG 4, this technique requires concealment of MBs between the last RM immediately preceding the error and the first RM immediately following the error, as illustrated in FIG 4. Hence, the use of resynchronization markers in this manner provides a significant improvement in efficiency, but still results in a significant number of wasted MBs. [00] In contrast to the techniques depicted in FIGS. 3 and 4, the use of one or more boundary markers, as described in this disclosure, supports the recovery of correctly decoded MBs positioned prior to an error, but still adequately prevents the presentation of wrongly decoded MBs. The demultiplexing layer, e.g., H.223, passes video-rlc boundary information to the decoder by embedding one or more boundary markers in the bitstream, e.g., as a special codeword. Video decoder 28 interprets the codeword as a boundary marker that permits identification of all possible locations of data losses in terms of the physical data units received via channel 16. With more exact locations of data losses, video decoder 28 can use such information to associate the errors with smaller segments of the video data frame and recover more of the correctly received MBs. [0041] FIG. is a diagram illustrating a prior art multiplexing and packetization technique within an encoder, such as encoder 12 of FIG 1. The process will be described in the context of the H.223 multiplexing protocol for purposes of illustration. In the example of FIG, video data is packetized into H.223 packets and multiplexed with audio data. The video bitstream at the application layer (APP) is first chopped into one or more application layer service data units (AL-SDUs). One AL-SDU can contain one whole frame or just a slice of a frame, depending on the video encoder implementation. Each AL- SDU is then passed to the H.223 Adaptation Layer (AL), where an AL-PDU packet is formed by adding an optional Sequence Number (SN) at the front, and a 16-bit cyclic redundancy code (CRC) to the end. [0042] Each video AL-PDU is sent to the H.223 Multiplex Layer (ML) to be fragmented, if necessary, and multiplexed with audio (AU) AL-PDUs into MUX-PDUs by inserting framing information and a MUX header. The last MUX-PDU of a video AL-PDU is tailed with additional framing information (ALT) to indicate the termination of this video AL-PDU. All the MUX-PDUs are carried by physical layer data units. In a wireless application, the physical layer data units are radio link packets, such as W-CDMA RLC PDUs as shown in FIG. [0043] At a decoder, such as decoder system 14 of FIG 1, the H.223 demultiplexer receives the RLC-PDUs and locates each MUX-PDU by searching the MUX framing information. The demultiplexer extracts the video and audio data from the MUX-PDU payload according to a MUX table in the MUX header. Once the terminating framing information is found, the demultiplexer de-fragments all of the video data extracted from different MUX- PDUs, but belonging to the same video AL-PDU, and passes the de-fragmented video data to the AL for integrity checking using CRC. If CRC succeeds, the video decoder receives the entire AL-SDU. If CRC fails, the corrupted AL-SDU may be passed to the video decoder or discarded, depending on the implementation. [0044] FIGS. 6A-6D illustrate different techniques for concealment of macroblocks in a video data frame upon detection of an error. In particular, FIGS. 6A-6D show video data frames in conjunction with different concealment techniques. FIGS. 6A and 6C depict prior art techniques that do not employ boundary markers. FIGS. 6B and 6D depict the use of boundary markers, as described in this disclosure. [004] FIG 6A depicts the use of a prior art technique in which no error resilience techniques are used. According to the technique of FIG 6A, when an error is detected
6 9 EP B1 at MB 41, the entire video data frame, including macroblocks MB [0, 98], is concealed. FIG 6B illustrates the use of boundary markers 4, in accordance with this disclosure, to associate errors with smaller segments of a video data frame. [0046] In FIG 6B, a video data frame includes boundary markers 4 that indicate the boundaries between the video portions of adjacent physical layer data units, such as W-CDMA RLC PDUs. In particular, the boundary markers 4 define segments referred to herein as "Video-RLC" units. One Video-RLC unit is indicated, for example, by boundary markers 4A, 4B. A Video-RLC unit generally corresponds to an RLC PDU, which is the smallest unit in which a loss can occur. In the event of channel error, the RLC PDU can be used as a guide to prevent the concealment of useful information. [0047] The use of boundary markers 4 allows errors to be associated with a single Video-RLC unit. In the event an error is detected by video decoder engine 0, correctly received MBs that are positioned prior to the Video-RLC unit in which the error occurred can be preserved. In particular, this technique permits recovery of correctly received MBs positioned prior to boundary marker 4A. [0048] Preservation of the correctly received MBs using the technique of FIG 6B can result in increased efficiency and improved performance, relative to the technique of FIG 6A. In the example of FIG. 6B, if an error is detected at MB 41, MBs [41,98] through the end of the video data frame are concealed. However, MBs [0,] occurring prior to MB 41 need not be concealed. Boundary marker 4A serves to indicate the start of the Video- RLC unit in which the error was detected. Hence, video decoder engine 0 relies on boundary marker 4A in determining which MBs to conceal. [0049] FIG. 6C depicts the use of an error resilience technique that employs resynchronization markers (RMs). In FIG 6C, when an error is detected at MB 41, only MBs [21, 72] between a preceding RM and a subsequent RM are concealed, thereby conserving MB [0, ] and MB [73,98]. FIG. 6D illustrates the use of boundary markers 4 in combination with RMs 6 for error resilience. In the example of FIG. 6D, when an error occurs at MB 41, MBs [41, 72] are concealed from the beginning of the Video-RLC unit to the next occurring RM 6B. [000] The boundary marker technique of FIG. 6D represents an improvement over the basic error resilience technique shown in FIG. 6C. Specifically, MBs are preserved between the preceding RM 6A and the boundary marker 4A denoting the start of the Video-RLC unit in which the error was detected, providing advantages over the conventional use of RMs. At the same time, however, MBs are preserved between the following RM 6B and the end of the frame. Hence, the combined use of boundary markers 4 and RMs 6 according to the technique of FIG. 6D results in further efficiencies relative to the technique of FIG. 6C. [001] A variety of different techniques may be used to provide boundary markers 4. As one example, demultiplexing engine 36 may store the memory address of each RLC boundary in memory 2. However, the stored information may be lost when the memory content is copied to the decoding buffer used by video decoder 28. In addition, it can be difficult to convert recorded memory addresses to the addresses in the decoding buffer. Therefore, as an alternative, another approach is to embed the boundary markers in a video data frame, as described herein. In particular, according to this approach, demultiplexer 26 detects the boundaries from the physical layer data units, and embeds boundary markers, which are then passed up through the multiplexing and adaptation layers to the application layer for use by video decoder engine. [002] FIG 7 is a diagram illustrating a demultiplexing and depacketization technique that involves embedding boundary markers in a video data frame, in accordance with this disclosure. FIG 8 is a diagram illustrating the technique of FIG 7 when a physical data unit, such as an RLC-PDU, is lost. The functions shown in FIGS. 7 and 8 may be performed by a video decoder system 14 as described with reference to FIG. 2. As shown in FIG 7, demultiplexing engine 26 receives RLC PDU s at the physical layer and converts them to MUX PDUs at the multiplex layer (ML). In the decoder implementation of FIG 2, for example, video-rlc boundary detector 38 detects the boundaries between the RLC PDUs, and boundary code generator embeds boundary markers 4 in the MUX PDUs. [003] Demultiplexing engine 36 generates adaptation layer (AL) PDUs, which are then converted to AL SDUs. In this manner, the video data is serialized into a video data frame for bitstream pre-processing followed by video decoding at the application layer (APP). At the multiplexer and adaptation layers, the boundary markers 4 that signify the RLC boundaries remain intact for later reference by video decoder engine 0. In effect, the multiplex layer keeps track of each RLC-PDU fetched from the physical layer and inserts a special codeword, i.e., a boundary marker, when RLC-PDUs are concatenated. If an RLC-PDU is lost, as shown in FIG 8, the video decoder is still able to recover correctly received data by tracing back to the MB where the nearest boundary lays, instead of dropping the data of the corrupted slice or frame. In this way, video decoder engine 0 can use the boundary markers 4 to associate detected errors with smaller segments within the video data frame, conforming to the original physical layer data units, and thereby avoid excessive and unnecessary concealment of MBs in the video data frame. Video decoder engine 0 may detect an error when an RLC-PDU is corrupted or lost in its entirety. [004] The boundary markers may be embedded as a special codeword when an RLC-PDU is fetched by the MUX layer. Again, this boundary information can be passed up all the way to the application layer as boundary markers for use by video decoder 28 (FIG. 2). With reference to FIG. 2, boundary code detector 48 performs 6
7 11 EP B1 12 the bitstream pre-screening process to seek these boundary markers, which serve as special codewords. Boundary code detector 48 records the positions of the boundary markers in memory 2, and removes the boundary markers from the bitstream before decoding by decoder engine 0. During decoding, once an RLC boundary is crossed, decoder engine 0 can record which MB is being decoded, by reference to the locations stored in memory 2. Once an error is detected by error detection module 46, decoder engine 0 will conceal MBs extending from the MB it has recorded to the end of the frame, or to the next resynchronization marker (RM) codeword, in the event error resilience techniques are also employed in combination with RLC boundary markers. The characteristics of the particular special codeword used as a boundary marker may be subject to different implementations. However, the codeword should be readily distinguishable from existing bit patterns used in the bitstream produced by any video compression standards such as MPEG-4 and H.263 bitstreams. In some cases, the special codeword may be implemented using the reserved start code defined in the MPEG-4 and H.263 standards. [00] FIG 9 is a diagram illustrating an alternative demultiplexing and depacketization technique, in accordance with this disclosure, that uses a boundary marker to identify a lost physical data unit within a video data frame. In the example of FIG 9, demultiplexer 26 embeds an RLC boundary marker to indicate a lost RLC-PDU 7 at the physical layer. In this case, the physical layer is configured to indicate to the multiplex layer which RLC- PDU is lost. Hence, demultiplexer 26 provides video decoder 28 with an advance warning when an RLC-PDU has been lost. This approach is in contrast to providing boundary markers for all RLC-PDUs, and having the video decoder engine 0 resolve errors or lost RLC-PDUs during decoding. If the physical layer is configured to identify a lost RLC-PDU to provide such information, then the demultiplexer 26 embeds a marker as a special codeword within the MUX-PDU in which the lost RLC-PDU occurred. Video decoder engine 0 then seeks this special codeword within memory 2 to locate the lost video- RLC boundary, and conceals macroblocks from that point to the end of the frame, or the next RM if error resilience techniques are employed. In this manner, the correctly received MBs up to the point of the lost RLC- PDU can be recovered and preserved, rather than concealed and wasted. [006] FIGS. A-D are diagrams illustrating various demultiplexing and depacketization techniques including a technique that uses resynchronization markers (RMs), header extension code (HEC) and boundary markers in FIG D. For ease of illustration, each diagram in FIGS. A-D includes vertical lines to indicate the position of boundary markers defining Video-RLC units, although only FIG D actually illustrates the use of Video-RLC boundary markers. In the example of FIG A, no error resilience tools are used. As a result, when an error is detected at the end of a first video data frame and the beginning of a second video data frame, the MBs [0, 98] for the entire second frame are lost, while the MBs [0, 98] for substantially the entire first video data frame must be concealed up to the point of the error. Consequently, the scenario depicted in FIG A can result in a drastic adverse impact on video decoding performance. [007] In the example of FIG. B, RMs are embedded in the video data frame, in accordance with prior art error resilience techniques. As shown in FIG. B, when an error is detected at the end of a first video data frame and the beginning of second video data frame, as in FIG. A, the RMs permit a significant reduction in the number of concealed MBs in the first video data frame. Although the MBs [0,98] in the entire second video data frame are lost, and MBs extending from the error to the end of the first video data frame are concealed, the MBs in the first video data frame up to the point of the RM [,98] immediately preceding the error are recovered, rather than concealed. Hence, the use of error resilience techniques can provide a substantial performance improvement. [008] In the example of FIG. C, RMs and HEC bits are embedded in the video data frames. In this scenario, MBs can be recovered up to the point of the RM immediately preceding the error. MBs [,98] are concealed between the RM immediately preceding the error and the end of the first video data frame. In these respects, the scenario of FIG. C generally conforms to the scenario of FIG B. However, the presence of the HEC bits prevents the loss of the entire second video data frame. Rather, as shown in FIG C, MBs at the start of the second video data frame [0, 4], while MBs following the first HEC field in the second video data frame are recovered. In particular, at the start of the HEC field in the second video data frame, a new frame is created. The MBs [0, 44] in the new frame need to be concealed but the MBs [4, 98] can be decoded. [009] In the example of FIG. D, decoder system 14 employs advanced error resilient tools, such as RMs and HEC fields, in combination with Video-RLC boundary markers in accordance with this disclosure to further reduce the impact of data losses and the number of dropped and concealed MBs. FIG D generally conforms to FIG C. However, the presence of boundary markers permits additional MBs to be recovered prior to the point of error detection. In particular, as shown in FIG. D, MBs are recovered up to the point of the boundary marker at the start of the segment in which the error occurred, such that only MBs [70, 98] must be concealed in the first video data field. [0060] The presence of the boundary markers in the video data frames of FIG D permits the error to be associated with a smaller segment of the video data field. The error segment is significantly smaller than the range between RMs, and actually corresponds to a physical layer data unit, which is the smallest unit in which a loss can occur during transmission. As is apparent from FIG. D, the addition of boundary markers results in a sig- 7
8 13 EP B1 14 nificant savings in the recovery of MBs, when compared with the use of RMs and HEC fields alone. [0061] FIG 11 is a flow diagram illustrating a video decoding technique in accordance with this disclosure. As shown in FIG, the technique involves receiving physical layer data units containing video and audio information (8), and detecting the boundaries between adjacent physical layer data units (60). In an example wireless application, the physical layer data units may be W-CD- MA RLC-PDUs. Upon generation of multiplex layer data units (62), the technique further involves embedding one or more boundary markers in the multiplex layer data units to identify the physical data unit boundaries (64). [0062] Upon generating a video data frame (66), a video decoder decodes the video data frame (68) and associates any error with a smaller segment of the video data frame using the embedded boundary markers (70). In this manner, MBs positioned prior to the segment in which the error is detected, i.e., prior to the boundary marker signifying the start of the error segment, can be recovered (72), rather than concealed. In addition, if resynchronization markers (RMs) are used, MBs following the next RM occurring after the end of the error segment can be recovered through the end of the applicable frame. The next RM following the error segment can be identified by reference to the boundary marker signifying the end of the segment in which the error was detected. [0063] Various embodiments have been described. These and other embodiments are within the scope of the following claims. segment of the video data frame.. The method of claim 1, wherein the video data frame includes resynchronization markers, the method further comprising concealing macroblocks within the segment of the video data frame and macroblocks up to a next one of the resynchronization markers following the detected decoding error in the video data frame. 6. The method of claim 1, further comprising demultiplexing the multiplex layer data units to generate adaptation layer data units, and generating the video data frame based on the adaptation layer data units. 7. The method of claim 1, further comprising receiving the physical layer data units via wireless communication. 8. The method of claim 1, further comprising demultiplexing the multiplex layer units according to the ITU H.223 multiplexing/demultiplexing protocol. 9. The method of claim 1, further comprising demultiplexing the multiplex layer units according to the RTP/UDP/IP multiplexing/demultiplexing protocol.. The method of claim 1, wherein the video data frame includes macroblocks of video data conforming to the MPEG-4 standard. Claims 1. A video decoding method comprising: The method of claim 1, wherein the video data frame includes macroblocks of video data conforming to one of the ITU H.263, ITU H.264 and MPEG-2 protocols. generating multiplex layer data units containing video data based on physical layer data units; embedding a boundary marker in the multiplex layer data units to indicate a boundary between the physical layer data units; demultiplexing the multiplex layer data units to generate a video data frame; and associating a detected decoding error with a segment of the video data frame using the boundary markers. 2. The method of claim 1, wherein the boundary marker identifies a start of a lost physical layer data unit. 3. The method of claim 1, wherein embedding a boundary marker includes embedding a plurality of the boundary markers to identify boundaries between a plurality of the physical layer data units. 4. The method of claim 1, wherein the video data frame includes macroblocks of video data, the method further comprising concealing macroblocks within the The method of claim 1, wherein the physical layer data units include W-CDMA radio link control packet data units (RLC PDUs). 13. The method of claim 12, wherein the multiplex layer data units conform to the H.223 multiplexing/demultiplexing protocol. 14. The method of claim 1, wherein the physical layer data units include CDMA00 1x radio link protocol packet data units (RLP PDUs), CDMA00 1x EV- DO RLP PDUs, or CDMA00 EV-DV RLP PDUs.. The method of claim 1, wherein the multiplex layer data units conform to the RTP/UDP/IP multiplexing/ demultiplexing protocol. 16. The method of claim 1, wherein the physical layer data units include audio and video data, and embedding boundary markers includes embedding boundary markers in the multiplex layer data units to indicate boundaries between video information in the 8
9 EP B1 16 physical layer data units. 17. A video decoding system comprising: a demultiplexing engine to generate multiplex layer data units containing video data based on physical layer data units, and demultiplex the multiplex layer data units; a boundary generator to embed a boundary marker in the multiplex layer data units to indicate a boundary between the physical layer data units; and a video decoding engine to decode a video data frame containing the video data, and associate a detected decoding error with a segment of the video data frame using the boundary markers. 18. The system of claim 17, wherein the boundary marker identifies a start of a lost physical layer data unit. 19. The system of claim 17, wherein the boundary generator embeds a plurality of the boundary markers to identify boundaries between a plurality of the physical layer data units.. The system of claim 17, further comprising a boundary detector to detect the boundaries between the physical layer data units. 26. The system of claim 17, wherein the demultiplexing engine demultiplexes the multiplex layer units according to the RTP/UDP/IP multiplexing/demultiplexing protocol. 27. The system of claim 17, wherein the video data frame includes macroblocks of video data conforming to the MPEG-4 standard. 28. The system of claim 17, wherein the video data frame includes macroblocks of video data conforming to one of the ITU H.263, ITU H.264 and MPEG-2 protocols 29. The system of claim 17, wherein the physical layer data units include W-CDMA radio link control packet data units (RLC PDUs).. The system of claim 29, wherein the multiplex layer data units conform to the H.223 multiplexing/demultiplexing protocol. 31. The system of claim 17, wherein the physical layer data units include CDMA00 1x radio link protocol packet data units (RLP PDUs), CDMA00 1x EV- DO RLP PDUs, or CDMA00 EV-DV RLP PDUs. 32. The system of claim 17, wherein the multiplex layer data units conform to the RTP/UDP/IP multiplexing/ demultiplexing protocol. 21. The system of claim 17, wherein the video data frame includes macroblocks of video data, and the decoding engine conceals macroblocks within the segment of the video data frame. 22. The system of claim 17, wherein the video data frame includes resyncrhonization markers, and the decoding engine conceals macroblocks within the segment of the video data frame and macroblocks up to a next one of the resynchronization markers following the detected decoding error in the video data frame. 23. The system of claim 17, further comprising an adaptation layer module to generate adaptation layer data units based on the demultiplexed multiplex layer data units, and generate the video data frame based on the adaptation layer data units. 24. The system of claim 17, further comprising a wireless receiver to receive the physical layer data units via wireless communication.. The system of claim 17, wherein the demultiplexing engine demultiplexes the multiplex layer units according to the ITU H.223 multiplexing/demultiplexing protocol The system of claim 17, wherein the physical layer data units include audio and video data, and the boundary generator embeds the boundary markers in the multiplex layer data units to indicate boundaries between video information in the physical layer data units. 34. A video demultiplexer comprising: a demultiplexing engine to generate multiplex layer data units containing video data based on physical layer data units, and demultiplex the multiplex layer data units; and a boundary generator to embed a boundary marker in the multiplex layer data units to indicate a boundary between the physical layer data units to permit a video decoder to associate a detected decoding error with a segment of a video data frame using the boundary markers. 3. The demultiplexer of claim 34, wherein the boundary marker identifies a start of a lost physical layer data unit. 36. The demultiplexer of claim 34, wherein the boundary generator embeds a plurality of the boundary markers to identify boundaries between a plurality of the physical layer data units. 9
10 17 EP B The demultiplexer of claim 34, further comprising a boundary detector to detect the boundaries between the physical layer data units. 38. The demultiplexer of claim 34, wherein the video data frame includes macroblocks of video data, and the decoding engine conceals macroblocks within the segment of the video data frame. 39. The demultiplexer of claim 34, wherein the demultiplexing engine demultiplexes the multiplex layer data units according to the H.223 multiplexing/demultiplexing protocol.. The demultiplexer of claim 34, wherein the demultiplexing engine demultiplexes the multiplex layer data units according to the RTP/UDP/IP multiplexing/ demultiplexing protocol. 41. The demultiplexer of claim 34, wherein the video data frame includes macroblocks of video data conforming to the MPEG-4 standard. a demultiplexing engine to generate multiplex layer data units based on the physical layer data units, and demultiplex the multiplex layer data units; a boundary generator to embed a boundary marker in the multiplex layer data units to indicate a boundary between the physical layer data units; and a video decoding engine to decode a video data frame containing the video data, and associate a detected decoding error with a segment of the video data frame using the boundary markers. 49. The device of claim 48, wherein the boundary marker identifies a start of a lost physical layer data unit. 0. The device of claim 48, wherein the boundary generator embeds a plurality of the boundary markers to identify boundaries between a plurality of the physical layer data units. 1. A video decoding system comprising: 42. The demultiplexer of claim 34, wherein the video data frame includes macroblocks of video data conforming to one of the ITU H.263, ITU H.264 and MPEG-2 protocols 43. The demultiplexer of claim 34, wherein the physical layer data units include W-CDMA radio link control packet data units (RLC PDUs). 44. The demultiplexer of claim 43, wherein the multiplex layer data units conform to the H.223 multiplexing/ demultiplexing protocol. 4. The demultiplexer of claim 34, wherein the physical layer data units include CDMA00 1x radio link protocol packet data units (RLP PDUs), CDMA00 1x EV-DO RLP PDUs, or CDMA00 EV-DV RLP PDUs. 46. The demultiplexer of claim 34, wherein the multiplex layer data units conform to the RTP/UDP/IP multiplexing/demultiplexing protocol. 47. The demultiplexer of claim 34, wherein the physical layer data units include audio and video data, and the boundary generator embeds the boundary markers in the multiplex layer data units to indicate boundaries between video information in the physical layer data units. 48. A wireless communication device comprising: a wireless receiver to receive physical layer data units via wireless communication, the physical layer data units containing video data; means for generating multiplex layer data units containing video data based on physical layer data units; means for embedding a boundary marker in the multiplex layer data units to indicate a boundary between the physical layer data units; means for demultiplexing the multiplex layer data units to generate a video data frame; and means for associating a detected decoding error with a segment of the video data frame using the boundary markers. 2. The system of claim 1, wherein the boundary marker identifies a start of a lost physical layer data unit. 3. The system of claim 1, wherein the embedding means includes means for embedding a plurality of the boundary markers to identify boundaries between a plurality of the physical layer data units. 4. The system of claim 1, wherein the video data frame includes macroblocks of video data, the system further comprising means for concealing macroblocks within the segment of the video data frame.. The system of claim 1, wherein the video data frame includes resynchronization markers, the system further comprising means for concealing macroblocks within the segment of the video data frame and macroblocks up to a next one of the resynchronization markers following the detected decoding error in the video data frame. 6. The system of claim 1, wherein the demultiplexing means demultiplexes the multiplex layer units ac-
11 19 EP B1 cording to the ITU H.223 or RTP/UDP/IP multiplexing/demultiplexing protocols. 7. The system of claim 1, wherein the video data frame includes macroblocks of video data conforming to the MPEG-4, ITU H.263, ITU H.264 or MPEG-2 protocols. 8. The system of claim 1, wherein the physical layer data units include W-CDMA radio link control packet data units (RLC PDUs), CDMA00 1x radio link protocol packet data units (RLP PDUs), CDMA00 1x EV-DO RLP PDUs, or CDMA00 EV-DV RLP PDUs. 9. A computer-readable medium comprising instructions to cause one or more processors to: generate multiplex layer data units containing video data based on physical layer data units; embed a boundary marker in the multiplex layer data units to indicate a boundary between the physical layer data units; demultiplex the multiplex layer data units to generate a video data frame; and associate a detected decoding error with a segment of the video data frame using the boundary markers. 60. The computer-readable medium of claim 9, wherein the boundary marker identifies a start of a lost physical layer data unit. 61. The computer-readable medium of claim 9, further comprising instructions to cause the processor to embed a plurality of the boundary markers to identify boundaries between a plurality of the physical layer data units The computer-readable medium of claim 9, wherein the instructions cause the processor to demultiplex the multiplex layer units according to the ITU H.223 or RTP/UDP/IP multiplexing/demultiplexing protocol. 6. The computer-readable medium of claim 9, wherein the video data frame includes macroblocks of video data conforming to the MPEG-4, ITU H.263, ITU H.264 or MPEG-2 protocols. 66. The computer-readable medium of claim 9, wherein the physical layer data units include W-CDMA radio link control packet data units (RLC PDUs), CDMA00 1x radio link protocol packet data units (RLP PDUs), CDMA00 1x EV-DO RLP PDUs, or CDMA00 EV-DV RLP PDUs. Patentansprüche 1. Ein Videodecodierungsverfahren, das Folgendes aufweist: Generieren von Dateneinheiten der Multiplexierschicht bzw. Multiplex Layer, die Videodaten enthalten, basierend auf Dateneinheiten der physikalischen Schicht bzw. Physical Layer; Einbetten eines Grenzmarkers in die Dateneinheiten der Multiplex Layer, um eine Grenze zwischen den Dateneinheiten der Physical Layer anzuzeigen; Demultiplexen der Dateneinheiten der Multiplex Layer, um einen Videodatenrahmen zu generieren; und Assoziieren eines detektierten Decodierungsfehlers mit einem Segment des Videodatenrahmens unter Verwendung der Grenzmarker. 2. Verfahren nach Anspruch 1, wobei der Grenzmarker einen Beginn einer verlorenen Dateneinheit der Physical Layer anzeigt. 62. The computer-readable medium of claim 9, wherein the video data frame includes macroblocks of video data, further comprising instructions to cause the processor to conceal macroblocks within the segment of the video data frame. 63. The computer-readable medium of claim 9, wherein the video data frame includes resynchronization markers, further comprising instructions to cause the processor to conceal macroblocks within the segment of the video data frame and macroblocks up to a next one of the resynchronization markers following the detected decoding error in the video data frame Verfahren nach Anspruch 1, wobei das Einbetten eines Grenzmarkers Einbetten einer Vielzahl der Grenzmarker beinhaltet, um Grenzen zwischen einer Vielzahl von Dateneinheiten der Physical Layer zu identifizieren. 4. Verfahren nach Anspruch 1, wobei der Videodatenrahmen Makroblöcke von Videodaten beinhaltet, wobei das Verfahren weiter Concealen bzw. Verbergen von Makroblöcken in dem Segment des Videodatenrahmens aufweist.. Verfahren nach Anspruch 1, wobei der Videodatenrahmen Resynchronisationsmarker beinhaltet, wobei das Verfahren weiter Verbergen von Makroblökken in dem Segment des Videodatenrahmens und von Makroblöcken bis zu einem nächsten der Resynchronisationsmarker, der dem detektierten Decodierfehler in dem Videodatenrahmen folgt, auf- 11
12 21 EP B1 22 weist. 6. Verfahren nach Anspruch 1, das weiter Demultiplexen der Dateneinheiten der Multiplex Layer aufweist, um Dateneinheiten der Adaptionsschicht bzw. Adaptation Layer zu generieren, und Generieren der Videodatenrahmen basierend auf den Dateneinheiten der Adaptation Layer. 7. Verfahren nach Anspruch 1, das weiter Empfangen der Dateneinheiten der Physical Layer über Drahtloskommunikation aufweist. 8. Verfahren nach Anspruch 1, das weiter Demultiplexen der Einheiten der Multiplex Layer gemäß dem ITU-H.223-Multiplexing/Demultiplexing-Protokoll aufweist. 9. Verfahren nach Anspruch 1, das weiter Demultiplexen der Einheiten der Multiplex Layer gemäß dem RTP/UDP/IP-Multiplexing/Demultiplexing-Protokoll aufweist.. Verfahren nach Anspruch 1, wobei der Videodatenrahmen Makroblöcke von Videodaten beinhaltet, die zum MPEG-4-Standard konform sind. 11. Verfahren nach Anspruch 1, wobei der Videodatenrahmen Makroblöcke von Videodaten beinhaltet, die zu einem der ITU-H.263-, ITU-H-264- und MPEG-2- Protokolle konform sind. 12. Verfahren nach Anspruch 1, wobei die Dateneinheiten der Physical Layer W-CDMA-Funkverbindungssteuerungspaketdateneinheiten bzw. W-CDMA RLC PDUs (RLC PDUs = radio link control packet data units) beinhalten. 3 Multiplex Layer beinhaltet, um Grenzen zwischen Videoinformation in den Dateneinheiten der Physical Layer anzuzeigen. 17. Ein Videodecodiersystem, das Folgendes aufweist: eine Demultiplexing-Engine zum Generieren von Dateneinheiten der Multiplexierschicht bzw. Multiplex Layer die Videodaten beinhalten, und zwar basierend auf Dateneinheiten der physikalischen Schicht bzw. Physical Layer, und zum Demultiplexen der Dateneinheiten der Multiplex Layer; einen Grenzgenerator zum Einbetten eines Grenzmarkers in die Dateneinheiten der Multiplex Layer, um eine Grenze zwischen den Dateneinheiten der Physical Layer anzuzeigen; und eine Videodecodier-Engine zum Decodieren eines Videodatenrahmens, der Videodaten enthält, und zum Assoziieren eines detektierten Decodierfehlers mit einem Segment des Videodatenrahmens unter Verwendung der Grenzmarker. 18. System nach Anspruch 17, wobei der Grenzmarker einen Beginn einer verlorenen Dateneinheit der Physical Layer identifiziert. 19. System nach Anspruch 17, wobei der Grenzgenerator eine Vielzahl von Grenzmarkern einbettet, um Grenzen zwischen einer Vielzahl von Dateneinheiten der Physical Layer zu identifizieren.. System nach Anspruch 17, das weiter einen Grenzdetektor zum Detektieren der Grenzen zwischen den Dateneinheiten der Physical Layer aufweist. 13. Verfahren nach Anspruch 12, wobei die Dateneinheiten der Multiplex Layer zum H.223-Multiplexing/ Demultiplexing-Protokoll konform sind. 14. Verfahren nach Anspruch 1, wobei die Dateneinheiten der Physical Layer CDMA00-1 x- Funkverbindungsprotokollpaketdateneinheiten bzw. CDMA00 1x RLP PDUs (RLP PDUs = radio link protocol packet data units), CDMA00 1x EV- DO RLP PDUs oder CDMA00 EV-DV RLP PDUs beinhalten.. Verfahren nach Anspruch 1, wobei die Dateneinheiten der Multiplex Layer zum RTP/UDP/IP-Multiplexing/Demultiplexing-Protokoll konform sind. 16. Verfahren nach Anspruch 1, wobei die Dateneinheiten der Physical Layer Audio- und Videodaten beinhalten, und das Einbetten von Grenzmarkern Einbetten von Grenzmarkern in die Dateneinheiten der System nach Anspruch 17, wobei der Videodatenrahmen Makroblöcke von Videodaten beinhaltet, und die Decodier-Engine Makroblöcke in dem Segment des Videodatenrahmens verbirgt. 22. System nach Anspruch 17, wobei der Videodatenrahmen Resynchronisationsmarker beinhaltet, und die Decodier-Engine Makroblöcke in dem Segment des Videodatenrahmens und Makroblöcke bis zu einem nächsten der Resynchronisationsmarker, die dem detektierten Decodierfehler in dem Videodatenrahmen folgen, verbirgt. 23. System nach Anspruch 17, das weiter ein Adaptation-Layer-Modul aufweist zum Generieren von Dateneinheiten der Adaptation Layer basierend auf den demultiplexten Dateneinheiten der Multiplex Layer, und zum Generieren des Video-Datenrahmens basierend auf den Dateneinheiten der Adaptation Layer. 12
13 23 EP B System nach Anspruch 17, das weiter einen Drahtlosempfänger zum Empfangen der Dateneinheiten der Physical Layer über Drahtloskommunikation aufweist.. System nach Anspruch 17, wobei die Demultiplexing-Engine Einheiten der Multiplex Layer gemäß dem ITU-H.223-Multiplexing/Demultiplexing-Protokoll demultiplext. 26. System nach Anspruch 17, wobei die Demultiplexing-Engine die Einheiten der Multiplex Layer gemäß dem RTP/UDP/IP-Multiplexing/Demultiplexing-Protokoll demultiplext. 27. System nach Anspruch 17, wobei der Videodatenrahmen Makroblöcke von Videodaten beinhaltet, die zum MPEG-4-Standard konform sind. 28. System nach Anspruch 17, wobei der Videodatenrahmen Makroblöcke von Videodaten beinhaltet, die zu einem der ITU-H.263-, ITU-H.264- und MPEG-2- Protokolle konform sind. 29. System nach Anspruch 17, wobei die Dateneinheiten der Physical Layer W-CDMA-Funkverbindungssteuerpaketdateneinheiten bzw. W-CDMA RLC PDUs (RLC PDUs = radio link control packet data units) beinhalten.. System nach Anspruch 29, wobei die Dateneinheiten der Multiplex Layer mit dem H.223-Multiplexing/ Demultiplexing-Protokoll konform sind. 31. System nach Anspruch 17, wobei die Dateneinheiten der Physical Layer CDMA00-1 x- Funkverbindungsprotokollpaketdateneinheiten bzw. CDMA00 1x RLP PDUs (RLP PDUs = radio link protocol packet data units), CDMA00 1x EV- DO RLP PDUs oder CDMA00 EV-DV RLP PDUs beinhalten. 3 zwar basierend auf Dateneinheiten der physikalischen Schicht bzw. Physical Layer, und zum Demultiplexen der Dateneinheiten der Multiplex Layer; und einen Grenzgenerator zum Einbetten eines Grenzmarkers in die Dateneinheiten der Multiplex Layer zum Anzeigen einer Grenze zwischen den Dateneinheiten der Physical Layer, um es einem Videodecodierer zu gestatten, einen detektierten Decodierfehler mit einem Segment eines Videodatenrahmens unter Verwendung der Grenzmarker zu assoziieren. 3. Demultiplexer nach Anspruch 34, wobei der Grenzmarker einen Beginn einer verlorenen Dateneinheit der Physical Layer identifiziert. 36. Demultiplexer nach Anspruch 34, wobei der Grenzgenerator eine Vielzahl von Grenzmarkern einbettet, um Grenzen zwischen einer Vielzahl von Dateneinheiten der Physical Layer zu identifizieren. 37. Demultiplexer nach Anspruch 34, der weiter einen Grenzdetektor zum Detektieren der Grenzen zwischen den Dateneinheiten der Physical Layer aufweist. 38. Demultiplexer nach Anspruch 34, wobei der Videodatenrahmen Makroblöcke von Videodaten beinhaltet, und die Decodier-Engine Makroblöcke in dem Segment des Videodatenrahmens verbirgt. 39. Demultiplexer nach Anspruch 34, wobei die Demultiplexing-Engine Einheiten der Multiplex Layer gemäß dem H.223-Multiplexing/Demultiplexing-Protokoll demultiplext.. Demultiplexer nach Anspruch 34, wobei die Demultiplexing-Engine die Dateneinheiten der Multiplex Layer gemäß dem RTP/UDP/IP-Multiplexing/Demultiplexing-Protokoll demultiplext. 32. System nach Anspruch 17, wobei die Dateneinheiten der Multiplex Layer zum RTP/UDP/IP-Multiplexing/Demultiplexing-Protokoll konform sind Demultiplexer nach Anspruch 34, wobei der Videodatenrahmen Makroblöcke von Videodaten beinhaltet, die zum MPEG-4-Standard konform sind. 33. System nach Anspruch 17, wobei die Dateneinheiten der Physical Layer Audio- und Videodaten beinhalten, und der Grenzgenerator die Grenzmarker in die Dateneinheiten der Multiplex Layer einbettet, um Grenzen zwischen Videoinformation in den Dateneinheiten der Physical Layer anzuzeigen. 34. Ein Videodemultiplexer, der Folgendes aufweist: eine Demultiplexing-Engine zum Generieren von Dateneinheiten der Multiplexierschicht bzw. Multiplex Layer, die Videodaten beinhalten, und Demultiplexer nach Anspruch 34, wobei der Videodatenrahmen Makroblöcke von Videodaten beinhaltet, die zu einem der ITU-H.263-, ITU-H.264- und MPEG-2-Protokolle konform sind. 43. Demultiplexer nach Anspruch 34, wobei die Dateneinheiten der Physical Layer W-CDMA-Funkverbindungssteuerpaketdateneinheiten bzw. W-CDMA RLC PDUs (RLC PDUs = radio link control packet data units) beinhalten. 44. Demultiplexer nach Anspruch 43, wobei die Daten- 13
14 EP B1 26 einheiten der Multiplex Layer mit dem H.223-Multiplexing/Demultiplexing-Protokoll konform sind. 4. Demultiplexer nach Anspruch 34, wobei die Dateneinheiten der Physical Layer CDMA00-1x- Funkverbindungsprotokollpaketdateneinheiten bzw. CDMA00 1x RLP PDUs (RLP PDUs = radio link protocol packet data units), CDMA00 1x EV- DO RLP PDUs oder CDMA00 EV-DV RLP PDUs beinhalten. 46. Demultiplexer nach Anspruch 34, wobei die Dateneinheiten der Multiplex Layer zum RTP/UDP/IP-Multiplexing/Demultiplexing-Protokoll konform sind. 47. Demultiplexer nach Anspruch 34, wobei die Dateneinheiten der Physical Layer Audio- und Videodaten beinhalten, und der Grenzgenerator die Grenzmarker in die Dateneinheiten der Multiplex Layer einbettet, um Grenzen zwischen Videoinformation in den Dateneinheiten der Physical Layer anzuzeigen. 48. Eine Drahtloskommunikationseinrichtung, die Folgendes aufweist: einen Drahtlosempfänger zum Empfangen von Dateneinheiten der physikalischen Schicht bzw. Physical Layer über Drahtloskommunikation, wobei die Dateneinheiten der Physical Layer Videodaten beinhalten; eine Demultiplexing-Engine zum Generieren von Dateneinheiten der Multiplex Layer basierend auf den Dateneinheiten der Physical Layer, und zum Demultiplexen der Dateneinheiten der Multiplex Layer; einen Grenzgenerator zum Einbetten eines Grenzmarkers in die Dateneinheiten der Multiplex Layer zum Anzeigen einer Grenze zwischen den Dateneinheiten der Physical Layer; und eine Videodecodier-Engine zum Decodieren eines Videodatenrahmens, der Videodaten enthält, und zum Assoziieren eines detektierten Decodierfehlers in einem Segment des Videodatenrahmens unter Verwendung der Grenzmarker. 49. Einrichtung nach Anspruch 48, wobei der Grenzmarker einen Beginn einer verlorenen Dateneinheit der Physical Layer identifiziert Mittel zum Generieren von Dateneinheiten der Multiplex Layer, die Videodaten enthalten, und zwar basierend auf Dateneinheiten der Physical Layer; Mittel zum Einbetten eines Grenzmarkers in die Dateneinheiten der Multiplex Layer, um eine Grenze zwischen den Dateneinheiten der Physical Layer anzuzeigen; Mittel zum Demultiplexen der Dateneinheiten der Multiplex Layer zum Generieren eines Videodatenrahmens; und Mittel zum Assoziieren eines detektierten Decodierfehlers mit einem Segment des Videodatenrahmens unter Verwendung der Grenzmarker. 2. System nach Anspruch 1, wobei der Grenzmarker einen Beginn einer verlorenen Dateneinheit der Physical Layer identifiziert. 3. System nach Anspruch 1, wobei die Mittel zum Einbetten Mittel zum Einbetten einer Vielzahl der Grenzmarkern beinhalten, um Grenzen zwischen einer Vielzahl von Dateneinheiten der Physical Layer zu identifizieren. 4. System nach Anspruch 1, wobei der Videodatenrahmen Makroblöcke von Videodaten beinhaltet, wobei das System weiter Mittel zum Verbergen von Makroblöcken in dem Segment des Videodatenrahmens aufweist.. System nach Anspruch 1, wobei der Videodatenrahmen Resynchronisationsmarker beinhaltet, wobei das System weiter Mittel zum Verbergen von Makroblöcken in dem Segment des Videodatenrahmens und von Makroblöcken bis zu einem nächsten der Resynchronisationsmarker, der dem detektierten Decodierfehler in dem Videodatenrahmen folgt, aufweist. 6. System nach Anspruch 1, wobei die Mittel zum Demultiplexen die Einheiten der Multiplex Layer gemäß den ITU-H.223-Multiplexing/Demultiplexing- oder RTP/UDP/IP-Multiplexing/Demultiplexing-Protokollen demultiplexen. 7. System nach Anspruch 1, wobei der Videodatenrahmen Makroblöcke von Videodaten beinhaltet, die zu einem der MPEG-4-, ITU-H.263-, ITU-H-264- und MPEG-2-Protokolle konform sind. 0. Einrichtung nach Anspruch 48, wobei der Grenzgenerator eine Vielzahl von Grenzmarkern einbettet, um Grenzen zwischen einer Vielzahl von Dateneinheiten der Physical Layer zu identifizieren. 1. Ein Videodecodiersystem, das Folgendes aufweist: 8. System nach Anspruch 1, wobei die Dateneinheiten der Physical Layer W-CDMA-Funkverbindungssteuerungspaketdateneinheiten bzw. W-CDMA RLC PDUs (RLC PDUs = radio link control packet data units), CDMA00-1x-Funkverbindungsprotokollpaketdateneinheiten bzw. CDMA00 1x RLP PDUs (RLP PDUs = radio link protocol packet data 14
15 27 EP B1 28 units), CDMA00 1x EV-DO RLP PDUs oder CDMA00 EV-DV RLP PDUs beinhalten. H.263-, ITU-H-264- und MPEG-2-Protokolle konform sind. 9. Ein computerlesbares Medium, das Instruktionen aufweist, um einen oder mehrere Prozessoren zu Folgendem zu veranlassen: Generieren von Dateneinheiten der Multiplexierschicht bzw. Multiplex Layer, die Videodaten enthalten, basierend auf Dateneinheiten der physikalischen Schicht bzw. Physical Layer; Einbetten eines Grenzmarkers in die Dateneinheiten der Multiplex Layer, um eine Grenze zwischen den Dateneinheiten der Physical Layer anzuzeigen; Demultiplexen der Dateneinheiten der Multiplex Layer, um einen Videodatenrahmen zu generieren; und Assoziieren eines detektierten Decodierungsfehlers mit einem Segment des Videodatenrahmens unter Verwendung der Grenzmarker. 60. Computerlesbares Medium nach Anspruch 9, wobei der Grenzmarker einen Beginn einer verlorenen Dateneinheit der Physical Layer anzeigt. 61. Computerlesbares Medium nach Anspruch 9, das weiter Instruktionen aufweist, um den Prozessor zu veranlassen, eine Vielzahl von Grenzmarkern einzubetten, um Grenzen zwischen einer Vielzahl von Dateneinheiten der Physical Layer zu identifizieren. 62. Computerlesbares Medium nach Anspruch 9, wobei der Videodatenrahmen Makroblöcke von Videodaten beinhaltet, das weiter Instruktionen aufweist, um den Prozessor zu veranlassen, Makroblöcke in dem Segment des Videodatenrahmens zu verbergen. 63. Computerlesbares Medium nach Anspruch 9, wobei der Videodatenrahmen Resynchronisationsmarker beinhaltet, das weiter Instruktionen aufweist, um den Prozessor zu veranlassen, Makroblöcke in dem Segment des Videodatenrahmens und Makroblöcke bis zu einem nächsten der Resynchronisationsmarker, der dem detektierten Decodierfehler in dem Videodatenrahmen folgt, zu verbergen. 64. Computerlesbares Medium nach Anspruch 9, wobei die Instruktionen den Prozessor veranlassen, die Einheiten der Multiplex Layer gemäß den ITU-H. 223-Multiplexing/Demultiplexing- oder RTP/UDP/IP-Multiplexing/Demultiplexing-Protokollen zu demultiplexen. 6. Computerlesbares Medium nach Anspruch 9, wobei der Videodatenrahmen Makroblöcke von Videodaten beinhaltet, die zu einem der MPEG-4-, ITU Computerlesbares Medium nach Anspruch 9, wobei die Dateneinheiten der Physical Layer W-CDMA- Funkverbindungssteuerungspaketdateneinheiten bzw. W-CDMA RLC PDUs (RLC PDUs = radio link control pakket data units), CDMA00-1x- Funkverbindungsprotokollpaketdateneinheiten bzw. CDMA00 1x RLP PDUs (RLP PDUs = radio link protocol packet data units), CDMA00 1x EV- DO RLP PDUs oder CDMA00 EV-DV RLP PDUs beinhalten. Revendications 1. Procédé de décodage vidéo comprenant : générer des unités de données de couche de multiplexage contenant des données vidéo sur la base d unités de données de couche physique ; intégrer un marqueur de frontière dans les unités de données de couche de multiplexage pour indiquer une frontière entre les unités de données de couche physique ; démultiplexer les unités de données de couche de multiplexage pour générer une trame de données vidéo ; et associer une erreur de décodage détectée à un segment de la trame de données vidéo en utilisant les marqueurs de frontière. 2. Procédé selon la revendication 1, dans lequel le marqueur de frontière identifie le début d une unité de données de couche physique perdue. 3. Procédé selon la revendication 1, dans lequel l intégration du marqueur de frontière inclut l intégration de plusieurs marqueurs de frontière pour identifier des frontières entre plusieurs des unités de données de couche physique. 4. Procédé selon la revendication 1, dans lequel la trame de données vidéo comprend des macroblocs de données vidéo, le procédé comprenant en outre la dissimulation de macroblocs dans le segment de la trame de données vidéo.. Procédé selon la revendication 1, dans lequel la trame de données vidéo comprend des marqueurs de resynchronisation, le procédé comprenant en outre la dissimulation de macroblocs dans le segment de la trame de données vidéo et de macroblocs jusqu à un marqueur suivant des marqueurs de resynchronisation à la suite de l erreur de décodage détectée dans la trame de données vidéo.
16 29 EP B1 6. Procédé selon la revendication 1, comprenant en outre le démultiplexage des unités de données de couche de multiplexage pour générer des unités de données de couche d adaptation, et la génération de la trame de données vidéo sur la base des unités de données de couche d adaptation. 7. Procédé selon la revendication 1, comprenant en outre la réception des unités de données de couche physique par l intermédiaire d une communication sans fil. 8. Procédé selon la revendication 1, comprenant en outre le démultiplexage des unités de couche de multiplexage conformément au protocole de multiplexage/démultiplexage ITU H Procédé selon la revendication 1, comprenant en outre le démultiplexage des unités de couche de multiplexage conformément au protocole de multiplexage/démultiplexage RTP/UDP/IP.. Procédé selon la revendication 1, dans lequel la trame de données vidéo comprend des macroblocs de données vidéo conformes à la norme MPEG Procédé selon la revendication 1, dans lequel la trame de données vidéo comprend des macroblocs de données vidéo conformes à l un des protocoles ITU H.263, ITU H.264 et MPEG Procédé selon la revendication 1, dans lequel les unités de données de couche physique comprennent des unités de données par paquets de commande de liaison radio (PDU RLC) W-CDMA. 13. Procédé selon la revendication 12, dans lequel les unités de données de couche de multiplexage sont conformes au protocole de multiplexage/démultiplexage H Procédé selon la revendication 1, dans lequel les unités de données de couche physique comprennent des unités de données par paquets de protocole de liaison radio (PDU RLP) CDMA00 1x, des PDU RLP CDMA00 1x EV-DO, ou des PDU RLP CDMA00 EV-DV.. Procédé selon la revendication 1, dans lequel les unités de données de couche de multiplexage sont conformes au protocole de multiplexage/démultiplexage RTP/UDP/IP. 16. Procédé selon la revendication 1, dans lequel les unités de données de couche physique comprennent des données audio et vidéo, et l intégration de marqueurs de frontière comprend l intégration de marqueurs de frontière dans les unités de données de couche de multiplexage pour indiquer des frontières entre des informations vidéo dans les unités de données de couche physique. 17. Système de décodage vidéo comprenant : un moteur de démultiplexage pour générer des unités de données de couche de multiplexage contenant des données vidéo sur la base d unités de données de couche physique, et démultiplexer les unités de données de couche de multiplexage ; un générateur de frontière pour intégrer un marqueur de frontière dans les unités de données de couche de multiplexage pour indiquer une frontière entre les unités de données de couche physique ; et un moteur de décodage vidéo pour décoder une trame de données vidéo contenant les données vidéo, et associer une erreur de décodage détectée à un segment de la trame de données vidéo en utilisant les marqueurs de frontière. 18. Système selon la revendication 17, dans lequel le marqueur de frontière identifie le début d une unité de données de couche physique perdue. 19. Système selon la revendication 17, dans lequel le générateur de frontière intègre plusieurs marqueurs de frontière pour identifier des frontières entre plusieurs des unités de données de couche physique.. Système selon la revendication 17, comprenant en outre un détecteur de frontière pour détecter les frontières entre les unités de données de couche physique. 21. Système selon la revendication 17, dans lequel la trame de données vidéo comprend des macroblocs de données vidéo, et le moteur de décodage dissimule des macroblocs dans le segment de la trame de données vidéo. 22. Système selon la revendication 17, dans lequel la trame de données vidéo comprend des marqueurs de resynchronisation, et le moteur de décodage dissimule des macroblocs dans le segment de la trame de données vidéo et des macroblocs jusqu à un marqueur suivant des marqueurs de resynchronisation à la suite de l erreur de décodage détectée dans la trame de données vidéo. 23. Système selon la revendication 17, comprenant en outre un module de couche d adaptation pour générer des unités de données de couche d adaptation sur la base des unités de données de couche de multiplexage démultiplexées, et générer la trame de données vidéo sur la base des unités de données 16
17 31 EP B1 32 de couche d adaptation. 24. Système selon la revendication 17, comprenant en outre un récepteur sans fil pour recevoir les unités de données de couche physique par l intermédiaire d une communication sans fil.. Système selon la revendication 17, dans lequel le moteur de démultiplexage démultiplexe les unités de couche de multiplexage conformément au protocole de multiplexage/démultiplexage ITU H Système selon la revendication 17, dans lequel le moteur de démultiplexage démultiplexe les unités de couche de multiplexage conformément au protocole de multiplexage/démultiplexage RTP/UDP/IP. 27. Système selon la revendication 17, dans lequel la trame de données vidéo comprend des macroblocs de données vidéo conformes à la norme MPEG Système selon la revendication 17, dans lequel la trame de données vidéo comprend des macroblocs de données vidéo conformes à l un des protocoles ITU H.263, ITU H.264 et MPEG Système selon la revendication 17, dans lequel les unités de données de couche physique comprennent des unités de données par paquets de commande de liaison radio (PDU RLC) W-CDMA.. Système selon la revendication 29, dans lequel les unités de données de couche de multiplexage sont conformes au protocole de multiplexage/démultiplexage H un moteur de démultiplexage pour générer des unités de données de couche de multiplexage contenant des données vidéo sur la base d unités de données de couche physique, et démultiplexer les unités de données de couche de multiplexage ; et un générateur de frontière pour intégrer un marqueur de frontière dans les unités de données de couche de multiplexage pour indiquer une frontière entre les unités de données de couche physique pour permettre à un décodeur vidéo d associer une erreur de décodage détectée à un segment d une trame de données vidéo en utilisant les marqueurs de frontière. 3. Démultiplexeur selon la revendication 34, dans lequel le marqueur de frontière identifie le début d une unité de données de couche physique perdue. 36. Démultiplexeur selon la revendication 34, dans lequel le générateur de frontière intègre plusieurs marqueurs de frontière pour identifier des frontières entre plusieurs des unités de données de couche physique. 37. Démultiplexeur selon la revendication 34, comprenant en outre un détecteur de frontière pour détecter les frontières entre les unités de données de couche physique. 38. Démultiplexeur selon la revendication 34, dans lequel la trame de données vidéo comprend des macroblocs de données vidéo, et le moteur de décodage dissimule des macroblocs dans le segment de la trame de données vidéo. 31. Système selon la revendication 17, dans lequel les unités de données de couche physique comprennent des unités de données par paquets de protocole de liaison radio (PDU RLP) CDMA00 1x, des PDU RLP CDMA00 1x EV-DO, ou des PDU RLP CDMA00 EV-DV. 32. Système selon la revendication 17, dans lequel les unités de données de couche de multiplexage sont conformes au protocole de multiplexage/démultiplexage RTP/UDP/IP Démultiplexeur selon la revendication 34, dans lequel le moteur de démultiplexage démultiplexe les unités de données de couche de multiplexage conformément au protocole de multiplexage/démultiplexage H Démultiplexeur selon la revendication 34, dans lequel le moteur de démultiplexage démultiplexe les unités de données de couche de multiplexage conformément au protocole de multiplexage/démultiplexage RTP/UDP/IP. 33. Système selon la revendication 17, dans lequel les unités de données de couche physique comprennent des données audio et vidéo, et le générateur de frontière intègre les marqueurs de frontière dans les unités de données de couche de multiplexage pour indiquer des frontières entre des informations vidéo dans les unités de données de couche physique. 34. Démultiplexeur vidéo comprenant : Démultiplexeur selon la revendication 34, dans lequel la trame de données vidéo comprend des macroblocs de données vidéo conformes à la norme MPEG Démultiplexeur selon la revendication 34, dans lequel la trame de données vidéo comprend des macroblocs de données vidéo conformes à l un des protocoles ITU H.263, ITU H.264 et MPEG-2. 17
18 33 EP B Démultiplexeur selon la revendication 34, dans lequel les unités de données de couche physique comprennent des unités de données par paquets de commande de liaison radio (PDU RLC) W-CDMA. 44. Démultiplexeur selon la revendication 43, dans lequel les unités de données de couche de multiplexage sont conformes au protocole de multiplexage/démultiplexage H Démultiplexeur selon la revendication 34, dans lequel les unités de données de couche physique comprennent des unités de données par paquets de protocole de liaison radio (PDU RLP) CDMA00 1x, des PDU RLP CDMA00 1x EV-DO, ou des PDU RLP CDMA00 EV-DV. 46. Démultiplexeur selon la revendication 34, dans lequel les unités de données de couche de multiplexage sont conformes au protocole de multiplexage/démultiplexage RTP/UDP/IP. 47. Démultiplexeur selon la revendication 34, dans lequel les unités de données de couche physique comprennent des données audio et vidéo, et le générateur de frontière intègre les marqueurs de frontière dans les unités de données de couche de multiplexage pour indiquer des frontières entre des informations vidéo dans les unités de données de couche physique. 48. Dispositif de communication sans fil comprenant : 0. Dispositif selon la revendication 48, dans lequel le générateur de frontière intègre plusieurs marqueurs de frontière pour identifier des frontières entre plusieurs des unités de données de couche physique. 1. Système de décodage vidéo comprenant : des moyens pour générer des unités de données de couche de multiplexage contenant des données vidéo sur la base d unités de données de couche physique ; des moyens pour intégrer un marqueur de frontière dans les unités de données de couche de multiplexage pour indiquer une frontière entre les unités de données de couche physique ; des moyens pour démultiplexer les unités de données de couche de multiplexage pour générer une trame de données vidéo ; et des moyens pour associer une erreur de décodage détectée à un segment de la trame de données vidéo en utilisant les marqueurs de frontière. 2. Système selon la revendication 1, dans lequel le marqueur de frontière identifie le début d une unité de données de couche physique perdue. 3. Système selon la revendication 1, dans lequel les moyens pour intégrer comprennent des moyens pour intégrer plusieurs marqueurs de frontière pour identifier des frontières entre plusieurs des unités de données de couche physique. un récepteur sans fil pour recevoir des unités de données de couche physique par l intermédiaire d une communication sans fil, les unités de données de couche physique contenant des données vidéo ; un moteur de démultiplexage pour générer des unités de données de couche de multiplexage sur la base des unités de données de couche physique, et démultiplexer les unités de données de couche de multiplexage ; un générateur de frontière pour intégrer un marqueur de frontière dans les unités de données de couche de multiplexage pour indiquer une frontière entre les unités de données de couche physique ; et un moteur de décodage vidéo pour décoder une trame de données vidéo contenant les données vidéo, et associer une erreur de décodage détectée à un segment de la trame de données vidéo en utilisant les marqueurs de frontière. 49. Dispositif selon la revendication 48, dans lequel le marqueur de frontière identifie le début d une unité de données de couche physique perdue Système selon la revendication 1, dans lequel la trame de données vidéo comprend des macroblocs de données vidéo, le système comprenant en outre des moyens pour dissimuler des macroblocs dans le segment de la trame de données vidéo.. Système selon la revendication 1, dans lequel la trame de données vidéo comprend des marqueurs de resynchronisation, le système comprenant en outre des moyens pour dissimuler des macroblocs dans le segment de la trame de données vidéo et des macroblocs jusqu à un marqueur suivant des marqueurs de resynchronisation à la suite de l erreur de décodage détectée dans la trame de données vidéo. 6. Système selon la revendication 1, dans lequel les moyens de démultiplexage démultiplexent les unités de couche de multiplexage selon les protocoles de multiplexage/démultiplexage ITU H.223 ou RTP/UDP/IP. 7. Système selon la revendication 1, dans lequel la trame de données vidéo comprend des macroblocs de données vidéo conformes aux protocoles MPEG- 18
19 3 EP B1 36 4, ITU H.263, ITU H.264 ou MPEG Système selon la revendication 1, dans lequel les unités de données de couche physique comprennent des unités de données par paquets de commande de liaison radio (PDU RLC) W-CDMA, des unités de données par paquets de protocole de liaison radio (PDU RLP) CDMA00 1x, des PDU RLP CDMA00 1x EV-DO, ou des PDU RLP CDMA00 EV-DV. 9. Support lisible par un ordinateur comprenant des instructions pour amener un ou plusieurs processeurs à: générer des unités de données de couche de multiplexage contenant des données vidéo sur la base d unités de données de couche physique ; intégrer un marqueur de frontière dans les unités de données de couche de multiplexage pour indiquer une frontière entre les unités de données de couche physique ; démultiplexer les unités de données de couche de multiplexage pour générer une trame de données vidéo ; et associer une erreur de décodage détectée à un segment de la trame de données vidéo en utilisant les marqueurs de frontière. 60. Support lisible par un ordinateur selon la revendication 9, dans lequel le marqueur de frontière identifie le début d une unité de données de couche physique perdue. 61. Support lisible par un ordinateur selon la revendication 9, comprenant en outre des instructions pour amener le processeur à intégrer plusieurs marqueurs de frontière pour identifier des frontières entre plusieurs des unités de données de couche physique Support lisible par un ordinateur selon la revendication 9, dans lequel les instructions amènent le processeur à démultiplexer les unités de couche de multiplexage selon les protocoles de multiplexage/démultiplexage ITU H.223 ou RTP/UDP/IP. 6. Support lisible par un ordinateur selon la revendication 9, dans lequel la trame de données vidéo comprend des macroblocs de données vidéo conformes aux protocoles MPEG-4, ITU H.263, ITU H.264 ou MPEG Support lisible par un ordinateur selon la revendication 9, dans lequel les unités de données de couche physique comprennent des unités de données par paquets de commande de liaison radio (PDU RLC) W-CDMA, des unités de données par paquets de protocole de liaison radio (PDU RLP) CDMA00 1x, des PDU RLP CDMA00 1x EV-DO, ou des PDU RLP CDMA00 EV-DV. 62. Support lisible par un ordinateur selon la revendication 9, dans lequel la trame de données vidéo comprend des macroblocs de données vidéo, comprenant en outre des instructions pour amener le processeur à dissimuler des macroblocs dans le segment de la trame de données vidéo Support lisible par un ordinateur selon la revendication 9, dans lequel la trame de données vidéo comprend des marqueurs de resynchronisation, comprenant en outre des instructions pour amener le processeur à dissimuler des macroblocs dans le segment de la trame de données vidéo et des macroblocs jusqu à un marqueur suivant des marqueurs de resynchronisation à la suite de l erreur de décodage détectée dans la trame de données vidéo. 0 19
20 EP B1
21 EP B1 21
22 EP B1 22
23 EP B1 23
24 EP B1 24
25 EP B1
26 EP B1 26
27 EP B1 27
28 EP B1 28
29 EP B1 29
30 EP B1
EP 2 455 926 A1 (19) (11) EP 2 455 926 A1 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: 23.05.2012 Bulletin 2012/21
(19) (12) EUROPEAN PATENT APPLICATION (11) EP 2 4 926 A1 (43) Date of publication: 23.0.2012 Bulletin 2012/21 (21) Application number: 11190024.7 (1) Int Cl.: G08B 2/14 (2006.01) G08B 2/00 (2006.01) G0B
TEPZZ 87_546A T EP 2 871 546 A2 (19) (11) EP 2 871 546 A2 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G05B 19/05 (2006.01)
(19) TEPZZ 87_46A T (11) EP 2 871 46 A2 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 13.0.1 Bulletin 1/ (1) Int Cl.: G0B 19/0 (06.01) (21) Application number: 14188238.1 (22) Date of filing:
TEPZZ 9 Z5A_T EP 2 922 305 A1 (19) (11) EP 2 922 305 A1. (12) EUROPEAN PATENT APPLICATION published in accordance with Art.
(19) TEPZZ 9 ZA_T (11) EP 2 922 A1 (12) EUROPEAN PATENT APPLICATION published in accordance with Art. 13(4) EPC (43) Date of publication: 23.09.1 Bulletin 1/39 (21) Application number: 1386446.2 (22) Date
TEPZZ 68575_A_T EP 2 685 751 A1 (19) (11) EP 2 685 751 A1. (12) EUROPEAN PATENT APPLICATION published in accordance with Art.
(19) TEPZZ 687_A_T (11) EP 2 68 71 A1 (12) EUROPEAN PATENT APPLICATION published in accordance with Art. 3(4) EPC (43) Date of publication:.01.14 Bulletin 14/03 (21) Application number: 1278849.6 (22)
(51) Int Cl.: H04N 7/52 (2011.01)
(19) TEPZZ_9776 B_T (11) EP 1 977 611 B1 (12) EUROPEAN PATENT SPECIFICATION (4) Date of publication and mention of the grant of the patent: 16.01.13 Bulletin 13/03 (21) Application number: 0683819.1 (22)
TEPZZ 94Z968A_T EP 2 940 968 A1 (19) (11) EP 2 940 968 A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: H04L 29/08 (2006.01)
(19) TEPZZ 94Z968A_T (11) EP 2 940 968 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 04.11.20 Bulletin 20/4 (1) Int Cl.: H04L 29/08 (2006.01) (21) Application number: 1430649.7 (22) Date
TEPZZ 6_Z76 A_T EP 2 610 763 A1 (19) (11) EP 2 610 763 A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.:
(19) TEPZZ 6_Z76 A_T (11) EP 2 6 763 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 03.07.2013 Bulletin 2013/27 (51) Int Cl.: G06F 17/30 (2006.01) (21) Application number: 12192220.7 (22)
TEPZZ 65Z79 A_T EP 2 650 793 A1 (19) (11) EP 2 650 793 A1. (12) EUROPEAN PATENT APPLICATION published in accordance with Art.
(19) TEPZZ 65Z79 A_T (11) EP 2 650 793 A1 (12) EUROPEAN PATENT APPLICATION published in accordance with Art. 153(4) EPC (43) Date of publication: 16.10.2013 Bulletin 2013/42 (21) Application number: 12818771.3
TEPZZ 69 49A_T EP 2 693 349 A1 (19) (11) EP 2 693 349 A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G06F 17/30 (2006.01)
(19) TEPZZ 69 49A_T (11) EP 2 693 349 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 0.02.2014 Bulletin 2014/06 (1) Int Cl.: G06F 17/30 (2006.01) (21) Application number: 13160696.4 (22)
TEPZZ 96 A_T EP 2 961 111 A1 (19) (11) EP 2 961 111 A1. (12) EUROPEAN PATENT APPLICATION published in accordance with Art.
(19) TEPZZ 96 A_T (11) EP 2 961 111 A1 (12) EUROPEAN PATENT APPLICATION published in accordance with Art. 13(4) EPC (43) Date of publication:.12.1 Bulletin 1/3 (21) Application number: 147426.7 (22) Date
TEPZZ 84 587A_T EP 2 843 587 A1 (19) (11) EP 2 843 587 A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G06F 21/64 (2013.01)
(19) TEPZZ 84 87A_T (11) EP 2 843 87 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 04.03.201 Bulletin 201/ (1) Int Cl.: G06F 21/64 (2013.01) (21) Application number: 13181902.1 (22) Date
EP 2 492 881 A2 (19) (11) EP 2 492 881 A2 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: 29.08.2012 Bulletin 2012/35
(19) (12) EUROPEAN PATENT APPLICATION (11) EP 2 492 881 A2 (43) Date of publication: 29.08.2012 Bulletin 2012/35 (51) Int Cl.: G08B 13/16 (2006.01) G08B 25/08 (2006.01) (21) Application number: 12386006.6
TEPZZ 87657ZA_T EP 2 876 570 A1 (19) (11) EP 2 876 570 A1 (12) EUROPEAN PATENT APPLICATION
(19) TEPZZ 8767ZA_T (11) EP 2 876 70 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 27.0.201 Bulletin 201/22 (21) Application number: 14189809.8 (1) Int Cl.: G06F 21/34 (2013.01) G08B 13/196
*EP001520563A1* EP 1 520 563 A1 (19) (11) EP 1 520 563 A1 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: 06.04.2005 Bulletin 2005/14
(19) Europäisches Patentamt European Patent Office Office européen des brevets *EP001520563A1* (11) EP 1 520 563 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 06.04.2005 Bulletin 2005/14
Video Coding Basics. Yao Wang Polytechnic University, Brooklyn, NY11201 [email protected]
Video Coding Basics Yao Wang Polytechnic University, Brooklyn, NY11201 [email protected] Outline Motivation for video coding Basic ideas in video coding Block diagram of a typical video codec Different
EP 1 675 420 A1 (19) (11) EP 1 675 420 A1 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: 28.06.2006 Bulletin 2006/26
(19) Europäisches Patentamt European Patent Office Office européen des brevets (12) EUROPEAN PATENT APPLICATION (11) EP 1 67 4 A1 (43) Date of publication: 28.06.06 Bulletin 06/26 (1) Int Cl.: H04Q 7/34
TEPZZ 69 _ZA T EP 2 692 310 A2 (19) (11) EP 2 692 310 A2. (12) EUROPEAN PATENT APPLICATION published in accordance with Art.
(19) TEPZZ 69 _ZA T (11) EP 2 692 3 A2 (12) EUROPEAN PATENT APPLICATION published in accordance with Art. 13(4) EPC (43) Date of publication: 0.02.14 Bulletin 14/06 (21) Application number: 1276632.0 (22)
Chapter 3 ATM and Multimedia Traffic
In the middle of the 1980, the telecommunications world started the design of a network technology that could act as a great unifier to support all digital services, including low-speed telephony and very
EP 2 365 669 A1 (19) (11) EP 2 365 669 A1 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: 14.09.2011 Bulletin 2011/37
(19) (12) EUROPEAN PATENT APPLICATION (11) EP 2 36 669 A1 (43) Date of publication: 14.09.11 Bulletin 11/37 (1) Int Cl.: H04L 12/8 (06.01) (21) Application number: 00243.6 (22) Date of filing:.03. (84)
TEPZZ 8898 7A_T EP 2 889 827 A1 (19) (11) EP 2 889 827 A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G06Q 40/04 (2012.01)
(19) TEPZZ 8898 7A_T (11) EP 2 889 827 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 01.07.201 Bulletin 201/27 (1) Int Cl.: G06Q 40/04 (2012.01) (21) Application number: 14199864.1 (22)
White paper. H.264 video compression standard. New possibilities within video surveillance.
White paper H.264 video compression standard. New possibilities within video surveillance. Table of contents 1. Introduction 3 2. Development of H.264 3 3. How video compression works 4 4. H.264 profiles
*EP001025692B1* EP 1 025 692 B1 (19) (11) EP 1 025 692 B1 (12) EUROPEAN PATENT SPECIFICATION
(19) Europäisches Patentamt European Patent Office Office européen des brevets *EP002692B1* (11) EP 1 02 692 B1 (12) EUROPEAN PATENT SPECIFICATION (4) Date of publication and mention of the grant of the
Annex A to the MPEG Audio Patent License Agreement Essential Philips, France Telecom and IRT Patents relevant to DVD-Video Player - MPEG Audio
Annex A to the MPEG Audio Patent License Agreement Essential Philips, France Telecom and IRT Patents relevant to DVD-Video Player - MPEG Audio PUBLICATION AR N 015678 - P970100444 05-Feb-97 17-Oct-05 AR006969
Our patent and trade mark attorneys are here to help you protect and profit from your ideas, making sure they re working every bit as hard as you do.
Our patent and trade mark attorneys are here to help you protect and profit from your ideas, making sure they re working every bit as hard as you do. Our people work with everyone from multi-nationals
(51) Int Cl.: H04L 12/58 (2006.01) H04L 29/06 (2006.01)
(19) TEPZZ_986 8 B_T (11) EP 1 986 382 B1 (12) EUROPEAN PATENT SPECIFICATION (4) Date of publication and mention of the grant of the patent: 19.02.14 Bulletin 14/08 (1) Int Cl.: H04L 12/8 (06.01) H04L
EP 2 354 708 A2 (19) (11) EP 2 354 708 A2 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: 10.08.2011 Bulletin 2011/32
(19) (12) EUROPEAN PATENT APPLICATION (11) EP 2 354 708 A2 (43) Date of publication:.08.2011 Bulletin 2011/32 (51) Int Cl.: F24H 3/08 (2006.01) F24H 8/00 (2006.01) (21) Application number: 111536.8 (22)
ARIB STD-T64-C.S0042 v1.0 Circuit-Switched Video Conferencing Services
ARIB STD-T-C.S00 v.0 Circuit-Switched Video Conferencing Services Refer to "Industrial Property Rights (IPR)" in the preface of ARIB STD-T for Related Industrial Property Rights. Refer to "Notice" in the
(51) Int Cl.: H04L 29/06 (2006.01) H04L 12/24 (2006.01)
(19) (12) EUROPEAN PATENT SPECIFICATION (11) EP 1 231 74 B1 (4) Date of publication and mention of the grant of the patent: 16.03.11 Bulletin 11/11 (1) Int Cl.: H04L 29/06 (06.01) H04L 12/24 (06.01) (21)
(51) Int Cl.: H04S 3/00 (2006.01)
(19) TEPZZ_9 7 66B_T (11) EP 1 927 266 B1 (12) EUROPEAN PATENT SPECIFICATION (4) Date of publication and mention of the grant of the patent: 14.0.14 Bulletin 14/ (21) Application number: 0679846.2 (22)
US 20070139188A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2007/0139188 A1 Ollis et al. HOME PROCESSOR /\ J\ NETWORK
US 20070139188A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2007/0139188 A1 Ollis et al. (43) Pub. Date: Jun. 21, 2007 (54) (75) (73) (21) (22) METHOD AND APPARATUS FOR COMMUNICATING
TEPZZ 79ZZ8_A_T EP 2 790 081 A1 (19) (11) EP 2 790 081 A1 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: 15.10.2014 Bulletin 2014/42
(19) TEPZZ 79ZZ8_A_T (11) EP 2 790 081 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 1..14 Bulletin 14/42 (1) Int Cl.: G0D 23/19 (06.01) (21) Application number: 1414221.7 (22) Date of
Voice Over IP Per Call Bandwidth Consumption
Over IP Per Call Bandwidth Consumption Interactive: This document offers customized voice bandwidth calculations with the TAC Bandwidth Calculator ( registered customers only) tool. Introduction Before
Title (fr) SOURCE IONIQUE INTERNE DOUBLE POUR PRODUCTION DE FAISCEAU DE PARTICULES AVEC UN CYCLOTRON
Title (en) A TWIN INTERNAL ION SOURCE FOR PARTICLE BEAM PRODUCTION WITH A CYCLOTRON Title (de) DOPPELTE INTERNE IONENQUELLE FÜR PARTIKELSTRAHLHERSTELLUNG MIT EINEM ZYKLOTRON Title (fr) SOURCE IONIQUE INTERNE
AN ANALYSIS OF DELAY OF SMALL IP PACKETS IN CELLULAR DATA NETWORKS
AN ANALYSIS OF DELAY OF SMALL IP PACKETS IN CELLULAR DATA NETWORKS Hubert GRAJA, Philip PERRY and John MURPHY Performance Engineering Laboratory, School of Electronic Engineering, Dublin City University,
(51) Int Cl.: G10L 15/26 (2006.01)
(19) TEPZZ Z 8B_T (11) EP 2 023 338 B1 (12) EUROPEAN PATENT SPECIFICATION (4) Date of publication and mention of the grant of the patent: 28.0.14 Bulletin 14/22 (1) Int Cl.: GL /26 (06.01) (21) Application
(51) Int Cl.: G06F 9/455 (2006.01) G06F 9/50 (2006.01)
(19) TEPZZ 6987 B_T (11) EP 2 698 711 B1 (12) EUROPEAN PATENT SPECIFICATION (4) Date of publication and mention of the grant of the patent: 0.08.1 Bulletin 1/32 (21) Application number: 118777.8 (22) Date
Data Link Layer Overview
Data Link Layer Overview Date link layer deals with two basic issues: Part I How data frames can be reliably transmitted, and Part II How a shared communication medium can be accessed In many networks,
Network administrators must be aware that delay exists, and then design their network to bring end-to-end delay within acceptable limits.
Delay Need for a Delay Budget The end-to-end delay in a VoIP network is known as the delay budget. Network administrators must design a network to operate within an acceptable delay budget. This topic
How To Understand The Layered Architecture Of A Network
COMPUTER NETWORKS NETWORK ARCHITECTURE AND PROTOCOLS The Need for Standards Computers have different architectures, store data in different formats and communicate at different rates Agreeing on a particular
(51) Int Cl.: H04L 12/28 (2006.01) H04L 29/06 (2006.01) H04L 12/56 (2006.01)
(19) (12) EUROPEAN PATENT SPECIFICATION (11) EP 1 096 7 B1 (4) Date of publication and mention of the grant of the patent: 11.03.09 Bulletin 09/11 (1) Int Cl.: H04L 12/28 (06.01) H04L 29/06 (06.01) H04L
Requirements of Voice in an IP Internetwork
Requirements of Voice in an IP Internetwork Real-Time Voice in a Best-Effort IP Internetwork This topic lists problems associated with implementation of real-time voice traffic in a best-effort IP internetwork.
First Semester Examinations 2011/12 INTERNET PRINCIPLES
PAPER CODE NO. EXAMINER : Martin Gairing COMP211 DEPARTMENT : Computer Science Tel. No. 0151 795 4264 First Semester Examinations 2011/12 INTERNET PRINCIPLES TIME ALLOWED : Two Hours INSTRUCTIONS TO CANDIDATES
Performance Issues of TCP and MPEG-4 4 over UMTS
Performance Issues of TCP and MPEG-4 4 over UMTS Anthony Lo [email protected] 1 Wiskunde end Informatica Outline UMTS Overview TCP and MPEG-4 Performance Summary 2 1 Universal Mobile Telecommunications
Link Layer. 5.6 Hubs and switches 5.7 PPP 5.8 Link Virtualization: ATM and MPLS
Link Layer 5.1 Introduction and services 5.2 Error detection and correction 5.3Multiple access protocols 5.4 Link-Layer Addressing 5.5 Ethernet 5.6 Hubs and switches 5.7 PPP 5.8 Link Virtualization: and
Bandwidth Adaptation for MPEG-4 Video Streaming over the Internet
DICTA2002: Digital Image Computing Techniques and Applications, 21--22 January 2002, Melbourne, Australia Bandwidth Adaptation for MPEG-4 Video Streaming over the Internet K. Ramkishor James. P. Mammen
Broadband Networks. Prof. Dr. Abhay Karandikar. Electrical Engineering Department. Indian Institute of Technology, Bombay. Lecture - 29.
Broadband Networks Prof. Dr. Abhay Karandikar Electrical Engineering Department Indian Institute of Technology, Bombay Lecture - 29 Voice over IP So, today we will discuss about voice over IP and internet
APTA TransiTech Conference Communications: Vendor Perspective (TT) Phoenix, Arizona, Tuesday, 3.19.13. VoIP Solution (101)
APTA TransiTech Conference Communications: Vendor Perspective (TT) Phoenix, Arizona, Tuesday, 3.19.13 VoIP Solution (101) Agenda Items Introduction What is VoIP? Codecs Mean opinion score (MOS) Bandwidth
HU CZ FI PL SI PT IT ES NO NL FR DK SE IE GB AT DE CH LU 0 10 20 30 40 Foreigners' share Source: Eurostat More trust 3 4 5 6 7 PL HU CZ SI PT GR ES DK FI SE
TEPZZ 9 _88_A_T EP 2 921 881 A1 (19) (11) EP 2 921 881 A1 (12) EUROPEAN PATENT APPLICATION
(19) TEPZZ 9 _88_A_T (11) EP 2 921 881 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 23.09.1 Bulletin 1/39 (21) Application number: 1416041.2 (1) Int Cl.: G01T 1/ (06.01) G03B 42/02 (06.01)
BLUETOOTH SERIAL PORT PROFILE. iwrap APPLICATION NOTE
BLUETOOTH SERIAL PORT PROFILE iwrap APPLICATION NOTE Thursday, 19 April 2012 Version 1.2 Copyright 2000-2012 Bluegiga Technologies All rights reserved. Bluegiga Technologies assumes no responsibility for
Easy H.264 video streaming with Freescale's i.mx27 and Linux
Libre Software Meeting 2009 Easy H.264 video streaming with Freescale's i.mx27 and Linux July 8th 2009 LSM, Nantes: Easy H.264 video streaming with i.mx27 and Linux 1 Presentation plan 1) i.mx27 & H.264
Performance Evaluation of VoIP Services using Different CODECs over a UMTS Network
Performance Evaluation of VoIP Services using Different CODECs over a UMTS Network Jianguo Cao School of Electrical and Computer Engineering RMIT University Melbourne, VIC 3000 Australia Email: [email protected]
Module 8 VIDEO CODING STANDARDS. Version 2 ECE IIT, Kharagpur
Module 8 VIDEO CODING STANDARDS Version ECE IIT, Kharagpur Lesson H. andh.3 Standards Version ECE IIT, Kharagpur Lesson Objectives At the end of this lesson the students should be able to :. State the
EN 106 EN 4. THE MOBILE USE OF THE INTERNET BY INDIVIDUALS AND ENTERPRISES. 4.1. Introduction
4. THE MOBILE USE OF THE INTERNET BY INDIVIDUALS AND ENTERPRISES 4.1. Introduction This chapter looks at mobile use of the internet by individuals and enterprises, benefiting from new data collected in
Module 7 Internet And Internet Protocol Suite
Module 7 Internet And Internet Protocol Suite Lesson 21 Internet and IPv4 LESSON OBJECTIVE General The lesson will discuss a popular network layer protocol, i.e. the Internet Protocol Specific The focus
ANALYSIS OF LONG DISTANCE 3-WAY CONFERENCE CALLING WITH VOIP
ENSC 427: Communication Networks ANALYSIS OF LONG DISTANCE 3-WAY CONFERENCE CALLING WITH VOIP Spring 2010 Final Project Group #6: Gurpal Singh Sandhu Sasan Naderi Claret Ramos ([email protected]) ([email protected])
EPL 657 Wireless Networks
EPL 657 Wireless Networks Some fundamentals: Multiplexing / Multiple Access / Duplex Infrastructure vs Infrastructureless Panayiotis Kolios Recall: The big picture... Modulations: some basics 2 Multiplexing
Multiplexing on Wireline Telephone Systems
Multiplexing on Wireline Telephone Systems Isha Batra, Divya Raheja Information Technology, Dronacharya College of Engineering Farrukh Nagar, Gurgaon, India ABSTRACT- This Paper Outlines a research multiplexing
ATM. Asynchronous Transfer Mode. Networks: ATM 1
ATM Asynchronous Transfer Mode Networks: ATM 1 Issues Driving LAN Changes Traffic Integration Voice, video and data traffic Multimedia became the buzz word One-way batch Two-way batch One-way interactive
Proactive Video Assurance through QoE and QoS Correlation
A Complete Approach for Quality and Service Assurance W H I T E P A P E R Introduction Video service providers implement new technologies to maximize the quality and diversity of their entertainment program
Audio and Video Synchronization:
White Paper Audio and Video Synchronization: Defining the Problem and Implementing Solutions Linear Acoustic Inc. www.linearacaoustic.com 2004 Linear Acoustic Inc Rev. 1. Introduction With the introduction
MPEG-2 Transport vs. Program Stream
MPEG-2 Transport vs. Program Stream White Paper What is the difference between Program Stream and Transport Stream, and why do we currently only support the Transport Stream format? Well, this topic is
FREE TV AUSTRALIA OPERATIONAL PRACTICE OP- 59 Measurement and Management of Loudness in Soundtracks for Television Broadcasting
Page 1 of 9 1. SCOPE This Operational Practice is recommended by Free TV Australia and refers to the measurement of audio loudness as distinct from audio level. It sets out guidelines for measuring and
ISO/IEC 11172-1 INTERNATIONAL STANDARD
NTERNATONAL STANDARD SO/EC 11172-1 First edition 1993-08-0 1 nformation technology - Coding of moving pictures and associated audio for digital storage media at up to about 1,5 Mbit/s - Part 1: Systems
ASUS Transformer Pad QSG TF300TG 3G Connection Manager
E7210 QSG TF300TG 3G Connection Manager Installing SIM card 1. Use a straightened paper clip to press the SIM card tray eject button. 2. Remove the tray from the slot. Orient and place the SIM card on
Figure 1.Block diagram of inventory management system using Proximity sensors.
Volume 1, Special Issue, March 2015 Impact Factor: 1036, Science Central Value: 2654 Inventory Management System Using Proximity ensors 1)Jyoti KMuluk 2)Pallavi H Shinde3) Shashank VShinde 4)Prof VRYadav
TEPZZ 799965A_T EP 2 799 965 A1 (19) (11) EP 2 799 965 A1. (12) EUROPEAN PATENT APPLICATION published in accordance with Art.
(19) TEPZZ 79996A_T (11) EP 2 799 96 A1 (12) EUROPEAN PATENT APPLICATION published in accordance with Art. 13(4) EPC (43) Date of publication: 0.11.14 Bulletin 14/4 (21) Application number: 14727698.4
SERIES H: AUDIOVISUAL AND MULTIMEDIA SYSTEMS Infrastructure of audiovisual services Transmission multiplexing and synchronization
International Telecommunication Union ITU-T TELECOMMUNICATION STANDARDIZATION SECTOR OF ITU H.222.0 (05/2006) SERIES H: AUDIOVISUAL AND MULTIMEDIA SYSTEMS Infrastructure of audiovisual services Transmission
(51) Int Cl.: G06F 17/00 (2006.01) G06F 15/16 (2006.01) H04N 7/10 (2006.01) H04N 21/235 (2011.01) H04L 12/58 (2006.01)
(19) TEPZZ _Z_9ZB_T (11) EP 2 2 190 B1 (12) EUROPEAN PATENT SPECIFICATION (4) Date of publication and mention of the grant of the patent: 11.09.13 Bulletin 13/37 (21) Application number: 08839334.3 (22)
Video Transmission over Wireless LAN. Hang Liu [email protected]
Video Transmission over Wireless LAN Hang Liu [email protected] Page 1 Introduction! Introduction! Wi-Fi Multimedia and IEEE 802.11e for QoS Enhancement! Error Control Techniques Page 2 Introduction!
An Introduction to VoIP Protocols
An Introduction to VoIP Protocols www.netqos.com Voice over IP (VoIP) offers the vision of a converged network carrying multiple types of traffic (voice, video, and data, to name a few). To carry out this
SINCE 1997, the ITU-T s Video Coding Experts Group
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, VOL. 13, NO. 7, JULY 2003 657 H.264/AVC in Wireless Environments Thomas Stockhammer, Miska M. Hannuksela, and Thomas Wiegand Abstract Video
How To Test Video Quality With Real Time Monitor
White Paper Real Time Monitoring Explained Video Clarity, Inc. 1566 La Pradera Dr Campbell, CA 95008 www.videoclarity.com 408-379-6952 Version 1.0 A Video Clarity White Paper page 1 of 7 Real Time Monitor
Overview of Asynchronous Transfer Mode (ATM) and MPC860SAR. For More Information On This Product, Go to: www.freescale.com
Overview of Asynchronous Transfer Mode (ATM) and MPC860SAR nc. 2 What is ATM? o Protocol that applies primarily to layer 2 of the OSI protocol stack: Application Presentation Session Transport Network
Complexity-bounded Power Control in Video Transmission over a CDMA Wireless Network
Complexity-bounded Power Control in Video Transmission over a CDMA Wireless Network Xiaoan Lu, David Goodman, Yao Wang, and Elza Erkip Electrical and Computer Engineering, Polytechnic University, Brooklyn,
A Transport Protocol for Multimedia Wireless Sensor Networks
A Transport Protocol for Multimedia Wireless Sensor Networks Duarte Meneses, António Grilo, Paulo Rogério Pereira 1 NGI'2011: A Transport Protocol for Multimedia Wireless Sensor Networks Introduction Wireless
Communication Networks. MAP-TELE 2011/12 José Ruela
Communication Networks MAP-TELE 2011/12 José Ruela Network basic mechanisms Network Architectures Protocol Layering Network architecture concept A network architecture is an abstract model used to describe
Computer Network. Interconnected collection of autonomous computers that are able to exchange information
Introduction Computer Network. Interconnected collection of autonomous computers that are able to exchange information No master/slave relationship between the computers in the network Data Communications.
Protocols. Packets. What's in an IP packet
Protocols Precise rules that govern communication between two parties TCP/IP: the basic Internet protocols IP: Internet Protocol (bottom level) all packets shipped from network to network as IP packets
TEPZZ 9 8Z87A_T EP 2 938 087 A1 (19) (11) EP 2 938 087 A1 (12) EUROPEAN PATENT APPLICATION
(19) TEPZZ 9 8Z87A_T (11) EP 2 938 087 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 28..1 Bulletin 1/44 (21) Application number: 14604.2 (1) Int Cl.: H04N 21/23 (11.01) H04N 21/488 (11.01)
How To Understand The Differences Between A Fax And A Fax On A G3 Network
The Fax on IP Networks White Paper February 2011 2 The Fax on IP Networks Contents Overview... 3 Group 3 Fax Technology... 4 G.711 Fax Pass-Through... 5 T.38 IP Fax Relay... 6 Network Design Considerations...
Chapter 6: Broadcast Systems. Mobile Communications. Unidirectional distribution systems DVB DAB. High-speed Internet. architecture Container
Mobile Communications Chapter 6: Broadcast Systems Unidirectional distribution systems DAB DVB architecture Container High-speed Internet Prof. Dr.-Ing. Jochen Schiller, http://www.jochenschiller.de/ MC
Protocol Overhead in IP/ATM Networks
Protocol Overhead in IP/ATM Networks John David Cavanaugh * Minnesota Supercomputer Center, Inc. This paper discusses the sources of protocol overhead in an IP/ATM protocol stack. It quantifies the amount
Efficient Data Recovery scheme in PTS-Based OFDM systems with MATRIX Formulation
Efficient Data Recovery scheme in PTS-Based OFDM systems with MATRIX Formulation Sunil Karthick.M PG Scholar Department of ECE Kongu Engineering College Perundurau-638052 Venkatachalam.S Assistant Professor
The EU Energy Tax Directive: overview about the proposed reform, impacts on national measures and state of play
Environmentally Related Taxes and Fiscal Reform Rome, Thursday, 15 December 2011 The EU Energy Tax Directive: overview about the proposed reform, impacts on national measures and state of play A short
Applicability of UDP-Lite for Voice over IP in UMTS Networks
Applicability of -Lite for Voice over IP in UMTS Networks Frank Mertz, Ulrich Engelke, Peter Vary RWTH Aachen University, Institute of Communication Systems and Data Processing (IND) D-5256 Aachen, Germany
Doro PhoneEasy 331ph
Doro PhoneEasy 331ph 1 2 6 3 4 5 English 1 Ringer indicator 2 Hanging Hook for Handset 3 Redial function 4 Volume control 5 Flash button/programming 6 Speed dial memories This device is intended for the
