[Back]

AVEM Classroom 15 - Video Signals




There are two groups of components in a video signal:

  • Color signals
  • Synchronization signals


Color Signals

The light focused on a camera's imager system is separated into the image's primary color components: red, green and blue signals. These three signals are commonly referred to as RGB. These color signals must be combined with synchronizing information, typiically called sync.


Horizontal and Vertical Signals

Sync preserves the time relationship between video frames and correctly positions the image horizontally and vertically. This information is called Horizontal sync and vertical sync, usually abbreviated H and V, So, R, G, B, H, and V are the five basic elements of video signals. They contain all the information needed for a complete video image.

   Video signals were originally scanned line by line, in both the camera used for capturing images and the television picture tube used for displaying them. While the technology used to capture and display images has changed dramastically, a form of image scanning still takes place in all video systems.

   Horizontal and vertical sync signals define the edges of an image. Without these signals, your picture would roll from side to side and from top to bottom.

   Inside each camera is a sync generator that produces the horizontal and vertical sync pulses. Horizontal sync pulses are used to time the edges of the image. They determine the point where the pixels end of the right edge of the screen. Vertical sync pulses are used to time the top and bottom of the screen. Vertical sync begins at the bottom-right pixel.

   Syncing signals are also used to ensure smooth switching between video sources, enabling switching to occur during the short interval between complete video frames. This is known as vertical interval switching and is the preferred source switching process for all video systems. The switching of sources at any point other than during the vertical interval is known as crash switching and produces a burst of picture instability until the video system re-synchronizes.


Scan Rates

Horizontal and vertical sync signals define the edges of the image. The horizontal scan rate describes the number of horizontal lines displayed per second. The vertical scan rate is the amount of time it takes for the amount of times it takes for the entire image, or frame, to appear since all the horizontal and vertical scan rates are known as the scan rate.

   The number of horizontal scan lines in an image depends on the video content. For example, a DVD might be 780 x 480 pixels, with a refresh rate of 30 frames per second, while a computer might be 2560 x 1440, with a refresh rate of 60 frames per second. For both of these types of content to be viewed on the same display, the display must be capable of playing back both scan rates.

   Table 5-4 shows a select sample of scan rates for the visible pixels in analog video, digital video, and computer graphics.



   Each also defines the scan rate according to a specific horizonal frequency, the number of horizontal lines a device delivers each second, measured in kilohertz. Because many horizontal scans are completed in just one vertical scan. The vertical scan describes the number of complete video fields delivered per second, measured in hertz. This may also be called vertical sync rate. Many modern video displays are capable of displaying more video frames than the source maybe be delivering, and they can refresh the image at a higher frequency.

   Some older display technologies were unable to display all of the the horizontal lines in a video frame at the same time, resulting in a system that displayed odd-numbered lines in one pass and even-numbered lines in the next display pass. This method of display scanning is known as an interlaced display and indicated with an I suffix (such as 480i), while a system that can display all lines in a single pass is known as a progressive display  and indicated with an p suffix (such as 1080p). The rate for an interlaced image is one-half that of the vertical scan rate.


Bit Depth of Video Signals

The bit depth of a digital signal is a measure of how accurately that signal can potentially be translated back into an analog signal at the end of its journey to a video imaging device or an audio device. As the technologies used in digital imaging devices and video display systems continue to increase. Although the color bit depth has mostly remained at 8 bits per channel (RGB), High Dynamic Range image, although some image display are not yet able to render the extra bits.

   The bit depth of the color channels of video images has been steadily increasing since the introduction of wide-color gamut technologies such as UHD Blu-ray, which offers 10 bits of color resolution per channel.

Some image composition software can work at 16 bits per color channel, but displays with this capabilities are not yet widely available.

Signal Quality and Bandwidth

After video has been captured and converted into signals, you need to transport those signals to devices for display. High-resolution video images take up a lot of bandwidth. To preserve image detail, you need high- quality cable, as well as video equipment that can switch, process, and distribute these signals without degrading them.

   In electronics, bandwidth is the range of frequencies that can pass through a circuit. The difference between the highest and lowest frequencies that can detect and react to is that circuit's bandwidth.

   A video signal covers a range of frequencies:

  • Lower frequencies, in the hertz range, include vertical sync.
  • Middle frequencies, in the kilohertz range, include horizontal sync.
  • Higher frequencies, in the megahertz range, include detailed picture information.

A video consists of a continuous sequence of still images, similar to the individual frames on a strip of motion pciture film. The camera's lens focuses light onto a recording medium, capturing each frame in rapid succession.

In traditional film cameras, when the shutter opens briefly, light exposes the film, triggering chemical changes that form an image. Once the shutter closes, the film advances to position the next frame for exposure.


AVEM Classroom 15 - Video Signals

AVEM Classroom 15 - Video Signals

AVEM Classroom 14 - How Do Cameras Operate

AVEM Classroom 14 - How Do Cameras Operate

AVEM Classroom 13 - Color Rendering Index

AVEM Classroom 13 - Color Rendering Index

AVEM Classroom 12 - Inverse Square Law and Light

AVEM Classroom 12 - Inverse Square Law and Light

AVEM Classroom 11 - Video Systems

AVEM Classroom 11 - Video Systems

AVEM Classroom 10 - Sound Capture & Microphones

AVEM Classroom 10 - Sound Capture & Microphones

AVEM Classroom 09 - Acoustics

AVEM Classroom 09 - Acoustics

AVEM Classroom 08 - Digital Audio Network Protocols

AVEM Classroom 08 - Digital Audio Network Protocols

AVEM Classroom 07 - Human Perception ( Harmonics and Logarithms)

AVEM Classroom 07 - Human Perception ( Harmonics and Logarithms)

AVEM Classroom 06 - Frequency, Bands and Octaves

AVEM Classroom 06 - Frequency, Bands and Octaves

AVEM Classroom 05 - Sound Waves & Wavelengths

AVEM Classroom 05 - Sound Waves & Wavelengths

AVEM Classroom 04 - Noise and Signal Transmission

AVEM Classroom 04 - Noise and Signal Transmission

AVEM Classroom 03 - Signal compression

AVEM Classroom 03 - Signal compression

AVEM Classroom 02 - Bit Depth of a Digital Signal

AVEM Classroom 02 - Bit Depth of a Digital Signal

AVEM Classroom 01 - Waveforms & Digital Signal Basics

AVEM Classroom 01 - Waveforms & Digital Signal Basics


WhatsApp AVEM