Video - PowerPoint PPT Presentation

About This Presentation
Title:

Video

Description:

However, in interlaced video, the set of horizontal lines with even numbers and ... It treats the video stream as a series of individual images. ... – PowerPoint PPT presentation

Number of Views:270
Avg rating:3.0/5.0
Slides: 39
Provided by: VikoN
Category:
Tags: video

less

Transcript and Presenter's Notes

Title: Video


1
Video
  • To begin looking at video we need to first go
    over some terms used in its description. Here
    comes the jargon!

2
Describing the video stream
  • Frame rate
  • The number of still pictures per unit of time of
    video,
  • old mechanical cameras from six or eight (fps
  • new professional cameras 120 or more frames per
    second
  • PAL and SECAM standards specify 25 fps,
  • NTSC specifies 29.97 fps.
  • To achieve the illusion of a moving image, the
    minimum frame rate is about ten frames per
    second.
  • Film is shot at the slower frame rate of 24fps.

3
  • Interlacing vs progressive scan
  • Interlaced
  • Interlacing was invented as a way to achieve good
    visual quality within the limitations of a narrow
    bandwidth.
  • The horizontal scan lines of each interlaced
    frame are numbered consecutively and partitioned
    into two fields the odd field consisting of the
    odd-numbered lines and the even field consisting
    of the even-numbered lines. NTSC, PAL and SECAM
    are interlaced formats.

4
  • Progressive scan
  • Progressive or non-interlaced scanning is any
    method for displaying, storing or transmitting
    moving images in which the lines of each frame
    are drawn in sequence. This is in contrast to the
    interlacing used in traditional television
    systems.
  • Progressive scan is used in most CRTs used as
    computer monitors. It is also becoming
    increasingly common in high-end television
    equipment, which is often capable of performing
    deinterlacing so that interlaced video can still
    be viewed.

5
  • Advantages of progressive scan include
  • Subjectively increased vertical resolution. The
    perceived vertical resolution of an interlaced
    image is usually equivalent to multiplying the
    active lines by about 0.6. This explains, for
    example, why HDTV standards such as 1080i
    (1920x1080, interlaced) in most cases deliver a
    quality equal to or slightly poorer than that of
    720p (1280x720, progressive), despite containing
    far more lines of resolution.
  • No flickering of narrow horizontal patterns
  • Simpler video processing equipment
  • Easier compression

6
  • Video resolution
  • The size of a video image is measured in pixels
    for digital video or horizontal scan lines for
    analog video.
  • Standard-definition television (SDTV) is
    specified as 640_80i60 for NTSC and 720_76i50 for
    PAL or SÉCAM resolution. New high-definition
    televisions (HDTV) are capable of resolutions up
    to 1920_080p60, i.e. 1920 pixels per scan line by
    1080 scan lines, progressive, at 60 frames per
    second.

7
(No Transcript)
8
  • Aspect ratio
  • Aspect ratio describes the dimensions of video
    screens and video picture elements.
  • traditional television screen is 43, or 1.331.
  • High definition televisions 169, or about
    1.781.
  • full 35 mm film (also known as "Academy
    standard") is around 1.371.

9
  • Some common aspect ratios

10
  • Digital Video
  • Digital video is a type of video recording system
    that works by using a digital, rather than
    analog, representation of the video signal.
  • This generic term is not to be confused with the
    name DV, which is a specific type of digital
    video.
  • Most often recorded on tape, then distributed on
    optical discs, usually DVDs.
  • Some exceptions, camcorders that record directly
    to DVDs, Digital8 camcorders which encode digital
    video on conventional analog tapes.

11
  • Main issue with digital video
  • Digital Video in its raw form can be huge.
  • e.g. 25 fps of 720 576 3bytes.
  • Leeds to two interrelated problems
  • Large storage requirements
  • High Bandwidth requirement during playback and
    recording.

12
  • The solution compression
  • In fact this problem of large storage and high
    bandwith requirement is a defining feature of
    much multimedia not just video.
  • We also must deal with compression in the context
    of sound.
  • For now we focus on digital video

13
Video Compression
  • Compression is a conversion of data to a format
    that requires fewer bits, usually performed so
    that the data can be stored or transmitted more
    efficiently.
  • The inverse process is known as decompression.
    Depending on the compression method used to
    encode the original, decompression may or may not
    yield an exact copy of the original unencoded
    data.

14
  • Two broad types of compression
  • Temporal and Spatial ( a slightly false
    dichotomy, many codecs use a combination of the
    two.)
  • Before we look at these remember what we mean by
    a frame.
  • Frame
  • A frame is a set of all pixels that
    (approximately) correspond to a single point in
    time. However, in interlaced video, the set of
    horizontal lines with even numbers and the set
    with odd numbers are grouped together in fields.
    The term "picture" can refer to a frame or a
    field.

15
  • Spatial Compression (intraframe)
  • Compresses each frame individually. It treats the
    video stream as a series of individual images.
  • takes advantage of the fact that the human eye is
    unable to distinguish small differences in colour
    as easily as it can changes in brightness and so
    very similar areas of colour can be "averaged
    out" in a similar way to jpeg images.

16
  • Temporal compression (interframe)
  • With temporal compression only the changes from
    one frame to the next are encoded.
  • Often a large number of the pixels will be the
    same on a series of frames. (blue sky)
  • By identifying differences between consecutive
    frames of video, and storing just those frame
    differences in the compressed file, temporal
    compression can dramatically decrease video data
    size.
  • Interframes Frames compressed with temporal
    compression are called interframes

17
Codecs
  • A Codec is a device or program capable of
    performing Encoding and Decoding on a data stream
    or signal.
  • The word "codec" is a portmanteau of any of the
    following 'Compressor-Decompressor',
    'Coder-Decoder', or 'Compression/Decompression
    algorithm'.

18
  • Cinepak
  • developed by Radius Inc to accommodate 1x (150
    kbyte/s) CD-ROM transfer rates.
  • primary video codec of early versions of
    QuickTime and Microsoft Video for Windows,
  • superseded by Sorenson Video, Intel Indeo, and
    most recently MPEG-4 and h.264.
  • based on vector quantization, which is a
    significantly different algorithm from the
    discrete cosine transform (DCT) algorithm used by
    most current codecs (in particular the MPEG
    family, as well as JPEG). This permitted
    implementation on relatively slow CPUs, but
    tended to result in blocky artifacting at low
    bitrates.

19
  • Sorenson codec (Sorenson Video Codec, Sorenson
    Video Quantizer or SVQ)
  • devised by the company Sorenson Media
  • The Sorenson codec first appeared in QuickTime 3.

  • The specifications of the codec were not public,
    and for a long time the only way to play back
    Sorenson video was to use Apple's QuickTime
    player, or the MPlayer for Unix/Linux, which in
    turn piggy-backed Microsoft Windows DLL-files
    extracted from Apple's player.

20
  • According to an anonymous developer of FFmpeg,
    reverse engineering of the SVQ3 codec revealed it
    as a tweaked version of H.264. The same developer
    also added support for this codec to FFmpeg,
    making native playback on all platforms supported
    by FFmpeg possible.

21
  • MPEG-1 video
  • Originally designed for 1.5M bit/second data
    rates and 352x240 resolution.
  • Later improvements allow for up to 4M bit/second
    for better quality.
  • MPEG-1 is the most compatible format in the MPEG
    family it is playable in almost all computers
    and VCD/DVD players.
  • One big disadvantage of the MPEG-1 is that it
    supports only progressive scan video, which is
    part of the reasons which prompted the more
    advanced MPEG-2.

22
  • MPEG-2 Video
  • Standard for DVD encoding
  • similar to MPEG-1, but also provides support for
    interlaced video (the format used by broadcast TV
    systems)
  • not optimized for low bit-rates (less than
    1 Mbit/s), but outperforms MPEG-1 at 3 Mbit/s and
    above.
  • All standards-conforming MPEG-2 Video decoders
    are fully capable of playing back MPEG-1 Video
    streams.

23
  • Mpeg 2 uses a combination of temporal and spatial
    encoding
  • An MPEG-2 video bitstream is made up of a series
    of data frames encoding pictures.
  • The three ways of encoding a picture are
    intra-coded (I picture), forward predictive (P
    picture) and bidirectional predictive (B
    picture).
  • I pictures encode for spatial redundancy, P and B
    pictures for temporal redundancy. Because
    adjacent frames in a video stream are often
    well-correlated, P pictures may be 10 of the
    size of I pictures, and B pictures 2 of their
    size.

24
  • The sequence of different frame types is called
    the Group of Pictures (GOP) structure.
  • There are many possible structures but a common
    one is 15 frames long, and has the sequence
    I_BB_P_BB_P_BB_P_BB_P_BB_.
  • The ratio of I, P and B pictures in the GOP
    structure is determined by the nature of the
    video stream and the bandwidth constraints on the
    output stream, although encoding time may also be
    an issue.

25
  • MPEG 4
  • introduced in late 1998,
  • the designation for a group of audio and video
    coding standards and related technology agreed
    upon by the ISO/IEC Moving Picture Experts Group
    (MPEG)
  • The primary uses for the MPEG-4 standard are web
    (streaming media) and CD distribution,
    conversational (videophone), and broadcast
    television.

26
  • And thats not even the beginning of codecs
  • http//www.fourcc.org/codecs.php

27
Container formats
  • A container format is a computer file format that
    can contain various types of data, compressed in
    a manner of standardized codecs.
  • The container file is used to be able to identify
    and interleave the different data types. Simpler
    container formats can contain different types of
    audio codecs, while more advanced container
    formats can support audio, video, subtitles,
    chapters, and meta-data (tags) - along with the
    synchronization information needed to play back
    the various streams together.

28
Audio containers
  • Some containers are exclusive to audio such as
  • WAV (RIFF file format, widely used on Windows
    platform)
  • AIFF (AIFF file format, widely used on Mac OS
    platform)

29
More flexible containers
  • Other flexible containers can hold many types of
    audio and video, as well as other media.
  • The most popular multi-media containers are
  • AVI (the obsolete standard Microsoft Windows
    container, also based on RIFF)
  • MOV (standard QuickTime container)
  • MPEG-2 TS (acronym of Transport Stream, standard
    container for digital broadcasting

30
  • MP4 (standard container for the MPEG-4 multimedia
    portfolio)
  • Ogg (standard container for Xiph.org codecs)
  • ASF (standard container for Microsoft WMA and
    WMV)
  • RealMedia (standard container for RealVideo and
    RealAudio)
  • Matroska (not standard for any codec or system,
    but it is an open standard).
  • 3gp (used by many mobile phones)

31
Lesser known containers
  • There are many other container formats, such as
    NUT, MPEG, MXF, ratDVD, SVI,and
  • DivX Media Format. (becoming very well known)

32
Enough definitions! One more
  • DV_ digital video. Yet another standard.
  • a video format launched in 1996, and, in its
    smaller tape form factor MiniDV.
  • has since become one of the standards for
    consumer and semiprofessional video production.
  • The DV specification (originally known as the
    Blue Book, current official name IEC 61834)
    defines both the codec and the tape format.

33
  • Features include intraframe compression for
    uncomplicated editing,
  • a standard interface for transfer to non-linear
    editing systems (FireWire also known as IEEE
    1394)
  • good video quality, especially compared to
    earlier consumer analog formats such as 8 mm,
    Hi-8 and VHS-C.
  • DV now enables filmmakers to produce movies
    inexpensively, associated with no-budget cinema.

34
  • DVs Video compression
  • DV uses DCT intraframe compression at a fixed
    bitrate of 25 megabits per second (25.146
    Mbit/s), which, when added to the sound data
    (1.536 Mbit/s), the subcode data, error
    detection, and error correction (approx 8.7
    Mbit/s) amounts in all to roughly 36 megabits per
    second (approx 35.382 Mbit/s) or one Gigabyte
    every four minutes.
  • At equal bitrates, DV performs somewhat better
    than the older MJPEG codec, and is comparable to
    intraframe MPEG-2. (Note that many MPEG-2
    encoders for real-time acquisition applications
    do not use intraframe compression.)

35
  • DVs audio compression
  • DV allows either 2 digital audio channels
    (usually stereo) at 16 bit resolution and 48 kHz
    sampling rate,
  • or 4 digital audio channels at 12 bit resolution
    and 32 kHz sampling rate.
  • For professional or broadcast applications, 48
    kHz is used almost exclusively.
  • In addition, the DV spec includes the ability to
    record audio at 44.1 kHz (the same sampling rate
    used for CD audio), although in practice this
    option is rarely used.

36
  • Variations on DV
  • Sony's DVCAM is a semiprofessional variant of the
    DV standard that uses the same cassettes as DV
    and MiniDV, but transports the tape 50 faster,
    leading to a higher track width of 15
    micrometres.
  • The codec used is the same as DV, but because of
    the greater track width available to the recorder
    the data are much more robust, producing 50 less
    errors known as dropouts.

37
  • Other Variants include DVCPRo and HDV
  • Further reading
  • http//en.wikipedia.org/

38
TO DO
  • Start up Adobe premier pro.
  • Begin a new project that is PAL and 48kHz.
  • Acquaint yourself with the different areas of the
    interface.
  • The timeline
  • The bin
  • Import some video from the samples folder in the
    premier folder into your bin and drop them onto
    the timeline on video 1 track and video 2 track.
Write a Comment
User Comments (0)
About PowerShow.com