Understanding the Audience Part 2 - PowerPoint PPT Presentation

1 / 81
About This Presentation
Title:

Understanding the Audience Part 2

Description:

Use sources such as telephone directories, but sample selected by computer ... Representative of overall UK and individual TV regions ... – PowerPoint PPT presentation

Number of Views:94
Avg rating:3.0/5.0
Slides: 82
Provided by: michaels92
Category:

less

Transcript and Presenter's Notes

Title: Understanding the Audience Part 2


1
Understanding the AudiencePart 2
  • Michael Svennevig
  • 16, Clarendon Place
  • m.svennevig_at_leeds.ac.uk
  • Ext 31604
  • 0113 343 1604

2
ACTIVE AUDIENCES Uses and GratificationsThe
Problems
  • Circularity self-justification
  • Media use not always controlled by the individual
  • Relies on lists etc.
  • If you ask questions, you get answers

3
ACTIVE AUDIENCES Uses and GratificationsThe
Problems
  • If you ask stupid questions, you get stupid
    answers
  • If you ask leading questions, you get leading
    answers
  • If you ask incomplete questions, you get
    incomplete answers

4
AUDIENCES Reception Studies
  • Finding out what people do with the media
    messages they encounter (either actively or
    passively)
  • Linked to literary-critical studies.
  • Media use reflects socio-cultural context and
    gives meaning to cultural expressions and
    experiences.

5
AUDIENCES Reception Studies
  • Mainly uses qualitative methodologies -
    interviews, focus groups, observational,
    ethnographic

6
AUDIENCES Reception Studies
  • Media text has to be read audience
    perceptions construct the meaning and experience
    of media output
  • Media use situation-specific and involves
    interpretative communities

7
AUDIENCES Reception Studies
  • Media use reflects socio-cultural context and
    gives meaning to cultural expressions and
    experiences.
  • Audiences are never passive, nor are all members
    equal, since some will be more experienced or
    more active than others

8
AUDIENCES Reception Studies
David Morley in the UK set the current scene in
the 1980s For instance, the same man may be
simultaneously a productive worker, a trade union
member, a supporter of the Social Democratic
Party, a consumer, a racist, a home owner, a wife
beater and a Christian David Morley, Family
Television, Comedia, London, 1986, p.42
9
AUDIENCES Reception Studies
An individual can make different readings of a
text OPPOSITIONAL rejecting framework of media
argument/logic NEGOTIATED takes middle
way DOMINANT follows mediated logic
10
AUDIENCES Reception Studies
The problem with this style of approach is the
almost complete inability to consider the
audience at all. Taken to the logical extreme,
each person is a separate audience in his or
her own right.
11
QUESTIONS THAT NEED TO BE ASKED ABOUT MEDIA
How are they used? Ron Lembo (Thinking Through
Television, 2000) DISCRETE USE watch programmes,
select content, attention UNDIRECTED USE watch
TV, some selective attention CONTINUOUS USE TV is
on, some attention, zapping, grazing, other
activities co-exist
12
QUESTIONS THAT NEED TO BE ASKED ABOUT MEDIA
How are they used? Denis McQuail (Audience
Analysis, 1997) PRIVATE USE For personal or
immediate social use, possible reflecting
demographics, ethnicity etc. PUBLIC USE Social
and societal sharing of meanings, experiences,
emotions, spectacle and tradition, cultural
transmission
13
QUESTIONS THAT NEED TO BE ASKED ABOUT MEDIA
How are they used? Sonia Livingstone (European
Journal of Communication,2004) INTERPRETATION Tex
t and Readers - encoding and decoding of
symbols Active and/or Passive - depending on
the occasion and the context Content - rather
than form or channel Fans - not just viewers
14
AUDIENCE RESEARCH
  • Some real research projects
  • futura.com survey who uses what why
    mobiles, digital TV, internet etc.
  • joint research with TV, cable and advertising
    companies

15
AUDIENCE RESEARCH
  • What do you want to find out?
  • actions
  • thoughts/opinions
  • descriptions
  • mass/representative
  • subgroup/depth
  • reactions to/impacts of/effects of media
  • media content

16
AUDIENCE RESEARCH
  • How are you going to find it out?
  • QUANTITATIVE METHODS
  • surveys
  • audits
  • one-off or longitudinal or repeated
  • How many, who, when, how often, where?

17
AUDIENCE RESEARCH
  • What assumptions have you made?
  • people are similar to each other, at least in
    some ways
  • hidden cognitive processes can be revealed
    through research
  • behaviour can be interpreted as evidence of
    thought
  • people are basically truthful when giving
    responses
  • questions are understood, and meanings of words
    are constant

18
AUDIENCE RESEARCH
  • What sort of data/analysis do you want to end up
    with?
  • support for existing theories/models
  • material to develop new theories/models
  • results that have a direct application to
    practice
  • statistical accuracy and data that can be
    manipulated
  • rich data that allow detailed interpretation at
    the individual or small group setting

19
AUDIENCE RESEARCH
  • KEY CONCEPTS AND TERMS
  • POPULATION
  • SAMPLE
  • VALIDITY
  • RELIABILITY
  • ERROR
  • ESTIMATE
  • REPRESENTATIVENESS

20
  • KEY CONCEPTS AND TERMS
  • POPULATION
  • All of those people you want to know about
  • All adults
  • All TV viewers
  • All with a TV set
  • Internet users
  • The British
  • 30-45 year old men who drive 4x4s
  • Violent people

21
  • KEY CONCEPTS AND TERMS
  • SAMPLE
  • A SUBSET of those things/people you want to know
    about
  • 200 adults
  • 15,000 TV viewers
  • 10 focus groups of 6 people each
  • 2000 people randomly telephoned
  • 500 people who complete an on-line questionnaire
  • 100,00 people who enter a competition in a
    magazine

22
  • KEY CONCEPTS AND TERMS
  • SAMPLING
  • For generating ideas and concepts to take
    further
  • Go for the interesting, the weird, to extremes,
    cover the maximum range
  • BUT
  • For finding out about the world potential
    audiences, tastes among a given population,
    opinions held by TV viewers
  • Go for reliable estimates of what is out there,
    establish what the norms are

23
  • KEY CONCEPTS AND TERMS
  • SAMPLING
  • SELECTION FROM LISTS
  • get a list of the population (electoral register,
    list of house addresses, membership lists,
    guarantees and warranty cards, list of students
    etc.
  • Take every nth name/address
  • e.g. want 2000 names from a list of 14,000
    start at any where, and select every 7th name
  • e.g. knock at every 10th front door

24
  • KEY CONCEPTS AND TERMS
  • SAMPLING
  • RANDOM METHODS
  • Basic methods
  • draw names from hat
  • select people who have a birthday in three next 3
    weeks
  • Use sources such as telephone directories, but
    sample selected by computer drawing random
    numbers
  • Lists may not be available, or may have unwanted
    characteristics

25
  • KEY CONCEPTS AND TERMS
  • SAMPLING
  • NON-RANDOM METHODS
  • Define the characteristics you want
  • Then find people who match these
  • use selection questions in an interview
  • select people on basis of answers/characteristics
  • Often called QUOTA sampling very widely used,
    since it is relatively cheap and quick.

26
  • KEY CONCEPTS AND TERMS
  • SAMPLING
  • PROBLEMS
  • Reality gets in the way of theory
  • Cant really do proper random sampling
  • some people not in, or refuse to help
  • need to ensure a sample of different areas and
    different people
  • tower blocks versus leafy suburbs, students
    versus elderly, working versus unemployed, with
    children and without, etc, etc.
  • Some people more appealing/less scary than
    others, tend to get recruited

27
  • KEY CONCEPTS AND TERMS
  • SAMPLE
  • Absolutely vital to get this as right as
    possible!
  • Sample wrong, survey estimates wrong
  • Sample wrong, focus groups misleading
  • Samples wrong, failed concept, failed
    productions, end of promising career?
  • http//www.stats.gla.ac.uk/steps/glossary/sampling
    .html
  • http//en.wikipedia.org/wiki/Random_sampling

28
  • KEY CONCEPTS AND TERMS
  • VALIDITY/RELIABILITY
  • The techniques, methods, questions etc. are
    actually addressing the issues you are
    researching
  • It was good to see the return of Sil the
    Galactic Slug (Agree/Disagree)
  • Measuring whether people are in a room with a TV
    set which is switched on
  • Asking people which radio stations they listened
    to yesterday

29
  • KEY CONCEPTS AND TERMS
  • ERROR sample-based
  • Because samples are used, there is inevitably
    error involved
  • Statistical error in survey estimates A
    sample of 1,000 people gives results which have
    an error of at least /- 5 - 45 - 55 would
    vote for Tony Blair
  • Systematic error in selecting samples
    friendly-looking people, people in shopping
    malls rather than on the street, or in
    home..etc.
  • Volunteers, questionnaires in papers

30
  • KEY CONCEPTS AND TERMS
  • ERROR question-based
  • Because research often uses questions, there is
    inevitably error involved
  • leading questions
  • Most people . How about you?
  • stupid questions
  • Have you stopped beating your wife?
  • phone-in polls, questionnaires sent in from
    newspapers and magazines
  • sensitive issuesswearing, sexual behaviour,
    prejudice.

31
  • KEY CONCEPTS AND TERMS
  • REPRESENTATIVENESS
  • Research is based on samples- therefore all we
    get are estimates of reality, not reality itself
  • It is rarely legitimate to say things like
    research has proved
  • Research can suggest, support, imply
  • Research is only as good as the creative effort
    put into its design

32
  • KEY CONCEPTS AND TERMS
  • THE AVERAGE.
  • Beware the average person they rarely exist,
    most people are not average in the technical
    sense of the word
  • Two measures available - Mean and Mode
  • The average person has 1.9 ears (Mean)
  • The typical person has 2 ears (Mode)

33
AUDIENCE RESEARCH METHODS
  • INDUSTRIAL BIG RESEARCH FOR MEASURING THINGS
  • AD-HOC MEDIA STUDIES
  • FOCUS GROUPS/CLINICS/HALL TESTS

34
INDUSTRIAL BIG RESEARCH
  • Television Audience Research (e.g. BARB, Neilsen,
    Gfk, ATR, Arbitron peoplemeters, portable
    personmeters)
  • assess who sees/hears, what, when, (where?)
  • representativeness high quality sampling
  • for TV, usually continuous panel
  • for other media, usually diary, recall or single
    exposure using multiple separate survey waves
    over time (repeated measures)
  • sample size important accuracy of estimates

35
INDUSTRIAL BIG RESEARCH
36
INDUSTRIAL BIG RESEARCH
37
INDUSTRIAL BIG RESEARCH
  • Doesnt involve using many words

38
INDUSTRIAL BIG RESEARCH
  • Television Audience Research BARB
  • 5,100 households 11,500 people aged 4
  • Representative of overall UK and individual TV
    regions
  • Annual establishment survey 52,000 interviews per
    year
  • Sample of households drawn from Establishment
    Survey (and top-ups)
  • All TVs measured
  • Guests measured
  • Measures set use and presence of viewers

39
INDUSTRIAL BIG RESEARCH
  • Television Audience Research
  • sample composition important
  • key markets or subgroups require minimum sample
    size, therefore often over-sample these to
    maintain accuracy
  • tendency towards automation
  • tendency towards monopoly supply and shared data
  • tendency towards non-obtrusive methods (passive
    meters)

40
INDUSTRIAL BIG RESEARCH
  • The Peoplemeter

41
INDUSTRIAL BIG RESEARCH
  • Television Audience Research
  • tendency towards additional marketing and
    segmentation measures beyond simple activity
    logging
  • e.g. attitudinal measures, other media,
    purchasing habits, interests
  • constant need to incorporate change new
    technology, new channels etc.

42
INDUSTRIAL BIG RESEARCH
  • Television Audience Research
  • Outputs
  • audience size
  • audience behaviour
  • audience composition
  • target audience data
  • demographic segmentation
  • 16-24 males, housewives with kids etc.
  • data fusion

43
INDUSTRIAL BIG RESEARCH
  • Television Audience Research
  • Analyses
  • reach (1 minute, 15 second, 1 hour, all day etc.)
  • frequency of viewing
  • averaging/smoothing
  • weighting
  • automatic detection of rogue panellists

44
INDUSTRIAL BIG RESEARCH
  • Television Audience Research
  • Problems
  • sample representativeness (long-term volunteers)
  • sample sizes, especially regional and demographic
    subsamples
  • guest viewing
  • out-of-home viewing
  • unattributed viewing
  • smoothing versus noisy data

45
INDUSTRIAL BIG RESEARCH
  • Television Audience Research
  • Problems
  • presence versus attention
  • inter-national differences in methods, samples,
    definitions
  • video and DVD playback
  • DVD recorders, TiVo, Sky Plus
  • Viewing via the internet

46
INDUSTRIAL BIG RESEARCH
  • Television Audience Research
  • Quality control
  • coincident studies (diary, telephone, personal
    visit etc.)
  • observational studies
  • effects of weighting procedures
  • panel maintenance/ recruiting replacements

47
INDUSTRIAL BIG RESEARCH
  • The Passive Personal Meter (PPM)

48
INDUSTRIAL BIG RESEARCH
  • The Passive Personal Meter (PPM)

49
INDUSTRIAL BIG RESEARCH
  • The Passive Personal Meter (PPM)

50
INDUSTRIAL BIG RESEARCH
  • The Passive Personal Meter (PPM)

51
INDUSTRIAL BIG RESEARCH
  • Radio, newspaper audiences/readership (JICRAR,
    NRS)
  • less money available for research
  • emphasis on reach, frequency of use
  • flexibility needed (out-of-home use)
  • NRS recall of papers and magazines read in past
    12 months when last read and how often
  • RAJAR weekly diary, soon a personal meter

52
INDUSTRIAL BIG RESEARCH
  • RAJAR

53
INDUSTRIAL BIG RESEARCH
  • NRS Average Issue Readership (AIR)

54
INDUSTRIAL BIG RESEARCH
  • NRS Average Issue Readership (AIR)

55
INDUSTRIAL BIG RESEARCH
  • Measuring Internet and Web Traffic
  • assess how many hits (not the same as people
    can have multiple hits by one user) or unique
    users visit a given website
  • gives equivalent of audience size measures
  • uses tracking software which can recognise repeat
    users (unless they prevent this being detected)
  • cumulative sample size important

56
INDUSTRIAL BIG RESEARCH
  • Measuring Internet and Web Traffic
  • cumulative sample composition important
  • key markets or subgroups require minimum sample
    size, therefore often over-sample these to
    maintain accuracy
  • tendency towards automation
  • tendency towards additional marketing and
    segmentation measures beyond simple activity
    logging
  • attitudinal measures, other media, purchasing
    habits, interests

57
INDUSTRIAL BIG RESEARCH
  • Measuring Internet and Web Traffic
  • comScore 2,000,000 net users and growing daily
  • Recruited via Random Direct Dialling (RDD)
  • Software on PC (with permission)
  • Measures purchasing, surfing
  • Links with other data sources where available
    (supermarkets etc.)

58
INDUSTRIAL BIG RESEARCH
59
INDUSTRIAL BIG RESEARCH
60
INDUSTRIAL BIG RESEARCH
  • Measuring Internet and Web Traffic
  • NetRatings
  • Recruited via Random Direct Dialling (RDD)
  • Software on PC (with permission)
  • Measures purchasing, surfing
  • Links with other data sources where available
    (supermarkets etc.)

61
INDUSTRIAL BIG RESEARCH
62
INDUSTRIAL BIG RESEARCH
63
INDUSTRIAL BIG RESEARCH
  • Quality of. Research (e.g. Audience Reaction
    measures)
  • assess the quality of the experience
  • smaller samples
  • usually panels or ad-hoc samples
  • increasingly rare

64
INDUSTRIAL BIG RESEARCH
  • Marketing and Media Barometers (e.g. Target
    Group Index)
  • assess who sees/hears, what, when, (where?), and
    consumes what
  • aims for representativeness high quality
    sampling
  • usually self-completion, one-off survey with
    large samples
  • also electronic home scanning panels
  • sample size important

65
INDUSTRIAL BIG RESEARCH
  • Marketing and Media Barometers (e.g. Target
    Group Index)
  • sample composition important
  • key markets or subgroups require minimum sample
    size, therefore often over-sample these to
    maintain accuracy
  • tendency towards automation (scan questionnaires,
    scan purchases in-home)
  • tendency towards additional marketing measures
    such as attitude statements
  • used as basis for data fusion

66
TGI Self-completion questionnaire
67
TGI Examples
68
TGI Examples
69
TGI Examples
70
TGI Examples
71
INDUSTRIAL BIG RESEARCH
  • Future Trends?
  • Multi-medium research vehicles
  • Multi-country
  • Increasing focus on hard-to-measure and
    attractive groups
  • Increasing use of on-line methods
  • Increasing use of passive or low involvement
    measures
  • Increasing secrecy about results

72
AUDIENCE RESEARCH
  • KEY QUESTIONS
  • What do you want to find out?
  • How are you going to find it out?
  • What assumptions have you made?
  • What sort of information/analysis do you want to
    end up with?

73
AUDIENCE RESEARCH
  • How are you going to find it out?
  • QUALITATIVE METHODS
  • focus groups
  • depth interviews
  • citizens juries
  • hands-on clinics

74
AUDIENCE RESEARCH
  • How are you going to find it out?
  • MIXED METHODS
  • direct observation
  • evaluation/viewing sessions
  • ethnographic methods
  • Content analysis
  • diary-keeping

75
AUDIENCE RESEARCH
  • Evaluation/viewing sessions
  • Viewers reactions in real time
  • Reaction measures constantly assessed

76
The Perception Analyser
77
The Perception Analyser
78
The Perception Analyser
79
Star Dials
The STAR System of Testing Audience Reactions is
a broadcast research tool, to measure audience
reactions in real time to audio and video
material. (www.opinion.co.uk). By means of
hand-held dials that read and register
frame-by-frame each individuals responses, the
system measures the reactions of a sampled
audience to the tested broadcast material. The
system instantaneously aggregates the individual
reactions and displays the results as a graph
synchronised with the video material.

80
Star Dials

81
AUDIENCE RESEARCH
  • Some ICS research projects
  • Viewers understanding of violence on TV
    BSC/ITC/BBC/Ch4 etc.
  • Virtual ethnography cameras TVAnytime
    development of a personal video recorder
  • Futura.com survey who uses what why
    mobiles, digital TV, internet etc.
  • Research into invasion of privacy
Write a Comment
User Comments (0)
About PowerShow.com