Quality in Tourism Assessment Feedback - PowerPoint PPT Presentation

1 / 58
About This Presentation
Title:

Quality in Tourism Assessment Feedback

Description:

The accommodation type most likely to be dissatisfied are Holiday Parks ... Only 26 people commented about half of which were people saying they couldn't ... – PowerPoint PPT presentation

Number of Views:144
Avg rating:3.0/5.0
Slides: 59
Provided by: kdo6
Category:

less

Transcript and Presenter's Notes

Title: Quality in Tourism Assessment Feedback


1
Quality in Tourism Assessment Feedback
  • A Presentation by Arkenford Ltd

Wave 2 (November 2007)
2
Background
  • VisitBritain is responsible for assessing
    properties throughout the UK, developing service
    product standards to be attained by
    accommodation providers
  • Alongside assessments and ratings, the aim of the
    assessments is to provide quality advice
    guidance
  • This way members can ensure that their property
    is able to attain the desired standards ratings
  • The purpose of this market review is to provide
    feedback from participants who have recently been
    assessed by the assessment provider Quality in
    Tourism (QiT)

3
Methodology
  • Data of establishments assessed in the last 3
    months was provided by QiT
  • This included all types of accommodation
    (Serviced, Self Catering, Hostels and Holiday
    Parks)
  • An invitation to participate was emailed to all
    establishments that had been assessed in this
    period
  • Those who have provided an email address were
    sent a link to an electronic questionnaire, or an
    option to request a postal version of the
    questionnaire.
  • Those with no email address were mailed a postal
    version of the questionnaire
  • A total of 3548 invitations to participate were
    sent out overall
  • We received response from 1303 establishments,
    with 920 participants fully completing the
    questionnaire
  • This is in line with the traditionally high
    strike rate of c30

4
The Sample
5
Sample Profile (1)
  • Everyone assessed in June, July and August was
    asked to complete a questionnaire
  • In total 3591 questionnaires were sent out, 1303
    people responded at least in part (a 36 return
    rate), 927 people fully completed the
    questionnaire (26)
  • Due to people dropping out of the internet survey
    and people skipping questions on the postal
    questionnaires, the base sizes of questions
    varies.
  • On average, those questions asked of everybody
    have between 950 and 1000 responses.
  • The sample was broadly representative of the
    market
  • Serviced accommodation makes up about 56 of the
    sample
  • Over half of the responses were from 4 star
    establishments

6
Sample Profile (2)
  • Throughout this presentation, the result may be
    broken down by the region, rating, sector or
    cycle type of the respondent.
  • The following table breaks down the region of the
    respondents by these other variables
  • Due to low numbers of respondents having a one
    star rating, or being in the hostel sector, these
    categories will not be shown when breaking down
    the results by rating or sector.
  • It should also be remembered that London and the
    unknown region have lower base sizes than the
    other regions
  • Any significant changes from the previous wave of
    research (carried out in June) will also be
    reported

7
Topics Covered
  • We have presented the information broadly in line
    with the questionnaire content
  • The Assessment Process
  • Reporting
  • Outcome Future Ratings
  • Administration
  • Value of the Scheme
  • In addition, the questionnaire also covered
    issues relating to
  • The VisitBritain and QiT Websites
  • The VisitBritain e-newsletter
  • Potential scheme developments

8
The Assessment Process
9
Amount of Notice Given (1)
  • The majority of respondents report receiving
    between 1 and 2 weeks notice before their
    assessment
  • The mean number of days notice given is 12.65
    (exc. no notice given)
  • As with previous findings, there is variation in
    the amount of notice given between the sectors
  • Self catering establishments are more likely to
    receive a larger amount of notice
  • Over 80 of parks receive no notice (a slight
    decline on the previous wave)
  • The trend for an increased number serviced
    establishments receiving no notice continues (up
    from 34 to 36)

10
Amount of Notice Given (2)
  • Respondents were then asked if they felt that
    this was enough prior notice.
  • The vast majority (96) thought that they has
    been given enough notice.
  • Despite the differences in the amount of notice
    given to self catering and serviced
    establishments, the level of satisfaction is
    similar for both types of accommodation
  • The accommodation type most likely to be
    dissatisfied are Holiday Parks
  • 98 of both re-joiners and new members thought
    they received enough notice
  • renewals reported more dissatisfaction with
    amount of notice given, with 5 reporting they
    didnt feel it was enough (even though they were
    less likely to receive no notice)

11
Assessor Punctuality and Manner
  • 97 of respondents reported that the assessor was
    punctual
  • Very little difference can be seen between
    different sectors, ratings, regions or cycle type
  • Comparing the results to the previous rate shows
    that the assessors are consistently seen as
    punctual
  • 99 of respondents judged the assessor to be
    polite and professional in their manner
  • Only 7 individuals reported otherwise
  • As with punctuality there is little variation
    amongst different categories or with previous
    waves of research

12
Overnight Stays in Serviced Accommodation (1)
  • Overall 56 of serviced establishments received
    an overnight visit
  • There is variation across regions as to whether
    an overnight stay was received
  • 63 of establishments in the Northwest received
    an overnight stay compared to 50 in London (NB.
    Low base size for London)
  • 46 of renewals receive an overnight stay
    compared to about 98 for new members and
    re-joiners

13
Overnight Stays in Serviced Accommodation (2)
  • As with previous waves of research, those
    properties with a higher rating are more likely
    to receive an overnight visit
  • There is no significant change overall since last
    wave

14
Number of Rooms/Units Seen (1)
  • The majority of respondents have 2 or 3 rooms
    inspected, and/or 1 unit inspected
  • The pattern of the number of rooms seen by rating
    has stayed consistent with previous findings
    lower rated properties tend to have more rooms
    inspected
  • The pattern for the numbers of units seen shows
    greater variation, generally those with the
    highest or lowest rating will have the most units
    seen

15
Number of Rooms/Units Seen (2)
  • The number of rooms seen has stayed relatively
    similar to the previous wave the number of
    respondents reporting that 8 or more units were
    seen has increased by 5 since last wave
  • On average, 4.2 rooms are viewed during an
    assessment
  • The average number of units seen for a re-joiner
    is 22.3 compared to 9.3 for a new member.
  • Respondents were then asked if they felt the
    assessor had seen enough of their property to
    gain a reliable picture of the service they
    provided
  • 98 of people reported that they felt the
    assessor saw just the right amount with 1 of
    people reporting they had not seen enough and 1
    saying they had seen too much

16
Rating Indication
  • As with previous waves, about 95 of respondents
    were given an indication of the rating they would
    receive at the time of the visit
  • Previously it was found that lower rated
    establishments were more likely to be given a
    rating indication at the time of visit This wave
    this trend is not present.
  • However those with a 5 star rated are still the
    least likely to be given an indication
  • Renewals are the least likely to be given an
    indication, whereas re-joiners are given an
    indication 99 of the time
  • There are no differences across regions or
    sectors

17
Information on Improvements
  • In the previous wave of research, there was a
    clear relationship between the rating of the
    establishment and whether information on
    improvements was offered where lower ratings
    were more likely to receive advice
  • In the current wave (2), this trend is no longer
    present
  • 73 of people report receiving advice regardless
    of rating (with 4 stars being the exception, with
    a slightly lower figure of 68)
  • Sector and cycle type categories have stayed
    relatively stable with the same patterns present
  • Renewals are less likely to receive advice
  • Little difference across sectors

18
Satisfaction with Assessment
  • Respondents asked to rate the assessment on a
    number of different factors (using a scale of 1-7
    where 1 is poor and 7 is excellent)
  • Overall satisfaction received a score of 5.75
  • As with previous research the area rated most
    highly is time allowed to discuss assessment
    with a mean score of 5.89
  • Other advice on marketing issues received the
    lowest rating at 5.01
  • The higher the rating the higher the levels of
    satisfaction
  • Those new to the scheme show higher levels of
    satisfaction, followed by re-joiners and then
    renewals
  • On average an increase of 0.12 can be seen across
    all categories compared to last wave

19
Improving the Assessment Process
  • Respondents were asked to highlight problems/ways
    to improve the assessment process, 169 people
    gave comments. The table below summarises what
    was said
  • The most common complaints are in regards to the
    rating requirements themselves and a lack of
    information on marketing advice

20
Improving the Assessment Process
  • Example comments
  • The Assessor was fine but I do feel that what is
    required re grading between 4 and 5 star are
    getting ridiculous to achieve i.e. because the
    kitchen doesn't have slate floors and granite
    worktops.
  • Help on how to improve marketing without being
    pedantic. Advice should be tailored to
    individual circumstances.
  • More consistency from the assessors. What was
    acceptable for one assessor does not seem to be
    acceptable for another and vice versa.
  • Inspection should take place on turnaround day
    or when unoccupied. I think it's bad manners to
    intrude whilst a guest is in residence.

21
Reporting
22
Reporting (1)
  • 96 of respondents had the rating indicated at
    the time of the visit confirmed in writing
  • Those with the lowest rating were less likely to
    have the indicated rating confirmed
  • 94 received their report with 21 days of the
    assessment
  • a 3 increase on the previous wave
  • Whereas the report from the last wave shows that
    5 stars were least likely to receive their report
    within 21 days, now they are the most likely
  • 96 felt the report accurately reflected what was
    discussed at the visit
  • Those with higher ratings are more likely to feel
    the report accurately reflected the visit

23
Reporting (2)
  • 97 of respondents felt the report explained
    their rating
  • However respondents are less likely to think that
    the report provides them guidance on improving
    ratings or best practice and quality awards, with
    both around 80
  • This is especially true of 3 and 4 star
    properties
  • Results are similar to the previous wave

24
Satisfaction with Written Report (1)
  • The overall satisfaction level with the report
    was 5.66, a slightly improvement on the score of
    5.46 from the previous wave
  • Otherwise results are generally very similar to
    previous waves
  • Those who are new to the scheme or have rejoined
    rate all aspects of the written report higher
    than renewals
  • Those with higher ratings are more likely to give
    higher scores
  • There is little variation between regions or
    sectors

25
Improving Reporting
  • Respondents were asked to highlight problems/ways
    to improve the written reports
  • Only 76 people gave comments (reflecting the
    generally high satisfaction levels)
  • The table opposite summarises what was said
  • Example Comments
  • I was told I must provide an access statement
    and a fire safety report. Despite contacting
    various people have had no real guidance on
    this. If you insist on these as part of the
    rating system, you should at least provide some
    sort of guidance or advice
  • An idea of percentages would help/indicate how
    improvements could be made to achieve a 'Gold'
    award'.

26
Outcome and Future Ratings
27
Rating Overview (1)
  • The first chart shows how close peoples
    previous, expected and current rating are
  • Generally peoples expectations and achievements
    are fairly close
  • The table demonstrates the percentage of people
    who over or underestimate the rating they would
    achieve
  • 18 of people who expected 5 stars achieved 4
    stars
  • 1 of people who expected 4 stars achieved 5
    stars
  • Overall 6 overestimated what they would achieve,
    8 underestimated with the remaining 86 being
    accurate with their expectations
  • Only renewals who gave answers for expected and
    past ratings were included in this analysis

28
Rating Overview (2)
  • This chart demonstrates the expectation and
    attainments of those people new to the scheme
  • It shows that more people get 4 stars than
    expected due to both underestimation of their
    property and over expectation
  • The table shows that of the people that expected
    to get 5 stars 41 received 4 stars
  • 95 of people expecting 4 stars achieved that
    grade

29
Explanations and Fair Ratings
  • 170 people said there was a difference between
    their expected and achieved rating
  • Of these, 83 said the reasons for this were
    explained
  • The majority of which also said they were told
    how they could obtain the rating they wanted
  • Establishments with a higher rating are more
    likely to think the rating given is fair
  • There is little difference between sectors
  • Renewals are slightly more likely to feel their
    rating is unfair
  • The Northwest has 99 of respondents thinking
    their rating is fair
  • In contrast the Northeast has 89 of respondents
    believing their rating is fair
  • Very little overall change from last wave

30
Rating Expectations
  • When considering their next assessment most
    people indicate that they expect their rating to
    stay the same
  • The lower the rating the more likely increase is
    expected
  • When thinking about the future people tend to be
    more optimistic
  • For example 21 of 3 stars expect an increase
    next assessment, but 28 do in 3 years time
  • Compared to last wave, the expected outcome of
    the next assessment has stayed stable
  • In contrast expected increases in 3 years time
    has risen by 7

31
Current Position (1)
  • Compared to last year people are less happy to
    remain at the current rating (from 46 to 39)
  • showing that there is an intention to improve
    current standard
  • 13 report actively trying to improve their
    rating
  • As seen above there is regional variation in
    current rating positions
  • For example the East Midlands are most likely to
    report wanting to increase their rating but not
    being in a position to do so

32
Current Position (2)
  • The higher the rating the happier people are to
    stay at that rating
  • The lower the rating the more likely they are to
    feel they are not able to improve their rating
  • Self Catering are 14 more likely to be happy at
    their current rating
  • Rejoiners are the most likely to be working
    towards increasing their rating

33
Prevention of Rating Increase
  • Respondents were asked what, if anything, was
    preventing them from increasing their rating.
  • 36 cited financial reasons, with 30 saying it
    was a structure or space issue
  • Example Comments
  • lack of consistent info on what is required.
    Too much personal taste and individual views from
    inspectors
  • Structural alterations required to building out
    of financial reach.
  • Not clear about what I would need to do to move
    from a silver to a gold award. The book is
    singularly unhelpful in this area. intangibles
    are difficult to quantify!

34
Administration
35
Written Administration (1)
  • The overall satisfaction score is 5.47
  • A slight increase on last wave
  • There is little variation amongst the different
    aspects of written administration that were rated
  • As previously seen in other areas, higher rated
    establishment show greater levels of satisfaction
  • Region wise, there is only a range of .34 across
    overall satisfaction levels (when London is
    excluded due to its low base size)
  • Suggesting consistency

36
Written Administration (2)
  • The highest rating for each category is
    highlighted in bold
  • Those new to the scheme rate all aspects of
    written administration the highest
  • Renewals and re-joiners scores are closer, with
    variation over who is rating higher
  • Renewals find it clearer and easy to understand
    than re-joiners
  • Re-joiners find it more accurate
  • Serviced accommodation tends to rate written
    administration higher than other sectors

37
Written Administration (3)
  • When asked how unsatisfactory areas could be
    improved 60 people commented
  • A third cited generally poor administration
  • A third said they need more information
  • Example Comments
  • The information on what is required could be
    made simpler. The information is repeated for all
    the different levels and it's not clear which
    bits are relevant to your own situation.
  • I found the application information repetitive
    and at times with a loose page document, it as
    easy to muddle the pages. A different colour for
    each section would certainly have helped as it
    was my first time of applying.

38
Telephone and Electronic Administration (1)
  • Renewals are the least likely to have a reason to
    contact QiT
  • Reasons for contacting QiT are wide ranging
  • 45 of respondents say they contacted QiT about
    their assessment/visit
  • As with previous results, around a third of
    respondents have had a reason to contact QiT by
    phone or email
  • The higher the rating the more likely they have
    made contact

39
Telephone and Electronic Administration (2)
  • The Overall satisfaction score was 4.69
  • Lower than the score for written administration
    (5.45)
  • A slight increase on last wave
  • Again, lower ratings are equated with lower
    scores across all sections of administration
  • Serviced accommodation give the highest rating
    amongst the sectors
  • Renewals give lower scores than new members or
    re-joiners
  • Regional differences are hard to judge due to low
    base sizes

40
Telephone and Electronic Administration (3)
  • 18 of people who answered this section commented
    when asked how unsatisfactory areas could be
    improved
  • Example Comments
  • Avoid giving mixed messages when two or more
    phone calls made on the same issue.
  • They just pointed me to their web-site rather
    than talked things through with me
  • Telephone staff sound like robots with a script
    in front of them. If you ask a question that is
    not on the script they didn't try to help. Staff
    very much like most call centre staff.

41
Website Awareness and Usage
42
Websites Awareness and Usage
  • Respondents were asked about their use and
    awareness of both the QiT and VisitBritain
    websites
  • Over half of respondents have used the QiT
    website at least once
  • Less have used the VisitBritain website with 38
    having used it at least once
  • A third of respondents know about each website
    but have not visited them
  • Those who took part in the postal survey were
    much less likely to use either website
  • About 80 have not used the websites
  • Renewals are less likely to use the websites
  • Higher rated establishments are more likely to
    use both websites

43
The Quality in Tourism Website (1)
  • Respondents who had visited the QiT website were
    then asked why they had visited the website
  • Respondents were allowed to select more than one
    reason
  • On average, half of the respondents cited more
    than 1 reason
  • Overall 75 visited the website for general
    information
  • As can be seen by the chart, the reason people
    use the website by cycle type varies
  • Renewals are much less likely to apply for a
    rating online
  • Re-joiners are more likely to use it for
    downloading copies of scheme literature
  • New members are more likely to check for updates
    and to obtain contact details

44
The Quality in Tourism Website (2)
  • The amount of information and the quality of that
    information was rated highest
  • Ease of use was rated the lowest
  • Serviced accommodation and the higher ratings
    gave higher scores as with previous rating scales
  • Those in the Northwest rated the highest on every
    aspect

Uses a 1-7 scale
  • Respondents were also asked how the QiT website
    could be improved
  • Only 26 people commented about half of which
    were people saying they couldnt remember the
    website
  • Of the valid comments, areas raised included
    better search facilities, contact lists for who
    to contact for different problems and more
    detailed advice

45
The VisitBritain Website (1)
  • Respondents who had visited the VisitBritain
    website were then asked why they had visited the
    website
  • Respondents were allowed to select more than one
    reason
  • As with the QiT website, the vast majority used
    to website for general information
  • Other uses are all around the 15-20 mark
  • Those with lower ratings are more likely to be
    downloading copies of scheme literature
  • Self catering establishments are more likely be
    getting contact information or trying to sort out
    a specific query than other sectors

46
The VisitBritain Website (2)
  • All areas of the VisitBritain website were very
    similarly scored
  • All scores are slightly lower than that received
    by the QiT website
  • Other rating scales that have shown the trend of
    higher ratings giving higher scores with the
    website ratings there is little variation between
    ratings
  • Respondents were also asked how the VisitBritain
    website could be improved
  • Only 25 people commented however compared to
    QiT website more useful comments were given
  • Comments made included
  • Speed of the website
  • Ease of use
  • Lack of information regarding properties
  • Out of date information

Uses a 1-7 scale
47
Online Forum
  • Respondents were asked how useful they would find
    an online forum where they could discuss
    industry-related topics with other accommodation
    providers
  • For this they used a 1- 7 scale, where 1 means
    not at all useful and 7 very useful.
  • The Overall score was 3.82 Which is fairly low
  • Further investigation shows that 13 scored at 1,
    with another 13 scoring an indifferent 4
  • However 25 of people did score the forum at 6 or
    7 indicating there is interest
  • Higher ratings showed more interest
  • Renewals showed less interest

48
E-newsletter (1)
  • 22 of respondents currently receive the Visit
    Britain monthly e-newsletter
  • Of those who dont 40 report wanting to receive
    it
  • Those who completed the postal questionnaire show
    the least interest
  • Those in the Northeast show the most interest,
    with the Northwest and Southeast showing the
    least
  • 12 of two star properties receive the
    e-newsletter with 20-24 of 3 to 5 star
    properties receiving it
  • Overall the usefulness of the e-newsletter is
    4.09.
  • Those with 5 stars find it most useful followed
    by those with 2 stars

49
E-newsletter (2)
  • Respondents were then asked how useful they would
    find electronic information on various topics
  • News on the latest legislation proved to be seen
    as the most interesting to people
  • Information on research and statistics was seen
    as the least popular
  • Higher ratings showed more interest in all topics
  • Renewals should the least interest in all topics
  • Serviced accommodation also showed the highest
    levels of interest
  • There were few regional differences

50
E-newsletter (3)
  • When asked what other features they would like to
    see in the VisitBritain e-newsletter, 35 people
    took the opportunity to comment
  • A substantial number stated they would prefer to
    receive this information in the post.
  • Following are some examples of the ideas that
    were suggested
  • Advice on producing fire risk assessments and
    disabled assessments, or websites where you can
    find the relevant form without spending half a
    day looking for the right one.
  • Feature high 4/5 star rated businesses to enable
    newcomers to pick up useful tips and information
    on how best to improve their own business and
    achieve higher rating
  • Planning requirements for BB and Liquor
    Licensing requirements. Also going green, health
    safety, local food, and fire safety report.

51
Value of the Scheme
52
Value of the Scheme (1)
  • Obtaining a widely recognised quality sign was
    seen as the most beneficial aspect of the scheme
  • Receiving marketing support was rated the lowest
    which may reflect comments made in other parts of
    the questionnaire regarding wanting more
    information on marketing support
  • Higher ratings generally score all aspects higher
  • The largest differences between 2 and 5 star
    properties are obtaining a quality sign (a 1.42
    difference), increasing revenue and bookings
    (1.25 and 1.11 respectively)

53
Value of the Scheme (2)
  • As can be seen in the table below, renewals are
    the least positive about the benefits of the
    scheme
  • It is those that have rejoined who see the most
    benefit in signage
  • New members value being on national websites
    more, as well as placing the increase in bookings
    and revenue higher than renewals of rejoiners
  • There are variations amongst regions in the
    scores given
  • The East Midlands and the Northwest give all of
    the highest ratings (excluding London due base
    size)
  • The Southwest and East of England give the lowest
    ratings

54
Value of the Scheme (3)
  • Comparing this waves findings to the previous
    wave, we find increases in the ratings given
  • The biggest rise in score is to get into a
    widely circulated guidebook
  • Getting into local guides and on the websites saw
    the least improvement
  • The groups that showed the most interest on being
    on the websites were
  • High rated properties
  • In the serviced sector
  • New to the scheme

55
Scheme Memberships
  • As with previous findings, respondents are more
    likely to be members of LA schemes than other
    national schemes
  • Higher rated establishments are more likely to be
    members of other schemes or considered joining
    them
  • As with previous results, VisitBritain is rated
    substantially higher than the other schemes.
  • It has also increased since last wave, a decline
    can be seen for the other schemes
  • Opposite shows membership rates of sustainable
    schemes/awards
  • 5 star rated properties are twice as likely to
    subscribe to green awards compared to other
    ratings

56
Potential Scheme Developments
  • Respondents were asked about using consumer
    feedback as part of the scheme
  • They were asked to indicated their interest on a
    7 point scale (1 not at all interested, 7
    Extremely interested)
  • There was little difference between the
    different proposed types of consumer feedback
  • Higher graded properties and new members showed
    the highest levels of interest for all
  • There was little variations across regions and
    sectors
  • Each type of consumer feedbacks score increased
    by about 0.3 from the last wave.

57
Rating Difference Analysis
  • 75 people had a decline from their previous
    rating to their current rating
  • Additional analysis can show that these people
    are much more likely to answer questions more
    negatively
  • The average mean score over all questions
    requiring an answer on a 1-7 scales is 0.6 lower
  • Below is an example of the ratings for telephone
    and electronic administration
  • All aspects are rating much lower, especially in
    the case of speed of response and being
    friendly and attentive
  • Due to the fairly low proportion of people who
    had a rating decrease it would not overly impact
    the total mean scores

58
Summary of Findings
  • The findings this wave were similar to previous
    waves
  • Pleasingly respondents were slightly more
    positive overall
  • Most rating scales showed an increase in average
    scores
  • Is this an indication that feedback has been
    acted upon?
  • The area with the lowest levels of satisfaction
    was telephone and electronic administration
  • Half of respondents had visited the QiT website,
    40 had visited the VistiBritain website
  • The QiT website was rated better than the
    VisitBritain website
  • Under a quarter of respondents receive the
    VisitBritain e-newsletter
  • A third showed no interest in receiving it
  • There was only moderate interest in an online
    forum
  • Respondents see high levels of value to most
    aspects of the scheme
  • Those who are disappointed in their rating answer
    questions more negatively
Write a Comment
User Comments (0)
About PowerShow.com