Slide_template_IPSC'ppt - PowerPoint PPT Presentation

1 / 48
About This Presentation
Title:

Slide_template_IPSC'ppt

Description:

Test creation tools (e.g. Hot Potatoes, WebQuiz, Questiontools) ... Video Clips in Listening, Drag and Drop Activity, Re-Organisation, Highlighting/Underlining, ... – PowerPoint PPT presentation

Number of Views:47
Avg rating:3.0/5.0
Slides: 49
Provided by: europe59
Category:

less

Transcript and Presenter's Notes

Title: Slide_template_IPSC'ppt


1
Research and Educational Indicators For European
Policy and Skills Assessment Quality aspects
of Computer-based Assessment Friedrich
Scheuermann Centre for Research on Lifelong
Learning (CRELL) Knowledge Assessment
Methodologies (QSI/KAM), Ispra,
Italy http//crell.jrc.it / http//kam.jrc.it
2
CRELL
3
  • CRELL at the
  • Joint Research Centre
  • CRELL is hosted by the Unit of Applied Statistics
    and Econometrics, JRC Ispra.
  • As a Directorate General of the European
    Commission, the JRC provides scientific and
    technical support to Community policy-making.
  • 7 Institutes in 5 Member States (total staff
    2,700).
  • CRELL was established 2005 by Directorate General
    Education and Culture and the Joint Research
    Centre of the European Commission

4
Results
5
  • CRELL in brief
  • CRELL combines research in education, social
    sciences, economy, econometrics and statistiks in
    an interdisciplinary approach
  • 12 staff members

6
Educational Research Based on Indicators and
Benchmarks
7
  • Monitoring contribution to documents (e.g.
    Progress Report), evaluating studies tendered
  • Work on indicators Analysis of existing
    datasets Composite indicators
  • Participating in expert groups
  • Preparation of new surveys
  • Organizing and facilitating meetings

Support to the EU-Commission
8
Research- Infrastructures
  • Networks maintain and develop networks on active
    citizenships, Learning to learn, VET investment
    efficiency school leadership, eAssessment
  • Dissemination Conferences and workshops, CRELL
    reports, Publications in peer reviewed journals
  • Inventories of institutions active in
    indicator-based evaluation indicators-based
    research initiatives

9
Political Framework
10
Lisbon European Council March 2000
Making Europe the most competitive knowledge
based economy by 2010, with better jobs and
greater social cohesion.
  • Open method of coordination
  • Guidelines for the Member States
  • Indicators and benchmarks
  • Exchange of good practice
  • Peer reviews and mutual learning processes

11
Stockholm European Council, March 2001
Three strategic objectives for education and
training
Detailed Work Programme for the implementation of
common objectives in the fields of education and
training requested
12
Barcelona European Council, March 2002
European Education and Training systems to become
a world reference by 2010
  • Detailed Work Programme endorsed
  • Language indicator requested
  • Goal for pre-primary participation

13
Benchmarks and indicators
14
Detailed Work programme of 2002
3 strategic objectives
13 detailed objectives
5 benchmarks (Reference Levels of Average
Performance In EU Member States)
29 indicators for monitoring progress
15
Monitoring progress
Standing Group on Indicators and Benchmarks set
up, first meeting in July 2002 (22 meetings
2002-2006/7) (27 EU countries, 2 EEA countries,
Commission, OECD, Cedefop, Eurydice, Crell)
29 indicators for monitoring progress of then
implementation of the Detailed Work programme
Progress Report Reports published in 2004, 2005,
2006, 2007 2008 forthcoming with a new structure
16
3 strategic objectives, 13 detailed objectives
  • Improving the quality and effectiveness of
    education an training systems in the EU
  • Improving education and training for teachers and
    trainers
  • Developing skills for the knowledge society
  • Ensuring access to ICT for everyone
  • Increasing recruitment to scientific and
    technical studies
  • Making best use of resources
  • Facilitating the access of all to education and
    training systems
  • Open learning environment
  • Making learning more attractive
  • Supporting active citizenship, equal
    opportunities and social cohesion
  • Opening up education and training systems to the
    wider world
  • Strengthening the links with working life and
    research and society at large
  • Developing the spirit of enterprise
  • Improving foreign language learning
  • Increasing mobility and exchange
  • Strengthening the European co-operation

17
5 EU European Reference Levels of Average
Performance to be reached by 2010
  • Reduce the share of 15 years old low achievers in
    reading (PISA, level 1) by 20 compared to 2000
  • No more than 10 of young people (aged 18-24)
    should be early school leavers
  • At least 85 of young people (aged 22) should
    have completed at least upper secondary
    education
  • Increase the number of MST graduates by 15
  • At least 12,5 of adults (aged 25-64) should
    participate in lifelong learning.

Early school leavers percentage of the
population aged 18-24 with at most lower
secondary education and not in further education
or training. Life-long learning indicator refers
to persons aged 25 to 64 who answered they
received education or training in the four weeks
preceding the survey.
18
Reaching the 5 benchmarks in 2010 would imply
? 2 million fewer young Europeans having left
school early ? 2 million more young people
finishing upper secondary education? over 200
000 fewer 15 year olds performing low in reading
literacy? nearly 8 million more adults
participating in lifelong learning
19
Progress in the 5 benchmarks
  • Based on data 2000-2003/06
  • Benchmark already achieved
  • Mathematics, science and technology graduates
  • Constant, but not sufficient progress
  • Early school leavers
  • Upper secondary attainment
  • Lifelong learning participation
  • No progress yet
  • Low achievers in PISA

20
Re-launched of the Lisbon Strategy
  • February 2004 Joint interim report The success
    of the Lisbon Strategy hinges urgent reforms
  • May 2005, Council conclusions on new indicators
    in education and training
  • Establishment of CRELL (2005)
  • Commission Communication, 2007
  • May 2007, Council conclusions on a Coherent
    Framework of indicators and benchmarks
  • A Coherent Framework of indicators and
    benchmarks for monitoring progress towards the
    Lisbon objectives in education and training.
    COM(2007)61 final (21.02.07)

21
Coherent Framework on Indicators
22
Communication on a Coherent Framework of
Indicators and Benchmarks (Feb. 2007)
Policy Areas
1. Improving equity in education and training
2. Promoting efficiency in education and
training 3. Making lifelong learning a reality
4. Key competencies among young people 5.
Modernising school education, 6. Modernising VET
(the Copenhagen process) 7. Modernising higher
education (the Bologna process) 8.
Employability.
23
Data sources
Participation Mobility, financing VET Self
reported adult skills ICT
LFS UOE CVTS AES SICTU
ESS
PISA survey TALIS survey PIAAC survey
Maths, reading, science skills Teacher education
(CRELL) Adult skills
Civic skills (CRELL)
ICCS survey
Language skills Learning to learn skills (CRELL)
Language survey Learning to Learn survey
24
Indicator development
Quantitative analysis Composite indicator
development
Composite indicators and quantitative analysis
Indicator identification
Indicators
Indicators
SGIB
COM
Stat. Ind.
Stat. Ind.
Stat. Ind.
Data producers
25
Communication on a coherent framework of
indicators and benchmarks
16 core indicators
  • Participation in pre-school education
  • Special needs education
  • Early school leavers
  • Literacy in reading, mathematics and science
  • Language skills
  • ICT skills
  • Civic skills
  • Learning to learn skills
  • Upper secondary completion rates of young people
  • Professional development of teachers and trainers
  • Higher education graduates
  • Cross-national mobility of students in higher
    education
  • Participation of adults in lifelong learning
  • Adults skills
  • Educational attainment of the population
  • Investment in education and training

26
Indicators Adopted
11 already existing Core indicators 1.
Participation in pre-school education 2. Special
needs education ( ) 3. Early school leavers 4.
Literacy in reading, mathematics and science 6.
ICT skills 9. Upper secondary completion rates of
young people 11. Higher education graduates 12.
Cross-national mobility of students in higher
education 13. Participation of adults in
lifelong learning 15. Educational attainment of
the population 16. Investment in education and
training European Statistical System data
(ESS)
5 Core indicators where new surveys are being
prepared 5. Language competencies (DG
EAC) COM(2007) 184, and pre-call for tenders 8.
Learning to learn skills (DG EAC/CRELL) 7. Civic
skills (IEA) 10. Professional dev. of teachers
and trainers (OECD) 14. Adults skills (OECD)
Contextual data sources ESS, OECD, Eurydice,
Cedefop
27
Council Conclusions May 2007 indicators adopted
a) to make full use of the following indicators
1. Participation in pre-school education 3.
Early school leavers 4. Literacy in reading,
mathematics and science 9. Upper secondary
completion rates of young people 11. Higher
education graduates 12. Cross-national mobility
of students in higher education 13. Participation
of adults in lifelong learning 15. Educational
attainment of the population
c) to pursue the development of indicators on 7.
Civic skills (IEA) 10. Professional dev. of
teachers and trainers (OECD) 14. Adults skills
(OECD)
d) to further examine the development of
indicators on 5. Language competencies 8.
Learning to learn skills
b) to submit to the Council, for further
consideration, information on the definition of
the following indicators 2. Special needs
education ( ) 6. ICT skills 16. Investment in
education and training
28
The way ahead
  • Quantitative analysis and development of
    Composite indicators ? CRELL Active citizenship
  • New Surveys Learning to Learn (L2L), Language
    indicator
  • Progress Report 2008

29
Quality Criteria for Computer-Based Assessment
and Open Source Software Tools
30
  • Benefits vs. Challenges
  • CBA has a great deal of advantages compared with
    traditional pencil and paper testing, namely
  • in the types of items that can be proposed
    (hence, a variety of audiences and specific
    groups can take such tests),
  • the types of assessments that can be carried out
    (for instance adaptive testing and other emerging
    testing forms, different from traditional
    sequential assessments)
  • speed of test execution and administration,
  • distributive testing administration and
    preparation, etc.

31
  • Benefits vs. Challenges
  • High stakes assessments can imply issues of
    security
  • The infrastructure required (computers,
    logistics, telecommunications, etc.)
  • Validity (confounding variables)
  • Hardware performance
  • Training at the several interfaces (additional
    skills are required for test takers, test authors
    and test administrators)
  • Organisational issues (some countries are highly
    decentralised, other are highly centralised with
    regards to decision making)
  • Interoperability among testing platforms and
    sharing of entities
  • Costs associated are said to first investments
    but then they are expected to be lower with time.

32
  • Experiences
  • Little experiences at European level on carrying
    out CBA and large scale assessments
  • References
  • DIALANG (http//www.dialang.org)
  • TAO
  • PISA pilot studies
  • Project on Quality Criteria for Open Source
    assessment platforms

33
  • Project Objectives
  • Review of practice on computer based assessment,
    focusing especially on open source software
    applications. Outline advantages and
    disadvantages of computer based assessment and
    conditions of usage.
  • Develop e-evaluation quality criteria based on
    the review and past experience of open source
    platforms
  • Tune the quality criteria with testing with open
    source systems
  • Outline the value and make recommendations for
    such systems in contexts of skills assessment
    relevant for lifelong learning objectives.
  • Recommend desirable architectures, competencies
    required, interoperability requirements, etc.

34
  • Why Open Source ?
  • Little transferability of commercial solutions
  • Guarantees independence from commercial options
  • Allows accessible customisation and development
    of functionality because it is supplied with
    source code
  • It can provide the context for a unifying
    standard with a clear role on European
    assessments, even linking already on-going
    initiatives such as National assessments and
    worldwide assessments such as PISA or PIACC
  • However

35
  • Why Open Source ?
  • However
  • if such platforms are not supported and or
    maintained by a community of developers, such
    platforms should not be considered appropriate
    if such a community of developers exists, it may
    be financially supported to develop specific
    functionality
  • for an OSS CBA platform be widely adopted it has
    to be released with industrial and commercial
    standards, otherwise it risks to be prematurely
    abandoned.

36
  • Quality assure the set of criteria to develop a
    robust protocol for evaluating the quality of OSS
    CBA. This will be done by testing the protocol
    with users.
  • Methodology
  • Organisation of a number of focus groups with
    potential users deploying an OSS CBA platform
    where several tasks will be simulated
  • Creation of a test
  • Publishing a test
  • Taking a test
  • Analysing results.
  • The aim is to verify whether the quality items
    considered in the quality protocol are those
    relevant from the point of view of users and tune
    the protocol with the results.
  • Review of OSS CBA software To analyse
    functionality, usability, etc. and identify
    challenges of such platforms for evaluation of
    skills.
  • Methodology
  • Literature
  • experimentation with platforms,
  • interviews with developers, users
  • Develop a set of criteria to evaluate the quality
    of the OSS CBA software in relation to their
    potential usage in skills assessment.
  • Methodology
  • Review done in step 1
  • Literature
  • Experimentation with platforms
  • Interviews with experts

Protocol for evaluating the quality of OSS
Computer Based Assessment Software
ongoing
STEP 3
DELIVERABLE
STEP 1
STEP 2
37
  • Platforms, Services and Tools
  • Large number of assessment applications on the
    market (790 products were assessed)
  • General lack of transparency (terminology,
    reference points etc.)
  • Software applications
  • Test creation tools (e.g. Hot Potatoes, WebQuiz,
    Questiontools)
  • Survey software (e.g OpenSurveyPilot)
  • Modules of educational platforms that enable the
    creation and management of limited types of items
    (e.g. Moodle)
  • Assessment platforms (e.g TAO)
  • Assessment services (e.g. Pan Testing, Assessment
    Solutions)
  • Assessment applications (e.g. IQ tests)
  • Management tools (e.g. Gradebook)

38
  • Platforms considered for further consideration
  • TAO, Centre de Recherche Henri Tudor, Univ. of
    Luxembourg, LU (http//www.tao.lu)
  • TCExam, Tecnick.com S.r.l., Italy
    (http//sourceforge.net/projects/tcexam/)
  • ECAssignmentBox, University of Magdeburg, DE,
    (http//wwwai.cs.uni-magdeburg.de/software/ecab)
  • TestMaker, RWTH Aachen, DE
  • (http//www.global-assess.rwth-aachen.de/testmake
    r2/index.php)

39
Software Quality hard to define, impossible
to measure, easy to recognise. (Kitchenham, 1989)
40
  • Quality Dimensions
  • Quality of methods
  • Pedagogical richness of assessment
  • Testing standards
  • Quality of tools/applications
  • Technological standards (e.g. ISO/IEC standard
    9126 on software quality, IMS Question and Test
    Interoperability)
  • Quality of implementation (cost/benefit, equal
    opportunities etc.)
  • Policy level, contextual data
  • Checklist for evaluation large-scale Assessment
    programs

41
  • Quality of Methods Aspects
  • Test rationale (e.g. specification of test
    purpose, description of measurement, relevance of
    test content)
  • Quality of Test Material (e.g. objective scoring
    system, clearly and completely specified scoring
    system ethical issues, quality of test material)
  • Quality of Test Manual and Instructions (e.g.
    completeness, clarity, examples)
  • Norms (e.g. are norms/cut-off scores provided,
    explanations of scale or score limitations)
  • Reliability (e.g. information about reliability
    provided, existing studies)
  • Validity (e.g. correct analysis procedures)

42
Quality of Methods Item Types
Source Scalise, K. Gifford, B. (2006).
Computer-Based Assessment in E-Learning A
Framework for Constructing Intermediate
Constraint Questions and Tasks for Technology
Platforms. Journal of Technology, Learning, and
Assessment, 4(6). Retrieved 15.12.2006from
http//www.jtla.org
43
  • Pictorial Multiple Choice with Sound
  • Interactive Image with Sound,
  • Video Clips in Listening,
  • Drag and Drop Activity,
  • Re-Organisation,
  • Highlighting/Underlining,
  • Insertion,
  • Deletion,
  • Thematic Grouping,
  • Multiple Response

Quality of Methods Innovative Item Types
  • Indirect Speaking with Audio Clips as
    alternatives,
  • Benchmarking in Direct Writing,
  • Multiple Benchmarks in Speaking,
  • Drag and Drop Matching,
  • Transformation,
  • Combined Skills,
  • Mapping and Flow Charting,
  • Confidence in response

44
Quality of Tools
45
Quality of Tools
46
Quality of Tools
47
Quality of Tools
48
Quality of Tools
Write a Comment
User Comments (0)
About PowerShow.com