Title: Improving Assessment in Software Engineering Team Projects
1Improving Assessment in Software Engineering Team
Projects
- Marie Devlin, Sarah Drummond, Chris Phillips,
Lindsay Marshall
2Improving Assessment in Software Engineering
Student Team Projects
- Active Learning in Computing (ALiC)
- Aligning pedagogical aims and assessment
- Assessment Methods
- Guidance on using these assessment methods
- Conclusions and future work
3Active Learning in Computing
- CETL project Partners are Leeds, Leeds
Metropolitan, Durham (CETL lead) and Newcastle - Group and Project Work
- Cross-site software development project
- Collaboration via technologies
- Transferable skills, increased employability
- Variety of assessment methods
4Aligning Pedagogical Aims and Assessment
Deliverables Learning Outcomes
Statement of work, (C ) Requirements Analysis Communication, problem solving, requirements analysis
Project Plan (C), log books (T, I), Reports (T) Use of initiative, planning, use of software development models, problem solving
Project Document (C), Software Design Software design, software development models, industry standards, practices for design notation
Source code, software documentation, user manuals etc. Project Document (C) Programming, testing, software development
Personal Skills Analysis (I), Report (I), Minutes of meetings (T), Observations (T), Evaluating performance (T, I), Report (T) Adaptability, leadership, interpersonal communication, cross-site communication, team work, fulfil roles, collaboration, time management, organisation
Presentation (T), Reports (C, T), Meeting with customer (C), Use evaluation of technologies Communication
Project Document (C), Reports (T, I), Coding (C), documentation (C) Written communication skills, using industry standard notation
5Assessment Methods
- Tangible products and personal characteristics
- Product and Process
- Contributions of each member of team
- Cross-Site work presents more of a challenge
6Joint Assessment of Company Deliverables
Sections Joe Mary Michael Tanya
1.0 Introduction Newcastle Newcastle Newcastle Newcastle
1.1 Purpose CMR R R
2.1.1 PC Modules CMR
2.1.2 PDA Modules Newcastle Newcastle Newcastle Newcastle
3.1.1 PC Modules CMR CMR
3.1.2 PDA Modules Newcastle Newcastle Newcastle Newcastle
3.2 Inter-process dependencies. CMR R MR CMR
3.2.1 PC Modules CMR R MR CMR
3.2.2 PDA Modules Newcastle Newcastle Newcastle Newcastle
4.2.3 PC Process Interface CMR
4.2.4 PDA Process Interface Newcastle Newcastle Newcastle Newcastle
Key C create M modify R - review
Contribution Matrix
7Peer and Self Assessment
- Newcastle Percentage Sharing
- Durham Self and Peer Ranking
- Cross-Site Percentage Sharing
- Formative Assessment and Feedback
- Calculating a Final Individual Mark
15 Monitor (I) Individual mark 25 Monitor (T,C)
Team mark 20 Module Leader (I) Individual
mark 40 Module Leader (T,C) Team mark Results
from peer assessments used as a weighting for
components (T,C)
8Using These Assessment Methods
- Agree clear assessment criteria.
- Teach students about peer and self-assessment.
- Continue to actively allay student anxiety.
- Formative assessment needs to be timely and
meaningful.
9Conclusions and Further Work
- Cross-Site development used a variety of
assessment methods and evidence gathering. - Contribution matrix, peer and self assessment
helps to ensure greater fairness and enable
intangible tasks such as teamwork and
communication to be given value. - Difficult to reassure students about the impact
on assessment without reducing requirement for
inter-site collaboration. - Scalability of this work needs some consideration
as does the impact of distributed development on
scheduling, completion times etc.