Title: CrossGrid Quality Assurance Activities Report
1CrossGrid Quality Assurance Activities - Report
Robert Pajak, Piotr Nowakowski and Grzegorz
Mlynarczyk X TAT ACC CYFRONET AGH, Kraków,
Poland www.eu-crossgrid.org
2Outline
- Recommendations from the CG Review
- QA Procedures for deliverables
- Quality Assurance Reports
- Process Quality Indicators
- Testbed Quality Indicators
- Code Quality Indicators
- progress and effectiveness of testing
- static source code metrics
3Recommendations from the CG Review
- Recommendations for Quality Assurance in the
CG project from the Final Report on the CG Annual
Review, 18 June 2003 - A project Quality Engineer should be quickly
nominated, and tightly integrated into the
project management structure, reporting directly
to the project coordinator. The Quality Engineer
must have the overall responsibility for all
quality-related aspects of the entire project
(documents, reports, software). He must
permanently monitor the quality all deliverables
and dissemination material. - Done.
- We recommend that for the next deliverables up to
and including M18, both the draft reports
arriving at the project Quality Engineer and the
final versions be sent to the Commission. - Done.
4QAP for deliverables
- After internal review each deliverable is
verified and corrected by the Quality Engineer. - Both the draft reports arriving at the project
Quality Engineer and the final versions are
afterwards sent to the Commission.
5Number of monthly reports delivered to CG Office
on time
6Number of monthly reports delivered to CG Office
on time
- The following Partners have never sent their
monthly report on time during the last 7 months - INP (AC3), UCY (AC12), UAB (AC16), AUTH ( AC19),
CSIC - (CR15)
- The following Partners have been sending most
frequently their monthly report on time during
the last 7 months - AC9 (USTUTT) and CR5 (UVA) - 6 times each
- AC7 (UNI LINZ) and AC10 (TUM) - 4 times each
7Number of persons allocated by each Partner to
development of all tasks during Dec 2003 (EU
funded hours)
8Mailing lists activity in December 2003
9Mailing lists activity in December 2003
10Testbed Quality Indicators - December 2003
11Testbed Quality Indicators last 4 months
12Testbed Quality Indicators - notes
- Number of sites corresponds to the sum of all
sites involved in the Production, Development and
Validation testbeds. - Number of users refers to the users registered in
all crossgrid VOs. - Job submission refers to the number of globus
jobs submitted in the Production testbed. - Monthly uptime refers to the CE gatekeeper
availability it includes gatekeeper and ICMP
response. - During the period November-December 2003 several
sites were being upgraded this is the reason why
the success rate and uptime is very low.
13Code QIs - Progress and effectiveness of testing
December 2003
14Code QIs - Progress and effectiveness of testing
December 2003
15Code QIs - Progress and effectiveness of testing
December 2003
16Code QIs - Progress and effectiveness of testing
December 2003
17Code QIs - Progress and effectiveness of testing
December 2003
18Code QIs - Progress and effectiveness of testing
December 2003
19Code QIs - Progress and effectiveness of testing
missing reports
- The reports from Tasks 1.1, 1.2, 1.3 and 1.4 have
not been sent to the Quality Engineer and are
still missing.
20Code QIs - Progress and effectiveness of testing
December 2003
21Code QIs - Progress and effectiveness of testing
December 2003
- The main conclusions about the above data and
bugs tracking system - From the beginning of the project only few bugs
were reported using bugs tracking system so no
advanced measurements of testing phase could be
calculated since last month people working on
Task1.3 started to use bugtracker to inform about
compilation errors - In the configuration of the bugtracker many
projects still have no defined persons
responsible for resolving reported bugs and whom
the bugs may and should be assigned to.
22Code QIs - Progress and effectiveness of testing
December 2003
- In the configuration of the bugtracker many
projects still have no defined categories so even
if somebody from development team would like to
enter a bug it is not possible to assign a
category to it. It may in the future complicate
analysis of the reported issues/problems and
stability of distinct modules because right now
projects without categories are treated from the
bugs tracking stand point as a one big package. - Feedback after the first quality report indicates
that most of people are using direct email
notification to inform each other about bugs
but this form isnt sufficient enough to track
development process developers should use
bug-tracking tool as a centralized database for
issues reporting, which itself provides email
notification mechanism in order to keep them
receiving notification about problems.
23Tasks source code in project hierarchy
24Tasks source code in project hierarchy - notes
- The diagram contains distinguished, on the first
level, only these tasks of individual WPs, that
directories in CVS use naming convention wpX_Y_Z.
- If the existing project isnt considered in this
way in classification but it should be analyzed,
then the QE should be informed about such cases. - The diagram contains marked in green only these
tasks, which in their source directories
possessed sources that could be compiled and for
which suitable makefiles were delivered. - Adjusting source code directories of individual
projects might have brought in December 2003 QA
report disruption to indicators values, since
values provided a month ago were obtained from
different directories.
25Tasks source code in project hierarchy - notes
- Due to backward compatibility of reports, Work
Package data have been gathered using whole
content found in CVS under Crossgrid/wpx
directory. As a result, values of several
indicators at package level may not be just a
simple sum of appropriate indicators at project
level. Project level indicators may omit (as
requested) other sources not found in src dir,
i.e. sources of tests or includes. - Values of indicators at package level presented
in this report were adjusted to not contain fake
sources that had been found in CVS. For
instance, GTK sources (found in wp2_4_1-perfmon
directory) and doubled Workload sources were
eliminated from Work Package data.
26Code QIs - Static source code metrics December
2003
27Code QIs - Static source code metrics December
2003
28Code QIs - Static source code metrics December
2003
29Code QIs - Static source code metrics December
2003
30Code QIs - Static source code metrics December
2003
31Code QIs - Static source code metrics December
2003
32Code QIs - Static source code metrics December
2003
33Code QIs - Static source code metrics December
2003
34Code QIs - Static source code metrics December
2003
Total complexity of CrossGrid has been reduced.
35Code QIs - Static source code metrics December
2003
36Code QIs - Static source code metrics December
2003
37Code QIs - Static source code metrics last 6
months
38Code QIs - Static source code metrics last6
months
39Code QIs - Static source code metrics December
2003
40Code QIs - Static source code metrics December
2003
41Code QIs - Static source code metrics December
2003
42Code QIs - Static source code metrics December
2003
- Number of quality notifications for WP4 tasks
- The lack of dividing on individual projects joins
the situation, in which individual tasks of WP4
do not contain source codes generating the
qualitative remarks. - Remarks are generated on basis of headers being
in CVS in catalogs not recognized in QA report as
task catalogues.
43Code QIs - Static source code metrics
44Code QIs - Static source code metrics
45Code QIs - Static source code metrics
46Code QIs - Static source code metrics
47Code QIs - Static source code metrics December
2003
48Code QIs - Static source code metrics December
2003
- The main conclusions about the above data
- The projects wp2_3- the bench as well as marmot
possess the largest ratio of number of lines of
comment to effective lines of code. - Value of this coefficient close to 0.9 in case of
the first of these two projects marks, that
statistically 9 on 10 lines of actual (effective)
code possess the own line of comment. - In practice so large values of this coefficient
are not necessary but it is important to keep it
on level not lower than 0.4 the majority of
examined projects still do not follow this rule.
49Code QIs - Static source code metrics December
2003
50Code QIs - Static source code metrics December
2003
- The main conclusions about the above data
- The coefficient is calculated as the ratio of
number of quality notifications/violations to the
number of effective lines of code. - Accepting 0.1 as a satisfactory value of this
coefficient (the lower values the better quality)
it has been observed that only a few projects are
below this level. - Most of projects keep the value of this
coefficient between 0.1 and 0.2 which is
unacceptable and should be corrected. - Verification of quality rules used for the
analysis can be done by reading through reports
generated by RSM tool.
51Code QIs - Static source code metrics December
2003
52Code QIs - Static source code metrics December
2003
- The main conclusions about the above data
- During analysis the problems with too high
complexity of modules have been noticed within
the limits of individual projects. - Even if the average function complexity in all
projects does not exceed 10, then the average
classes complexity in half of projects exceeds
widely accepted in industry value 15.
53Code QIs - Static source code metrics Nov-Dec
2003
54Code QIs - Static source code metrics Nov-Dec
2003
55Code QIs - Static source code metrics Nov-Dec
2003
56Code QIs - Static source code metrics Nov-Dec
2003
57Code QIs - Static source code metrics Nov-Dec
2003
58Code QIs - Static source code metrics Nov-Dec
2003
59Code QIs - Static source code metrics Nov-Dec
2003
60Code QIs - Static source code metrics Nov-Dec
2003
61Code QIs - Static source code metrics Nov-Dec
2003
62Code QIs - Static source code metrics Nov-Dec
2003
63Code QIs - Static source code metrics Nov-Dec
2003
64Code QIs - Static source code metrics Nov-Dec
2003
65Code QIs - Static source code metrics Nov-Dec
2003
66Code QIs - Static source code metrics Nov-Dec
2003
67Code QIs - Static source code metrics Nov-Dec
2003
68Code QIs - Static source code metrics Nov-Dec
2003
69Code QIs - Static source code metrics Nov-Dec
2003
70Quality Assurance Reports
- All Quality Assurance Reports are available at
the following website (password-protected) - http//www.eu-crossgrid.org/wp5-1-login/QA_reports
.htm