Senior project metrics
Each team is required to track time/effort worked on the project on an individual team member basis, and aggragated for the entire team.
In addition, pick any two metrics that you believe would add value for your project, and track them throughout the project. You also will need to consider which metrics may be appropriate for the development methodology the team is following. A list of categories and some possible metrics in each is provided to assist you, however you may pick any metrics that are appropriate for your project, including those not on the list. Pick metrics from at least two different categories..
The selected metrics are to be presented during the senior project presentations at the end of each quarter. In addition, recently updated charts (within 2 weeks) should be maintained on the project website, so that project sponsors and faculty coaches can observe the progress.
Note that most of these metrics can be generated quite easily with tools, based on normal project activities. Progress metrics can be generated automatically from a planning spreadsheet. Defect tracking tools will automatically generate defect metrics. Effort metrics can be generated by an activity tracking spreadsheet. There are plugins for Eclipse that will generate design and code metrics.
A list of useful metrics
The metrics below can be used for traditional waterfall/iterative development as well as agile approaches, and can also be applied to projects other than pure development work. If you are looking for recommendations, consider the categories and metrics identified in bold.
- Progress metrics: These show whether project execution is on schedule.
- Slippage chart: Shows number of days you are ahead/behind schedule, based on planned dates for reaching milestones. To use this, you would plan your project as a series of milestones, identify target dates for reaching each, track actual dates of reaching each milestone, and plot the difference.
- Earned value chart: Identify the activities for your project, and associate an earned value (# of points) with each. You earn the value (only) when you complete the activity - no points for partially complete activities. Plot total points earned against time.
- Defect metrics: Keep track of errors in your artifacts: requirements, design, code, test cases, process definition etc. The best practice is to use a defect tracker (such as bugzilla) and enter problems as they are found, and also track the status of the defect (found, assigned, fixed, deferred). From this information, you can derive several metrics:
- Defect density: number of errors per KLOC, page, use case.
- Pie chart of defects by type (include raw number and density).
- Bar chart of defects by module.
- If you are conducting code inspections on all deliverables, you can create a Defect Removal Effectiveness chart, which is a grid showing the phase in which each defect originated, and phase it was found. Very useful to measure effectiveness of inspection activities and also phase activities, but probably too much work for the size of projects you are doing.
- Effort metrics: Keeping track of hours spent.
- Estimation accuracy: Planned vs actual effort for each activity.
- Effort by type of activity: Classify effort based on activity type: requirements, design, implementation, coordination, documentation, customer interaction etc. Useful to identify where your time is going, so that you can try to improve efficiency in that area.
- Activity metrics:
- Requirements metrics: Requirements volatility (% of requirements that were changed during each week)
- Design metrics: methods per class, depth of inheritance hierarchy, fan out (# of methods called per class or per method) …
- Test metrics: Coverage, # of test cases per requirement (use case, user story etc), % of tests passed successfully
- Code metrics: cyclomatic complexity, LOC per method
- Documentation metrics:
- Other metrics
- Productivity: Volume of deliverable produced per unit effort (e.g. pages per hour, LOC per hour)
- Product quality metrics: mouse clicks per operation (interface usability), average response time (performance)
- Average time to fix bugs / respond to customer requests
- Other metrics suggested by the sponsor or preferred by the team.