P. Configuration Audit Checklist
Sr. No. | Point to be Verified |
1 | A reviewed and approved Configuration management plan exists? |
2 | Is the Configuration Management process implemented as per plan? |
3 | Are all Configurable Items correctly identified and named? |
4 | Is adequate configuration management documentation (e.g., Change Request (if applicable), Change Control Log available? |
5 | Have the Configurable Items been placed in the Library Structure? |
6 | Are the documents and code units are current, complete, and properly formatted? |
7 | Have traceability matrices from the requirements to test cases been prepared? |
8 | Is the traceability metrics correct & complete? |
9 | Does the Traceability matrix reflect the latest Versions of the configuration items? |
10 | Have all findings from the FS Reviews been incorporated or completed? If not, have the recommended action items been listed? |
11 | Have all findings from the TS Reviews been incorporated or completed? If not, have the recommended action items been listed? |
12 | Have you done random sampling on requirements/ requirement changes to check whether they are addressed in the Traceability matrix? |
13 | Have you done random sampling on Test cases and source code to check whether the requirements are taken care as per Traceability matrix? |
14 | Were test cases identified for all requirements? |
15 | Were all tests cases identified actually performed? |
16 | Were all tests (eventually) successful? |
17 | Were all defects documented and subsequently corrected? |
18 | Is user documentation (User's Manual, Operator's Manual) specific for this release available, complete, and current? |
19 | Does the tested system reflect the release package agreed to by the customer? |
20 | Metrics pertaining to configuration management are collected as per QMS requirement |
21 | Latest items/documents are in use. |
22 | Are draft versions prior to the review is available for all CI’s? |
23 | Has Basis activities checked as per the SAP_Basis Activities Checklist? |
Q. Incidence Checklist
Sr. No. | Point to be Verified |
A | Were transport request nos. mentioned? |
B | Have issues been checked for Estimate & Actual Effort? |
b.2 | Were there any with major difference? |
b.3 | If yes, were the reasons analysed and documented? |
C | Has the ‘Closed MRs’ sheet made for the month? |
c.2 | Have issues been checked for |
D | If yes, have the reasons for violations known & documented? |
d.1 | Are there any actions planned or taken? |
d.2 | Have closed issues been checked for Solutions? |
E | If yes, were they found informative? |
e.1 | Have issues of type L3 (Customising/ ABAP dev) been checked for? |
e.2 | If yes, were test cases documented? |
R. PES Checklist
| What needs to be done | Who | With / to whom | Document-ation | Miscellaneous | |
| ||||||
Planning | Agree/confirm date of PES | PM | with participants | Project Quality Plan | If date changed, PM must find new date | |
Decide distribution list & documentation | PM | | | | | |
Preparation | Prepare documents for PES | PM | | | | |
| ||||||
Distribute documents for PES | PM | to Decision maker, TC & FC, Sys Test | | | | |
Technical appraisal | TC | | | | | |
Financial appraisal/ FSR | FC | | | | | |
If meeting is planned, draw up invitation to meeting and agenda | PM | with Decision maker | PES Invitation | | | |
Review Preparation | Moderator | | | | | |
Project overview | PM | | | | | |
Status of conditions/ tasks | Moderator | with PM | | | | |
Meeting | Discussion | All | | | | |
Acceptance | Decision Maker | | PES Decision Sheet | | | |
Next Phase planning | PM | | | | | |
Technical appraisal | TC | | | | | |
Financial appraisal | FC | | | | | |
Discussion | All | | | | | |
Release | Decision maker | | PES Decision Sheet | | | |
Dates of next meetings | Moderator | | | | | |
Project experience | All | | | | | |
Agree PES Decision | Moderator/ recorder | | | | | |
Follow-up | Draw up and get PES Decision sheet signed off | PM / Recorder | to participants | PES Decision Sheet | | |
Distribute PES decision sheet | PM/ Moderator | | PES Decision Sheet | | | |
Track conditions | PM | | | | |
Sr. No | Point to be checked |
A | Open Items from previous PSR: |
a.1 | Has the previous PSR been reviewed for open issues? |
a.2 | Have the open issues been carried forward to current PSR? |
B | Customer Satisfaction: |
b.1 | Has the CSS feedback form been received? |
b.2 | If no, need to remind Client? |
b.3 | If yes, Is it discussed with client? |
b,4 | Detailed analysis done and documented? |
b.5 | Action plan prepared? |
b.6 | Actions are effective? |
b.7 | Has a copy of the CSS form forwarded to Quality? |
C | Schedule Slippage: |
c.1 | Is there a Schedule Slippage? |
c.2 | Root cause analysis done? |
c.3 | Schedule revision required? |
c.4 | Schedule revision approved by customer? |
c.5 | Project Plan updated? |
c.6 | Percentage Completion of Tasks by Milestone reviewed? |
c.7 | Is the schedule in danger? |
D | Effort Accuracy: |
d.1 | Is effort accuracy different from the goal? |
| Root cause analysis done? |
| Corrective actions taken? |
E | Metrics & Defects: |
e.1. | Has quality data report been prepared, reviewed and sent to P & Q group for the month? |
e.2 | Has defect/bug analysis been done? |
| Root cause analysis done? |
| Preventive action taken? |
e.3 | Has the current metrics report been compared with previous metrics report? |
e.4 | Are the Defect Prevention activities going on as per plan? |
F | Risk Management: |
f.1 | Is Risk Management plan being maintained? |
f.2 | Have all the risks identified in the current phase been assessed? |
f.3 | Has an action plan been made to monitor future activities to avoid/eliminate risks? |
f.4 | Has any new risk identified and documented? |
G | Resources: |
g.1 | Are resources adequate? |
| Hardware (Terminals/PCs, Disk space, Memory, Link/System response) |
| Software (Licenses) |
| Human resource |
| Infrastructural Resources |
| Any new training required? If yes, then is plan prepared / discussed? |
g.2 | Any change in project team? |
| Project Plan updated? |
| Is Familiarization done for new members? |
| Is Debriefing/Appraisal done for exiting members? |
g.3 | Hand over Process (If Applicable) |
| PM to PM |
| TL to TL |
| PT to PT |
| PQM to PQM |
H | Is Project Billability satisfactory? |
I | Reviews: |
i.1 | Contract review: |
| Is there any change in the scope/technology/skill set? |
| Contract review done for any such change? |
i.2 | Work product reviews done as per plan? |
| Plans (PP,QP,CMP,RMP,TP) reviewed? |
| Review report forms complete? |
J | Configuration Control & Change Control: |
j.1 | Are Customer supplied items under CM? (H/W, S/W, Documents, Stds) |
j.2 | Change control process as per plan |
j.3 | Any Change Requests? |
| Any impact on effort/ schedule? |
| Changes conveyed to Marketing/Client? |
| All affected Baseline items updated? |
j.4 | Configuration items updated? |
| CM status reviewed by PM? |
K | Audit Related: |
k.1 | Any NCs / observations? |
| Any NC repeated? |
| Number of NC increasing? |
| Has preventive action been taken based on Internal /External Audit Analysis report? |
L | General: |
l.1 | Any Inter-group co-ordination related issue? |
l.2 | Any Document Control issues? |
l.3 | Are minutes prepared for Meetings/Tele-cons? |
l.4 | Is Distribution of reports/documents as per plan? |
l.5 | Need for process automation? |
l.6 | Changes to QMS desired? |
l.7 | Down time/Idle time analysed? |
l.8 | Training for members as per plan? |
l.9 | Any Managerial/Personal issues? |
l.10 | Any organisational Issues discussed? |
l.11 | Any Employee Rotation/turnover issue? |
l.12 | Any technical issues? |
I.13 | Is there any testing related issues? |
I.14 | Is there any customer related issues? |
l.15 | Re-usable if any used in the Project? |
l.16 | Re-usable if any developed / identified in the Project? |
l.17 | Best Practices any followed in the Project |
l.18 | Have all the approved documents till date uploaded in the Livelink? |
T. Risk Checklist
Risk Id | Risk Category | Risk Description | | | | |
C | Customer | | | | | |
C1 | | Credit worthiness doubtful and payment delays likely | | | | |
C2 | | Lack of customer commitment | | | | |
C3 | | Customer lacks technical competence | | | | |
C4 | | Language barrier | | | | |
C5 | | Too much dependence on Customer for infrastructure,libraries,etc. | | | | |
| | | | | | |
S | Supplier/Partner | | | | | |
S1 | | Inadequate back-to-back agreement | | | | |
S2 | | Integration / dependency on third party products and/or partners | | | | |
S3 | | Substantial dependence on partner | | | | |
S4 | | Delivery date selected is too late, making employees under-utilized | | | | |
S5 | | Delivery date selected is too early, ramp up difficult | | | | |
S6 | | | | | | |
| | | | | | |
L | Proposal & Contract | | | | | |
L1 | | Cost estimation risk/no risk elements in cost estimates | | | | |
L2 | | Fixed price contracts based on insufficient specification | | | | |
L3 | | Delivery dates difficult to comply with | | | | |
L4 | | Guarantee regarding functionality,performance,availability etc.with substantial consequences being missed out | | | | |
L5 | | Unlimited liability for damages caused by delay, no-compliance,guaranteed functional element or violations of legal clauses | | | | |
L6 | | Billing of travel expenses by project team not included in the pricing | | | | |
L7 | | | | | | |
| | | | | | |
| | | | | | |
T | Technical | | | | | |
T1 | | Use of new or unproven technology | | | | |
T2 | | Use of new development tool,testing tool,etc. | | | | |
T3 | | Risk due to specific IT security requirements | | | | |
T4 | | Very high testing requirement | | | | |
T5 | | Requirements difficult to validate | | | | |
T6 | | High number of products/components/interfaces | | | | |
T7 | | Migration of existing data | | | | |
T8 | | Crtitical commitments regarding function/performance/scalability/security/availability/time behavoiur with consequences if not delivered | | | | |
T9 | | Obligation to comply with technology cycles requested or specified by the customer | | | | |
T10 | | Service level ( | | | | |
| | | | | | |
R | Resources | | | | | |
R1 | | Manpower with requisite skills not available | | | | |
R2 | | Ramp up difficult | | | | |
R3 | | Software license procuring could cause delay | | | | |
R4 | | Budget is insufficient to meet training, travel, HW and SW procurement | | | | |
R5 | | Back up for key resource | | | | |
| | | | | | |
P | Project Process | | | | | |
P1 | | Requisite technical,domain knowledge not available | | | | |
P2 | | Project team requires substantial training but project schedule does not permit | | | | |
P3 | | Software development standards and process not available | | | | |
P4 | | Prototyping/Simulation is a must | | | | |
P5 | | Regular backups of project data not planned | | | | |
P6 | | Inter/intra team coordination | | | | |
P7 | | Acceptance conditions do not correspond to those usually proposed and result in additional cost | | | | |
P8 | | The acceptance procedure and criteria are not documented or is inadequate | | | | |
P9 | | The acceptance conditions and conventions have not been agreed with or accepted by the customer. | | | | |
P10 | | Change Request has not been addressed contractually | | | | |
P11 | | Nothing has been specified with regard to warranty and warranty period | | | | |
P12 | | The product is used at many sites during the warranty period | | | | |
P13 | | Failure to comply with or take into account the required standards | | | | |
P14 | | There is no effective project organisation at customer end | | | | |
P15 | | Unstable process | | | | |
P16 | | Decision not supported by evaluating techniques | | | | |
| | | | | | |
E | Engineering Process | | | | | |
E1 | | Safety critical codes are involved in the program | | | | |
E2 | | Implementation personnel are not familiar with development environment | | | | |
E3 | | Software requirements can only be validated at system testing level | | | | |
E4 | | The funtions and function descriptions are incomplete and unclear | | | | |
E5 | | The technical descriptions does not correspond or only partially corresponds to the task specified | | | | |
E6 | | The technical solution or parts of it are not checked for feasibility | | | | |
E7 | | The customer has heterogeneous procedures landscape, into which we have to integrate | | | | |
E8 | | Interfaces within the system are not or are only inadequately identified and described | | | | |
E9 | | Customer not informed or inadequately informed of test data and test cases | | | | |
E10 | | Provision of test data/test cases is not agreed with the customer(contractual agreement) | | | | |
E11 | | Test cases not described specifically or clearly | | | | |
| | | | | | |
U. Guidance on Risk Impact Value
Risk Impact Value | Quality | Cost | Schedule | | | | | |
5-Catastrophic. | Operational failure, unacceptable. | 1. Current design inadequate. 2. Alternate design approach not available. 3. Validation only possible at system testing level. | Cost will exceed established: Contract maximum or Life cycle cost maximum. | 50% slip expected | | | | |
4-Major | Loss of operational capability. | Margins can only be met by: Significant redesign Reallocation of design margins. | Cost will exceed by >5%: Contract target, or LCC target (but will not exceed maximum of each) | 25%>schedule slippage <50% expected, or Slip impacts critical path, or Slip puts item on critical path. | | | | |
3-Moderate | Limited operational impact | Rigorous Integration testing required to meet quality parameters | Cost will exceed by <5%: Contract target, or LCC target. | 10%>schedule slippage <25% expected | | | | |
2-Minor | Minimal operational impact | Minor HW or SW design changes required meeting performance. | Cost will exceed by <5% but can be partly recovered from the client. | Schedule delay can be managed within reserves. | | | | |
1-Insignificant | No operational impact | Reviews will improve the quality | Cost can be recovered from the client | Schedule delay can be managed easily to meet all the deliverables on time | | | | |
V. Guidance on Risk Probability
Probability | Quality | Cost | Schedule | | | | | | | |
High 0.7 £ risk probability < 1.0 | 1.Requirements only through conceptual design. 2. Analysis, demonstration, or simulations have not been conducted. 3. Use of unproved technology. 4. Highly complex. 5. Many critical components. | 1. 20% of cost estimate is: a. High level. b. Without complexity factor defined. c. Has no historic basis. 2. No contractual coverage | 1. No margins. 2. Schedule estimate is not based on prior experience.3. Critical dependencies outside project’s control with no flexibility. | | | | | | | |
Medium 0.4 £ risk probability < 0.7 | 1. Requirements partially documented. 2. Technical approach never implemented before.3. Design concept verified by prototype. | 1. 80% of estimate is detail level. 2. Contract coverage is bounded. | 1. Schedule estimate is based on little experience. 2. Little schedule margin. 3. Some schedule flexibility for critical dependencies outside project’s control. | | | | | | | |
Low 0.1 £ risk probability < 0.4 | 1. Requirements understood and fully documented. 2. Technical approach: a. Has been demonstrated on client or similar program. b. Is within state-of- the art? | 1. Estimate is: a. Fully detailed. b. Based on history. 2. Contractual coverage is well established | 1.Schedule based on closely related experience. 2. No schedule dependencies outside project’s control. | | | | | | | |
W. Guidelines for Customer Feedback Rating:
INSTRUCTIONS TO SYSTEMS / PROJECT MANAGER : | ||||||||
| | | | | | | | |
1. The SAP_Customer Feedback Form is to be used for formal survey of customer feedback at defined periodicity (Minimum 6 months or at the end of the project) | ||||||||
2. Please give this form (hard copy or soft copy) to the customer’s representative. | ||||||||
3. Please forward the completed forms to the Quality Manager in your region. | ||||||||
4. Guidelines for Rating Criteria can be used to gain common understanding before assigning CSI scores | ||||||||
ASSIGNING SATISFACTION SCORES | ||||||||
| | | | | | | | |
A. A high satisfaction rating (>=5) shall be given in case the identified attributes for each of the 6 rating criterion are met satisfactorily or SISL has excelled in meeting them. | ||||||||
B. A rating of 4 indicate the parameters identified are just about enough to execute the project this time. | ||||||||
C. Rating of 3 and below indicates SISL has failed to live up to many of customer expectations, and SISL may not be considered for next project unless action is taken to correct the shortcomings. | ||||||||
| | | | | | | | |
MAPPING OF SAP_CUSTOMER FEEDBACK CRITERIA TO ORGANIZATIONAL LEVEL CUSTOMER FEEDBACK CRITERIA | ||||||||
| | | | | | | | |
Criteria in SAP_Customer Feedback Form | Criteria in Customer Feedback Form | | ||||||
Did the Organization meet the committed schedules most of the time, given that all requirements are known & frozen? | Adherence to schedule | | ||||||
Schedule updates due to changes in requirement are communicated to customer on time? | Adherence to schedule | | ||||||
Changes to baseline schedules re approved by the customer? | Adherence to schedule | | ||||||
Did you find the processes well defined? | Process Quality | | ||||||
Were the processes defined is followed in Project execution? | Process Quality | | ||||||
Were all the issues resolved in agreed time? | Product / Service Quality | | ||||||
Was the resolution of issues to your satisfaction? | Product / Service Quality | | ||||||
Could the consultant provide a satisfactory solution to your business needs? | Domain Competence | | ||||||
Were there performance related issues encountered for the developments? | Domain Competence | | ||||||
Did your processes stop because of in-adequate post go-live support? | Post-Delivery Support | | ||||||
Did you satisfactorily achieve all the objectives set before the project? | Value for Money | | ||||||
Did Project overshoot expected budget because of SISL? | Value for Money | | ||||||
What is your overall satisfaction with SISL? | Overall customer Satisfaction | | ||||||
Will you recommend SISL’s service to others? | Overall customer Satisfaction | | ||||||
| | | | | | | | |
Note: | | | | | | | | |
1. For organizational Level Analysis, for each criterion in Customer Feedback Form, equivalent rating has to be calculated from the SAP_Customer Feedback Form by taking an average rating of respective criterions. | ||||||||
2. For scores of 5 and above in each criterion, Invite improvement suggestions in specific process areas from customers if any, and/or ask customer to share some best practice so that it can be practiced organization-wide, if not already) | ||||||||
3. For scores of 4 invite complaints/suggestions on processes which have fallen short of expectations | ||||||||
4. For score of 3 and below, SBU Manager would be mandated to meet the Customer |
X. Checklist for Lessons Learnt
Sr. No | Point to be checked | |
1 | Technology Related experiences are Addressed? | |
2 | Domain Related experiences are Addressed? | |
3 | Quality Related experiences are Addressed? | |
4 | Whether all the Risks encountered and how it was handled is Addressed? | |
5 | Customer’s involvement towards the project is Addressed? | |
6 | Whether the Achievements of the project have been Addressed? | |
Y. Contents of Process Database
Effort |
1. The Estimation data collected across the projects will be consolidated. This is to give a distribution of estimation details in terms of Planned Vs Actual. This may also assist PM / PQM while doing the Estimation for their respective projects. 2. Object wise / CR wise estimated effort is captured and the data is used for future estimation. |
Risk Database |
Risk information collected from all the projects are consolidated & is made available. This may assist PM to get an idea about Risks occurred in the projects while preparing Risk Management Plan for their respective projects. |
Reusables |
Reuse artifacts collected from all the projects are consolidated & is made available. This may assist PM to get an idea about probable reuse percentage in the projects while preparing Project Plan for their respective projects. |
Tools used |
Tools used by different projects are collected and consolidated & is made available here. This may assist PM to get an idea about tools to be used in the projects while preparing Project Plan for their respective projects. Also it helps PM to select an appropriate tool for a particular task. |
Quality Goals |
Once in six months when the indicators are updated, the same is made available for the projects to follow / update their plans accordingly. |
Process Capability Baselines |
The Process Capability Baselines are calculated from the data collected across projects once in three months. PC Baselines acts as input to determine the Quality Goals. SEPG reviews it before the same is released to all. The previous baselines are also maintained for reference. |
Coding Guidelines |
The Best Coding Guidelines will be collected for each topic (like ABAP, JAVA etc…) from the projects and will be made available. This will be done with the help from SEPG. |
Lessons Learnt |
On Successful completion of the project, at the time of END PES meeting, Lessons Learnt during the execution of the project will be collected. The same is made available to all for reference. |
Training Materials |
Training materials are collected at org level and preserved at the repository for future use |
No comments:
Post a Comment