Professional Documents
Culture Documents
Empirical Report To Evaluate and Analysis Actual Defect of System
Empirical Report To Evaluate and Analysis Actual Defect of System
M.Tech Scholar, CSE Department, Arya College of Engineering, Jaipur, Rajasthan, India
Assistant Professor, Department of CSE & I.T, Arya College of Engineering, Jaipur, Rajasthan, India
ABSTRACT
Current scenario of research shows that testing is a valuable area for every project to guarantee its quality and performance at actual behavior. Before starting to test a system we first known literature about the remaining problem in project, due to this problem the performance and quality of project may not capture its planned area. So we first search out few of common error in the project. For that we need to clarify the concept of defect, error, fault, failure and other relevant to affect the system performance. In this paper we proposed simple and valuable stages for testing to capture common error and increase the performance of system. The paradigm of actual defect helps to classify the fault type. At last we summarize the paper and concluded with features scope.
required improving their knowledge, because different types of problem are innovated. In the presence of defect the functionality of software made changes and they cant achieve required quality. Before to proceed next the basic differences behind categories of problem must be clarify. Problem may be in four different terms such that fault, failure, error & defect. Failure is the departure of the external results of program operation from requirements. Fault is the defect in the program that, when executed under particular conditions, causes a failure. It is the cause of failure. Failure is something dynamic. The program has to be executing for a failure to occur but Fault is static. Its a property of program not a property of its execution behavior. It is created when a programmer makes an error. A failure occurs when the user perceives that a software program ceases to deliver the expected service. A fault is uncovered when either a failure of the program occurs, or an internal error (e.g., an incorrect state) is detected within the program. The cause of the failure or the internal error is said to be a fault. It is also referred to as a bug. In most cases the fault can be identified and removed; in other cases it remains a hypothesis that cannot be adequately verified [1]. When we analyzed the software system, mostly we saw that a fault is present in the software program but it performs the operation as we need. In this case, our operational commands do take the path where the fault is present. So it gave the desired output as we need. And we say that system may be reliable. But in another case: when we gave the operational command, which takes the path where fault is presented in the program then the system did not give the desired
32
output as we needed. In this case system shows the failure condition. After statistical analysis we found out that, if a fault is present in a software system then system may be said to be reliable but in the case of failure, system is not reliable. Defect: defect is a problem found in a later phase or process than when it was introduced. Defect means we cant repair again. Error: is a problem found in the current phase or process. Programmatically mistake Bug: It divert from our Expect result Means of Defect Density Defect density means the percentile amount of conformation to detect number of defects (with compared to complete/ size of) into software component or configured item during predefined time duration of building process.
DD = M / (M + N) [3] M = number of bugs detected by the test team N = number of defects detected by client or end user Size of Project can be Function points, feature points, use cases, KLOC etc. In general terms we can say that, Defect Density shows the comparative results of number of defects to the size (actual size) of the Project. Defect Severity Defect Severity or Impact is a classification of software defect (bug) to indicate the degree of negative impact on the quality of software. On the basis of workaround and defect affection, we can classify defects into four types: Table 1: Defect Severity Classification S. No. 1 Types of Defects Critical Defect Status Affects Critical functionality & Critical data Affects major functionality or major data Workaround No Example
Unsuccessful installation, complete failure of a feature. 2 Major Difficult A feature is not functional from one module but the task is doable if 10 complicated indirect steps are followed in another module/s 3 Minor Affects minor functionality Easy A minor feature that is not functional in or non-critical data one module but the same task is easily doable from another module 4 Trivial Does not affect functionality Does not need Petty layout discrepancies, or data spelling/grammatical errors Note: Trivial defects do not impact productivity or efficiency. It is merely an inconvenience. Defect severity is representated by DS.
33
CAUTION Generally, tester feuds to development team for defect severity. For example: in project, the major defect severity is classified by tester but the developers refuse to agree on them: They believed that the testers classified defect is of Minor or Trivial severity. Defect Probability The visibility or probability of defects / bugs shows that, the quantitative amount of defects /bugs encountered by the users. If (User encountered all the defects/ bugs) { P (D) = 1; (Print defect probability is, HIGH); } Else if (User encountered 50 % of all the defects/ bugs) { P (D) = = 0; (Print defect probability is, MEDIUM); } Else { P (D) = 0; (Print defect probability is, LOW); } P (D) stands for Defect Probability; it is also measure in percentage (%). Defect Probability (visibility) is measured in respect to feature usages of software, not the overall software. Defect Priority The fixations of defects / bugs are most prior work in the field of testing. The classical presentation of this work is known as defect/ bugs Priority. This priority should be initialization by Software Tester and at last project manager finalized them. No authority are classified the defect priority because its a subjective decision made every time changes, depending on humans thought. We can suppose few categorizations as listed below: Table 2: Defect Priority Categorizations S. No. 1 2 Type of Priority Urgent High Denotation Dp1 Dp2 Description Fixation of defect must be done in (before) next build Fixation of defect must be done in anyone of upcoming builds but should be included in the release Fixation of defects may be done after the release / in the next release Fixation of defects may or may not at all
3 4
Medium Low
Dp3 Dp4
According to ISTQB, priority is the level to classify the importance for assigned task. It is necessary for management team to avoid installation of product, especially when there may be huge amount of defects.
34
Figure 2: Bugs Life Cycle Varies from Organization to Organization The life cycle of bugs/ defects is a complete tour of bugs activities from their identification to finalization. It may move from organization to organization.
Figure 3: Bugs Life Cycle Defect Report The testers recognition a bug (defect) and create a proper deficiency report. The main motive of this report is to make visible the problem so that developer can easily imitate & fix them. It is essential that you report defects effectively so that time and effort is not unnecessarily wasted in trying to understand and reproduce the defect. Defect Detection Efficiency
DDE (Defect Detection Efficiency) is measured in percentage (%). Let us suppose that, m number of defects injected during phase I and all defects are detected during phase I & none are transmitted to subsequent phases. Then Defect Detection Efficiency is 100%. The chargeable amounts to fix the defect in later phase are maximum. DDE are
35
more applicable in the identification and measurement the processes quality (process efficiency) within software development life cycle Example Detected Defects that were Injected in the Same Phase 6 14 25 Defect Detection Efficiency 33.33% [= 6 / 18] 60.8% [= 14 / 23] 14.28% [= 25 / 175]
Phase
Injected Defects
Detected Defects
Detected Phase Specific Activity Requirement Review Design Review Code Review Unit Testing System Testing Integration Testing Acceptance Testing Operation
Requirements Design Coding Unit Testing Integration Testing System Testing Acceptance Testing Operation
18 23 175 0 0 0 0 0
6 16 23 25 30 83 5 3
The DDE of Requirements Phase is 33.33% which can definitely be bettered. Requirement Review can be strengthened. The DDR of Design Phase is 60.8% which is relatively good but can be bettered. The DDE of Coding Phase is only 14.28% which can be bettered. The DDE for this phase is usually low because most defects get injected during this phase but one should definitely aim higher by strengthening Code Review. Defect Age S. N
Measuring Term
Application To measure responsiveness of the development /testing team. Lesser the age better the responsiveness. Lets say the software life cycle has the following phases: Requirements Development High-Level Design Detail Design Coding Unit Testing Integration Testing System Testing Acceptance Testing If a defect is identified in System Testing and the defect was introduced in Requirements Development, the Defect Age is 6.
Example If a defect was detected on 01/01/2009 10:00:00 AM and closed on 01/05/2009 12:00:00 PM, the Defect Age is 98 hours.
Time
Phases
For assessing the effectiveness of each phase and any review/testing activities. Lesser the age better the effectiveness
36
Defect Removal Efficiency (DRE) DRE= (Defect Found by Internal Team/Total no of defects found) * 100 Thus, DRE is the percentage of bugs eliminated by reviews, inspections, tests etc. Example - defects found internally=95 & client found around 5 defects then DRE= (95/(5+95))*100 = 95%.
Selected the nearest and appropriate required area of testing (i) (ii) Create the list of common problem in software with their appropriate testing technique Arrange the problem in an order with testing technique
c.
(ii) Search there may be a problem that match with list of common problem (iii) If problem is differ from general problem then, detection technique applied d. e. Capture problem Classify the category of problem (i) Classify the class of Problem. They may be belong to the class of fault, failure, error & defect etc,
(ii) Further classification of those categories. For example: a problem may be in defect categories then classify the defect( in which problem are matched) (iii)
37
(iv)
(v)
(vi)
(vii) At the bottom from top (classified the problem). So that the main center of problem is cleared and can help to understand easily to solve them
f.
Apply the common methods for solving to the problem (i) (ii) (iii) What are the common methods for classified categories to solve the problem These common methods are applied to solve the problem The solution comes from node (child site of problem classified) to top (main node).
(iv)
38
g.
Check one by one mythology which may be applicable (i) (ii) (iii) Step by step apply the nearest TMA Every time result is saved in a tabulated form By using probability of nearest possible solution (Use the result table & seen the nearest value of result for many methods that have been shown a most common result). (iv) That may consider as final result.
APPLICATION
This model is Applicable in the area of examination of software and hardware. Other most common use of this model is listed as below: (i) (ii) (iii) Design Electronics tools Software development life cycle (software testing) Examine Mechanical tools
ACKNOWLEDGEMENTS
The author is thankful to all the audience and peoples who supported us directly or indirectly.
REFERENCES
1. Articles on Software Reliability Theory by Michael Rung-Tsong Lyu, The Chinese University of Hong Kong, 2002. 2. 3. 4. 5. 6. http://www.geekinterview.com/question_details/15436 http://softwaretestingfundamentals.com/defect-density/. http://geekswithblogs.net/dthakur/archive/2004/07/05/7617.aspx. http://www.softwaretestingtimes.com/2010/04/defect-density.html http://blogs.construx.com/forums/p/695/1556.aspx