CodeReviewChecklist BestPractices

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 5

Automation Development Code Review Checklist.

General Checklist:
S.No Review Guidelines Verified(Yes/
. No)

1 Test Plan should be approved before starting a test script

2 Test script should be unit tested before sending for Internal/External Review

3 A New file should be created as per the Standard Naming convention and the name should signify the test’s
purpose    E.g.: SanityAliveTests(Caps for first letter of each word/Pascal Casing)

4 Add Comcast Headers for New files


(/*  *================================================================
 * This file is the intellectual property of Comcast Corp.  It
 * may not be copied in whole or in part without the express written
 * permission of Comcast or its designees.
*===================================================================
 * Copyright (c) 2018 Comcast Corp. All rights reserved.
*===================================================================
 */)

5 Package path and import should be at the top

6 @author should be included.

7 @version should be included as required.

8 Class variables should be declared at the beginning of the class.

9 Variable name should follow standard naming convention(Camel Casing/NoUnderscore,No Numbers). Boolean
variables should start with 'is'/'does' etc. Eg:isFilePresent, For Collections, use plurals in the name

10 Check for existing utils - Project lib & external lib before creating a new util

11 Add Methods in CommonUtils only if they can be reused across various tests/features

12 Check for constants availability in TestConstants,TraceConstants,CommandConstants before adding them

13 All String/Text/version/numbers/regex/patterns

14 All commands/Filenames , names should start with CMD_ /FILE_

15 Log Message, names should start with LOG_

16 Method names should follow standard naming convention and the name should signify the function’s
purpose(Camel Casing/NoUnderscore,No Numbers).

E.g.: tuneToChannelAndVerify () - start with lower case and every other word should start with uppercase
character.

17 All Methods should contain ENTERING and EXITING log messages in DEBUG level

18 All Methods and file should contain detailed comment headers(Java Doc format)

19 All class, variable, and method modifiers should be validated. For correctness

20 Classes should be packaged based on particular feature.(Test and Utils)

21 All computed STRINGS/Objects should have NULL check validation using CommonMethods.isNotNull()
22 Complex algorithms should be explained with references/comments. 

23 TEST_GROUP and TEST_CATEGORY should be appropriately added

24 Use RegEx/PatterFinder/PatternMatcher to perform any String Manipulation (try to avoid split/contains etc.)

25 Ideal case would be to call a Utils Method/Api to validate the step

26 Check if the code is applicable for all devices, if not the condition is handled

27 Use grep -I <text> <filename> , do not use cat xxx| grep yyy 

28 Be aware of using external code to avoid Black Duck Violation.

29 Incomplete code is marked with //TODO or //FIXME markers.

30 Add JIRA ticket number wherever appropriate

31 Specific Exceptions to be used - Preferrably Test Exception. Others to be used as applicable

32 Array indices are always checked to avoid ArrayIndexOfBounds exceptions.

33 Invalid parameter values are handled properly early in methods (Fast Fail).

34 An Error handler/finally block must clean up state and resources no matter where an error occurs.

35 Avoid large objects in memory, or using String to hold large documents which should be handled with better
tools. For example, don't read a large XML document into a String.

36 Do not leave debugging code in production code.

37 Avoid System.out.println (); statements in code. Use LOGGER class

38 Comments should be added for every step/command executed to tell the purpose of the step

39 Every log added should be clear and precise. 

40 Code should be formatted using XRE_Automation_Formatter_v0.2.xml 

41 Do PMD Check before Code-Review , this will provide us with violations in the test code

42 For internal/External review, send a mail to the reviewer with following details
 Test log should be attached
 Test Plan should be attached
 Gerrit ID/s should be mentioned. (Project Specific)
 Code Review Checklist

43 For external review, (Only if internal review is complete) .(Project Specific)


 Add test log to the JIRA ticket(complete log not the Jenkins job link)
 Raise external review on topic_ci_develop/stable2 branch only

44 All condition blocks and looping will be within {}.

45 CommonMethods.patternFinder() -> Get regex match output.

46 Use equals for Object comparison & equalsIgnoreCase for value comparison

47 Logging should follow the HTML logging requirements,


48 Test case impact should be updated in Automatics when test weightage is 1.(Project Specific)

49 Use private branches to make changes to the code, get them reviewed and merge to develop brach.

50 Keep the Single Responsibility Principle in mind when writing code.

51 Use indentation to marks the beginning and end of control structures

52 Avoid Deep Nesting

53 Avoid creating unnecessary objects. Always remove unused object references

S.No Review Guidelines


.

1 LOGGER.Debug -  Messages to debug the code

2 LOGGER.Info - Logs to help understand the Execution 

3 LOGGER.Error -  Only to Log any failures and exceptions(Avoid exception.printStackTrace())

4 LOGGER.info("S1) <TEST STEP>");


LOGGER.info("S1 - EXPECTED - <EXPECTED_RESULT>.");
result
<VALIDATION STEPS>
result = <COMPUTE_RESULT>;
errorMessage = <ERROR_MESSAGE>
LOGGER.info("S1 - ACTUAL: " + (result ? <MESSAGE> : errorMessage);

5 Pre-Condition (Throw - catch)


if (!result) {
        errorMessage = "UNABLE TO VERIFY THE PRESENCE OF DCM SCRIPT LOG FILE.";
        LOGGER.error(errorMessage);
        throw new TestException(CompassTestConstants.PRE_CONDITION_ERROR + errorMessage);
        }
CommonUtils.updateTestStatusDuringException(tapEnv, settop, testCaseId, step, result, errorMessage, true);

6 Use tapEnv.searchAndWaitForTRace() to search logs from ocapri_log.txt and receiver.log

7 use CommonUtils.searchLogFiles() to search a file for not NULL response

8 Use CommonUtils.getPidOfProcess() for checking whether a process is running instead of systemctl/ps -ef

9   Test Class name -> Compass<Functionality Name>Test.java. For Reboot test, the class name should be CompassAppleTest.java,
etc.

10   Package name -> com.comcast.merlin.<featureName>

11   Utility class name -> Compass<Functionality Name>Utils.java

12   Constant class -> We already have CompassTestConstants.java, CompassCommandConstants.java,


CompassTraceConstants.java, CompassWebPaConstants, etc. Before defining any new constants, please refer the parent class
defined in rdkv-utils.

13   CompassCommonUtils.enableDisableReportingService() -> To enable or disable any reporting service along with pooling &
reporting period.

14   CompassCommonUtils.postHttpRequest()

15   CommonUtils.rebootAndWaitForIpAccusition() -> Device reboot

16   Use StringBuffer for string concatenation instead of using “+”.


Security Checklist:
S.No Review Guidelines
.

1 Always avoid sensitive data leak (using production accounts in test files, PII, Credit Card Numbers etc)

2 DO NOT HARDCODE USERNAME, PASSWORDS, ACCESS KEYS OR LIKEWISE.

3
log only necessary information and obfuscate the sensitive data .
4 ALWAYS STORE YOUR SECRETS IN VAULTS. When not possible, HASH THEM.

5 Do not log any secrets. NEVER!!!

6 Do not PRINT any secrets in reports generated. NEVER!!!

Best Practices / Util Info:


S.No Review Guidelines
.

1 LOGGER.Debug -  Messages to debug the code

2 LOGGER.Info - Logs to help understand the Execution 

3 LOGGER.Error -  Only to Log any failures and exceptions(Avoid exception.printStackTrace())

4 LOGGER.info("S1) <TEST STEP>");


LOGGER.info("S1 - EXPECTED - <EXPECTED_RESULT>.");
result
<VALIDATION STEPS>
result = <COMPUTE_RESULT>;
errorMessage = <ERROR_MESSAGE>
LOGGER.info("S1 - ACTUAL: " + (result ? <MESSAGE> : errorMessage);

5 Pre-Condition (Throw - catch)


if (!result) {
        errorMessage = "UNABLE TO VERIFY THE PRESENCE OF DCM SCRIPT LOG FILE.";
        LOGGER.error(errorMessage);
        throw new TestException(CompassTestConstants.PRE_CONDITION_ERROR + errorMessage);
        }
CommonUtils.updateTestStatusDuringException(tapEnv, settop, testCaseId, step, result, errorMessage, true);

6 Use tapEnv.searchAndWaitForTRace() to search logs from ocapri_log.txt and receiver.log

7 use CommonUtils.searchLogFiles() to search a file for not NULL response

8 Use CommonUtils.getPidOfProcess() for checking whether a process is running instead of systemctl/ps -ef

9   Test Class name -> Compass<Functionality Name>Test.java. For Reboot test, the class name should be CompassAppleTest.java,
etc.

10   Package name -> com.comcast.merlin.<featureName>

11   Utility class name -> Compass<Functionality Name>Utils.java

12   Constant class -> We already have CompassTestConstants.java, CompassCommandConstants.java,


CompassTraceConstants.java, CompassWebPaConstants, etc. Before defining any new constants, please refer the parent class
defined in rdkv-utils.
13   CompassCommonUtils.enableDisableReportingService() -> To enable or disable any reporting service along with pooling &
reporting period.

14   CompassCommonUtils.postHttpRequest()

15   CommonUtils.rebootAndWaitForIpAccusition() -> Device reboot

16   Use StringBuffer for string concatenation instead of using “+”.

You might also like