Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

Logging Standards

1 Objective:
2 Centralized Logging.
2.1 What are we using for centralized logging?
3 Use a standard interface across different projects.
3.1 Java Projects:
3.1.1 How do I add slf4j to my project?
3.1.2 How do I use the slf4j interface in Java?
3.2 How do I log in Apache Camel?
3.3 How do I log in BPM processes?
3.4 How do I log in SNOW?
4 Categorization of logs into levels
5 Standard easy to read format.
5.1 The multiple log lines per logging operation problem.
5.2 The 1 to 1 logging format.
5.3 The JSON log fields.
6 Logging patterns and anti patterns.
6.1 Log Flooding:
6.1.1 How to avoid log flooding.
6.1.1.1 Anti Pattern:
6.1.1.2 Pattern:
6.2 Senseless Logging.
6.2.1 How to avoid senseless logging.
6.2.1.1 Anti Pattern Example
6.2.1.2 Pattern:
6.3 Log and Throw
6.3.1 Anti-pattern:
6.3.2 Pattern:
6.4 Withhold the exception
6.4.1 Anti-pattern:
6.4.2 Pattern:
6.5 Use of System.out.println, System.err.println, e.printStackTrace
6.5.1 Anti-pattern
6.5.2 Pattern

Objective:
Define a set of standards and patterns that meet the following criteria:

1. Centralized logging.
2. Use a standard interface across different projects.
3. Categorization of logs.
4. Standard easy to read format.
5. Logging patterns and anti patterns.

Centralized Logging.
To enable management of logs across several projects/application it is imperative that we use a centralized logging solution. Centralized logging comes
with several benefits which is listed below:

Improved log data availability.


Better security.
Improved system-wide overview.
Application-level monitoring.

What are we using for centralized logging?


We are currently using the ElasticSearch Logstash Kibana stack for centralized logging.

Use a standard interface across different projects.

Java Projects:
For java projects all logging must happen via the Simple Logging 4 Java (slf4j) interface. This interface abstract the underlying logging implementation so
that developers can simply log using a standard interface. This also allows us to choose/change a logging implementation without developers having to
retrain.
Interface first then implementation details.

Note that the instructions below is not complete. It only details the interfaces. Slf4j cannot run by itself. We will address this at the end where we briefly for
the sake of sanity outline the implementation we are using.

How do I add slf4j to my project?


Add the following dependency to your maven pom file

SLF4J Maven Dependency

<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</dependency>

How do I use the slf4j interface in Java?


This is really simple and requires you to declare a logger and use it as show in the code snippet below.
Loggin with SLF4J

package com.ventia.wms.opti.api;

import java.io.BufferedReader;
import java.io.DataOutputStream;
import java.io.InputStreamReader;
import java.net.HttpURLConnection;
import java.net.URL;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

public class API


{

public static String address;


private static final Logger logger = LoggerFactory.getLogger(API.class);

public static String HTTPPost(String relativePath, String body)


{
if(address != null)
{
try
{
String urlStr = address + "/" + relativePath;
URL url = new URL(urlStr);
HttpURLConnection conn = (HttpURLConnection) url.openConnection();
conn.setRequestMethod("POST");
conn.setRequestProperty("Accept", "application/json");
conn.setRequestProperty("Content-Type", "application/json");
conn.setDoOutput(true);
DataOutputStream wr = new DataOutputStream(conn.getOutputStream());
logger.debug("Call: {} {}", urlStr, body);
wr.writeBytes(body);
wr.flush();
wr.close();

BufferedReader in = new BufferedReader(new InputStreamReader(conn.


getInputStream()));
String inputLine;
StringBuffer response = new StringBuffer();
while ((inputLine = in.readLine()) != null)
response.append(inputLine);
in.close();
logger.info("The rest response was {}:",response);
return response.toString();
}
catch(Exception e)
{
logger.error("Error in HTTPPost:",e);
throw new RuntimeException(e.getMessage(), e.getCause());
}
}
else
{
logger.warn("No API uri configured do you think I am stupid throwing my toys");
throw new RuntimeException("No API Rest address configured");
}
}

How do I log in Apache Camel?


In Camel we use the XML DSL which is a little different from Java logging. It still uses slf4j. Below is an example Camel logging.

Maven Dependencies

Since apache camel is essentially a Java project you need to add the same dependencies to the maven file as outlined in the Java section.

Apache Camel Logging Example

<route id="foo_service">
<from id="foo_service_from" uri="direct:foo_service"/>
<log logName="com.ventia.wms.fuse.foo_service" loggingLevel="INFO" message="foo_service request header:
${headers}"/>
<log logName="com.ventia.wms.fuse.foo_service" loggingLevel="INFO" message="foo_service request body:
${body}"/>
<setHeader headerName="CamelHttpMethod" id="foo_service_set_http_method">
<constant>GET</constant>
</setHeader>
<setHeader headerName="CamelHttpPath" id="foo_service_set_http_path">
<constant/>
</setHeader>
<inOut id="foo_service_call" uri="http4://fooservice.com"/>
<unmarshal id="foo_service_unmarshal">
<json library="Jackson"/>
</unmarshal>
<log logName="com.ventia.wms.fuse.foo_service" loggingLevel="INFO" message="foo_service response
header: ${headers}"/>
<log logName="com.ventia.wms.fuse.foo_service" loggingLevel="INFO" message="foo_service response body:
${body}"/>
</route>

How do I log in BPM processes?


In BPM there are 4 available logging methods exposed in the Pulse Class and log at the level indicated below

logger
logDebug
logInfo
logError

logger Method

The logger method in Pulse class internally calls logInfo with the provided string.

Include the below line into your Script task to log in a standard fashion

Script Task logging example

com.ventia.Pulse.logger(data_task_id, 'I like logging at INFO, but really I should call LogInfo');
com.ventia.Pulse.logDebug(data_task_id, 'I like logging at DEBUG');
com.ventia.Pulse.logInfo(data_task_id, 'I like logging at INFO');
com.ventia.Pulse.logError(data_task_id, 'I like logging at ERROR');

How do I log in SNOW?


Servicenow provides a couple of glide system methods (eg gs.log) for writing to the system log but these are not ideal because there is no way to set
levels to filter out the noise.

Instead, SNOW provides the script include GSLog() out of the box.
GSLog provides the following benefits:

It can log at following different levels: Debug, Info, Notice, Warning, Error & Critical
It will tag the log entry with a caller label so that the source of the entry can easily be identified.
The level of logging can be set simply by modifying a system property.

Usage for non-class functions

//definition
var gl = new GSLog('ventia.log.level', '<script include / business rule name>');

//usage
gl.debug('<Say debug/testing stuff here>');

gl.error('<Say error message here>');

Usage for class based functions

//definition inside class init


initialize: function() {
this.logger = new GSLog('ventia.log.level', '<class name>');
this.logger.debug('initialized');
},

//usage
this.logger.debug('<Say debug/testing stuff here>');

this.logger.error('<Say error message here>');

Categorization of logs into levels


Slf4j provides several logging levels that we can use. Choosing the correct level to log things on is critical as we can easily filter on this logging level.

See the table for more details:

Logging Suitable Environments For Use When To Use


Level

INFO Local Development, Development Servers, Use to log inputs, outputs only.
Production Servers

ERROR Local Development, Development Servers, Use to log exceptions when they are handled
Production Servers

WARN Local Development, Development Servers, Use to log when normal logic flow is not working. Use to log exceptions that you
Production Servers throw up the stack.

DEBUG Local Development, Development Servers Use to log normal logic flow and any supporting variables.

TRACE Local Development, Development Servers Use to log everything but the kitchen sink. Beware this can cause analysis
paralysis.

Standard easy to read format.

The multiple log lines per logging operation problem.


Typically logging solutions do not have a well defined structure, if you think they do have a well defined structure read on. Developers assume that
executing a log operation will result in one event/log line in the logs, however there are cases when one logging instruction can create multiple log entries.

For example:

Log A Item

log.info("hello world");

Would produce a log like this:

Log Output

2019-09-25 10:22:28.543 INFO 7828 --- [main] c.ventia.wms : hello world

However when logging exceptions this assumption falls apart as stack traces are logged as several log lines, see the example below:

One log event several log lines

04:10:20.374 [@project.name@-thread-84] WARN o.a.c.c.jms.EndpointMessageListener - Execution of JMS message


listener failed. Caused by: [org.apache.camel.RuntimeCamelException - com.microsoft.sqlserver.jdbc.
SQLServerException: Conversion failed when converting date and/or time from character string.]
org.apache.camel.RuntimeCamelException: com.microsoft.sqlserver.jdbc.SQLServerException: Conversion failed when
converting date and/or time from character string.
at org.apache.camel.util.ObjectHelper.wrapRuntimeCamelException(ObjectHelper.java:1826)
at org.apache.camel.component.jms.EndpointMessageListener$EndpointMessageListenerAsyncCallback.done
(EndpointMessageListener.java:196)
at org.apache.camel.component.jms.EndpointMessageListener.onMessage(EndpointMessageListener.java:117)
at org.springframework.jms.listener.AbstractMessageListenerContainer.doInvokeListener
(AbstractMessageListenerContainer.java:736)
at org.springframework.jms.listener.AbstractMessageListenerContainer.invokeListener
(AbstractMessageListenerContainer.java:696)
at org.springframework.jms.listener.AbstractMessageListenerContainer.doExecuteListener
(AbstractMessageListenerContainer.java:674)
at org.springframework.jms.listener.AbstractPollingMessageListenerContainer.doReceiveAndExecute
(AbstractPollingMessageListenerContainer.java:318)
at org.springframework.jms.listener.AbstractPollingMessageListenerContainer.receiveAndExecute
(AbstractPollingMessageListenerContainer.java:245)
at org.springframework.jms.listener.DefaultMessageListenerContainer$AsyncMessageListenerInvoker.
invokeListener(DefaultMessageListenerContainer.java:1189)
at org.springframework.jms.listener.DefaultMessageListenerContainer$AsyncMessageListenerInvoker.
executeOngoingLoop(DefaultMessageListenerContainer.java:1179)
at org.springframework.jms.listener.DefaultMessageListenerContainer$AsyncMessageListenerInvoker.run
(DefaultMessageListenerContainer.java:1076)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Conversion failed when converting date and/or time
from character string.
at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:254)
at com.microsoft.sqlserver.jdbc.SQLServerResultSet$FetchBuffer.nextRow(SQLServerResultSet.java:5378)
at com.microsoft.sqlserver.jdbc.SQLServerResultSet.fetchBufferNext(SQLServerResultSet.java:1754)
at com.microsoft.sqlserver.jdbc.SQLServerResultSet.next(SQLServerResultSet.java:1018)
at org.apache.commons.dbcp2.DelegatingResultSet.next(DelegatingResultSet.java:1160)
at org.apache.commons.dbcp2.DelegatingResultSet.next(DelegatingResultSet.java:1160)
at org.apache.camel.component.jdbc.ResultSetIterator.loadNext(ResultSetIterator.java:123)
at org.apache.camel.component.jdbc.ResultSetIterator.<init>(ResultSetIterator.java:66)
at org.apache.camel.component.jdbc.JdbcProducer.setResultSet(JdbcProducer.java:319)
at org.apache.camel.component.jdbc.JdbcProducer.doCreateAndExecuteSqlStatement(JdbcProducer.java:225)
at org.apache.camel.component.jdbc.JdbcProducer.createAndExecuteSqlStatement(JdbcProducer.java:125)
at org.apache.camel.component.jdbc.JdbcProducer.processingSqlBySettingAutoCommit(JdbcProducer.java:86)
at org.apache.camel.component.jdbc.JdbcProducer.process(JdbcProducer.java:67)
at org.apache.camel.util.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process
(AsyncProcessorConverterHelper.java:61)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:148)
at org.apache.camel.processor.interceptor.HandleFaultInterceptor.process(HandleFaultInterceptor.java:42)
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:110)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:548)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:201)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:138)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:101)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:201)
at org.apache.camel.component.direct.DirectProducer.process(DirectProducer.java:76)
at org.apache.camel.processor.Enricher.process(Enricher.java:191)
at org.apache.camel.processor.interceptor.HandleFaultInterceptor.process(HandleFaultInterceptor.java:42)
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:110)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:548)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:201)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:138)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:101)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:201)
at org.apache.camel.component.direct.DirectProducer.process(DirectProducer.java:76)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:148)
at org.apache.camel.processor.interceptor.HandleFaultInterceptor.process(HandleFaultInterceptor.java:42)
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:110)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:548)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:201)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:201)
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:97)
at org.apache.camel.component.jms.EndpointMessageListener.onMessage(EndpointMessageListener.java:113)
... 11 common frames omitted

Here you can see the logging format fall apart as one log entry does not mean one log line. This makes the standard logging format unsuitable for
centralized logging as exception will get spread over several lines and force the user to trawl through lines to assemble the complete log entry. This format
is a PITA to deal with.

The 1 to 1 logging format.


To give the logs a consistent structure we have decided to implement all logging using JSON as a data format. Using the same logging message as
previously outlined the logging changes to the format show in the examples below:

Hello World Log In Json

{"@timestamp":"2019-09-25T10:33:28.964+10:00","@version":"1","message":"hello world","thread_name":"main","
level":"INFO","level_value":20000,"appName":"foo","env":"dev","mvnVersion":"1.16"}

Here you can see the structure of the log very clearly. Even when you log an exception which is multiple lines long you get 1 log line per log operation as
shown below:
Error Log In One Line

{"@timestamp":"2019-09-16T10:22:46.999+10:00","@version":"1","message":"Application run failed","logger_name":"


org.springframework.boot.SpringApplication","thread_name":"main","level":"ERROR","level_value":40000,"
stack_trace":"java.lang.IllegalStateException: Logback configuration error detected: \r\nERROR in net.logstash.
logback.composite.GlobalCustomFieldsJsonProvider@2453f95d - Failed to parse custom fields [{\"appname\":\"shrub-
fcc\",\"name2\":info.app.name_IS_UNDEFINED}] com.fasterxml.jackson.core.JsonParseException: Unrecognized token
'info': was expecting ('true', 'false' or 'null')\n at [Source: (String)\"{\"appname\":\"shrub-fcc\",\"name2\":
info.app.name_IS_UNDEFINED}\"; line: 1, column: 36]\r\n\tat org.springframework.boot.logging.logback.
LogbackLoggingSystem.loadConfiguration(LogbackLoggingSystem.java:169)\r\n\tat org.springframework.boot.logging.
AbstractLoggingSystem.initializeWithConventions(AbstractLoggingSystem.java:82)\r\n\tat org.springframework.boot.
logging.AbstractLoggingSystem.initialize(AbstractLoggingSystem.java:60)\r\n\tat org.springframework.boot.
logging.logback.LogbackLoggingSystem.initialize(LogbackLoggingSystem.java:117)\r\n\tat org.springframework.boot.
context.logging.LoggingApplicationListener.initializeSystem(LoggingApplicationListener.java:292)\r\n\tat org.
springframework.boot.context.logging.LoggingApplicationListener.initialize(LoggingApplicationListener.java:265)
\r\n\tat org.springframework.boot.context.logging.LoggingApplicationListener.
onApplicationEnvironmentPreparedEvent(LoggingApplicationListener.java:228)\r\n\tat org.springframework.boot.
context.logging.LoggingApplicationListener.onApplicationEvent(LoggingApplicationListener.java:201)\r\n\tat org.
springframework.context.event.SimpleApplicationEventMulticaster.doInvokeListener
(SimpleApplicationEventMulticaster.java:172)\r\n\tat org.springframework.context.event.
SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:165)\r\n\tat org.
springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent
(SimpleApplicationEventMulticaster.java:139)\r\n\tat org.springframework.context.event.
SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:127)\r\n\tat org.
springframework.boot.context.event.EventPublishingRunListener.environmentPrepared(EventPublishingRunListener.
java:75)\r\n\tat org.springframework.boot.SpringApplicationRunListeners.environmentPrepared
(SpringApplicationRunListeners.java:54)\r\n\tat org.springframework.boot.SpringApplication.prepareEnvironment
(SpringApplication.java:347)\r\n\tat org.springframework.boot.SpringApplication.run(SpringApplication.java:306)
\r\n\tat org.springframework.boot.builder.SpringApplicationBuilder.run(SpringApplicationBuilder.java:139)
\r\n\tat org.springframework.cloud.bootstrap.BootstrapApplicationListener.bootstrapServiceContext
(BootstrapApplicationListener.java:208)\r\n\tat org.springframework.cloud.bootstrap.
BootstrapApplicationListener.onApplicationEvent(BootstrapApplicationListener.java:104)\r\n\tat org.
springframework.cloud.bootstrap.BootstrapApplicationListener.onApplicationEvent(BootstrapApplicationListener.
java:70)\r\n\tat org.springframework.context.event.SimpleApplicationEventMulticaster.doInvokeListener
(SimpleApplicationEventMulticaster.java:172)\r\n\tat org.springframework.context.event.
SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:165)\r\n\tat org.
springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent
(SimpleApplicationEventMulticaster.java:139)\r\n\tat org.springframework.context.event.
SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:127)\r\n\tat org.
springframework.boot.context.event.EventPublishingRunListener.environmentPrepared(EventPublishingRunListener.
java:75)\r\n\tat org.springframework.boot.SpringApplicationRunListeners.environmentPrepared
(SpringApplicationRunListeners.java:54)\r\n\tat org.springframework.boot.SpringApplication.prepareEnvironment
(SpringApplication.java:347)\r\n\tat org.springframework.boot.SpringApplication.run(SpringApplication.java:306)
\r\n\tat org.springframework.boot.SpringApplication.run(SpringApplication.java:1260)\r\n\tat org.
springframework.boot.SpringApplication.run(SpringApplication.java:1248)\r\n\tat com.ventia.wms.
PulseFccApplication.main(PulseFccApplication.java:13)\r\n"}

Notice that the stacktrace is now logged in a stacktrace element of the json structure. The stack trace still has its old format with line breaks and tabs in
however they are now capture as a whole and not a individual lines.

The JSON log fields.


To ensure consistency the following is the minimal log fields we need for standardized logging.

Field Name Description

timestamp The timestamp of generated event. Auto generated by our implementation.

version The json schema version for logstash. Auto generated by our implementation.

message The value that was logged by the logging operation. Make it descriptive but try stay away from overly verbose messages.

logger_name The name of the class that logged this message. Auto generated by our implementation.

thread_name The name of the thread that logged this message. Auto generated by our implementation.

level The logging level for this logging operation. Valid values are INFO, WARN, DEBUG, ERROR and TRACE. These can be used in filters.

level_value The logging level value for the logging operation. Auto generated by our implementation.
HOSTNAME The host name of the machine the application was running on. Auto generated by our implementation.

appName The application name injected from the Maven information in the pom file. Auto generated by our implementation.

env The environment the application is running in: Valid values are dev, test, uat and prod. Auto generated by our implementation.

mvnVersion The maven project version. Auto generated by our implementation.

stack_trace This will contain the multi line stack traces when there is an error. Auto generated by our implementation when an exception occurs.

Logging patterns and anti patterns.

Log Flooding:
This is where log are flooded with messages from a loop/iteration.

How to avoid log flooding.


Some rules that can be applied here:

1. Log all iterations on DEBUG level. Logging normal execution flow is not needed on INFO level.
2. Only log the exceptions with supporting information.

Anti Pattern:

Log Flooding Anti Pattern

for (Customer customer: customerList) {


LOG.info("processing customer:{}", customer.getId());
if(customer.getStatus().equalsIgnoreCase("VIP")){
LOG.info("Processing a VIP customer");
try{
applyDiscount(customer.getId());
sendToFastQueue(customer.getId());
}
catch(Exception e){
LOG.error("exception in vip customer:",e);
}
}
else{
LOG.info("Normal Joe goes to the slow queue");
try{
sendToSlowQueue(customer.getId());
}
catch(Exception e){
LOG.error("exception in normal customer:",e);
}
}
}

Pattern:
How Not To Flood The Log

for (Customer customer: customerList) {


LOG.debug("processing customer:{}", customer.getId());
try{
if(customer.getStatus().equalsIgnoreCase("VIP")){
applyDiscount(customer.getId());
sendToFastQueue(customer.getId());
}
else{
sendToSlowQueue(customer.getId());
}catch(Exception e){
LOG.warn("while processing a customer with an id of:{} an exception occurred exception will be
logged next.", customer.getId());
LOG.error("customer exception:",e);
}
}

Senseless Logging.
Logging normal execution of logic is senseless.

How to avoid senseless logging.


1. Don't use INFO level for logging normal execution flow. Use DEBUG or TRACE for this type of logging.
2. Only log exceptional flow using the WARN level.

Anti Pattern Example

Senseless Logging Anti Pattern

if(value==1){
LOG.info("value 1 found");
}
else if(value==2){
LOG.info("value 2 found");

Pattern:

Senseless Logging Pattern

if(value==1){
...
}
else if(value==2){
...

}
else{
LOG.warn("Houston we have a problem the value was not 1 or 2");
}

Log and Throw


There’s another well known principle: “Either handle or throw your exception”.

Imagine the following problem when performing the anti-pattern “Log and throw”: if you log your exception and throw it, the calling method might log that
exception too. And hey – maybe the next one does the same. Your log-files are going to hold the same information several times. Which doesn’t increase
the logs’ readability.
Anti-pattern:

Log And Throw Anti Pattern

public void foo() {


try {
...
} catch (Exception e) {
LOGGER.error(e.getMessage(), e);
throw e; // or: throw new WrappingException(e);
}
}

Pattern:

Log Or Throw Pattern

public void foo() {


try {
...
} catch (Exception e) {
LOGGER.error(e.getMessage(), e);
}
}

public void foo() {


try {
...
} catch (Exception e) {
throw e; // or: throw new WrappingException(e);
}
}

Withhold the exception


Remember the saying “don’t let bugs get lost without a trace”? If you’re not giving the exception to your logger, you simply throw away information. The
exceptions cause and stacktrace are meaningful hints while tracing a bug in your application. So always call the logger including the exception.

Anti-pattern:

Withhold Exception Anti Pattern

public void foo() {


try {
...
} catch (Exception e) {
LOGGER.error("An exception occurred.");
// or: LOGGER.error(e.getMessage());
}
}

Pattern:
Show Exception Pattern

public void foo() {


try {
...
} catch (Exception e) {
LOGGER.error("An exception occurred.", e);
// or: LOGGER.error(e.getMessage(), e);
}
}

Use of System.out.println, System.err.println, e.printStackTrace


Sad but true – a code-smell which doesn’t seem to get exterminated. In the (common) case where your logger uses appenders which don’t only write to
the console you lose every information written via System.out, System.err and e.printStackTrace. For example if your logger writes to a specified log-file
and you use System.out.println(“Extremely useful message”), then your “extremely useful message” will never be written to the log-file.

Anti-pattern

Not Using Logger Anti Pattern

public void foo() {


try {
...
} catch (Exception e) {
System.out.println(e.getMessage())
// or: System.err.println(e.getMessage())
// or: e.printStackTrace();
}
}

Pattern

Not Using Logger Anti Pattern

public void foo() {


try {
...
} catch (Exception e) {
LOGGER.error(e.getMessage(), e);
}
}

You might also like