Quantcast
Channel: Baeldung
Viewing all 3743 articles
Browse latest View live

Introduction to Spring Reactor

$
0
0

1. Overview

In this quick article, we’ll introduce the Spring Reactor project. We’ll set up an a real-life scenario for a reactive, event-driven application.

2. The Basics of Spring Reactor

2.1. Why Reactor?

The reactive design pattern is an event-based architecture for asynchronous handling of a large volume of concurrent service requests coming from single or multiple service handlers.

And the Spring Reactor project is based on the this pattern and has the clear and ambitious goal of building asynchronous, reactive applications on the JVM.

2.2. Example Scenarios

Before we get started, here are a few interesting scenarios where leveraging the reactive architectural style will make sense, just to get an idea of where you might apply it:

  • Notification service of large online shopping application like Amazon
  • Huge transaction processing services of banking sector
  • Share trade business where share prices changes simultaneously

One quick note to be aware of is that the event bus implementation offers no persistence of events; just like the default Spring Event bus, it’s an in-memory implementation.

3. Maven Dependencies

Let’s start to use Spring Reactor by adding the following dependency into our pom.xml:

<dependency>
    <groupId>io.projectreactor</groupId>
    <artifactId>reactor-bus</artifactId>
    <version>2.0.8.RELEASE</version>
</dependency>

You can check the latest version of reactor-bus in Central Maven Repository.

4. Building a Demo Application

To better understand the benefits of the reactor-based approach, let’s look at a practical example.

We’re going to build a simple notification app, which would notify users via mail and SMS – after they finish their order on an online store.

A typical synchronous implementation would naturally be bound by the throughput of the SMS service. Spikes in traffic, such holidays would generally be problematic.

With a reactive approach, the system can be more flexible and adapt better to failures or timeouts in these types of external systems, such as SMS or email servers.

Let’s have a look at the application – starting with the more traditional aspects and moving on to the more reactive constructs.

4.1. Simple POJO

First, let’s create a POJO class to represent the notification data:

public class NotificationData {
	
    private long id;
    private String name;
    private String email;
    private String mobile;
    
    // getter and setter methods
}

4.2. The Service Layer

Let’s now set up a simple service layer:

public interface NotificationService {

    void initiateNotification(NotificationData notificationData) 
      throws InterruptedException;

}

And the implementation, simulating a long operation here:

@Service
public class NotificationServiceimpl implements NotificationService {
	
    @Override
    public void initiateNotification(NotificationData notificationData) 
      throws InterruptedException {

      System.out.println("Notification service started for "
        + "Notification ID: " + notificationData.getId());
		
      Thread.sleep(5000);
		
      System.out.println("Notification service ended for "
        + "Notification ID: " + notificationData.getId());
    }
}

Notice that to illustrate real life scenario of sending messages via SMS gateway or Email gateway, we’re intentionally introducing a 5 seconds delay in the initiateNotification method by Thread.sleep(5000). 

And so, when the thread hits the service – it will be blocked for 5 seconds.

4.3. The Consumer

Let’s now jump into the more reactive aspects of our application and implement a consumer – which we’ll then map to the reactor event bus:

@Service
public class NotificationConsumer implements 
  Consumer<Event<NotificationData>> {

    @Autowired
    private NotificationService notificationService;
	
    @Override
    public void accept(Event<NotificationData> notificationDataEvent) {
        NotificationData notificationData = notificationDataEvent.getData();
        
        try {
            notificationService.initiateNotification(notificationData);
        } catch (InterruptedException e) {
            // ignore        
        }	
    }
}

As you can see, the consumer is simply implementing Consumer<T> interface – with a single accept method. It’s this simple implementation that runs the main logic, just like a typical Spring listener.

4.4. The Controller

Finally, now that we’re able to consume the events, let’s also generate them.

We’re going to do that in a simple controller:

@Controller
public class NotificationController {

    @Autowired
    private EventBus eventBus;

    @GetMapping("/startNotification/{param}")
    public void startNotification(@PathVariable Integer param) {
        for (int i = 0; i < param; i++) {
            NotificationData data = new NotificationData();
            data.setId(i);

            eventBus.notify("notificationConsumer", Event.wrap(data));

            System.out.println(
              "Notification " + i + ": notification task submitted successfully");
        }
    }
}

This is quite self-explanatory – we’re sending events through the EventBus here – using a unique key.

So, simply put – when a client hits the URL with param value 10, a total of 10 events will be sent through the bus.

4.5. The Java Config

We’re almost done; let’s just put everything together with the Java Config and create our Boot application:

import static reactor.bus.selector.Selectors.$;

@Configuration
@EnableAutoConfiguration
@ComponentScan
public class Application implements CommandLineRunner {
	
    @Autowired
    private EventBus eventBus;
	
    @Autowired
    private NotificationConsumer notificationConsumer;
	
    @Bean
    Environment env() {
        return Environment.initializeIfEmpty().assignErrorJournal();
    }
    
    @Bean
    EventBus createEventBus(Environment env) {
        return EventBus.create(env, Environment.THREAD_POOL);
    }

    @Override
    public void run(String... args) throws Exception {
        eventBus.on($("notificationConsumer"), notificationConsumer);
    }

    public static void main(String[] args){
        SpringApplication.run(Application.class, args);
    }
}

It’s here that we’re creating the EventBus bean via the static create API in EventBus.

In our case, we’re instantiating the event bus with a default thread pool available in the environment.

If we wanted a bit more control over the bus, we could also provide a thread count to the implementation:

EventBus evBus = EventBus.create(
  env, 
  Environment.newDispatcher(
    REACTOR_THREAD_COUNT,REACTOR_THREAD_COUNT,   
    DispatcherType.THREAD_POOL_EXECUTOR));

Next – also notice how we’re using the static import of the $ attribute here. 

The feature provides a type-safe mechanism to include constants(in our case it’s $ attribute) into code without having to reference the class that originally defined the field.

We’re making use of this functionality in our run method implementation – where we’re registering our consumer to be triggered when the matching notification.

This is based on a unique selector key that enables each consumer to be identified.

5. Test the Application

After running a Maven build, we can now simply run java -jar name_of_the_application.jar to run the application.

Let’s now create a small JUnit test class to test the application. We would use Spring Boot’s SpringJUnit4ClassRunner to create the test case:

@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(classes = {Application.class}) 
public class DataLoader {

    @Test
    public void exampleTest() {
        RestTemplate restTemplate = new RestTemplate();
        restTemplate.getForObject(
          "http://localhost:8080/startNotification/10", String.class);
    }
}

Now, let’s run this test case to test the application:

Notification 0: notification task submitted successfully
Notification 1: notification task submitted successfully
Notification 2: notification task submitted successfully
Notification 3: notification task submitted successfully
Notification 4: notification task submitted successfully
Notification 5: notification task submitted successfully
Notification 6: notification task submitted successfully
Notification 7: notification task submitted successfully
Notification 8: notification task submitted successfully
Notification 9: notification task submitted successfully
Notification service started for Notification ID: 1
Notification service started for Notification ID: 2
Notification service started for Notification ID: 3
Notification service started for Notification ID: 0
Notification service ended for Notification ID: 1
Notification service ended for Notification ID: 0
Notification service started for Notification ID: 4
Notification service ended for Notification ID: 3
Notification service ended for Notification ID: 2
Notification service started for Notification ID: 6
Notification service started for Notification ID: 5
Notification service started for Notification ID: 7
Notification service ended for Notification ID: 4
Notification service started for Notification ID: 8
Notification service ended for Notification ID: 6
Notification service ended for Notification ID: 5
Notification service started for Notification ID: 9
Notification service ended for Notification ID: 7
Notification service ended for Notification ID: 8
Notification service ended for Notification ID: 9

As you can see, as soon as the endpoint hit, all 10 tasks get submitted instantly without creating any blocking. And once submitted, the notification events get processed in parallel.

Keep in mind that in our scenario there’s no need to process these events in any order.

6. Conclusion

In this small application, we definitely get a throughput increase, along with a more well behaved application overall.

However, this scenario is just scratching the surface, and represents just a good base to start understanding the reactive paradigm.

As always, the source code is available over on GitHub.


Introduction to Nashorn

$
0
0

1. Introduction

This article is focused on Nashorn – the new default JavaScript engine for the JVM as of Java 8.

Many sophisticated techniques have been used to make Nashorn orders of magnitude more performant than its predecessor called Rhino, so it is a worthwhile change.

Let’s have a look at some of the ways in which it can be used.

2. Command Line

JDK 1.8 includes a command line interpreter called jjs which can be used to run JavaScript files or, if started with no arguments, as a REPL (interactive shell):

$ $JAVA_HOME/bin/jjs hello.js
Hello World

Here the file hello.js contains a single instruction: print(“Hello World”);

The same code can be run in the interactive manner:

$ $JAVA_HOME/bin/jjs
jjs> print("Hello World")
Hello World

You can also instruct the *nix runtime to use jjs for running a target script by adding a #!$JAVA_HOME/bin/jjs as the first line:

#!$JAVA_HOME/bin/jjs
var greeting = "Hello World";
print(greeting);

And then the file can be run as normal:

$ ./hello.js
Hello World

3. Embedded Script Engine

The second, and probably more common way to run JavaScript from within the JVM is via the ScriptEngine. JSR-223 defines a set of scripting APIs, allowing for a pluggable script engine architecture that can be used for any dynamic language (provided it has a JVM implementation, of course).

Let’s create a JavaScript engine:

ScriptEngine engine = new ScriptEngineManager().getEngineByName("nashorn");

Object result = engine.eval(
   "var greeting='hello world';" +
   "print(greeting);" +
   "greeting");

Here we create a new ScriptEngineManager and immediately ask it to give us a ScriptEngine named nashorn. Then, we pass a couple instructions and obtain the result which predictably, turns out to be a Stringhello world“.

4. Passing Data to the Script

Data can be passed into the engine by defining a Bindings object and passing it as a second parameter to the eval function:

Bindings bindings = engine.createBindings();
bindings.put("count", 3);
bindings.put("name", "baeldung");

String script = "var greeting='Hello ';" +
  "for(var i=count;i>0;i--) { " +
  "greeting+=name + ' '" +
  "}" +
  "greeting";

Object bindingsResult = engine.eval(script, bindings);

Running this snippet produces: “Hello baeldung baeldung baeldung“.

5. Invoking JavaScript Functions

It’s of course possible to call JavaScript functions from your Java code:

engine.eval("function composeGreeting(name) {" +
  "return 'Hello ' + name" +
  "}");
Invocable invocable = (Invocable) engine;

Object funcResult = invocable.invokeFunction("composeGreeting", "baeldung");

This will return “Hello baeldung“.

6. Using Java Objects 

Since we are running in the JVM it is possible to use native Java objects from within JavaScript code.

This is accomplished by using a Java object:

Object map = engine.eval("var HashMap = Java.type('java.util.HashMap');" +
  "var map = new HashMap();" +
  "map.put('hello', 'world');" +
  "map");

7. Language Extensions

Nashorn is targeting ECMAScript 5.1 but it does provide extensions to make JavaScript usage a tad nicer.

7.1. Iterating Collections with for-each

For-each is a convenient extension to make iteration over various collections easier:

String script = "var list = [1, 2, 3, 4, 5];" +
  "var result = '';" +
  "for each (var i in list) {" +
  "result+=i+'-';" +
  "};" +
  "print(result);";

engine.eval(script);

Here, we join elements of an array by using for-each iteration construct.

The resulting output will be 1-2-3-4-5-.

7.2. Function Literals

In simple function declarations you can omit curly braces:

function increment(in) ++in

Obviously, this can only be done for simple, one-liner functions.

7.3. Conditional Catch Clauses

It is possible to add guarded catch clauses that only execute if the specified condition is true:

try {
    throw "BOOM";
} catch(e if typeof e === 'string') {
    print("String thrown: " + e);
} catch(e) {
    print("this shouldn't happen!");
}

This will print “String thrown: BOOM“.

7.4. Typed Arrays and Type Conversions

It is possible to use Java typed arrays and to convert to and from JavaScript arrays:

function arrays(arr) {
    var javaIntArray = Java.to(arr, "int[]");
    print(javaIntArray[0]);
    print(javaIntArray[1]);
    print(javaIntArray[2]);
}

Nashorn performs some type conversions here to make sure that all the values from the dynamically typed JavaScript array can fit into the integer-only Java arrays.

The result of calling above function with argument [100, “1654”, true] results in the output of 100, 1654 and 1 (all numbers).

The String and boolean values were implicitly converted to their logical integer counterparts.

7.5. Setting Object’s Prototype with Object.setPrototypeOf

Nashorn defines an API extension that enables us to change the prototype of an object:

Object.setPrototypeOf(obj, newProto)

This function is generally considered a better alternative to Object.prototype.__proto__ so it should be the preferred way to set object’s prototype in all new code.

7.6. Magical __noSuchProperty__ and __noSuchMethod__

It is possible to define methods on an object that will be invoked whenever an undefined property is accessed or an undefined method is invoked:

var demo = {
    __noSuchProperty__: function (propName) {
        print("Accessed non-existing property: " + propName);
    },
	
    __noSuchMethod__: function (methodName) {
        print("Invoked non-existing method: " + methodName);
    }
};

demo.doesNotExist;
demo.callNonExistingMethod()

This will print:

Accessed non-existing property: doesNotExist
Invoked non-existing method: callNonExistingMethod

7.7. Bind Object Properties with Object.bindProperties

Object.bindProperties can be used to bind properties from one object into another:

var first = {
    name: "Whiskey",
    age: 5
};

var second = {
    volume: 100
};

Object.bindProperties(first, second);

print(first.volume);

second.volume = 1000;
print(first.volume);

Notice, that this creates is a “live” binding and any updates to the source object are also visible through the binding target.

7.8. Locations

Current file name, directory and a line can be obtained from global variables __FILE__, __DIR__, __LINE__:

print(__FILE__, __LINE__, __DIR__)

7.9. Extensions to String.prototype 

There are two simple, but very useful extensions that Nashorn provides on the String prototype. These are trimRight and trimLeft functions which, unsurprisingly, return a copy of the String with the whitespace removed:

print("   hello world".trimLeft());
print("hello world     ".trimRight());

Will print “hello world” twice without leading or trailing spaces.

7.10. Java.asJSONCompatible Function

Using this function, we can obtain an object that is compatible with Java JSON libraries expectations.

Namely, that if it itself, or any object transitively reachable through it is a JavaScript array, then such objects will be exposed as JSObject that also implements the List interface for exposing the array elements.

Object obj = engine.eval("Java.asJSONCompatible(
  { number: 42, greet: 'hello', primes: [2,3,5,7,11,13] })");
Map<String, Object> map = (Map<String, Object>)obj;
 
System.out.println(map.get("greet"));
System.out.println(map.get("primes"));
System.out.println(List.class.isAssignableFrom(map.get("primes").getClass()));

This will print “hello” followed by [2, 3, 5, 7, 11, 13] followed by true.

8. Loading Scripts

It’s also possible to load another JavaScript file from within the ScriptEngine:

load('classpath:script.js')

A script can also be loaded from a URL:

load('http://www.baeldung.com/script.js')

Keep in mind that JavaScript does not have a concept of namespaces so everything gets piled on into the global scope. This makes it possible for loaded scripts to create naming conflicts with your code or each other. This can be mitigated by using the loadWithNewGlobal function:

var math = loadWithNewGlobal('classpath:math_module.js')
math.increment(5);

With the following math_module.js:

var math = {
    increment: function(num) {
        return ++num;
    }
};

math;bai

Here we are defining an object named math that has a single function called increment. Using this paradigm we can even emulate basic modularity!

8. Conclusion

This article explored some features of the Nashorn JavaScript engine. Examples showcased here used string literal scripts, but for real-life scenarios you most likely want to keep your script in separate files and load them using a Reader class.

As always, the code in this write-up is all available over on Github.

Introduction to PMD

$
0
0

1. Overview

Simply put, PMD is a source code analyzer to find common programming flaws like unused variables, empty catch blocks, unnecessary object creation, and so forth.

It supports Java, JavaScript, Salesforce.com Apex, PLSQL, Apache Velocity, XML, XSL.

In this article, we’ll focus on how to use PMD to perform static analysis in a Java project.

2. Prerequisites

Let’s start with setting up PMD into a Maven project – using and configuring the maven-pmd-plugin:

<project>
    ...
    <reporting>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-pmd-plugin</artifactId>
                <version>3.7</version>
                <configuration>
                    <rulesets>
                        <ruleset>/rulesets/java/braces.xml</ruleset>
                        <ruleset>/rulesets/java/naming.xml</ruleset>
                    </rulesets>
                </configuration>
            </plugin>
        </plugins>
    </reporting>
</project>

You can find the latest version of maven-pmd-plugin here.

Notice how we’re adding rulesets in the configuration here – these are a relative path to already define rules from the PMD core library.

Finally, before running everything, let’s create a simple Java class with some glaring issues – something that PMD can start reporting problems on:

public class Ct {

    public int d(int a, int b) {
        if (b == 0)
            return Integer.MAX_VALUE;
        else
            return a / b;
    }
}

3. Run PMD

With the simple PMD config and the sample code – let’s generate a report in the build target folder:

mvn site

The generated report is called pmd.html and is located in the target/site folder:

Files

com/baeldung/pmd/Cnt.java

Violation                                                                             Line

Avoid short class names like Cnt                                   1–10 
Avoid using short method names                                  3 
Avoid variables with short names like b                        3 
Avoid variables with short names like a                        3 
Avoid using if...else statements without curly braces 5 
Avoid using if...else statements without curly braces 7 

As you can see – we’re not getting results. The report shows violations and line numbers in your Java code, according to PMD.

4. Rulesets

The PMD plugin uses five default rulesets:

  • basic.xml
  • empty.xml
  • imports.xml
  • unnecessary.xml
  • unusedcode.xml

You may use other rulesets or create your own rulesets, and configure these in the plugin:

<project>
    ...
    <reporting>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-pmd-plugin</artifactId>
                <version>3.7</version>
                <configuration>
                    <rulesets>
                        <ruleset>/rulesets/java/braces.xml</ruleset>
                        <ruleset>/rulesets/java/naming.xml</ruleset>
                        <ruleset>/usr/pmd/rulesets/strings.xml</ruleset>
                        <ruleset>http://localhost/design.xml</ruleset>
                    </rulesets>
                </configuration>
            </plugin>
        </plugins>
    </reporting>
</project>

Notice that we’re using either a relative address, an absolute address or even a URL – as the value of the ‘ruleset’ value in configuration.

A clean strategy for customizing which rules to use for a project is to write a custom ruleset file. In this file, we can define which rules to use, add custom rules, and customize which rules to include/exclude from the official rulesets.

5. Custom Ruleset

Let’s now choose the specific rules we want to use from existing sets of rules in PMD – and let’s also customize them.

First, we’ll create a new ruleset.xml file. We can of course use one of the existing rulesets files as an example and copy and paste that into our new file, delete all the old rules from it, and change the name and description:

<?xml version="1.0"?>
<ruleset name="Custom ruleset"
  xmlns="http://pmd.sourceforge.net/ruleset/2.0.0"
  xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://pmd.sourceforge.net/ruleset/2.0.0
  http://pmd.sourceforge.net/ruleset_2_0_0.xsd">
    <description>
        This ruleset checks my code for bad stuff
    </description>
</ruleset>

Secondly, let’s add some rule references:

<!-- We'll use the entire 'strings' ruleset -->
<rule ref="rulesets/java/strings.xml"/>

Or add some specific rules:

<rule ref="rulesets/java/unusedcode.xml/UnusedLocalVariable"/>
<rule ref="rulesets/java/unusedcode.xml/UnusedPrivateField"/>
<rule ref="rulesets/java/imports.xml/DuplicateImports"/>
<rule ref="rulesets/java/basic.xml/UnnecessaryConversionTemporary"/>

We can customize the message and priority of the rule:

<rule ref="rulesets/java/basic.xml/EmptyCatchBlock"
  message="Must handle exceptions">
    <priority>2</priority>
</rule>

And you also can customize a rule’s property value like this:

<rule ref="rulesets/java/codesize.xml/CyclomaticComplexity">
    <properties>
        <property name="reportLevel" value="5"/>
    </properties>
</rule>

Notice that you can customize individual referenced rules. Everything but the class of the rule can be overridden in your custom ruleset.

Next – you can also excluding rules from a ruleset:

<rule ref="rulesets/java/braces.xml">
    <exclude name="WhileLoopsMustUseBraces"/>
    <exclude name="IfElseStmtsMustUseBraces"/>
</rule>

Next – you can also exclude files from a ruleset using exclude patterns, with an optional overriding include pattern.

A file will be excluded from processing when there is a matching exclude pattern, but no matching include pattern.

Path separators in the source file path are normalized to be the ‘/’ character, so the same ruleset can be used on multiple platforms transparently.

Additionally, this exclude/include technique works regardless of how PMD is used (e.g. command line, IDE, Ant), making it easier to keep the application of your PMD rules consistent throughout your environment.

Here’s a quick example:

<?xml version="1.0"?>
<ruleset ...>
    <description>My ruleset</description>
    <exclude-pattern>.*/some/package/.*</exclude-pattern>
    <exclude-pattern>
       .*/some/other/package/FunkyClassNamePrefix.*
    </exclude-pattern>
    <include-pattern>.*/some/package/ButNotThisClass.*</include-pattern>
    <rule>...
</ruleset>

6. Conclusion

In this quick article we introduced PMD – a flexible and highly configurable tool focused on static analysis of Java code

As always, the full code presented in this tutorial is available over on Github.

Spring Performance Logging

$
0
0

1. Overview

In this tutorial, we’ll look into a couple of basic options the Spring Framework offers for performance monitoring.

2. PerformanceMonitorInterceptor

A simple solution to get basic monitoring functionality for the execution time of our methods, we can make use of the PerformanceMonitorInterceptor class out of Spring AOP (Aspect Oriented Programming).

Spring AOP allows the defining of cross-cutting concerns in applications, meaning code that intercepts the execution of one or more methods, in order to add extra functionality.

The PerformanceMonitorInterceptor class is an interceptor that can be associated with any custom method to be executed at the same time. This class uses a StopWatch instance to determine the beginning and ending time of the method run.

Let’s create a simple Person class and a PersonService class with two methods that we will monitor:

public class Person {
    private String lastName;
    private String firstName;
    private LocalDate dateOfBirth;

    // standard constructors, getters, setters
}
public class PersonService {
    
    public String getFullName(Person person){
        return person.getLastName()+" "+person.getFirstName();
    }
    
    public int getAge(Person person){
        Period p = Period.between(person.getDateOfBirth(), LocalDate.now());
        return p.getYears();
    }
}

In order to make use of the Spring monitoring interceptor, we need to define a pointcut and advisor:

@Configuration
@EnableAspectJAutoProxy
@Aspect
public class AopConfiguration {
    
    @Pointcut(
      "execution(public String com.baeldung.performancemonitor.PersonService.getFullName(..))"
    )
    public void monitor() { }
    
    @Bean
    public PerformanceMonitorInterceptor performanceMonitorInterceptor() {
        return new PerformanceMonitorInterceptor(true);
    }

    @Bean
    public Advisor performanceMonitorAdvisor() {
        AspectJExpressionPointcut pointcut = new AspectJExpressionPointcut();
        pointcut.setExpression("com.baeldung.performancemonitor.AopConfiguration.monitor()");
        return new DefaultPointcutAdvisor(pointcut, performanceMonitorInterceptor());
    }
    
    @Bean
    public Person person(){
        return new Person("John","Smith", LocalDate.of(1980, Month.JANUARY, 12));
    }
 
    @Bean
    public PersonService personService(){
        return new PersonService();
    }
}

The pointcut contains an expression that identifies the methods that we want to be intercepted — in our case the getFullName() method of the PersonService class.

After configuring the performanceMonitorInterceptor() bean, we need to associate the interceptor with the pointcut. This is achieved through an advisor, as shown in the example above.

Finally, the @EnableAspectJAutoProxy annotation enables AspectJ support for our beans. Simply put, AspectJ is a library created to make the use of Spring AOP easier through convenient annotations like @Pointcut.

After creating the configuration, we need to set the log level of the interceptor class to TRACE, as this is the level at which it logs messages.

For example, using Jog4j, we can achieve this through the log4j.properties file:

log4j.logger.org.springframework.aop.interceptor.PerformanceMonitorInterceptor=TRACE, stdout

For every execution of the getAge() method, we will see the TRACE message in the console log:

2017-01-08 19:19:25 TRACE 
  PersonService:66 - StopWatch 
  'com.baeldung.performancemonitor.PersonService.getFullName': 
  running time (millis) = 10

3. Custom Performance Monitoring Interceptor

If we want more control over the way the performance monitoring is done, we can implement our own custom interceptor.

For this, let’s extend the AbstractMonitoringInterceptor class and override the invokeUnderTrace() method to log the start, end, and duration of a method, as well as a warning if the method execution lasts more than 10 ms:

public class MyPerformanceMonitorInterceptor extends AbstractMonitoringInterceptor {
    
    public MyPerformanceMonitorInterceptor() {
    }

    public MyPerformanceMonitorInterceptor(boolean useDynamicLogger) {
            setUseDynamicLogger(useDynamicLogger);
    }

    @Override
    protected Object invokeUnderTrace(MethodInvocation invocation, Log log) 
      throws Throwable {
        String name = createInvocationTraceName(invocation);
        long start = System.currentTimeMillis();
        log.info("Method " + name + " execution started at:" + new Date());
        try {
            return invocation.proceed();
        }
        finally {
            long end = System.currentTimeMillis();
            long time = end - start;
            log.info("Method "+name+" execution lasted:"+time+" ms");
            log.info("Method "+name+" execution ended at:"+new Date());
            
            if (time > 10){
                log.warn("Method execution longer than 10 ms!");
            }            
        }
    }
}

The same steps for associating the custom interceptor to one or more methods as in the preceding section need to be followed.

Let’s define a pointcut for the getAge() method of PersonService and associate it to the interceptor we have created:

@Pointcut("execution(public int com.baeldung.performancemonitor.PersonService.getAge(..))")
public void myMonitor() { }
    
@Bean
public MyPerformanceMonitorInterceptor myPerformanceMonitorInterceptor() {
    return new MyPerformanceMonitorInterceptor(true);
}
    
@Bean
public Advisor myPerformanceMonitorAdvisor() {
    AspectJExpressionPointcut pointcut = new AspectJExpressionPointcut();
    pointcut.setExpression("com.baeldung.performancemonitor.AopConfiguration.myMonitor()");
    return new DefaultPointcutAdvisor(pointcut, myPerformanceMonitorInterceptor());
}

Let’s sets the log level to INFO for the custom interceptor:

log4j.logger.com.baeldung.performancemonitor.MyPerformanceMonitorInterceptor=INFO, stdout

The execution of the getAge() method produced the following output:

2017-01-08 19:19:25 INFO PersonService:26 - 
  Method com.baeldung.performancemonitor.PersonService.getAge 
  execution started at:Sun Jan 08 19:19:25 EET 2017
2017-01-08 19:19:25 INFO PersonService:33 - 
  Method com.baeldung.performancemonitor.PersonService.getAge execution lasted:50 ms
2017-01-08 19:19:25 INFO PersonService:34 - 
  Method com.baeldung.performancemonitor.PersonService.getAge 
  execution ended at:Sun Jan 08 19:19:25 EET 2017
2017-01-08 19:19:25 WARN PersonService:37 - 
  Method execution longer than 10 ms!

4. Conclusion

In this quick tutorial, we’ve introduced simple performance monitoring in Spring.

As always, the full source code for this article can be found over on Github.

How to Work with Dates in Thymeleaf

$
0
0

1. Introduction

Thymeleaf is a Java template engine designed to work directly with Spring. For an intro to Thymeleaf and Spring, have a look at this write-up.

Besides these basic functions, Thymeleaf offers us a set of utility objects that will help us perform common tasks in our application.

In this article, we will discuss the processing and formatting of the new and old Java Date classes with handful features of Thymeleaf 3.0.

2. Maven Dependencies

First, let’s see the configuration needed to integrate Thymeleaf with Spring into our pom.xml:

<dependency>
    <groupId>org.thymeleaf</groupId>
    <artifactId>thymeleaf</artifactId>
    <version>3.0.3.RELEASE</version>
</dependency>
<dependency>
    <groupId>org.thymeleaf</groupId>
    <artifactId>thymeleaf-spring4</artifactId>
    <version>3.0.3.RELEASE</version>
</dependency>

The latest versions of thymeleaf and thymeleaf-spring4 can be found on Maven Central. Note that, for a Spring 3 project, the thymeleaf-spring3 library must be used instead of thymeleaf-spring4.

Moreover, in order to work with new Java 8 Date classes, we will add the following dependency to our pom.xml:

<dependency>
    <groupId>org.thymeleaf.extras</groupId>
    <artifactId>thymeleaf-extras-java8time</artifactId>
    <version>3.0.0.RELEASE</version>
</dependency>

The thymeleaf extras is an optional module, fully supported by the official Thymeleaf team, that was created for compatibility with the Java 8 Time API. It adds a #temporals object to the Context as a utility object processor during expression evaluations. This means that it can be used to evaluate expressions in Object-Graph Navigation Language (OGNL) and Spring Expression Language (SpringEL).

3. Old and New: java.util and java.time

The Time package is a new date, time, and calendar API for the Java SE platform. The main difference between old legacy Date and is that the new API distinguishes between machine and human views of a timeline. The machine view reveals a sequence of integral values relative to the epoch, whereas view reveals a set of fields (e.g., year or day).

To work with new Time package, we need to configure our template engine to use the new Java8TimeDialect:

private TemplateEngine templateEngine(ITemplateResolver templateResolver) {
    SpringTemplateEngine engine = new SpringTemplateEngine();
    engine.addDialect(new Java8TimeDialect());
    engine.setTemplateResolver(templateResolver);
    return engine;
}

This will add the #temporals object similar to the ones in the Standard Dialect, allowing the formatting and creation of Temporal objects from Thymeleaf templates.

In order to test the processing of new and old classes, we’ll create the following variables and add them as model objects to our controller class:

model.addAttribute("standardDate", new Date());
model.addAttribute("localDateTime", LocalDateTime.now());
model.addAttribute("localDate", LocalDate.now());
model.addAttribute("timestamp", Instant.now());

Now we are ready to use Expression and Temporals Utility Objects provided by Thymeleaf.

3.1. Format Dates

The first function that we want to cover is formatting of a Date object (which is added to the Spring model parameters). We decided to use ISO8601 format:

<h1>Format ISO</h1>
<p th:text="${#dates.formatISO(standardDate)}"></p>
<p th:text="${#temporals.formatISO(localDateTime)}"></p>
<p th:text="${#temporals.formatISO(localDate)}"></p>
<p th:text="${#temporals.formatISO(timestamp)}"></p>

No matter how our Date was set on the back-end side, it will be shown accordingly to selected standard. The standardDate is going to be processed by the #dates utility. The new LocalDateTime, LocalDate and Instant classes are going to be processed by the #temporals utility. This is the final result we’ll see in the browser:

This is the final result we’ll see in the browser:

Moreover, if we want to set the format manually, we can do it by using:

<h1>Format manually</h1>
<p th:text="${#dates.format(standardDate, 'dd-MM-yyyy HH:mm')}"></p>
<p th:text="${#temporals.format(localDateTime, 'dd-MM-yyyy HH:mm')}"></p>
<p th:text="${#temporals.format(localDate, 'MM-yyyy')}"></p>

As we can observe, we cannot process the Instant class with #temporals.format(…) — it will result in UnsupportedTemporalTypeException. Moreover, formatting the LocalDate is only possible if we’ll specify only the particular date fields, skipping the time fields.

The final result:

3.2. Obtain Specific Date Fields

In order to obtain the specific fields of the java.util.Date class, we should use the following utility objects:

${#dates.day(date)}
${#dates.month(date)}
${#dates.monthName(date)}
${#dates.monthNameShort(date)}
${#dates.year(date)}
${#dates.dayOfWeek(date)}
${#dates.dayOfWeekName(date)}
${#dates.dayOfWeekNameShort(date)}
${#dates.hour(date)}
${#dates.minute(date)}
${#dates.second(date)}
${#dates.millisecond(date)}

For the new java.time package, we should stick with #temporals utilities:

${#temporals.day(date)}
${#temporals.month(date)}
${#temporals.monthName(date)}
${#temporals.monthNameShort(date)}
${#temporals.year(date)}
${#temporals.dayOfWeek(date)}
${#temporals.dayOfWeekName(date)}
${#temporals.dayOfWeekNameShort(date)}
${#temporals.hour(date)}
${#temporals.minute(date)}
${#temporals.second(date)}
${#temporals.millisecond(date)}

Let’s look at a few examples. First, let’s show today’s day of the week:

<h1>Show only which day of a week</h1>
<p th:text="${#dates.day(standardDate)}"></p>
<p th:text="${#temporals.day(localDateTime)}"></p>
<p th:text="${#temporals.day(localDate)}"></p>

Next, let’s show the name of the week day:

<h1>Show the name of the week day</h1>
<p th:text="${#dates.dayOfWeekName(standardDate)}"></p>
<p th:text="${#temporals.dayOfWeekName(localDateTime)}"></p>
<p th:text="${#temporals.dayOfWeekName(localDate)}"></p>

And finally, let’s show the current second of the day:

<h1>Show the second of the day</h1>
<p th:text="${#dates.second(standardDate)}"></p>
<p th:text="${#temporals.second(localDateTime)}"></p>

Please note that in order to work with time parts, you would need to use LocalDateTime, as LocalDate will throw an error.

4. Conclusion

In this quick tutorial, we discussed Java Date processing features implemented in the Thymeleaf framework, version 3.0.

The full implementation of this tutorial can be found in the GitHub project – this is a Maven-based project that is easy to import and run.

How to test? Our suggestion is to play with the code in a browser first, then check our existing JUnit tests as well.

Please note that our examples do not cover all available options in Thymeleaf. If you want to learn about all types of utilities, then take a look at our article covering Spring and Thymeleaf Expressions.

A Custom Data Binder in Spring MVC

$
0
0

1. Overview

This article will show how we can use Spring’s Data Binding mechanism in order to make our code more clear and readable by applying automatic primitives to objects conversions.

2. Bind Request Parameters

By default, Spring only knows how to convert simple types. In other words, once we submit data to controller Int, String or Boolean type of data, it will be bound to appropriate Java types automatically.

But in real-world projects, that won’t be enough, as we might need to bind more more complex types of objects.

2.1. Individual Objects

Let’s start simple and first bind a simple type; we’ll have to provide a custom implementation of the Converter<S, T> interface where S is the type we are converting from, and T is the type we are converting to:

@Component
public class StringToLocalDateTimeConverter
  implements Converter<String, LocalDateTime> {

    @Override
    public LocalDateTime convert(String source) {
        return LocalDateTime.parse(
          source, DateTimeFormatter.ISO_LOCAL_DATE_TIME);
    }
}

Now we can use the following syntax in our controller:

@GetMapping("/findbydate/{date}")
public GenericEntity findByDate(@PathVariable("date") LocalDateTime date) {
    return ...;
}

2.2. Hierarchy of Objects

Sometimes we need to convert the entire tree of the object hierarchy and it makes sense to have a more centralized binding rather than a set of individual converters.

In this case, we can implement ConverterFactory<S, R> where S will be the type we are converting from and R to be the base type defining the range of classes we can convert to:

@Component
public class StringToEnumConverterFactory
  implements ConverterFactory<String, Enum> {

    private static class StringToEnumConverter<T extends Enum> 
      implements Converter<String, T> {

        private Class<T> enumType;

        public StringToEnumConverter(Class<T> enumType) {
            this.enumType = enumType;
        }

        public T convert(String source) {
            return (T) Enum.valueOf(this.enumType, source.trim());
        }
    }

    @Override
    public <T extends Enum> Converter<String, T> getConverter(
      Class<T> targetType) {
        return new StringToEnumConverter(targetType);
    }
}

As we can see, the only method that must implement is getConverter() which returns converter for needed type. The conversion process then is delegated to this converter.

So, suppose we have an Enum:

public enum Modes {
    ALPHA, BETA;
}

We can let Spring convert incoming values automatically:

@GetMapping("/findbymode/{mode}")
public GenericEntity findByEnum(@PathVariable("mode") Modes mode) {
    return ...;
}

3. Bind Domain Objects

There are cases when we want to bind data to objects, but it comes either in a non-direct way (for example, from Session, Header or Cookie variables) or even stored in a data source. In those cases, we need to use a different solution.

3.1. Custom Argument Resolver

First of all, we will define an annotation for such parameters:

@Retention(RetentionPolicy.RUNTIME)
@Target(ElementType.PARAMETER)
public @interface Version {
}

Then, we will implement a custom HandlerMethodArgumentResolver:

public class HeaderVersionArgumentResolver
  implements HandlerMethodArgumentResolver {

    @Override
    public boolean supportsParameter(MethodParameter methodParameter) {
        return methodParameter.getParameterAnnotation(Version.class) != null;
    }

    @Override
    public Object resolveArgument(
      MethodParameter methodParameter, 
      ModelAndViewContainer modelAndViewContainer, 
      NativeWebRequest nativeWebRequest, 
      WebDataBinderFactory webDataBinderFactory) throws Exception {
 
        HttpServletRequest request 
          = (HttpServletRequest) nativeWebRequest.getNativeRequest();

        return request.getHeader("Version");
    }
}

The last thing is letting Spring know where to search for them:

@Configuration
public class WebConfig extends WebMvcConfigurerAdapter {

    //...

    @Override
    public void addArgumentResolvers(
      List<HandlerMethodArgumentResolver> argumentResolvers) {
        argumentResolvers.add(new HeaderVersionArgumentResolver());
    }
}

That’s it. Now we can use it in a controller:

@GetMapping("/entity/{id}")
public ResponseEntity findByVersion(
  @PathVariable Long id, @Version String version) {
    return ...;
}

As we can see, HandlerMethodArgumentResolver‘s resolveArgument() method returns an Object. In other words, we could return any object, not only String.

4. Conclusion

As a result, we got rid of many routine conversions and let Spring do most stuff for us. At the end, let’s conclude:

  • For individual simple type to object conversions we should use Converter implementation
  • For encapsulating conversion logics for range of objects, we can try ConverterFactory implementation
  • For any data comes indirectly or it is required to apply additional logics to retrieve the associated data it’s better to use HandlerMethodArgumentResolver

As usually, all the examples can be always found at our GitHub repository.

A Guide to MongoDB with Java

$
0
0

1. Overview

In this article, we’ll have a look at integrating MongoDB, a very popular NoSQL open source database with a standalone Java client.

MongoDB is written in C++ and has quite a number of solid features such as map-reduce, auto-sharding, replication, high availability etc.

2. MongoDB

Let’s start with a few key points about MongoDB itself:

  • stores data in JSON-like documents that can have various structures
  • uses dynamic schemas, which means that we can create records without predefining anything
  • the structure of a record can be changed simply by adding new fields or deleting existing ones

The above-mentioned data model gives us the ability to represent hierarchical relationships, to store arrays and other more complex structures easily.

3. Terminologies

Understanding concepts in MongoDB becomes easier if we can compare them to relational database structures.

Let’s see the analogies between Mongo and a traditional MySQL system:

  • Table in MySQL becomes a Collection in Mongo
  • Row becomes a Document
  • Column becomes a Field
  • Joins are defined as linking and embedded documents

This is a simplistic way to look at the MongoDB core concepts of course, but nevertheless useful.

Now, let’s dive into implementation to understand this powerful database.

4. Maven Dependencies

We need to start by defining the dependency of a Java Driver for MongoDB:

<dependency>
    <groupId>org.mongodb</groupId>
    <artifactId>mongo-java-driver</artifactId>
    <version>3.4.1</version>
</dependency>

To check if any new version of the library has been released – track the releases here.

5. Using MongoDB

Now, let’s start implementing Mongo queries with Java. We will follow with the basic CRUD operations as they are the best to start with.

5.1. Make a Connection with MongoClient

First, let’s make a connection to a MongoDB server. With version >= 2.10.0, we’ll use the MongoClient:

MongoClient mongoClient = new MongoClient("localhost", 27017);

And for older versions use Mongo class:

Mongo mongo = new Mongo("localhost", 27017);

5.2. Connecting to a Database

Now, let’s connect to our database. It is interesting to note that we don’t need to create one. When Mongo sees that database doesn’t exist, it will create it for us:

DB database = mongoClient.getDB("myMongoDb");

Sometimes, by default, MongoDB runs in authenticated mode. In that case, we need to authenticate while connecting to a database.

We can do it as presented below:

MongoClient mongoClient = new MongoClient();
DB database = mongoClient.getDB("myMongoDb");
boolean auth = database.authenticate("username", "pwd".toCharArray());

5.3. Show Existing Databases

Let’s display all existing databases. When we want to use command line, the syntax to show databases is similar to MySQL:

show databases;

In Java, we display databases using snippet below:

mongoClient.getDatabaseNames().forEach(System.out::println);

The output will be:

local      0.000GB
myMongoDb  0.000GB

Above, local is the default Mongo database.

5.4. Create a Collection

Let’s start by creating a Collection (table equivalent for MongoDB) for our database. Once we have connected to our database, we can make a Collection as:

database.createCollection("customers", null);

Now, let’s display all existing collections for current database:

database.getCollectionNames().forEach(System.out::println);

The output will be:

customers

5.5. Save – Insert

The save operation has save-or-update semantics: if an id is present, it performs an update, if not – it does an insert.

When we save a new customer:

DBCollection collection = database.getCollection("customers");
BasicDBObject document = new BasicDBObject();
document.put("name", "Shubham");
document.put("company", "Baeldung");
collection.insert(document);

The entity will be inserted into a database:

{
    "_id" : ObjectId("33a52bb7830b8c9b233b4fe6"),
    "name" : "Shubham",
    "company" : "Baeldung"
}

Next, we’ll look at the same operation – save – with update semantics.

5.6. Save – Update

Let’s now look at save with update semantics, operating on an existing customer:

{
    "_id" : ObjectId("33a52bb7830b8c9b233b4fe6"),
    "name" : "Shubham",
    "company" : "Baeldung"
}

Now, when we save the existing customer – we will update it:

BasicDBObject query = new BasicDBObject();
query.put("name", "Shubham");

BasicDBObject newDocument = new BasicDBObject();
newDocument.put("name", "John");

BasicDBObject updateObject = new BasicDBObject();
updateObject.put("$set", newDocument);

collection.update(query, updateObject);

The database will look like this:

{
    "_id" : ObjectId("33a52bb7830b8c9b233b4fe6"),
    "name" : "John",
    "company" : "Baeldung"
}

As you can see, in this particular example, save uses the semantics of update, because we use object with given _id.

5.7. Read a Document from a Collection

Let’s search for a Document in a Collection by making a query:

BasicDBObject searchQuery = new BasicDBObject();
searchQuery.put("name", "John");
DBCursor cursor = collection.find(searchQuery);

while (cursor.hasNext()) {
    System.out.println(cursor.next());
}

It will show the only Document we have by now in our Collection:

[
    {
      "_id" : ObjectId("33a52bb7830b8c9b233b4fe6"),
      "name" : "John",
      "company" : "Baeldung"
    }
]

5.8. Delete a Document

Let’s move forward to our last CRUD operation, deletion:

BasicDBObject searchQuery = new BasicDBObject();
searchQuery.put("name", "John");

collection.remove(searchQuery);

With above command executed, our only Document will be removed from the Collection.

6. Conclusion

This article was a quick introduction to using MongoDB from Java.

The implementation of all these examples and code snippets can be found over on GitHub – this is a Maven based project, so it should be easy to import and run as it is.

Parsing HTML in Java with Jsoup

$
0
0

1. Overview

Jsoup is an open source Java library used mainly for extracting data from HTML. It also allows you to manipulate and output HTML. It has a steady development line, great documentation, and a fluent and flexible API. Jsoup can also be used to parse and build XML.

In this tutorial, we’ll use the Spring Blog to illustrate a scraping exercise that demonstrates several features of jsoup:

  • Loading: fetching and parsing the HTML into a Document
  • Filtering: selecting the desired data into Elements and traversing it
  • Extracting: obtaining attributes, text, and HTML of nodes
  • Modifying: adding/editing/removing nodes and editing their attributes

2. Maven Dependency

To make use of the jsoup library in your project, add the dependency to your pom.xml:

<dependency>
    <groupId>org.jsoup</groupId>
    <artifactId>jsoup</artifactId>
    <version>1.10.2</version>
</dependency>

You can find the latest version of jsoup in the Maven Central repository.

3. Jsoup at a Glance

Jsoup loads the page HTML and builds the corresponding DOM tree. This tree works the same way as the DOM in a browser, offering methods similar to jQuery and vanilla JavaScript to select, traverse, manipulate text/HTML/attributes and add/remove elements.

If you’re comfortable with client-side selectors and DOM traversing/manipulation, you’ll find jsoup very familiar. Check how easy it is to print the paragraphs of a page:

Document doc = Jsoup.connect("http://example.com").get();
doc.select("p").forEach(System.out::println);

Bear in mind that jsoup interprets HTML only — it does not interpret JavaScript. Therefore changes to the DOM that would normally take place after page loads in a JavaScript-enabled browser will not be seen in jsoup.

4. Loading

The loading phase comprises the fetching and parsing of the HTML into a Document. Jsoup guarantees the parsing of any HTML, from the most invalid to the totally validated ones, as a modern browser would do. It can be achieved by loading a String, an InputStream, a File or a URL.

Let’s load a Document from the Spring Blog URL:

String blogUrl = "https://spring.io/blog";
Document doc = Jsoup.connect(blogUrl).get();

Notice the get method, it represents an HTTP GET call. You could also do an HTTP POST with the post method (or you could use a method which receives the HTTP method type as a parameter).

If you need to detect abnormal status codes (e.g. 404), you should catch the HttpStatusException exception:

try {
   Document doc404 = Jsoup.connect("https://spring.io/will-not-be-found").get();
} catch (HttpStatusException ex) {
   //...
}

Sometimes, the connection needs to be a bit more customized. Jsoup.connect(…) returns a Connection which allows you to set, among other things, the user agent, referrer, connection timeout, cookies, post data, and headers:

Connection connection = Jsoup.connect(blogUrl);
connection.userAgent("Mozilla");
connection.timeout(5000);
connection.cookie("cookiename", "val234");
connection.cookie("cookiename", "val234");
connection.referrer("http://google.com");
connection.header("headersecurity", "xyz123");
Document docCustomConn = connection.get();

Since the connection follows a fluent interface, you can chain these methods before calling the desired HTTP method:

Document docCustomConn = Jsoup.connect(blogUrl)
  .userAgent("Mozilla")
  .timeout(5000)
  .cookie("cookiename", "val234")
  .cookie("anothercookie", "ilovejsoup")
  .referrer("http://google.com")
  .header("headersecurity", "xyz123")
  .get();

You can learn more about the Connection settings by browsing the corresponding Javadoc.

5. Filtering

Now that we have the HTML converted into a Document, it’s time to navigate it and find what we are looking for. This is where the resemblance with jQuery/JavaScript is more evident, as its selectors and traversing methods are similar.

5.1. Selecting

The Document select method receives a String representing the selector, using the same selector syntax as in a CSS or JavaScript, and retrieves the matching list of Elements. This list can be empty but not null.

Let’s take a look at some selections using the select method:

Elements links = doc.select("a");
Elements sections = doc.select("section");
Elements logo = doc.select(".spring-logo--container");
Elements pagination = doc.select("#pagination_control");
Elements divsDescendant = doc.select("header div");
Elements divsDirect = doc.select("header > div");

You can also use more explicit methods inspired by the browser DOM instead of the generic select:

Element pag = doc.getElementById("pagination_control");
Elements desktopOnly = doc.getElementsByClass("desktopOnly");

Since Element is a superclass of Document, you can learn more about working with the selection methods in the Document and Element Javadocs.

5.2. Traversing

Traversing means navigating across the DOM tree. Jsoup provides methods that operate on the Document, on a set of Elements, or on a specific Element, allowing you to navigate to a node’s parents, siblings, or children.

Also, you can jump to the first, the last, and the nth (using a 0-based index) Element in a set of Elements:

Element firstSection = sections.first();
Element lastSection = sections.last();
Element secondSection = sections.get(2);
Elements allParents = firstSection.parents();
Element parent = firstSection.parent();
Elements children = firstSection.children();
Elements siblings = firstSection.siblingElements();

You can also iterate through selections. In fact, anything of type Elements can be iterated:

sections.forEach(el -> System.out.println("section: " + el));

You can make a selection restricted to a previous selection (sub-selection):

Elements sectionParagraphs = firstSection.select(".paragraph");

6. Extracting

We now know how to reach specific elements, so it’s time to get their content — namely their attributes, HTML, or child text.

Take a look at this example that selects the first article from the blog and gets its date, its first section text, and finally, its inner and outer HTML:

Element firstArticle = doc.select("article").first();
Element timeElement = firstArticle.select("time").first();
String dateTimeOfFirstArticle = timeElement.attr("datetime");
Element sectionDiv = firstArticle.select("section div").first();
String sectionDivText = sectionDiv.text();
String articleHtml = firstArticle.html();
String outerHtml = firstArticle.outerHtml();

Here are some tips to bear in mind when choosing and using selectors:

  • Rely on “View Source” feature of your browser and not only on the page DOM as it might have changed (selecting at the browser console might yield different results than jsoup)
  • Know your selectors as there are a lot of them and it’s always good to have at least seen them before; mastering selectors takes time
  • Use a playground for selectors to experiment with them (paste a sample HTML there)
  • Be less dependent on page changes: aim for the smallest and least compromising selectors (e.g. prefer id. based)

7. Modifying

Modifying encompasses setting attributes, text, and HTML of elements, as well as appending and removing elements. It is done to the DOM tree previously generated by jsoup – the Document.

7.1. Setting Attributes and Inner Text/HTML

As in jQuery, the methods to set attributes, text, and HTML bear the same names but also receive the value to be set:

  • attr() – sets an attribute’s values (it creates the attribute if it does not exist)
  • text() – sets element inner text, replacing content
  • html() – sets element inner HTML, replacing content

Let’s look at a quick example of these methods:

timeElement.attr("datetime", "2016-12-16 15:19:54.3");
sectionDiv.text("foo bar");
firstArticle.select("h2").html("<div><span></span></div>");

7.2. Creating and Appending Elements

To add a new element, you need to build it first by instantiating Element. Once the Element has been built, you can append it to another Element using the appendChild method. The newly created and appended Element will be inserted at the end of the element where appendChild is called:

Element link = new Element(Tag.valueOf("a"), "")
  .text("Checkout this amazing website!")
  .attr("href", "http://baeldung.com")
  .attr("target", "_blank");
firstArticle.appendChild(link);

7.3. Removing Elements

To remove elements, you need to select them first and run the remove method.

For example, let’s remove all <li> tags that contain the “navbar-link” class from Document, and all images from the first article:

doc.select("li.navbar-link").remove();
firstArticle.select("img").remove();

7.4. Converting the Modified Document to HTML

Finally, since we were changing the Document, we might want to check our work.

To do this, we can explore the Document DOM tree by selecting, traversing, and extracting using the presented methods, or we can simply extract its HTML as a String using the html() method:

String docHtml = doc.html();

The String output is a tidy HTML.

8. Conclusion

Jsoup is a great library to scrape any page. If you’re using Java and don’t require browser-based scraping, it’s a library to take into account. It’s familiar and easy to use since it makes use of the knowledge you may have on front-end development and follows good practices and design patterns.

You can learn more about scraping web pages with jsoup by studying the jsoup API and reading the jsoup cookbook.

The source code used in this tutorial can be found in the GitHub project.


Java Web Weekly, Issue 159

$
0
0

1. Spring and Java

>> Java 9 Will Change the Way You Traverse Stack Traces [takipi.com]

The upcoming Java release will feature a very interesting Stack-Walking API.

>> Feedback on Feeding Spring Boot metrics to Elasticsearch [frankel.ch]

A short tutorial explaining how to integrate Spring Boot metrics with Elasticsearch.

>> Java Enums to Be Enhanced with Sharper Type Support [infoq.com]

Java Enums will get some enhancements. Not in Java 9 though 🙂

>> The truth about Optional [insaneprogramming.be]

Optional is not a panacea. Use it where it was designed to be used.

>> Fixing Bugs in Running Java Code with Dynamic Attach [sitepoint.com]

About patching JVM applications on the fly 🙂

>> Why HTTP/2 with TLS is not supported properly in Java – And what you can do about it [vanwilgenburg.com]

An in-depth insight into a compatibility of TLS-enabled HTTP/2 and Java.

>> 2017 Predictions [adambien.blog]

Adam Bien’s 11 predictions for 2017.

>> Staring Into My Java Crystal Ball [azul.com]

And another writeup focused on 2017, this time all about the upcoming Java releases.

>> The JVM is not that heavy [opensourcery.co.za]

Some actual numbers opposing the “JVM is to heavy” direction.

>> Jigsaw’s Missing Pieces [wildfly.org]

Notes from the Wildfly lead on the state of the Jigsaw implementation, and more importantly the gaps in that implementation.

Also worth reading:

Webinars and presentations:

Time to upgrade:

2. Technical

>> The Dark Path [cleancoder.com]

Uncle Bob’s thoughts about features available in languages such as Kotlin or Swift.

>> Semantic Versioning is not enough [scottlogic.com]

A few thoughts about the flaws of Semantic Versioning.

>> Flexible group-based permissions management! [dynatrace.com]

This is supposed to be an internal update from Dynatrace.

Ignoring that aspect entirely – it’s a solid, mature example of how a permission management UI can be implemented.

Also worth reading:

3. Musings

>> If You Build It, They Won’t Come [daedtech.com]

Do not underestimate the power of the sales and marketing 🙂

>> Publicly Dogfooding Your Culture [zachholman.com]

A very interesting write-up about the importance of transparency when growing a company.

>> Choose wisely [ontestautomation.com]

A few thoughts about APIs and automated testing.

Also worth reading:

4. Comics

And my favorite Dilberts of the week:

>> I’m your CEO, but I’m still like a regular person [dilbert.com]

>> It takes money to make money [dilbert.com]

>> This went differently than expected [dilbert.com]

5. Pick of the Week

>> Quitting something you love [sivers.org]

A Guide to the Spring Task Scheduler

$
0
0

1. Overview

In this article, we’ll discuss the Spring task scheduling mechanismsTaskScheduler and it’s pre-built implementations along with the different triggers to use. If you want to read more about scheduling in Spring, check @Async and @Scheduled articles.

TaskScheuler was introduced in Spring 3.0 with a variety of methods to run at some point in the future, it also returns a representation object of ScheduledFuture interface, which could be used to cancel scheduled task or check if it’s done or not.

All we need to do is to select a runnable task for scheduling then select a proper scheduling policy.

2. ThreadPoolTaskScheduler

ThreadPoolTaskScheduler is well suited for internal thread management, as it delegates tasks to the ScheduledExecutorService and implements the TaskExecutor interface – so that single instance of it is able to handle asynchronous potential executions as well as the @Scheduled annotation.

Let’s now define ThreadPoolTaskScheduler bean at ThreadPoolTaskSchedulerConfig:

@Configuration
@ComponentScan(
  basePackages="org.baeldung.taskscheduler",
  basePackageClasses={ThreadPoolTaskSchedulerExamples.class})
public class ThreadPoolTaskSchedulerConfig {

    @Bean
    public ThreadPoolTaskScheduler threadPoolTaskScheduler(){
        ThreadPoolTaskScheduler threadPoolTaskScheduler
          = new ThreadPoolTaskScheduler();
        threadPoolTaskScheduler.setPoolSize(5);
        threadPoolTaskScheduler.setThreadNamePrefix(
          "ThreadPoolTaskScheduler");
        return threadPoolTaskScheduler;
    }
}

The configured bean threadPoolTaskScheduler can execute tasks asynchronously based on the configured pool size of 5.

Note that all ThreadPoolTaskScheduler related thread names will be prefixed with ThreadPoolTaskScheduler.

Let’s implement a simple task we can then schedule:

class RunnableTask implements Runnable{
    private String message;
    
    public RunnableTask(String message){
        this.message = message;
    }
    
    @Override
    public void run() {
        System.out.println(new Date()+" Runnable Task with "+message
          +" on thread "+Thread.currentThread().getName());
    }
}

We can now simple schedule this task to be executed by the scheduler:

taskScheduler.schedule(
  new Runnabletask("Specific time, 3 Seconds from now"),
  new Date(System.currentTimeMillis + 3000)
);

The taskScheduler will schedule this runnable task at a known date, exactly 3 seconds after the current time.

Let’s now go a bit more in depth with the ThreadPoolTaskScheduler scheduling mechanisms.

3. Schedule Runnable task with Fixed Delay

Scheduling with a fixed delay can be done with two simple mechanisms:

3.1. Scheduling after a Fixed Delay of  The Last Scheduled Execution

Let’s configure a task to run after a fixed delay of 1000 milliseconds:

taskScheduler.scheduleWithFixedDelay(
  new RunnableTask("Fixed 1 second Delay"), 1000);

The RunnableTask will always run 1000 milliseconds later between the completion of one execution and the start of the next.

3.2. Scheduling after a Fixed Delay of a Specific Date

Let’s configure a task to run after a fixed delay of a given start time:

taskScheduler.scheduleWithFixedDelay(
  new RunnableTask("Current Date Fixed 1 second Delay"),
  new Date(),
  1000);

The RunnableTask will be invoked at the specified execution time which mainly the time in which @PostConstruct method starts and subsequently with 1000 milliseconds delay.

4. Scheduling at a Fixed Rate

There are two simple mechanisms for scheduling runnable tasks at fixed rate:

4.1. Scheduling The RunnableTask at a Fixed Rate

Let’s schedule a task to run at a fixed rate of milliseconds:

taskScheduler.scheduleAtFixedRate(
  new RunnableTask("Fixed Rate of 2 seconds") , 2000);

The next RunnableTask will run always after 2000 milliseconds no matter the status of last execution which may be still running.

4.2. Scheduling The RunnableTask at a Fixed Rate from a Given Date

taskScheduler.scheduleAtFixedRate(new RunnableTask(
  "Fixed Rate of 2 seconds"), new Date(), 3000);

The RunnableTask will run 3000 milliseconds after the current time.

5. Scheduling with CronTrigger

CronTrigger is used to schedule a task based on a cron expression:

CronTrigger cronTrigger 
  = new CronTrigger("10 * * * * ?");

The provided trigger can be used to run a task according to a certain specified cadence or schedule:

taskScheduler.schedule(new RunnableTask("Cron Trigger"), cronTrigger);

In this case, the RunnableTask will be executed at the 10th second of every minute.

6. Scheduling with PeriodicTrigger

Let’s use PeriodicTrigger for scheduling a task with a fixed delay of 2000 milliseconds:

PeriodicTrigger periodicTrigger 
  = new PeriodicTrigger(2000, TimeUnit.MICROSECONDS);

The configured PeriodicTrigger bean would be used to run a task after a fixed delay of 2000 millisecond.

Now let’s schedule the RunnableTask with the PeriodicTrigger:

taskScheduler.schedule(
  new RunnableTask("Periodic Trigger"), periodicTrigger);

We also can configure PeriodicTrigger to be initialized at a fixed rate rather than fixed delay, also we can set an initial delay for the first scheduled task by a given milliseconds.

All we need to do is to add two lines of code before return statement at the periodicTrigger bean:

periodicTrigger.setFixedRate(true);
periodicTrigger.setInitialDelay(1000);

We used setFixedRate method to schedule the task at fixed rate rather than with a fixed delay, then setInitialDelay method is used to set initial delay only for the first runnable task to run.

7. Conclusion

In this quick article, we’ve illustrated how to schedule a runnable task using the Spring support for tasks.

We looked at running the task with a fixed delay, at a fixed rate and according to a specified trigger.

And, as always, the code is available as a Maven project over in Github.

A Guide to JGit

$
0
0

1. Introduction

JGit is a lightweight, pure Java library implementation of the Git version control system – including repository access routines, network protocols, and core version control algorithms.

JGit is a relatively full-featured implementation of Git written in Java and is widely used in the Java community. The JGit project is under the Eclipse umbrella, and its home can be found at JGit.

In this tutorial, we’ll explain how to work with it.

2. Getting Started

There are a number of ways to connect your project with JGit and start writing code. Probably the easiest way is to use Maven – the integration is accomplished by adding the following snippet to the <dependencies> tag in our pom.xml file:

<dependency>
    <groupId>org.eclipse.jgit</groupId>
    <artifactId>org.eclipse.jgit</artifactId>
    <version>4.6.0.201612231935-r</version>
</dependency>

Please visit the Maven Central repository for the newest version of JGit. Once this step is done, Maven will automatically acquire and use the JGit libraries that we’ll need.

If you prefer OSGi bundles, there is also a p2 repository. Please visit Eclipse JGit to get the necessary information how to integrate this library.

3. Creating a Repository

JGit has two basic levels of API: plumbing and porcelain. The terminology for these comes from Git itself. JGit is divided into the same areas:

  • porcelain APIs – front-end for common user-level actions (similar to Git command-line tool)
  • plumbing APIs – direct interacting with low-level repository objects

The starting point for most of JGit sessions is in the Repository class. The first thing we are going to do is the creation of a new Repository instance.

The init command will let us create an empty repository:

Git git = Git.init().setDirectory("/path/to/repo").call();

This will create a repository with a working directory at the location given to setDirectory().

An existing repository can be cloned with the cloneRepository command:

Git git = Git.cloneRepository()
  .setURI("https://github.com/eclipse/jgit.git")
  .setDirectory("/path/to/repo")
  .call();

The code above will clone the JGit repository into the local directory named path/to/repo.

4. Git Objects

All objects are represented by an SHA-1 id in the Git object model. In JGit, this is represented by the AnyObjectId and ObjectId classes.

There are four types of objects in the Git object model:

  • blob – used for storing file data
  • tree –  a directory; it references other trees and blobs
  • commit – points to a single tree
  • tag – marks a commit as special; generally used for marking specific releases

To resolve an object from a repository, simply pass the right revision as in the following function:

ObjectId head = repository.resolve("HEAD");

4.1. Ref

The Ref is a variable that holds a single object identifier. The object identifier can be any valid Git object (blobtreecommittag).

For example, to query for the reference to head, you can simply call:

Ref HEAD = repository.getRef("refs/heads/master");

4.2. RevWalk

The RevWalk walks a commit graph and produces the matching commits in order:

RevWalk walk = new RevWalk(repository);

4.3. RevCommit

The RevCommit represents a commit in the Git object model. To parse a commit, use a RevWalk instance:

RevWalk walk = new RevWalk(repository);
RevCommit commit = walk.parseCommit(objectIdOfCommit);

4.4. RevTag

The RevTag represents a tag in the Git object model. You can use a RevWalk instance to parse a tag:

RevWalk walk = new RevWalk(repository);
RevTag tag = walk.parseTag(objectIdOfTag);

4.5. RevTree

The RevTree represents a tree in the Git object model. A RevWalk instance is also used to parse a tree:

RevWalk walk = new RevWalk(repository);
RevTree tree = walk.parseTree(objectIdOfTree);

5. Porcelain API

While JGit contains a lot of low-level code to work with Git repositories, it also contains a higher level API that mimics some of the Git porcelain commands in the org.eclipse.jgit.api package.

5.1. AddCommand (git-add)

The AddCommand allows you to add files to the index via:

  • addFilepattern()

Here’s a quick example of how to add a set of files to the index using the porcelain API:

Git git = new Git(db);
AddCommand add = git.add();
add.addFilepattern("someDirectory").call();

5.2. CommitCommand (git-commit)

The CommitCommand allows you to perform commits and has following options available:

  • setAuthor()
  • setCommitter()
  • setAll()

Here’s a quick example of how to commit using the porcelain API:

Git git = new Git(db);
CommitCommand commit = git.commit();
commit.setMessage("initial commit").call();

5.3. TagCommand (git-tag)

The TagCommand supports a variety of tagging options:

  • setName()
  • setMessage()
  • setTagger()
  • setObjectId()
  • setForceUpdate()
  • setSigned()

Here’s a quick example of tagging a commit using the porcelain API:

Git git = new Git(db);
RevCommit commit = git.commit().setMessage("initial commit").call();
RevTag tag = git.tag().setName("tag").call();

5.4. LogCommand (git-log)

The LogCommand allows you to easily walk a commit graph.

  • add(AnyObjectId start)
  • addRange(AnyObjectId since, AnyObjectId until)

Here’s a quick example of how to get some log messages:

Git git = new Git(db);
Iterable<RevCommit> log = git.log().call();

6. Ant Tasks

JGit also has some common Ant tasks contained in the org.eclipse.jgit.ant bundle.

To use those tasks:

<taskdef resource="org/eclipse/jgit/ant/ant-tasks.properties">
    <classpath>
        <pathelement location="path/to/org.eclipse.jgit.ant-VERSION.jar"/>
        <pathelement location="path/to/org.eclipse.jgit-VERSION.jar"/>
        <pathelement location="path/to/jsch-0.1.44-1.jar"/>
    </classpath>
</taskdef>

This would provide the git-clone, git-init and git-checkout tasks.

6.1. git-clone

<git-clone uri="http://egit.eclipse.org/jgit.git" />

The following attributes are required:

  • uri: the URI to clone from

The following attributes are optional:

  • dest: the destination to clone to (defaults to use a human readable directory name based on the last path component of the URI)
  • baretrue/false/yes/no to indicate if the cloned repository should be bare or not (defaults to false)
  • branch: the initial branch to check out when cloning the repository (defaults to HEAD)

6.2. git-init

<git-init />

No attributes are required to run the git-init task.

The following attributes are optional:

  • dest: the path where a git repository is initialized (defaults to $GIT_DIR or the current directory)
  • baretrue/false/yes/no to indicate if the repository should be bare or not (defaults to false)

6.3. git-checkout

<git-checkout src="path/to/repo" branch="origin/newbranch" />

The following attributes are required:

  • src: the path to the git repository
  • branch: the initial branch to checkout

The following attributes are optional:

  • createbranchtrue/false/yes/no to indicate whether the branch should be created if it does not already exist (defaults to false)
  • forcetrue/false/yes/no: if true/yes and the branch with the given name already exists, the start-point of an existing branch will be set to a new start-point; if false, the existing branch will not be changed (defaults to false)

7. Conclusion

The high-level JGit API isn’t hard to understand. If you know what git command to use, you can easily guess which classes and methods to use in JGit.

There is a collection of ready-to-run JGit code snippets available here.

If you still have difficulties or questions, please leave a comment here or ask the JGit community for assistance.

Creating PDF Files in Java

$
0
0

1. Introduction

In this quick article, we’ll focus on creating PDF document from scratch based on popular iText and PdfBox library.

2. Maven Dependencies

Let’s take a look at the Maven dependencies, which needs to be included in our project:

<dependency>
    <groupId>com.itextpdf</groupId>
    <artifactId>itextpdf</artifactId>
    <version>5.5.10</version>
</dependency>
<dependency>
    <groupId>org.apache.pdfbox</groupId>
    <artifactId>pdfbox</artifactId>
    <version>2.0.4</version>
</dependency>

The latest version of the libraries can be found here: iText and PdfBox.

One extra dependency is necessary to add, in case our file will need to be encrypted. The Bounty Castle Provider package contains implementations of cryptographic algorithms and is required by both libraries:

<dependency>
    <groupId>org.bouncycastle</groupId>
    <artifactId>bcprov-jdk15on</artifactId>
    <version>1.56</version>
</dependency>

The latest version of the library can be found here: The Bounty Castle Provider.

3. Overview

Both, the iText and PdfBox are java libraries used for creation/manipulation of pdf files. Although the final output of the libraries is the same, they operate in a bit different manner. Let’s take a look at them.

4. Create Pdf in IText

4.1. Insert Text in Pdf

Let’s have a look, at the way a new file with “Hello World” text is inserted in pdf file

Document document = new Document();
PdfWriter.getInstance(document, new FileOutputStream("iTextHelloWorld.pdf"));

document.open();
Font font = FontFactory.getFont(FontFactory.COURIER, 16, BaseColor.BLACK);
Chunk chunk = new Chunk("Hello World", font);

document.add(chunk);
document.close();

Creating a pdf with a use of the iText library is based on manipulating objects implementing Elements interface in Document (in version 5.5.10 there are 45 of those implementations).

The smallest element which can be added to the document and used is called Chunk, which is basically a string with applied font.

Additionally, Chunk‘s can be combined with other elements like Paragraphs, Section etc. resulting in nice looking documents.

4.2. Inserting Image

The iText library provides an easy way to add an image to the document. We simply need to create an Image instance and add it to the Document.

Path path = Paths.get(ClassLoader.getSystemResource("Java_logo.png").toURI());

Document document = new Document();
PdfWriter.getInstance(document, new FileOutputStream("iTextImageExample.pdf"));
document.open();
Image img = Image.getInstance(path.toAbsolutePath().toString());
document.add(img);

document.close();

4.3. Inserting Table

We might face a problem when we would like to add a table to our pdf. Luckily iText provides out-of-the-box such functionality.

First what we need to do is to create a PdfTable object and in constructor provide a number of columns for our table. Now we can simply add new cell by calling

Now we can simply add new cell by calling addCell method on the newly created table object. iText will create table rows as long as all necessary cells are defined, what it means is that once you create a table with 3 columns and add 8 cells to it, only 2 rows with 3 cells in each will be displayed.

Let’s take a look at the example:

Document document = new Document();
PdfWriter.getInstance(document, new FileOutputStream("iTextTable.pdf"));

document.open();

PdfPTable table = new PdfPTable(3);
addTableHeader(table);
addRows(table);
addCustomRows(table);

document.add(table);
document.close();

We create a new table with 3 columns and 3 rows. First row we will treat as a table header with changed background color and border width:

private void addTableHeader(PdfPTable table) {
    Stream.of("column header 1", "column header 2", "column header 3")
      .forEach(columnTitle -> {
        PdfPCell header = new PdfPCell();
        header.setBackgroundColor(BaseColor.LIGHT_GRAY);
        header.setBorderWidth(2);
        header.setPhrase(new Phrase(columnTitle));
        table.addCell(header);
    });
}

The second row will be composed of three cells just with text, no extra formatting.

private void addRows(PdfPTable table) {
    table.addCell("row 1, col 1");
    table.addCell("row 1, col 2");
    table.addCell("row 1, col 3");
}

We can include not only text in cells but also images. Additionally, each cell might be formatted individually, in the example presented below we apply horizontal and vertical alignment adjustments:

private void addCustomRows(PdfPTable table) 
  throws URISyntaxException, BadElementException, IOException {
    Path path = Paths.get(ClassLoader.getSystemResource("Java_logo.png").toURI());
    Image img = Image.getInstance(path.toAbsolutePath().toString());
    img.scalePercent(10);

    PdfPCell imageCell = new PdfPCell(img);
    table.addCell(imageCell);

    PdfPCell horizontalAlignCell = new PdfPCell(new Phrase("row 2, col 2"));
    horizontalAlignCell.setHorizontalAlignment(Element.ALIGN_CENTER);
    table.addCell(horizontalAlignCell);

    PdfPCell verticalAlignCell = new PdfPCell(new Phrase("row 2, col 3"));
    verticalAlignCell.setVerticalAlignment(Element.ALIGN_BOTTOM);
    table.addCell(verticalAlignCell);
}

4.4. File Encryption

In order to apply permission using iText library, we need to have already created pdf document. In our example, we will use our iTextHelloWorld.pdf file generated previously.

Once we load the file using PdfReader, we need to create a PdfStamper which is used to apply additional content to file like metadata, encryption etc:

PdfReader pdfReader = new PdfReader("HelloWorld.pdf");
PdfStamper pdfStamper 
  = new PdfStamper(pdfReader, new FileOutputStream("encryptedPdf.pdf"));

pdfStamper.setEncryption(
  "userpass".getBytes(),
  ".getBytes(),
  0,
  PdfWriter.ENCRYPTION_AES_256
);

pdfStamper.close();

In our example, we encrypted the file with two passwords. The user password (“userpass”) where a user has only read-only right with no possibility to print it, and owner password (“ownerpass”) that is used as master key allowing a person to have full access to pdf.

If we want to allow the user to print pdf, instead of 0 (third parameter of setEncryption) we can pass:

PdfWriter.ALLOW_PRINTING

Of course, we can mixed different permissions like:

PdfWriter.ALLOW_PRINTING | PdfWriter.ALLOW_COPY

Keep in mind that using iText to set access permissions, we are also creating a temporary pdf which should be deleted and if not it could be fully accessible to anybody.

5. Create Pdf in PdfBox

5.1. Insert Text in Pdf

As opposite to the iText, the PdfBox library provides API which is based on stream manipulation. There are no classes like Chunk/Paragraph etc. The PDDocument class is a pdf representation in-memory where the user writes data by manipulating PDPageContentStream class.

Let’s take a look at the code example:

PDDocument document = new PDDocument();
PDPage page = new PDPage();
document.addPage(page);

PDPageContentStream contentStream = new PDPageContentStream(document, page);

contentStream.setFont(PDType1Font.COURIER, 12);
contentStream.beginText();
contentStream.showText("Hello World");
contentStream.endText();
contentStream.close();

document.save("pdfBoxHelloWorld.pdf");
document.close();

5.2. Inserting Image

Inserting images is straightforward.

First we need to load a file and create a PDImageXObject, subsequently draw it on the document (need to provide exact x,y coordinates).

That’s all:

PDDocument document = new PDDocument();
PDPage page = new PDPage();
document.addPage(page);

Path path = Paths.get(ClassLoader.getSystemResource("Java_logo.png").toURI());
PDPageContentStream contentStream = new PDPageContentStream(document, page);
PDImageXObject image 
  = PDImageXObject.createFromFile(path.toAbsolutePath().toString(), document);
contentStream.drawImage(image, 0, 0);
contentStream.close();

document.save("pdfBoxImage.pdf");
document.close();

5.3. Inserting a Table

Unfortunately, PdfBox does not provide any out-of-box methods allowing creating tables. What we can do in such situation is to draw it manually – literally draw each line until our drawing resembles our dreamed table.

5.4. File Encryption

PdfBox library provides a possibility to encrypt, and adjust file permission for the user. Comparing to iText, it does not require to use an already existing file, as we simply use PDDocument. Pdf file permissions are handled by AccessPermission class, where we can set if a user will be able to modify, extract content or print a file.

Subsequently, we create a StandardProtectionPolicy object which adds password-based protection to the document. We can specify two types of password. The user password, after which person will be able to open a file with applied access permissions and owner password (no limitations to the file):

PDDocument document = new PDDocument();
PDPage page = new PDPage();
document.addPage(page);

AccessPermission accessPermission = new AccessPermission();
accessPermission.setCanPrint(false);
accessPermission.setCanModify(false);

StandardProtectionPolicy standardProtectionPolicy 
  = new StandardProtectionPolicy("ownerpass", "userpass", accessPermission);
document.protect(standardProtectionPolicy);
document.save("pdfBoxEncryption.pdf");
document.close();

Our example presents a situation that if a user provides user password, the file cannot be modified and printed.

6. Conclusions

In this tutorial, we discussed ways of creating a pdf file in two popular Java libraries.

Full examples can be found in the Maven based project over on GitHub.

Guide to Pattern Matching in Javaslang

$
0
0

1. Overview

In this article, we are going to focus on Pattern Matching with Javaslang. If you do not know what about Javaslang, please read the Javaslang’s Overview first.

Pattern matching is a feature that is not natively available in Java. One could think of it as the advanced form of a switch-case statement.

The advantage of Javaslang’s pattern matching is that it saves us from writing stacks of switch cases or if-then-else statements. It, therefore, reduces the amount of code and represents conditional logic in a human-readable way.

We can use the pattern matching API by making the following import from Javaslang 2.0 onwards:

import static javaslang.API.*;

2. How Pattern Matching Works

As we saw in the previous article, pattern matching can be used to replace a switch block:

@Test
public void whenSwitchWorksAsMatcher_thenCorrect() {
    int input = 2;
    String output;
    switch (input) {
    case 0:
        output = "zero";
        break;
    case 1:
        output = "one";
        break;
    case 2:
        output = "two";
        break;
    case 3:
        output = "three";
        break;
    default:
        output = "unknown";
        break;
    }

    assertEquals("two", output);
}

Or multiple if statements:

@Test
public void whenIfWorksAsMatcher_thenCorrect() {
    int input = 3;
    String output;
    if (input == 0) {
        output = "zero";
    }
    if (input == 1) {
        output = "one";
    }
    if (input == 2) {
        output = "two";
    }
    if (input == 3) {
        output = "three";
    } else {
        output = "unknown";
    }

    assertEquals("three", output);
}

The snippets we have seen so far are verbose and therefore error prone. When using pattern matching, we use three main building blocks: the two static methods Match, Case and atomic patterns.

Atomic patterns represent the condition that should be evaluated to return a boolean value:

  • $(): a wild-card pattern that is similar to the default case in a switch statement. It handles a scenario where no match is found
  • $(value): this is the equals pattern where a value is simply equals-compared to the input.
  • $(predicate): this is the conditional pattern where a predicate function is applied to the input and the resulting boolean is used to make a decision.

The switch and if approaches could be replaced by a shorter and more concise piece of code as below:

@Test
public void whenMatchworks_thenCorrect() {
    int input = 2;
    String output = Match(input).of(
      Case($(1), "one"), 
      Case($(2), "two"), 
      Case($(3), "three"), 
      Case($(), "?"));
        
    assertEquals("two", output);
}

If the input does not get a match, the wild-card pattern gets evaluated:

@Test
public void whenMatchesDefault_thenCorrect() {
    int input = 5;
    String output = Match(input).of(
      Case($(1), "one"), 
      Case($(), "unknown"));

    assertEquals("unknown", output);
}

If there is no wild-card pattern and the input does not get matched, we will get a match error:

@Test(expected = MatchError.class)
public void givenNoMatchAndNoDefault_whenThrows_thenCorrect() {
    int input = 5;
    Match(input).of(
      Case($(1), "one"), 
      Case($(2), "two"));
}

In this section, we have covered the basics of Javaslang pattern matching and the following sections will cover various approaches to tackling different cases we are likely to encounter in our code.

3. Match With Option

As we saw in the previous section, the wild-card pattern $() matches default cases where no match is found for the input.

However, another alternative to including a wild-card pattern is wrapping the return value of a match operation in an Option instance:

@Test
public void whenMatchWorksWithOption_thenCorrect() {
    int i = 10;
    Option<String> s = Match(i)
      .option(Case($(0), "zero"));

    assertTrue(s.isEmpty());
    assertEquals("None", s.toString());
}

To get a better understanding of Option in Javaslang, you can refer to the introductory article.

4. Match With Inbuilt Predicates

Javaslang ships with some inbuilt predicates that make our code more human-readable. Therefore, our initial examples can be improved further with predicates:

@Test
public void whenMatchWorksWithPredicate_thenCorrect() {
    int i = 3;
    String s = Match(i).of(
      Case(is(1), "one"), 
      Case(is(2), "two"), 
      Case(is(3), "three"),
      Case($(), "?"));

    assertEquals("three", s);
}

Javaslang offers more predicates than this. For example, we can make our condition check the class of the input instead:

@Test
public void givenInput_whenMatchesClass_thenCorrect() {
    Object obj=5;
    String s = Match(obj).of(
      Case(instanceOf(String.class), "string matched"), 
      Case($(), "not string"));

    assertEquals("not string", s);
}

Or whether the input is null or not:

@Test
public void givenInput_whenMatchesNull_thenCorrect() {
    Object obj=5;
    String s = Match(obj).of(
      Case(isNull(), "no value"), 
      Case(isNotNull(), "value found"));

    assertEquals("value found", s);
}

Instead of matching values in equals style, we can use contains style. This way, we can check if an input exists in a list of values with the isIn predicate:

@Test
public void givenInput_whenContainsWorks_thenCorrect() {
    int i = 5;
    String s = Match(i).of(
      Case(isIn(2, 4, 6, 8), "Even Single Digit"), 
      Case(isIn(1, 3, 5, 7, 9), "Odd Single Digit"), 
      Case($(), "Out of range"));

    assertEquals("Odd Single Digit", s);
}

There is more we can do with predicates, like combining multiple predicates as a single match case.To match only when the input passes all of a given group of predicates, we can AND predicates using the allOf predicate.

A practical case would be where we want to check if a number is contained in a list as we did with the previous example. The problem is that the list contains nulls as well. So, apart from rejecting numbers which are not in the list, we want to reject nulls as well:

The problem is that the list contains nulls as well. So, we want to apply a filter that, apart from rejecting numbers which are not in the list, will also reject nulls:

@Test
public void givenInput_whenMatchAllWorks_thenCorrect() {
    Integer i = null;
    String s = Match(i).of(
      Case(allOf(isNotNull(),isIn(1,2,3,null)), "Number found"), 
      Case($(), "Not found"));

    assertEquals("Not found", s);
}

To match when an input matches any of a given group, we can OR the predicates using the anyOf predicate.

Assume we are screening candidates by their year of birth and we want only candidates who were born in 1990,1991 or 1992.

If no such candidate is found, then we can only accept those born in 1986 and we want to make this clear in our code too:

@Test
public void givenInput_whenMatchesAnyOfWorks_thenCorrect() {
    Integer year = 1990;
    String s = Match(year).of(
      Case(anyOf(isIn(1990, 1991, 1992), is(1986)), "Age match"), 
      Case($(), "No age match"));
    assertEquals("Age match", s);
}

Finally, we can also XOR predicates using the noneOf predicates so that an input gets a match when all conditions evaluate to false.

To demonstrate this, we can negate the condition in the previous example such that we get candidates who are not in the above age groups:

@Test
public void givenInput_whenMatchesNoneOfWorks_thenCorrect() {
    Integer year = 1990;
    String s = Match(year).of(
      Case(noneOf(isIn(1990, 1991, 1992), is(1986)), "Age match"), 
      Case($(), "No age match"));

    assertEquals("No age match", s);
}

5. Match With Custom Predicates

In the previous section, we explored the inbuilt predicates of Javaslang. But Javaslang does not stop there. With the knowledge of lambdas, we can build and use our own predicates or even just write them inline.

With this new knowledge, we can inline a predicate in the first example of the previous section and rewrite it like this:

@Test
public void whenMatchWorksWithCustomPredicate_thenCorrect() {
    int i = 3;
    String s = Match(i).of(
      Case(n -> n == 1, "one"), 
      Case(n -> n == 2, "two"), 
      Case(n -> n == 3, "three"), 
      Case($(), "?"));
    assertEquals("three", s);
}

We can also apply a functional interface in the place of a predicate in case we need more parameters. The contains example can be rewritten like this, albeit a little more verbose, but it gives us more power over what our predicate does:

@Test
public void givenInput_whenContainsWorks_thenCorrect2() {
    int i = 5;
    BiFunction<Integer, List<Integer>, Boolean> contains 
      = (t, u) -> u.contains(t);

    String s = Match(i).of(
      Case(o -> contains
        .apply(i, Arrays.asList(2, 4, 6, 8)), "Even Single Digit"), 
      Case(o -> contains
        .apply(i, Arrays.asList(1, 3, 5, 7, 9)), "Odd Single Digit"), 
      Case($(), "Out of range"));

    assertEquals("Odd Single Digit", s);
}

In the above example, we created a Java 8 BiFunction which simply checks the isIn relationship between the two arguments.

You could have used Javaslang’s FunctionN for this as well. Therefore, if the inbuilt predicates do not quite match your requirements or you want to have control over the whole evaluation, then use custom predicates.

6. Object Decomposition

Object decomposition is the process of breaking a Java object into its component parts. For example, consider the case of abstracting an employee’s bio-data alongside employment information:

public class Employee {

    private String name;
    private String id;

    //standard constructor, getters and setters
}

We can decompose an Employee’s record into its component parts: name and id. This is quite obvious in Java:

@Test
public void givenObject_whenDecomposesJavaWay_whenCorrect() {
    Employee person = new Employee("Carl", "EMP01");

    String result = "not found";
    if (person != null && "Carl".equals(person.getName())) {
        String id = person.getId();
        result="Carl has employee id "+id;
    }

    assertEquals("Carl has employee id EMP01", result);
}

We create an employee object, then we first check if it is null before applying a filter to ensure we end up with the record of an employee whose name is Carl. We then go ahead and retrieve his id. The Java way works but it is verbose and error prone.

What we are basically doing in the above example is matching what we know with what is coming in. We know we want an employee called Carl, so we try to match this name to the incoming object.

We then break down his details to get a human readable output. The null checks are simply defensive overheads we don’t need.

With Javaslang’s Pattern Matching API, we can forget about unnecessary checks and simply focus on what is important, resulting in very compact and readable code.

To use this provision, we must have an additional javaslang-match dependency installed in your project. You can get it by following this link.

The above code can then be written as below:

@Test
public void givenObject_whenDecomposesJavaslangWay_thenCorrect() {
    Employee person = new Employee("Carl", "EMP01");

    String result = Match(person).of(
      Case(Employee($("Carl"), $()),
        (name, id) -> "Carl has employee id "+id),
      Case($(),
        () -> "not found"));
         
    assertEquals("Carl has employee id EMP01", result);
}

The key constructs in the above example are the atomic patterns $(“Carl”) and $(), the value pattern the wild card pattern respectively. We discussed these in detail in the Javaslang introductory article.

Both patterns retrieve values from the matched object and store them into the lambda parameters. The value pattern $(“Carl”) can only match when the retrieved value matches what is inside it i.e. carl.

On the other hand, the wild card pattern $() matches any value at its position and retrieves the value into the id lambda parameter.

For this decomposition to work, we need to define decomposition patterns or what is formally known as unapply patterns.

This means that we must teach the pattern matching API how to decompose our objects, resulting in one entry for each object to be decomposed:

@Patterns
class Demo {
    @Unapply
    static Tuple2<String, String> Employee(Employee Employee) {
        return Tuple.of(Employee.getName(), Employee.getId());
    }

    // other unapply patterns
}

The annotation processing tool will generate a class called DemoPatterns.java which we have to statically import to wherever we want to apply these patterns:

import static com.baeldung.javaslang.DemoPatterns.*;

We can also decompose inbuilt Java objects.

For instance, java.time.LocalDate can be decomposed into a year, month and day of the month. Let us add its unapply pattern to Demo.java:

@Unapply
static Tuple3<Integer, Integer, Integer> LocalDate(LocalDate date) {
    return Tuple.of(
      date.getYear(), date.getMonthValue(), date.getDayOfMonth());
}

Then the test:

@Test
public void givenObject_whenDecomposesJavaslangWay_whenCorrect2() {
    LocalDate date = LocalDate.of(2017, 2, 13);

    String result = Match(date).of(
      Case(LocalDate($(2016), $(3), $(13)), 
        () -> "2016-02-13"),
      Case(LocalDate($(2016), $(), $()),
        (y, m, d) -> "month " + m + " in 2016"),
      Case(LocalDate($(), $(), $()),  
        (y, m, d) -> "month " + m + " in " + y),
      Case($(), 
        () -> "(catch all)")
    );

    assertEquals("month 2 in 2017",result);
}

7. Side Effects in Pattern Matching

By default, Match acts like an expression, meaning it returns a result. However, we can force it to produce a side-effect by using the helper function run within a lambda.

It takes a method reference or a lambda expression and returns Void. 

Consider a scenario where we want to print something when an input is a single digit even integer and another thing when the input is a single digit odd number and throw an exception when the input is none of these.

The even number printer:

public void displayEven() {
    System.out.println("Input is even");
}

The odd number printer:

public void displayOdd() {
    System.out.println("Input is odd");
}

And the match function:

@Test
public void whenMatchCreatesSideEffects_thenCorrect() {
    int i = 4;
    Match(i).of(
      Case(isIn(2, 4, 6, 8), o -> run(this::displayEven)), 
      Case(isIn(1, 3, 5, 7, 9), o -> run(this::displayOdd)), 
      Case($(), o -> run(() -> {
          throw new IllegalArgumentException(String.valueOf(i));
      })));
}

Which would print:

Input is even

8. Conclusion

In this article, we have explored the most important parts of the Pattern Matching API in Javaslang. Indeed we can now write simpler and more concise code without the verbose switch and if statements, thanks to Javaslang.

To get the full source code for this article, you can check out the the Github project.

Overview of AI Libraries in Java

$
0
0

1. Introduction

In this article, we’ll go over an overview of Artificial Intelligence (AI) libraries in Java.

Since this article is about libraries, we’ll not make any introduction to AI itself. Additionally, theoretical background of AI is necessary in order to use libraries presented in this article.

AI is a very wide field, so we will be focusing on the most popular fields today like Natural Language Processing, Machine Learning, Neural Networks and more. In the end, we’ll mention few interesting AI challenges where you can practice your understanding of AI.

2. Expert Systems

2.1. Apache Jena

Apache Jena is an open source Java framework for building semantic web and linked data applications from RDF data. The official website provides a detailed tutorial on how to use this framework with a quick introduction to RDF specification.

2.2. PowerLoom Knowledge Representation and Reasoning System

PowerLoom is a platform for the creation of intelligent, knowledge-based applications. It provides Java API with detailed documentation which can be found on this link.

2.3. d3web 

d3web is an open source reasoning engine for developing, testing and applying problem-solving knowledge onto a given problem situation, with many algorithms already included. The official website provides a quick introduction to the platform with many examples and documentation.

2.4. Eye

Eye is an open source reasoning engine for performing semi-backward reasoning.

2.5. Tweety

Tweety is a collection of Java frameworks for logical aspects of AI and knowledge representation. The official website provides documentation and many examples.

3. Neural Networks

3.1. Neuroph

Neuroph is an open source Java framework for neural network creation. Users can create networks through provided GUI or Java code. Neuroph provides API documentation which also explains what neural network actually is and how it works.

3.2. Deeplearning4j

Deeplearning4j is a deep learning library for JVM but it also provides API for neural network creation. The official website provides many tutorials and simple theoretical explanations for deep learning and neural networks.

4. Natural Language Processing

4.1. Apache OpenNLP

Apache OpenNLP library is a machine learning based toolkit for the processing of natural language text. The official website provides API documentation with information on how to use the library.

4.2. Stanford CoreNLP

Stanford CoreNLP is the most popular Java NLP framework which provides various tools for performing NLP tasks. The official website provides tutorials and documentation with information on how to use this framework.

5. Machine Learning

5.1. Java Machine Learning Library (Java-ML)

Java-ML is an open source Java framework which provides various machine learning algorithms specifically for programmers. The official website provides API documentation with many code samples and tutorials.

5.2. RapidMiner

RapidMiner is a data science platform which provides various machine learning algorithms through GUI and Java API. It has a very big community, many available tutorials, and an extensive documentation.

5.3. Weka 

Weka is a collection of machine learning algorithms which can be applied directly to the dataset, through the provided GUI or called through the provided API. Similar as for RapidMiner, a community is very big, providing various tutorials for Weka and machine learning itself.

5.4. Encog Machine Learning Framework

Encong is a Java machine learning framework which supports many machine learning algorithms. It’s developed by Jeff Heaton from Heaton Research. The official website provides documentation and many examples.

6. Genetic algorithms

6.1. Jenetics 

Jenetics is an advanced genetic algorithm written in Java. It provides a clear separation of the genetic algorithm concepts. The official website provides documentation and a user guide for new users.

6.2. Watchmaker Framework

Watchmaker Framework is a framework for implementing genetic algorithms in Java. The official website provides documentation, examples, and additional information about the framework itself.

6.3. ECJ 23

ECJ 23 is a Java based research framework with strong algorithmic support for genetic algorithms. ECJ is developed at George Mason University’s ECLab Evolutionary Computation Laboratory. The official website provides extensive documentation and tutorials.

6.4. Java Genetic Algorithms Package (JGAP)

JGAP is a genetic programming component provided as a Java framework. The official website provides documentation and tutorials.

7. Automatic programming

7.1. Spring Roo

Spring Roo is a lightweight developer tool from Spring. It’s using AspectJ mixins to provide separation of concerns during round-trip maintenance.

7.2. Acceleo

Acceleo is an open source code generator for Eclipse which generates code from EMF models defined from any metamodel (UML, SysML, etc.).

8. Challenges

Since AI is very interesting and popular topic, there are many challenges and competitions online. This is a list of some interesting competitions where you can train and test your skills:

9. Conclusion

In this article, we presented various Java AI frameworks which can be used in everyday work.

We also saw that AI is a very wide field with many frameworks and services – all of which can make your applications better and more innovative.

Guide to Spring Email

$
0
0

1. Overview

In this article, we’ll walk through the steps needed to send emails from both a plain vanilla Spring application as well as from a Spring Boot application, the former using the JavaMail library and the latter using the spring-boot-starter-mail dependency.

2. Maven Dependencies

First, we need to add the dependencies to our pom.xml.

2.1. Spring

For use in the plain vanilla Spring framework we’ll add:

<dependency>
    <groupId>org.springframework</groupId>
    <artifactId>spring-context-support</artifactId>
    <version>4.3.5-RELEASE</version>
</dependency>

The latest version may be found here.

2.2. Spring Boot

And for Spring Boot:

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-mail</artifactId>
    <version>1.4.3.RELEASE</version>
</dependency>

The latest version is available in the Maven Central repository.

3. Mail Server Properties

The interfaces and classes for Java mail support in the Spring framework are organized as follows:

  1. MailSender interface: The top-level interface that provides basic functionality for sending simple emails
  2. JavaMailSender interface: the subinterface of the above MailSender. It supports MIME messages and is mostly used in conjunction with the MimeMessageHelper class for the creation of a MimeMessage. It’s recommended to use the MimeMessagePreparator mechanism with this interface
  3. JavaMailSenderImpl class: provides an implementation of the JavaMailSender interface. It supports the MimeMessage and SimpleMailMessage
  4. SimpleMailMessage class: used to create a simple mail message including the from, to, cc, subject and text fields
  5. MimeMessagePreparator interface: provides a callback interface for the preparation of MIME messages
  6. MimeMessageHelper class: helper class for the creation of MIME messages. It offers support for images, typical mail attachments and text content in an HTML layout

In the following sections we show how these interfaces and classes are used.

3.1. Spring Mail Server Properties

Mail properties that are needed to specify e.g. the SMTP server may be defined using the JavaMailSenderImpl.

For example, for Gmail this can be configured as shown below:

@Bean
public JavaMailSender getJavaMailSender() {
    JavaMailSender mailSender = new JavaMailSenderImpl();
    mailSender.setHost("smtp.gmail.com");
    mailSender.setPort(587);
    
    mailSender.setUsername("my.gmail@gmail.com");
    mailSender.setPassword("password");
    
    Properties props = mailSender.getJavaMailProperties();
    props.put("mail.transport.protocol", "smtp");
    props.put("mail.smtp.auth", "true");
    props.put("mail.smtp.starttls.enable", "true");
    props.put("mail.debug", "true");
    
    return mailSender;
}

3.2. Spring Boot Mail Server Properties

Once the dependency is in place, the next step is to specify the mail server properties in the application.properties file using the spring.mail.* namespace.

For example, the properties for Gmail SMTP Server can be specified as:

spring.mail.host=smtp.gmail.com
spring.mail.port=587
spring.mail.username=<login user to smtp server>
spring.mail.password=<login password to smtp server>
spring.mail.properties.mail.smtp.auth=true
spring.mail.properties.mail.smtp.starttls.enable=true

Some SMTP servers require a TLS connection, so the property spring.mail.properties.mail.smtp.starttls.enable is used to enable a TLS-protected connection.

3.2.1. Gmail SMTP Properties

We can send an email via Gmail SMTP server. Have a look at the documentation to see the Gmail outgoing mail SMTP server properties.

Our application.the properties file is already configured to use Gmail SMTP (see the previous section).

Note that the password for your account should not be an ordinary password, but an application password generated for your google account. Follow this link to see the details and to generate your Google App Password.

3.2.2. SES SMTP Properties

To send emails using Amazon SES Service, set your application.properties as we do below:

spring.mail.host=email-smtp.us-west-2.amazonaws.com
spring.mail.username=username
spring.mail.password=password
spring.mail.properties.mail.transport.protocol=smtp
spring.mail.properties.mail.smtp.port=25
spring.mail.properties.mail.smtp.auth=true
spring.mail.properties.mail.smtp.starttls.enable=true
spring.mail.properties.mail.smtp.starttls.required=true

Please, be aware that Amazon requires you to verify your credentials before using them. Follow the link to verify your username and password.

4. Sending Email

Once dependency management and configuration are in place, we can use the aforementioned JavaMailSender to send an email.

Since both the plain vanilla Spring framework as well as the Boot version of it handle the composing and sending of e-mails in a similar way, we won’t have to distinguish between the two in the subsections below.

4.1. Sending Simple Emails

Let’s first compose and send a simple email message without any attachments:

@Component
public class EmailServiceImpl implements EmailService {
 
    @Autowired
    public JavaMailSender emailSender;

    public void sendSimpleMessage(
      String to, String subject, String text) {
        ...
        SimpleMailMessage message = new SimpleMailMessage(); 
        message.setTo(to); 
        message.setSubject(subject); 
        message.setText(text);
        emailSender.send(message);
        ...
    }
}

4.2. Sending Emails with Attachments

Sometimes Spring’s simple messaging is not enough for our use cases.

For example, we want to send an order confirmation email with an invoice attached. In this case, we should use a MIME multipart message from JavaMail library instead of SimpleMailMessage. Spring supports JavaMail messaging with the org.springframework.mail.javamail.MimeMessageHelper class.

First of all, we’ll add a method to the EmailServiceImpl to send emails with attachments:

@Override
public void sendMessageWithAttachment(
  String to, String subject, String text, String pathToAttachment) {
    // ...
    
    MimeMessage message = emailSender.createMimeMessage();
     
    MimeMessageHelper helper = new MimeMessageHelper(message, true);
    
    helper.setTo(to);
    helper.setSubject(subject);
    helper.setText(text);
        
    FileSystemResource file 
      = new FileSystemResource(new File(pathToAttachment));
    helper.addAttachment("Invoice", file);

    emailSender.send(message);
    // ...
}

4.3. Simple Email Template

SimpleMailMessage class supports text formatting. We can create a template for emails by defining a template bean in our configuration:

@Bean
public SimpleMailMessage templateSimpleMessage() {
    SimpleMailMessage message = new SimpleMailMessage();
    message.setText(
      "This is the test email template for your email:\n%s\n");
    return message;
}

Now we can use this bean as a template for email and only need to provide necessary parameters to the template:

@Autowired
public SimpleMailMessage template;
...
String text = String.format(template.getText(), templateArgs);  
sendSimpleMessage(to, subject, text);

5. Handling Send Errors

JavaMail provides SendFailedException to handle situations when a message cannot be sent. But it is possible that you won’t get this exception while sending an email to the incorrect address. The reason is the following:

The protocol specs for SMTP in RFC 821 specifies the 550 return code that SMTP server should return when attempting to send an email to the incorrect address. But most of the public SMTP servers don’t do this. Instead, they send a “delivery failed” email to your box, or give no feedback at all.

For example, Gmail SMTP server sends a “delivery failed” message. And you get no exceptions in your program.

So, there are few options you can go through to handle this case:

  1. Catch the SendFailedException, which can never be thrown
  2. Check your sender mailbox on “delivery failed” message for some period of time. This is not straightforward and the time period is not determined
  3. If your mail server gives no feedback at all, you can do nothing

6. Conclusion

In this quick article, we showed how to set up and send emails from a Spring Boot application.

The implementation of all these examples and code snippets can be found in the GitHub project; this is a Maven-based project, so it should be easy to import and run as it is.


Messaging With Spring AMQP

$
0
0

1. Overview

In this article, we will explore Messaging-based communication over AMQP protocol using Spring AMQP framework. First, we’ll cover some of the key concepts of messaging, and we’ll move on to practical examples in section 5.

1.1. Maven Dependencies

To use spring-amqp and spring-rabbit in your project, just add these dependencies:

<dependencies>
    <dependency>
        <groupId>org.springframework.amqp</groupId>
        <artifactId>spring-amqp</artifactId>
        <version>1.6.6.RELEASE</version>
    </dependency>
    <dependency>
        <groupId>org.springframework.amqp</groupId>
        <artifactId>spring-rabbit</artifactId>
        <version>1.6.6.RELEASE</version>
    </dependency>
</dependencies>

You will find the newest versions in the Maven repository.

2. Messaging Based Communication

Messaging is a technique for inter-application communication that relies on asynchronous message-passing instead of synchronous request response-based architecture. Producer and consumer of messages are decoupled by an intermediate messaging layer known as the message broker. Message broker provides features like persistent storage of messages, message filtering, message transformation, etc.

In a case of intercommunication between applications written in Java, JMS (Java Message Service) API is commonly used for sending and receiving messages. For interoperability between different vendors and platforms,  we won’t be able to use JMS clients and brokers. This is where AMQP comes in handy.

3. AMQP – Advanced Message Queuing Protocol

AMQP is an open standard wire specification for asynchronous messaging based communication. It provides a description on how a message should be constructed. Every byte of transmitted data is specified.

3.1. How AMQP is Different From JMS

Since AMQP is a platform neutral binary protocol standard, this characteristic allows libraries to be written in different programming languages, and to run on different operating systems and CPU architectures.

There is no vendor based protocol lock in, as in the case of migrating from one JMS broker to another. For more details refer to JMS vs AMQP and Understanding AMQP. Some of the widely used AMQP brokers are RabbitMQ, OpenAMQ, and StormMQ.

3.2. AMQP Entities

AMQP entities comprise of Exchanges, Queues, and Bindings:

  • Exchanges are like post offices or mailboxes and clients always publish a message to an AMQP exchange
  • Queues bind to exchange using the binding key. A binding is “link” that you set up to bind a queue to an exchange
  • Messages are sent to message broker/exchange with a routing key. The exchange then distributes copies of messages to queues. Exchange provides the abstraction to achieve different messaging routing topologies like fanout, hierarchical routing, etc.

3.3. Exchange Types

Exchanges are AMQP entities where messages are sent. Exchanges take a message and route it into zero or more queues. There are four built in exchange types:

  • Direct Exchange
  • Fanout Exchange
  • Topic Exchange
  • Headers Exchange

For more details, have a look at AMQP Concepts and Routing Topologies.

4. Spring AMQP

Spring AMQP comprise of two modules: spring-amqp and spring-rabbit, each represented by a jar in the distribution. spring-

Spring-amqp is the base abstraction and you will find classes that represent the core AMQP model: Exchange, Queue and Binding. 

We will be covering spring-rabbit specific features below in the article.

4.1. spring-amqp Features

  • Template-based abstraction – AMQPTemplate interface defines all basic operations for sending/receiving messages, with RabbitTemplate as the implementation
  • Publisher Confirmations/Consumer Acknowledgements support
  • AmqpAdmin tool that allows performing basic operations

4.2. spring-rabbit Features

AMQP model and related entities we discussed above are generic and applicable to all implementations. But there are some features which are specific to the implementation. A couple of those spring-rabbit features is explained below.

Connection Management Support – org.springframework.amqp.rabbit.connection.ConnectionFactory interface is the central component for managing the connections to RabbitMQ broker. The responsibility of CachingConnectionFactory implementation of ConnectionFactory interface is to provide an instance of org.springframework.amqp.rabbit.connection.Connection interface. Please note that spring-rabbit provides Connection interface as a wrapper over RabbitMQ client com.rabbitmq.client.Connection interface.

Asynchronous Message Consumption For asynchronous message consumption, two key concepts are a callback and a container. The callback is where your application code will be integrated with the messaging system.

One of the ways to code a callback is to provide an implementation of the MessageListener interface:

public interface MessageListener {
    void onMessage(Message message);
}

In order to have a clear separation between application code and messaging API, Spring AMQP provides MessageListenerAdapter also. Here is a Spring bean configuration example for listener container:

MessageListenerAdapter listener = new MessageListenerAdapter(somePojo);
listener.setDefaultListenerMethod("myMethod");

Now that we saw the various options for the Message-listening callback, we can turn our attention to the container. Basically, the container handles the “active” responsibilities so that the listener callback can remain passive. The container is an example of a lifecycle component. It provides methods for starting and stopping.

When configuring the container, you are essentially bridging the gap between an AMQP Queue and the MessageListener instance. By default, the listener container will start a single consumer which will receive messages from the queues.

5. Send and Receive Messages Using Spring AMQP

Here are the steps to send and receive a message via Spring AMQP:

  1. Setup and Start RabbitMQ broker – RabbitMQ installation and setup is straightforward, just follow the steps mentioned here
  2. Setup Java Project Create a Maven based Java project with dependencies mentioned above
  3. Implement Message Producer – We can use RabbitTemplate to send a “Hello, world!” message:
    AbstractApplicationContext ctx
      = new ClassPathXmlApplicationContext("beans.xml");
    AmqpTemplate template = ctx.getBean(RabbitTemplate.class);
    template.convertAndSend("Hello, world!");
    
  4. Implement Message Consumer – As discussed earlier, we can implement message consumer as a POJO:
    public class Consumer {
        public void listen(String foo) {
            System.out.println(foo);
        }
    }
  5. Wiring Dependencies  We will be using following spring bean configuration for setting up queues, exchange, and other entities. Most of the entries are self-explanatory. Queue named “myQueue” is bound to Topic Exchange “myExchange” using “foo.*” as the binding key. RabbitTemplate has been set up to send messages to “myExchange” exchange with “foo.bar” as the default routing key. ListenerContainer ensures asynchronous delivery of messages from “myQueue” queue to listen() method of Foo class:
    <rabbit:connection-factory id="connectionFactory"
      host="localhost" username="guest" password="guest" />
    
    <rabbit:template id="amqpTemplate" connection-factory="connectionFactory"
        exchange="myExchange" routing-key="foo.bar" />
    
    <rabbit:admin connection-factory="connectionFactory" />
    
    <rabbit:queue name="myQueue" />
    
    <rabbit:topic-exchange name="myExchange">
        <rabbit:bindings>
            <rabbit:binding queue="myQueue" pattern="foo.*" />
        </rabbit:bindings>
    </rabbit:topic-exchange>
    
    <rabbit:listener-container connection-factory="connectionFactory">
        <rabbit:listener ref="consumer" method="listen" queue-names="myQueue" />
    </rabbit:listener-container>
    
    <bean id="consumer" class="com.baeldung.springamqp.consumer.Consumer" />

    Note: By default, in order to connect to local RabbitMQ instance, use username “guest” and password “guest”.

  6. Run the Application: 
  • The first step is to make sure RabbitMQ is running, default port being 5672
  • Run the application by running Producer.java, executing the main() method
  • Producer sends the message to the “myExchange” with “foo.bar” as the routing key
  • As per the binding key of “myQueue”, it receives the message
  • Foo class which is a consumer of “myQueue” messages with listen() method as the callback receives the message and prints it on the console

6. Conclusion

In this article, we have covered messaging based architecture over AMQP protocol using Spring AMQP for communication between applications.

The complete source code and all code snippets for this article are available on the the GitHub project.

Guide to Guava Multimap

$
0
0

1. Overview

In this article, we will look at one of Map implementations from Google Guava library – Multimap. It is a collection that maps keys to values, similar to java.util.Map, but in which each key may be associated with multiple values.

2. Maven Dependency

First, let’s add a dependency:

<dependency>
    <groupId>com.google.guava</groupId>
    <artifactId>guava</artifactId>
    <version>21.0</version>
</dependency>

The latest version can be found here.

3. Multimap Implementation

In the case of Guava Multimap, if we add two values for the same key, the second value will not override the first value. Instead, we will have two values in the resulting map. Let’s look at a test case:

String key = "a-key";
Multimap<String, String> map = ArrayListMultimap.create();

map.put(key, "firstValue");
map.put(key, "secondValue");

assertEquals(2, map.size());

Printing the map‘s content will output:

{a-key=[firstValue, secondValue]}

When we will get values by key “a-key” we will get Collection<String> that contains “firstValue” and “secondValue” as a result:

Collection<String> values = map.get(key);

Printing values will output:

[firstValue, secondValue]

4. Compared to the Standard Map

Standard map from java.util package doesn’t give us the ability to assign multiple values to the same key. Let’s consider a simple case when we put() two values into a Map using the same key:

String key = "a-key";
Map<String, String> map = new LinkedHashMap<>();

map.put(key, "firstValue");
map.put(key, "secondValue");

assertEquals(1, map.size());

The resulting map has only one element (“secondValue”), because of a second put() operation that overrides the first value. Should we want to achieve the same behavior as with Guava’s Multimapwe would need to create a Map that has a List<String> as a value type:

String key = "a-key";
Map<String, List<String>> map = new LinkedHashMap<>();

List<String> values = map.get(key);
if(values == null) {
    values = new LinkedList<>();
    values.add("firstValue");
    values.add("secondValue");
 }

map.put(key, values);

assertEquals(1, map.size());

Obviously, it is not very convenient to use. And if we have such need in our code then Guava’s Multimap could be a better choice than java.util.Map.

One thing to notice here is that, although we have a list that has two elements in it, size() method returns 1. In Multimap, size() returns an actual number of values stored in a Map, but keySet().size() returns the number of distinct keys.

5. Pros of Multimap

Multimaps are commonly used in places where a Map<K, Collection<V>> would otherwise have appeared. The differences include:

  • There is no need to populate an empty collection before adding an entry with put()
  • The get() method never returns null, only an empty collection (we do not need to check against null like in Map<String, Collection<V>> test case)
  • A key is contained in the Multimap if and only if it maps to at least one value. Any operation that causes a key to has zero associated values, has the effect of removing that key from the Multimap (in Map<String, Collection<V>>, even if we remove all values from the collection, we still keep an empty Collection as a value, and this is unnecessary memory overhead)
  • The total entry values count is available as size()

6. Conclusion

This article shows how and when to use Guava Multimap. It compares it to standard java.util.Map and shows pros of Guava Multimap.

All these examples and code snippets can be found in the GitHub project – this is a Maven project, so it should be easy to import and run as it is.

Exceptions in Java 8 Lambda Expressions

$
0
0

1. Overview

In Java 8, Lambda Expressions started to facilitate functional programming by providing a concise way to express behavior. However, the Functional Interfaces provided by the JDK don’t deal with exceptions very well – and the code becomes verbose and cumbersome when it comes to handling them.

In this article, we’ll explore some ways to deal with exceptions when writing lambda expressions.

2. Handling Unchecked Exceptions

First, let’s understand the problem with an example.

We have a List<Integer> and we want to divide a constant, say 50 with every element of this list and print the results:

List<Integer> integers = Arrays.asList(3, 9, 7, 6, 10, 20);
integers.forEach(i -> System.out.println(50 / i));

This expression works but there’s one problem. If any of the elements in the list is 0, then we get an ArithmeticException: / by zero. Let’s fix that by using a traditional try-catch block such that we log any such exception and continue execution for next elements:

List<Integer> integers = Arrays.asList(3, 9, 7, 0, 10, 20);
integers.forEach(i -> {
    try {
        System.out.println(50 / i);
    } catch (ArithmeticException e) {
        System.err.println(
          "Arithmetic Exception occured : " + e.getMessage());
    }
});

The use of try-catch solves the problem, but the conciseness of a Lambda Expression is lost and it’s no longer a small function as it’s supposed to be.

To deal with this problem, we can write a lambda wrapper for the lambda function. Let’s look at the code to see how it works:

static Consumer<Integer> lambdaWrapper(Consumer<Integer> consumer) {
    return i -> {
        try {
            consumer.accept(i);
        } catch (ArithmeticException e) {
            System.err.println(
              "Arithmetic Exception occured : " + e.getMessage());
        }
    };
}
List<Integer> integers = Arrays.asList(3, 9, 7, 0, 10, 20);
integers.forEach(lambdaWrapper(i -> System.out.println(50 / i)));

At first, we wrote a wrapper method that will be responsible for handling the exception and then passed the lambda expression as a parameter to this method.

The wrapper method works as expected but, you may argue that it’s basically removing the try-catch block from lambda expression and moving it to another method and it doesn’t reduce the actual number of lines of code being written.

This is true in this case where the wrapper is specific to a particular use case but we can make use of generics to improve this method and use it for a variety of other scenarios:

static <T, E extends Exception> Consumer<T>
  consumerWrapper(Consumer<T> consumer, Class<E> clazz) {
 
    return i -> {
        try {
            consumer.accept(i);
        } catch (Exception ex) {
            try {
                E exCast = clazz.cast(ex);
                System.err.println(
                  "Exception occured : " + exCast.getMessage());
            } catch (ClassCastException ccEx) {
                throw ex;
            }
        }
    };
}
List<Integer> integers = Arrays.asList(3, 9, 7, 0, 10, 20);
integers.forEach(
  consumerWrapper(
    i -> System.out.println(50 / i), 
    ArithmeticException.class));

As we can see, this iteration of our wrapper method takes two arguments, the lambda expression and the type of Exception to be caught. This lambda wrapper is capable of handling all data types, not just Integers, and catch any specific type of exception and not the superclass Exception.

Also, notice that we have changed the name of the method from lambdaWrapper to consumerWrapper. It’s because this method only handles lambda expressions for Functional Interface of type Consumer. We can write similar wrapper methods for other Functional Interfaces like Function, BiFunction, BiConsumer and so on.

3. Handling Checked Exceptions

Let’s consider the example from the previous section, but instead of dividing and printing the integers to the console, we want to write them to a file. This operation of writing to a file throws IOException.

static void writeToFile(Integer integer) throws IOException {
    // logic to write to file which throws IOException
}
List<Integer> integers = Arrays.asList(3, 9, 7, 0, 10, 20);
integers.forEach(i -> writeToFile(i));

On compilation, we get the following error.

java.lang.Error: Unresolved compilation problem: Unhandled exception type IOException

Since IOException is a checked exception, it must be handled. Now there are two options, we may want to throw the exception and handle it somewhere else or handle it inside the method that has the lambda expression. Let’s look at each of them one by one.

3.1. Throwing Checked Exception from Lambda Expressions

Let’s throw the exception from the method in which the lambda expression is written, in this case, the main:

public static void main(String[] args) throws IOException {
    List<Integer> integers = Arrays.asList(3, 9, 7, 0, 10, 20);
    integers.forEach(i -> writeToFile(i));
}

Still, while compiling, we get the same error of unhandled IOException. This is because lambda expressions are similar to Anonymous Inner Classes. In this case, the lambda expression is an implementation of accept(T t) method from Consumer<T> interface.

Throwing the exception from main is does nothing and since the method in the parent interface doesn’t throw any exception, it can’t in its implementation:

Consumer<Integer> consumer = new Consumer<Integer>() {
 
    @Override
    public void accept(Integer integer) throws Exception {
        writeToFile(integer);
    }
};

The above code doesn’t compile because the implementation of accept method can’t throw any Exception.

The most straightforward way would be to use a try-catch and wrap the checked exception into an unchecked exception and rethrow:

List<Integer> integers = Arrays.asList(3, 9, 7, 0, 10, 20);
integers.forEach(i -> {
    try {
        writeToFile(i);
    } catch (IOException e) {
        throw new RuntimeException(e);
    }
});

This approach gets the code to compile and run but has the same problem as the example in the case of unchecked exceptions in the previous section.

Since we just want to throw the exception,  we need to write our own Consumer Functional Interface which can throw an exception and then a wrapper method using it. Let’s call it ThrowingConsumer:

@FunctionalInterface
public interface ThrowingConsumer<T, E extends Exception> {
    void accept(T t) throws E;
}
static <T> Consumer<T> throwingConsumerWrapper(
  ThrowingConsumer<T, Exception> throwingConsumer) {
 
    return i -> {
        try {
            throwingConsumer.accept(i);
        } catch (Exception ex) {
            throw new RuntimeException(ex);
        }
    };
}

Now we can write our lambda expression which can throw exceptions without losing the conciseness.

List<Integer> integers = Arrays.asList(3, 9, 7, 0, 10, 20);
integers.forEach(throwingConsumerWrapper(i -> writeToFile(i)));

3.2. Handling a Checked Exception in Lambda Expression

In this final section. we will modify the wrapper to handle checked exceptions. Since our ThrowingConsumer interface uses generics, we can handle any specific exception.

static <T, E extends Exception> Consumer<T> handlingConsumerWrapper(
  ThrowingConsumer<T, E> throwingConsumer, Class<E> exceptionClass) {
 
    return i -> {
        try {
            throwingConsumer.accept(i);
        } catch (Exception ex) {
            try {
                E exCast = exceptionClass.cast(ex);
                System.err.println(
                  "Exception occured : " + exCast.getMessage());
            } catch (ClassCastException ccEx) {
                throw new RuntimeException(ex);
            }
        }
    };
}

We can user this wrapper in our example to handle only the IOException and throw any other checked exception by wrapping them in an unchecked exception:

List<Integer> integers = Arrays.asList(3, 9, 7, 0, 10, 20);
integers.forEach(handlingConsumerWrapper(
  i -> writeToFile(i), IOException.class));

Similar to the case of unchecked exceptions, throwing siblings for other Functional Interfaces like ThowingFunction, ThrowingBiFunction, ThrowingBiConsumer etc. can be written along with their corresponding wrapper methods.

4. Conclusion

In this article, we covered how to handle a specific exception in lambda expressions without losing the conciseness by use of wrapper methods. We also learned how to write throwing alternatives for the Functional Interfaces present in JDK to either throw a checked exception by wrapping them in an unchecked exception or to handle them.

The complete source code of Functional Interface and wrapper methods can be downloaded from here and test classes from here, over on Github.

If you are looking for the out-of-the-box working solutions, Javaslang and ThrowingFunction are worth checking out.

JAX-RS is just an API!

$
0
0

1. Overview

The REST paradigm has been around for quite a few years now and it’s still getting a lot of attention.

A RESTful API can be implemented in Java in a number of ways: you can use Spring, JAX-RS, or you might just write your own bare servlets if you’re good and brave enough. All you need is the ability to expose HTTP methods – the rest is all about how you organize them and how you guide the client when making calls to your API.

As you can make out from the title, this article will cover JAX-RS. But what does “just an API” mean? It means that the focus here is on clarifying the confusion between JAX-RS and its implementations and on offering an example of what a proper JAX-RS webapp looks like.

2. Inclusion in Java EE

JAX-RS is nothing more than a specification, a set of interfaces and annotations offered by Java EE. And then of course we have the implementations; some of the more well known are RESTEasy and Jersey.

Also, if you ever decide to build a JEE-compliant application server, the guys from Oracle will tell you that, among many other things, your server should provide a JAX-RS implementation for the deployed apps to use. That’s why it’s called Java Enterprise Edition Platform.

Another good example of specification and implementation is JPA and Hibernate.

2.1. Lightweight Wars

So how does all this help us, the developers? The help is in that our deployables can and should be very thin, letting the application server provide the needed libraries. This applies when developing a RESTful API as well: the final artifact should not contain any information about the used JAX-RS implementation.

Sure, we can provide the implementation (here‘s a tutorial for RESTeasy). But then we cannot call our application “Java EE app” anymore. If tomorrow someone comes and says “Ok, time to switch to Glassfish or Payara, JBoss became too expensive!“, we might be able to do it, but it won’t be an easy job.

If we provide our own implementation we have to make sure the server knows to exclude its own – this usually happens by having a proprietary XML file inside the deployable. Needless to say, such a file should contain all sorts of tags and instructions that nobody knows nothing about, except the developers who left the company three years ago.

2.2. Always Know your Server

We said so far that we should take advantage of the platform that we’re offered.

Before deciding on a server to use, we should see what JAX-RS implementation (name, vendor, version and known bugs) it provides, at least for Production environments. For instance, Glassfish comes with Jersey, while Wildfly or Jboss come with RESTEasy.

This, of course, means a little time spent on research, but it’s supposed to be done only once, at the beginning of the project or when migrating it to another server.

3. An Example

If you want to start playing with JAX-RS, the shortest path is: have a Maven webapp project with the following dependency in the pom.xml:

<dependency>
    <groupId>javax</groupId>
    <artifactId>javaee-api</artifactId>
    <version>7.0</version>
    <scope>provided</scope>
</dependency>

We’re using JavaEE 7 since there are already plenty of application servers implementing it. That API jar contains the annotations that you need to use, located in package javax.ws.rs. Why is the scope “provided”? Because this jar doesn’t need to be in the final build either – we need it at compile time and it is provided by the server for the run time.

After the dependency is added, we first have to write the entry class: an empty class which extends javax.ws.rs.core.Application and is annotated with javax.ws.rs.ApplicationPath:

@ApplicationPath("/api")
public class RestApplication extends Application {
}

We defined the entry path as being /api. Whatever other paths we declare for our resources, they will be prefixed with /api.

Next, let’s see a resource:

@Path("/notifications")
public class NotificationsResource {
    @GET
    @Path("/ping")
    public Response ping() {
        return Response.ok().entity("Service online").build();
    }

    @GET
    @Path("/get/{id}")
    @Produces(MediaType.APPLICATION_JSON)
    public Response getNotification(@PathParam("id") int id) {
        return Response.ok()
          .entity(new Notification(id, "john", "test notification"))
          .build();
    }

    @POST
    @Path("/post/")
    @Consumes(MediaType.APPLICATION_JSON)
    @Produces(MediaType.APPLICATION_JSON)
    public Response postNotification(Notification notification) {
        return Response.status(201).entity(notification).build();
    }
}

We have a simple ping endpoint to call and check if our app is running, a GET and a POST for a Notification (this is just a POJO with attributes plus getters and setters).

Deploy this war on any application server implementing JEE7 and the following commands will work:

curl http://localhost:8080/simple-jaxrs-ex/api/notifications/ping/

curl http://localhost:8080/simple-jaxrs-ex/api/notifications/get/1

curl -X POST -d '{"id":23,"text":"lorem ipsum","username":"johana"}' 
  http://localhost:8080/simple-jaxrs-ex/api/notifications/post/ 
  --header "Content-Type:application/json"

Where simple-jaxrs-ex is the context-root of the webapp.

This was tested with Glassfish 4.1.0 and Wildfly 9.0.1.Final. Please note that the last two commands won’t work with Glassfish 4.1.1, because of this bug. It is apparently a known issue in this Glassfish version, regarding the serialization of JSON (if you have to use this server version, you’ll have to manage JSON marshaling on your own)

4. Conclusion

At the end this article, just keep in mind that JAX-RS is a powerful API and most (if not all) of the stuff that you need is already implemented by your web server. No need to turn your deployable into an unmanageable pile of libraries.

This writeup presents a simple example and things might get more complicated. For instance, you might want to write your own marshalers. When that’s needed, look for tutorials that solve your problem with JAX-RS, not with Jersey, Resteasy or other concrete implementation. It’s very likely that your problem can be solved with one or two annotations.

Guide to Spring Retry

$
0
0

1. Overview

Spring Retry provides an ability to automatically re-invoke a failed operation. This is helpful where the errors may be transient in nature (like a momentary network glitch). Spring Retry provides declarative control of the process and policy-based behavior that is easy to extend and customize.

In this article, we’ll see how to use String Retry to implement retry logic in Spring applications. We’ll also configure listeners to receive additional callbacks.

2. Maven Dependencies

Let’s begin by adding the dependency into our pom.xml:

<dependency>
    <groupId>org.springframework.retry</groupId>
    <artifactId>spring-retry</artifactId>
    <version>1.1.5.RELEASE</version>
</dependency>

We can check the latest version of spring-retry in Maven Central.

3. Enabling Spring Retry

To enable Spring Retry in an application, we need to add the @EnableRetry annotation to our @Configuration class:

@Configuration
@EnableRetry
public class AppConfig { ... }

4. Retry with Annotations

We can make a method call to be retried on failure by using annotations.

4.1. @Retryable

To add retry functionality to methods, @Retryable can be used:

@Service
public interface MyService {
    @Retryable(
      value = { SQLException.class }, 
      maxAttempts = 2,
      backoff = @Backoff(delay = 5000))
    void retryService(String sql) throws SQLException;
    ...
}

Here, the retry behavior is customized using the attributes of @Retryable. In this example, retry will be attempted only if the method throws an SQLException. There will be up to 2 retries and a delay of 5000 milliseconds.

If @Retryable is used without any attributes, if the method fails with an exception, then retry will be attempted up to three times, with a delay of one second.

4.2. @Recover

The @Recover annotation is used to define a separate recovery method when a @Retryable method fails with a specified exception:

@Service
public interface MyService {
    ...
    @Recover
    void recover(SQLException e, String sql);
}

So if the retryService() method throws an SQLException, the recover() method will be called. A suitable recovery handler has its first parameter of type Throwable (optional). Subsequent arguments are populated from the argument list of the failed method in the same order as the failed method, and with the same return type.

5. RetryTemplate

5.1 RetryOperations

Spring Retry provides RetryOperations interface which supplies a set of execute() methods:

public interface RetryOperations {
    <T> T execute(RetryCallback<T> retryCallback) throws Exception;

    ...
}

The RetryCallback which is a parameter of the execute() is an interface that allows insertion of business logic that needs to be retried upon failure:

public interface RetryCallback<T> {
    T doWithRetry(RetryContext context) throws Throwable;
}

5.2. RetryTemplate Configuration 

The RetryTemplate is an implementation of the RetryOperations. Let’s configure a RetryTemplate bean in our @Configuration class:

@Configuration
public class AppConfig {
    //...
    @Bean
    public RetryTemplate retryTemplate() {
        RetryTemplate retryTemplate = new RetryTemplate();
		
        FixedBackOffPolicy fixedBackOffPolicy = new FixedBackOffPolicy();
        fixedBackOffPolicy.setBackOffPeriod(2000l);
        retryTemplate.setBackOffPolicy(fixedBackOffPolicy);

        SimpleRetryPolicy retryPolicy = new SimpleRetryPolicy();
        retryPolicy.setMaxAttempts(2);
        retryTemplate.setRetryPolicy(retryPolicy);
		
        return retryTemplate;
    }
}

RetryPolicy determines when an operation should be retried. A SimpleRetryPolicy is used to retry a fixed number of times.

BackOffPolicy is used to control back off between retry attempts. A FixedBackOffPolicy pauses for a fixed period of time before continuing.

5.3. Using the RetryTemplate

To run code with retry handling we call the retryTemplate.execute():

retryTemplate.execute(new RetryCallback<Void, RuntimeException>() {
    @Override
    public Void doWithRetry(RetryContext arg0) {
        myService.templateRetryService();
        ...
    }
});

The same could be achieved using a lambda expression instead of an anonymous class:

retryTemplate.execute(arg0 -> {
    myService.templateRetryService();
    return null;
});

6. XML Configuration

Spring Retry can be configured by XML using the Spring AOP namespace.

6.1. Adding XML File

In the classpath, let’s add retryadvice.xml:

...
<beans>
    <aop:config>
        <aop:pointcut id="transactional"
          expression="execution(*MyService.xmlRetryService(..))" />
        <aop:advisor pointcut-ref="transactional"
          advice-ref="taskRetryAdvice" order="-1" />
    </aop:config>

    <bean id="taskRetryAdvice"
      class="org.springframework.retry.interceptor.
        RetryOperationsInterceptor">
        <property name="RetryOperations" ref="taskRetryTemplate" />
    </bean>

    <bean id="taskRetryTemplate"
      class="org.springframework.retry.support.RetryTemplate">
        <property name="retryPolicy" ref="taskRetryPolicy" />
        <property name="backOffPolicy" ref="exponentialBackOffPolicy" />
    </bean>

    <bean id="taskRetryPolicy"
        class="org.springframework.retry.policy.SimpleRetryPolicy">
        <constructor-arg index="0" value="5" />
        <constructor-arg index="1">
            <map>
                <entry key="java.lang.RuntimeException" value="true" />
            </map>
        </constructor-arg>
    </bean>

    <bean id="exponentialBackOffPolicy"
      class="org.springframework.retry.backoff.ExponentialBackOffPolicy">
        <property name="initialInterval" value="300">
        </property>
        <property name="maxInterval" value="30000">
        </property>
        <property name="multiplier" value="2.0">
        </property>
    </bean>
</beans>
...

This example uses a custom RetryTemplate inside the interceptor of xmlRetryService method.

6.2. Using XML Configuration

Import retryadvice.xml from the classpath and enable @AspectJ support:

@Configuration
@EnableRetry
@EnableAspectJAutoProxy
@ImportResource("classpath:/retryadvice.xml")
public class AppConfig { ... }

7. Listeners

Listeners provide additional callbacks upon retries. They can be used for various cross-cutting concerns across different retries.

7.1. Adding Callbacks

The callbacks are provided in a RetryListener interface:

public class DefaultListenerSupport extends RetryListenerSupport {
    @Override
    public <T, E extends Throwable> void close(RetryContext context,
      RetryCallback<T, E> callback, Throwable throwable) {
        logger.info("onClose);
        ...
        super.close(context, callback, throwable);
    }

    @Override
    public <T, E extends Throwable> void onError(RetryContext context,
      RetryCallback<T, E> callback, Throwable throwable) {
        logger.info("onError"); 
        ...
        super.onError(context, callback, throwable);
    }

    @Override
    public <T, E extends Throwable> boolean open(RetryContext context,
      RetryCallback<T, E> callback) {
        logger.info("onOpen);
        ...
        return super.open(context, callback);
    }
}

The open and close callbacks come before and after the entire retry, and onError applies to the individual RetryCallback calls.

7.2. Registering the Listener

Next, we register our listener (DefaultListenerSupport) to our RetryTemplate bean:

@Configuration
public class AppConfig {
    ...

    @Bean
    public RetryTemplate retryTemplate() {
        RetryTemplate retryTemplate = new RetryTemplate();
        ...
        retryTemplate.registerListener(new DefaultListenerSupport());
        return retryTemplate;
    }
}

8. Testing the Results

Let’s verify the results:

@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(
  classes = AppConfig.class,
  loader = AnnotationConfigContextLoader.class)
public class SpringRetryTest {

    @Autowired
    private MyService myService;

    @Autowired
    private RetryTemplate retryTemplate;

    @Test(expected = RuntimeException.class)
    public void givenTemplateRetryService_whenCallWithException_thenRetry() {
        retryTemplate.execute(arg0 -> {
            myService.templateRetryService();
            return null;
        });
    }
}

When we run the test case, the below log text means that we have successfully configured the RetryTemplate and Listener:

2017-01-09 20:04:10 [main] INFO  o.b.s.DefaultListenerSupport - onOpen 
2017-01-09 20:04:10 [main] INFO  o.baeldung.springretry.MyServiceImpl
- throw RuntimeException in method templateRetryService() 
2017-01-09 20:04:10 [main] INFO  o.b.s.DefaultListenerSupport - onError 
2017-01-09 20:04:12 [main] INFO  o.baeldung.springretry.MyServiceImpl
- throw RuntimeException in method templateRetryService() 
2017-01-09 20:04:12 [main] INFO  o.b.s.DefaultListenerSupport - onError 
2017-01-09 20:04:12 [main] INFO  o.b.s.DefaultListenerSupport - onClose

9. Conclusion

In this article, we have introduced Spring Retry. We have seen examples of retry using annotations and RetryTemplate. We have then configured additional callbacks using listeners.

You can find the source code for this article over on GitHub.

Viewing all 3743 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>