Quantcast
Channel: Baeldung
Viewing all 3692 articles
Browse latest View live

Custom Information in Spring Boot Info Endpoint

$
0
0

1. Overview

In this quick article, we’ll have a look at how to customize the Spring Boot Actuators’ /info endpoint.

Please refer to this article to learn more about actuators in Boot and how to configure them.

2. Static Properties in /info

If we have some static information like the name of the application or it’s version that does not change for a long time, then it’s a good idea to add that details in our application.properties file:

## Configuring info endpoint
info.app.name=Spring Sample Application
info.app.description=This is my first spring boot application
info.app.version=1.0.0

That’s all we need to do to make this data available on the /info endpoint. Spring will automatically add all the properties prefixed with info to the /info endpoint:

{
  "app": {
    "description": "This is my first spring boot application",
    "version": "1.0.0",
    "name": "Spring Sample Application"
  }
}

3. Environment Variables in /info

Let’s now expose an Environment variable in our /info endpoint:

info.java-vendor = ${java.specification.vendor}

This will expose the Java vendor to our /info endpoint:

{
  "app": {
    "description": "This is my first spring boot application",
    "version": "1.0.0",
    "name": "Spring Sample Application"
  },
  "java-vendor": "Oracle Corporation",
}

Please note that all the environment variables are already available on the /env endpoint. However, the same can be exposed quickly on the /info endpoint as well.

4. Custom Data From the Persistence Layer

Now let’s go one step further and expose some useful data from the persistence storage.

To achieve this, we need to implement InfoContributor interface and override the contribute() method:

@Component
public class TotalUsersInfoContributor implements InfoContributor {

    @Autowired
    UserRepository userRepository;

    @Override
    public void contribute(Info.Builder builder) {
        Map<String, Integer> userDetails = new HashMap<>();
        userDetails.put("active", userRepository.countByStatus(1));
        userDetails.put("inactive", userRepository.countByStatus(0));

        builder.withDetail("users", userDetails);
    }
}

The first thing is that we need to mark the implementing class as a @Component. Then add the required details to the Info.Builder instance provided to the contribute() method.

This approach provides us a lot of flexibility regarding what we can expose to our /info endpoint:

{
  ...other /info data...,
  ...
  "users": {
    "inactive": 2,
    "active": 3
  }
}

5. Conclusion

In this tutorial, we looked at various ways to add custom data to our /info endpoint.

Note that we’re also discussing how to add git information into the /info endpoint.

As always, the complete source code of this article can be found over on GitHub.


Custom Scope in Spring

$
0
0

1. Overview

Out of the box, Spring provides two standard bean scopes (“singleton” and “prototype”) that can be used in any Spring application, plus three additional bean scopes (“request”, “session”, and “globalSession”) for use only in web-aware applications.

The standard bean scopes cannot be overridden, and it’s generally considered bad practice to override the web-aware scopes. However, you may have an application requiring different or additional capabilities from those found in the provided scopes.

For example, if you are developing a multi-tenant system, you may want to provide a separate instance of a particular bean or set of beans for each tenant. Spring provides a mechanism for creating custom scopes for scenarios such as this.

In this quick tutorial, we will demonstrate how to create, register, and use a custom scope in a Spring application.

2. Creating a Custom Scope Class

In order to create a custom scope, we must implement the Scope interface. In doing so, we must also ensure that the implementation is thread-safe because scopes can be used by multiple bean factories at the same time.

2.1. Managing the Scoped Objects and Callbacks

One of the first things to consider when implementing a custom Scope class is how you will store and manage the scoped objects and destruction callbacks. This could be done using a map or a dedicated class, for example.

For this article, we’ll do this in a thread-safe manner using synchronized maps.

Let’s begin to define our custom scope class:

public class TenantScope implements Scope {
    private Map<String, Object> scopedObjects
      = Collections.synchronizedMap(new HashMap<String, Object>());
    private Map<String, Runnable> destructionCallbacks
      = Collections.synchronizedMap(new HashMap<String, Runnable>());
...
}

2.2. Retrieving an Object from Scope

To retrieve an object by name from our scope, let’s implement the getObject method. As the JavaDoc states, if the named object does not exist in the scope, this method must create and return a new object.

In our implementation, we check to see if the named object is in our map. If it is, we return it, and if not, we use the ObjectFactory to create a new object, add it to our map, and return it:

@Override
public Object get(String name, ObjectFactory<?> objectFactory) {
    if(!scopedObjects.containsKey(name)) {
        scopedObjects.put(name, objectFactory.getObject());
    }
    return scopedObjects.get(name);
}

Of the five methods defined by the Scope interface, only the get method is required to have a full implementation of the described behavior. The other four methods are optional and may throw UnsupportedOperationException if they don’t need to or can’t support a functionality.

2.3. Registering a Destruction Callback

We must also implement the registerDestructionCallback method. This method provides a callback that is to be executed when the named object is destroyed or if the scope itself is destroyed by the application:

@Override
public void registerDestructionCallback(String name, Runnable callback) {
    destructionCallbacks.put(name, callback);
}

2.4. Removing an Object from Scope

Next, let’s implement the remove method, which removes the named object from the scope and also removes its registered destruction callback, returning the removed object:

@Override
public Object remove(String name) {
    destructionCallbacks.remove(name);
    return scopedObjects.remove(name);
}

Note that it is the caller’s responsibility to actually execute the callback and destroy the removed object.

2.5. Getting the Conversation ID

Now, let’s implement the getConversationId method. If your scope supports the concept of a conversation ID, you would return it here. Otherwise, the convention is to return null:

@Override
public String getConversationId() {
    return "tenant";
}

2.6. Resolving Contextual Objects

Finally, let’s implement the resolveContextualObject method. If your scope supports multiple contextual objects, you would associate each with a key value, and you would return the object corresponding to the provided key parameter. Otherwise, the convention is to return null:

@Override
public Object resolveContextualObject(String key) {
    return null;
}

3. Registering the Custom Scope

To make the Spring container aware of your new scope, you need to register it through the registerScope method on a ConfigurableBeanFactory instance. Let’s take a look at this method’s definition:

void registerScope(String scopeName, Scope scope);

The first parameter, scopeName, is used to identify/specify a scope by its unique name. The second parameter, scope, is an actual instance of the custom Scope implementation that you wish to register and use.

Let’s create a custom BeanFactoryPostProcessor and register our custom scope using a ConfigurableListableBeanFactory:

public class TenantBeanFactoryPostProcessor implements BeanFactoryPostProcessor {

    @Override
    public void postProcessBeanFactory(ConfigurableListableBeanFactory factory) throws BeansException {
        factory.registerScope("tenant", new TenantScope());
    }
}

Now, let’s write a Spring configuration class that loads our BeanFactoryPostProcessor implementation:

@Configuration
public class TenantScopeConfig {

    @Bean
    public static BeanFactoryPostProcessor beanFactoryPostProcessor() {
        return new TenantBeanFactoryPostProcessor();
    }
}

4. Using the Custom Scope

Now that we have registered our custom scope, we can apply it to any of our beans just as we would with any other bean that uses a scope other than singleton (the default scope) — by using the @Scope annotation and specifying our custom scope by name.

Let’s create a simple TenantBean class — we’ll declare tenant-scoped beans of this type in a moment:

public class TenantBean {
    
    private final String name;
    
    public TenantBean(String name) {
        this.name = name;
    }

    public void sayHello() {
        System.out.println(
          String.format("Hello from %s of type %s",
          this.name, 
          this.getClass().getName()));
    }
}

Note that we did not use the class-level @Component and @Scope annotations on this class.

Now, let’s define some tenant-scoped beans in a configuration class:

@Configuration
public class TenantBeansConfig {

    @Scope(scopeName = "tenant")
    @Bean
    public TenantBean foo() {
        return new TenantBean("foo");
    }
    
    @Scope(scopeName = "tenant")
    @Bean
    public TenantBean bar() {
        return new TenantBean("bar");
    }
}

5. Testing the Custom Scope

Let’s write a test to exercise our custom scope configuration by loading up an ApplicationContext, registering our Configuration classes, and retrieving our tenant-scoped beans:

@Test
public final void whenRegisterScopeAndBeans_thenContextContainsFooAndBar() {
    AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext();
    try{
        ctx.register(TenantScopeConfig.class);
        ctx.register(TenantBeansConfig.class);
        ctx.refresh();
        
        TenantBean foo = (TenantBean) ctx.getBean("foo", TenantBean.class);
        foo.sayHello();
        TenantBean bar = (TenantBean) ctx.getBean("bar", TenantBean.class);
        bar.sayHello();
        Map<String, TenantBean> foos = ctx.getBeansOfType(TenantBean.class);
        
        assertThat(foo, not(equalTo(bar)));
        assertThat(foos.size(), equalTo(2));
        assertTrue(foos.containsValue(foo));
        assertTrue(foos.containsValue(bar));

        BeanDefinition fooDefinition = ctx.getBeanDefinition("foo");
        BeanDefinition barDefinition = ctx.getBeanDefinition("bar");
        
        assertThat(fooDefinition.getScope(), equalTo("tenant"));
        assertThat(barDefinition.getScope(), equalTo("tenant"));
    }
    finally {
        ctx.close();
    }
}

And the output from our test is:

Hello from foo of type org.baeldung.customscope.TenantBean
Hello from bar of type org.baeldung.customscope.TenantBean

6. Conclusion

In this quick tutorial, we showed how to define, register, and use a custom scope in Spring.

You can read more about custom scopes in the Spring Framework Reference. You can also take a look at Spring’s implementations of various Scope classes in the Spring Framework repository on GitHub.

As usual, you can find the code samples used in this article over on the GitHub project.

Guide to Reactive Microservices Using Lagom Framework

$
0
0

1. Overview

In this article, we’ll explore the Lagom framework and implement an example application using a reactive microservices driven architecture.

Simply put, reactive software applications rely on message-driven asynchronous communication and are highly Responsive, Resilient and Elastic in nature.

By microservice-driven architecture, we meant splitting the system into boundaries between collaborative services for achieving goals of Isolation, Autonomy, Single Responsibility, Mobility, etc. For further reading on these two concepts, refer The Reactive Manifesto and Reactive Microservices Architecture.

2. Why Lagom?

Lagom is an open source framework built with the shifting from monoliths to microservices-driven application architecture in mind. It abstracts the complexity of building, running and monitoring microservices driven applications.

Behind the scenes, Lagom framework uses the Play Framework, an Akka message-driven runtime, Kafka for decoupling services, Event Sourcing, and CQRS patterns, and ConductR support for monitoring and scaling microservices in the container environment.

3. Hello World in Lagom

We’ll be creating a Lagom application to handle a greeting request from the user and reply back with a greeting message along with weather statistics for the day.

And we’ll be developing two separate microservices: Greeting and Weather.

Greeting will focus on handling a greeting request, interacting with weather service to reply back to the user. The Weather microservice will service the request for weather statistics for today.

In the case of existing user interacting with Greeting microservice, the different greeting message will be shown to the user.

3.1. Prerequisites

  1. Install Scala (we are currently using 2.11.8 version) from here
  2. Install sbt build tool (we are currently using 0.13.11) from here

4. Project Setup

Let’s now have a quick look at the steps to set up a working Lagom system.

4.1. SBT Build

Create a project folder lagom-hello-world followed by the build file build.sbt. A Lagom system is typically made up of a set of sbt builds with each build corresponding to a group of related services:

organization in ThisBuild := "org.baeldung"

scalaVersion in ThisBuild := "2.11.8"

lagomKafkaEnabled in ThisBuild := false

lazy val greetingApi = project("greeting-api")
  .settings(
    version := "1.0-SNAPSHOT",
    libraryDependencies ++= Seq(
      lagomJavadslApi
    )
  )

lazy val greetingImpl = project("greeting-impl")
  .enablePlugins(LagomJava)
  .settings(
    version := "1.0-SNAPSHOT",
    libraryDependencies ++= Seq(
      lagomJavadslPersistenceCassandra
    )
  )
  .dependsOn(greetingApi, weatherApi)

lazy val weatherApi = project("weather-api")
  .settings(
    version := "1.0-SNAPSHOT",
    libraryDependencies ++= Seq(
      lagomJavadslApi
    )
  )

lazy val weatherImpl = project("weather-impl")
  .enablePlugins(LagomJava)
  .settings(
    version := "1.0-SNAPSHOT"
  )
  .dependsOn(weatherApi)

def project(id: String) = Project(id, base = file(id))

To start with, we’ve specified the organization details, scala version, and disabled Kafka for the current project. Lagom follows a convention of two separate projects for each microservice: API project and an implementation project.

The API project contains the service interface on which the implementation depends.

We’ve added dependencies to the relevant Lagom modules like lagomJavadslApi, lagomJavadslPersistenceCassandra for using the Lagom Java API in our microservices and storing events related to the persistent entity in Cassandra, respectively.

Also, the greeting-impl project depends on the weather-api project to fetch and serve weather stats while greeting a user.

Support for the Lagom plugin is added by creating a plugin folder with plugins.sbt file, having an entry for Lagom plugin. It provides all the necessary support for building, running, and deploying our application.

Also, the sbteclipse plugin will be handy if we use Eclipse IDE for this project. The code below shows the contents for both plugins:

addSbtPlugin("com.lightbend.lagom" % "lagom-sbt-plugin" % "1.3.1")
addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "3.0.0")

Create project/build.properties file and specify sbt version to use:

sbt.version=0.13.11

4.2. Project Generation

Running sbt command from the project root will generate the following project templates:

  1. greeting-api
  2. greeting-impl
  3. weather-api
  4. weather-impl

Before we start implementing the microservices, let’s add the src/main/java and src/main/java/resources folders inside each of the projects, to follow Maven-like project directory layout.

Also, two dynamic projects are generated inside project-root/target/lagom-dynamic-projects:

  1. lagom-internal-meta-project-cassandra
  2. lagom-internal-meta-project-service-locator

These projects are used internally by Lagom.

5. Service Interface

In the greeting-api project, we specify the following interface:

public interface GreetingService extends Service {

    public ServiceCall<NotUsed, String> handleGreetFrom(String user);

    @Override
    default Descriptor descriptor() {
        return named("greetingservice")
          .withCalls(restCall(Method.GET, "/api/greeting/:fromUser",
            this::handleGreetFrom))
          .withAutoAcl(true);
    }
}

GreetingService exposes handleGreetFrom() to handle greet request from the user. A ServiceCall API is used as the return type of these methods. ServiceCall takes two type parameters Request and Response.

The Request parameter is the type of the incoming request message, and the Response parameter is the type of the outgoing response message.

In the example above, we’re not using request payload, request type is NotUsed, and Response type is a String greeting message.

GreetingService also specifies a mapping to the actual transport used during the invocation, by providing a default implementation of the Service.descriptor() method. A service named greetingservice is returned.

handleGreetFrom() service call is mapped using a Rest identifier: GET method type and path identifier /api/greeting/:fromUser mapped to handleGreetFrom() method. Check this link out for more details on service identifiers.

On the same lines, we define WeatherService interface in the weather-api project. weatherStatsForToday() method and descriptor() method are pretty much self explanatory:

public interface WeatherService extends Service {
    
    public ServiceCall<NotUsed, WeatherStats> weatherStatsForToday();

    @Override
    default Descriptor descriptor() {
        return named("weatherservice")
          .withCalls(
            restCall(Method.GET, "/api/weather",
              this::weatherStatsForToday))
          .withAutoAcl(true);
    }
};

WeatherStats is defined as an enum with sample values for different weather and random lookup to return weather forecast for the day:

public enum WeatherStats {

    STATS_RAINY("Going to Rain, Take Umbrella"), 
    STATS_HUMID("Going to be very humid, Take Water");

    public static WeatherStats forToday() {
        return VALUES.get(RANDOM.nextInt(SIZE));
    }
}

6. Lagom Persistence – Event Sourcing

Simply put, in a system making use of Event Sourcing, we’ll be able to capture all changes as immutable domain events appended one after the other. The current state is derived by replying and processing events. This operation is essentially a foldLeft operation known from the Functional Programming paradigm.

Event sourcing helps to achieve high write performance by appending the events and avoiding updates and deletes of existing events.

Let’s now look at our persistent entity in the greeting-impl project, GreetingEntity:

public class GreetingEntity extends 
  PersistentEntity<GreetingCommand, GreetingEvent, GreetingState> {

      @Override
      public Behavior initialBehavior(
        Optional<GreetingState> snapshotState) {
            BehaviorBuilder b 
              = newBehaviorBuilder(new GreetingState("Hello "));
        
            b.setCommandHandler(
              ReceivedGreetingCommand.class,
              (cmd, ctx) -> {
                  String fromUser = cmd.getFromUser();
                  String currentGreeting = state().getMessage();
                  return ctx.thenPersist(
                    new ReceivedGreetingEvent(fromUser),
                    evt -> ctx.reply(
                      currentGreeting + fromUser + "!"));
              });
        
            b.setEventHandler(
              ReceivedGreetingEvent.class,
              evt -> state().withMessage("Hello Again "));

            return b.build();
      }
}

Lagom provides PersistentEntity<Command, Entity, Event> API for processing incoming events of type Command via setCommandHandler() methods and persist state changes as events of type Event. The domain object state is updated by applying the event to the current state using the setEventHandler() method. The initialBehavior() abstract method defines the Behavior of the entity.

In initialBehavior(), we build original GreetingState “Hello” text. Then we can define a ReceivedGreetingCommand command handler – which produces a ReceivedGreetingEvent Event and gets persisted in the event log.

GreetingState is recalculated to “Hello Again” by the ReceivedGreetingEvent event handler method. As mentioned earlier, we’re not invoking setters – instead, we are creating a new instance of State from the current event being processed.

Lagom follows the convention of GreetingCommand and GreetingEvent interfaces for holding together all the supported commands and events:

public interface GreetingCommand extends Jsonable {

    @JsonDeserialize
    public class ReceivedGreetingCommand implements 
      GreetingCommand, 
      CompressedJsonable, 
      PersistentEntity.ReplyType<String> {      
          @JsonCreator
          public ReceivedGreetingCommand(String fromUser) {
              this.fromUser = Preconditions.checkNotNull(
                fromUser, "fromUser");
          }
    }
}
public interface GreetingEvent extends Jsonable {
    class ReceivedGreetingEvent implements GreetingEvent {

        @JsonCreator
        public ReceivedGreetingEvent(String fromUser) {
            this.fromUser = fromUser;
        }
    }
}

7. Service Implementation

7.1. Greeting Service

public class GreetingServiceImpl implements GreetingService {

    @Inject
    public GreetingServiceImpl(
      PersistentEntityRegistry persistentEntityRegistry, 
      WeatherService weatherService) {
          this.persistentEntityRegistry = persistentEntityRegistry;
          this.weatherService = weatherService;
          persistentEntityRegistry.register(GreetingEntity.class);
      }

    @Override
    public ServiceCall<NotUsed, String> handleGreetFrom(String user) {
        return request -> {
            PersistentEntityRef<GreetingCommand> ref
              = persistentEntityRegistry.refFor(
                GreetingEntity.class, user);
            CompletableFuture<String> greetingResponse 
              = ref.ask(new ReceivedGreetingCommand(user))
                .toCompletableFuture();
            CompletableFuture<WeatherStats> todaysWeatherInfo
              = (CompletableFuture<WeatherStats>) weatherService
                .weatherStatsForToday().invoke();
            
            try {
                return CompletableFuture.completedFuture(
                  greetingResponse.get() + " Today's weather stats: "
                    + todaysWeatherInfo.get().getMessage());
            } catch (InterruptedException | ExecutionException e) {
                return CompletableFuture.completedFuture(
                  "Sorry Some Error at our end, working on it");
            }
        };
    }
}

Simply put, we inject the PersistentEntityRegistry and WeatherService dependencies using @Inject (provided by Guice framework), and we register the persistent GreetingEntity.

The handleGreetFrom() implementation is sending ReceivedGreetingCommand to the GreetingEntity to process and return greeting string asynchronously using CompletableFuture implementation of CompletionStage API.

Similarly, we make an async call to Weather microservice to fetch weather stats for today.

Finally, we concatenate both outputs and return the final result to the user.

To register an implementation of the service descriptor interface GreetingService with Lagom, let’s create GreetingServiceModule class which extends AbstractModule and implements ServiceGuiceSupport:

public class GreetingServiceModule extends AbstractModule 
  implements ServiceGuiceSupport {
 
      @Override
      protected void configure() {
          bindServices(
            serviceBinding(GreetingService.class, GreetingServiceImpl.class));
          bindClient(WeatherService.class);
    }
}

Also, Lagom internally uses the Play Framework. And so, we can add our module to Play’s list of enabled modules in src/main/resources/application.conf file:

play.modules.enabled
  += org.baeldung.lagom.helloworld.greeting.impl.GreetingServiceModule

7.2. Weather Service

After looking at the GreetingServiceImpl, WeatherServiceImpl is pretty much straightforward and self-explanatory:

public class WeatherServiceImpl implements WeatherService {
 
    @Override
    public ServiceCall<NotUsed, WeatherStats> weatherStatsForToday() {
        return req -> 
          CompletableFuture.completedFuture(WeatherStats.forToday());
    }
}

We follow the same steps as we did above for greeting module to register the weather module with Lagom:

public class WeatherServiceModule 
  extends AbstractModule 
  implements ServiceGuiceSupport {
 
      @Override
      protected void configure() {
          bindServices(serviceBinding(
            WeatherService.class, 
            WeatherServiceImpl.class));
      }
}

Also, register the weather module to Play’s framework list of enabled modules:

play.modules.enabled
  += org.baeldung.lagom.helloworld.weather.impl.WeatherServiceModule

8. Running the Project

Lagom allows running any number of services together with a single command.

We can start our project by hitting the below command:

sbt lagom:runAll

This will start the embedded Service Locator, embedded Cassandra and then start microservices in parallel. The same command also reloads our individual microservice when the code changes so that we don’t have to restart them manually.

We can be focused on our logic and Lagom handle the compilation and reloading. Once started successfully, we will see the following output:

................
[info] Cassandra server running at 127.0.0.1:4000
[info] Service locator is running at http://localhost:8000
[info] Service gateway is running at http://localhost:9000
[info] Service weather-impl listening for HTTP on 0:0:0:0:0:0:0:0:56231 and how the services interact via
[info] Service greeting-impl listening for HTTP on 0:0:0:0:0:0:0:0:49356
[info] (Services started, press enter to stop and go back to the console...)

Once started successfully we can make a curl request for greeting:

curl http://localhost:9000/api/greeting/Amit

We will see following output on the console:

Hello Amit! Today's weather stats: Going to Rain, Take Umbrella

Running the same curl request for an existing user will change the greeting message:

Hello Again Amit! Today's weather stats: Going to Rain, Take Umbrella

9. Conclusion

In this article, we have covered how to use Lagom framework to create two micro services that interact asynchronously.

The complete source code and all code snippets for this article are available in the GitHub project.

Introduction to JaVers

$
0
0

1. Overview 

In this article, we will be looking at the JaVers library.

This library helps programmers examine and detect changes in the states of simple Java objects. When we use mutable objects in our code, every object can potentially be modified in various places in the application; JaVers would help us discover and audit these changes.

2. Maven Dependency

To get started let us add the javers-core Maven dependency to our pom.xml:

<dependency>
    <groupId>org.javers</groupId>
    <artifactId>javers-core</artifactId>
    <version>3.1.0</version>
</dependency>

We can find the latest version over on Maven Central.

3. Detecting POJO State Changes

Let’s start with a simple Person class:

public class Person {
    private Integer id;
    private String name;

    // standard getters/constructors
}

Suppose that we created a Person object in one part of our application, and in some other part of the codebase, the name of the person with the same id field was changed. We want to compare them to find out what kind of changes happened to the person object.

We can compare those two objects using the compare() method from the JaVers class:

@Test
public void givenPersonObject_whenApplyModificationOnIt_thenShouldDetectChange() {
    // given
    Javers javers = JaversBuilder.javers().build();

    Person person = new Person(1, "Michael Program");
    Person personAfterModification = new Person(1, "Michael Java");

    // when
    Diff diff = javers.compare(person, personAfterModification);

    // then
    ValueChange change = diff.getChangesByType(ValueChange.class).get(0);

    assertThat(diff.getChanges()).hasSize(1);
    assertThat(change.getPropertyName()).isEqualTo("name");
    assertThat(change.getLeft()).isEqualTo("Michael Program");
    assertThat(change.getRight()).isEqualTo("Michael Java");
}

4. Detecting State Change of List of Objects

If we’re working with collections of objects, we similarly need to examine state changes by looking at the each element in the collection. Sometimes, we want to add or remove the particular object from the list, altering its state.

Let’s have a look at an example; say we have a list of objects, and we remove one object from that list.

That change can be undesirable for some reason, and we want to audit changes that occurred in this list. JaVers allows us to do it by using a compareCollections() method:

@Test
public void givenListOfPersons_whenCompare_ThenShouldDetectChanges() {
    // given
    Javers javers = JaversBuilder.javers().build();
    Person personThatWillBeRemoved = new Person(2, "Thomas Link");
    List<Person> oldList = 
      Lists.asList(new Person(1, "Michael Program"), personThatWillBeRemoved);
    List<Person> newList = 
      Lists.asList(new Person(1, "Michael Not Program"));

    // when
    Diff diff = javers.compareCollections(oldList, newList, Person.class);

    // then
    assertThat(diff.getChanges()).hasSize(3);

    ValueChange valueChange = 
      diff.getChangesByType(ValueChange.class).get(0);
 
    assertThat(valueChange.getPropertyName()).isEqualTo("name");
    assertThat(valueChange.getLeft()).isEqualTo("Michael Program");
    assertThat(valueChange.getRight()).isEqualTo("Michael Not Program");

    ObjectRemoved objectRemoved = diff.getChangesByType(ObjectRemoved.class).get(0);
    assertThat(
      objectRemoved.getAffectedObject().get().equals(personThatWillBeRemoved))
      .isTrue();

    ListChange listChange = diff.getChangesByType(ListChange.class).get(0);
    assertThat(listChange.getValueRemovedChanges().size()).isEqualTo(1);
}

5. Comparing Object Graphs

In real word applications, we often deal with the object graphs. Let’s say that we have a PersonWithAddress class that has a list of the Address objects and we are adding a new address for the given person.

We can easily find the type of change that has occurred:

@Test
public void givenListOfPerson_whenPersonHasNewAddress_thenDetectThatChange() {
    // given
    Javers javers = JaversBuilder.javers().build();

    PersonWithAddress person = 
      new PersonWithAddress(1, "Tom", Arrays.asList(new Address("England")));

    PersonWithAddress personWithNewAddress = 
      new PersonWithAddress(1, "Tom", 
        Arrays.asList(new Address("England"), new Address("USA")));


    // when
    Diff diff = javers.compare(person, personWithNewAddress);
    List objectsByChangeType = diff.getObjectsByChangeType(NewObject.class);

    // then
    assertThat(objectsByChangeType).hasSize(1);
    assertThat(objectsByChangeType.get(0).equals(new Address("USA")));
}

Similarly, removing an address will be detected:

@Test
public void givenListOfPerson_whenPersonRemovedAddress_thenDetectThatChange() {
    // given
    Javers javers = JaversBuilder.javers().build();

    PersonWithAddress person = 
      new PersonWithAddress(1, "Tom", Arrays.asList(new Address("England")));

    PersonWithAddress personWithNewAddress = 
      new PersonWithAddress(1, "Tom", Collections.emptyList());


    // when
    Diff diff = javers.compare(person, personWithNewAddress);
    List objectsByChangeType = diff.getObjectsByChangeType(ObjectRemoved.class);

    // then
    assertThat(objectsByChangeType).hasSize(1);
    assertThat(objectsByChangeType.get(0).equals(new Address("England")));
}

6. Conclusion 

In this quick article, we used the JaVers library, a useful library that gives us APIs for detecting state changes in our objects. Not only it can detect the change in a simple POJO object, but also it can detect more complex shifts in the collections of objects, or even object graphs.

As always, the code is available over on GitHub.

Java Web Weekly, Issue 171

$
0
0

Here we go…

1. Spring and Java

>> The Java in 2017 Survey [docs.google.com]

I’m running my yearly Java “State of the Union” survey for 2017. Please take a few seconds to vote.

>> Memory Puzzle with Lambdas [javaspecialists.eu]

A very interesting Java 8 memory puzzle, with quite surprising results 🙂

>> Common code in Spring MVC, where to put it? [frankel.ch]

It’s not that easy to find a suitable place for common code in Spring MVC apps. This write-up shows a few possible places where you could do this.

>> Optional Dependencies in the Java Platform Module System [codefx.org]

Java 9’s Project Jigsaw finally makes it possible to define module dependencies to be present at compile time and not at runtime.

>> CRUD operations on Spring REST resources with Kotlin [codecentric.de]

This is how you can create and consume a simple REST API using Kotlin and Spring.

>> Use Spring Cloud Config as externalized configuration [pragmaticintegrator.wordpress.com]

Externalizing your configuration allows you to build artifacts once and easily swap configurations during runtime or for different environments. It turns out that Git works great as a configuration holder for such scenarios.

>> Test Doubles – Fakes, Mocks and Stubs. [pragmatists.pl]

Stubs are often mistaken as Mocks, and Fakes are often mistaken as Fakes 🙂

The article clarifies which is which.

>> Idiomatic Kotlin Best Practices [blog.philiphauer.de]

Kotlin is becoming more and more popular and it’s important to revisit our Java coding habits and learn the right way of doing stuff in Kotlin.

Also worth reading:

Webinars and presentations:

Time to upgrade:

2. Technical

>> Kafka Streams for Stream processing [balamaci.ro]

Quick and practical insights into how Kafka works.

>> The ultimate software QA process [mehdi-khalili.com]

Shocking no one, having a QA in your team is definitely a good idea.

Also worth reading:

3. Musings

>> How to start a peer group [mdswanson.com]

If you don’t know how to start a peer group, here is a simple list of steps to follow. You don’t need this often, but when you do, it’s super handy.

>> QA in Production [martinfowler.com]

Production is always a source of unexpected problems that can be a great feedback for improving your systems.

>> The Relationship between Static Analysis and Continuous Testing [daedtech.com]

Static Analysis and Continuous Testing are two different techniques applied to different areas. Static Analysis will be useful no matter how good your test coverage is, as it will point out potential problems before even running a single test.

Also worth reading:

4. Comics

And my favorite Dilberts of the week:

>> Ten thousand hours [dilbert.com]

>> I don’t know what this is, but I want in [dilbert.com]

>> This isn’t what I wanted [dilbert.com]

5. Pick of the Week

I’ve been following the work on this book for a few month now. It’s finally out:

>> Hibernate Tips (book)

If you’re doing any kind of Hibernate work, this is definitely one to get.

Cucumber and Scenario Outline

$
0
0

1. Introduction

Cucumber is a BDD (Behavioral Driven Development) testing framework.

Using the framework to write repetitive scenarios with different permutations of inputs/outputs can be quite time-consuming, difficult to maintain and of course frustrating.

Cucumber came with a solution for reducing this effort by using the concept of Scenario Outline coupled with Examples. In the below section, we will try to take up an example and see how can we minimize this effort.

If you want to read more about the approach and Gherkin language, have a look at this article.

2. Adding Cucumber Support

To add support for Cucumber in a simple Maven project, we will need to add the following dependencies:

<dependency>
    <groupId>info.cukes</groupId>
    <artifactId>cucumber-junit</artifactId>
    <version>1.2.5</version>
    <scope>test</scope>
</dependency>
<dependency>
    <groupId>info.cukes</groupId>
    <artifactId>cucumber-java</artifactId>
    <version>1.2.5</version>
    <scope>test</scope>
</dependency>
<dependency>
    <groupId>org.hamcrest</groupId>
    <artifactId>hamcrest-library</artifactId>
    <version>1.3</version>
    <scope>test</scope>
</dependency>

Useful links to dependencies from Maven Central: cucumber-junitcucumber-javahamcrest-library

Since these are testing libraries, they don’t need to be shipped with the actual deployable – which is why they’re all test scoped.

3. A Simple Example

Let’s demonstrate both a bloated way and a concise way of writing featured files. Let’s first define the logic we want to write a test for:

Let’s first define the logic we want to write a test for:

public class Calculator {
    public int add(int a, int b) {
        return a + b;
    }
}

4. Defining Cucumber Tests

4.1. Defining a Feature File

Feature: Calculator
  As a user
  I want to use a calculator to add numbers
  So that I don't need to add myself

  Scenario: Add two numbers -2 & 3
    Given I have a calculator
    When I add -2 and 3
    Then the result should be 1
   
  Scenario: Add two numbers 10 & 15
    Given I have a calculator
    When I add 10 and 15
    Then the result should be 25

As seen here, 2 different combinations of numbers have been put to test here the addition logic. Apart from numbers, all the scenarios are exactly the same.

4.2. “Glue” Code

In order to test out these scenarios, i’s essential to define each step with corresponding code, in order to translate a statement into a functional piece of code:

public class CalculatorRunSteps {

    private int total;

    private Calculator calculator;

    @Before
    private void init() {
        total = -999;
    }

    @Given("^I have a calculator$")
    public void initializeCalculator() throws Throwable {
        calculator = new Calculator();
    }

    @When("^I add (-?\\d+) and (-?\\d+)$")
    public void testAdd(int num1, int num2) throws Throwable {
        total = calculator.add(num1, num2);
    }

    @Then("^the result should be (-?\\d+)$")
    public void validateResult(int result) throws Throwable {
        Assert.assertThat(total, Matchers.equalTo(result));
    }
}

4.3. A Runner Class

In order to integrate features and the glue code, we can use the JUnit runners:

@RunWith(Cucumber.class)
@CucumberOptions(
  features = { "classpath:features/calculator.feature" },
  glue = {"com.baeldung.cucumber.calculator" })
public class CalculatorTest {}

5. Rewriting Features Using Scenario Outlines

We saw in Section 4.1. how defining a feature file can be a time-consuming task and more error prone. The same feature file can be reduced to mere few lines using the Scenario Outline:

Feature: Calculator
  As a user
  I want to use a calculator to add numbers
  So that I don't need to add myself

  Scenario Outline: Add two numbers <num1> & <num2>
    Given I have a calculator
    When I add <num1> and <num2>
    Then the result should be <total>

  Examples: 
    | num1 | num2 | total |
    | -2 | 3 | 1 |
    | 10 | 15 | 25 |
    | 99 | -99 | 0 |
    | -1 | -10 | -11 |

When comparing a regular Scenario Definition with Scenario Outline, values no longer need to be hard-coded in step definitions. Values are replaced with parameters as <parameter_name> in step-definition itself.

At the end of Scenario Outline, values are defined in a pipe-delimited table format using Examples.

A sample to define Examples is shown below:

Examples:
  | Parameter_Name1 | Parameter_Name2 |
  | Value-1 | Value-2 |
  | Value-X | Value-Y |

6. Conclusion

With this quick article, we’ve shown how scenarios can be made generic in nature. And also reduce effort in writing and maintaining these scenarios.

The complete source code of this article can be found over on GitHub.

Flattening Nested Collections in Java

$
0
0

1. Overview

In this quick article, we’ll explore how to flatten a nested collection in Java.

2. Example of a Nested Collection

Suppose we have a list of lists of type String.

List<List<String>> nestedList = asList(
  asList("one:one"), 
  asList("two:one", "two:two", "two:three"), 
  asList("three:one", "three:two", "three:three", "three:four"));

3. Flattening the List with forEach

In order to flatten this nested collection into a list of strings, we can use forEach together with a Java 8 method reference:

public <T> List<T> flattenListOfListsImperatively(
    List<List<T>> nestedList) {
    List<T> ls = new ArrayList<>();
    nestedList.forEach(ls::addAll);
    return ls;
}

And here you can see the method in action:

@Test
public void givenNestedList_thenFlattenImperatively() {
    List<String> ls = flattenListOfListsImperatively(nestedList);
    
    assertNotNull(ls);
    assertTrue(ls.size() == 8);
    assertThat(ls, IsIterableContainingInOrder.contains(
      "one:one",
      "two:one", "two:two", "two:three", "three:one",
      "three:two", "three:three", "three:four"));
}

4. Flattening the List with flatMap

We can also flatten the nested list by utilizing the flatMap method from the Stream API.

This allows us to flatten the nested Stream structure and eventually collect all elements to a particular collection:

public <T> List<T> flattenListOfListsStream(List<List<T>> list) {
    return list.stream()
      .flatMap(Collection::stream)
      .collect(Collectors.toList());    
}

And here’s the logic in action:

@Test
public void givenNestedList_thenFlattenFunctionally() {
    List<String> ls = flattenListOfListsStream(nestedList);
    
    assertNotNull(ls);
    assertTrue(ls.size() == 8);
}

5. Conclusion

A simple forEach or flatMap methods in Java 8, in combination with method references, can be used for flattening nested collections.

You can find the code discussed in this article over on GitHub.

Java in 2017 Survey Results

$
0
0

We've been running the "State of Java" survey for many year now - to get a good read of the state of the Java ecosystem. Last year, 2250 Java developers decided to take the time to answer the questions, and so it's fantastic to see this year that number is almost double - we got 4439 answers.

So, before we get into the numbers - I wanted to say "thanks" to everyone who participated.

Let's jump right in and start with the Java adoption.


1. Java Adoption

The 2016 numbers had Java 7 adoption at 29.5% and Java 8 at 64.3%.

The numbers today - April 2017 (exactly one year later) - look quite different:

As you can see, Java 8 adoption has reached a solid 75% of the developer community.

This is quite encouraging to see and it also means that we're very much ready for Java 9 to finally be here.

Let's have a look at the Spring and Spring Boot numbers next.

2. Spring Adoption

The 2016 numbers had Spring 4 adoption at 81% and Spring 3 at 18%.

Let's have a look at the 2017 numbers now:

Spring 4 has inched up from 81% to 85% and Spring 3 has gone down from 18% to about 12% over the course of a year.

What's also quite interesting is that more than 2% of developers are using the Spring 5 milestones - which is a lot higher than the 1% using Java 9 milestones.

Finally, note that these numbers represent the developers that are using Spring. Overall, 25.5% of the developers answered they're not using the framework. 

3. Spring Boot Adoption

Boot is seeing some incredible adoption in the Spring ecosystem - that much is clear. Last year, the adoption numbers were at 53% - which is very high considering just how new the project really is.

Well, this year, growth is still going strong: ​

We can see that, summed up - the adoption number for Boot jumped from 53% to 70% - which is huge year over year growth. ​

4. IDE Market Share

Time to look at the market share of IDEs in 2017:

The trend was pretty clear last year as well - Eclipse is bleeding users to IntelliJ. 

Last year, Eclipse was at a respectable 48% and it's now sitting at a 40.5% - which is a severe, near double-digit drop in a single year.

5. JVM Languages

This year, we asked a new question in the survey - "Are you using other JVM based languages?".

Here are the super interesting results:

Groovy is clearly leading the pack with a strong 40%, Scala's following suit with over 28.5% and Kotlin is number 3 - with a surprising 11.5%.

Note this data is adapted to the "yes" answers - developers who are using other JVM languages. Overall, 57% of developers are only using Java.

6. Build Tools Market Share

On to build tools. Last year, we had Maven sitting at 72.5% and Gradle at 19%.

Well, this years numbers are quite close - surprisingly, Maven's slowly getting even more traction and is now at 76% and Gradle is just slightly down to 18%.

The build tools market seems to be a lot more stable than the rest of the Java ecosystem, where things are changing a lot more and a lot quicker.

7. Running your own blog?

We added this question in the survey out of pure curiosity. Here are the results:

Hopefully more and more developers are going to start writing and putting their work out there.

8. Conclusion

The 2017 numbers are quite interesting, and somewhat surprising in some respects.

Java 8 adoption has hit 75%, only a few months away from the GA of Java 9.

The Spring community has fully adopted Spring 4 - over 85% - and Spring Boot is up to 70% as well - which means that most Spring developers are actively using the new framework as well.

On the IDE side of things, IntelliJ is clawing market share from Eclipse with the same effectiveness as last year, and with no signs of slowing down any time soon.

The build landscape is much more quiet, with Maven continuing to be the dominant player and actually gaining ground, despite no major releases this last year.

And finally, JVM languages are getting a lot of traction as well - given that almost half of the developers who answered the survey are actively using a second language.

This is going to be an exiting year in the Java community.


Check If a Number Is Prime in Java

$
0
0

1. Introduction

First, let’s go over same basic theory.

Simply put, a number is prime if it’s only divisible by one and by the number itself. The non-prime numbers are called composite numbers. And number one is neither prime nor composite.

In this article, we’ll have a look at different ways to check the primality of a number in Java.

2. A Custom Implementation

With this approach, we can check if a number between 2 and (square root of the number) can accurately divide the number.

The following logic will return true if the number is prime:

public boolean isPrime(int number) {
    return number > 2 
      && IntStream.rangeClosed(2, (int) Math.sqrt(number))
      .noneMatch(n -> (number % n == 0));
}

3. Using BigInteger

BigInteger class is generally used for storing large sized integers, i.e., those greater than 64bits. It provides a few useful APIs for working with int and long values.

One of those APIs is the isProbablePrime. This API returns false if the number is definitely a composite and returns true if there is some probability of it being prime. It is useful when dealing with large integers because it can be quite an intensive computation to verify these fully.

A quick side-note – the isProbablePrime API uses what’s known as “Miller – Rabin and Lucas – Lehmer” primality tests to check if the number is probably prime. In cases where the number is less than 100 bits, only the “Miller – Rabin” test is used, otherwise, both tests are used for checking the primality of a number.

“Miller-Rabin” test iterates a fixed number of times to determine the primality of number and this iteration count is determined by a simple check which involves the bit length of the number and the certainty value passed to the API:

public boolean isPrime(int number) {
    BigInteger bigInt = BigInteger.valueOf(number);
    return bigInt.isProbablePrime(100);
}

4. Using Apache Commons Math 

Apache Commons Math API provides a method named org.apache.commons.math3.primes.Primes, which we will use for checking the primality of a number.

First, we need to import the Apache Commons Math library by adding the following dependency in our pom.xml:

<dependency>
    <groupId>org.apache.commons</groupId>
    <artifactId>commons-math3</artifactId>
    <version>3.6.1</version>
</dependency>

The latest version of the commons-math3 can be found here.

We could do the check just by calling the method:

Primes.isPrime(number);

5. Conclusion

In this quick write-up, we have seen three ways of checking for the primality of the number.

The code for this can be found in the package com.baeldung.primechecker over on Github.

JasperReports with Spring

$
0
0

1. Overview

JasperReports is an open source reporting library that enables users to create pixel-perfect reports that can be printed or exported in many formats including PDF, HTML, and XLS.

In this article, we’ll explore its key features and classes, and implement examples to showcase its capabilities.

2. Maven Dependency

First, we need to add the jasperreports dependency to our pom.xml:

<dependency>
    <groupId>net.sf.jasperreports</groupId>
    <artifactId>jasperreports</artifactId>
    <version>6.4.0</version>
</dependency>

The latest version of this artifact can be found here.

3. Report Templates

Report designs are defined in JRXML files. These are ordinary XML files with a particular structure that JasperReports engine can interpret.

Let’s now have a look at only the relevant structure of the JRXML files – to understand better the Java part of the report generation process, which is our primary focus.

Let’s create a simple report to show employee information:

<jasperReport ... >
    <field name="FIRST_NAME" class="java.lang.String"/>
    <field name="LAST_NAME" class="java.lang.String"/>
    <field name="SALARY" class="java.lang.Double"/>
    <field name="ID" class="java.lang.Integer"/>
    <detail>
        <band height="51" splitType="Stretch">
            <textField>
                <reportElement x="0" y="0" width="100" height="20"/>
                <textElement/>
                <textFieldExpression class="java.lang.String">
                  <![CDATA[$F{FIRST_NAME}]]></textFieldExpression>
            </textField>
            <textField>
                <reportElement x="100" y="0" width="100" height="20"/>
                <textElement/>
                <textFieldExpression class="java.lang.String">
                  <![CDATA[$F{LAST_NAME}]]></textFieldExpression>
            </textField>
            <textField>
                <reportElement x="200" y="0" width="100" height="20"/>
                <textElement/>
                <textFieldExpression class="java.lang.String">
                  <![CDATA[$F{SALARY}]]></textFieldExpression>
            </textField>
        </band>
    </detail>
</jasperReport>

3.1. Compiling Reports

JRXML files need to be compiled so the report engine can fill them with data.

Let’s perform this operation with the help of the JasperCompilerManager class:

InputStream employeeReportStream
  = getClass().getResourceAsStream("/employeeReport.jrxml");
JasperReport jasperReport
  = JasperCompileManager.compileReport(employeeReportStream);

To avoid compiling it every time, we can save it to a file:

JRSaver.saveObject(jasperReport, "employeeReport.jasper");

4. Populating Reports

The most common way to fill compiled reports is with records from a database. This requires the report to contain a SQL query the engine will execute to obtain the data.

First, let’s modify our report to add a SQL query:

<jasperReport ... >
    <queryString>
        <![CDATA[SELECT * FROM EMPLOYEE]]>
    </queryString>
    ...
</jasperReport>

Now, let’s create a simple data source:

@Bean
public DataSource dataSource() {
    return new EmbeddedDatabaseBuilder()
      .setType(EmbeddedDatabaseType.HSQL)
      .addScript("classpath:employee-schema.sql")
      .build();
}

Now, we can fill the report:

JasperPrint jasperPrint = JasperFillManager.fillReport(
  jasperReport, null, dataSource.getConnection());

Note that we are passing null to the second argument since our report doesn’t receive any parameters yet.

4.1. Parameters

Parameters are useful for passing data to the report engine that it can not find in its data source or when data changes depending on different runtime conditions.

We can also change portions or even the entire SQL query with parameters received in the report filling operation.

First, let’s modify the report to receive three parameters:

<jasperReport ... >
    <parameter name="title" class="java.lang.String" />
    <parameter name="minSalary" class="java.lang.Double" />
    <parameter name="condition" class="java.lang.String">
        <defaultValueExpression>
          <![CDATA["1 = 1"]]></defaultValueExpression>
    </parameter>
    // ...
</jasperreport>

Now, let’s add a title section to show the title parameter:

<jasperreport ... >
    // ...
    <title>
        <band height="20" splitType="Stretch">
            <textField>
                <reportElement x="238" y="0" width="100" height="20"/>
                <textElement/>
                <textFieldExpression class="java.lang.String">
                  <![CDATA[$P{title}]]></textFieldExpression>
            </textField>
        </band>
    </title>
    ...
</jasperreport/>

Next, let’s alter the query to use the minSalary and condition parameters:

SELECT * FROM EMPLOYEE
  WHERE SALARY >= $P{minSalary} AND $P!{condition}

Note the different syntax when using the condition parameter. This tells the engine that the parameter should not be used as a standard PreparedStatement parameter, but as if the value of that parameter would have been written originally in the SQL query.

Finally, let’s prepare the parameters and fill the report:

Map<String, Object> parameters = new HashMap<>();
parameters.put("title", "Employee Report");
parameters.put("minSalary", 15000.0);
parameters.put("condition", " LAST_NAME ='Smith' ORDER BY FIRST_NAME");

JasperPrint jasperPrint
  = JasperFillManager.fillReport(..., parameters, ...);

Note that the keys of parameters correspond to parameter names in the report. If the engine detects a parameter is missing, it will obtain the value from defaultValueExpression of the parameter if any.

5. Exporting

To export a report, first, we instantiate an object of an exporter class that matches the file format we need.

Then, we set our previous filled report as input and define where to output the resulting file.

Optionally, we can set corresponding report and export configuration objects to customize the exporting process.

5.1. PDF

JRPdfExporter exporter = new JRPdfExporter();

exporter.setExporterInput(new SimpleExporterInput(jasperPrint));
exporter.setExporterOutput(
  new SimpleOutputStreamExporterOutput("employeeReport.pdf"));

SimplePdfReportConfiguration reportConfig
  = new SimplePdfReportConfiguration();
reportConfig.setSizePageToContent(true);
reportConfig.setForceLineBreakPolicy(false);

SimplePdfExporterConfiguration exportConfig
  = new SimplePdfExporterConfiguration();
exportConfig.setMetadataAuthor("baeldung");
exportConfig.setEncrypted(true);
exportConfig.setAllowedPermissionsHint("PRINTING");

exporter.setConfiguration(reportConfig);
exporter.setConfiguration(exportConfig);

exporter.exportReport();

5.2. XLS

JRXlsxExporter exporter = new JRXlsxExporter();
 
// Set input and output ...
SimpleXlsxReportConfiguration reportConfig
  = new SimpleXlsxReportConfiguration();
reportConfig.setSheetNames(new String[] { "Employee Data" });

exporter.setConfiguration(reportConfig);
exporter.exportReport();

5.3. CSV

JRCsvExporter exporter = new JRCsvExporter();
 
// Set input ...
exporter.setExporterOutput(
  new SimpleWriterExporterOutput("employeeReport.csv"));

exporter.exportReport();

5.4. HTML

HtmlExporter exporter = new HtmlExporter();
 
// Set input ...
exporter.setExporterOutput(
  new SimpleHtmlExporterOutput("employeeReport.html"));

exporter.exportReport();

6. Subreports

Subreports are nothing more than a standard report embedded in another report.

First, let’s create a report to show the emails of an employee:

<jasperReport ... >
    <parameter name="idEmployee" class="java.lang.Integer" />
    <queryString>
        <![CDATA[SELECT * FROM EMAIL WHERE ID_EMPLOYEE = $P{idEmployee}]]>
    </queryString>
    <field name="ADDRESS" class="java.lang.String"/>
    <detail>
        <band height="20" splitType="Stretch">
            <textField>
                <reportElement x="0" y="0" width="156" height="20"/>
                <textElement/>
                <textFieldExpression class="java.lang.String">
                  <![CDATA[$F{ADDRESS}]]></textFieldExpression>
            </textField>
        </band>
    </detail>
</jasperReport>

Now, let’s modify our employee report to include the previous one:

<detail>
    <band ... >
        <subreport>
            <reportElement x="0" y="20" width="300" height="27"/>
            <subreportParameter name="idEmployee">
                <subreportParameterExpression>
                  <![CDATA[$F{ID}]]></subreportParameterExpression>
            </subreportParameter>
            <connectionExpression>
              <![CDATA[$P{REPORT_CONNECTION}]]></connectionExpression>
            <subreportExpression class="java.lang.String">
              <![CDATA["employeeEmailReport.jasper"]]></subreportExpression>
        </subreport>
    </band>
</detail>

Note that we are referencing the subreport by the name of the compiled file and passing it the idEmployee and current report connection as parameters.

Next, let’s compile both reports:

InputStream employeeReportStream
  = getClass().getResourceAsStream("/employeeReport.jrxml");
JasperReport jasperReport
  = JasperCompileManager.compileReport(employeeReportStream);
JRSaver.saveObject(jasperReport, "employeeReport.jasper");

InputStream emailReportStream
  = getClass().getResourceAsStream("/employeeEmailReport.jrxml");
JRSaver.saveObject(
  JasperCompileManager.compileReport(emailReportStream),
  "employeeEmailReport.jasper");

Our code for filling and exporting the report doesn’t require modifications.

7. Conclusion

In this article, we had a brief look at the core features of the JasperReports library.

We were able to compile and populate reports with records from a database; we passed parameters to change the data shown in the report according to different runtime conditions, embedded subreports and exported them to the most common formats.

Complete source code for this article can be found over on Github.

Configuring Separate Spring DataSource for Tests

$
0
0

1. Overview

When testing a Spring application that relies on a persistence layer, such as JPA, we may want to set up a test data source to use a smaller, faster database – one that is different from the one we use to run the application – in order to make running our tests much easier.

Configuring a data source in Spring requires defining a bean of type DataSource, either manually or, if using Spring Boot, through standard application properties.

In this quick tutorial, we’re going to take a look at several ways to configure a separate data source for testing in Spring.

2. Maven Dependencies

We are going to create a Spring Boot application using Spring JPA and testing, so we will need the following dependencies:

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-data-jpa</artifactId>
    <version>1.5.2.RELEASE</version>
</dependency> 
<dependency>
    <groupId>com.h2database</groupId>
    <artifactId>h2</artifactId>
    <version>1.4.194</version>
</dependency>
<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-test</artifactId>
    <version>1.5.2.RELEASE</version>
</dependency>

The latest versions of spring-boot-starter-data-jpa, h2 and spring-boot-starter-test can be downloaded from Maven Central.

Let’s take a look at a few different ways to configure a DataSource for testing.

3. Using a Standard Properties File in Spring Boot

The standard properties file that Spring Boot picks up automatically when running an application is called application.properties and resides in the src/main/resources folder.

If we want to use different properties for tests, then we can override the properties file in the main folder by placing another file with the same name in src/test/resources.

The application.properties file in src/test/resources folder should contain the standard key-value pairs necessary for configuring a data source. These properties are prefixed with spring.datasource.

For example, let’s configure an H2 in-memory database as a data source for tests:

spring.datasource.driver-class-name=org.h2.Driver
spring.datasource.url=jdbc:h2:mem:db;DB_CLOSE_DELAY=-1
spring.datasource.username=sa
spring.datasource.password=sa

Spring Boot will use these properties to automatically configure a DataSource bean.

Let’s define a very simple GenericEntity and repository using Spring JPA:

@Entity
public class GenericEntity {
    @Id
    @GeneratedValue(strategy = GenerationType.AUTO)
    private Long id;
    private String value;

    //standard constructors, getters, setters
}
public interface GenericEntityRepository
  extends JpaRepository<GenericEntity, Long> { }

Next, let’s write a JUnit test for the repository. In order for a test in a Spring Boot application to pick up the standard data source properties we have defined, it has to be annotated with @SpringBootTest:

@RunWith(SpringJUnit4ClassRunner.class)
@SpringBootTest(classes = Application.class)
public class SpringBootJPAIntegrationTest {
 
    @Autowired
    private GenericEntityRepository genericEntityRepository;

    @Test
    public void givenGenericEntityRepository_whenSaveAndRetreiveEntity_thenOK() {
        GenericEntity genericEntity = genericEntityRepository
          .save(new GenericEntity("test"));
        GenericEntity foundEntity = genericEntityRepository
          .findOne(genericEntity.getId());
 
        assertNotNull(foundEntity);
        assertEquals(genericEntity.getValue(), foundEntity.getValue());
    }
}

4. Using a Custom Properties File

If we don’t want to use the standard application.properties file and keys, or if we’re not using Spring Boot, we can define a custom .properties file with custom keys, then read this file in a @Configuration class to create a DataSource bean based on the values it contains.

This file will be placed in src/main/resources folder for the normal running mode of the application, and in src/test/resources in order to be picked up by tests.

Let’s create a file called persistence-generic-entity.properties that uses an H2 in-memory database for tests and place it in the src/test/resources folder:

jdbc.driverClassName=org.h2.Driver
jdbc.url=jdbc:h2:mem:db;DB_CLOSE_DELAY=-1
jdbc.username=sa
jdbc.password=sa

Next, we can define the DataSource bean based on these properties in a @Configuration class that loads our persistence-generic-entity.properties as a property source:

Configuration
@EnableJpaRepositories(basePackages = "org.baeldung.repository")
@PropertySource("persistence-generic-entity.properties")
@EnableTransactionManagement
public class H2JpaConfig {
    // ...
}

For a more detailed example of this configuration, please take a look at our previous article on Self-contained testing with an in-memory database, section “JPA Configuration”.

Then, we can create a JUnit test similar to the previous one, except it will load our configuration class:

@RunWith(SpringJUnit4ClassRunner.class)
@SpringBootTest(classes = {Application.class, H2JpaConfig.class})
public class SpringBootH2IntegrationTest {
    // ...
}

5. Using Spring Profiles

Another way we could configure a separate DataSource for testing is by leveraging Spring Profiles to define a DataSource bean that is only available in a test profile.

For this, we can use a .properties file as before, or we can write the values in the class itself.

Let’s define a DataSource bean for the test profile in a @Configuration class that will be loaded by our test:

@Configuration
@EnableJpaRepositories(basePackages = {
  "org.baeldung.repository",
  "org.baeldung.boot.repository"
})
@EnableTransactionManagement
public class H2TestProfileJPAConfig {

    @Bean
    @Profile("test")
    public DataSource dataSource() {
        DriverManagerDataSource dataSource = new DriverManagerDataSource();
        dataSource.setDriverClassName("org.h2.Driver");
        dataSource.setUrl("jdbc:h2:mem:db;DB_CLOSE_DELAY=-1");
        dataSource.setUsername("sa");
        dataSource.setPassword("sa");

        return dataSource;
    }
    
    // configure entityManagerFactory
    // configure transactionManager
    // configure additional Hibernate properties
}

Then, in the JUnit test class, we need to specify that we want to use the test profile by adding the @ActiveProfiles annotation:

@RunWith(SpringJUnit4ClassRunner.class)
@SpringBootTest(classes = {
  Application.class, 
  H2TestProfileJPAConfig.class})
@ActiveProfiles("test")
public class SpringBootProfileIntegrationTest {
    // ...
}

6. Conclusion

In this quick tutorial, we’ve seen several ways in which we can configure a separate DataSource for testing in Spring.

As always, the full source code of the examples can be found over on GitHub.

Quick Guide to the Java StringTokenizer

$
0
0

1. Overview

In this quick article, we’ll explore a fundamental class in Java – the StringTokenizer.

2. StringTokenizer

 The StringTokenizer class helps us split Strings into multiple tokens.

StreamTokenizer provides similar functionality but the tokenization method is much simpler than the one used by the StreamTokenizer class. Methods of StringTokenizer do not distinguish among identifiers, numbers, and quoted strings, nor recognize and skip comments.

The set of delimiters (the characters that separate tokens) may be specified either at the creation time or on a per-token basis.

3. Using the StringTokenizer

The simplest example of using StringTokenizer will be to split a String based on specified delimiters.

In this quick example, we’re going to split the argument String and add the tokens into a list:

public List<String> getTokens(String str) {
    List<String> tokens = new ArrayList<>();
    StringTokenizer tokenizer = new StringTokenizer(str, ",");
    while (tokenizer.hasMoreElements()) {
        tokens.add(tokenizer.nextToken());
    }
    return tokens;
}

Notice how we’re breaking the String into the list of tokens based on delimiter ‘,‘. Then in the loop, using tokens.add() method; we are adding each token into the ArrayList.

For example, if a user gives input as “Welcome,to,baeldung.com“, this method should return a list containing a three-word fragment as “Welcome“, “to” and “baeldung.com“.

3.1. Java 8 Approach

Since StringTokenizer implements Enumeration<Object> interface, we can use it with Java‘s Collections interface.

If we consider the earlier example, we can retrieve the same set of tokens using Collections.list() method and Stream API:

public List<String> getTokensWithCollection(String str) {
    return Collections.list(new StringTokenizer(str, ",")).stream()
      .map(token -> (String) token)
      .collect(Collectors.toList());
}

Here, we are passing the StringTokenizer itself as a parameter in the Collections.list() method.

Point to note here is that, since the Enumeration is an Object type, we need to type-cast the tokens to String type (i.e. depends on the implementation; if we use List of Integer/Float then we’ll need to type-cast with Integer/Float).

3.2. Variants of StringTokenizer

StringTokenizer comes with two overloaded constructors beside the default constructor: StringTokenizer(String str) and StringTokenizer(String str, String delim, boolean returnDelims):

StringTokenizer(String str, String delim, boolean returnDelims) takes an extra boolean input. If the boolean value is true, then StringTokenizer considers the delimiter itself as a token and add it to its internal pool of tokens.

StringTokenizer(String str) is a shortcut for the previous example; it internally calls the other constructor with hard-coded delimiter as ” \t\n\r\f” and the boolean value as false.

3.3. Token Customization

StringTokenizer also comes with an overloaded nextToken() method which takes a string fragment as input. This String fragment acts as an extra set of delimiters; based on which tokens are re-organized again.

For example, if we can pass ‘e‘ in the nextToken() method to further break the string based on the delimiter ‘e‘:

tokens.add(tokenizer.nextToken("e"));

Hence, for a given string of ‘Hello,baeldung.com‘ we will produce following tokens:

H
llo
ba
ldung.com

3.4. Token Length

To count the available numbers of tokens, we can use StringTokenizer‘s size method:

int tokenLength = tokens.size();

3.5. Reading From CSV File

Now, let’s try using StringTokenizer in a real use case.

There are scenarios where we try to read data from CSV files and parse the data based on the user-given delimiter.

Using StringTokenizer, we can easily get there:

public List<String> getTokensFromFile( String path , String delim ) {
    List<String> tokens = new ArrayList<>();
    String currLine = "";
    StringTokenizer tokenizer;
    try (BufferedReader br = new BufferedReader(
        new InputStreamReader(Application.class.getResourceAsStream( 
          "/" + path )))) {
        while (( currLine = br.readLine()) != null ) {
            tokenizer = new StringTokenizer( currLine , delim );
            while (tokenizer.hasMoreElements()) {
                tokens.add(tokenizer.nextToken());
            }
        }
    } catch (IOException e) {
        e.printStackTrace();
    }
    return tokens;
}

Here, the function takes two arguments; one as CSV file name (i.e. read from the resources [src -> main -> resources] folder) and the other one as a delimiter.

Based on this two arguments, the CSV data is read line by line, and each line gets tokenized using StringTokenizer.

For example, we’ve put following content in the CSV:

1|IND|India
2|MY|Malaysia
3|AU|Australia

Hence, following tokens should be generated:

1
IND
India
2
MY
Malaysia
3
AU
Australia

3.6. Testing

Now, let’s create a quick test case:

public class TokenizerTest {

    private MyTokenizer myTokenizer = new MyTokenizer();
    private List<String> expectedTokensForString = Arrays.asList(
      "Welcome" , "to" , "baeldung.com" );
    private List<String> expectedTokensForFile = Arrays.asList(
      "1" , "IND" , "India" , 
      "2" , "MY" , "Malaysia" , 
      "3", "AU" , "Australia" );

    @Test
    public void givenString_thenGetListOfString() {
        String str = "Welcome,to,baeldung.com";
        List<String> actualTokens = myTokenizer.getTokens( str );
 
        assertEquals( expectedTokensForString, actualTokens );
    }

    @Test
    public void givenFile_thenGetListOfString() {
        List<String> actualTokens = myTokenizer.getTokensFromFile( 
          "data.csv", "|" );
 
        assertEquals( expectedTokensForFile , actualTokens );
    }
}

4. Conclusion

In this quick tutorial, we had a look at some practical examples of using the core Java StringTokenizer.

Like always, the full source code is available over on GitHub.

Testing a REST API with JBehave

$
0
0

1. Introduction

In this article, we’ll have a quick look at JBehave, then focus on testing a REST API from a BDD perspective.

2. JBehave and BDD

JBehave is a Behaviour Driven Development framework. It intends to provide an intuitive and accessible way for automated acceptance testing.

If you’re not familiar with BDD, it’s a good idea to start with this article, covering on another BDD testing framework – Cucumber, in which we’re introducing the general BDD structure and features.

Similar to other BDD frameworks, JBehave adopts the following concepts:

  • Story – represents an automatically executable increment of business functionality, comprises one or more scenarios
  • Scenarios – represent concrete examples of the behavior of the system
  • Steps – represent actual behavior using classic BDD keywords: Given, When and Then

A typical scenario would be:

Given a precondition
When an event occurs
Then a the outcome should be captured

Each step in the scenario corresponds to an annotation in JBehave:

  • @Given: initiate the context
  • @When: do the action
  • @Then: test the expected outcome

3. Maven Dependency

To make use of JBehave in our maven project, the jbehave-core dependency should be included in the pom:

<dependency>
    <groupId>org.jbehave</groupId>
    <artifactId>jbehave-core</artifactId>
    <version>4.1</version>
    <scope>test</scope>
</dependency>

4. A Quick Example

To use JBehave, we need to follow the following steps:

  1. Write a user story
  2. Map steps from the user story to Java code
  3. Configure user stories
  4. Run JBehave tests
  5. Review results

4.1. Story

Let’s start with the following simple story: “as a user, I want to increase a counter, so that I can have the counter’s value increase by 1”.

We can define the story in a .story file:

Scenario: when a user increases a counter, its value is increased by 1

Given a counter
And the counter has any integral value
When the user increases the counter
Then the value of the counter must be 1 greater than previous value

4.2. Mapping Steps

Given the steps, let’s implementation this in Java:

public class IncreaseSteps {
    private int counter;
    private int previousValue;

    @Given("a counter")
    public void aCounter() {
    }

    @Given("the counter has any integral value")
    public void counterHasAnyIntegralValue() {
        counter = new Random().nextInt();
        previousValue = counter;
    }

    @When("the user increases the counter")
    public void increasesTheCounter() {
        counter++;
    }

    @Then("the value of the counter must be 1 greater than previous value")
    public void theValueOfTheCounterMustBe1Greater() {
        assertTrue(1 == counter - previousValue);
    }
}

Remember that the value in the annotations must accurately match the description.

4.3. Configuring Our Story

To perform the steps, we need to set up the stage for our story:

public class IncreaseStoryLiveTest extends JUnitStories {

    @Override
    public Configuration configuration() {
        return new MostUsefulConfiguration()
          .useStoryLoader(new LoadFromClasspath(this.getClass()))
          .useStoryReporterBuilder(new StoryReporterBuilder()
            .withCodeLocation(codeLocationFromClass(this.getClass()))
            .withFormats(CONSOLE));
    }

    @Override
    public InjectableStepsFactory stepsFactory() {
        return new InstanceStepsFactory(configuration(), new IncreaseSteps());
    }

    @Override
    protected List<String> storyPaths() {
        return Arrays.asList("increase.story");
    }

}

In storyPaths(), we provide our .story file path to be parsed by JBehave. Actual steps implementation is provided in stepsFactory(). Then in configuration(), the story loader and story report are properly configured.

Now that we have everything ready, we can begin our story simply by running: mvn clean test.

4.4. Reviewing Test Results

We can see our test result in the console. As our tests have passed successfully, the output would be the same with our story:

Scenario: when a user increases a counter, its value is increased by 1
Given a counter
And the counter has any integral value
When the user increases the counter
Then the value of the counter must be 1 greater than previous value

If we forget to implement any step of the scenario, the report will let us know. Say we didn’t implement the @When step:

Scenario: when a user increases a counter, its value is increased by 1
Given a counter
And the counter has any integral value
When the user increases the counter (PENDING)
Then the value of the counter must be 1 greater than previous value (NOT PERFORMED)
@When("the user increases the counter")
@Pending
public void whenTheUserIncreasesTheCounter() {
    // PENDING
}

The report would say the @When a step is pending, and because of that, the @Then step would not be performed.

What if our @Then step fails? We can spot the error right away from the report:

Scenario: when a user increases a counter, its value is increased by 1
Given a counter
And the counter has any integral value
When the user increases the counter
Then the value of the counter must be 1 greater than previous value (FAILED)
(java.lang.AssertionError)

5. Testing REST API

Now we have grasped the basics of JBhave; we’ll see how to test a REST API with it. Our tests will be based on our previous article discussing how to test REST API with Java.

In that article, we tested the GitHub REST API and mainly focused on the HTTP response code, headers, and payload. For simplicity, we can write them into three separate stories respectively.

5.1. Testing the Status Code

The story:

Scenario: when a user checks a non-existent user on github, github would respond 'not found'

Given github user profile api
And a random non-existent username
When I look for the random user via the api
Then github respond: 404 not found

When I look for eugenp1 via the api
Then github respond: 404 not found

When I look for eugenp2 via the api
Then github respond: 404 not found

The steps:

public class GithubUserNotFoundSteps {

    private String api;
    private String nonExistentUser;
    private int githubResponseCode;

    @Given("github user profile api")
    public void givenGithubUserProfileApi() {
        api = "https://api.github.com/users/%s";
    }

    @Given("a random non-existent username")
    public void givenANonexistentUsername() {
        nonExistentUser = randomAlphabetic(8);
    }

    @When("I look for the random user via the api")
    public void whenILookForTheUserViaTheApi() throws IOException {
        githubResponseCode = getGithubUserProfile(api, nonExistentUser)
          .getStatusLine()
          .getStatusCode();
    }

    @When("I look for $user via the api")
    public void whenILookForSomeNonExistentUserViaTheApi(
      String user) throws IOException {
        githubResponseCode = getGithubUserProfile(api, user)
          .getStatusLine()
          .getStatusCode();
    }

    @Then("github respond: 404 not found")
    public void thenGithubRespond404NotFound() {
        assertTrue(SC_NOT_FOUND == githubResponseCode);
    }

    //...
}

Notice how, in the steps implementation, we used the parameter injection feature. The arguments extracted from the step candidate are just matched following natural order to the parameters in the annotated Java method.

Also, annotated named parameters are supported:

@When("I look for $username via the api")
public void whenILookForSomeNonExistentUserViaTheApi(
  @Named("username") String user) throws IOException

5.2. Testing the Media Type

Here’s a simple MIME type testing story:

Scenario: when a user checks a valid user's profile on github, github would respond json data

Given github user profile api
And a valid username
When I look for the user via the api
Then github respond data of type json

And here are the steps:

public class GithubUserResponseMediaTypeSteps {

    private String api;
    private String validUser;
    private String mediaType;

    @Given("github user profile api")
    public void givenGithubUserProfileApi() {
        api = "https://api.github.com/users/%s";
    }

    @Given("a valid username")
    public void givenAValidUsername() {
        validUser = "eugenp";
    }

    @When("I look for the user via the api")
    public void whenILookForTheUserViaTheApi() throws IOException {
        mediaType = ContentType
          .getOrDefault(getGithubUserProfile(api, validUser).getEntity())
          .getMimeType();
    }

    @Then("github respond data of type json")
    public void thenGithubRespondDataOfTypeJson() {
        assertEquals("application/json", mediaType);
    }
}

5.3. Testing the JSON Payload

Then the last story:

Scenario: when a user checks a valid user's profile on github, github's response json should include a login payload with the same username

Given github user profile api
When I look for eugenp via the api
Then github's response contains a 'login' payload same as eugenp

And the plain straight steps implementation:

public class GithubUserResponsePayloadSteps {

    private String api;
    private GitHubUser resource;

    @Given("github user profile api")
    public void givenGithubUserProfileApi() {
        api = "https://api.github.com/users/%s";
    }

    @When("I look for $user via the api")
    public void whenILookForEugenpViaTheApi(String user) throws IOException {
        HttpResponse httpResponse = getGithubUserProfile(api, user);
        resource = RetrieveUtil.retrieveResourceFromResponse(httpResponse, GitHubUser.class);
    }

    @Then("github's response contains a 'login' payload same as $username")
    public void thenGithubsResponseContainsAloginPayloadSameAsEugenp(String username) {
        assertThat(username, Matchers.is(resource.getLogin()));
    }
}

6. Summary

In this article, we have briefly introduced JBehave and implemented BDD-style REST API tests.

When compared to our plain Java test code, code implemented with JBehave looks much clear and intuitive and the test result report looks much more elegant.

As always, the example code can be found in the Github project.

REST Query Language – Implementing OR Operation

$
0
0

1. Overview

In this quick article, we’ll extend the advanced search operations that we implemented in the previous article and include OR-based search criteria into our REST API Query Language.

2. Implementation Approach

Before, all the criteria in the search query parameter formed predicates grouped only by AND operator. Let’s change that.

We should be able to implement this feature either as a simple, quick change to existing approach or a new one from scratch.

With the simple approach, we’ll flag the criteria to indicate that it must be combined using the OR operator.

For example, here is the URL to test the API for “firstName OR lastName”:

http://localhost:8080/users?search=firstName:john,'lastName:doe

Note that we have flagged the criteria lastName with a single quote to differentiate it. We will capture this predicate for OR operator in our criteria value object – SpecSearchCriteria:

public SpecSearchCriteria(
  String orPredicate, String key, SearchOperation operation, Object value) {
    super();
    
    this.orPredicate 
      = orPredicate != null
      && orPredicate.equals(SearchOperation.OR_PREDICATE_FLAG);
    
    this.key = key;
    this.operation = operation;
    this.value = value;
}

3. UserSpecificationBuilder Improvement

Now, let’s modify our specification builder, UserSpecificationBuilder, to consider the OR qualified criteria when constructing Specification<User>:

public Specification<User> build() {
    if (params.size() == 0) {
        return null;
    }
    Specification<User> result = new UserSpecification(params.get(0));

    for (int i = 1; i < params.size(); i++) {
        result = params.get(i).isOrPredicate()
          ? Specifications.where(result).or(new UserSpecification(params.get(i))) 
          : Specifications.where(result).and(new UserSpecification(params.get(i)));
    }
    return result;
 }

4. UserController Improvement

Finally, let’s set up a new REST endpoint in our controller to use this search functionality with OR operator. The improved parsing logic extracts the special flag that helps in identifying the criteria with OR operator:

@GetMapping("/users/espec")
@ResponseBody
public List<User> findAllByOrPredicate(@RequestParam String search) {
    Specification<User> spec = resolveSpecification(search);
    return dao.findAll(spec);
}

protected Specification<User> resolveSpecification(String searchParameters) {
    UserSpecificationsBuilder builder = new UserSpecificationsBuilder();
    String operationSetExper = Joiner.on("|")
      .join(SearchOperation.SIMPLE_OPERATION_SET);
    Pattern pattern = Pattern.compile(
      "(\\p{Punct}?)(\\w+?)("
      + operationSetExper 
      + ")(\\p{Punct}?)(\\w+?)(\\p{Punct}?),");
    Matcher matcher = pattern.matcher(searchParameters + ",");
    while (matcher.find()) {
        builder.with(matcher.group(1), matcher.group(2), matcher.group(3), 
        matcher.group(5), matcher.group(4), matcher.group(6));
    }
    
    return builder.build();
}

5. Live Test With OR Condition

In this live test example, with the new API endpoint, we’ll search for users by the first name “john” OR last name “doe”. Note that parameter lastName has a single quote, which qualifies it as an “OR predicate”:

private String EURL_PREFIX
  = "http://localhost:8082/spring-security-rest-full/auth/users/espec?search=";

@Test
public void givenFirstOrLastName_whenGettingListOfUsers_thenCorrect() {
    Response response = givenAuth().get(EURL_PREFIX + "firstName:john,'lastName:doe");
    String result = response.body().asString();

    assertTrue(result.contains(userJohn.getEmail()));
    assertTrue(result.contains(userTom.getEmail()));
}

6. Persistence Test With OR Condition

Now, let’s perform the same test we did above, at the persistence level for users with first name “john” OR last name “doe”:

@Test
public void givenFirstOrLastName_whenGettingListOfUsers_thenCorrect() {
    UserSpecificationsBuilder builder = new UserSpecificationsBuilder();

    SpecSearchCriteria spec 
      = new SpecSearchCriteria("firstName", SearchOperation.EQUALITY, "john");
    SpecSearchCriteria spec1 
      = new SpecSearchCriteria("'","lastName", SearchOperation.EQUALITY, "doe");

    List<User> results = repository
      .findAll(builder.with(spec).with(spec1).build());

    assertThat(results, hasSize(2));
    assertThat(userJohn, isIn(results));
    assertThat(userTom, isIn(results));
}

7. Alternative Approach

In the alternate approach, we could provide the search query more like a complete WHERE clause of SQL query.

For example, here is the URL for a more complex search by firstName and age:

http://localhost:8080/users?search=( firstName:john OR firstName:tom ) AND age>22

Note that we have separated individual criteria, operators & grouping parenthesis with a space to form a valid infix expression.

Let us parse the infix expression with a CriteriaParser. Our CriteriaParser splits the given infix expression into tokens (criteria, parenthesis, AND & OR operators) and creates a postfix expression for the same:

public Deque<?> parse(String searchParam) {

    Deque<Object> output = new LinkedList<>();
    Deque<String> stack = new LinkedList<>();

    for (String token : searchParam.split("\\s+")) {
        if (ops.containsKey(token)) {
            while (!stack.isEmpty() 
              && isHigerPrecedenceOperator(token, stack.peek())) {
                output.push(stack.pop().equalsIgnoreCase(SearchOperation.OR_OPERATOR)
                  ? SearchOperation.OR_OPERATOR : SearchOperation.AND_OPERATOR);
            }
            stack.push(token.equalsIgnoreCase(SearchOperation.OR_OPERATOR) 
              ? SearchOperation.OR_OPERATOR : SearchOperation.AND_OPERATOR);

        } else if (token.equals(SearchOperation.LEFT_PARANTHESIS)) {
            stack.push(SearchOperation.LEFT_PARANTHESIS);
        } else if (token.equals(SearchOperation.RIGHT_PARANTHESIS)) {
            while (!stack.peek().equals(SearchOperation.LEFT_PARANTHESIS)) { 
                output.push(stack.pop());
            }
            stack.pop();
        } else {
            Matcher matcher = SpecCriteraRegex.matcher(token);
            while (matcher.find()) {
                output.push(new SpecSearchCriteria(
                  matcher.group(1), 
                  matcher.group(2), 
                  matcher.group(3), 
                  matcher.group(4), 
                  matcher.group(5)));
            }
        }
    }

    while (!stack.isEmpty()) {
        output.push(stack.pop());
    }
  
    return output;
}

Let us add a new method in our specification builder, GenericSpecificationBuilder, to construct the search Specification from the postfix expression:

    public Specification<U> build(Deque<?> postFixedExprStack, 
        Function<SpecSearchCriteria, Specification<U>> converter) {

        Deque<Specification<U>> specStack = new LinkedList<>();

        while (!postFixedExprStack.isEmpty()) {
            Object mayBeOperand = postFixedExprStack.pollLast();

            if (!(mayBeOperand instanceof String)) {
                specStack.push(converter.apply((SpecSearchCriteria) mayBeOperand));
            } else {
                Specification<U> operand1 = specStack.pop();
                Specification<U> operand2 = specStack.pop();
                if (mayBeOperand.equals(SearchOperation.AND_OPERATOR)) {
                    specStack.push(Specifications.where(operand1)
                      .and(operand2));
                }
                else if (mayBeOperand.equals(SearchOperation.OR_OPERATOR)) {
                    specStack.push(Specifications.where(operand1)
                      .or(operand2));
                }
            }
        }
        return specStack.pop();

Finally, let us add another REST endpoint in our UserController to parse the complex expression with the new CriteriaParser:

@GetMapping("/users/spec/adv")
@ResponseBody
public List<User> findAllByAdvPredicate(@RequestParam String search) {
    Specification<User> spec = resolveSpecificationFromInfixExpr(search);
    return dao.findAll(spec);
}

protected Specification<User> resolveSpecificationFromInfixExpr(String searchParameters) {
    CriteriaParser parser = new CriteriaParser();
    GenericSpecificationsBuilder<User> specBuilder = new GenericSpecificationsBuilder<>();
    return specBuilder.build(parser.parse(searchParameters), UserSpecification::new);
}

8. Conclusion

In this tutorial, we’ve improved our REST query language with the capability to search with an OR operator.

The full implementation of this article can be found in the GitHub project. This is a Maven-based project, so it should be easy to import and run as it is.

Working with an Embedded Jetty Server in Java

$
0
0

1. Overview

In this article, we will be looking at the Jetty library. Jetty provides a web server that can run as an embedded container and integrates easily with the javax.servlet library.

2. Maven Dependencies

To get started we’ll add Maven dependencies to jetty-server and jetty-servlet libraries:

<dependency>
    <groupId>org.eclipse.jetty</groupId>
    <artifactId>jetty-server</artifactId>
    <version>9.4.3.v20170317</version>
</dependency>
<dependency>
    <groupId>org.eclipse.jetty</groupId>
    <artifactId>jetty-servlet</artifactId>
    <version>9.4.3.v20170317</version>
</dependency>

3. Starting Jetty Server with Servlet

Starting the Jetty embedded container is simple. We need to instantiate a new Server object and set it to start on a given port:

public class JettyServer {
    private Server server;

    public void start() throws Exception {
        server = new Server();
        ServerConnector connector = new ServerConnector(server);
        connector.setPort(8090);
        server.setConnectors(new Connector[] {connector});
}

Let’s say that we want to create an endpoint that will respond with the HTTP status code of 200 if everything goes well and a simple JSON payload.

We’ll create a class that extends the HttpServlet class to handle such request; this class will be single threaded and block until completion:

public class BlockingServlet extends HttpServlet {

    protected void doGet(HttpServletRequest request, HttpServletResponse response)
      throws ServletException, IOException {
        response.setContentType("application/json");
        response.setStatus(HttpServletResponse.SC_OK);
        response.getWriter().println("{ \"status\": \"ok\"}");
    }
}

Next, we need to register the BlockingServlet class in the ServletHandler object by using the addServletWithMapping() method and start the server:

servletHandler.addServletWithMapping(BlockingServlet.class, "/status");
server.start();

If we wish to test our Servlet logic, we need to start our server by using the previously created JettyServer class that is a wrapper of the actual Jetty server instance within the test setup:

@Before
public void setup() throws Exception {
    jettyServer = new JettyServer();
    jettyServer.start();
}

Once started, we will send a test HTTP request to the /status endpoint:

String url = "http://localhost:8090/status";
HttpClient client = HttpClientBuilder.create().build();
HttpGet request = new HttpGet(url);

HttpResponse response = client.execute(request);
 
assertThat(response.getStatusLine().getStatusCode()).isEqualTo(200);

4. Non-Blocking Servlets

Jetty has good support for asynchronous request processing.

Let’s say that we have an enormous resource that is I/O intense taking a long time to load blocking the executing thread for a substantial amount of time. It is better if that thread can be liberated to handle other requests in the meantime, instead of waiting for some I/O resource.

To provide such logic with Jetty, we can create a servlet that will use the AsyncContext class by calling the startAsync() method on the HttpServletRequest. This code will not block the executing thread but will perform the I/O operation in separate thread returning the result when ready using the AsyncContext.complete() method:

public class AsyncServlet extends HttpServlet {
    private static String HEAVY_RESOURCE 
      = "This is some heavy resource that will be served in an async way";

    protected void doGet(HttpServletRequest request, HttpServletResponse response)
      throws ServletException, IOException {
        ByteBuffer content = ByteBuffer.wrap(HEAVY_RESOURCE.getBytes(StandardCharsets.UTF_8));

        AsyncContext async = request.startAsync();
        ServletOutputStream out = response.getOutputStream();
        out.setWriteListener(new WriteListener() {
            @Override
            public void onWritePossible() throws IOException {
                while (out.isReady()) {
                    if (!content.hasRemaining()) {
                        response.setStatus(200);
                        async.complete();
                        return;
                    }
                    out.write(content.get());
                }
            }

            @Override
            public void onError(Throwable t) {
                getServletContext().log("Async Error", t);
                async.complete();
            }
        });
    }
}

We are writing the ByteBuffer to the OutputStream, and once the whole buffer is written we are signaling that result is ready to return to the client by invoking the complete() method.

Next, we need to add the AsyncServlet as a Jetty servlet mapping:

servletHandler.addServletWithMapping(AsyncServlet.class, "/heavy/async");

We can now send a request to the /heavy/async endpoint – that request will be handled by the Jetty in an asynchronous way:

String url = "http://localhost:8090/heavy/async";
HttpClient client = HttpClientBuilder.create().build();
HttpGet request = new HttpGet(url);
HttpResponse response = client.execute(request);

assertThat(response.getStatusLine().getStatusCode()).isEqualTo(200);
String responseContent = IOUtils.toString(response.getEntity().getContent(), StandardCharsets.UTF_8);
assertThat(responseContent).isEqualTo("This is some heavy resource that will be served in an async way");

When our application is handling requests in an asynchronous way, we should configure thread pool explicitly. In the next section, we will configure Jetty to use a custom thread pool.

5. Jetty Configuration

When we run our web application on production, we want might want to tune how the Jetty server processes requests. This is done by defining thread pool and applying it to our Jetty server.

To do this, we have three configuration settings that we can set:

  • maxThreads – To specify the maximum number of threads that Jetty can create and use in the pool
  • minThreads – To set the initial number of threads in the pool that Jetty will use
  • idleTimeout – This value in milliseconds defines how long a thread can be idle before it is stopped and removed from the thread pool. The number of remaining threads in the pool will never go below the minThreads setting

With these we can configure the embedded Jetty server programmatically by passing the configured thread pool to the Server constructor:

int maxThreads = 100;
int minThreads = 10;
int idleTimeout = 120;

QueuedThreadPool threadPool = new QueuedThreadPool(maxThreads, minThreads, idleTimeout);

server = new Server(threadPool);

Then, when we start our server it will be using threads from a specific thread pool.

6. Conclusion

In this quick tutorial, we saw how to integrate embedded servers with Jetty and tested our web application.

As always, the code is available over on GitHub.


Introduction to Jenetics Library

$
0
0

1. Introduction

The aim of this series is to explain the idea of genetic algorithms and show the most known implementations.

In this tutorial, we’ll describe a very powerful Jenetics Java library that can be used for solving various optimization problems.

If you feel that you need to learn more about genetic algorithms, we recommend starting with this article.

2. How Does it Work?

According to its official documents, Jenetics is a library based on an evolutionary algorithm written in Java. Evolutionary algorithms have their roots in biology, as they use mechanisms inspired by biological evolution, such as reproduction, mutation, recombination, and selection.

Jenetics is implemented using the Java Stream interface, so it works smoothly with the rest of the Java Stream API.

The main features are:

  • frictionless minimization – there is no need to change or tweak the fitness function; we can just change the configuration of the Engine class and we are ready to start our first application
  • dependency free – there are no runtime third-party libraries needed to use Jenetics
  • Java 8 ready – full support for Stream and lambda expressions
  • multithreaded – evolutionary steps can be executed in parallel

In order to use Jenetics, we need to add the following dependency into our pom.xml:

<dependency>
    <groupId>io.jenetics</groupId>
    <artifactId>jenetics</artifactId>
    <version>3.7.0</version>
</dependency>

The latest version can be found in Maven Central.

3. Use Cases

To test all features of Jenetics, we’ll try to solve various well-known optimization problems, starting from the simple binary algorithm and ending with the Knapsack problem.

3.1. Simple Genetic Algorithm

Let’s assume that we need to solve the simplest binary problem, where we need to optimize the positions of the 1 bits in the chromosome consisting of 0’s and 1’s. First, we need to define the factory suitable for the problem:

Factory<Genotype<BitGene>> gtf = Genotype.of(BitChromosome.of(10, 0.5));

We created the BitChromosome with a length of 10, and the probability of having 1’s in the chromosome equal to 0.5.

Now, let’s create the execution environment:

Engine<BitGene, Integer> engine
  = Engine.builder(SimpleGeneticAlgorithm::eval, gtf).build();

The eval() method returns the bit count:

private Integer eval(Genotype<BitGene> gt) {
    return gt.getChromosome().as(BitChromosome.class).bitCount();
}

In the final step, we start the evolution and collect the results:

Genotype<BitGene> result = engine.stream()
  .limit(500)
  .collect(EvolutionResult.toBestGenotype());

The final result will look similar to this:

Before the evolution:
[00000010|11111100]
After the evolution:
[00000000|11111111]

We managed to optimize the position of 1’s in the gene.

3.2. Subset Sum Problem

Another use case for Jenetics is to solve the subset sum problem. In brief, the challenge to optimize is that, given a set of integers, we need to find a non-empty subset whose sum is zero.

There are predefined interfaces in Jenetics to solve such problems:

public class SubsetSum implements Problem<ISeq<Integer>, EnumGene<Integer>, Integer> {
    // implementation
}

As we can see, we implement the Problem<T, G, C>, that has three parameters:

  • <T> – the argument type of the problem fitness function, in our case an immutable, ordered, fixed sized Integer sequence ISeq<Integer>
  • <G> the gene type the evolution engine is working with, in this case, countable Integer genes EnumGene<Integer>
  • <C> – the result type of the fitness function; here it is an Integer

In order to use the Problem<T, G, C> interface, we need to override two methods:

@Override
public Function<ISeq<Integer>, Integer> fitness() {
    return subset -> Math.abs(subset.stream()
      .mapToInt(Integer::intValue).sum());
}

@Override
public Codec<ISeq<Integer>, EnumGene<Integer>> codec() {
    return codecs.ofSubSet(basicSet, size);
}

In the first one, we define our fitness function, whereas the second one is a class containing factory methods for creating common problem encodings, for example, to find the best fixed-size subset from a given basic set, as in our case.

Now we can proceed to the main part. At the beginning, we need to create a subset to use in the problem:

SubsetSum problem = of(500, 15, new LCG64ShiftRandom(101010));

Please note that we are using the LCG64ShiftRandom generator provided by Jenetics. In the next step, we are building the engine of our solution:

In the next step, we are building the engine of our solution:

Engine<EnumGene<Integer>, Integer> engine = Engine.builder(problem)
  .minimizing()
  .maximalPhenotypeAge(5)
  .alterers(new PartiallyMatchedCrossover<>(0.4), new Mutator<>(0.3))
  .build();

We try to minimize the result (optimally the result will be 0) by setting the phenotype age and alterers used to alter the offspring. In the next step we can obtain the result:

Phenotype<EnumGene<Integer>, Integer> result = engine.stream()
  .limit(limit.bySteadyFitness(55))
  .collect(EvolutionResult.toBestPhenotype());

Please note that we are using bySteadyFitness() that returns a predicate, which will truncate the evolution stream if no better phenotype could be found after the given number of generations and collect the best result. If we get lucky, and there is a solution to the randomly created set, we’ll see something similar to this:

If we get lucky, and there is a solution to the randomly created set, we’ll see something similar to this:

[85|-76|178|-197|91|-106|-70|-243|-41|-98|94|-213|139|238|219] --> 0

Otherwise, the sum of subset will be different than 0.

3.3. Knapsack First Fit Problem

The Jenetics library allows us to solve even more sophisticated problems, such as the Knapsack problem. Briefly speaking, in this problem, we have a limited space in our knapsack, and we need to decide which items to put inside.

Let’s start with defining the bag size and number of items:

int nItems = 15;
double ksSize = nItems * 100.0 / 3.0;

In the next step, we’ll generate a random array containing KnapsackItem objects (defined by size and value fields) and we’ll put those items randomly inside the knapsack, using the First Fit method:

KnapsackFF ff = new KnapsackFF(Stream.generate(KnapsackItem::random)
  .limit(nItems)
  .toArray(KnapsackItem[]::new), ksSize);

Next, we need to create the Engine:

Engine<BitGene, Double> engine
  = Engine.builder(ff, BitChromosome.of(nItems, 0.5))
  .populationSize(500)
  .survivorsSelector(new TournamentSelector<>(5))
  .offspringSelector(new RouletteWheelSelector<>())
  .alterers(new Mutator<>(0.115), new SinglePointCrossover<>(0.16))
  .build();

There are a few points to note here:

  • population size is 500
  • the offspring will be chosen through the tournament and roulette wheel selections
  • as we did in the previous subsection, we need also to define the alterers for the newly created offspring

There is also one very important feature of Jenetics. We can easily collect all statistics and insights from the whole simulation duration. We are going to do this by using the EvolutionStatistics class:

EvolutionStatistics<Double, ?> statistics = EvolutionStatistics.ofNumber();

Finally, let’s run the simulations:

Phenotype<BitGene, Double> best = engine.stream()
  .limit(bySteadyFitness(7))
  .limit(100)
  .peek(statistics)
  .collect(toBestPhenotype());

Please note that we are updating the evaluation statistics after each generation, which is limited to 7 steady generation and a maximum of 100 generations in total. In more detail there are two possible scenarios:

  • we achieve 7 steady generations, then the simulation stops
  • we cannot get 7 steady generations in less than 100 generations, so the simulation stops due to the second limit()

It’s important to have maximum generations limit, otherwise, the simulations may not stop in a reasonable time.

The final result contains a lot of information:

+---------------------------------------------------------------------------+
|  Time statistics                                                          |
+---------------------------------------------------------------------------+
|             Selection: sum=0,039207931000 s; mean=0,003267327583 s        |
|              Altering: sum=0,065145069000 s; mean=0,005428755750 s        |
|   Fitness calculation: sum=0,029678433000 s; mean=0,002473202750 s        |
|     Overall execution: sum=0,111383965000 s; mean=0,009281997083 s        |
+---------------------------------------------------------------------------+
|  Evolution statistics                                                     |
+---------------------------------------------------------------------------+
|           Generations: 12                                                 |
|               Altered: sum=7 664; mean=638,666666667                      |
|                Killed: sum=0; mean=0,000000000                            |
|              Invalids: sum=0; mean=0,000000000                            |
+---------------------------------------------------------------------------+
|  Population statistics                                                    |
+---------------------------------------------------------------------------+
|                   Age: max=10; mean=1,792167; var=4,657748                |
|               Fitness:                                                    |
|                      min  = 0,000000000000                                |
|                      max  = 716,684883338605                              |
|                      mean = 587,012666759785                              |
|                      var  = 17309,892287851708                            |
|                      std  = 131,567063841418                              |
+---------------------------------------------------------------------------+

This particular time, we were able to put items with a total value of 716,68 in the best scenario. We also can see the detailed statistics of evolution and time.

How to test?

It is a fairly simple process — just open the main file related to the problem and first run the algorithm. Once we have a general idea, then we can start playing with the parameters.

4. Conclusion

In this article, we covered the Jenetics library features based on real optimization problems.

The code is available as a Maven project on GitHub. Please note that we provided the code examples for more optimization challenges, such as the Springsteen Record (yes, it exists!) and Traveling Salesman problems.

For all articles in the series, including other examples of genetic algorithms, check out the following links:

Java Web Weekly, Issue 172

$
0
0

Lots of interesting writeups on Java 9 this week.

Here we go…

1. Spring and Java

>> Accessing private state of Java 9 modules [in.relation.to]

The introduction of modularity to Java 9 sheds new light on accessing private fields using “deep reflection”. As well as creates problems for libraries such as Hibernate or Lombok.

>> Running Spring Boot Apps on Docker Windows Containers with Ansible: A Complete Guide incl Packer, Vagrant & Powershell [codecentric.de]

It turns out we can run real Docker containers on Windows without using virtual machines 🙂

>> Which Java Logging Framework Has the Best Performance? [sitepoint.com]

A comprehensive guide to Java logging from the performance side of things.

Also worth reading:

Webinars and presentations:

Time to upgrade:

2. Technical

>> How to Solve Tough Problems Using Genetic Algorithms [blog.takipi.com]

A quick example of how mimicking nature can help us tackle complex problems.

>> Distributed Cache – Overview [techblog.bozho.net]

A short and practical introduction to distributed caches.

>> Kotlin for front-end developers [frankel.ch]

It turns out you can use Kotlin for front-end development too, assuming you use Kotlin to JavaScript transpiler.

Also worth reading:

3. Musings

>> The Polyglot’s Dilemma [daedtech.com]

It’s crucial to be able to use your skills for solving problems and not just being a universal penknife.

Also worth reading:

4. Comics

And my favorite Dilberts of the week:

>> On networking 1 [dilbert.com]

>> On networking 2 [dilbert.com]

>> They’re trying to kill me [dilbert.com]

5. Pick of the Week

>> The “Java in 2017” Survey Results

Using @JsonComponent in Spring Boot

$
0
0

1. Overview

This quick article is focused on how to use the @JsonComponent annotation in Spring Boot.

The annotation allows us to expose an annotated class to be a Jackson serializer and/or deserializer without the need to add it to the ObjectMapper manually.

This is part of the core Spring Boot module, so there are no additional dependencies required in a plain Spring Boot application.

2. Serialization

Let’s start with the following User object containing a favorite color:

public class User {
    private Color favoriteColor;

    // standard getters/constructors
}

If we serialize this object using Jackson with default settings we get:

{
  "favoriteColor": {
    "red": 0.9411764740943909,
    "green": 0.9725490212440491,
    "blue": 1.0,
    "opacity": 1.0,
    "opaque": true,
    "hue": 208.00000000000003,
    "saturation": 0.05882352590560913,
    "brightness": 1.0
  }
}

We can make the JSON a lot more condensed and readable by just printing the RGB values – for example, to be used in CSS.

To this extent, we just have to create a class that implements JsonSerializer:

@JsonComponent
public class UserJsonSerializer extends JsonSerializer<User> {

    @Override
    public void serialize(User user, JsonGenerator jsonGenerator, 
      SerializerProvider serializerProvider) throws IOException, 
      JsonProcessingException {
 
        jsonGenerator.writeStartObject();
        jsonGenerator.writeStringField(
          "favoriteColor", 
          getColorAsWebColor(user.getFavoriteColor()));
        jsonGenerator.writeEndObject();
    }

    private static String getColorAsWebColor(Color color) {
        int r = (int) Math.round(color.getRed() * 255.0);
        int g = (int) Math.round(color.getGreen() * 255.0);
        int b = (int) Math.round(color.getBlue() * 255.0);
        return String.format("#%02x%02x%02x", r, g, b);
    }
}

With this serializer, the resulting JSON has been reduced to:

{"favoriteColor":"#f0f8ff"}

Due to the @JsonComponent annotation, the serializer is registered in the Jackson ObjectMapper in the Spring Boot application. We can test this with the following JUnit test:

@JsonTest
@RunWith(SpringRunner.class)
public class UserJsonSerializerTest {

    @Autowired
    private ObjectMapper objectMapper;

    @Test
    public void testSerialization() throws JsonProcessingException {
        User user = new User(Color.ALICEBLUE);
        String json = objectMapper.writeValueAsString(user);
 
        assertEquals("{\"favoriteColor\":\"#f0f8ff\"}", json);
    }
}

3. Deserialization

Continuing with the same example, we can write a deserializer that will turn the web color String into a JavaFX Color object:

@JsonComponent
public class UserJsonDeserializer extends JsonDeserializer<User> {
 
    @Override
    public User deserialize(JsonParser jsonParser, 
      DeserializationContext deserializationContext) throws IOException, 
      JsonProcessingException {
 
        TreeNode treeNode = jsonParser.getCodec().readTree(jsonParser);
        TextNode favoriteColor
          = (TextNode) treeNode.get("favoriteColor");
        return new User(Color.web(favoriteColor.asText()));
    }
}

Let’s test the new deserializer and make sure everything works as expected:

@JsonTest
@RunWith(SpringRunner.class)
public class UserJsonDeserializerTest {

    @Autowired
    private ObjectMapper objectMapper;

    @Test
    public void testDeserialize() throws IOException {
        String json = "{\"favoriteColor\":\"#f0f8ff\"}"
        User user = objectMapper.readValue(json, User.class);
 
        assertEquals(Color.ALICEBLUE, user.getFavoriteColor());
    }
}

4. Serializer and Deserializer in one Class

When desired, we can connect the serializer and the deserializer in one class by using two inner classes and adding the @JsonComponent on the enclosing class:

@JsonComponent
public class UserCombinedSerializer {
 
    public static class UserJsonSerializer 
      extends JsonSerializer<User> {

        @Override
        public void serialize(User user, JsonGenerator jsonGenerator, 
          SerializerProvider serializerProvider) throws IOException, 
          JsonProcessingException {
 
            jsonGenerator.writeStartObject();
            jsonGenerator.writeStringField(
              "favoriteColor", getColorAsWebColor(user.getFavoriteColor()));
            jsonGenerator.writeEndObject();
        }

        private static String getColorAsWebColor(Color color) {
            int r = (int) Math.round(color.getRed() * 255.0);
            int g = (int) Math.round(color.getGreen() * 255.0);
            int b = (int) Math.round(color.getBlue() * 255.0);
            return String.format("#%02x%02x%02x", r, g, b);
        }
    }

    public static class UserJsonDeserializer 
      extends JsonDeserializer<User> {
 
        @Override
        public User deserialize(JsonParser jsonParser, 
          DeserializationContext deserializationContext)
          throws IOException, JsonProcessingException {
 
            TreeNode treeNode = jsonParser.getCodec().readTree(jsonParser);
            TextNode favoriteColor = (TextNode) treeNode.get(
              "favoriteColor");
            return new User(Color.web(favoriteColor.asText()));
        }
    }
}

5. Conclusion

This quick tutorial showed how to quickly add a Jackson serializer/deserializer in a Spring Boot application by leveraging component scanning with the @JsonComponent annotation.

The code snippets can be found over on GitHub.

Dynamic DTO Validation Config Retrieved from DB

$
0
0

1. Overview

In this tutorial, we’re going to take a look at how we can create a custom validation annotation that uses a regular expression retrieved from a database to match against the field value.

We will use Hibernate Validator as a base implementation.

2. Maven Dependencies

For development, we will need the following dependencies:

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-thymeleaf</artifactId>
    <version>1.5.2.RELEASE</version>
</dependency>
<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-data-jpa</artifactId>
    <version>1.5.2.RELEASE</version>
</dependency>
<dependency>
    <groupId>com.h2database</groupId>
    <artifactId>h2</artifactId>
    <version>1.4.194</version>
</dependency>

The latest versions of spring-boot-starter-thymeleaf, spring-boot-starter-data-jpa, and h2 can be downloaded from Maven Central.

3. Custom Validation Annotation

For our example, we will create a custom annotation called @ContactInfo that will validate a value against a regular expression retrieved from a database. We will then apply this validation on the contactInfo field of a POJO class called Customer.

To retrieve regular expressions from a database, we will model these as a ContactInfoExpression entity class.

3.1. Data Models and Repository

Let’s create the Customer class with id and contactInfo fields:

@Entity
public class Customer {

    @Id
    @GeneratedValue(strategy = GenerationType.IDENTITY)
    private long id;

    private String contactInfo;

    // standard constructor, getters, setters
}

Next, let’s take a look at the ContactInfoExpression class – which will hold the regular expression values in a property called pattern:

@Entity
public class ContactInfoExpression {

    @Id
    @Column(name="expression_type")
    private String type;
 
    private String pattern;

    // standard constructor, getters, setters
}

Next, let’s add a repository interface based on Spring Data to manipulate the ContactInfoExpression entities:

public interface ContactInfoExpressionRepository 
  extends Repository<ContactInfoExpression, String> {
 
    Optional<ContactInfoExpression> findOne(String id);
}

3.2. Database Setup

For storing regular expressions, we will use an H2 in-memory database with the following persistence configuration:

@EnableJpaRepositories("com.baeldung.dynamicvalidation.dao")
@EntityScan("com.baeldung.dynamicvalidation.model")
@Configuration
public class PersistenceConfig {

    @Bean
    public JdbcTemplate getJdbcTemplate() {
        return new JdbcTemplate(dataSource());
    }

    @Bean
    public DataSource dataSource() {
        EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder();
        EmbeddedDatabase db = builder.setType(EmbeddedDatabaseType.H2)
          .addScript("schema-expressions.sql")
          .addScript("data-expressions.sql")
          .build();
        return db;
    }
}

The two scripts mentioned are used for creating the schema and inserting the data into the contact_info_expression table:

CREATE TABLE contact_info_expression(
  expression_type varchar(50) not null,
  pattern varchar(500) not null,
  PRIMARY KEY ( expression_type )
);

The data-expressions.sql script will add three records to represent the types email, phone, and website. These represent regular expressions for validating that value is a valid email address, a valid US phone number, or a valid URL:

insert into contact_info_expression values ('email',
  '[a-z0-9!#$%&*+/=?^_`{|}~-]+(?:\.[a-z0-9!#$%&*+/=?^_`{|}~-]+)*@(?:[a-z0-9](?:[a-z0-9-]*[a-z0-9])?\.)+[a-z0-9](?:[a-z0-9-]*[a-z0-9])?')
insert into contact_info_expression values ('phone',
  '^([0-9]( |-)?)?(\(?[0-9]{3}\)?|[0-9]{3})( |-)?([0-9]{3}( |-)?[0-9]{4}|[a-zA-Z0-9]{7})$')
insert into contact_info_expression values ('website',
  '^(http:\/\/www\.|https:\/\/www\.|http:\/\/|https:\/\/)?[a-z0-9]+([\-\.]{1}[a-z0-9]+)*\.[a-z]{2,5}(:[0-9]{1,5})?(\/.*)?$')

3.3. Creating the Custom Validator

Let’s create the ContactInfoValidator class that contains the actual validation logic. Following Java Validation specification guidelines, the class implements the ConstraintValidator interface and overrides the isValid() method.

This class will obtain the value of the currently used type of contact info — email, phone, or website — which is set in a property called contactInfoType, then use it to retrieve the regular expression’s value from the database:

public class ContactInfoValidator implements ConstraintValidator<ContactInfo, String> {

    @Value("${contactInfoType}")
    private String expressionType;
 
    @Autowired
    private ContactInfoExpressionRepository expressionRepository;

    @Override
    public void initialize(ContactInfo contactInfo) { }

    @Override
    public boolean isValid(String value, ConstraintValidatorContext context) {
        if (StringUtils.isEmptyOrWhitespace(expressionType)) {
            return false;
        }
        return expressionRepository
          .findOne(expressionType)
          .map(ContactInfoExpression::getPattern)
          .map(p -> Pattern.matches(p, value))
          .orElse(false);
        }
    }
}

The contactInfoType property can be set in the application.properties file to one of the values email, phone or website:

contactInfoType=email

3.4. Creating the Custom Constraint Annotation

And now, let’s create the annotation interface for our custom constraint:

@Constraint(validatedBy = { ContactInfoValidator.class })
@Target({ METHOD, FIELD, ANNOTATION_TYPE, CONSTRUCTOR, PARAMETER })
@Retention(RetentionPolicy.RUNTIME)
public @interface ContactInfo {
    String message() default "Invalid value";

    Class<?>[] groups() default {};

    Class<? extends Payload>[] payload() default {};
}

3.5. Applying the Custom Constraint

Finally, let’s add the validation annotation we have created the contactInfo field of our Customer class:

public class Customer {
    
    // ...
    @ContactInfo
    private String contactInfo;
    
    // ...
}

4. Spring Controller and HTML Form

To test our validation annotation, we will create a Spring MVC request mapping that uses the @Valid annotation to trigger the validation of a Customer object:

@PostMapping("/customer")
public String validateCustomer(@Valid Customer customer, BindingResult result, Model model) {
    if (result.hasErrors()) {
        model.addAttribute("message", "The information is invalid!");
    } else {
        model.addAttribute("message", "The information is valid!");
    }
    return "customer";
}

The Customer object is sent to the controller from an HTML form:

<form action="customer" method="POST">
Contact Info: <input type="text" name="contactInfo" /> <br />
<input type="submit" value="Submit" />
</form>
<span th:text="${message}"></span>

To wrap it all up, we can run our application as a Spring Boot application:

@SpringBootApplication
public class DynamicValidationApp {
    public static void main(String[] args) {
        SpringApplication.run(DynamicValidationApp.class, args);
    }
}

5. Conclusion

In this example, we have shown how we can create a custom validation annotation that retrieves a regular expression dynamically from a database and uses it to validate the annotated field.

The full source code of the example can be found over on GitHub.

Converters, Listeners and Validators in Java EE 7

$
0
0

1. Overview

Java Enterprise Edition (JEE) 7 provides some useful features e.g. for validating user input, converting values into appropriate Java data types.

In this tutorial, we’ll focus on those features provided by converters, listeners, and validators.

2. Converters

A converter allows us to transform string input values into a Java data types. Predefined converters are located in the javax.faces.convert package, and they are compatible with any Java data type or even standard classes like Date.

To define an Integer converter, first we create our property in the managed bean used as a back end of our JSF form:

private Integer age;
	 
// getters and setters

Then we create the component in our form using the f:converter tag:

<h:outputLabel value="Age:"/>
<h:inputText id="Age" value="#{convListVal.age}">
    <f:converter converterId="javax.faces.Integer" />
</h:inputText>
<h:message for="Age" />

In a similar way, we create the other numeric converters like the Double converter:

private Double average;

Then we create the appropriate JSF component in our view. Please note that we are using the variable average, which is then mapped to the field using the getter and the setter by name convention:

<h:outputLabel value="Average:"/>
<h:inputText id="Average" value="#{convListVal.average}">
    <f:converter converterId="javax.faces.Double" />
</h:inputText>
<h:message for="Average" />

If we want to give the feedback to the user, we need to include an h:message tag to be used by the control as a placeholder for error messages.

A useful converter is the DateTime converter because it allows us to validate dates, times and format these values.

First, as in previous converters, we declare our field with the getters and setters:

private Date myDate;
// getters and setters

Then we create the component in our view. Here we need to enter the date using the pattern, if the pattern is not used then we get an error with an example of a correct pattern of the input:

<h:outputLabel value="Date:"/>
<h:inputText id="MyDate" value="#{convListVal.myDate}">
    <f:convertDateTime pattern="dd/MM/yyyy" />
</h:inputText>
<h:message for="MyDate" />
<h:outputText value="#{convListVal.myDate}">
    <f:convertDateTime dateStyle="full" locale="en"/>
</h:outputText>

In our case, we can convert our input date and send the post data, formatted as a full date in our h:outputText.

3. Listeners

A listener allows us to monitor changes in our components; we are monitoring when the value of a text field changes.

As before we define the properties in our managed bean:

private String name;

Then we define our listener in the view:

<h:outputLabel value="Name:"/>
<h:inputText id="name" size="30" value="#{convListVal.name}">
    <f:valueChangeListener type="com.baeldung.convListVal.MyListener" />
</h:inputText>

We set our h:inputText tag by adding an f:valueChangeListener and also, inside the listener tag, we need to specify a class, which will be used to perform the tasks when the listener is triggered.

public class MyListener implements ValueChangeListener {
    private static final Logger LOG = Logger.getLogger(MyListener.class.getName());	
        
    @Override
    public void processValueChange(ValueChangeEvent event)
      throws AbortProcessingException {
        if (event.getNewValue() != null) {
            LOG.log(Level.INFO, "\tNew Value:{0}", event.getNewValue());
        }
    }
}

The listener class must implement the ValueChangeListener interface and override the processValueChange() method to do the listener tasks, to write a log message.

4. Validators

We use a validator to validate a JSF component data, with a set of standard classes provided, to validate the user input.

Here, we defined a standard validator to enable us to check the length of a user input in a text field.

First, we create our field in the managed bean:

private String surname;

Then we create our component in the view:

<h:outputLabel value="surname" for="surname"/>
<h:panelGroup>
    <h:inputText id="surname" value="#{convListVal.surname}">
        <f:validateLength minimum="5" maximum="10"/>
    </h:inputText>
    <h:message for="surname" errorStyle="color:red"  />
</h:panelGroup>

Inside the h:inputText tag we put our validator, to validate the length of the input. Please remember that there are various standard validators predefined in JSF and we can use them in a similar way to the one presented here.

5. Tests

To test this JSF application, we are going to use Arquillian to perform a functional testing with Drone, Graphene and Selenium Web Driver.

First, we deploy our application using ShrinkWrap:

@Deployment(testable = false)
public static WebArchive createDeployment() {
    return (ShrinkWrap.create(
      WebArchive.class, "jee7.war").
      addClasses(ConvListVal.class, MyListener.class)).
      addAsWebResource(new File(WEBAPP_SRC, "ConvListVal.xhtml")).
      addAsWebInfResource(EmptyAsset.INSTANCE, "beans.xml");
}

Then we test the error messages of each component to verify that our application is working properly:

@Test
@RunAsClient
public void givenAge_whenAgeInvalid_thenErrorMessage() throws Exception {
    browser.get(deploymentUrl.toExternalForm() + "ConvListVal.jsf");
    ageInput.sendKeys("stringage");
    guardHttp(sendButton).click();
    assertTrue("Show Age error message",
      browser.findElements(By.id("myForm:ageError")).size() > 0);
}

Similar tests are performed on each component.

6. Summary

In this tutorial, we created implementations of converters, listeners, and validators provided by JEE7.

You can find the code from the article over on Github.

Viewing all 3692 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>