Quantcast
Channel: Baeldung
Viewing all 3752 articles
Browse latest View live

Entity To DTO Conversion for a Spring REST API

$
0
0

I usually post about REST APIs and HTTP on Twitter - you can follow me there:

1. Overview

In this tutorial, we’ll handle the conversions that need to happen between the internal entities of a Spring application and the external DTOs (Data Transfer Objects) that are published back to the client.

2. Model Mapper

Let’s start by introducing the main library that we’re going to use to perform this entity-DTO conversion – ModelMapper.

We will need this dependency in the pom.xml:

<dependency>
    <groupId>org.modelmapper</groupId>
    <artifactId>modelmapper</artifactId>
    <version>0.7.4</version>
</dependency>

To check if there’s any newer version of this library, go here.

We’ll then define the ModelMapper bean in our Spring configuration:

@Bean
public ModelMapper modelMapper() {
    return new ModelMapper();
}

3. The DTO

Next, let’s introduce the DTO side of this two-sided problem – Post DTO:

public class PostDto {
    private static final SimpleDateFormat dateFormat = 
      new SimpleDateFormat("yyyy-MM-dd HH:mm");

    private Long id;

    private String title;

    private String url;

    private String date;

    private UserDto user;

    public Date getSubmissionDateConverted(String timezone) throws ParseException {
        dateFormat.setTimeZone(TimeZone.getTimeZone(timezone));
        return dateFormat.parse(this.date);
    }

    public void setSubmissionDate(Date date, String timezone) {
        dateFormat.setTimeZone(TimeZone.getTimeZone(timezone));
        this.date = dateFormat.format(date);
    }
    // standard getters and setters
}

Note that the 2 custom date related methods handle the date conversion back and forth between the client and the server:

  • getSubmissionDateConverted() method converts Date String into a date in server’s timezone to use it in persisting Post entity
  • setSubmissionDate() method is to set DTO’s date to Post’s Date in current user timezone.

4. The Service Layer

Let’s now look at a service level operation – which will obviously work with the Entity (not the DTO):

public List<Post> getPostsList(int page, int size, String sortDir, String sort) {
    PageRequest pageReq = 
      new PageRequest(page, size, Sort.Direction.fromString(sortDir), sort);
    Page<Post> posts = postRepository.findByUser(userService.getCurrentUser(), pageReq);
    return posts.getContent();
}

We’re going to have a look at the layer above service next – the controller layer. This is where the conversion will actually happen as well.

5. The Controller Layer

Let’s now have a look at a standard controller implementation, exposing the simple REST API for the Post resource.

We’re going to show here a few simple CRUD operations: create, update, get one and get all. And given the operations are pretty straightforward, we are especially interested in the Entity-DTO conversion aspects:

@Controller
class PostRestController {

    @Autowired
    private IPostService postService;

    @Autowired
    private IUserService userService;

    @Autowired
    private ModelMapper modelMapper;

    @RequestMapping(method = RequestMethod.GET)
    @ResponseBody
    public List<PostDto> getPosts(...) {
        ...
        List<Post> posts = postService.getPostsList(page, size, sortDir, sort);
        return posts.stream()
          .map(post -> convertToDto(post)).collect(Collectors.toList());
    }

    @RequestMapping(method = RequestMethod.POST)
    @ResponseStatus(HttpStatus.CREATED)
    @ResponseBody
    public PostDto createPost(@RequestBody PostDto postDto) {
        Post post = convertToEntity(postDto);
        Post postCreated = postService.createPost(post));
        return convertToDto(postCreated);
    }

    @RequestMapping(value = "/{id}", method = RequestMethod.GET)
    @ResponseBody
    public PostDto getPost(@PathVariable("id") Long id) {
        return convertToDto(postService.getPostById(id));
    }

    @RequestMapping(value = "/{id}", method = RequestMethod.PUT)
    @ResponseStatus(HttpStatus.OK)
    public void updatePost(@RequestBody PostDto postDto) {
        Post post = convertToEntity(postDto);
        postService.updatePost(post);
    }
}

And here is our conversion from Post entity to PostDto:

private PostDto convertToDto(Post post) {
    PostDto postDto = modelMapper.map(post, PostDto.class);
    postDto.setSubmissionDate(
      post.getSubmissionDate(), userService.getCurrentUser().getPreference().getTimezone());
    return postDto;
}

And here is the conversion from DTO to entity:

private Post convertToEntity(PostDto postDto) throws ParseException {
    Post post = modelMapper.map(postDto, Post.class);
    post.setSubmissionDate(
      postDto.getSubmissionDateConverted(
      userService.getCurrentUser().getPreference().getTimezone()));
    if (postDto.getId() != null) {
        Post oldPost = postService.getPostById(postDto.getId());
        post.setRedditID(oldPost.getRedditID());
        post.setSent(oldPost.isSent());
    }
    return post;
}

So, as you can see, with the help of the model mapper, the conversion logic is quick and simple – we’re using the map API of the mapper and getting the data converted without writing a single line of conversion logic.

6. Unit Testing

Finally, let’s do a very simple test to make sure the conversions between the entity and the DTO work well:

public class PostDtoUnitTest {

    private ModelMapper modelMapper = new ModelMapper();

    @Test
    public void whenConvertPostEntityToPostDto_thenCorrect() {
        Post post = new Post();
        post.setId(Long.valueOf(1));
        post.setTitle(randomAlphabetic(6));
        post.setUrl("www.test.com");

        PostDto postDto = modelMapper.map(post, PostDto.class);
        assertEquals(post.getId(), postDto.getId());
        assertEquals(post.getTitle(), postDto.getTitle());
        assertEquals(post.getUrl(), postDto.getUrl());
    }

    @Test
    public void whenConvertPostDtoToPostEntity_thenCorrect() {
        PostDto postDto = new PostDto();
        postDto.setId(Long.valueOf(1));
        postDto.setTitle(randomAlphabetic(6));
        postDto.setUrl("www.test.com");

        Post post = modelMapper.map(postDto, Post.class);
        assertEquals(postDto.getId(), post.getId());
        assertEquals(postDto.getTitle(), post.getTitle());
        assertEquals(postDto.getUrl(), post.getUrl());
    }
}

7. Conclusion

This was a quick and to the point article on simplifying the conversion from Entity to DTO and from DTO to Entity in a Spring REST API, by using the model mapper library instead of writing these conversions by hand.

I usually post about REST APIs and HTTP on Twitter - you can follow me there:


Baeldung Weekly Review 34

$
0
0

I usually post about Dev stuff on Twitter - you can follow me there:

At the very beginning of last year, I decided to track my reading habits and share the best stuff here, on Baeldung. Haven’t missed a review since.

Here we go…

1. Spring and Java

>> Impulse: “Adventures On The Road to Valhalla” [codefx.org]

Detailed breakdown of the current progress and state of Project Valhalla – which is supposed to land in Java 10 or 11.

The article is following a recent talk that I shared in the review from last week.

>> 8u60 Update Release Notes (+covered on Voxxed) [oracle.com]

A “fixes and improvements” new release of the JDK. Besides just being up to date – here’s specifically why I care.

>> Parameterized integration tests with Spring JUnit Rules [codeleak.pl]

Practical and timely illustration on how to put the new JUnit rules from Spring 4.2 to good use.

>> Migrating a Spring Web MVC application from JSP to AngularJS [spring.io]

A very nice guide on migrating an MVC style app to a REST API, which then makes possible moving from Server Side JSP to full Client Side AngularJS. Very nice.

Also worth reading:

Webinars and presentations:

Time to upgrade:

2. Technical

>> Introduction to Akka Actors [codecentric.de]

Continues the introduction to Akka series with this well put together second installment.

>> Will there be a Distributed HTTP? [mnot.net]

Really good, in-depth writeup on the architecture of the web, potential privacy risks and one possible way to move forward.

>> Eliminating Roundtrips with Preconnect [igvita.com]

Very cool new spec to make the web faster.

>> Tips and Tricks on Optimizing Amazon Web Services [netguru.co]

Interesting cost and speed comparison between S3 and a CDN for serving static assets at scale.

>> Call me Maybe: Chronos [aphyr.com]

Another fantastic, in-depth read in the Call me Maybe series. I’m really enjoying these in-depth writeups of technologies I’m actually using.

Also worth reading:

3. Musings

>> The parable of the 57.6k modem [dandreamsofcoding.com]

Don’t get trapped in a local maxima.

>> Developers Who Can Build Things from Scratch [aaronstannard.com]

Learning to put your work out there often takes being intentional about it. Otherwise, it’s far to easy to land in a job where that simply isn’t part of the engineering culture.

I didn’t do to much of that in the first 3 years of my career; so, after that, I made it a point to prioritize the “release early and often” culture when I took on my next job. So yeah, a good quick write-up with a few solid take-aways.

>> The difference between time and attention [signalvnoise.com]

A good distinction – when you’re asking for someone’s time, you’re also asking for their attention. And though they might have the time, they might not have the attention.

Also worth reading:

4. Comics

And my favorite Dilberts of the week:

>> It’s like stabbing Gandhi

>> You don’t have any friends, do you?

>> Said the engineer with no budget

5. Pick of the Week

Earlier this year I introduced the “Pick of the Week” section here in my “Weekly Review”. If you’re already on my email list – you got the pick already – hope you enjoyed it.

If not – you can share the review and unlock it right here:

Spring Data MongoDB – Indexes, Annotations and Converters

$
0
0

1. Overview

This tutorial will explore some of the core features of Spring Data MongoDB – indexing, common annotations and converters.

2. Indexes

2.1. @Indexed

This annotation marks the field as indexed in MongoDB:

@QueryEntity
@Document
public class User {
    @Indexed
    private String name;
    
    ... 
}

Now that the name field is indexed – let’s have a look at the indexes in MongoDB:

db.user.getIndexes();

Here’s what we have at the database level:

[
    {
        "v" : 1,
        "key" : {
             "_id" : 1
         },
        "name" : "_id_",
        "ns" : "test.user"
    },
    {
         "v" : 1,
         "key" : {
             "name" : 1
          },
          "name" : "name",
          "ns" : "test.user"
     }
]

As you can see, we have two indexes – one of them is _id – which was created by default due to the @Id annotation and the second one is our name field.

2.2. Create an Index Programmatically

We can also create an index programmatically:

mongoOps.indexOps(User.class).
  ensureIndex(new Index().on("name", Direction.ASC));

We’ve now created an index for the field name and the result will be the same as in the previous section.

2.3. Compound Indexes

MongoDB supports compound indexes, where a single index structure holds references to multiple fields.

Let’s see a quick example using compound indexes:

@QueryEntity
@Document
@CompoundIndexes({
    @CompoundIndex(name = "email_age", def = "{'email.id' : 1, 'age': 1}")
})
public class User {
    //
}

We created a compound index with the email and age fields. Let’s now check out the actual indexes:

{
    "v" : 1,
    "key" : {
        "email.id" : 1,
        "age" : 1
    },
    "name" : "email_age",
    "ns" : "test.user"
}

Note that a DBRef field cannot be marked with @Index – that field can only be part of a compound index.

3. Common Annotations

3.1 @Transient

As you would expect, this simple annotation excludes the field from being persisted in the database:

public class User {
    
    @Transient
    private Integer yearOfBirth;
    // standard getter and setter

}

Let’s insert user with the setting field yearOfBirth:

User user = new User();
user.setName("Alex");
user.setYearOfBirth(1985);
mongoTemplate.insert(user);

Now if we look the state of database, we see that the filed yearOfBirth was not saved:

{
    "_id" : ObjectId("55d8b30f758fd3c9f374499b"),
    "name" : "Alex",
    "age" : null
}

So if we query and check:

mongoTemplate.findOne(Query.query(Criteria.where("name").is("Alex")), User.class).getYearOfBirth()

The result will be null.

3.2. @Field

@Field indicates the key to be used for the field in the JSON document:

@Field("email")
private EmailAddress emailAddress;

Now emailAddress will be saved in the database using the key email:

User user = new User();
user.setName("Brendan");
EmailAddress emailAddress = new EmailAddress();
emailAddress.setValue("a@gmail.com");
user.setEmailAddress(emailAddress);
mongoTemplate.insert(user);

And the state of the database:

{
    "_id" : ObjectId("55d076d80bad441ed114419d"),
    "name" : "Brendan",
    "age" : null,
    "email" : {
        "value" : "a@gmail.com"
    }
}

3.3. @PersistenceConstructor and @Value

@PersistenceConstructor marks a constructor, even one that’s package protected, to be the primary constructor used by the persistence logic. The constructor arguments are mapped by name to the key values in the retrieved DBObject.

Let’s look at this constructor for our User class:

@PersistenceConstructor
public User(String name, @Value("#root.age ?: 0") Integer age, EmailAddress emailAddress) {
    this.name =  name;
    this.age = age;
    this.emailAddress =  emailAddress;
}

Notice the use of the standard Spring @Value annotation here. It’s with the help of this annotation that we can use the Spring Expressions to transform a key’s value retrieved from the database before it is used to construct a domain object. That is a very powerful and highly useful feature here.

In our example if age is not set that it will be set to 0 by default.

Let’s now see how it works:

User user = new User();
user.setName("Alex");
mongoTemplate.insert(user);

Our database will look:

{
    "_id" : ObjectId("55d074ca0bad45f744a71318"),
    "name" : "Alex",
    "age" : null
}

So the age field is null, but when we query the document and retrieve age:

mongoTemplate.findOne(Query.query(Criteria.where("name").is("Alex")), User.class).getAge();

The result will be 0.

4. Converters

Let’s now take a look at another very useful feature in Spring Data MongoDB – converters, and specifically at the MongoConverter.

This is used to handle the mapping of all Java types to DBObjects when storing and querying these objects.

We have two options – we can either work with MappingMongoConverter – or SimpleMongoConverter in earlier versions (this was deprecated in Spring Data MongoDB M3 and its functionality has been moved into MappingMongoConverter).

Or we can write our own custom converter. To do that, we would need to implement the Converter interface and register the implementation in MongoConfig.

Let’s look at a quick example. As you’ve seen in some of the JSON output here, all objects saved in a database have the field  _class which is saved automatically. If however we’d like to skip that particular field during persistence, we can do that using a MappingMongoConverter.

First – here’s the custom converter implementation:

@Component
public class UserWriterConverter implements Converter<User, DBObject> {
    @Override
    public DBObject convert(User user) {
        DBObject dbObject = new BasicDBObject();
        dbObject.put("name", user.getName());
        dbObject.put("age", user.getAge());
        if (user.getEmailAddress() != null) {
            DBObject emailDbObject = new BasicDBObject();
            emailDbObject.put("value", user.getEmailAddress().getValue());
            dbObject.put("email", emailDbObject);
        }
        dbObject.removeField("_class");
        return dbObject;
    }
}

Notice how we can easily hit the goal of not persisting _class by specifically removing the field directly here.

Now we need to register the custom converter:

private List<Converter<?,?>> converters = new ArrayList<Converter<?,?>>();

@Override
public CustomConversions customConversions() {
    converters.add(new UserWriterConverter());
    return new CustomConversions(converters);
}

We can of course achieve the same result with XML configuration as well, if we need to:

<bean id="mongoTemplate" 
  class="org.springframework.data.mongodb.core.MongoTemplate">
    <constructor-arg name="mongo" ref="mongo"/>
    <constructor-arg ref="mongoConverter" />
    <constructor-arg name="databaseName" value="test"/>
</bean>

<mongo:mapping-converter id="mongoConverter" base-package="org.baeldung.converter">
    <mongo:custom-converters base-package="org.baeldung.converter" />
</mongo:mapping-converter>

Now, when we save a new user:

User user = new User();
user.setName("Chris");
mongoOps.insert(user);

The resulting document in the database no longer contains the class information:

{
    "_id" : ObjectId("55cf09790bad4394db84b853"),
    "name" : "Chris",
    "age" : null
}

5. Conclusion

In this tutorial we’ve covered some core concepts of working with Spring Data MongoDB – indexing, common annotations and converters.

The implementation of all these examples and code snippets can be found in my github project – this is an Eclipse based project, so it should be easy to import and run as it is.

Custom Cascading in Spring Data MongoDB

$
0
0

I usually post about Persistence on Twitter - you can follow me there:

1. Overview

This tutorial will continue to explore some of the core features of Spring Data MongoDB – the @DBRef annotation and lifecycle events.

2. @DBRef

The mapping framework doesn’t support storing parent-child relations and embedded documents within other documents. What we can do though is – we can store them separately and use a DBRef to refer between the documents.

When the object is loaded from MongoDB, those references will be eagerly resolved and we’ll get back a mapped object that looks the same as if it had been stored embedded within our master document.

Let’s look at some code:

@DBRef
private EmailAddress emailAddress;

EmailAddress looks like:

@Document
public class EmailAddress {
    @Id
    private String id;
    
    private String value;
    
    // standard getters and setters
}

Note that the mapping framework doesn’t handle cascading operations. So – for instance – if we trigger a save on a parent, the child won’t be saved automatically – we’ll need to explicitly trigger the save on the child if we want to save it as well.

This is exactly where lifecycle events really come in handy.

3. Lifecycle Events

Spring Data MongoDB publishes some very useful lifecycle events – such as onBeforeConvert, onBeforeSave, onAfterSave, onAfterLoad and onAfterConvert.

To intercept one of the events, we need to register a subclass of AbstractMappingEventListener and override one of the methods here. When the event is dispatched, our listener will be called and domain object passed in.

3.1. Basic Cascade Save

Let’s look at the example we had earlier – saving the user with the emailAddress. We can now listen to the onBeforeConvert event which will be called before a domain object goes into the converter:

public class UserCascadeSaveMongoEventListener extends AbstractMongoEventListener<Object> {
    @Autowired
    private MongoOperations mongoOperations;

    @Override
    public void onBeforeConvert(final Object source) {
        if (source instanceof User && 
          ((User) source).getEmailAddress() != null) {
            mongoOperations.save(((User) source).getEmailAddress());
        }
    }
}

Now we just need to register the listener into MongoConfig:

@Bean
public UserCascadeSaveMongoEventListener userCascadingMongoEventListener() {
    return new UserCascadeSaveMongoEventListener();
}

Or as xml:

<bean class="org.baeldung.event.UserCascadeSaveMongoEventListener" />

And we have cascading semantics all done – albeit only for the user.

3.2. A Generic Cascade Implementation

Let’s now improve the previous solution by making the cascade functionality generic. Let’s start by defining a custom annotation:

@Retention(RetentionPolicy.RUNTIME)
@Target(ElementType.FIELD)
public @interface CascadeSave {
    //
}

Let’s now work on our custom listener to handle these fields generically and not have to cast to any specific entity:

public class CascadeSaveMongoEventListener extends AbstractMongoEventListener<Object> {

    @Autowired
    private MongoOperations mongoOperations;

    @Override
    public void onBeforeConvert(Object source) {
        ReflectionUtils.doWithFields(source.getClass(), 
          new CascadeCallback(source, mongoOperations));
    }
}

So we’re using the reflection utility out of Spring and we’re running our own callback on all fields that meet our criteria:

@Override
public void doWith(Field field) throws IllegalArgumentException, IllegalAccessException {
    ReflectionUtils.makeAccessible(field);

    if (field.isAnnotationPresent(DBRef.class) && 
      field.isAnnotationPresent(CascadeSave.class)) {
    
        Object fieldValue = field.get(getSource());
        if (fieldValue != null) {
            FieldCallback callback = new FieldCallback();
            ReflectionUtils.doWithFields(fieldValue.getClass(), callback);

            getMongoOperations().save(fieldValue);
        }
    }
}

As you can see, we’re looking for fields that have both the DBRef annotation as well as CascadeSave. Once we find these fields, we save the child entity.

Let’s look at the FieldCallback class which we’re using to check if the child has an @Id annotation:

public class FieldCallback implements ReflectionUtils.FieldCallback {
    private boolean idFound;

    public void doWith(Field field) throws IllegalArgumentException, IllegalAccessException {
        ReflectionUtils.makeAccessible(field);

        if (field.isAnnotationPresent(Id.class)) {
            idFound = true;
        }
    }

    public boolean isIdFound() {
        return idFound;
    }
}

Finally, to make it all work togather, we of course need to emailAddress field to now be correctly annotated:

@DBRef
@CascadeSave
private EmailAddress emailAddress;

3.3. The Cascade Test

Let’s now have a look at a scenario – we save a User with emailAddress and the save operation cascades to this embedded entity automatically:

User user = new User();
user.setName("Brendan");
EmailAddress emailAddress = new EmailAddress();
emailAddress.setValue("b@gmail.com");
user.setEmailAddress(emailAddress);
mongoTemplate.insert(user);

Let’s check our database:

{
    "_id" : ObjectId("55cee9cc0badb9271768c8b9"),
    "name" : "Brendan",
    "age" : null,
    "email" : {
        "value" : "b@gmail.com"
    }
}

4. Conclusion

In this article we illustrated some cool features of Spring Data MongoDB – the @DBRef annotation, lifecycle events and how we can handle cascading intelligently.

The implementation of all these examples and code snippets can be found in my github project – this is an Eclipse based project, so it should be easy to import and run as it is.

I usually post about Persistence on Twitter - you can follow me there:


Java Base64 Encoding and Decoding

$
0
0

1. Overview

In this tutorial, we’re going to explore the various utilities that provide Base64 encoding and decoding functionality in Java.

We’re mainly going to illustrate the new Java 8 APIs as well as the utility APIs coming out of Apache Commons.

2. Java 8 for Base 64

Java 8 has finally added Base64 capabilities to the standard API, via the java.util.Base64 utility class.

Let’s start by looking a basic encoder process.

2.1. Java 8 Basic Base64

The basic encoder keeps things simple and encodes the input as is – without any line separation.

The output is mapped to a set of characters in A-Za-z0-9+/ character set and the decoder rejects any character outside of this set.

Let’s first encode a simple String:

String originalInput = "test input";
String encodedString = Base64.getEncoder().encodeToString(originalInput.getBytes());

Note how we retrieve the full Encoder API via the simple getEncoder() utility method.

Let’s now decode that String back to the original form:

byte[] decodedBytes = Base64.getDecoder().decode(encodedString);
String decodedString = new String(decodedBytes);

2.2. Java 8 Base64 Encoding without Padding

In Base64 encoding, the length of output encoded String must be a multiple of 3. If it’s not, the output will be padded with additional pad characters (`=`).

On decoding, these extra padding characters will be discarded. To dig deeper into padding in Base64, check out this detailed answer over on StackOverflow.

If you need to skip the padding of the output – perhaps, because the resulting String will never be decoded back – you can simply chose to encode without padding:

String encodedString = 
  Base64.getEncoder().withoutPadding().encodeToString(originalInput.getBytes());

2.3. Java 8 URL Encoding

URL encoding is very similar to the basic encoder we looked at above. It uses the URL and Filename safe Base64 alphabet and does not add any line separation:

String originalUrl = "https://www.google.co.nz/?gfe_rd=cr&ei=dzbFV&gws_rd=ssl#q=java";
String encodedUrl = Base64.getUrlEncoder().encodeToString(originalURL.getBytes());

Decoding happens in much the same way – the getUrlDecoder() utility method returns a java.util.Base64.Decoder that is then used to decode the URL:

byte[] decodedBytes = Base64.getUrlDecoder().decode(encodedUrl);
String decodedUrl = new String(decodedBytes);

2.4. Java 8 MIME Encoding

Let’s start with by generating some basic MIME input to encode:

private static StringBuilder getMimeBuffer() {
    StringBuilder buffer = new StringBuilder();
    for (int count = 0; count < 10; ++count) {
        buffer.append(UUID.randomUUID().toString());
    }
    return buffer;
}

The MIME encoder generates a Base64 encoded output using the basic alphabet but in a MIME friendly format: each line of the output is no longer than 76 characters and ends with a carriage return followed by a linefeed (\r\n):

StringBuilder buffer = getMimeBuffer();
byte[] encodedAsBytes = buffer.toString().getBytes();
String encodedMime = Base64.getMimeEncoder().encodeToString(encodedAsBytes);

The getMimeDecoder() utility method returns a java.util.Base64.Decoder that is then used in the decoding process:

byte[] decodedBytes = Base64.getMimeDecoder().decode(encodedMime);
String decodedMime = new String(decodedBytes);

3. Encoding/Decoding Using Apache Commons Code

First, we need to define the commons-codec dependency in the pom.xml:

<dependency>
    <groupId>commons-codec</groupId>
    <artifactId>commons-codec</artifactId>
    <version>1.10</version>
</dependency>

Note that you can check is newer versions of the library have been released over on Maven central.

The main API is the org.apache.commons.codec.binary.Base64 class – which can be parameterized with various constructors:

  • Base64(boolean urlSafe) – creates the Base64 API by controlling the URL-safe mode – on or off
  • Base64(int lineLength) – creates the Base64 API in an URL unsafe mode and controlling the length of the line (default is 76)
  • Base64(int lineLength, byte[] lineSeparator) – creates the Base64 API by accepting an extra line separator, which, by default is CRLF (“\r\n”)

On the Base64 API is created, both encoding and decoding are quite simple:

String originalInput = "test input";
Base64 base64 = new Base64();
String encodedString = new String(base64.encode(originalInput.getBytes()));

The decode() method of Base64 class returns the decoded string:

String decodedString = new String(base64.decode(encodedString.getBytes()));

Another simple option is using the static API of Base64 instead of creating an instance:

String originalInput = "test input";
String encodedString = new String(Base64.encodeBase64(originalInput.getBytes()));
String decodedString = new String(Base64.decodeBase64(encodedString.getBytes()));

4. Conclusion

This article explains the basics of how to do Base64 encoding and decoding in Java, using the new APIs introduced in Java 8 as well as Apache Commons.

Preserve the History of Reddit Post Submissions

$
0
0

1. Overview

In this installment of the Reddit App case study, we’re going to start keeping track of the history of submission attempts for a post, and make the statuses more descriptive and easy to understand.

2. Improving The Post Entity

First, let’s start by replacing the old String status in the Post entity with a much more complete list of submission responses, keeping track of a lot more information:

public class Post {
    ...
    @OneToMany(fetch = FetchType.EAGER, mappedBy = "post")
    private List<SubmissionResponse> submissionsResponse;
}

Next, let’s see what we’re actually keeping track of in this new submission response entity:

@Entity
public class SubmissionResponse implements IEntity {

    @Id
    @GeneratedValue(strategy = GenerationType.AUTO)
    private Long id;

    private int attemptNumber;

    private String content;

    private Date submissionDate;

    private Date scoreCheckDate;

    @JsonIgnore
    @ManyToOne
    @JoinColumn(name = "post_id", nullable = false)
    private Post post;

    public SubmissionResponse(int attemptNumber, String content, Post post) {
        super();
        this.attemptNumber = attemptNumber;
        this.content = content;
        this.submissionDate = new Date();
        this.post = post;
    }

    @Override
    public String toString() {
        StringBuilder builder = new StringBuilder();
        builder.append("Attempt No ").append(attemptNumber).append(" : ").append(content);
        return builder.toString();
    }
}

Note that each consumed submission attempt has a SubmissionResponse, so that:

  • attemptNumber: the number of this attempt
  • content: the detailed response of this attempt
  • submissionDate: the submission date of this attempt
  • scoreCheckDate: the date we checked the score of the Reddit Post in this attempt

And here is the simple Spring Data JPA repository:

public interface SubmissionResponseRepository extends JpaRepository<SubmissionResponse, Long> {

    SubmissionResponse findOneByPostAndAttemptNumber(Post post, int attemptNumber);
}

3. Scheduling Service

We now need to start modifying the service layer to keep track of this extra information.

We’ll first make sure we have a nicely formatted success or failure reasons for why the Post was considered a success or a failure:

private final static String SCORE_TEMPLATE = "score %d %s minimum score %d";
private final static String TOTAL_VOTES_TEMPLATE = "total votes %d %s minimum total votes %d";

protected String getFailReason(Post post, PostScores postScores) { 
    StringBuilder builder = new StringBuilder(); 
    builder.append("Failed because "); 
    builder.append(String.format(
      SCORE_TEMPLATE, postScores.getScore(), "<", post.getMinScoreRequired())); 
    
    if (post.getMinTotalVotes() > 0) { 
        builder.append(" and "); 
        builder.append(String.format(TOTAL_VOTES_TEMPLATE, 
          postScores.getTotalVotes(), "<", post.getMinTotalVotes()));
    } 
    if (post.isKeepIfHasComments()) { 
        builder.append(" and has no comments"); 
    } 
    return builder.toString(); 
}

protected String getSuccessReason(Post post, PostScores postScores) {
    StringBuilder builder = new StringBuilder(); 
    if (postScores.getScore() >= post.getMinScoreRequired()) { 
        builder.append("Succeed because "); 
        builder.append(String.format(SCORE_TEMPLATE, 
          postScores.getScore(), ">=", post.getMinScoreRequired())); 
        return builder.toString(); 
    } 
    if (
      (post.getMinTotalVotes() > 0) && 
      (postScores.getTotalVotes() >= post.getMinTotalVotes())
    ) { 
        builder.append("Succeed because "); 
        builder.append(String.format(TOTAL_VOTES_TEMPLATE, 
          postScores.getTotalVotes(), ">=", post.getMinTotalVotes()));
        return builder.toString(); 
    } 
    return "Succeed because has comments"; 
}

Now, we’ll improve the old logic and keep track of this extra historical information:

private void submitPost(...) {
    ...
    if (errorNode == null) {
        post.setSubmissionsResponse(addAttemptResponse(post, "Submitted to Reddit"));
        ...
    } else {
        post.setSubmissionsResponse(addAttemptResponse(post, errorNode.toString()));
        ...
    }
}
private void checkAndReSubmit(Post post) {
    if (didIntervalPass(...)) {
        PostScores postScores = getPostScores(post);
        if (didPostGoalFail(post, postScores)) {
            ...
            resetPost(post, getFailReason(post, postScores));
        } else {
            ...
            updateLastAttemptResponse(
              post, "Post reached target score successfully " + 
                getSuccessReason(post, postScores));
        }
    }
}
private void checkAndDeleteInternal(Post post) {
    if (didIntervalPass(...)) {
        PostScores postScores = getPostScores(post);
        if (didPostGoalFail(post, postScores)) {
            updateLastAttemptResponse(post, 
              "Deleted from reddit, consumed all attempts without reaching score " + 
                getFailReason(post, postScores));
            ...
        } else {
            updateLastAttemptResponse(post, 
              "Post reached target score successfully " + 
                getSuccessReason(post, postScores));
            ...
        }
    }
}
private void resetPost(Post post, String failReason) {
    ...
    updateLastAttemptResponse(post, "Deleted from Reddit, to be resubmitted " + failReason);
    ...
}

Note what the lower level methods are actually doing:

  • addAttemptResponse(): creates a new SubmissionResponse record and adds it to the Post (called on every submission attempt)
  • updateLastAttemptResponse(): update the last attempt response (called while checking post’s score)

4. Scheduled Post DTO

Next, we’ll modify the DTO to make sure this new information gets exposed back to the client:

public class ScheduledPostDto {
    ...

    private String status;

    private List<SubmissionResponseDto> detailedStatus;
}

And here’s the simple SubmissionResponseDto:

public class SubmissionResponseDto {

    private int attemptNumber;

    private String content;

    private String localSubmissionDate;

    private String localScoreCheckDate;
}

We will also modify conversion method in our ScheduledPostRestController:

private ScheduledPostDto convertToDto(Post post) {
    ...
    List<SubmissionResponse> response = post.getSubmissionsResponse();
    if ((response != null) && (response.size() > 0)) {
        postDto.setStatus(response.get(response.size() - 1).toString().substring(0, 30));
        List<SubmissionResponseDto> responsedto = 
          post.getSubmissionsResponse().stream().
            map(res -> generateResponseDto(res)).collect(Collectors.toList());
        postDto.setDetailedStatus(responsedto);
    } else {
        postDto.setStatus("Not sent yet");
        postDto.setDetailedStatus(Collections.emptyList());
    }
    return postDto;
}

private SubmissionResponseDto generateResponseDto(SubmissionResponse responseEntity) {
    SubmissionResponseDto dto = modelMapper.map(responseEntity, SubmissionResponseDto.class);
    String timezone = userService.getCurrentUser().getPreference().getTimezone();
    dto.setLocalSubmissionDate(responseEntity.getSubmissionDate(), timezone);
    if (responseEntity.getScoreCheckDate() != null) {
        dto.setLocalScoreCheckDate(responseEntity.getScoreCheckDate(), timezone);
    }
    return dto;
}

5. Front End

Next, we will modify our front-end scheduledPosts.jsp to handle our new response:

<div class="modal">
    <h4 class="modal-title">Detailed Status</h4>
    <table id="res"></table>
</div>

<script >
var loadedData = [];
var detailedResTable = $('#res').DataTable( {
    "searching":false,
    "paging": false,
    columns: [
        { title: "Attempt Number", data: "attemptNumber" },
        { title: "Detailed Status", data: "content" },
        { title: "Attempt Submitted At", data: "localSubmissionDate" },
        { title: "Attempt Score Checked At", data: "localScoreCheckDate" }
 ]
} );
           
$(document).ready(function() {
    $('#myposts').dataTable( {
        ...
        "columnDefs": [
            { "targets": 2, "data": "status",
              "render": function ( data, type, full, meta ) {
                  return data + 
                    ' <a href="#" onclick="showDetailedStatus('+meta.row+' )">More Details</a>';
              }
            },
            ....
        ],
        ...
    });
});

function showDetailedStatus(row){
    detailedResTable.clear().rows.add(loadedData[row].detailedStatus).draw();
    $('.modal').modal();
}

</script>

6. Tests

Finally, we will perform a simple unit test on our new methods:

First, we’ll test the getSuccessReason() implementation:

@Test
public void whenHasEnoughScore_thenSucceed() {
    Post post = new Post();
    post.setMinScoreRequired(5);
    PostScores postScores = new PostScores(6, 10, 1);

    assertTrue(getSuccessReason(post, postScores).contains("Succeed because score"));
}

@Test
public void whenHasEnoughTotalVotes_thenSucceed() {
    Post post = new Post();
    post.setMinScoreRequired(5);
    post.setMinTotalVotes(8);
    PostScores postScores = new PostScores(2, 10, 1);

    assertTrue(getSuccessReason(post, postScores).contains("Succeed because total votes"));
}

@Test
public void givenKeepPostIfHasComments_whenHasComments_thenSucceed() {
    Post post = new Post();
    post.setMinScoreRequired(5);
    post.setKeepIfHasComments(true);
    final PostScores postScores = new PostScores(2, 10, 1);

    assertTrue(getSuccessReason(post, postScores).contains("Succeed because has comments"));
}

Next, we will test the getFailReason() implementation:

@Test
public void whenNotEnoughScore_thenFail() {
    Post post = new Post();
    post.setMinScoreRequired(5);
    PostScores postScores = new PostScores(2, 10, 1);

    assertTrue(getFailReason(post, postScores).contains("Failed because score"));
}

@Test
public void whenNotEnoughTotalVotes_thenFail() {
    Post post = new Post();
    post.setMinScoreRequired(5);
    post.setMinTotalVotes(15);
    PostScores postScores = new PostScores(2, 10, 1);

    String reason = getFailReason(post, postScores);
    assertTrue(reason.contains("Failed because score"));
    assertTrue(reason.contains("and total votes"));
}

@Test
public void givenKeepPostIfHasComments_whenNotHasComments_thenFail() {
    Post post = new Post();
    post.setMinScoreRequired(5);
    post.setKeepIfHasComments(true);
    final PostScores postScores = new PostScores(2, 10, 0);

    String reason = getFailReason(post, postScores);
    assertTrue(reason.contains("Failed because score"));
    assertTrue(reason.contains("and has no comments"));
}

7. Conclusion

In this installment, we introduced some very useful visibility into the lifecycle of a Reddit post. We can now see exactly when a post was submitted and deleted each time, along with the exact reason for each operation.

Baeldung Weekly Review 35

$
0
0

I usually post about Dev stuff on Twitter - you can follow me there:

At the very beginning of last year, I decided to track my reading habits and share the best stuff here, on Baeldung. Haven’t missed a review since.

Here we go…

1. Spring and Java

>> Comments on The Twelve-Factor App [techblog.bozho.net]

Very interesting analysis of the well known twelve-factor app recommendations focused on the Java ecosystem.

>> Optionally typechecked StateMachines [benjiweber.co.uk]

A cool and practical implementation of a finite state machine that can replace what would otherwise be an enum.

>> Building Microservices with Polyglot Persistence Using Spring Cloud and Docker [kennybastani.com]

Really well put together writeup on building a microservice with different persistence options and leveraging Docker for deployment. Good stuff.

>> Writing Unit Tests With Spock Framework: Creating a Maven Project [petrikainulainen.net]

A solid intro to setting up a project and working with Spock.

>> Java 8 SE Optional, a strict approach [codefx.org]

A thoughtful rebuttal of the piece from two weeks ago, on recommendations around using Optional.

Also worth reading:

Webinars and presentations:

Time to upgrade:

2. Technical

>> PresentationDomainDataLayering [martinfowler.com]

A solid intro to the concept of the three-tiered architecture.

>> How To Ensure Idempotency In An Eventual Consistent DDD/CQRS Application [sapiensworks.com]

Where to handle idempotency and data consistency when doing DDD is definitely an important question answered here.

Also worth reading:

3. Musings

>> Sorry, I Can’t Talk to You This Iteration [frazzleddad]

I found this to be true over and over again – when I allow myself to have margin and breathing room – things grow.

>> Why Write Automated Tests? [jetbrains.com]

If you need convincing, have a read.

>> All evidence points to OOP being bullshit [pivotal.io]

Yeah.

Also worth reading:

4. Comics

And my favorite Dilberts of the week:

>> Not on poker night

>> The know about plan A

>> I think you call it a smartphone

5. Pick of the Week

Earlier this year I introduced the “Pick of the Week” section here in my “Weekly Review”. If you’re already on my email list – you got the pick already – hope you enjoyed it.

If not – you can share the review and unlock it right here:

Hiring a Technical Editor for Baeldung

$
0
0

Let me start off by saying this is not the typical code focused article I usually publish here on Baeldung. In fact, it’s the very first non-technical piece on the site in over 4 years. I usually reserve the meta stuff for meta.baeldung.com.

Jumping right to it – the site is growing, new authors are staring to contribute and I’m looking for a part-time technical editor to help and work with these new authors.

And what better way to find a solid content editor for the site than reaching out to readers and the community.

Who is the right candidate?

First – you need to be a developer yourself, working or actively involved in the Java and Spring ecosystem. All of these articles are code-centric, so being in the trenches and able to code is instrumental.
Second – you need to have your own technical site / blog in the Java ecosystem (or have some similar experience). Also, your own site cannot be super-small – a 3-post blog is really not enough to do a good evaluation.

Finally – and it almost goes without saying – you should have a good command of the English language.

What Will You Be Doing?

You’re going to work with authors, review their new article drafts and provide helpful feedback. The goal is to generally make sure that the article hits a high level of quality before it gets published. More specifically – articles should match the Baeldung formatting, code and style guidelines.
Beyond formatting and style, articles should be code-focused, clean and easy to understand. Sometimes an article is almost there, but not quite – and the author needs be guided towards a better solution, or a better way of explaining some specific concept.

Typical Time Commitment and Budget

The typical number of new articles in any given week is 3. And the typical article will take about 2 rounds of review until it’s ready to go.

All of this usually takes about 1 hour of work for a small to medium article and can take 1.5 to 2 hours for larger pieces.

Overall, you’ll spend somewhere around 16-20 hours / month. 

The budget for the position is 600$ / month.

Apply

If you think you’re well suited for this work, I’d love to work with you to help grow Baeldung.

Email me at eugen@baeldung.com with your details.

Cheers,

Eugen. 


Baeldung Weekly Review 36

$
0
0

I usually post about Dev stuff on Twitter - you can follow me there:

At the very beginning of last year, I decided to track my reading habits and share the best stuff here, on Baeldung. Haven’t missed a review since.

Here we go…

1. Spring and Java

>> Displaying progress of Spring application startup in web browser [nurkiewicz.com]

Showing visual progress during the bootstrap process of a Spring app.

There is cool, and then there is cool. And this is just plain cool. Play the video at the end to see this thing in action.

>> JDBI, a Nice Spring JDBC Alternative [insaneprogramming.be]

Nice quick intro to a SQL Java library I knew nothing about; it looks decent.

>> React.js and Spring Data REST: Part 1 – Basic Features [spring.io]

The first piece in a promising new series to follow – focused on building a Spring Data REST app and a front end for it.

>> Naming Optional query methods [joda.org]

More practical advice for using Optional – this time as a return type for query methods.

>> Java EE 8 MVC: Getting started with Ozark [mscharhag.com]

The first of an interesting series, exploring the upcoming MVC framework out of the next Java EE release.

Also worth reading:

Time to upgrade:

2. Technical

>> Revisiting webapp performance on HTTP/2 [advancedweb.hu]

Very cool look into the speed improvements in HTTP/2, along with hard-numbers for the various optimization techniques.

Crazy how much of a difference there is here – some really nice numbers.

>> Does each microservice really need its own database? [plainoldobjects.com]

This piece discusses some quite important questions touching on CQRS, Event Sourcing and focusing on the oh-so useful Polyglot Persistence aspects of that architecture.

>> Introducing Brutal Coding Constraints [code-cop.org]

Coding with the full set of typical constraints for this kind of session – this must have been a fun day.

Here’s another one to try out if you’re feeling brave – no mouse. Get your mouse and put it in your bag. Really.

I did that a few years back and I learned more keyboard shortcuts (I now use daily) over that weekend than I did for a whole year with the mouse.

>> The Unit of Work and Transactions In Domain Driven Design [sapiensworks.com]

As I’m going deeper into DDD and Event Sourcing myself, I really enjoy these dives into specific aspects of the architecture.

>> Lesson learned, test your migrations on the big dataset [swizec.com]

Hmm – this takes be back a few years, cursing the heavens in the middle of a highly annoying data migration. Good read.

Also worth reading:

3. Musings

>> An In-Depth Look At CQRS [sapiensworks.com]

Solid intro to CQRS and the CQS pattern, while at the same time looking forward towards Event Sourcing.

>> Team Efficiency is Irrelevant [benjiweber.co.uk]

An interesting read about the 80-20 of value in building software, and the idea that maybe, just maybe – this entire track of measuring performance for knowledge workers is more complex than other disciplines.

>> Surviving Software Heroes [daedtech.com]

Solid advice on how to approach the hard, hard problem of improving the team you’re part of – or lead.

Also worth reading:

4. Comics

Here are my favorite comics of the week:

>> The Laser Pointer [theoatmeal.com]

>> How to pet a kitty [theoatmeal.com]

>> How we should have been tough in our senior year of high school [theoatmeal.com]

5. Pick of the Week

Earlier this year I introduced the “Pick of the Week” section here in my “Weekly Review”. If you’re already on my email list – you got the pick already – hope you enjoyed it.

If not – you can share the review and unlock it right here:

Fourth Round of Improvements to the Reddit Application

$
0
0

I just announced the release dates of my upcoming "REST With Spring" Classes:

>> THE "REST WITH SPRING" CLASSES

1. Overview

In this tutorial, we’ll keep improving the simple Reddit application that we’re building as part of this public case study.

2. Better Tables for Admin

First, we’ll bring the tables in the Admin pages to the same level as the tables in the user facing application – by using the jQuery DataTable plugin.

2.1. Get Users Paginated – The Service Layer

Let’s add the pagination enabled operation in the service layer:

public List<User> getUsersList(int page, int size, String sortDir, String sort) {
    PageRequest pageReq = new PageRequest(page, size, Sort.Direction.fromString(sortDir), sort);
    return userRepository.findAll(pageReq).getContent();
}
public PagingInfo generatePagingInfo(int page, int size) {
    return new PagingInfo(page, size, userRepository.count());
}

2.2. A User DTO

Next – let’s now make sure that we’re cleanly returning DTOs to the client consistently.

We’re going to need a User DTO because – up until now – the API was returning the actual User entity back to the client:

public class UserDto {
    private Long id;

    private String username;

    private Set<Role> roles;

    private long scheduledPostsCount;
}

2.3. Get Users Paginated – in the Controller

Now, let’s implement this simple operation in the controller layer as well:

public List<UserDto> getUsersList(
  @RequestParam(value = "page", required = false, defaultValue = "0") int page, 
  @RequestParam(value = "size", required = false, defaultValue = "10") int size,
  @RequestParam(value = "sortDir", required = false, defaultValue = "asc") String sortDir, 
  @RequestParam(value = "sort", required = false, defaultValue = "username") String sort, 
  HttpServletResponse response) {
    response.addHeader("PAGING_INFO", userService.generatePagingInfo(page, size).toString());
    List<User> users = userService.getUsersList(page, size, sortDir, sort);

    return users.stream().map(
      user -> convertUserEntityToDto(user)).collect(Collectors.toList());
}

And here’s the DTO conversion logic:

private UserDto convertUserEntityToDto(User user) {
    UserDto dto = modelMapper.map(user, UserDto.class);
    dto.setScheduledPostsCount(scheduledPostService.countScheduledPostsByUser(user));
    return dto;
}

2.4. Front-end

Finally, on the client side, let’s use this new operation and re-implement our admin users page:

<table><thead><tr>
<th>Username</th><th>Scheduled Posts Count</th><th>Roles</th><th>Actions</th>
</tr></thead></table>

<script>           
$(function(){
    $('table').dataTable( {
        "processing": true,
        "searching":false,
        "columnDefs": [
            { "name": "username",   "targets": 0},
            { "name": "scheduledPostsCount",   "targets": 1,"orderable": false},
            { "targets": 2, "data": "roles", "width":"20%", "orderable": false, 
              "render": 
                function ( data, type, full, meta ) { return extractRolesName(data); } },
            { "targets": 3, "data": "id", "render": function ( data, type, full, meta ) {
                return '<a onclick="showEditModal('+data+',\'' + 
                  extractRolesName(full.roles)+'\')">Modify User Roles</a>'; }}
                     ],
        "columns": [
            { "data": "username" },
            { "data": "scheduledPostsCount" }
        ],
        "serverSide": true,
        "ajax": function(data, callback, settings) {
            $.get('admin/users', {
                size: data.length, 
                page: (data.start/data.length), 
                sortDir: data.order[0].dir, 
                sort: data.columns[data.order[0].column].name
            }, function(res,textStatus, request) {
                var pagingInfo = request.getResponseHeader('PAGING_INFO');
                var total = pagingInfo.split(",")[0].split("=")[1];
                callback({
                    recordsTotal: total,recordsFiltered: total,data: res
            });});
        }
});});
</script>

3. Disable a User

Next we’re going to build out a simple admin feature – the ability to disable a user.

The first thing we need is the enabled field in the User entity:

private boolean enabled;

Then, we can use that in our UserPrincipal implementation to determine if the principal is enabled or not:

public boolean isEnabled() {
    return user.isEnabled();
}

Here the API operation that deals with disabling/enabling users:

@PreAuthorize("hasRole('USER_WRITE_PRIVILEGE')")
@RequestMapping(value = "/users/{id}", method = RequestMethod.PUT)
@ResponseStatus(HttpStatus.OK)
public void setUserEnabled(@PathVariable("id") Long id, 
  @RequestParam(value = "enabled") boolean enabled) {
    userService.setUserEnabled(id, enabled);
}

And here’s the simple service layer implementation:

public void setUserEnabled(Long userId, boolean enabled) {
    User user = userRepository.findOne(userId);
    user.setEnabled(enabled);
    userRepository.save(user);
}

4. Handle Session Timeout

Next, let’s configure the app to handle a session timeout – we will add a simple SessionListener to our context to control session timeout:

public class SessionListener implements HttpSessionListener {

    @Override
    public void sessionCreated(HttpSessionEvent event) {
        event.getSession().setMaxInactiveInterval(5 * 60);
    }
}

And here is the Spring Security configuration:

protected void configure(HttpSecurity http) throws Exception {
    http 
    ...
        .sessionManagement()
        .invalidSessionUrl("/?invalidSession=true")
        .sessionFixation().none();
}

Note:

  • We configured our session timeout to be 5 minutes.
  • When session expire the user will be redirected to login page.

5. Enhance Registration

Next, we’ll enhance the registration flow by adding some functionality that was previously missing.

We’re going to only illustrate the main points here; to go deep into registration – check out the Registration series.

5.1. Registration Confirmation Email

One of these features missing from registration was that users weren’t promoted to confirm their email.

We’ll now make users confirm their email address first before they’re activated in the system:

public void register(HttpServletRequest request, 
  @RequestParam("username") String username, 
  @RequestParam("email") String email, 
  @RequestParam("password") String password) {
    String appUrl = 
      "http://" + request.getServerName() + ":" + 
       request.getServerPort() + request.getContextPath();
    userService.registerNewUser(username, email, password, appUrl);
}

The service layer also needs a bit of work – basically making sure that the user is disabled initially:

@Override
public void registerNewUser(String username, String email, String password, String appUrl) {
    ...
    user.setEnabled(false);
    userRepository.save(user);
    eventPublisher.publishEvent(new OnRegistrationCompleteEvent(user, appUrl));
}

Now for the confirmation:

@RequestMapping(value = "/user/regitrationConfirm", method = RequestMethod.GET)
public String confirmRegistration(Model model, @RequestParam("token") String token) {
    String result = userService.confirmRegistration(token);
    if (result == null) {
        return "redirect:/?msg=registration confirmed successfully";
    }
    model.addAttribute("msg", result);
    return "submissionResponse";
}
public String confirmRegistration(String token) {
    VerificationToken verificationToken = tokenRepository.findByToken(token);
    if (verificationToken == null) {
        return "Invalid Token";
    }

    Calendar cal = Calendar.getInstance();
    if ((verificationToken.getExpiryDate().getTime() - cal.getTime().getTime()) <= 0) {
        return "Token Expired";
    }

    User user = verificationToken.getUser();
    user.setEnabled(true);
    userRepository.save(user);
    return null;
}

5.2. Trigger A Password Reset

Now, let’s see how to allow users to reset their own password in case they forget it:

@RequestMapping(value = "/users/passwordReset", method = RequestMethod.POST)
@ResponseStatus(HttpStatus.OK)
public void passwordReset(HttpServletRequest request, @RequestParam("email") String email) {
    String appUrl = "http://" + request.getServerName() + ":" + 
      request.getServerPort() + request.getContextPath();
    userService.resetPassword(email, appUrl);
}

Now, the service layer will simply send an email to the user – with the link where they can reset their password:

public void resetPassword(String userEmail, String appUrl) {
    Preference preference = preferenceRepository.findByEmail(userEmail);
    User user = userRepository.findByPreference(preference);
    if (user == null) {
        throw new UserNotFoundException("User not found");
    }

    String token = UUID.randomUUID().toString();
    PasswordResetToken myToken = new PasswordResetToken(token, user);
    passwordResetTokenRepository.save(myToken);
    SimpleMailMessage email = constructResetTokenEmail(appUrl, token, user);
    mailSender.send(email);
}

5.3. Reset Password

Once the user clicks on the link in the email, they can actually perform the reset password operation:

@RequestMapping(value = "/users/resetPassword", method = RequestMethod.GET)
public String resetPassword(
  Model model, 
  @RequestParam("id") long id, 
  @RequestParam("token") String token) {
    String result = userService.checkPasswordResetToken(id, token);
    if (result == null) {
        return "updatePassword";
    }
    model.addAttribute("msg", result);
    return "submissionResponse";
}

And the service layer:

public String checkPasswordResetToken(long userId, String token) {
    PasswordResetToken passToken = passwordResetTokenRepository.findByToken(token);
    if ((passToken == null) || (passToken.getUser().getId() != userId)) {
        return "Invalid Token";
    }

    Calendar cal = Calendar.getInstance();
    if ((passToken.getExpiryDate().getTime() - cal.getTime().getTime()) <= 0) {
        return "Token Expired";
    }

    UserPrincipal userPrincipal = new UserPrincipal(passToken.getUser());
    Authentication auth = new UsernamePasswordAuthenticationToken(
      userPrincipal, null, userPrincipal.getAuthorities());
    SecurityContextHolder.getContext().setAuthentication(auth);
    return null;
}

Finally, here’s the update password implementation:

@RequestMapping(value = "/users/updatePassword", method = RequestMethod.POST)
@ResponseStatus(HttpStatus.OK)
public void changeUserPassword(@RequestParam("password") String password) {
    userService.changeUserPassword(userService.getCurrentUser(), password);
}

5.4. Change Password

Next, we’re going to implement a similar functionality – changing your password internally:

@RequestMapping(value = "/users/changePassword", method = RequestMethod.POST)
@ResponseStatus(HttpStatus.OK)
public void changeUserPassword(@RequestParam("password") String password, 
  @RequestParam("oldpassword") String oldPassword) {
    User user = userService.getCurrentUser();
    if (!userService.checkIfValidOldPassword(user, oldPassword)) {
        throw new InvalidOldPasswordException("Invalid old password");
    }
    userService.changeUserPassword(user, password);
}
public void changeUserPassword(User user, String password) {
    user.setPassword(passwordEncoder.encode(password));
    userRepository.save(user);
}

6. Bootify the Project

Next, let’s convert/upgrade the project over to Spring Boot; first, we will modify the pom.xml:

...
<parent>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-parent</artifactId>
    <version>1.2.5.RELEASE</version>
</parent>

<dependencies>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-web</artifactId>
    </dependency>
        
    <dependency>
       <groupId>org.aspectj</groupId>
       <artifactId>aspectjweaver</artifactId>
     </dependency>
...

And also provide a simple Boot application for startup:

@SpringBootApplication
public class Application {

    @Bean
    public SessionListener sessionListener() {
        return new SessionListener();
    }

    @Bean
    public RequestContextListener requestContextListener() {
        return new RequestContextListener();
    }

    public static void main(String... args) {
        SpringApplication.run(Application.class, args);
    }
}

Note that the new base URL will now be http://localhost:8080 instead of the old http://localhost:8080/reddit-scheduler.

7. Externalize Properties

Now that we have Boot in, we can use @ConfigurationProperties to externalize our Reddit properties:

@ConfigurationProperties(prefix = "reddit")
@Component
public class RedditProperties {

    private String clientID;
    private String clientSecret;
    private String accessTokenUri;
    private String userAuthorizationUri;
    private String redirectUri;

    public String getClientID() {
        return clientID;
    }
    
    ...
}

We can now cleanly use these properties in a type-safe manner:

@Autowired
private RedditProperties redditProperties;

@Bean
public OAuth2ProtectedResourceDetails reddit() {
    AuthorizationCodeResourceDetails details = new AuthorizationCodeResourceDetails();
    details.setClientId(redditProperties.getClientID());
    details.setClientSecret(redditProperties.getClientSecret());
    details.setAccessTokenUri(redditProperties.getAccessTokenUri());
    details.setUserAuthorizationUri(redditProperties.getUserAuthorizationUri());
    details.setPreEstablishedRedirectUri(redditProperties.getRedirectUri());
    ...
    return details;
}

8. Conclusion

This round of improvements was a very good step forward for the application.

We’re not adding any more major features, which makes architectural improvements the next logical step – this is what this article is all about.

Sign Up and get 25% Off my upcoming "REST With Spring" classes on launch:

>> CHECK OUT THE CLASSES

Apply CQRS to a Spring REST API

$
0
0

1. Overview

In this quick article, we’re going to do something new. We’re going to evolve an existing REST Spring API and make it use Command Query Responsibility Segregation – CQRS.

The goal is to cleanly separate both the service and the controller layers to deal with Reads – Queries and Writes – Commands coming into the system separately.

Keep in mind that this is just an early first step towards this kind of architecture, not “an arrival point”. That being said – I’m excited about this one.

Finally – the example API we’re going to be using is publishing User resources and is part of our ongoing Reddit app case study to exemplify how this works – but of course any API will do.

2. The Service Layer

We’ll start simple – by just identifying the read and the write operations in our previous User service – and we’ll split that into 2 separate services – UserQueryService and UserCommandService:

public interface IUserQueryService {

    List<User> getUsersList(int page, int size, String sortDir, String sort);

    String checkPasswordResetToken(long userId, String token);

    String checkConfirmRegistrationToken(String token);

    long countAllUsers();

}
public interface IUserCommandService {

    void registerNewUser(String username, String email, String password, String appUrl);

    void updateUserPassword(User user, String password, String oldPassword);

    void changeUserPassword(User user, String password);

    void resetPassword(String email, String appUrl);

    void createVerificationTokenForUser(User user, String token);

    void updateUser(User user);

}

From reading this API you can clearly see how the query service is doing all the reading and the command service isn’t reading any data – all void returns.

3. The Controller Layer

Next up – the controller layer.

3.1. The Query Controller

Here is our UserQueryRestController:

@Controller
@RequestMapping(value = "/api/users")
public class UserQueryRestController {

    @Autowired
    private IUserQueryService userService;

    @Autowired
    private IScheduledPostQueryService scheduledPostService;

    @Autowired
    private ModelMapper modelMapper;

    @PreAuthorize("hasRole('USER_READ_PRIVILEGE')")
    @RequestMapping(method = RequestMethod.GET)
    @ResponseBody
    public List<UserQueryDto> getUsersList(...) {
        PagingInfo pagingInfo = new PagingInfo(page, size, userService.countAllUsers());
        response.addHeader("PAGING_INFO", pagingInfo.toString());
        
        List<User> users = userService.getUsersList(page, size, sortDir, sort);
        return users.stream().map(user -> convertUserEntityToDto(user)).collect(Collectors.toList());
    }

    private UserQueryDto convertUserEntityToDto(User user) {
        UserQueryDto dto = modelMapper.map(user, UserQueryDto.class);
        dto.setScheduledPostsCount(scheduledPostService.countScheduledPostsByUser(user));
        return dto;
    }
}

What’s interesting here is that the query controller is only injecting query services.

What would be even more interesting is to cut of the access of this controller to the command services – by placing these in a separate module.

3.2. The Command Controller

Now, here’s our command controller implementation:

@Controller
@RequestMapping(value = "/api/users")
public class UserCommandRestController {

    @Autowired
    private IUserCommandService userService;

    @Autowired
    private ModelMapper modelMapper;

    @RequestMapping(value = "/registration", method = RequestMethod.POST)
    @ResponseStatus(HttpStatus.OK)
    public void register(HttpServletRequest request, @RequestBody UserRegisterCommandDto userDto) {
        String appUrl = request.getRequestURL().toString().replace(request.getRequestURI(), "");
        
        userService.registerNewUser(
          userDto.getUsername(), userDto.getEmail(), userDto.getPassword(), appUrl);
    }

    @PreAuthorize("isAuthenticated()")
    @RequestMapping(value = "/password", method = RequestMethod.PUT)
    @ResponseStatus(HttpStatus.OK)
    public void updateUserPassword(@RequestBody UserUpdatePasswordCommandDto userDto) {
        userService.updateUserPassword(
          getCurrentUser(), userDto.getPassword(), userDto.getOldPassword());
    }

    @RequestMapping(value = "/passwordReset", method = RequestMethod.POST)
    @ResponseStatus(HttpStatus.OK)
    public void createAResetPassword(
      HttpServletRequest request, 
      @RequestBody UserTriggerResetPasswordCommandDto userDto) 
    {
        String appUrl = request.getRequestURL().toString().replace(request.getRequestURI(), "");
        userService.resetPassword(userDto.getEmail(), appUrl);
    }

    @RequestMapping(value = "/password", method = RequestMethod.POST)
    @ResponseStatus(HttpStatus.OK)
    public void changeUserPassword(@RequestBody UserchangePasswordCommandDto userDto) {
        userService.changeUserPassword(getCurrentUser(), userDto.getPassword());
    }

    @PreAuthorize("hasRole('USER_WRITE_PRIVILEGE')")
    @RequestMapping(value = "/{id}", method = RequestMethod.PUT)
    @ResponseStatus(HttpStatus.OK)
    public void updateUser(@RequestBody UserUpdateCommandDto userDto) {
        userService.updateUser(convertToEntity(userDto));
    }

    private User convertToEntity(UserUpdateCommandDto userDto) {
        return modelMapper.map(userDto, User.class);
    }
}

A few interesting things are happening here. First – notice how each of these API implementations is using a different command. This is mainly to give us a good base for further improving the design of the API and extracting different resources as they emerge.

Another reason is that, when we take the next step, towards Event Sourcing – we have a clean set of commands that we’re working with.

3.3. Separate Resource Representations

Let’s now quickly go over the different representations of our User resource, after this separation into commands and queries:

public class UserQueryDto {
    private Long id;

    private String username;

    private boolean enabled;

    private Set<Role> roles;

    private long scheduledPostsCount;
}

Here is our Command DTOs:

  • UserRegisterCommandDto used to represent user registration data:
public class UserRegisterCommandDto {
    private String username;
    private String email;
    private String password;
}
  •  UserUpdatePasswordCommandDto used to represent data to update current user password:
public class UserUpdatePasswordCommandDto {
    private String oldPassword;
    private String password;
}
  • UserTriggerResetPasswordCommandDto used to represent user’s email to trigger reset password by sending email with reset password token:
public class UserTriggerResetPasswordCommandDto {
    private String email;
}
  • UserChangePasswordCommandDto used to represent new user password – this command is called after user use password reset token.
public class UserChangePasswordCommandDto {
    private String password;
}
  • UserUpdateCommandDto used to represent new user’s data after modifications:
public class UserUpdateCommandDto {
    private Long id;

    private boolean enabled;

    private Set<Role> roles;
}

5. Conclusion

In this tutorial we laid the groundwork towards a clean CQRS implementation for a Spring REST API.

The next step will be to keep improving the API by identifying some separate responsibilities (and Resources) out into their own services, so that we more closely align with a Resource-centric architecture.

Baeldung Weekly Review 37

$
0
0

I usually post about Dev stuff on Twitter - you can follow me there:

At the very beginning of last year, I decided to track my reading habits and share the best stuff here, on Baeldung. Haven’t missed a review since.

Here we go…

1. Spring and Java

>> The State of the Module System [java.net]

A very good intro – nay – the reference intro to what’s going to be the Java module system. Good stuff.

Here’s a quick outside look at the module system as well.

>> Stream Performance [codefx.org]

Very interesting data on stream performance, and depending on the performance goals of your system, maybe actionable data as well.

>> How to use Java 8 Functional Programming to Generate an Alphabetic Sequence [jooq.org]

Good things come to those who use Java 8 streams, but only the things left behind by those who use Clojure.

>> What’s New In Spring Data Release Gosling? [spring.io]

A myriad of improvements in this release; I’m most excited about the new Querydsl web support and the HAL browser – these look highly useful.

>> Introduction to Event Sourcing and Command-Query Responsibility Segregation [squirrel.pl]

Introducing both CQRS and Event Sourcing in a single piece is not easy, but this definitely is a good initial writeup in what I’m hoping is going to be a long series.

>> 7 Java Performance Metrics to Watch After a Major Release [takipi.com]

A solid set of basic metrics that you really do need to always track, but even more so after putting a major release into production.

>> Using JAX-RS With Spring Boot Instead of MVC [insaneprogramming.be]

Spring Boot with JAX-RS – now that’s something I never thought I’d be reading about.

Also worth reading:

Webinars and presentations:

Time to upgrade:

2. Technical

>> Call me Maybe: MariaDB Galera Cluster [aphyr.com]

Another clustering solution that looks decent but ultimately doesn’t hold water yet.

>> How we ended up with microservices [philcalcado.com]

Long but very interesting read on anther, much less talked about aspect of microservices – team productivity.

Also worth reading:

3. Musings

>> Knowledge Breadth versus Depth [nealford.com]

The classical knowledge quadrant, as it applies to the technical field and growing as an architect.

Also worth reading:

4. Comics

And my favorite comics of the week:

>> Why the mantis shrimp is my new favorite animal [theoatmeal.com]

>> Dear Sriracha aka Rooster Sauce [theoatmeal.com]

>> What it’s like to own an Apple product [theoatmeal.com]

5. Pick of the Week

Earlier this year I introduced the “Pick of the Week” section here in my “Weekly Review”. If you’re already on my email list – you got the pick already – hope you enjoyed it.

I’m doing a social media push for this weeks review.

If you’ve been enjoying the reviews coming in on Friday for the past year and a half, I’d really appreciate you taking the time to share this particular review on social:

>> Weekly 37 on Twitter

>> Weekly 37 on Reddit

>> Weekly 37 on G+

>> Weekly 37 on Facebook

Of course, just vote the one where you’re actually active and have an account.

Java Web Weekly 38 (formerly the “Baeldung Weekly Review”)

$
0
0

This week I’m announcing – and putting into practice – a name change for the “Baeldung Weekly Review“. The new weekly review will be called “Java Web Weekly”.

The reason behind the change is simple – I found that the old name didn’t really communicate to new readers what the review is about. The new name is much clearer and also fits the content and my own focus perfectly.

And of course, besides the name – the review will be exactly the same.

Here we go…

1. Spring and Java

>> React.js and Spring Data REST: Part 2 – Hypermedia [spring.io]

A tour du force of Spring Data REST and just how easy it makes baking in a lot of Hypermedia goodness in an API. Only a few APIs do it and even fewer do it well.

Hypermedia controls is one of my favorite things to really take the API up a notch, especially now that I’m getting closer to recording Course 7 (Evolving, Discovering and Documenting the REST API) of my REST With Spring classes. I’m was thinking of having a section on Spring Data REST, but I might just have to dedicate a whole bonus course to it to do it justice.

>> Create type-safe queries with the JPA static metamodel [thoughts-on-java.org]

An exploration of the cool static metamodel helper classes out of JPA. Once you get past the process of generating these – they really come in handy to write fluent, clean persistence level logic.

>> Stream Performance – Your Ideas [codefx.org]

New numbers on top of the results from last week – on the performance of Java 8 Streams.

>> An introduction to optimising a hashing strategy [vanillajava]

An interesting deep-dive into improving the hashing strategies we’re using daily.

>> AssertJ’s SoftAssertions – do we need them? [codeleak.pl]

Soft assertions are a new concept (for me) – and I’ll probably be very selective in how I actually use them, but I can certainly see how – in a few scenarios – these would come in really really handy.

>> JDK 9: Highlights from The State of the Module System [marxsoftware]

Some takeaways from the official info that came out last week on how Java 9 modularization will behave. I like the short, distilled notes.

>> Automated tests with Eclipse using MoreUnit [advancedweb.hu]

It may be easy to dismiss the little quirks of your IDE, but it always pays off to improve your craft and your workflow. Here’s an Eclipse plugin that looks promising if you’re doing TDD.

Also worth reading:

Webinars and presentations:

Time to upgrade:

2. Technical

>> Achieving Consistency in CQRS with Linear Event Store [squirrel.pl]

The second installment in this Event Sourcing focused series – going into a lot of detail about the choices that go into selecting an Event Store and interacting with it efficiently.

Also worth reading:

3. Musings

>> Putting on the shipping goggles [signalvnoise.com]

This piece really lands if you’ve ever shipped anything – especially your own work.

Also worth reading:

4. Comics

And my favorite comics of the week:

>> Severity One? [cube-drone.com]

>> Versioning [cube-drone.com]

>> The Singleton Fairy [cube-drone.com]

5. Pick of the Week

Along with the name change, another small change in the pick section is that I’m removing the lock mechanism.

Here’s a very cool Markdown app – if you’re doing any kind of writing:

>> Dillinger

[/sociallocker]

Spring RestTemplate Tutorial

$
0
0

I just announced the release dates of my upcoming "REST With Spring" Classes:

>> THE "REST WITH SPRING" CLASSES

1. Overview

In this tutorial we’re going to illustrate the broad range of operations where the Spring RestTemplate can be used, and used well.

For the API side of all examples, we’ll be running the RESTful service from here.

2. Use GET to Retrieve Resources

2.1. Get Plain JSON

Let’s start simple and talk about GET requests – with a quick example using the getForEntity() API:

RestTemplate restTemplate = new RestTemplate();
String fooResourceUrl = "http://localhost:8080/spring-security-rest-full/foos";
ResponseEntity<String> response = 
  restTemplate.getForEntity(fooResourceUrl + "/1", String.class);
assertThat(response.getStatusCode(), is(HttpStatus.OK));

We have full access to the response, so we can do things like checking the status code to make sure the operation was actually successful and then we can start working with the body of the response – which in this case is JSON:

ObjectMapper mapper = new ObjectMapper();
JsonNode root = mapper.readTree(response.getBody());
JsonNode name = root.path("name");
assertThat(name.asText(), is("bar"));

We’re working with the response body as a standard String here – and using Jackson (and the JSON node structure that Jackson provides) to verify some details.

2.1. Retrieving POJO Instead of JSON

We can also map the response directly to a Resource DTO – for example:

public class Foo implements Serializable {
    private long id;

    private String name;
    // standard getters and setters
}

Now – we can simply use the getForObject API in the template:

Foo foo = restTemplate.getForObject(fooResourceUrl + "/1", Foo.class);
assertThat(foo.getName(), is("bar"));
assertThat(foo.getId(), is(1L));

3. Use HEAD to Retrieve Headers

Let’s now have a quick look at using HEAD before moving on to the more common methods – we’re going to be using the headForHeaders() API here:

HttpHeaders httpHeaders = restTemplate.headForHeaders(fooResourceUrl);
assertTrue(httpHeaders.getContentType().includes(MediaType.APPLICATION_JSON));

4. Use POST to Create a Resource

In order to create a new Resource in the API – we can make good use of the postForLocation(), postForObject() or postForEntity() APIs.

The first returns the URI of the newly created Resource while the second returns the Resource itself.

4.1. The postForObject API

ClientHttpRequestFactory requestFactory = getClientHttpRequestFactory();
RestTemplate restTemplate = new RestTemplate(requestFactory);

HttpEntity<Foo> request = new HttpEntity<>(new Foo("bar"));
Foo foo = restTemplate.postForObject(fooResourceUrl, request, Foo.class);
assertThat(foo, notNullValue());
assertThat(foo.getName(), is("bar"));

4.2. The postForLocation API

Similarly, let’s have a look at the operation that – instead of returning the full Resource, just returns the Location of that newly created Resource:

HttpEntity<Foo> request = new HttpEntity<>(new Foo("bar"));
URI location = restTemplate.postForLocation(fooResourceUrl, request);
assertThat(location, notNullValue());

4.3. The exchange API

Finally, let’s have a look at how to do a POST with the more generic exchange API:

RestTemplate restTemplate = new RestTemplate();
HttpEntity<Foo> request = new HttpEntity<>(new Foo("bar"));
ResponseEntity<Foo> response = restTemplate.
  exchange(fooResourceUrl, HttpMethod.POST, request, Foo.class);
assertThat(response.getStatusCode(), is(HttpStatus.CREATED));
Foo foo = response.getBody();
assertThat(foo, notNullValue());
assertThat(foo.getName(), is("bar"));

5. Use OPTIONS to Get Allowed Operations

Next we’re going to have a quick look at using an OPTIONS request and exploring the allowed operations on a specific URI using this kind of request; the API is optionsForAllow:

Set<HttpMethod> optionsForAllow = restTemplate.optionsForAllow(fooResourceUrl);
HttpMethod[] supportedMethods = 
  {HttpMethod.GET, HttpMethod.POST, HttpMethod.PUT, HttpMethod.DELETE};
assertTrue(optionsForAllow.containsAll(Arrays.asList(supportedMethods)));

6. Use PUT to Update a Resource

Next, we’ll start looking at PUT – and more specifically the exchange API for this operation, because the template.put API is pretty straightforward.

6.1. Simple PUT with .exchange

We’ll start with a simple PUT operation against the API – and keep in mind that the operation isn’t returning any body back to the client:

Foo updatedInstance = new Foo("newName");
updatedInstance.setId(createResponse.getBody().getId());
String resourceUrl = fooResourceUrl + '/' + createResponse.getBody().getId();
HttpEntity<Foo> requestUpdate = new HttpEntity<>(updatedInstance, headers);
template.exchange(resourceUrl, HttpMethod.PUT, requestUpdate, Void.class);

6.2. PUT with .exchange and a Request Callback

Next, we’re going to be using a request callback to issue a PUT.

Let’s make sure we prepare the callback – where we can set all the headers we need as well as a request body:

RequestCallback requestCallback(final Foo updatedInstance) {
    return clientHttpRequest -> {
        ObjectMapper mapper = new ObjectMapper();
        mapper.writeValue(clientHttpRequest.getBody(), updatedInstance);
        clientHttpRequest.getHeaders().add(
          HttpHeaders.CONTENT_TYPE, MediaType.APPLICATION_JSON_VALUE);
        clientHttpRequest.getHeaders().add(
          HttpHeaders.AUTHORIZATION, "Basic " + getBase64EncodedLogPass());
    };
}

Next we create the Resource with POST request:

ResponseEntity<Foo> response = restTemplate.
  exchange(fooResourceUrl, HttpMethod.POST, request, Foo.class);
assertThat(response.getStatusCode(), is(HttpStatus.CREATED));

And then we update the Resource:

Foo updatedInstance = new Foo("newName");
updatedInstance.setId(response.getBody().getId());
String resourceUrl =fooResourceUrl + '/' + response.getBody().getId();
restTemplate.execute(
  resourceUrl, HttpMethod.PUT, requestCallback(updatedInstance), clientHttpResponse -> null);

7. Use DELETE to Remove a Resource

To remove an existing Resource we’ll make short work of the delete() API:

String entityUrl = fooResourceUrl + "/" + existingResource.getId();
restTemplate.delete(entityUrl);

8. Conclusion

We went over the main HTTP Verbs, using RestTemplate to orchestrate requests using all of these.

If you want to dig into how to do authentication with the template – check out my write-up on Basic Auth with RestTemplate.

The implementation of all these examples and code snippets can be found in my github project – this is an Eclipse based project, so it should be easy to import and run as it is.

Sign Up and get 25% Off my upcoming "REST With Spring" classes on launch:

>> CHECK OUT THE CLASSES

Java Web Weekly 39

$
0
0

I usually post about Dev stuff on Twitter - you can follow me there:

At the very beginning of last year, I decided to track my reading habits and share the best stuff here, on Baeldung. Haven’t missed a review since.

Here we go…

1. Spring and Java

>> JEP 269: Convenience Factory Methods for Collections [java.net]

Quick and clean JDK making collection creation easier in Java. This is how the language gets better.

>> Resource versioning with Spring MVC [mscharhag.com]

Highly useful writeup on making your static assets cacheable with Spring – simple and to the point.

Also worth reading:

Webinars and presentations:

Time to upgrade:

2. Technical

>> Writing an Event-Sourced CQRS Read Model [squirrel.pl]

Yet another solid writeup exploring the details of an Event-Sourced architecture – this time focusing on re-projecting all events on to an empty projection.

>> Scale it to Billions — What They Don’t Tell you in the Cassandra README [threatstack.com]

If you’re running a Cassandra cluster, this stuff can and will save your bacon.

Also worth reading:

3. Musings

>> Remember to Reevaluate [bitquabit.com]

Maybe taking a second look at a technology that I formed an opinion on five years ago is a good idea. That being said – I’m not looking at SVN again.

>> The Universality of Postel’s Law [michaelfeathers.silvrback.com]

“Design is deep topic. One could say it’s the deepest.”

>> What do you want to be known for? [katemats.com]

Good positioning is not something that “just happens” – it’s an intentional, strategic, multi-year effort but one that can lead to some pretty cool things.

>> How we lost (and found) millions by not A/B testing [signalvnoise.com]

Good take-aways from a major design change, a re-positioning campaign and a general re-brand.

Also worth reading:

4. Comics

And my favorite comics of the week:

>> Coder leaves nothing to chance [commitstrip.com]

>> The blob-ject [commitstrip.com]

>> Picture a Grassy Field [xkcd.com]

5. Pick of the Week

I haven’t been super public with this – but a few months back I opened up Baeldung to other authors.

I’ve been slowly realizing that I just can’t cover all the cool stuff readers are asking me to write about, so if you’re interested in writing for the site – send me a line:

>> Write for Baeldung


Fifth Round of Improvements to the Reddit Application

$
0
0

I just announced the release dates of my upcoming "REST With Spring" Classes:

>> THE "REST WITH SPRING" CLASSES

1. Overview

Let’s continue moving forward the Reddit application from our ongoing case study.

2. Send Email Notifications on Post Comments

Reddit is missing email notifications – plain and simple. What I’d like to see is – whenever someone comments on one of my posts, I get a short email notification with the comment.

So – simply put – that’s the goal of this feature here – email notifications on comments.

We’ll implement a simple scheduler that checks:

  • which users should receive email notification with posts’ replies
  • if the user got any post replies into their Reddit inbox

It will then  simply send out an email notification with unread post replies.

2.1. User Preferences

First, we will need to modify our Preference entity and DTO by adding:

private boolean sendEmailReplies;

To allow users to choose if they want to receive an email notification with posts’ replies.

2.2. Notification Scheduler

Next, here is our simple scheduler:

@Component
public class NotificationRedditScheduler {

    @Autowired
    private INotificationRedditService notificationRedditService;

    @Autowired
    private PreferenceRepository preferenceRepository;

    @Scheduled(fixedRate = 60 * 60 * 1000)
    public void checkInboxUnread() {
        List<Preference> preferences = preferenceRepository.findBySendEmailRepliesTrue();
        for (Preference preference : preferences) {
            notificationRedditService.checkAndNotify(preference);
        }
    }
}

Notice that the scheduler runs every hour – but we can of course go with a much shorter cadence if we want to.

2.3. The Notification Service

Now, let’s discuss our notification service:

@Service
public class NotificationRedditService implements INotificationRedditService {
    private Logger logger = LoggerFactory.getLogger(getClass());
    private static String NOTIFICATION_TEMPLATE = "You have %d unread post replies.";
    private static String MESSAGE_TEMPLATE = "%s replied on your post %s : %s";

    @Autowired
    @Qualifier("schedulerRedditTemplate")
    private OAuth2RestTemplate redditRestTemplate;

    @Autowired
    private ApplicationEventPublisher eventPublisher;

    @Autowired
    private UserRepository userRepository;

    @Override
    public void checkAndNotify(Preference preference) {
        try {
            checkAndNotifyInternal(preference);
        } catch (Exception e) {
            logger.error(
              "Error occurred while checking and notifying = " + preference.getEmail(), e);
        }
    }

    private void checkAndNotifyInternal(Preference preference) {
        User user = userRepository.findByPreference(preference);
        if ((user == null) || (user.getAccessToken() == null)) {
            return;
        }

        DefaultOAuth2AccessToken token = new DefaultOAuth2AccessToken(user.getAccessToken());
        token.setRefreshToken(new DefaultOAuth2RefreshToken((user.getRefreshToken())));
        token.setExpiration(user.getTokenExpiration());
        redditRestTemplate.getOAuth2ClientContext().setAccessToken(token);

        JsonNode node = redditRestTemplate.getForObject(
          "https://oauth.reddit.com/message/selfreply?mark=false", JsonNode.class);
        parseRepliesNode(preference.getEmail(), node);
    }

    private void parseRepliesNode(String email, JsonNode node) {
        JsonNode allReplies = node.get("data").get("children");
        int unread = 0;
        for (JsonNode msg : allReplies) {
            if (msg.get("data").get("new").asBoolean()) {
                unread++;
            }
        }
        if (unread == 0) {
            return;
        }

        JsonNode firstMsg = allReplies.get(0).get("data");
        String author = firstMsg.get("author").asText();
        String postTitle = firstMsg.get("link_title").asText();
        String content = firstMsg.get("body").asText();

        StringBuilder builder = new StringBuilder();
        builder.append(String.format(NOTIFICATION_TEMPLATE, unread));
        builder.append("\n");
        builder.append(String.format(MESSAGE_TEMPLATE, author, postTitle, content));
        builder.append("\n");
        builder.append("Check all new replies at ");
        builder.append("https://www.reddit.com/message/unread/");

        eventPublisher.publishEvent(new OnNewPostReplyEvent(email, builder.toString()));
    }
}

Note that:

  • We call Reddit API and get all replies then check them one by one to see if it is new “unread”.
  • If there is unread replies, we fire an event to send this user an email notification.

2.4. New Reply Event

Here is our simple event:

public class OnNewPostReplyEvent extends ApplicationEvent {
    private String email;
    private String content;

    public OnNewPostReplyEvent(String email, String content) {
        super(email);
        this.email = email;
        this.content = content;
    }
}

2.5. Reply Listener

Finally, here is our listener:

@Component
public class ReplyListener implements ApplicationListener<OnNewPostReplyEvent> {
    @Autowired
    private JavaMailSender mailSender;

    @Autowired
    private Environment env;

    @Override
    public void onApplicationEvent(OnNewPostReplyEvent event) {
        SimpleMailMessage email = constructEmailMessage(event);
        mailSender.send(email);
    }

    private SimpleMailMessage constructEmailMessage(OnNewPostReplyEvent event) {
        String recipientAddress = event.getEmail();
        String subject = "New Post Replies";
        SimpleMailMessage email = new SimpleMailMessage();
        email.setTo(recipientAddress);
        email.setSubject(subject);
        email.setText(event.getContent());
        email.setFrom(env.getProperty("support.email"));
        return email;
    }
}

3. Session Concurrency Control

Next, let’s set up some stricter rules regarding the number of concurrent sessions the application allows. More to the point – let’s not allow concurrent sessions:

@Override
protected void configure(HttpSecurity http) throws Exception {
    http.sessionManagement()
          .maximumSessions(1)
          .maxSessionsPreventsLogin(true);
}

Note that – as we are using a custom UserDetails implementation – we need to override equals() and hashcode() because the session controls strategy stores all principals in a map and needs to be able to retrieve them:

public class UserPrincipal implements UserDetails {

    private User user;

    @Override
    public int hashCode() {
        int prime = 31;
        int result = 1;
        result = (prime * result) + ((user == null) ? 0 : user.hashCode());
        return result;
    }

    @Override
    public boolean equals(Object obj) {
        if (this == obj) {
            return true;
        }
        if (obj == null) {
            return false;
        }
        if (getClass() != obj.getClass()) {
            return false;
        }
        UserPrincipal other = (UserPrincipal) obj;
        if (user == null) {
            if (other.user != null) {
                return false;
            }
        } else if (!user.equals(other.user)) {
            return false;
        }
        return true;
    }
}

4. Separate API Servlet

The application is now serving both the front end as well as the API out of the same servlet – which is not ideal.

Let’s now split these two major responsibilities apart and pull them into two different servlets:

@Bean
public ServletRegistrationBean frontendServlet() {
    ServletRegistrationBean registration = 
      new ServletRegistrationBean(new DispatcherServlet(), "/*");

    Map<String, String> params = new HashMap<String, String>();
    params.put("contextClass", 
      "org.springframework.web.context.support.AnnotationConfigWebApplicationContext");
    params.put("contextConfigLocation", "org.baeldung.config.frontend");
    registration.setInitParameters(params);
    
    registration.setName("FrontendServlet");
    registration.setLoadOnStartup(1);
    return registration;
}

@Bean
public ServletRegistrationBean apiServlet() {
    ServletRegistrationBean registration = 
      new ServletRegistrationBean(new DispatcherServlet(), "/api/*");
    
    Map<String, String> params = new HashMap<String, String>();
    params.put("contextClass", 
      "org.springframework.web.context.support.AnnotationConfigWebApplicationContext");
    params.put("contextConfigLocation", "org.baeldung.config.api");
    
    registration.setInitParameters(params);
    registration.setName("ApiServlet");
    registration.setLoadOnStartup(2);
    return registration;
}

@Override
protected SpringApplicationBuilder configure(final SpringApplicationBuilder application) {
    application.sources(Application.class);
    return application;
}

Note how we now have a front-end servlet that handles all front end requests and only bootstraps a Spring context specific for the front end; and then we have the API Servlet – bootstrapping an entirely different Spring context for the API.

Also – very important – these two servlet Spring contexts are child contexts. The parent context – created by SpringApplicationBuilder – scans the root package for common configuration like persistence, service, … etc.

Here is our WebFrontendConfig:

@Configuration
@EnableWebMvc
@ComponentScan({ "org.baeldung.web.controller.general" })
public class WebFrontendConfig extends WebMvcConfigurerAdapter {

    @Bean
    public static PropertySourcesPlaceholderConfigurer 
      propertySourcesPlaceholderConfigurer() {
        return new PropertySourcesPlaceholderConfigurer();
    }

    @Bean
    public ViewResolver viewResolver() {
        InternalResourceViewResolver viewResolver = new InternalResourceViewResolver();
        viewResolver.setPrefix("/WEB-INF/jsp/");
        viewResolver.setSuffix(".jsp");
        return viewResolver;
    }

    @Override
    public void configureDefaultServletHandling(DefaultServletHandlerConfigurer configurer) {
        configurer.enable();
    }

    @Override
    public void addViewControllers(ViewControllerRegistry registry) {
        super.addViewControllers(registry);
        registry.addViewController("/home");
        ...
    }

    @Override
    public void addResourceHandlers(ResourceHandlerRegistry registry) {
        registry.addResourceHandler("/resources/**").addResourceLocations("/resources/");
    }
}

And WebApiConfig:

@Configuration
@EnableWebMvc
@ComponentScan({ "org.baeldung.web.controller.rest", "org.baeldung.web.dto" })
public class WebApiConfig extends WebMvcConfigurerAdapter {

    @Bean
    public ModelMapper modelMapper() {
        return new ModelMapper();
    }
}

5. Unshorten Feeds URL

Finally – we’re going to make working with RSS better.

Sometimes, RSS feeds are shortened or redirected through an external service such as Feedburner – so when we’re loading the URL of a feed in the application – we need to make sure we follow that URL through all the redirects until we reach the main URL we actually care about.

So – when we post the article’s link to Reddit, we actually post the correct, original URL:

@RequestMapping(value = "/url/original")
@ResponseBody
public String getOriginalLink(@RequestParam("url") String sourceUrl) {
    try {
        List<String> visited = new ArrayList<String>();
        String currentUrl = sourceUrl;
        while (!visited.contains(currentUrl)) {
            visited.add(currentUrl);
            currentUrl = getOriginalUrl(currentUrl);
        }
        return currentUrl;
    } catch (Exception ex) {
        // log the exception
        return sourceUrl;
    }
}

private String getOriginalUrl(String oldUrl) throws IOException {
    URL url = new URL(oldUrl);
    HttpURLConnection connection = (HttpURLConnection) url.openConnection();
    connection.setInstanceFollowRedirects(false);
    String originalUrl = connection.getHeaderField("Location");
    connection.disconnect();
    if (originalUrl == null) {
        return oldUrl;
    }
    if (originalUrl.indexOf("?") != -1) {
        return originalUrl.substring(0, originalUrl.indexOf("?"));
    }
    return originalUrl;
}

A few things to take note of with this implementation:

  • We’re handling multiple levels of redirection
  • We’re also keeping track of all visited URLs to avoid redirect loops

6. Conclusion

And that’s it – a few solid improvements to make the Reddit application better. The next step is to do some performance testing of the API and see how it behaves in a production scenario.

Sign Up and get 25% Off my upcoming "REST With Spring" classes on launch:

>> CHECK OUT THE CLASSES

Java Web Weekly 40

$
0
0

I usually post about Dev stuff on Twitter - you can follow me there:

At the very beginning of last year, I decided to track my reading habits and share the best stuff here, on Baeldung. Haven’t missed a review since.

Here we go…

1. Spring and Java

>> Spring From the Trenches: Parsing Date and Time Information From a Request Parameter [petrikainulainen.net]

A solid, to the point write-up on passing date information to a Spring controller.

>> Annotation-driven event listeners in Spring 4.2+ [solidsoft]

The new event infrastructure in Spring 4.2 is way cool – I’ve been using it since it made it into the first milestone and I’ve never implemented ApplicationListener since :)

>> React.js and Spring Data REST: Part 3 – Conditional Operations [spring.io]

A strong installment of the Spring Data REST Series this week, looking at versioning and conditional operations in HTTP.

This series is getting better and better.

>> ${… } placeholders support in @Value annotations in Spring [codeleak.pl]

Some interesting, advanced usecases of working with properties in Spring.

>> Case for Defaulting to G1 Garbage Collector in Java 9 [infoq.com]

Looks like G1 is finally going to be the default – no more switching to it manually. That being said – I’ve had a few instances (very few) where G1 wasn’t the best option, so it’s always important to test these kind of low level changes.

>> ELK, Docker and Spring Boot [labouisse.com]

Getting our log data into an ELK instance is oh-so important. It opens up a lot of insights into what’s happening with the system, especially if we’re talking about a widely distributed system (which I am).

So – have a look at this one, copy the parts that match your scenario, stand up a small ELK instance and get your log data into it.

Also worth reading:

Webinars and presentations:

Time to upgrade:

2. Technical

>> Persistence in CQRS Read Models [squirrel.pl]

This is a series I’m following with a lot of interest. Event Sourcing isn’t a good fit for every system, but where it does fit, it’s an order of magnitude architectural improvement over a traditional approach.

>> Microservice Deployment [toomuchcoding]

As I do more and more microservice work – I found this piece quite insightful and anchored in the day to day reality of getting a system deployed.

>> Chaos Engineering Upgraded [netflix.com]

The mythical chaos monkey Netflix unleashed a few years ago is growing up.

Also worth reading:

3. Musings

>> Remote-First vs. Remote-Friendly [zachholman.com]

Building a remote-first workplace is tough, but it can be done and done well. Times, they are a changin.

>> Prediction Markets for Software Estimates [daedtech.com]

A fun read about estimations and an interesting proposal to solve it. Not sure if it would actually work, but I would definitely be fun to try.

Also worth reading:

4. Comics

And my favorite comics of the week:

>> Ingeniuty of Sorts [cube-drone.com]

>> Open Workspace Environment [dilbert.com]

>> I hope the next thing you want is sarcasm [dilbert.com]

5. Pick of the Week

After an intensive week of recording, I just launched the Starter Class of “REST With Spring”:

>> REST with Spring – The Starter Class

It feels good to stop calling it “upcoming”.

Scheduling in Spring with Quartz

$
0
0

1. Overview

In this tutorial we’ll build a simple Scheduler in Spring with Quartz.

We’ll begin with a simple goal in mind – to easily configure a new scheduled job.

1.1. Key Components of the Quartz API

Quartz has a modular architecture. It consists of several basic components that can be combined as required. In this tutorial, we’ll focus on the ones that are common to every job: Job, JobDetail, Trigger and Scheduler.

Although we will use Spring to manage the application, each individual component can configured in two ways: the Quartz way or the Spring way (using its convenience classes).

We will cover both as far as possible for the sake of completeness, but either may be adopted. Let’s start building, one component at a time.

2. Job and JobDetail

2.1. Job

The API provides a Job interface having just one method – execute. It must be implemented by the class that contains the actual work to be done, i.e. the task. When a job’s trigger fires, the scheduler invokes the execute method, passing it a JobExecutionContext object.

The JobExecutionContext provides the job instance with information about its runtime environment, including a handle to the scheduler, a handle to the trigger, and the job’s JobDetail object.

In this quick example – the job delegates the task to a service class:

@Component
public class SampleJob implements Job {

    @Autowired
    private SampleJobService jobService;

    public void execute(JobExecutionContext context) throws JobExecutionException {
        jobService.executeSampleJob();
    }
}

2.2. JobDetail

While the job is the workhorse, Quartz does not store an actual instance of the job class. Instead, we can define an instance of the Job using the JobDetail class. The job’s class must be provided to the JobDetail so that it knows the type of the job to be executed.

2.3. Quartz JobBuilder

The Quartz JobBuilder provides a builder-style API for constructing JobDetail entities.

@Bean
public JobDetail jobDetail() {
    return JobBuilder.newJob().ofType(SampleJob.class)
      .storeDurably()
      .withIdentity("Qrtz_Job_Detail")  
      .withDescription("Invoke Sample Job service...")
      .build();
}

2.4. Spring JobDetailFactoryBean

Spring’s JobDetailFactoryBean provides bean-style usage for configuring JobDetail instances. It uses the Spring bean name as the job name, if not otherwise specified:

@Bean
public JobDetailFactoryBean jobDetail() {
    JobDetailFactoryBean jobDetailFactory = new JobDetailFactoryBean();
    jobDetailFactory.setJobClass(SampleJob.class);
    jobDetailFactory.setDescription("Invoke Sample Job service...");
    jobDetailFactory.setDurability(true);
    return jobDetailFactory;
}

A new instance of JobDetail is created for every execution of the job. The JobDetail object conveys the detailed properties of the job. Once the execution is completed, references to the instance are dropped.

3. Trigger

A Trigger is the mechanism to schedule a Job, i.e. a Trigger instance “fires” the execution of a job. There’s a clear separation of responsibilities between the Job (notion of task) and Trigger (scheduling mechanism).

In addition to Job, the trigger also needs a type that can be chosen based on the scheduling requirements.

Let’s say, we want to schedule our task to execute once every hour, indefinitely – we can use Quartz’s TriggerBuilder or Spring’s SimpleTriggerFactoryBean to do so.

3.1. Quartz TriggerBuilder

TriggerBuilder is a builder-style API for constructing the Trigger entity:

@Bean
public Trigger trigger(JobDetail job) {
    return TriggerBuilder.newTrigger().forJob(job)
      .withIdentity("Qrtz_Trigger")
      .withDescription("Sample trigger")
      .withSchedule(simpleSchedule().repeatForever().withIntervalInHours(1))
      .build();
}

3.2. Spring SimpleTriggerFactoryBean

SimpleTriggerFactoryBean provides bean-style usage for configuring SimpleTrigger. It uses the Spring bean name as the trigger name and defaults to indefinite repetition, if not otherwise specified:

@Bean
public SimpleTriggerFactoryBean trigger(JobDetail job) {
    SimpleTriggerFactoryBean trigger = new SimpleTriggerFactoryBean();
    trigger.setJobDetail(job);
    trigger.setRepeatInterval(3600000);
    trigger.setRepeatCount(SimpleTrigger.REPEAT_INDEFINITELY);
    return trigger;
}

4. Configuring the JobStore

JobStore provides the storage mechanism for the Job and Trigger, and is responsible for maintaining all the data relevant to the job scheduler. The API supports both in-memory and persistent stores.

For example purposes, we will use the in-memory RAMJobStore which offers blazing-fast performance and simple configuration via quartz.properties.

org.quartz.jobStore.class=org.quartz.simpl.RAMJobStore

The obvious drawback of the RAMJobStore is that it is volatile in nature. All the scheduling information is lost between shutdowns. If job definitions and schedules must be kept between shutdowns, the persistent JDBCJobStore must be used instead.

5. Scheduler

The Scheduler interface is the main API for interfacing with the job scheduler.

A Scheduler can be instantiated with a SchedulerFactory. Once created, Jobs and Triggers can be registered with it. Initially, the Scheduler is in “stand-by” mode, and its start method must be invoked to start the threads that fire the execution of jobs.

5.1. Quartz StdSchedulerFactory

By simply invoking the getScheduler method on the StdSchedulerFactory, we can instantiate the Scheduler, initialize it (with the configured JobStore and ThreadPool) and return a handle to its API:

@Bean
public Scheduler scheduler(Trigger trigger, JobDetail job) {
    StdSchedulerFactory factory = new StdSchedulerFactory();
    factory.initialize(new ClassPathResource("quartz.properties").getInputStream());

    Scheduler scheduler = factory.getScheduler();
    scheduler.setJobFactory(springBeanJobFactory());
    scheduler.scheduleJob(job, trigger);

    scheduler.start();
    return scheduler;
}

5.2. Spring SchedulerFactoryBean

Spring’s SchedulerFactoryBean provides bean-style usage for configuring a Scheduler, manages its life-cycle within the application context, and exposes the Scheduler as a bean for dependency injection:

@Bean
public SchedulerFactoryBean scheduler(Trigger trigger, JobDetail job) {
    SchedulerFactoryBean schedulerFactory = new SchedulerFactoryBean();
    schedulerFactory.setConfigLocation(new ClassPathResource("quartz.properties"));

    schedulerFactory.setJobFactory(springBeanJobFactory());
    schedulerFactory.setJobDetails(job);
    schedulerFactory.setTriggers(trigger);
    return schedulerFactory;
}

5.3 Configuring SpringBeanJobFactory

The SpringBeanJobFactory provides support for injecting the scheduler context, job data map, and trigger data entries as properties into the job bean while creating an instance.

However, it lacks support for injecting bean references from the application context. Thanks to the author of this blog post, we can add auto-wiring support to SpringBeanJobFactory like so:

@Bean
public SpringBeanJobFactory springBeanJobFactory() {
    AutoWiringSpringBeanJobFactory jobFactory = new AutoWiringSpringBeanJobFactory();
    jobFactory.setApplicationContext(applicationContext);
    return jobFactory;
}

6. Conclusion

That’s all. We have just built our first basic scheduler using the Quartz API as well as Spring’s convenience classes.

The key takeaway from this tutorial is that we were able to configure a job with just a few lines of code and without using any XML-based configuration.

The complete source code for the example is available in this github project. It is a Maven project which can be imported and run as-is. The default setting uses Spring’s convenience classes, which can be easily switched to Quartz API with a run-time parameter (refer to the README.md in the repository).

Java Web Weekly 41

$
0
0

I usually post about Dev stuff on Twitter - you can follow me there:

At the very beginning of last year, I decided to track my reading habits and share the best stuff here, on Baeldung. Haven’t missed a review since.

Here we go…

1. Java and Spring

>> Using Jenkins Job DSL for Job Lifecycle Management [codecentric.de]

Jenkins can be quick and easy. Jenkins done right however, can be quite powerful at scale, paired with the right tools.

>> Java EE 8 MVC: A detailed look at Controllers [mscharhag.com]

An quick look at Controllers in Java EE 8. I’ve been covering a lot of Spring, so this might be an interesting change of scenery.

>> Simpler handling of asynchronous transaction bound events in Spring 4.2+ [solidsoft]

A quick exploration of the new transactional support that comes along with the new event support in Spring 4.2 – good stuff.

>> The Spring Boot Dashboard in STS – Part 1: Local Boot Apps [spring.io]

A cool new Spring Boot dashboard in Eclipse STS – makes working with microservices a whole lot easier.

>> Deploying Spring Boot applications to Heroku [codecentric.de]

A practical, end to end process of running a Boot app over on Heroku.

>> What the Heck Is Mutation Testing? [codeaffine.com]

Code coverage is such an imperfect metric (to put it kindly). To go beyond it is definitely worthwhile – and makes sense to evaluate, provided you have the basic stuff dialed in already.

>> Rapid Development with Hibernate in CQRS Read Models [squirrel.pl]

Some solid practical tips on using Hibernate for an Event Sourced system.

Also worth reading:

Time to upgrade:

2. Technical

>> Apache JMeter Tutorial [codefx.org]

A solid intro to JMeter.

I’ve personally been using Gatling more and more lately, but JMeter is still a go to tool that I recommend and use.

>> Navigating DRY IETF Specs [bizcoder.com]

Reading the official IETF specs well is more of an art than a science; here are some very good tips helping out.

Also worth reading:

3. Musings

>> Why All The Fear of Electronic Voting? [techblog.bozho.net]

Good notes around the thorny problem of moving the voting system to the web.

>> Doing the dishes [dandreamsofcoding.com]

Who would have thought? Doing the dishes and coding are actually similar. And someone else out there has stolen my dish-washing technique.

>> The Evolution of a Freelancer: Lessons from the Hallway Track at DYFConf [swizec.com]

What he said.

Also worth reading:

4. Comics

And my favorite Dilberts of the week:

>> Doc, scrub in, we got the liver [dilbert.com]

>> An army of mole-people [dilbert.com]

>> A sewerside mission [dilbert.com]

5. Pick of the Week

>> Hiring Engineers, a Process [hueniverse.com]

JSON API in a Java Web Application

$
0
0

1. Overview

In this article we’ll start exploring the JSON-API spec and how that can be integrated into a Java backed REST API. We are also using some minor Spring annotations but these can easily be taken out to make the project fully independent of Spring.

We’ll use the Katharsis implementation of JSON-API in Java – and we’ll set up a Katharsis powered Servlet – so all we need is a servlet-based application.

2. Maven

First, let’s take a look at our maven configuration – we need to add the following dependency into our pom.xml:

<dependency>
    <groupId>io.katharsis</groupId>
    <artifactId>katharsis-servlet</artifactId>
    <version>1.0.0</version>
</dependency>

3. A User Resource

Next, let’s take a look at our User resource:

@JsonApiResource(type = "users")
public class User {

    @JsonApiId
    private Long id;

    private String name;

    private String email;
}

Note that:

  • @JsonApiResource annotation is used to define our resource User
  • @JsonApiId annotation is used to define the resource identifier

And very briefly – the persistence for this example is going to be a Spring Data repository here (but of course it doesn’t have to be):

public interface UserRepository extends JpaRepository<User, Long> {}

4. A Resource Repository

Next, let’s discuss our resource repository – each resource should have a ResourceRepository to publish the API operations available on it:

@Component
public class UserResourceRepository implements ResourceRepository<User, Long> {

    @Autowired
    private UserRepository userRepository;

    @Override
    public User findOne(Long id, RequestParams params) {
        return userRepository.findOne(id);
    }

    @Override
    public Iterable<User> findAll(RequestParams params) {
        return userRepository.findAll();
    }

    @Override
    public Iterable<User> findAll(Iterable<Long> ids, RequestParams params) {
        return userRepository.findAll(ids);
    }

    @Override
    public <S extends User> S save(S entity) {
        return userRepository.save(entity);
    }

    @Override
    public void delete(Long id) {
        userRepository.delete(id);
    }
}

A quick note here – if you’re used to a Spring style of application, than this is of course very similar to a Spring controller.

5. Katharsis Filter

As we are using a katharsis-servlet – so we’ll need to implement a filter by extending the AbstractKatharsisFilter:

@Component
public class JsonApiFilter extends AbstractKatharsisFilter implements BeanFactoryAware {

    private static final String DEFAULT_RESOURCE_SEARCH_PACKAGE = "org.baeldung.persistence";
    private static final String RESOURCE_DEFAULT_DOMAIN = "http://localhost:8080";

    private BeanFactory beanFactory;

    @Override
    public void setBeanFactory(BeanFactory beanFactory) throws BeansException {
        this.beanFactory = beanFactory;
    }

    @Override
    protected KatharsisInvokerBuilder createKatharsisInvokerBuilder() {
        KatharsisInvokerBuilder builder = new KatharsisInvokerBuilder();

        builder.resourceSearchPackage(DEFAULT_RESOURCE_SEARCH_PACKAGE).
          resourceDefaultDomain(RESOURCE_DEFAULT_DOMAIN).
          jsonServiceLocator(new JsonServiceLocator() {
            @Override
            public <T> T getInstance(Class<T> clazz) {
                return beanFactory.getBean(clazz);
            }
        });

        return builder;
    }
}

Notice that – in this case, we are using a few minor Spring related artifacts – turning this filter into a bean – just so that it can play well in a Spring enabled system.

Also note how we only need to override the createKatharsisInvokerBuilder() method – based on our project configuration.

With that – we can now start consuming the API; for example:

  • GET “http://localhost:8080/users“: to get all users.
  • POST “http://localhost:8080/users“: to add new user, and more.

6. Relationships

Next, let’s discuss how to handle entities relationships in our JSON API.

6.1. Role Resource

First, let’s introduce a new resource – Role:

@JsonApiResource(type = "roles")
public class Role {

    @JsonApiId
    private Long id;

    private String name;

    @JsonApiToMany
    private Set<User> users;
}

And then set up a many-to-many relation between User and Role:

@JsonApiToMany
@JsonApiIncludeByDefault
private Set<Role> roles;

6.2. Role Resource Repository

Very quickly – here is our Role resource repository:

@Component
public class RoleResourceRepository implements ResourceRepository<Role, Long> {

    @Autowired
    private RoleRepository roleRepository;

    @Override
    public Role findOne(Long id, RequestParams params) {
        return roleRepository.findOne(id);
    }

    @Override
    public Iterable<Role> findAll(RequestParams params) {
        return roleRepository.findAll();
    }

    @Override
    public Iterable<Role> findAll(Iterable<Long> ids, RequestParams params) {
        return roleRepository.findAll(ids);
    }

    @Override
    public <S extends Role> S save(S entity) {
        return roleRepository.save(entity);
    }

    @Override
    public void delete(Long id) {
        roleRepository.delete(id);
    }
}

It is important to understand here is that this single resource repo doesn’t handle the relationship aspect – that takes a separate repository.

6.3. Relationship Repository

In order to handle many-to-many relationship between UserRole we need to create a new style of repository:

@Component
public class UserToRoleRelationshipRepository 
  implements RelationshipRepository<User, Long, Role, Long> {

    @Autowired
    private UserRepository userRepository;

    @Autowired
    private RoleRepository roleRepository;

    @Override
    public void setRelation(User User, Long roleId, String fieldName) { }

    @Override
    public void setRelations(User user, Iterable<Long> roleIds, String fieldName) {
        Set<Role> roles = new HashSet<Role>();
        roles.addAll(roleRepository.findAll(roleIds));
        user.setRoles(roles);
        userRepository.save(user);
    }

    @Override
    public void addRelations(User user, Iterable<Long> roleIds, String fieldName) {
        Set<Role> roles = user.getRoles();
        roles.addAll(roleRepository.findAll(roleIds));
        user.setRoles(roles);
        userRepository.save(user);
    }

    @Override
    public void removeRelations(User user, Iterable<Long> roleIds, String fieldName) {
        Set<Role> roles = user.getRoles();
        roles.removeAll(roleRepository.findAll(roleIds));
        user.setRoles(roles);
        userRepository.save(user);
    }

    @Override
    public Role findOneTarget(
      Long sourceId, String fieldName, RequestParams requestParams) {
        return null;
    }

    @Override
    public Iterable<Role> findManyTargets(
      Long sourceId, String fieldName, RequestParams requestParams) {
        User user = userRepository.findOne(sourceId);
        return user.getRoles();
    }
}

We’re ignoring the singular methods here, in the relationship repository.

7. Test

Finally, let’s analyze a few requests and really understand what the JSON-API output looks like.

We’re going to start retrieving a single User resource (with id = 2):

GET http://localhost:8080/users/2

{
    "data":{
        "type":"users",
        "id":"2",
        "attributes":{
            "email":"tom@test.com",
            "username":"tom"
        },
        "relationships":{
            "roles":{
                "links":{
                    "self":"http://localhost:8080/users/2/relationships/roles",
                    "related":"http://localhost:8080/users/2/roles"
                }
            }
        },
        "links":{
            "self":"http://localhost:8080/users/2"
        }
    },
    "included":[
        {
            "type":"roles",
            "id":"1",
            "attributes":{
                "name":"ROLE_USER"
            },
            "relationships":{
                "users":{
                    "links":{
                        "self":"http://localhost:8080/roles/1/relationships/users",
                        "related":"http://localhost:8080/roles/1/users"
                    }
                }
            },
            "links":{
                "self":"http://localhost:8080/roles/1"
            }
        }
    ]
}

Take-aways:

  • The main attributes of the Resource are found in data.attributes
  • The main relationships of the Resource are found in data.relationships
  • As we used @JsonApiIncludeByDefault for the roles relationship, it is included in the JSON and found in node included

Next – let’s get the collection resource containing the Roles:

GET http://localhost:8080/roles

{
    "data":[
        {
            "type":"roles",
            "id":"1",
            "attributes":{
                "name":"ROLE_USER"
            },
            "relationships":{
                "users":{
                    "links":{
                        "self":"http://localhost:8080/roles/1/relationships/users",
                        "related":"http://localhost:8080/roles/1/users"
                    }
                }
            },
            "links":{
                "self":"http://localhost:8080/roles/1"
            }
        },
        {
            "type":"roles",
            "id":"2",
            "attributes":{
                "name":"ROLE_ADMIN"
            },
            "relationships":{
                "users":{
                    "links":{
                        "self":"http://localhost:8080/roles/2/relationships/users",
                        "related":"http://localhost:8080/roles/2/users"
                    }
                }
            },
            "links":{
                "self":"http://localhost:8080/roles/2"
            }
        }
    ],
    "included":[

    ]
}

The quick take-away here is that we get all Roles in the system – as an array in the data node

8. Conclusion

JSON-API is a fantastic spec – finally adding some structure in the way we use JSON in our APIs and really powering a true Hypermedia API.

This piece explored one way to set it up in a web app. We’ll definitely explore other way – perhaps more idiomatic to Spring – in the future. But regardless of that implementation, the spec itself is – in my view – very very promising work.

The complete source code for the example is available in this github project. It is a Maven project which can be imported and run as-is.

Viewing all 3752 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>