Quantcast
Channel: Baeldung
Viewing all 3855 articles
Browse latest View live

Baeldung Weekly Review 25

$
0
0

I usually post about Dev stuff on Twitter - you can follow me there:

At the very beginning of 2014 I decided to track my reading habits and share the best stuff here, on Baeldung.

2014 has been quite the year, covering each week with a review. I’ve been doing a lot more reading to make sure I cover and curate stuff that has value and is actually worth reading.

Let me know in the comments if you’re finding my reviews interesting and useful.

Here we go…

1. Spring and Java

>> Motivation And Goals Of Project Jigsaw

A solid introduction to Jigsaw and the modularization work going into Java 9 – both context and attention to detail.

>> Oracle Proposes G1 as the Default Garbage Collector for Java 9

A very interesting change potentially slated for JDK 9.

>> JDT Improvements, Top Eclipse Mars Feature #5

Some cool improvements dropping with Eclipse Mars in a few days, including better Java 8 support.

>> Cache auto-configuration in Spring Boot 1.3

Caching support becomes easier with the upcoming Boot.

>> DevTools in Spring Boot 1.3

A nice set of additional to Boot, with the clear goal of making development easier across the board.

Also worth reading:

Webinars and presentations:

Time to upgrade:

2. Technical

>> HTTP/2 Implementation Status

The State of the Union for HTTP/2.

>> Tor for Technologists

A great writeup on Tor – how it works and why you would use it.

Also worth reading:

3. Musings

>> My Candidate Description

An interesting though exercise and a very useful writeup for two reasons. One – by writing something like this, you’ll likely bring a lot of clarity to your own goals and aspirations, and two – it’s a good link to hand over to recruiters.

Also worth reading:

4. Comics

And my favorite Dilberts of the week:

>> I think mauve has the most ram

>> Here’s a nickel, kid

>> The network token fell out and is in the room somewhere

5. Pick of the Week

Earlier this year I introduced the “Pick of the Week” section here in my “Weekly Review”. If you’re already on my email list – you got the pick already – hope you enjoyed it.

If not – you can share the review and unlock it right here:


A User Profile in the Reddit App

$
0
0

1. Overview

In this article we’re going to be building a Profile for the user of our Reddit application – to allow them to configure user specific preferences.

The goal is simple – instead of having the user fill in the same data each time they schedule a new post, they can set it once – in the preferences of their profile. Of course the user can always tune these settings for each post – but the idea is they don’t have to.

2. The Preference Entity

Overall, most things that can now be configured in the applications are going to become globally configurable in the user profile.

First, let’s start with a Preference entity:

@Entity
public class Preference {
    @Id
    @GeneratedValue(strategy = GenerationType.AUTO)
    private Long id;

    private String email;

    private String subreddit;

    private boolean sendReplies;

    // for post re-submission
    private int noOfAttempts;
    private int timeInterval;
    private int minScoreRequired;
    private int minUpvoteRatio;
    private boolean keepIfHasComments;
    private boolean deleteAfterLastAttempt;
}

So, what can we now configure? Simply put – defaults for pretty much every setting in the application.

We’re also storing the email of the user to allow them to receive notifications about what’s happening to their posts.

Now, let’s link the preferences to the user:

@Entity
public class User {
    ...
    
    @OneToOne
    @JoinColumn(name = "preference_id")
    private Preference preference;
}

As you can see, we have a simple one-to-one relation between User and Preference.

3. Simple Profile Page

First, let’s create our simple profile page:

<form >
    <input type="hidden" name="id" />
    <input name="email" type="email"/>
    <input name="subreddit"/>
    ...
   <button onclick="editPref()" >Save Changes</button>
</form>
<script>
$(function() {
    $.get("user/preference", function (data){
        $.each(data, function(key, value) {
            $('*[name="'+key+'"]').val(value);
        });
    });
});
function editPref(){
    var data = {};
	$('form').serializeArray().map(function(x){data[x.name] = x.value;});
	$.ajax({
        url: "user/preference/"+$('input[name="id"]').val(),
        data: JSON.stringify(data),
        type: 'PUT',
        contentType:'application/json'
    }).done(function() { window.location.href = "./"; })
      .fail(function(error) { alert(error.responseText); }); 
}
</script>

Nothing fancy here – just some plain HTML and JavaScript.

Let’s also add a quick link to the new profile:

<h1>Welcome, <a href="profile" sec:authentication="principal.username">username</a></h1>

4. The API

And here the controller, for creating and editing user’s preferences:

@Controller
@RequestMapping(value = "/user/preference")
public class UserPreferenceController {

    @Autowired
    private PreferenceRepository preferenceReopsitory;

    @RequestMapping(method = RequestMethod.GET)
    @ResponseBody
    public Preference getCurrentUserPreference() {
        return getCurrentUser().getPreference();
    }

    @RequestMapping(value = "/{id}", method = RequestMethod.PUT)
    @ResponseStatus(HttpStatus.OK)
    public void updateUserPreference(@RequestBody Preference pref) {
        preferenceReopsitory.save(pref);
        getCurrentUser().setPreference(pref);
    }
}

Finally, we need to make sure that, when the user is created, its preferences are also inintialized:

public void loadAuthentication(String name, OAuth2AccessToken token) {
    ...
    Preference pref = new Preference();
    preferenceReopsitory.save(pref);
    user.setPreference(pref);
    userReopsitory.save(user);
    ...   
}

5. Load/Use Preferences

Now – let’s see how to use these preferences and fill them in whenever they’re required.

We’ll start with the main Post Schedule page – where we’ll load in the preferences of the user:

$(function() {
    $.get("user/preference", function (data){
        $.each(data, function(key, value) {
            $('*[name="'+key+'"]').val(value);
        });
    });
});

6. Testing and Conclusion

We’re almost done – we just need to implement some basic integration tests for the new Profile entity we just introduced.

For the most part, we’re simply going to be extending the existing base persistence test and inherit a battery of tests from that.

Finally – we can wrap up the new user profile functionality – users are now able to set up their own preferences.

Baeldung Weekly Review 26

$
0
0

I usually post about Dev stuff on Twitter - you can follow me there:

At the very beginning of 2014 I decided to track my reading habits and share the best stuff here, on Baeldung.

2014 has been quite the year, covering each week with a review. I’ve been doing a lot more reading to make sure I cover and curate stuff that has value and is actually worth reading.

Let me know in the comments if you’re finding my reviews interesting and useful.

Here we go…

1. Spring and Java

>> Eclipse Ships Tenth Annual Release Train (+ InfoQ coverage)

Eclipse Mars (4.5) is out – here’s what’s cool about this release. Also – here’s the rundown of the top 10 features in this release.

>> Building Microservices: Using an API Gateway

A must-read piece on using the API Gateway pattern in a Microservices centric architecture.

In my own experience, I found the proxy/gateway pattern does a great job consolidating and clearly identifying responsibilities in the system. It also addresses a host of problems like – authentication, CORS, service discovery and clients being affected by early internal refactorings of the microservices.

Overall, a very good read.

>> Writing a download server. Part I: Always stream, never keep fully in memory

Good caching goes a long way towards a well behaved server implementation.

>> Blue-Green Deployment With a Single Database

If you’ve been deploying software to production for a while, you’re probably doing some version of this process. Setting it up for the first time though – that feels good :)

>> Java Logging Basics

A comprehensive deep-dive into Java logging – definitely bookmark material.

Also worth reading:

Webinars and presentations:

Time to upgrade:

2. Technical

>> Upgrades Without Tears Part 1 — Introduction to Blue/Green Deployment on AWS

>> Upgrades Without Tears Part 2 — Blue/Green Deployment Step By Step on AWS

Another solid look at blue-green deployments.

>> Refactoring with Loops and Collection Pipelines

A practical, step-by-step flow of moving from processing a collection with a loop control structure towards a functional, lambda based approach. A very good read if you’re starting on that path.

>> A Story about How Just a Few Characters Can Make Such a Big Difference in Performance

The performance of a good vs bad regex, alongside a deep-dive into how the bad expression actually matches. A good one to have pen and paper next to you.

Also worth reading:

3. Musings

>> 5 Tips for Being an Effective Tech Lead

Words to live by, and definitely a must read for devs that are transitioning into a lead or architect role.

>> It’s a Large Batch Life for Us

On the words we use.

>> Just Wear Headphones

A quick but interesting piece on wearing headphones to block out the noise of an improperly designed office.

>> Why offices are where work goes to die

Very good read on working in an office as opposed to being remote. This really hits home with me, as I’ve been mostly working remote for over a year now.

This style of work jives with me and my personality quite well. Overall, it’s of course highly dependent on a bunch of internal and external factors, and it may not fit well with everyone.

Also worth reading:

4. Comics

And my favorite Dilberts of the week:

>> I’m changing all my estimates to: To Be Determined

>> That idea won’t work

>> Apparently you don’t understand science

5. Pick of the Week

Earlier this year I introduced the “Pick of the Week” section here in my “Weekly Review”. If you’re already on my email list – you got the pick already – hope you enjoyed it.

If not – you can share the review and unlock it right here:

Displaying Dates in the Timezone of the User

$
0
0

1. Overview

In this installment of the Reddit app case study, we’re going to be adding the be scheduling post’s according to the user’s timezone.

Dealing with timezones is notoriously difficult and the technical options are wide open. Our first concern is that we need to show dates to the user according to their own (configurable) timezone. We also need to decide what format the date will be saved as, in the database.

2. A New User Preference – timezone

First, we’ll add a new field – timezone – to our already existing preferences:

@Entity
public class Preference {
    ...
    private String timezone;
}

We then simply make the timezone configurable in the user Preferences Page – leveraging a simple but very useful JQuery plugin:

<select id="timezone" name="timezone"></select>
<script>
    $(function() {
        $('#timezone').timezones();
    });
</script>

Note that the default timezone is the server timezone – which runs on UTC.

3. The Controller

Now, for the fun part. We need to convert dates from the user’s timezone to the server’s timezone:

@Controller
@RequestMapping(value = "/api/scheduledPosts")
public class ScheduledPostRestController {
    private static final SimpleDateFormat dateFormat = 
      new SimpleDateFormat("yyyy-MM-dd HH:mm");
     
    @RequestMapping(method = RequestMethod.POST)
    @ResponseStatus(HttpStatus.OK)
    public void schedule(
      @RequestBody Post post, 
      @RequestParam(value = "date") String date) throws ParseException 
    {
        post.setSubmissionDate(
          calculateSubmissionDate(date, getCurrentUser().getPreference().getTimezone()));
        ...
    }
     
    @RequestMapping(value = "/{id}", method = RequestMethod.PUT)
    @ResponseStatus(HttpStatus.OK)
    public void updatePost(
      @RequestBody Post post, 
      @RequestParam(value = "date") String date) throws ParseException 
    {
        post.setSubmissionDate(
          calculateSubmissionDate(date, getCurrentUser().getPreference().getTimezone()));
        ...
    }
    
    private Date calculateSubmissionDate(String dateString, String userTimeZone) 
      throws ParseException {
        dateFormat.setTimeZone(TimeZone.getTimeZone(userTimeZone));
        return dateFormat.parse(dateString);
    }
}

The conversion is pretty straightforward, but do note that it’s only happening on write operations – the server still returns UTC for reads.

That’s perfectly fine for our client, because we’ll do the conversion in JS – but it’s worth understanding that, for read operations, the server still returns UTC dates.

4. The Front-End

Now – let’s see how to use the user’s timezone in front-end:

4.1. Display the Posts

We will need to display the post’s submissionDate using the user’s timezone:

<table><thead><tr>
<th>Post title</th>
<th>Submission Date 
  (<span id="timezone" sec:authentication="principal.preference.timezone">UTC</span>)</th>
</tr></thead></table>

And here is our function loadPage():

function loadPage(page){
    ...
    $('.table').append('<tr><td>'+post.title+'</td><td>'+
      convertDate(post.submissionDate)+'</td></tr>');
    ...
}
function convertDate(date){
    var serverTimezone = [[${#dates.format(#calendars.createToday(), 'z')}]];
    var serverDate = moment.tz(date, serverTimezone);
    var clientDate = serverDate.clone().tz($("#timezone").html());
    var myformat = "YYYY-MM-DD HH:mm";
    return clientDate.format(myformat);
}

Moment.js helps here with the timezone conversion.

4.2. Schedule a new Post

We also need to modify our schedulePostForm.html:

Submission Date (<span sec:authentication="principal.preference.timezone">UTC</span>)
<input id="date" name="date" />

<script type="text/javascript">
function schedulePost(){
    var data = {};
    $('form').serializeArray().map(function(x){data[x.name] = x.value;});
    $.ajax({
        url: 'api/scheduledPosts?date='+$("#date").val(),
        data: JSON.stringify(data),
        type: 'POST',
        contentType:'application/json',
        success: function(result) {
            window.location.href="scheduledPosts";
        },
        error: function(error) {
            alert(error.responseText);
        }   
    }); 
}
</script>

Finally – we also need to modify our editPostForm.html to localize the submissonDate old value:

$(function() {
    var serverTimezone = [[${#dates.format(#calendars.createToday(), 'z')}]];
    var serverDate = moment.tz($("#date").val(), serverTimezone);
    var clientDate = serverDate.clone().tz($("#timezone").html());
    var myformat = "YYYY-MM-DD HH:mm";
    $("#date").val(clientDate.format(myformat));
});

5. Conclusion

In this simple article, we introduced a simple but highly useful feature into the Reddit app – the ability to see everything according to your own timezone.

This was one of the main pain points as I was using the app – the fact that everything was in UTC. Now – all dates are properly displayed in the timezone of the user, as they should be.

Baeldung Weekly Review 27

$
0
0

I usually post about Dev stuff on Twitter - you can follow me there:

At the very beginning of 2014 I decided to track my reading habits and share the best stuff here, on Baeldung.

2014 has been quite the year, covering each week with a review. I’ve been doing a lot more reading to make sure I cover and curate stuff that has value and is actually worth reading.

Let me know in the comments if you’re finding my reviews interesting and useful.

Here we go…

1. Spring and Java

>> The Features Project Jigsaw Brings To Java 9

We’re continuing the deep-dive into Jigsaw here, with a lot more practical details.

>> Who Cares About toString Performance?

Cool and useful insights into the performance of various toString implementations – generated with various libraries by the IDE. Who knew?

>> String Substring

The low level memory implications of String.substring over the history of Java.

>> More compact Mockito with Java 8, lambda expressions and Mockito-Java8 add-ons

Some very cool Mockito goodness.

>> Spring Tool Suite 3.7.0 released

On the tail of the Eclipse Mars release last week, the new Spring-specific version of Eclipse – STS – is out. Here are the New and Noteworthy.

>> Mars on Linux

And – are you’re using Eclipse Mars on Linux? I am and I’ve been seeing some weird visual artifacts – this is probably why.

Also worth reading:

Webinars and presentations:

Time to upgrade:

2. Technical

>> We Don’t Have Time to Learn It

A tactical and nuanced analysis of proposing a new tool to your team. Some good take-aways here.

>> Microservice Trade-Offs

The discussion around microservices is clearly becoming more mature. This is a fantastic read whether you’re doing microservices or not.

>> The Little Singleton

This is a fun read.

Also worth reading:

3. Musings

>> Office Politics 101 for Recovering Idealists

An insightful, solid piece on the nuances of office politics and interactions; a very good read.

Also worth reading:

4. Comics

And my favorite Dilberts of the week:

>> Breakthroughs in green energy

>> My people call it an avatar

>> Seven more layers of management

5. Pick of the Week

Earlier this year I introduced the “Pick of the Week” section here in my “Weekly Review”. If you’re already on my email list – you got the pick already – hope you enjoyed it.

If not – you can share the review and unlock it right here:

Baeldung Weekly Review 28

$
0
0

I usually post about Dev stuff on Twitter - you can follow me there:

At the very beginning of 2014 I decided to track my reading habits and share the best stuff here, on Baeldung.

2014 has been quite the year, covering each week with a review. I’ve been doing a lot more reading to make sure I cover and curate stuff that has value and is actually worth reading.

Let me know in the comments if you’re finding my reviews interesting and useful.

Here we go…

1. Spring and Java

>> Casting In Java 8 (And Beyond?) [codefx]

Quite a good back to basics on casting in Java. Some interesting takeaways here.

>> Spring Data JPA Tutorial: Creating Database Queries From Method Names [petrikainulainen]

>> Spring Data JPA Tutorial: Creating Database Queries With the @Query Annotation [petrikainulainen]

All about Spring Data JPA and creating/generating queries.

>> The Java Garbage Collection Mini-Book [infoq]

You know how every few months, there’s another good article on Garbage Collection in Java? This 100 page mini-book might mean you can skip all of them.

Good reference material and a solid candidate for weekend reading.

>> JDK 8 Massive Open and Online Course: Lambdas and Streams Introduction [oracle]

An interesting initiative. I haven’t gone through the course, but it looks well made and potentially worth exploring.

Also worth reading:

Webinars and presentations:

Time to upgrade:

2. Technical

>> Microservices Resource Guide [martinfowler]

This might be the home base for anything Microservices.

>> Comments on “You Do it Too” [aphyr]

Some useful nuances of CAP.

Also worth reading:

3. Musings

>> DevOpsCulture [martinfowler]

Beyond the buzzword, there are some intrinsically valuable ideas for building a healthy and holistic culture, in the DevOps movement.

>> How Much Does an Experienced Programmer Use Google? [two-wrongs]

Surgical use of Google. That should be a course for new developers.

Also worth reading:

4. Comics

And my favorite Dilberts of the week:

>> Task Force to Eliminate Redundancies

>> I Assume You Hate Your Customers’ Guts

>> Becoming a Useful Member of Society

5. Pick of the Week

Earlier this year I introduced the “Pick of the Week” section here in my “Weekly Review”. If you’re already on my email list – you got the pick already – hope you enjoyed it.

If not – you can share the review and unlock it right here:

Decoupling Registration from Login in the Reddit App

$
0
0

1. Overview

In this tutorial – we’ll replace the Reddit backed OAuth2 authentication process with a simpler, form-based login. We’ll still be able to authenticate the application with Reddit, we’ll just not use Reddit to drive our main login flow.

2. Basic User Registration

First, let’s replace the old authentication flow.

2.1. The User Entity

We’ll make a few changes to the User entity: make the username unique, add a password field (temporary) and implement UserDetails interface:

@Entity
public class User implements UserDetails{
    ...

    @Column(nullable = false, unique = true)
    private String username;

    private String password;

    @Override
    public Collection<? extends GrantedAuthority> getAuthorities() {
        return Arrays.asList(new SimpleGrantedAuthority("ROLE_USER"));
    }

    @Override
    public boolean isAccountNonExpired() {
        return true;
    }
    @Override
    public boolean isAccountNonLocked() {
        return true;
    }
    @Override
    public boolean isCredentialsNonExpired() {
        return true;
    }
    @Override
    public boolean isEnabled() {
        return true;
    }
    ...
}

2.2. Register A New User

Next – let’s see how to register a new user in the backend:

@Controller
@RequestMapping(value = "/user")
public class UserController {

    @Autowired
    private UserService service;

    @RequestMapping(value = "/register", method = RequestMethod.POST)
    @ResponseStatus(HttpStatus.OK)
    public void register(
      @RequestParam("username") String username, 
      @RequestParam("email") String email,
      @RequestParam("password") String password) 
    {
        service.registerNewUser(username, email, password);
    }
}

Obviously this is a basic create operation for the user – no bells and whistles.

Here’s the actual implementation, in the service layer:

@Service
public class UserService {
    @Autowired
    private UserRepository userRepository;

    @Autowired
    private PreferenceRepository preferenceReopsitory;

    @Autowired
    private PasswordEncoder passwordEncoder;

    @Override
    public void registerNewUser(String username, String email, String password) {
        User existingUser = userRepository.findByUsername(username);
        if (existingUser != null) {
            throw new UsernameAlreadyExistsException("Username already exists");
        }
        
        User user = new User();
        user.setUsername(username);
        user.setPassword(passwordEncoder.encode(password));
        Preference pref = new Preference();
        pref.setTimezone(TimeZone.getDefault().getID());
        pref.setEmail(email);
        preferenceReopsitory.save(pref);
        user.setPreference(pref);
        userRepository.save(user);
    }
}

2.3. Dealing with Exceptions

And the simple UserAlreadyExistsException:

public class UsernameAlreadyExistsException extends RuntimeException {

    public UsernameAlreadyExistsException(String message) {
        super(message);
    }
    public UsernameAlreadyExistsException(String message, Throwable cause) {
        super(message, cause);
    }
}

The exception is dealt with in the main exception handler of the application:

@ExceptionHandler({ UsernameAlreadyExistsException.class })
public ResponseEntity<Object> 
  handleUsernameAlreadyExists(RuntimeException ex, WebRequest request) {
    logger.error("400 Status Code", ex);
    String bodyOfResponse = ex.getLocalizedMessage();
    return new 
      ResponseEntity<Object>(bodyOfResponse, new HttpHeaders(), HttpStatus.BAD_REQUEST);
}

2.4. A Simple Register Page

Finally – a simple front-end signup.html:

<form>
    <input  id="username"/>
    <input  id="email"/>
    <input type="password" id="password" />
    <button onclick="register()">Sign up</button>
</form>

<script>
function register(){
    $.post("user/register", {username: $("#username").val(),
      email: $("#email").val(), password: $("#password").val()}, 
      function (data){
        window.location.href= "./";
    }).fail(function(error){
        alert("Error: "+ error.responseText);
    }); 
}
</script>

It’s worth mentioning again that this isn’t a fully mature registration process – just a very quick flow. For a complete registration flow, you can check out the main registration series here on Baeldung.

3. New Login Page

Here is our new and simple login page:

<div th:if="${param.containsKey('error')}">
Invalid username or password
</div>
<form method="post" action="j_spring_security_check">
    <input name="username" />
    <input type="password" name="password"/>  
    <button type="submit" >Login</button>
</form>
<a href="signup">Sign up</a>

4. Security Configuration

Now – let’s take a look at the new security configuration:

@Configuration
@EnableWebSecurity
@ComponentScan({ "org.baeldung.security" })
public class SecurityConfig extends WebSecurityConfigurerAdapter {

    @Autowired
    private MyUserDetailsService userDetailsService;

    @Override
    protected void configure(AuthenticationManagerBuilder auth) throws Exception {
        auth.userDetailsService(userDetailsService).passwordEncoder(encoder());
    }

    @Override
    protected void configure(HttpSecurity http) throws Exception {
        http
            ...
            .formLogin()
            .loginPage("/")
            .loginProcessingUrl("/j_spring_security_check")
            .defaultSuccessUrl("/home")
            .failureUrl("/?error=true")
            .usernameParameter("username")
            .passwordParameter("password")
            ...
    }

    @Bean
    public PasswordEncoder encoder() { 
        return new BCryptPasswordEncoder(11); 
    }
}

Most things are pretty straightforward, so we won’t go over them in detail here.

And here’s the custom UserDetailsService:

@Service
public class MyUserDetailsService implements UserDetailsService {

    @Autowired
    private UserRepository userRepository;

    @Override
    public UserDetails loadUserByUsername(String username) {
        User user = userRepository.findByUsername(username); 
        if (user == null) { 
            return new User(); 
        } 
        return user;
    }
}

Note: We used our custom User entity instead of  Spring Security default User.

5. Authenticate Reddit

Now that we’re no longer relying on Reddit for our authentication flow, we need to enable users to connect their accounts to Reddit after they log in.

First – we need to modify the old Reddit login logic:

@RequestMapping("/redditLogin")
public String redditLogin() {
    OAuth2AccessToken token = redditTemplate.getAccessToken();
    service.connectReddit(redditTemplate.needsCaptcha(), token);
    return "redirect:home";
}

And the actual implementation – the connectReddit() method:

@Override
public void connectReddit(boolean needsCaptcha, OAuth2AccessToken token) {
    User currentUser = 
      (User) SecurityContextHolder.getContext().getAuthentication().getPrincipal();
    currentUser.setNeedCaptcha(needsCaptcha);
    currentUser.setAccessToken(token.getValue());
    currentUser.setRefreshToken(token.getRefreshToken().getValue());
    currentUser.setTokenExpiration(token.getExpiration());
    userRepository.save(currentUser);
}

Note how the redditLogin() logic is now used to connect the user’s account in our system with his Reddit account by obtaining the user’s AccessToken.

As for the frontend – that’s quite simple:

<h1>Welcome, 
<a href="profile" sec:authentication="principal.username">Bob</a></small>
</h1>
<a th:if="${#authentication.principal.accessToken == null}" href="redditLogin" >
    Connect your Account to Reddit
</a>

We need to also need to make sure that users do connect their accounts to Reddit before trying to submit posts:

@RequestMapping("/post")
public String showSubmissionForm(Model model) {
    if (getCurrentUser().getAccessToken() == null) {
        model.addAttribute("msg", "Sorry, You did not connect your account to Reddit yet");
        return "submissionResponse";
    }
    ...
}

6. Conclusion

The small reddit app is definitely moving forward.

The old authentication flow – fully backed by Reddit – was causing some problems. So now, we have a clean and simple form-based login while still being able to connect your Reddit API in the back end.

Good stuff.

Testing the API of the Reddit App

$
0
0

1. Overview

We’ve been building out the REST API of our simple Reddit App for a while now – it’s time to get serious and start testing it.

And now that we finally switched to a simpler authentication mechanism, it’s easier to do so as well. We’re going to be using the powerful rest-assured library for all of these live tests.

2. Initial Setup

API tests need a user to run; in order to simplify running tests against the API, we’ll have a test user created beforehand – on application bootstrap:

@Component
public class Setup {
    @Autowired
    private UserRepository userRepository;

    @Autowired
    private PreferenceRepository preferenceRepository;

    @Autowired
    private PasswordEncoder passwordEncoder;

    @PostConstruct
    private void createTestUser() {
        User userJohn = userRepository.findByUsername("john");
        if (userJohn == null) {
            userJohn = new User();
            userJohn.setUsername("john");
            userJohn.setPassword(passwordEncoder.encode("123"));
            userJohn.setAccessToken("token");
            userRepository.save(userJohn);
            final Preference pref = new Preference();
            pref.setTimezone(TimeZone.getDefault().getID());
            pref.setEmail("john@test.com");
            preferenceRepository.save(pref);
            userJohn.setPreference(pref);
            userRepository.save(userJohn);
        }
    }
}

Notice how Setup is a plain bean and we’re using the @PostConstruct annotation to hook in the actual setup logic.

3. Support for Live Tests

Before we start to actually write our tests, let’s first set up some basic supporting functionality we can then leverage.

We need things like authentication, URL paths, and maybe some JSON marhalling and unmarshalling capabilities:

@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(
  classes = { TestConfig.class }, 
  loader = AnnotationConfigContextLoader.class)
public class AbstractLiveTest {
    public static final SimpleDateFormat dateFormat = new SimpleDateFormat("yyyy-MM-dd HH:mm");

    @Autowired
    private CommonPaths commonPaths;

    protected String urlPrefix;

    protected ObjectMapper objectMapper = new ObjectMapper().setDateFormat(dateFormat);

    @Before
    public void setup() {
        urlPrefix = commonPaths.getServerRoot();
    }
    
    protected RequestSpecification givenAuth() {
        FormAuthConfig formConfig 
          = new FormAuthConfig(urlPrefix + "/j_spring_security_check", "username", "password");
        return RestAssured.given().auth().form("john", "123", formConfig);
    }

    protected RequestSpecification withRequestBody(RequestSpecification req, Object obj) 
      throws JsonProcessingException {
        return req.contentType(MediaType.APPLICATION_JSON_VALUE)
          .body(objectMapper.writeValueAsString(obj));
    }
}

We are just defining some simple helper methods and fields to make the actual testing easier:

  • givenAuth(): to perform the authentication
  • withRequestBody(): to send the JSON representation of an Object as the body of the HTTP request

And here is our simple bean – CommonPaths – providing a clean abstraction to the URLs of the system:

@Component
@PropertySource({ "classpath:web-${envTarget:local}.properties" })
public class CommonPaths {

    @Value("${http.protocol}")
    private String protocol;

    @Value("${http.port}")
    private String port;

    @Value("${http.host}")
    private String host;

    @Value("${http.address}")
    private String address;

    public String getServerRoot() {
        if (port.equals("80")) {
            return protocol + "://" + host + "/" + address;
        }
        return protocol + "://" + host + ":" + port + "/" + address;
    }
}

And the local version of the properties file: web-local.properties:

http.protocol=http
http.port=8080
http.host=localhost
http.address=reddit-scheduler

Finally, the very simple test Spring configuration:

@Configuration
@ComponentScan({ "org.baeldung.web.live" })
public class TestConfig {
    @Bean
    public static PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer() {
        return new PropertySourcesPlaceholderConfigurer();
    }
}

4. Test the /scheduledPosts API

The first API we’re going to test is the /scheduledPosts API:

public class ScheduledPostLiveTest extends AbstractLiveTest {
    private static final String date = "2016-01-01 00:00";

    private Post createPost() throws ParseException, IOException {
        Post post = new Post();
        post.setTitle("test");
        post.setUrl("test.com");
        post.setSubreddit("test");
        post.setSubmissionDate(dateFormat.parse(date));

        Response response = withRequestBody(givenAuth(), post)
          .post(urlPrefix + "/api/scheduledPosts?date=" + date);
        
        return objectMapper.reader().forType(Post.class).readValue(response.asString());
    }
}

First, let’s test scheduling a new post:

@Test
public void whenScheduleANewPost_thenCreated() 
  throws ParseException, IOException {
    Post post = new Post();
    post.setTitle("test");
    post.setUrl("test.com");
    post.setSubreddit("test");
    post.setSubmissionDate(dateFormat.parse(date));

    Response response = withRequestBody(givenAuth(), post)
      .post(urlPrefix + "/api/scheduledPosts?date=" + date);

    assertEquals(201, response.statusCode());
    Post result = objectMapper.reader().forType(Post.class).readValue(response.asString());
    assertEquals(result.getUrl(), post.getUrl());
}

Next, let’s test retrieving all scheduled posts of a user:

@Test
public void whenGettingUserScheduledPosts_thenCorrect() 
  throws ParseException, IOException {
    createPost();

    Response response = givenAuth().get(urlPrefix + "/api/scheduledPosts?page=0");

    assertEquals(201, response.statusCode());
    assertTrue(response.as(List.class).size() > 0);
}

Next, let’s test editing a scheduled post:

@Test
public void whenUpdatingScheduledPost_thenUpdated() 
  throws ParseException, IOException {
    Post post = createPost();

    post.setTitle("new title");
    Response response = withRequestBody(givenAuth(), post).
      put(urlPrefix + "/api/scheduledPosts/" + post.getId() + "?date=" + date);

    assertEquals(200, response.statusCode());
    response = givenAuth().get(urlPrefix + "/api/scheduledPosts/" + post.getId());
    assertTrue(response.asString().contains(post.getTitle()));
}

Finally, let’s test the delete operation in the API:

@Test
public void whenDeletingScheduledPost_thenDeleted() 
  throws ParseException, IOException {
    Post post = createPost();
    Response response = givenAuth().delete(urlPrefix + "/api/scheduledPosts/" + post.getId());

    assertEquals(204, response.statusCode());
}

5. Test the /sites API

Next – let’s test the API publishing the Sites resources – the sites defined by a user:

public class MySitesLiveTest extends AbstractLiveTest {

    private Site createSite() throws ParseException, IOException {
        Site site = new Site("http://www.baeldung.com/feed/");
        site.setName("baeldung");
        
        Response response = withRequestBody(givenAuth(), site)
          .post(urlPrefix + "/sites");

        return objectMapper.reader().forType(Site.class).readValue(response.asString());
    }
}

Let’s test retrieving all the sites of the user:

@Test
public void whenGettingUserSites_thenCorrect() 
  throws ParseException, IOException {
    createSite();
    Response response = givenAuth().get(urlPrefix + "/sites");

    assertEquals(200, response.statusCode());
    assertTrue(response.as(List.class).size() > 0);
}

And also retrieving the articles of a site:

@Test
public void whenGettingSiteArticles_thenCorrect() 
  throws ParseException, IOException {
    Site site = createSite();
    Response response = givenAuth().get(urlPrefix + "/sites/articles?id=" + site.getId());

    assertEquals(200, response.statusCode());
    assertTrue(response.as(List.class).size() > 0);
}

Next, let’s test adding a new Site:

@Test
public void whenAddingNewSite_thenCorrect() 
  throws ParseException, IOException {
    Site site = createSite();

    Response response = givenAuth().get(urlPrefix + "/sites");
    assertTrue(response.asString().contains(site.getUrl()));
}

And deleting it:

@Test
public void whenDeletingSite_thenDeleted() throws ParseException, IOException {
    Site site = createSite();
    Response response = givenAuth().delete(urlPrefix + "/sites/" + site.getId());

    assertEquals(204, response.statusCode());
}

6. Test the /user/preferences API

Finally, let’s focus on the API exposing the preferences of the user.

First, let’s test getting user’s preferences:

@Test
public void whenGettingPrefernce_thenCorrect() {
    Response response = givenAuth().get(urlPrefix + "/user/preference");

    assertEquals(200, response.statusCode());
    assertTrue(response.as(Preference.class).getEmail().contains("john"));
}

And editing them:

@Test
public void whenUpdattingPrefernce_thenCorrect() 
  throws JsonProcessingException {
    Preference pref = givenAuth().get(urlPrefix + "/user/preference").as(Preference.class);
    pref.setEmail("john@xxxx.com");
    Response response = withRequestBody(givenAuth(), pref).
      put(urlPrefix + "/user/preference/" + pref.getId());

    assertEquals(200, response.statusCode());
    response = givenAuth().get(urlPrefix + "/user/preference");
    assertEquals(response.as(Preference.class).getEmail(), pref.getEmail());
}

7. Conclusion

In this quick article we put together some basic testing for our REST API.

Nothing to fancy though – more advanced scenarios are needed – but this isn’t about perfection, it’s about progress and iterating in public.


Baeldung Weekly Review 29

$
0
0

I usually post about Dev stuff on Twitter - you can follow me there:

 

>> On Reddit

At the very beginning of 2014 I decided to track my reading habits and share the best stuff here, on Baeldung.

2014 has been quite the year, covering each week with a review. I’ve been doing a lot more reading to make sure I cover and curate stuff that has value and is actually worth reading.

Let me know in the comments if you’re finding my reviews interesting and useful.

Here we go…

1. Spring and Java

>> Microservices with Spring [spring]

A quick and practical intro to building microservices with Spring. It touches on some interesting stuff, definitely worth a read.

>> OpenJDK Requesting Community Feedback on Java 9 Features [infoq]

An in-depth interview on some of the changes in Java 9 as well as how the community can participate and shape the release.

>> Spring Data JPA Tutorial: Creating Database Queries With Named Queries [petrikainulainen]

The next installment in the Spring Data JPA series – further exploring the various ways you can define queries.

>> Writing Clean Tests – Java 8 to the Rescue [petrikainulainen]

Testing on steroids – I’m liking the new approach.

>> Tomcat’s Default Connector(s) [bozho]

A very quick intro to the connectors available in Tomcat and the tradeoffs between them.

>> Removal of sun.misc.Unsafe in Java 9 – A disaster in the making and >> Understanding sun.misc.Unsafe [dripstat], [dzone]

A discussion that has been making the rounds this week, about the plans to sunset Unsafe from the JDK.

Also worth reading:

Webinars and presentations:

Time to upgrade:

2. Technical and Musings

>> Dissecting a tech talk: How I topped the charts at NDC [troyhunt]

Oh, I like this piece. It’s a very worthwhile talk by itself, but the meta stuff is even more interesting.

This is how you get better and public speaking.

>> Let’s Put Some Dignity Back into Job Seeking [daedtech]

Resumes can be a messy affair. That being said, hiring – solid, well matched hiring – is a tough nut to crack.

Also worth reading:

3. Comics

And my favorite Dilberts of the week:

>> Desk Standardization Policies

>> The Committee for Naming Conventions

>> Layers of Management

4. Pick of the Week

Earlier this year I introduced the “Pick of the Week” section here in my “Weekly Review”. If you’re already on my email list – you got the pick already – hope you enjoyed it.

If not – you can share the review and unlock it right here:

Adding Roles and Privileges To the Reddit App

$
0
0

1. Overview

In this installment, we’ll introduce simple roles and privileges into our Reddit app, to then be able to do some interesting things such as – limit how many posts a normal user can schedule to Reddit daily.

And since we’re going to have an Admin role – and implicitly an admin user – we’re also going to add an admin management area as well.

2. User, Role and Privilege Entities

First, we will modify the User entity – that we use it through our Reddit App series – to add roles:

@Entity
public class User {
    ...

    @ManyToMany(fetch = FetchType.EAGER)
    @JoinTable(name = "users_roles", 
      joinColumns = @JoinColumn(name = "user_id", referencedColumnName = "id"), 
      inverseJoinColumns = @JoinColumn(name = "role_id", referencedColumnName = "id"))
    private Collection<Role> roles;

    ...
}

Note how the User-Role relationship is a flexible many to many.

Next, we’re going to define the Role and the Privilege entities. For the full details of that implementation, check out this article on Baeldung.

3. Setup

Next, we’re going to run some basic setup on project bootstrap, to create these roles and privileges:

private void createRoles() {
    Privilege adminReadPrivilege = createPrivilegeIfNotFound("ADMIN_READ_PRIVILEGE");
    Privilege adminWritePrivilege = createPrivilegeIfNotFound("ADMIN_WRITE_PRIVILEGE");
    Privilege postLimitedPrivilege = createPrivilegeIfNotFound("POST_LIMITED_PRIVILEGE");
    Privilege postUnlimitedPrivilege = createPrivilegeIfNotFound("POST_UNLIMITED_PRIVILEGE");

    createRoleIfNotFound("ROLE_ADMIN", Arrays.asList(adminReadPrivilege, adminWritePrivilege));
    createRoleIfNotFound("ROLE_SUPER_USER", Arrays.asList(postUnlimitedPrivilege));
    createRoleIfNotFound("ROLE_USER", Arrays.asList(postLimitedPrivilege));
}

And make our test user an admin:

private void createTestUser() {
    Role adminRole = roleRepository.findByName("ROLE_ADMIN");
    Role superUserRole = roleRepository.findByName("ROLE_SUPER_USER");
    ...
    userJohn.setRoles(Arrays.asList(adminRole, superUserRole));
}

4. Register Standard Users

We’ll also need to make sure that we’re registering standard users via the registerNewUser() implementation:

@Override
public void registerNewUser(String username, String email, String password) {
    ...
    Role role = roleRepository.findByName("ROLE_USER");
    user.setRoles(Arrays.asList(role));
}

Note that the Roles in the system are:

  1. ROLE_USER: for regular users (the default role) – these have a limit on how many posts they can schedule a day
  2. ROLE_SUPER_USER: no scheduling limit
  3. ROLE_ADMIN: additional admin options

5. The Principal

Next, let’s integrate these new privileges into our principal implementation:

public class UserPrincipal implements UserDetails {
    ...

    @Override
    public Collection<? extends GrantedAuthority> getAuthorities() {
        List<GrantedAuthority> authorities = new ArrayList<GrantedAuthority>();
        for (Role role : user.getRoles()) {
            for (Privilege privilege : role.getPrivileges()) {
                authorities.add(new SimpleGrantedAuthority(privilege.getName()));
            }
        }
        return authorities;
    }
}

6. Restrict Scheduled Posts by Standard Users

Let’s now take advantage of the new roles and privileges and restrict standard users from scheduling more than – say – 3 new articles a day – to avoid spamming Reddit.

6.1. Post Repository

First, we’ll add a new operation to our PostRepository implementation – to count the scheduled posts by a specific user in specific time period:

public interface PostRepository extends JpaRepository<Post, Long> {
    ...
    
    Long countByUserAndSubmissionDateBetween(User user, Date start, Date end);

}

5.2. Scheduled Post Controller

Then, we will add a simple check to both schedule() and updatePost() methods:

public class ScheduledPostRestController {
    private static final int LIMIT_SCHEDULED_POSTS_PER_DAY = 3;

    public Post schedule(HttpServletRequest request,...) throws ParseException {
        ...
        if (!checkIfCanSchedule(submissionDate, request)) {
            throw new InvalidDateException("Scheduling Date exceeds daily limit");
        }
        ...
    }

    private boolean checkIfCanSchedule(Date date, HttpServletRequest request) {
        if (request.isUserInRole("POST_UNLIMITED_PRIVILEGE")) {
            return true;
        }
        Date start = DateUtils.truncate(date, Calendar.DATE);
        Date end = DateUtils.addDays(start, 1);
        long count = postReopsitory.
          countByUserAndSubmissionDateBetween(getCurrentUser(), start, end);
        return count < LIMIT_SCHEDULED_POSTS_PER_DAY;
    }
}

There are a couple interesting things going on here. First – notice how we’re manually interacting with Spring Security and checking if the currently logged in user has a privilege or not. That’s not something you do every day – but when you do have to do it, the API is very useful.

As the logic currently stands – if a user has the POST_UNLIMITED_PRIVILEGE – they’re able to – surprise – schedule however much they choose to.

If however, they don’t have that privilege, they’ll be able to queue up a max of 3 posts per day.

7. The Admin Users Page

Next – now that we have a clear separate of users, based on the role they have – let’s implement some very simple user management for the admin of our small Reddit app.

7.1. Display All Users

First, let’s create a basic page listing all the users in the system:

Here the API for listing out all users:

@PreAuthorize("hasRole('ADMIN_READ_PRIVILEGE')")
@RequestMapping(value="/admin/users", method = RequestMethod.GET)
@ResponseBody
public List<User> getUsersList() {
    return service.getUsersList();
}

And the service layer implementation:

@Transactional
public List<User> getUsersList() {
    return userRepository.findAll();
}

Then, the simple front-end:

<table>
    <thead>
        <tr>
            <th>Username</th>
            <th>Roles</th>
            <th>Actions</th></tr>
    </thead>
</table>

<script>
$(function(){
    var userRoles="";
    $.get("admin/users", function(data){
        $.each(data, function( index, user ) {
            userRoles = extractRolesName(user.roles);
            $('.table').append('<tr><td>'+user.username+'</td><td>'+
              userRoles+'</td><td><a href="#" onclick="showEditModal('+
              user.id+',\''+userRoles+'\')">Modify User Roles</a></td></tr>');
        });
    });
});

function extractRolesName(roles){ 
    var result =""; 
    $.each(roles, function( index, role ) { 
        result+= role.name+" "; 
    }); 
    return result; 
}
</script>

7.2. Modify User’s Role

Next, some simple logic to manage the roles of these users; let’s start with the controller:

@PreAuthorize("hasRole('USER_WRITE_PRIVILEGE')")
@RequestMapping(value = "/user/{id}", method = RequestMethod.PUT)
@ResponseStatus(HttpStatus.OK)
public void modifyUserRoles(
  @PathVariable("id") Long id, 
  @RequestParam(value = "roleIds") String roleIds) {
    service.modifyUserRoles(id, roleIds);
}

@PreAuthorize("hasRole('USER_READ_PRIVILEGE')")
@RequestMapping(value = "/admin/roles", method = RequestMethod.GET)
@ResponseBody
public List<Role> getRolesList() {
    return service.getRolesList();
}

And the service layer:

@Transactional
public List<Role> getRolesList() {
    return roleRepository.findAll();
}
@Transactional
public void modifyUserRoles(Long userId, String ids) {
    List<Long> roleIds = new ArrayList<Long>();
    String[] arr = ids.split(",");
    for (String str : arr) {
        roleIds.add(Long.parseLong(str));
    }
    List<Role> roles = roleRepository.findAll(roleIds);
    User user = userRepository.findOne(userId);
    user.setRoles(roles);
    userRepository.save(user);
}

Finally – the simple front-end:

<div id="myModal">
    <h4 class="modal-title">Modify User Roles</h4>
    <input type="hidden" name="id" id="userId"/>
    <div id="allRoles"></div>
    <button onclick="modifyUserRoles()">Save changes</button>
</div>

<script>
function showEditModal(userId, roleNames){
    $("#userId").val(userId);
    $.get("admin/roles", function(data){
        $.each(data, function( index, role ) {
            if(roleNames.indexOf(role.name) != -1){
                $('#allRoles').append(
                  '<input type="checkbox" name="roleIds" value="'+role.id+'" checked/> '+role.name+'<br/>')
            } else{
                $('#allRoles').append(
                  '<input type="checkbox" name="roleIds" value="'+role.id+'" /> '+role.name+'<br/>')
            }
       });
       $("#myModal").modal();
    });
}

function modifyUserRoles(){
    var roles = [];
    $.each($("input[name='roleIds']:checked"), function(){ 
        roles.push($(this).val());
    }); 
    if(roles.length == 0){
        alert("Error, at least select one role");
        return;
    }
 
    $.ajax({
        url: "user/"+$("#userId").val()+"?roleIds="+roles.join(","),
        type: 'PUT',
        contentType:'application/json'
        }).done(function() { window.location.href="users";
        }).fail(function(error) { alert(error.responseText); 
    }); 
}
</script>

8. Security Configuration

Finally, we need to modify the security configuration to redirect the admin users to this new, separate page in the system:

@Autowired 
private AuthenticationSuccessHandler successHandler;

@Override
protected void configure(HttpSecurity http) throws Exception {
    http.
    ...
    .authorizeRequests()
    .antMatchers("/adminHome","/users").hasAuthority("ADMIN_READ_PRIVILEGE")    
    ...
    .formLogin().successHandler(successHandler)
}

We’re using a custom authentication success handler to decide where the user lands after login:

@Component
public class MyAuthenticationSuccessHandler implements AuthenticationSuccessHandler {

    @Override
    public void onAuthenticationSuccess(
      HttpServletRequest request, HttpServletResponse response, Authentication auth) 
      throws IOException, ServletException {
        Set<String> privieleges = AuthorityUtils.authorityListToSet(auth.getAuthorities());
        if (privieleges.contains("ADMIN_READ_PRIVILEGE")) {
            response.sendRedirect("adminHome");
        } else {
            response.sendRedirect("home");
        }
    }
}

And the extremely simple admin homepage adminHome.html:

<html>
<body>
    <h1>Welcome, <small><span sec:authentication="principal.username">Bob</span></small></h1>
    <br/>
    <a href="users">Display Users List</a>
</body>
</html>

9. Conclusion

In this new part of the case study, we added some simple security artifacts into our app – roles and privileges. With that support, we built two simple features – a scheduling limit for standard users and a bare-bones admin for admin users.

A Guide To Spring Redirects

$
0
0

I usually post about Spring stuff on Twitter - you can follow me there:

1. Overview

This article will focus on implementing a Redirect in Spring and will discuss the reasoning behind each strategy.

2. Why Do A Redirect?

Let’s first consider the reasons why you may need to do a redirect in a Spring application.

There are a host of examples and reasons of course. One simple one might be POSTing form data, working around the double submission problem, or simply delegating the execution flow to another controller method.

A quick side note here is that the typical Post/Redirect/Get pattern doesn’t fully address double submission issues – problems such as refreshing the page before the initial submission has completed may still result in a double submission.

3. Redirect with RedirectView

Let’s start with this simple approach – and go straight to an example:

@Controller
@RequestMapping("/")
public class RedirectController {
    
    @RequestMapping(value = "/redirectWithRedirectView", method = RequestMethod.GET)
    public RedirectView redirectWithUsingRedirectView(RedirectAttributes redirectAttributes) {
        redirectAttributes.addFlashAttribute("flashAttribute", "redirectWithRedirectView");
        redirectAttributes.addAttribute("attribute", "redirectWithRedirectView");
        return new RedirectView("redirectedUrl");
    }
}

Behind the scenes, RedirectView will trigger a HttpServletResponse.sendRedirect() – which will perform the actual redirect.

Notice here how we’re injecting the redirect attributes into the method – the framework will do the heavy lifting here and allow us to interact with these attributes cleanly.

We’re adding the model attribute attribute – which will be exposed as HTTP query parameter. The model must contain only objects – generally Strings or objects that can be converted to Strings.

Let’s now test our redirect – with the help of a simple curl command:

curl -i http://localhost:8080/spring-rest/redirectWithRedirectView

The result will be:

HTTP/1.1 302 Found
Server: Apache-Coyote/1.1
Location: http://localhost:8080/spring-rest/redirectedUrl?attribute=redirectWithRedirectView

4. Redirect with the prefix redirect:

The previous approach – using RedirectView – is suboptimal for a few reasons.

First- we’re now coupled to the Spring API, because we’re using the RedirectView directly in our own code.

Second – we now need to know from the start, when implementing that controller operation – that the end result will always be a redirect – which may not always be the case.

A better option is using the prefix redirect: – the redirect view name is injected into the controller like any other logical view name. The controller is not even aware that redirection is happening.

Here’s what that looks like:

@Controller
@RequestMapping("/")
public class RedirectController {
    
    @RequestMapping(value = "/redirectWithRedirectPrefix", method = RequestMethod.GET)
    public ModelAndView redirectWithUsingRedirectPrefix(ModelMap model) {
        model.addAttribute("attribute", "redirectWithRedirectPrefix");
        return new ModelAndView("redirect:/redirectedUrl", model);
    }
}

When a view name is returned with the prefix redirect: the UrlBasedViewResolver (and all its subclasses) will recognize this as a special indication that a redirect needs to happen. The rest of the view name will be used as the redirect URL.

A quick but important note here is that – when we use this logical view name here – redirect:/redirectedUrl – we’re doing a redirect relative to the current Servlet context.

We can use a name such as redirect: http://localhost:8080/spring-redirect/redirectedUrl if we need to redirect to an absolute URL.

So now, when we execute the curl commend:

curl -i http://localhost:8080/spring-rest/redirectWithRedirectPrefix

We’ll immediately get redirected:

HTTP/1.1 302 Found
Server: Apache-Coyote/1.1
Location: http://localhost:8080/spring-rest/redirectedUrl?attribute=redirectWithRedirectPrefix

5. Forward with the prefix forward:

Let’s now see how to do something slightly different – a forward.

Before the code, let’s go over a quick high level overview of the semantics of forward vs redirect:

  • redirect will respond with a 302 and the new URL in the Location header; the browser/client will then make another request to the new URL
  • forward happens entirely on a server side; the Servlet container forwards the same request to the target URL; the URL won’t change in the browser

Now let’s look at the code:

@Controller
@RequestMapping("/")
public class RedirectController {
    
    @RequestMapping(value = "/forwardWithForwardPrefix", method = RequestMethod.GET)
    public ModelAndView redirectWithUsingForwardPrefix(ModelMap model) {
        model.addAttribute("attribute", "forwardWithForwardPrefix");
        return new ModelAndView("forward:/redirectedUrl", model);
    }
}

Same as redirect:, the forward: prefix will be resolved by UrlBasedViewResolver and its subclasses. Internally, this will create an InternalResourceView which does a RequestDispatcher.forward() to the new view.

When we execute the command with curl:

curl -I http://localhost:8080/spring-rest/forwardWithForwardPrefix

The result will be:

HTTP/1.1 405 Method Not Allowed
Server: Apache-Coyote/1.1
Allow: GET
Content-Type: text/html;charset=utf-8

To wrap up, compared to the 2 requests that we had in the case of the redirect solution, in this case we only have a single request going out from the browser/client to the server side. The attribute that was previously added by the redirect is of course missing as well.

6. Attributes With RedirectAttributes

Next – let’s look closer at passing attributes in a redirect – making full use the framework with RedirectAttribures:

@RequestMapping(value = "/redirectWithRedirectAttributes", method = RequestMethod.GET)
public RedirectView redirectWithRedirectAttributes(RedirectAttributes redirectAttributes) {
    redirectAttributes.addFlashAttribute("flashAttribute", "redirectWithRedirectAttributes");
    redirectAttributes.addAttribute("attribute", "redirectWithRedirectAttributes");
    return new RedirectView("redirectedUrl");
}

As we saw before, we can inject the attributes object in the method directly – which makes this mechanism very easy to use.

Notice also that we are adding a flash attribute as well – this is an attribute that won’t make it into the URL. What we can achieve with this kind of attribute is – we can later access the flash attribute using @ModelAttribute(“flashAttribute”) only in method that is the final target of the redirect:

@RequestMapping(value = "/redirectedUrl", method = RequestMethod.GET)
public ModelAndView redirection(
  ModelMap model, @ModelAttribute("flashAttribute") Object flashAttribute) {
     model.addAttribute("redirectionAttribute", flashAttribute);
     return new ModelAndView("redirection", model);
 }

So, to wrap up – if we test the functionality with curl:

curl -i http://localhost:8080/spring-rest/redirectWithRedirectAttributes

The result will be:

HTTP/1.1 302 Found
Server: Apache-Coyote/1.1
Set-Cookie: JSESSIONID=4B70D8FADA2FD6C22E73312C2B57E381; Path=/spring-rest/; HttpOnly
Location: http://localhost:8080/spring-rest/redirectedUrl;
  jsessionid=4B70D8FADA2FD6C22E73312C2B57E381?attribute=redirectWithRedirectAttributes

That way, using RedirectAttribures instead of a ModelMap gives us ability to only share some attributes between the two methods that are involved in the redirect operation.

7. An Alternative Configuration Without The Prefix

Let’s now explore an alternative configuration – a redirect without using the prefix.

To achieve this, we need to use a org.springframework.web.servlet.view.XmlViewResolver:

<bean class="org.springframework.web.servlet.view.XmlViewResolver">
    <property name="location">
        <value>/WEB-INF/spring-views.xml</value>
    </property>
    <property name="order" value="0" />
</bean>

Instead of org.springframework.web.servlet.view.InternalResourceViewResolver we used in the previous configuration:

<bean class="org.springframework.web.servlet.view.InternalResourceViewResolver"></bean>

We also need to define a RedirectView bean in the configuration:

<bean id="RedirectedUrl" class="org.springframework.web.servlet.view.RedirectView">
    <property name="url" value="redirectedUrl" />
</bean>

Now we can trigger the redirect by reference this new bean by id:

@Controller
@RequestMapping("/")
public class RedirectController {
    
    @RequestMapping(value = "/redirectWithXMLConfig", method = RequestMethod.GET)
    public ModelAndView redirectWithUsingXMLConfig(ModelMap model) {
        model.addAttribute("attribute", "redirectWithXMLConfig");
        return new ModelAndView("RedirectedUrl", model);
    }
}

8. Conclusion

This article illustrated 3 different approaches to implementing a redirect in Spring as well as how to handle/pass attributes when doing these redirects.

I usually post about Spring stuff on Twitter - you can follow me there:


Baeldung Weekly Review 30

$
0
0

I usually post about Dev stuff on Twitter - you can follow me there:

At the very beginning of 2014 I decided to track my reading habits and share the best stuff here, on Baeldung.

2014 has been quite the year, covering each week with a review. I’ve been doing a lot more reading to make sure I cover and curate stuff that has value and is actually worth reading.

Let me know in the comments if you’re finding my reviews interesting and useful.

Here we go…

1. Spring and Java

>> Spring Data JPA Tutorial: Auditing, Part One [petrikainulainen]

Auditing strategies with Spring Data and JPA – this is an important first part of any mature, production-grade system.

>> Testing Web-Applications with JBehave, PhantomJS and PageObjects [codecentric]

I’ve always found the Page Object pattern instrumental in testing web apps. The fluent, English-like syntax that you can get is super useful, and this article is a good illustration of that.

>> What the sun.misc.Unsafe Misery Teaches Us [jooq]

A quick, level headed look at the whole Unsafe removal debacle.

>> Java 9’s New HTTP/2 and REPL [infoq]

Java 9 is indeed coming with a lot of cool new things that are going to move both the language and the ecosystem forward in the next few years.

Also worth reading:

Webinars and presentations:

Time to upgrade:

2. Technical

>> Under the Hood of Amazon EC2 Container Service [allthingsdistributed]

Looking under the hood of the new Container Service from EC2. Very interesting stuff, for a lot of strategies I’m currently seeing being implemented manually.

>> Monitoring Microservices: Three Ways to Overcome the Biggest Challenges [loggly]

Quick and interesting read on monitoring and alerting in a microservice architecture.

>> RESTful considered harmful [nurkiewicz]

The nuts and bolts of some of the disadvantages of adhering to a RESTful architecture.

Some of these have nothing to do with REST itself, more with the way it happens to be implemented in the wild, but most of these points have good take-aways, whether you agree with them or not.

Also worth reading:

3. Musings

>> Your Code Is Data [daedtech]

A solid piece exploring static analysis below surface level. Definitely a must read if you’re using static analysis tools or not, but most definitely if you aren’t.

>> Group Flow in Software Development [hypesystem]

A group in flow is rare, but certainly doable and worth it when you’re there.

Also worth reading:

4. Comics

And my favorite comics of the week:

>> New intern knows best: GOTO

>> I no longer understand what employes say

>> When you start to understand a concept, it marks the beginning of its decline

5. Pick of the Week

Earlier this year I introduced the “Pick of the Week” section here in my “Weekly Review”. If you’re already on my email list – you got the pick already – hope you enjoyed it.

If not – you can share the review and unlock it right here:

Third Round of Improvements to the Reddit Application

$
0
0

1. Overview

In this article we’re going to keep moving our little case study app forward by implementing small but useful improvements to the already existing features.

2. Better Tables

Let’s start by using the jQuery DataTables plugin to replace the old, basic tables the app was using before.

2.1. Post Repository and Service

First, we’ll add a method to count the scheduled posts of a user – leveraging the Spring Data syntax of course:

public interface PostRepository extends JpaRepository<Post, Long> {
    ...
    Long countByUser(User user);
}

Next, let’s take a quick look at the service layer implementation – retrieving the posts of a user based on pagination parameters:

@Override
public List<SimplePostDto> getPostsList(int page, int size, String sortDir, String sort) {
    PageRequest pageReq = new PageRequest(page, size, Sort.Direction.fromString(sortDir), sort);
    Page<Post> posts = postRepository.findByUser(userService.getCurrentUser(), pageReq);
    return constructDataAccordingToUserTimezone(posts.getContent());
}

We’re converting the dates based on the timezone of the user:

private List<SimplePostDto> constructDataAccordingToUserTimezone(List<Post> posts) {
    String timeZone = userService.getCurrentUser().getPreference().getTimezone();
    return posts.stream().map(post -> new SimplePostDto(
      post, convertToUserTomeZone(post.getSubmissionDate(), timeZone)))
      .collect(Collectors.toList());
}
private String convertToUserTomeZone(Date date, String timeZone) { 
    dateFormat.setTimeZone(TimeZone.getTimeZone(timeZone)); 
    return dateFormat.format(date); 
}

2.2. The API with Pagination

Next, we’re going to publish this operation with full pagination and sorting, via the API:

@RequestMapping(method = RequestMethod.GET)
@ResponseBody
public List<SimplePost> getScheduledPosts(
  @RequestParam(value = "page", required = false, defaultValue = "0") int page, 
  @RequestParam(value = "size", required = false, defaultValue = "10") int size,
  @RequestParam(value = "sortDir", required = false, defaultValue = "asc") String sortDir, 
  @RequestParam(value = "sort", required = false, defaultValue = "title") String sort, 
  HttpServletResponse response) {
    response.addHeader("PAGING_INFO", 
      scheduledPostService.generatePagingInfo(page, size).toString());
    return scheduledPostService.getPostsList(page, size, sortDir, sort);
}

Note how we’re using a custom header to pass the pagination info to the client. There are other, slightly more standard ways to do this – ways we might explore later.

This implementation however is simply enough – we have a simple method to generate paging information:

public PagingInfo generatePagingInfo(int page, int size) {
    long total = postRepository.countByUser(userService.getCurrentUser());
    return new PagingInfo(page, size, total);
}

And the PagingInfo itself:

public class PagingInfo {
    private long totalNoRecords;
    private int totalNoPages;
    private String uriToNextPage;
    private String uriToPrevPage;

    public PagingInfo(int page, int size, long totalNoRecords) {
        this.totalNoRecords = totalNoRecords;
        this.totalNoPages = Math.round(totalNoRecords / size);
        if (page > 0) {
            this.uriToPrevPage = "page=" + (page - 1) + "&size=" + size;
        }
        if (page < this.totalNoPages) {
            this.uriToNextPage = "page=" + (page + 1) + "&size=" + size;
        }
    }
}

2.3. Front End

Finally, the simple front-end will use a custom JS method to interact with the API and handle the jQuery DataTable parameters:

<table>
<thead><tr>
<th>Post title</th><th>Submission Date</th><th>Status</th>
<th>Resubmit Attempts left</th><th>Actions</th>
</tr></thead>   
</table>

<script>
$(document).ready(function() {
    $('table').dataTable( {
        "processing": true,
        "searching":false,
        "columnDefs": [
            { "name": "title", "targets": 0 },
            { "name": "submissionDate", "targets": 1 },
            { "name": "submissionResponse", "targets": 2 },
            { "name": "noOfAttempts", "targets": 3 } ],
        "columns": [
            { "data": "title" },
            { "data": "submissionDate" },
            { "data": "submissionResponse" },
            { "data": "noOfAttempts" }],
        "serverSide": true,
        "ajax": function(data, callback, settings) {
            $.get('api/scheduledPosts', {
              size: data.length,
              page: (data.start/data.length),
              sortDir: data.order[0].dir,
              sort: data.columns[data.order[0].column].name
              }, function(res,textStatus, request) {
                var pagingInfo = request.getResponseHeader('PAGING_INFO');
                var total = pagingInfo.split(",")[0].split("=")[1];
                callback({recordsTotal: total, recordsFiltered: total,data: res});
              });
          }
    } );
} );
</script>

2.4. API Testing for Paging

With the API now published, we can write a few simple API tests to make sure the basics of the paging mechanism work as expected:

@Test
public void givenMoreThanOnePage_whenGettingUserScheduledPosts_thenNextPageExist() 
  throws ParseException, IOException {
    createPost();
    createPost();
    createPost();

    Response response = givenAuth().
      params("page", 0, "size", 2).get(urlPrefix + "/api/scheduledPosts");

    assertEquals(200, response.statusCode());
    assertTrue(response.as(List.class).size() > 0);

    String pagingInfo = response.getHeader("PAGING_INFO");
    long totalNoRecords = Long.parseLong(pagingInfo.split(",")[0].split("=")[1]);
    String uriToNextPage = pagingInfo.split(",")[2].replace("uriToNextPage=", "").trim();

    assertTrue(totalNoRecords > 2);
    assertEquals(uriToNextPage, "page=1&size=2");
}

@Test
public void givenMoreThanOnePage_whenGettingUserScheduledPostsForSecondPage_thenCorrect() 
  throws ParseException, IOException {
    createPost();
    createPost();
    createPost();

    Response response = givenAuth().
      params("page", 1, "size", 2).get(urlPrefix + "/api/scheduledPosts");

    assertEquals(200, response.statusCode());
    assertTrue(response.as(List.class).size() > 0);

    String pagingInfo = response.getHeader("PAGING_INFO");
    long totalNoRecords = Long.parseLong(pagingInfo.split(",")[0].split("=")[1]);
    String uriToPrevPage = pagingInfo.split(",")[3].replace("uriToPrevPage=", "").trim();

    assertTrue(totalNoRecords > 2);
    assertEquals(uriToPrevPage, "page=0&size=2");
}

3. Email Notifications

Next, we’re going to build out a basic email notification flow – where a user receives emails when their scheduled posts are being sent:

3.1. Email Configuration

First, let’s do the email configuration:

@Bean
public JavaMailSenderImpl javaMailSenderImpl() {
    JavaMailSenderImpl mailSenderImpl = new JavaMailSenderImpl();
    mailSenderImpl.setHost(env.getProperty("smtp.host"));
    mailSenderImpl.setPort(env.getProperty("smtp.port", Integer.class));
    mailSenderImpl.setProtocol(env.getProperty("smtp.protocol"));
    mailSenderImpl.setUsername(env.getProperty("smtp.username"));
    mailSenderImpl.setPassword(env.getProperty("smtp.password"));
    Properties javaMailProps = new Properties();
    javaMailProps.put("mail.smtp.auth", true);
    javaMailProps.put("mail.smtp.starttls.enable", true);
    mailSenderImpl.setJavaMailProperties(javaMailProps);
    return mailSenderImpl;
}

Along with the necessary properties to get SMTP working:

smtp.host=email-smtp.us-east-1.amazonaws.com
smtp.port=465
smtp.protocol=smtps
smtp.username=example
smtp.password=
support.email=example@example.com

3.2. Fire an Event When a Post is Published

Let’s now make sure we fire off an event when a scheduled post gets published to Reddit successfully:

private void updatePostFromResponse(JsonNode node, Post post) {
    JsonNode errorNode = node.get("json").get("errors").get(0);
    if (errorNode == null) {
        ...
        String email = post.getUser().getPreference().getEmail();
        eventPublisher.publishEvent(new OnPostSubmittedEvent(post, email));
    } 
    ...
}

3.3. Event and Listener

The event implementation is pretty straightforward:

public class OnPostSubmittedEvent extends ApplicationEvent {
    private Post post;
    private String email;

    public OnPostSubmittedEvent(Post post, String email) {
        super(post);
        this.post = post;
        this.email = email;
    }
}

And the listener:

@Component
public class SubmissionListner implements ApplicationListener<OnPostSubmittedEvent> {
    @Autowired
    private JavaMailSender mailSender;

    @Autowired
    private Environment env;

    @Override
    public void onApplicationEvent(OnPostSubmittedEvent event) {
        SimpleMailMessage email = constructEmailMessage(event);
        mailSender.send(email);
    }

    private SimpleMailMessage constructEmailMessage(OnPostSubmittedEvent event) {
        String recipientAddress = event.getEmail();
        String subject = "Your scheduled post submitted";
        SimpleMailMessage email = new SimpleMailMessage();
        email.setTo(recipientAddress);
        email.setSubject(subject);
        email.setText(constructMailContent(event.getPost()));
        email.setFrom(env.getProperty("support.email"));
        return email;
    }

    private String constructMailContent(Post post) {
        return "Your post " + post.getTitle() + " is submitted.\n" +
          "http://www.reddit.com/r/" + post.getSubreddit() + 
          "/comments/" + post.getRedditID();
    }
}

4. Using Post Total Votes

Next, we’ll do some work to simplify the resubmit options to – instead of working with the upvote ratio (which was difficult to understand) – it’s now working with the total number of votes.

We can calculate total number of votes using post score and upvote ratio:

  • Score = upvotes – downvotes
  • Total number of votes = upvotes + downvotes
  • Upvote ratio = upvotes/total number of votes

And so:

Total number of votes = Math.round( score / ((2 * upvote ratio) – 1) )

First, we’ll modify our scores logic to calculate and keep track of this total number of votes:

public PostScores getPostScores(Post post) {
    ...

    float ratio = node.get("upvote_ratio").floatValue();
    postScore.setTotalVotes(Math.round(postScore.getScore() / ((2 * ratio) - 1)));
    
    ...
}

And of course we’re going to make use of it when checking if a post is considered failed or not:

private boolean didPostGoalFail(Post post) {
    PostScores postScores = getPostScores(post);
    int totalVotes = postScores.getTotalVotes();
    ...
    return (((score < post.getMinScoreRequired()) || 
            (totalVotes < post.getMinTotalVotes())) && 
            !((noOfComments > 0) && post.isKeepIfHasComments()));
}

Finally, we’ll of course remove the old ratio fields from use.

5. Validate Resubmit Options

Finally, we will help the user by adding some validations to the complex resubmit options:

5.1. ScheduledPost Service

Here is the simple checkIfValidResubmitOptions() method:

private boolean checkIfValidResubmitOptions(Post post) {
    if (checkIfAllNonZero(
          post.getNoOfAttempts(), 
          post.getTimeInterval(), 
          post.getMinScoreRequired())) {
        return true;
    } else {
        return false;
    }
}
private boolean checkIfAllNonZero(int... args) {
    for (int tmp : args) {
       if (tmp == 0) {
           return false;
        }
    }
    return true;
}

We’ll make good use of this validation when scheduling a new post:

public Post schedulePost(boolean isSuperUser, Post post, boolean resubmitOptionsActivated) 
  throws ParseException {
    if (resubmitOptionsActivated && !checkIfValidResubmitOptions(post)) {
        throw new InvalidResubmitOptionsException("Invalid Resubmit Options");
    }
    ...        
}

Note that if the resubmit logic is on – the following fields need to have non-zero values:

  • Number of attempts
  • Time interval
  • Minimum score required

5.2. Exception Handling

Finally – in case of invalid input, the InvalidResubmitOptionsException is handled in our main error handling logic:

@ExceptionHandler({ InvalidResubmitOptionsException.class })
public ResponseEntity<Object> handleInvalidResubmitOptions
  (RuntimeException ex, WebRequest request) {
    
    logger.error("400 Status Code", ex);
    String bodyOfResponse = ex.getLocalizedMessage();
    return new ResponseEntity<Object>(
      bodyOfResponse, new HttpHeaders(), HttpStatus.BAD_REQUEST);
}

5.3. Test Resubmit Options

Finally, let’s now test our resubmit options – we will test both activating and deactivating conditions:

public class ResubmitOptionsLiveTest extends AbstractLiveTest {
    private static final String date = "2016-01-01 00:00";

    @Test
    public void 
      givenResubmitOptionsDeactivated_whenSchedulingANewPost_thenCreated() 
      throws ParseException, IOException {
        Post post = createPost();

        Response response = withRequestBody(givenAuth(), post)
          .queryParams("resubmitOptionsActivated", false)
          .post(urlPrefix + "/api/scheduledPosts");

        assertEquals(201, response.statusCode());
        Post result = objectMapper.reader().forType(Post.class).readValue(response.asString());
        assertEquals(result.getUrl(), post.getUrl());
    }

    @Test
    public void 
      givenResubmitOptionsActivated_whenSchedulingANewPostWithZeroAttempts_thenInvalid() 
      throws ParseException, IOException {
        Post post = createPost();
        post.setNoOfAttempts(0);
        post.setMinScoreRequired(5);
        post.setTimeInterval(60);

        Response response = withRequestBody(givenAuth(), post)
          .queryParams("resubmitOptionsActivated", true)
          .post(urlPrefix + "/api/scheduledPosts");

        assertEquals(400, response.statusCode());
        assertTrue(response.asString().contains("Invalid Resubmit Options"));
    }

    @Test
    public void 
      givenResubmitOptionsActivated_whenSchedulingANewPostWithZeroMinScore_thenInvalid() 
      throws ParseException, IOException {
        Post post = createPost();
        post.setMinScoreRequired(0);
        post.setNoOfAttempts(3);
        post.setTimeInterval(60);

        Response response = withRequestBody(givenAuth(), post)
          .queryParams"resubmitOptionsActivated", true)
          .post(urlPrefix + "/api/scheduledPosts");

        assertEquals(400, response.statusCode());
        assertTrue(response.asString().contains("Invalid Resubmit Options"));
    }

    @Test
    public void 
      givenResubmitOptionsActivated_whenSchedulingANewPostWithZeroTimeInterval_thenInvalid() 
      throws ParseException, IOException {
        Post post = createPost();
        post.setTimeInterval(0);
        post.setMinScoreRequired(5);
        post.setNoOfAttempts(3);

        Response response = withRequestBody(givenAuth(), post)
          .queryParams("resubmitOptionsActivated", true)
          .post(urlPrefix + "/api/scheduledPosts");

        assertEquals(400, response.statusCode());
        assertTrue(response.asString().contains("Invalid Resubmit Options"));
    }

    @Test
    public void 
      givenResubmitOptionsActivated_whenSchedulingNewPostWithValidResubmitOptions_thenCreated() 
      throws ParseException, IOException {
        Post post = createPost();
        post.setMinScoreRequired(5);
        post.setNoOfAttempts(3);
        post.setTimeInterval(60);

        Response response = withRequestBody(givenAuth(), post)
          .queryParams("resubmitOptionsActivated", true)
          .post(urlPrefix + "/api/scheduledPosts");

        assertEquals(201, response.statusCode());
        Post result = objectMapper.reader().forType(Post.class).readValue(response.asString());
        assertEquals(result.getUrl(), post.getUrl());
    }

    private Post createPost() throws ParseException {
        Post post = new Post();
        post.setTitle(randomAlphabetic(6));
        post.setUrl("test.com");
        post.setSubreddit(randomAlphabetic(6));
        post.setSubmissionDate(dateFormat.parse(date));
        return post;
    }
}

6. Conclusion

In this installment, we made several improvements that are moving the case study app in the right direction – ease of use.

The whole idea of the Reddit Scheduler app is to allow the user to quickly schedule new articles to Reddit, by getting into the app, doing the work and getting out.

It’s getting there.

Baeldung Weekly Review 31

$
0
0

I usually post about Dev stuff on Twitter - you can follow me there:

At the very beginning of 2014 I decided to track my reading habits and share the best stuff here, on Baeldung.

2014 has been quite the year, covering each week with a review. I’ve been doing a lot more reading to make sure I cover and curate stuff that has value and is actually worth reading.

Let me know in the comments if you’re finding my reviews interesting and useful.

Here we go…

1. Spring and Java

>> Spring Data JPA Tutorial: Auditing, Part Two [petrikainulainen]

A new installment continuing to explore audit functionality with Spring Data JPA – good stuff.

I’m actually gearing up for a similar audit implementation now and will be using these articles as a reference point.

>> A Map of Akka [codecentric]

An intro to what Akka brings to the table.

>> Testing your Liquibase Migrations in Continuous Integration [codecentric]

Testing these kinds of flows in your system – such as DB evolution and migration – is quite important if you’re not actively looking for trouble.

>> Build High Performance JVM Microservices with Ratpack & Spring Boot [infoq]

A very interesting match between Ratpack and Boot, to address some of the common problems when building microservices.

>> Spring Boot @ConfigurationProperties [java-allandsundry]

Cleanly working with properties is a sign of project maturity. One of many, sure, and maybe a pet peeve of mine, but certainly useful in practice.

This is how Spring Boot makes all of that easier.

>> Ludicrous speed, GO! [brettwooldridge]

Getting HikariCP (the connection pool) up to speed – real numbers included.

Also worth reading:

Webinars and presentations:

Time to upgrade:

2. Technical

>> Bypassing Google Authentication on Periscope’s Administration Panel [fin1te] and

>> Messenger.com Site-Wide CSRF [fin1te]

Two great pieces of security reading.

I really enjoy reading these detailed analysis of security issues – makes me triple check everything in my own implementations.

>> Stream processing, Event sourcing, Reactive, CEP… and making sense of it all [confluent]

A good introduction to Event Sourcing with the goal of “finding the wisdom behind the buzzwords“. Solid read.

Also worth reading:

3. Musings

>> Thinking of Freelancing? Don’t Quit Your Day Job. [joelklettke]

Solid advice if you’re thinking of striking it on your own.

>> Why Guessing is not Estimating and Estimating is not Guessing [herdingcats]

A quick writeup about the spectrum between pure guessing and careful estimation.

>> Doing Terrible Things To Your Code [codinghorror]

Starting out? Read it. In your third decade of building software? Read it.

Also worth reading:

4. Comics

And my favorite Dilberts of the week:

>> Crumb-Snatcher? No! Niche Player

>> Anything I don’t understand is easy to do

>> Your script was almost perfect, keep up the good work buddy

5. Pick of the Week

Earlier this year I introduced the “Pick of the Week” section here in my “Weekly Review”. If you’re already on my email list – you got the pick already – hope you enjoyed it.

If not – you can share the review and unlock it right here:

Introduction to Spring Data MongoDB

$
0
0

I usually post about Persistence on Twitter - you can follow me there:

1. Overview

This article will be a quick and practical introduction to Spring Data MongoDB.

We’ll go over the basics using both the MongoTemplate as well as MongoRepository using practical tests to illustrate each operation.

2. MongoTemplate and MongoRepository

The MongoTemplate follows the standard template pattern in Spring and provides a ready to go, basic API to the underlying persistence engine.

The repository follows the Spring Data centric approach and comes with more flexible and complex API operations, based on the well-known access patterns in all Spring Data projects.

For both, we need to start by defining the dependency – for example, in the pom.xml, with Maven:

<dependency>
    <groupId>org.springframework.data</groupId>
    <artifactId>spring-data-mongodb</artifactId>
    <version>1.7.2.RELEASE</version>
</dependency>

To check if any new version of the library has been released – track the releases here.

3. Configuration for MongoTemplate

3.1. XML Configuration

Let’s start with the simple XML configuration for the Mongo template:

<bean id="mongo" class="org.springframework.data.mongodb.core.MongoFactoryBean">
    <property name="host" value="localhost" />
</bean>

First, we need to define the factory bean responsible with creating Mongo instances.

Next – we need to actually define (and configure) the template bean:

<bean id="mongoTemplate" class="org.springframework.data.mongodb.core.MongoTemplate">
    <constructor-arg name="mongo" ref="mongo" />
    <constructor-arg name="databaseName" value="test" />
</bean>

And finally we need to define a post processor to translate any MongoExceptions thrown in @Repository annotated classes:

<bean class=
  "org.springframework.dao.annotation.PersistenceExceptionTranslationPostProcessor"/>

3.2. Java Configuration

Let’s now create a similar configuration, only with Java:

@Configuration
public class MongoConfig extends AbstractMongoConfiguration {

    @Override
    protected String getDatabaseName() {
        return "test";
    }

    @Override
    public Mongo mongo() throws Exception {
        return new MongoClient("127.0.0.1", 27017);
    }

    @Override
    protected String getMappingBasePackage() {
        return "org.baeldung";
    }
}

4. Configuration for MongoRepository

4.1. XML Configuration

To make use of custom repositories (extending the MongoRepository) – we need to continue the configuration from section 3.1 and set up the repos:

<mongo:repositories 
  base-package="org.baeldung.repository" mongo-template-ref="mongoTemplate"/>

4.2. Java Configuration

Similarly, we’ll build on the configuration we already created in section 3.2 and add a new annotation into the mix:

@EnableMongoRepositories(basePackages = "org.baeldung.repository")

4.3 Create the Repository

Now, after the configuration, we need to actually create a repository – extending the existing MongoRepository interface:

public interface UserRepository extends MongoRepository<User, String> {
    // 
}

Now we can auto wired this UserRepository and use operations from MongoRepository or add custom operations.

5. Using MongoTemplate

5.1. Insert

Let’s start with the insert operation; let’s also start with a completely empty database:

{
}

Now if we insert a new user:

User user = new User();
user.setName("Jon");
mongoTemplate.insert(user, "user");

The database will look like this:

{
    "_id" : ObjectId("55b4fda5830b550a8c2ca25a"),
    "_class" : "org.baeldung.model.User",
    "name" : "Jon"
}

5.2. Save – Insert

The save operation works has save or update semantics: if an id is present, it performs an update, if not – it does an insert.

Let’s look at the first type of semantic – the insert; here’s the initial state of the database:

{
}

When we now save a new user:

User user = new User();
user.setName("Albert"); 
mongoTemplate.save(user, "user");

The entity will be inserted in the database:

{
    "_id" : ObjectId("55b52bb7830b8c9b544b6ad5"),
    "_class" : "org.baeldung.model.User",
    "name" : "Albert"
}

Next, we’ll look at the same operation – save – with update semantics.

5.3. Save – Update

Let’s now look at save with update semantics, operating on an existing entity:

{
    "_id" : ObjectId("55b52bb7830b8c9b544b6ad5"),
    "_class" : "org.baeldung.model.User",
    "name" : "Jack"
}

Now, when we save the existing user – we will update it:

user = mongoTemplate.findOne(Query.query(Criteria.where("name").is("Jack")), User.class);
user.setName("Jim");
mongoTemplate.save(user, "user");

The database will look like this:

{
    "_id" : ObjectId("55b52bb7830b8c9b544b6ad5"),
    "_class" : "org.baeldung.model.User",
    "name" : "Jim"
}

As you can see, in this particular example, save uses the semantics of update, because we use object with given _id.

5.4. UpdateFirst

updateFirst updates the very first document that matches the query.

Let’s start with the initial state of the database:

[
    {
        "_id" : ObjectId("55b5ffa5511fee0e45ed614b"),
        "_class" : "org.baeldung.model.User",
        "name" : "Alex"
    },
    {
        "_id" : ObjectId("55b5ffa5511fee0e45ed614c"),
        "_class" : "org.baeldung.model.User",
        "name" : "Alex"
    }
]

When we now run the updateFirst:

Query query = new Query();
query.addCriteria(Criteria.where("name").is("Alex"));
Update update = new Update();
update.set("name", "James");
mongoTemplate.updateFirst(query, update, User.class);

Only the first entry will be updated:

[
    {
        "_id" : ObjectId("55b5ffa5511fee0e45ed614b"),
        "_class" : "org.baeldung.model.User",
        "name" : "James"
    },
    {
        "_id" : ObjectId("55b5ffa5511fee0e45ed614c"),
        "_class" : "org.baeldung.model.User",
        "name" : "Alex"
    }
]

5.5. UpdateMulti

UpdateMulti updates all document that match the given query.

First – here’s the state of database before doing the updateMulti:

[
    {
        "_id" : ObjectId("55b5ffa5511fee0e45ed614b"),
        "_class" : "org.baeldung.model.User",
        "name" : "Eugen"
    },
    {
        "_id" : ObjectId("55b5ffa5511fee0e45ed614c"),
        "_class" : "org.baeldung.model.User",
        "name" : "Eugen"
    }
]

Now, let’s now run the updateMulti operation:

Query query = new Query();
query.addCriteria(Criteria.where("name").is("Eugen"));
Update update = new Update();
update.set("name", "Victor");
mongoTemplate.updateMulti(query, update, User.class);

Both existing objects will be updated in the database:

[
    {
        "_id" : ObjectId("55b5ffa5511fee0e45ed614b"),
        "_class" : "org.baeldung.model.User",
        "name" : "Victor"
    },
    {
        "_id" : ObjectId("55b5ffa5511fee0e45ed614c"),
        "_class" : "org.baeldung.model.User",
        "name" : "Victor"
    }
]

5.6. FindAndModify

This operation works like updateMulti, but it returns the object before it was modified.

First – the state of database before calling findAndModify:

{
    "_id" : ObjectId("55b5ffa5511fee0e45ed614b"),
    "_class" : "org.baeldung.model.User",
    "name" : "Markus"
}

Let’s look at actual operation code:

Query query = new Query();
query.addCriteria(Criteria.where("name").is("Markus"));
Update update = new Update();
update.set("name", "Nick");
User user = mongoTemplate.findAndModify(query, update, User.class);

The returned user object has the same values as the initial state in the database.

However, the new state in the database is:

{
    "_id" : ObjectId("55b5ffa5511fee0e45ed614b"),
    "_class" : "org.baeldung.model.User",
    "name" : "Nick"
}

5.7. Upsert

The upsert works operates on find and modify else create semantics: if the document is matched, update it, else create a new document by combining the query and update object.

Let’s start with the initial state of the database:

{
    "_id" : ObjectId("55b5ffa5511fee0e45ed614b"),
    "_class" : "org.baeldung.model.User",
    "name" : "Markus"
}

Now – let’s run the upsert:

Query query = new Query();
query.addCriteria(Criteria.where("name").is("Markus"));
Update update = new Update();
update.set("name", "Nick");
mongoTemplate.upsert(query, update, User.class);

Here’s the state of database after the operation:

{
    "_id" : ObjectId("55b5ffa5511fee0e45ed614b"),
    "_class" : "org.baeldung.model.User",
    "name" : "Nick"
}

5.8. Remove

The state of database before calling remove:

{
    "_id" : ObjectId("55b5ffa5511fee0e45ed614b"),
    "_class" : "org.baeldung.model.User",
    "name" : "Benn"
}

Let’s now run remove:

mongoTemplate.remove(user, "user");

The result will be as expected:

{
}

6. Using MongoRepository

6.1. Insert

First – the state of the database before running the insert:

{
}

Now, when we insert a new user:

User user = new User();
user.setName("Jon");
userRepository.insert(user);

Here’s the end state of the database:

{
    "_id" : ObjectId("55b4fda5830b550a8c2ca25a"),
    "_class" : "org.baeldung.model.User",
    "name" : "Jon"
}

Note how the operation works exactly the same as the insert in the MongoTemplate API.

6.2. Save Insert

Similarly – save works exactly the same as the save operation in the MongoTemplate API.

Let’s start by looking at the insert semantics of the operation; here’s the initial state of the database:

{
}

Now – we execute the save operation:

User user = new User();
user.setName("Aaron");
userRepository.save(user);

This results in the user being added to the database:

{
    "_id" : ObjectId("55b52bb7830b8c9b544b6ad5"),
    "_class" : "org.baeldung.model.User",
    "name" : "Aaron"
}

Note again how, in this example, save works with insert semantics, because we are inserting new object.

6.3. Save Update

Let’s now look at the same operation but with update semantics.

First – here’s the state of the database before running the new save:

{
    "_id" : ObjectId("55b52bb7830b8c9b544b6ad5"),
    "_class" : "org.baeldung.model.User",
    "name" : "Jack"
}

Now – we execute the operation:

user = mongoTemplate.findOne(Query.query(Criteria.where("name").is("Jack")), User.class);
user.setName("Jim");
userRepository.save(user);

Finally, here is the state of the database:

{
    "_id" : ObjectId("55b52bb7830b8c9b544b6ad5"),
    "_class" : "org.baeldung.model.User",
    "name" : "Jim"
}

Note again how, in this example, save works with update semantics, because we are using an existing object.

6.4. Delete

The state of database before calling delete:

{
    "_id" : ObjectId("55b5ffa5511fee0e45ed614b"),
    "_class" : "org.baeldung.model.User",
    "name" : "Benn"
}

Let’s run delete:

userRepository.delete(user);

The result will simply be:

{
}

6.5. FindOne

The state of database when findOne is called:

{
    "_id" : ObjectId("55b5ffa5511fee0e45ed614b"),
    "_class" : "org.baeldung.model.User",
    "name" : "Chris"
}

Let’s now execute the findOne:

userRepository.findOne(user.getId())

The result which will return the existing data:

{
    "_id" : ObjectId("55b5ffa5511fee0e45ed614b"),
    "_class" : "org.baeldung.model.User",
    "name" : "Chris"
}

6.6. Exists

The state of database before calling exists:

{
    "_id" : ObjectId("55b5ffa5511fee0e45ed614b"),
    "_class" : "org.baeldung.model.User",
    "name" : "Harris"
}

Now, let’s run exists:

boolean isExists = userRepository.exists(user.getId());

Which of course will return true.

6.7. FindAll with Sort

The state of database before calling findAll:

[
    {
        "_id" : ObjectId("55b5ffa5511fee0e45ed614b"),
        "_class" : "org.baeldung.model.User",
        "name" : "Brendan"
    },
    {
       "_id" : ObjectId("67b5ffa5511fee0e45ed614b"),
       "_class" : "org.baeldung.model.User",
       "name" : "Adam"
    }
]

Let’s now run findAll with Sort:

List<User> users = userRepository.findAll(new Sort(Sort.Direction.ASC, "name"));

The result will be sorted by name in ascending order:

[
    {
        "_id" : ObjectId("67b5ffa5511fee0e45ed614b"),
        "_class" : "org.baeldung.model.User",
        "name" : "Adam"
    },
    {
        "_id" : ObjectId("55b5ffa5511fee0e45ed614b"),
        "_class" : "org.baeldung.model.User",
        "name" : "Brendan"
    }
]

6.8. FindAll with Pageable

The state of database before calling findAll:

[
    {
        "_id" : ObjectId("55b5ffa5511fee0e45ed614b"),
        "_class" : "org.baeldung.model.User",
        "name" : "Brendan"
    },
    {
        "_id" : ObjectId("67b5ffa5511fee0e45ed614b"),
        "_class" : "org.baeldung.model.User",
        "name" : "Adam"
    }
]

Let’s now execute findAll with Pegeable:

Pageable pageableRequest = new PageRequest(0, 1);
Page<User> page = userRepository.findAll(pageableRequest);
List<User> users = pages.getContent();

The result in users list will be only one user:

{
    "_id" : ObjectId("55b5ffa5511fee0e45ed614b"),
    "_class" : "org.baeldung.model.User",
    "name" : "Brendan"
}

7. Annotations

Finally, let’s also go over the simple annotations that Spring Data uses to drive these API operations.

@Id
private String id;

The field level @Id annotation can decorate any type, including long and string.

If the value on the @Id field is not null, it’s stored in the database as-is; otherwise the converter will assume you want to store an ObjectId in the database (either ObjectId, String or BigInteger work).

Next – @Document:

@Document
public class User {
    //
}

This annotation simply marks a class as being a domain object that needs to be persisted to the database, along with allowing us to choose the name of the collection to be used.

8. Conclusion

This article was a quick but comprehensive introduction to using MongoDB with Spring Data, both via the MongoTemplate API as well as making use of MongoRepository.

The implementation of all these examples and code snippets can be found in my github project – this is an Eclipse based project, so it should be easy to import and run as it is.

I usually post about Persistence on Twitter - you can follow me there:



Externalize Setup Data via CSV in a Spring Application

$
0
0

1. Overview

In this article, we’ll externalize the setup data of an application using CSV files, instead of hardcoding it.

2. A CSV Library

Let’s start by introducing a simple library to work with CSV – the Jackson CSV extension:

<dependency>
    <groupId>com.fasterxml.jackson.dataformat</groupId>
    <artifactId>jackson-dataformat-csv</artifactId>       
    <version>2.5.3</version>
</dependency>

There are of course a host of available libraries to work with CSVs in the Java ecosystem.

The reason we’re going with Jackson here is that – it’s likely that Jackson is already in use in the application, and the processing we need to read the data is fairly straightforward.

3. The Setup Data

Different projects will need to setup different data.

In this tutorial, we’re going to be setting up User data – basically preparing the system with a few default users.

Here’s the simple CSV file containing the users:

id,username,password,accessToken
1,john,123,token
2,tom,456,test

Note how the first row of the file is the header row – listing out the names of the fields in each row of data.

3. CSV Data Loader

Let’s start by creating a simple data loader to read up data from the CSV files into working memory.

3.1. Load a List of Objects

We’ll implement the loadObjectList() functionality to load a fully parametrized List of specific Object from the file:

public <T> List<T> loadObjectList(Class<T> type, String fileName) {
    try {
        CsvSchema bootstrapSchema = CsvSchema.emptySchema().withHeader();
        CsvMapper mapper = new CsvMapper();
        File file = new ClassPathResource(fileName).getFile();
        MappingIterator<T> readValues = 
          mapper.reader(type).with(bootstrapSchema).readValues(file);
        return readValues.readAll();
    } catch (Exception e) {
        logger.error("Error occurred while loading object list from file " + fileName, e);
        return Collections.emptyList();
    }
}

Notes:

  • We created the CSVSchema based on first “header” row.
  • The implementation is generic enough to handle any type of object.
  • If any error occurs, an empty list will be returned.

3.2. Handle Many to Many Relationship

Nested objects are not well supported in Jackson CSV – we’ll need to use an indirect way to load Many to Many relationships.

We’ll represent these similar to simple Join Tables – so naturally we’ll load from disk as a list of arrays:

public List<long[]> loadManyToManyRelationship(String fileName) {
    try {
        CsvMapper mapper = new CsvMapper();
        CsvSchema bootstrapSchema = CsvSchema.emptySchema().withSkipFirstDataRow(true);
        mapper.enable(CsvParser.Feature.WRAP_AS_ARRAY);
        File file = new ClassPathResource(fileName).getFile();
        MappingIterator<long[]> readValues = 
          mapper.reader(long[].class).with(bootstrapSchema).readValues(file);
        return readValues.readAll();
    } catch (Exception e) {
        logger.error(
          "Error occurred while loading many to many relationship from file = " + fileName, e);
        return Collections.emptyList();
    }
}

Here’s how one of these relationships – Roles <-> Privileges – is represented in a simple CSV file:

role_id,privilege_id
1,1
1,2
2,4
3,3

Note how we’re ignoring the header in this implementation, as we don’t really need that information.

4. Setup Data

Now, we’ll use a simple Setup bean to do all the work of setting up privileges, roles and users from CSV files:

@Component
public class Setup {
    ...
    
    @PostConstruct
    private void setupData() {
        setupRolesAndPrivileges();
        setupUsers();
    }
    
    ...
}

4.1. Setup Roles and Privileges

First, let’s load roles and privileges from disk into working memory, and then persist them as part of the setup process:

public List<Privilege> getPrivileges() {
    return csvDataLoader.loadObjectList(Privilege.class, PRIVILEGES_FILE);
}

public List<Role> getRoles() {
    List<Privilege> allPrivileges = getPrivileges();
    List<Role> roles = csvDataLoader.loadObjectList(Role.class, ROLES_FILE);
    List<long[]> rolesPrivileges = 
      csvDataLoader.loadManyToManyRelationship(SetupData.ROLES_PRIVILEGES_FILE);

    for (long[] rolePrivilege : rolesPrivileges) {
        Role role = findById(roles, rolePrivilege[0]);
        Set<Privilege> privileges = role.getPrivileges();
        if (privileges == null) {
            privileges = new HashSet<Privilege>();
        }
        privileges.add(findById(allPrivileges, rolePrivilege[1]));
        role.setPrivileges(privileges);
    }
    return roles;
}

private <T extends IEntity> T findById(List<T> list, long id) { 
    return list.stream().filter(item -> item.getId() == id).findFirst().get(); 
}

Then we’ll do the persist work here:

private void setupRolesAndPrivileges() {
    privilegeRepository.save(setupData.getPrivileges());
    roleRepository.save(setupData.getRoles());
}

Note how, after we load both Roles and Privileges into working memory, we load their relationships one by one.

4.2. Setup Initial Users

Next – let’s load the users into memory and persist them:

public List<User> getUsers() {
    List<Role> allRoles = getRoles();
    List<User> users = csvDataLoader.loadObjectList(User.class, SetupData.USERS_FILE);
    List<long[]> usersRoles = 
      csvDataLoader.loadManyToManyRelationship(SetupData.USERS_ROLES_FILE);

    for (long[] userRole : usersRoles) {
        User user = findById(users, userRole[0]);
        Set<Role> roles = user.getRoles();
        if (roles == null) {
            roles = new HashSet<Role>();
        }
        roles.add(findById(allRoles, userRole[1]));
        user.setRoles(roles);
    }
    return users;
}

Next, let’s focus on persisting the users:

private void setupUsers() {
    List<User> users = setupData.getUsers();
    for (User user : users) {
        setupService.setupUser(user);
    }
}

And here is our SetupService:

@Transactional
public void setupUser(User user) {
    try {
        setupUserInternal(user);
    } catch (Exception e) {
        logger.error("Error occurred while saving user " + user.toString(), e);
    }
}

private void setupUserInternal(User user) {
    user.setPassword(passwordEncoder.encode(user.getPassword()));
    user.setPreference(createSimplePreference(user));
    userRepository.save(user);
}

And here is createSimplePreference() method:

private Preference createSimplePreference(User user) {
    Preference pref = new Preference();
    pref.setId(user.getId());
    pref.setTimezone(TimeZone.getDefault().getID());
    pref.setEmail(user.getUsername() + "@test.com");
    return preferenceRepository.save(pref);
}

Note how, before we save a user, we create a simple Preference entity for it and persist that first.

5. Test CSV Data Loader

Next, let’s perform a simple unit test on our CsvDataLoader:

We will test loading list of Users, Roles and Privileges:

@Test
public void whenLoadingUsersFromCsvFile_thenLoaded() {
    List<User> users = csvDataLoader.
      loadObjectList(User.class, CsvDataLoader.USERS_FILE);
    assertFalse(users.isEmpty());
}

@Test
public void whenLoadingRolesFromCsvFile_thenLoaded() {
    List<Role> roles = csvDataLoader.
      loadObjectList(Role.class, CsvDataLoader.ROLES_FILE);
    assertFalse(roles.isEmpty());
}

@Test
public void whenLoadingPrivilegesFromCsvFile_thenLoaded() {
    List<Privilege> privileges = csvDataLoader.
      loadObjectList(Privilege.class, CsvDataLoader.PRIVILEGES_FILE);
    assertFalse(privileges.isEmpty());
}

Next, let’s test loading some Many to Many relationships via the data loader:

@Test
public void whenLoadingUsersRolesRelationFromCsvFile_thenLoaded() {
    List<long[]> usersRoles = csvDataLoader.
      loadManyToManyRelationship(CsvDataLoader.USERS_ROLES_FILE);
    assertFalse(usersRoles.isEmpty());
}

@Test
public void whenLoadingRolesPrivilegesRelationFromCsvFile_thenLoaded() {
    List<long[]> rolesPrivileges = csvDataLoader.
      loadManyToManyRelationship(CsvDataLoader.ROLES_PRIVILEGES_FILE);
    assertFalse(rolesPrivileges.isEmpty());
}

6. Test Setup Data

Finally, let’s perform a simple unit test on our bean SetupData:

@Test
public void whenGettingUsersFromCsvFile_thenCorrect() {
    List<User> users = setupData.getUsers();

    assertFalse(users.isEmpty());
    for (User user : users) {
        assertFalse(user.getRoles().isEmpty());
    }
}

@Test
public void whenGettingRolesFromCsvFile_thenCorrect() {
    List<Role> roles = setupData.getRoles();

    assertFalse(roles.isEmpty());
    for (Role role : roles) {
        assertFalse(role.getPrivileges().isEmpty());
    }
}

@Test
public void whenGettingPrivilegesFromCsvFile_thenCorrect() {
    List<Privilege> privileges = setupData.getPrivileges();
    assertFalse(privileges.isEmpty());
}

7. Conclusion

In this quick article we explored an alternative setup method for the initial data that usually needs to be loaded into a system on startup. This is of course just a simple Proof of Concept and a good base to build upon – not a production ready solution.

We’re also going to use this solution in the Reddit web application tracked by this ongoing case study.

Baeldung Weekly Review 32

$
0
0

I usually post about Dev stuff on Twitter - you can follow me there:

At the very beginning of last year, I decided to track my reading habits and share the best stuff here, on Baeldung. Haven’t missed a review since.

Here we go…

1. Spring and Java

>> Spring Framework 4.2 goes GA [spring]

I’m upgrading today. Literally. Nuff said.

>> Coming up in 2016: Spring Framework 4.3 & 5.0 [spring]

On top of the earlier 4.2 GA, here’s what the next steps in the Spring ecosystem look like.

>> Project Jigsaw is Really Coming in Java 9 [infoq]

A solid deep dive into Jigsaw and the upcoming Java 9.

Also worth reading:

Webinars and presentations:

Time to upgrade:

2. Technical

>> Apache Kafka, Samza, and the Unix Philosophy of Distributed Data [confluent]

A long but worthwhile read about a few things the Unix architecture can teach us.

>> Events Don’t Eliminate Dependencies [bozho]

This piece makes a good point – using an event just for the sake of decoupling can, in a lot of cases – do more hard than good.

That being said, events are certainly a powerful abstraction and, if the choice to use them is well founded and justified, can lead to an elegant implementation.

>> Amazon EC2 2015 Benchmark: Testing Speeds Between AWS EC2 and S3 Regions [takipi]

Interesting data tracking the speed of all EC2 regions this year compared to last year.

>> Microservices lessons from trenches [mehdi-khalili]

Some good take-aways about what’s good about a microservice architecture. And that’s not the hype.

Also worth reading:

3. Musings

>> Code Kata? How About Product Kata? and

>> Ship Something for Yourself, Even if you Only Earn A Dollar and

>> The Phoenix of My New Site from the Ashes of my Battle with CSS [daedtech]

“Learning when the pressure is off” removes a lot of the roadblocks people have when trying to pick up a new skill.

And learning “product” – yeah, I wish I started to wise up and get started a decade ago.

>> We’re heading Straight for AOL 2.0 [jacquesmattheij]

An enthralling and insightful read on protocols and protocol design.

A sentence that was said by no one ever.

Also worth reading:

4. Comics

And my favorite Dilberts of the week:

>> My PowerPoint Slides are in the Louvre

>> “Death Eater” Grey

>> Reverse my sense of right and wrong

5. Pick of the Week

Earlier this year I introduced the “Pick of the Week” section here in my “Weekly Review”. If you’re already on my email list – you got the pick already – hope you enjoyed it.

If not – you can share the review and unlock it right here:

Use Liquibase to Safely Evolve Your Database Schema

$
0
0

I usually post about Persistence on Twitter - you can follow me there:

1. Overview

In this quick tutorial, we’ll make use of Liquibase to evolve the database schema of a Java web application.

We’re going to focus on a general Java app first, and we’re also going to take a focused look at some interesting options available for Spring and Hibernate.

Very briefly, the core of using Liquibase is the changeLog file – an XML file that keeps track of all changes that need to run to update the DB.

Let’s start with the Maven dependency we need to add into our pom.xml:

<dependency>
    <groupId>org.liquibase</groupId>
     <artifactId>liquibase-core</artifactId>
      <version>3.4.1</version>
</dependency>

You can also check if there’s a newer version of liquibase-core here.

2. The Database Change Log

Now, let’s take a look at a simple changeLog file – this one only adds a column “address” to the table “users“:

<databaseChangeLog 
  xmlns="http://www.liquibase.org/xml/ns/dbchangelog" 
  xmlns:ext="http://www.liquibase.org/xml/ns/dbchangelog-ext" 
  xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
  xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog-ext
   http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-ext.xsd 
   http://www.liquibase.org/xml/ns/dbchangelog 
   http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-3.4.xsd">
    
    <changeSet author="John" id="someUniqueId">
        <addColumn tableName="users">
            <column name="address" type="varchar(255)" />
        </addColumn>
    </changeSet>
    
</databaseChangeLog>

Note how the change set is identified by an id and an author – to make sure it can be uniquely identified and only applied once.

Let’s not see how to wire this into our application and make sure that it runs when the application starts up.

3. Run Liquibase with a Spring Bean

Our first option to run the changes on application startup is via a Spring bean. There are of course many other ways, but if we’re dealing with a Spring application – this is a good, simple way to go:

@Bean
public SpringLiquibase liquibase() {
    SpringLiquibase liquibase = new SpringLiquibase();
    liquibase.setChangeLog("classpath:liquibase-changeLog.xml");
    liquibase.setDataSource(dataSource());
    return liquibase;
}

Note how we’re pointing it to a valid changeLog file that needs to exist on the classpath.

4. Generate the changeLog with a Maven Plugin

Instead of writing the changeLog file manually – we can use the Liquibase Maven plugin to generate one and save ourselves a lot of work.

4.1. Plugin Configuration

Here is the changes to our pom.xml:

<dependency>
    <groupId>org.liquibase</groupId>
    <artifactId>liquibase-maven-plugin</artifactId>
    <version>3.4.1</version>
</dependency> 
...
<plugins>
    <plugin>
        <groupId>org.liquibase</groupId>
        <artifactId>liquibase-maven-plugin</artifactId>
        <version>3.4.1</version>
        <configuration>                  
            <propertyFile>src/main/resources/liquibase.properties</propertyFile>
        </configuration>                
    </plugin> 
</plugins>

4.2. Generate a changeLog from an existing Database

We can use the plugin to generate a changeLog from an existing database:

mvn liquibase:generateChangeLog

Here are the liquibase properties:

url=jdbc:mysql://localhost:3306/oauth_reddit
username=tutorialuser
password=tutorialmy5ql
driver=com.mysql.jdbc.Driver
outputChangeLogFile=src/main/resources/liquibase-outputChangeLog.xml

The end result is an changeLog file that we can use either to create an initial DB schema or to populate data. Here’s how that would look like for our example app:

<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<databaseChangeLog ...>
    
    <changeSet author="John (generated)" id="1439225004329-1">
        <createTable tableName="APP_USER">
            <column autoIncrement="true" name="id" type="BIGINT">
                <constraints primaryKey="true"/>
            </column>
            <column name="accessToken" type="VARCHAR(255)"/>
            <column name="needCaptcha" type="BIT(1)">
                <constraints nullable="false"/>
            </column>
            <column name="password" type="VARCHAR(255)"/>
            <column name="refreshToken" type="VARCHAR(255)"/>
            <column name="tokenExpiration" type="datetime"/>
            <column name="username" type="VARCHAR(255)">
                <constraints nullable="false"/>
            </column>
            <column name="preference_id" type="BIGINT"/>
            <column name="address" type="VARCHAR(255)"/>
        </createTable>
    </changeSet>
    ...
</databaseChangeLog>

4.3. Generate a changeLog from diff between two databases

We can use the plugin to generate a changeLog file from the differences between two existing databases (for example: development and production):

mvn liquibase:diff

Here are the properties:

changeLogFile=src/main/resources/liquibase-changeLog.xml
url=jdbc:mysql://localhost:3306/oauth_reddit
username=tutorialuser
password=tutorialmy5ql
driver=com.mysql.jdbc.Driver
referenceUrl=jdbc:h2:mem:oauth_reddit
diffChangeLogFile=src/main/resources/liquibase-diff-changeLog.xml
referenceDriver=org.h2.Driver
referenceUsername=sa
referencePassword=

And here’s a snippet of the generated changeLog:

<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<databaseChangeLog ...>
    <changeSet author="John" id="1439227853089-1">
        <dropColumn columnName="address" tableName="APP_USER"/>
    </changeSet>
</databaseChangeLog>

This is a super powerful way to evolve your DB by – for example – allowing Hibernate to auto-generate a new schema for development, and then using that as a reference point against the old schema.

5. Use the Liquibase Hibernate Plugin

If the application uses Hibernate – we’re going to take a look at a very useful way of generating the changeLog.

First – here’s how the liquibase-hibernate plugin should be configured in Maven:

5.1. Plugin Configuration

First, let’s get the new plugin configured and using the right dependencies:

<plugins>
    <plugin>
        <groupId>org.liquibase</groupId>
        <artifactId>liquibase-maven-plugin</artifactId>
        <version>3.4.1</version>
        <configuration>                  
            <propertyFile>src/main/resources/liquibase.properties</propertyFile>
        </configuration> 
        <dependencies>
            <dependency>
                <groupId>org.liquibase.ext</groupId>
                <artifactId>liquibase-hibernate4</artifactId>
                <version>3.5</version>
            </dependency>
            <dependency>
                <groupId>org.springframework</groupId>
                <artifactId>spring-beans</artifactId>
                <version>4.1.7.RELEASE</version>
            </dependency>
            <dependency>
                <groupId>org.springframework.data</groupId>
                <artifactId>spring-data-jpa</artifactId>
                <version>1.7.3.RELEASE</version>
            </dependency>
        </dependencies>               
    </plugin> 
</plugins>

5.2. Generate a changeLog from diffs between a Database and Persistence Entities

Now, for the fun part. We can use this plugin to generate a changeLog file from the differences between an existing database (for example production) and our new persistence entities.

So – to make things simple – once an entity is modified, you can simply generate the changes against the old DB schema, getting a clean, powerful way to evolve your schema in production.

Here are the liquibase properties:

changeLogFile=classpath:liquibase-changeLog.xml
url=jdbc:mysql://localhost:3306/oauth_reddit
username=tutorialuser
password=tutorialmy5ql
driver=com.mysql.jdbc.Driver
referenceUrl=hibernate:spring:org.baeldung.persistence.model
  ?dialect=org.hibernate.dialect.MySQLDialect
diffChangeLogFile=src/main/resources/liquibase-diff-changeLog.xml

Note: The referenceUrl is using package scan, so the dialect parameter is required.

6. Conclusion

In this tutorial we illustrated several ways to use Liquibase and get to a safe and mature way of evolving and refactoring the DB schema of a Java app.

The implementation of all these examples and code snippets can be found in my github project – this is an Eclipse based project, so it should be easy to import and run as it is.

I usually post about Persistence on Twitter - you can follow me there:


A Guide to Queries in Spring Data MongoDB

$
0
0

I usually post about Persistence on Twitter - you can follow me there:

1. Overview

This article will focus on building out different types of queries in Spring Data MongoDB.

We’re going to be looking at querying documents with Query and Criteria classes, auto-generated query methods, JSON queries and QueryDSL.

2. Documents Query

One of the more common ways to query MongoDB with Spring Data is by making use of the Query and Criteria classes – which very closely mirror native operators.

2.1. Is

This is simply a criterion using equality – let’s see how it works.

In the following example – we’re looking for users named Eric.

Let’s look at our database:

[
    {
        "_id" : ObjectId("55c0e5e5511f0a164a581907"),
        "_class" : "org.baeldung.model.User",
        "name" : "Eric",
        "age" : 45
    },
    {
        "_id" : ObjectId("55c0e5e5511f0a164a581908"),
        "_class" : "org.baeldung.model.User",
        "name" : "Antony",
        "age" : 55
    }
}

Now let’s look at query code:

Query query = new Query();
query.addCriteria(Criteria.where("name").is("Eric"));
List<User> users = mongoTemplate.find(query, User.class);

This logic returns, as expected:

{
    "_id" : ObjectId("55c0e5e5511f0a164a581907"),
    "_class" : "org.baeldung.model.User",
    "name" : "Eric",
    "age" : 45
}

2.2. Regex

A more flexible and powerful type of query is regex. This creates a criterion using a MongoDB $regex that returns all records suitable for this regexp for this field.

It works similar to startingWith, endingWith operations – let’s look at an example.

We’re now looking for all users that have names starting with A.

Here’s the state of the database:

[
    {
        "_id" : ObjectId("55c0e5e5511f0a164a581907"),
        "_class" : "org.baeldung.model.User",
        "name" : "Eric",
        "age" : 45
    },
    {
        "_id" : ObjectId("55c0e5e5511f0a164a581908"),
        "_class" : "org.baeldung.model.User",
        "name" : "Antony",
        "age" : 33
    },
    {
        "_id" : ObjectId("55c0e5e5511f0a164a581909"),
        "_class" : "org.baeldung.model.User",
        "name" : "Alice",
        "age" : 35
    }
]

Let’s now create the query:

Query query = new Query();
query.addCriteria(Criteria.where("name").regex("^A"));
List<User> users = mongoTemplate.find(query,User.class);

This runs and returns 2 records:

[
    {
        "_id" : ObjectId("55c0e5e5511f0a164a581908"),
        "_class" : "org.baeldung.model.User",
        "name" : "Antony",
        "age" : 33
    },
    {
        "_id" : ObjectId("55c0e5e5511f0a164a581909"),
        "_class" : "org.baeldung.model.User",
        "name" : "Alice",
        "age" : 35
    }
]

Here’s another quick example, this time looking for all users that have names ending with c:

Query query = new Query();
query.addCriteria(Criteria.where("name").regex("c$"));
List<User> users = mongoTemplate.find(query, User.class);

So the result will be:

{
    "_id" : ObjectId("55c0e5e5511f0a164a581907"),
    "_class" : "org.baeldung.model.User",
    "name" : "Eric",
    "age" : 45
}

2.3. Lt and gt

This operators create a criterion using the $lt (less than) operator and $gt (greater than).

Let’s have a quick look at an example – we’re looking for all users with age between 20 and 50.

The database is:

[
    {
        "_id" : ObjectId("55c0e5e5511f0a164a581907"),
        "_class" : "org.baeldung.model.User",
        "name" : "Eric",
        "age" : 45
    },
    {
        "_id" : ObjectId("55c0e5e5511f0a164a581908"),
        "_class" : "org.baeldung.model.User",
        "name" : "Antony",
        "age" : 55
    }
}

This query code:

Query query = new Query();
query.addCriteria(Criteria.where("age").lt(50).gt(20));
List<User> users = mongoTemplate.find(query,User.class);

And the result – all user who with an age of greater than 20 and less than 50:

{
    "_id" : ObjectId("55c0e5e5511f0a164a581907"),
    "_class" : "org.baeldung.model.User",
    "name" : "Eric",
    "age" : 45
}

2.4. Sort

Sort is used to specify a sort order for the results.

The example below returns all users sorted by age in ascending order.

First – here’s the existing data:

[
    {
        "_id" : ObjectId("55c0e5e5511f0a164a581907"),
        "_class" : "org.baeldung.model.User",
        "name" : "Eric",
        "age" : 45
    },
    {
        "_id" : ObjectId("55c0e5e5511f0a164a581908"),
        "_class" : "org.baeldung.model.User",
        "name" : "Antony",
        "age" : 33
    },
    {
        "_id" : ObjectId("55c0e5e5511f0a164a581909"),
        "_class" : "org.baeldung.model.User",
        "name" : "Alice",
        "age" : 35
    }
]

After executing sort:

Query query = new Query();
query.with(new Sort(Sort.Direction.ASC, "age"));
List<User> users = mongoTemplate.find(query,User.class);

And here’s the result of the query – nicely sorted by age:

[
    {
        "_id" : ObjectId("55c0e5e5511f0a164a581908"),
        "_class" : "org.baeldung.model.User",
        "name" : "Antony",
        "age" : 33
    },
    {
        "_id" : ObjectId("55c0e5e5511f0a164a581909"),
        "_class" : "org.baeldung.model.User",
        "name" : "Alice",
        "age" : 35
    },
    {
        "_id" : ObjectId("55c0e5e5511f0a164a581907"),
        "_class" : "org.baeldung.model.User",
        "name" : "Eric",
        "age" : 45
    }
]

2.5. Pageable

Let’s look at a quick example using pagination.

Here’s the state of the database:

[
    {
        "_id" : ObjectId("55c0e5e5511f0a164a581907"),
        "_class" : "org.baeldung.model.User",
        "name" : "Eric",
        "age" : 45
    },
    {
        "_id" : ObjectId("55c0e5e5511f0a164a581908"),
        "_class" : "org.baeldung.model.User",
        "name" : "Antony",
        "age" : 33
    },
    {
        "_id" : ObjectId("55c0e5e5511f0a164a581909"),
        "_class" : "org.baeldung.model.User",
        "name" : "Alice",
        "age" : 35
    }
]

Now, the query logic, simply asking for a page of size 2:

final Pageable pageableRequest = new PageRequest(0, 2);
Query query = new Query();
query.with(pageableRequest);

And the result – the 2 documents, as expected:

[
    {
        "_id" : ObjectId("55c0e5e5511f0a164a581907"),
        "_class" : "org.baeldung.model.User",
        "name" : "Eric",
        "age" : 45
    },
    {
        "_id" : ObjectId("55c0e5e5511f0a164a581908"),
        "_class" : "org.baeldung.model.User",
        "name" : "Antony",
        "age" : 33
    }
]

To explore the full details of this API, here is the documentations for the Query and Criteria classes.

3. Generated Query Methods

Let’s now explore the more common type of query that Spring Data usually provides – auto-generated queries out of method names.

The only thing we need to do to leverage these kinds of queries is to declare the method on the repository interface:

public interface UserRepository 
  extends MongoRepository<User, String>, QueryDslPredicateExecutor<User> {
    ...
}

3.1. FindByX

We’ll start simple, by exploring the findBy type of query – in this case, find by name:

List<User> findByName(String name);

Same as in the previous section – 2.1 – the query will have the same results, finding all users with the given name:

List<User> users = userRepository.findByName("Eric");

3.2. StartingWith and endingWith.

In 2.2, we explored a regex based query. Starts and ends with are of course less powerful, but nevertheless quite useful – especially if we don’t have to actually implement them.

Here’s a quick example of how the operations would look like:

List<User> findByNameStartingWith(String regexp);
List<User> findByNameEndingWith(String regexp);

The example of actually using this would of course be very simple:

List<User> users = userRepository.findByNameStartingWith("A");
List<User> users = userRepository.findByNameEndingWith("c");

And the results are exactly the same.

3.3. Between

Similar to 2.3, this will return all users with age between ageGT and ageLT:

List<User> findByAgeBetween(int ageGT, int ageLT);

Calling the method will result in exactly the same documents being found:

List<User> users = userRepository.findByAgeBetween(20, 50);

3.4. Like and OrderBy

Let’s have a look at a more advanced example this time – combining two types of modifiers for the generated query.

We’re going to be looking for all users that have names containing the letter A and we’re also going to order the results by age, in ascending order:

List<User> users = userRepository.findByNameLikeOrderByAgeAsc("A");

For the database we used in 2.4 – the result will be:

[
    {
        "_id" : ObjectId("55c0e5e5511f0a164a581908"),
        "_class" : "org.baeldung.model.User",
        "name" : "Antony",
        "age" : 33
    },
    {
        "_id" : ObjectId("55c0e5e5511f0a164a581909"),
        "_class" : "org.baeldung.model.User",
        "name" : "Alice",
        "age" : 35
    }
]

4. JSON Query Methods

If we can’t represent a query with the help of a method name, or criteria, we can do something more low leve – use the @Query annotation.

With this annotation, we can specify a raw query – as a Mongo JSON query string.

4.1. FindBy

Let’s start simple and look at how we would represent a find by type of method first:

@Query("{ 'name' : ?0 }")
List<User> findUsersByName(String name);

This method should return users by name – the placeholder ?0 references the first parameter of the method.

List<User> users = userRepository.findUsersByName("Eric");

4.2 $regex

Let’s also look at a regex driven query – which of course produces the same result as in 2.2 and 3.2:

@Query("{ 'name' : { $regex: ?0 } }")
List<User> findUsersByRegexpName(String regexp);

The usage is also exactly the same:

List<User> users = userRepository.findUsersByRegexpName("^A");
List<User> users = userRepository.findUsersByRegexpName("c$");

4.3. $lt and $gt

Let’s now implement the lt and gt query:

@Query("{ 'age' : { $gt: ?0, $lt: ?1 } }")
List<User> findUsersByAgeBetween(int ageGT, int ageLT);

Now how, now that the method has 2 parameters, we’re referencing each of these by index in the raw query: ?0 and ?1.

List<User> users = userRepository.findUsersByAgeBetween(20, 50);

5. QueryDSL Queries

MongoRepository has good support for the QueryDSL project – so we can leverage that nice, type-safe API here as well.

5.1. The Maven Dependencies

First, let’s make sure we have the correct Maven dependencies defined in the pom:

<dependency>
    <groupId>com.mysema.querydsl</groupId>
    <artifactId>querydsl-mongodb</artifactId>
    <version>3.6.6</version>
</dependency>
<dependency>
    <groupId>com.mysema.querydsl</groupId>
    <artifactId>querydsl-apt</artifactId>
    <version>3.6.6</version>
</dependency>

5.2. Q-classes

QueryDSL used Q-classes for creating queries. But, since we don’t really want to create these by hand, we need to generate them somehow.

We’re going to use the apt-maven-plugin to do that:

<plugin>    
    <groupId>com.mysema.maven</groupId>
    <artifactId>apt-maven-plugin</artifactId>
    <version>1.1.3</version>
    <executions>
        <execution>
            <goals>
                <goal>process</goal>
            </goals>
            <configuration>
                <outputDirectory>target/generated-sources/java</outputDirectory>
                <processor>
                  org.springframework.data.mongodb.repository.support.MongoAnnotationProcessor
                </processor>
            </configuration>
        </execution>
     </executions>
</plugin>

Let’s look at the User class – focusing specifically at the @QueryEntity annotation :

@QueryEntity 
@Document
public class User {
 
    @Id
    private String id;
    private String name;
    private Integer age;
 
    // standard getters and setters
}

After running the process goal of the Maven lifecycle (or anything other goal after that one) – the apt plugin will generate the new classes under target/generated-sources/java/{your package structure}:

/**
 * QUser is a Querydsl query type for User
 */
@Generated("com.mysema.query.codegen.EntitySerializer")
public class QUser extends EntityPathBase<User> {

    private static final long serialVersionUID = ...;

    public static final QUser user = new QUser("user");

    public final NumberPath<Integer> age = createNumber("age", Integer.class);

    public final StringPath id = createString("id");

    public final StringPath name = createString("name");

    public QUser(String variable) {
        super(User.class, forVariable(variable));
    }

    public QUser(Path<? extends User> path) {
        super(path.getType(), path.getMetadata());
    }

    public QUser(PathMetadata<?> metadata) {
        super(User.class, metadata);
    }
}

It’s with the help of this class that we’re not going to be creating our queries.

As a sidenote – if you’re using Eclipse, introducing this plugin will generate the following warning in pom:

You need to run build with JDK or have tools.jar on the classpath. If this occurs during eclipse build make sure you run eclipse under JDK as well (com.mysema.maven:apt-maven-plugin:1.1.3:process:default:generate-sources

Maven install works fine and QUser class is generated, but plugin is highlighted in the pom.

A quick fix is to manually point to the JDK in eclipse.ini:

...
-vm
{path_to_jdk}\jdk{your_version}\bin\javaw.exe

5.3. The New Repository

Now we need to actually enable QueryDSL support in our repositories – which is done by simply extending the QueryDslPredicateExecutor interface:

public interface UserRepository extends 
  MongoRepository<User, String>, QueryDslPredicateExecutor<User>

5.4. Eq

With support enabled, let’s now implement the same queries as the ones we illustrated before.

We’ll start with simple equality:

QUser qUser = new QUser("user");
Predicate predicate = qUser.name.eq("Eric");
List<User> users = (List<User>) userRepository.findAll(predicate);

5.5. StartingWith and EndingWith

Similarly, let’s implement the previous queries – and find users with names that are starting with A:

QUser qUser = new QUser("user");
Predicate predicate = qUser.name.startsWith("A");
List<User> users = (List<User>) userRepository.findAll(predicate);

And ending with c:

QUser qUser = new QUser("user");
Predicate predicate = qUser.name.endsWith("c");
List<User> users = (List<User>) userRepository.findAll(predicate);

The result with same as in 2.2, 3.2 or 4.2.

5.6. Between

The next one query will return users with age between 20 and 50 – similar to the previous sections:

QUser qUser = new QUser("user");
Predicate predicate = qUser.age.between(20, 50);
List<User> users = (List<User>) userRepository.findAll(predicate);

6. Conclusion

In this article we explored the many ways we can query using Spring Data MongoDB.

It’s interesting to take a step back and see just how many powerful ways we have to query MongoDB – varying from limited control all the way to full control with raw queries.

The implementation of all these examples and code snippets can be found in my github project – this is an Eclipse based project, so it should be easy to import and run as it is.

I usually post about Persistence on Twitter - you can follow me there:


Baeldung Weekly Review 33

$
0
0

I usually post about Dev stuff on Twitter - you can follow me there:

At the very beginning of last year, I decided to track my reading habits and share the best stuff here, on Baeldung. Haven’t missed a review since.

Here we go…

1. Spring and Java

>> Java SE 8 Optional, a pragmatic approach [joda]

Some solid, to the point tips on using Optional in new code.

>> Performance Guru Kirk Pepperdine Reflects on Results of RebelLabs’ Performance Survey [infoq]

Chock full of goodness in terms of mindset and approach to tuning and profiling a Java application – or any kind of application for that matter. Also things like this:

“Think of a profilers as the blind men examining an elephant”. So there you go

>> What’s the Fastest Garbage Collector in Java 8 for Heavy Calculations? [voxxed]

Some very interesting numbers to inform your decision of picking a GC algorithm.

>> Your Maven build is slow. Speed it up! [zeroturnaround]

Quick feedback is essential for a good dev flow, and a slow build never did anyone any good. Have a read and speed up your Maven build.

>> What REPL means for Java [javaworld]

I like the new REPL already.

>> How to persist LocalDate and LocalDateTime with JPA [thoughts-on-java]

This is definitely handy, especially if you’re tired of working with the Date class, which is showing its age.

>> Spicing up Java projects with Clojure [jstaffans]

I really like this – I might give it a go over the weekend.

Also worth reading:

Webinars and presentations:

Time to upgrade:

2. Technical

>> How Docker changed me [odino]

A quick, practical but very interesting read about the way to use containers.

>> Half-done Versus Fake: The Intermediate Result/Mock Tradeoff [facebook]

Very interesting piece on how testing and more specifically mocking affects code style, what’s worth doing and what’s to much. A bit of a terse read.

>> Getting a Card’s Info From Trello [daedtech]

This is a cool quick intro to consuming an API. No muss, no fuss.

Also worth reading:

3. Musings

>> Use These 7 Tricks to Win Meetings and Get Promoted [daedtech]

Decent giggle to take-away ratio. Worth reading to get you in a good mood. Plus, it’s link-bait, so how can you not click?

>> We’re struggling to get traction with SSL because it’s still a “premium service” [troyhunt]

A fantastic read about the state of SSL in blog-land. It is admittedly a bit on the meta side, but it’s of course a topic that hits close to home with me as a site owner.

Also worth reading:

4. Comics

And my favorite Dilberts of the week:

>> I’m developing an app in my spare time

>> Plans to bestshore

>> International Billing Errors

5. Pick of the Week

Earlier this year I introduced the “Pick of the Week” section here in my “Weekly Review”. If you’re already on my email list – you got the pick already – hope you enjoyed it.

If not – you can share the review and unlock it right here:

Viewing all 3855 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>