Saturday, 24 December 2016

A quick peek into Spring Boot

Spring Boot is a framework to simplify the bootstrapping and development of new Spring application. This result in rapid application development. Spring boot allows developers to add out-of-box functionality in an opinionated manner. Spring boot is perfect tool for building cloud based micro-services as well as creating RESTful webservice.

Basic configuration required for Spring Boot application.

1. POM

Adding out-of-box functionality in an opinionated manner is enabled through POM entries.
1.1. The first things we have to ensure is that we add starter-parent. This ensure that project get sensible default, filtering and plugin configuration.
1.2. We add dependency for any of the out-of-box functionality we intend to use. For example add JPA dependency if we intend to use JPA etc



2.Application class. 

This the the entry point for processing of a Spring Boot application.



Note: 


A.  The @SpringBootApplication annotation is equivalent to using @Configuration, @EnableAutoConfiguration and @ComponentScan with their default attributes

B.  SpringApplication class creates ApplicationContext instance, register ommandLinePropertySource to expose command line argument as Spring properties, loading all singletons bean and trigger CommandLineRunner bean. 

C.     run() is a static method in SpringApplication class which is used for bootstrapping our application


Spring Boot database: 

Spring Boot can auto-configure embedded H2, HSQL and Derby databases. We don’t need to provide any connection URLs, simply include a build dependency to the embedded database that you want to use. As seen is the first screenshot we have added the H2 dependency in the POM.

One of the important thing that we may like to configure while working with any database is to see the state of database. As H2 is embedded into the Spring Boot we need to do some configuration to to view the state of database. 
The following Spring Configuration declares the servlet wrapper for the H2 database console and maps it to the path of /console.



Once you deploy the application we can view the database state in following url 

To login to the database use the following connection url and credentials(no password is required )



Database can be queried once we login. As shown below. 
Note: H2 is in-memory database, so every time we deploy the application database is recreated.


Spring Boot Resources

Sometime we may want to use some resource with Spring Boot application such as properties file, some sql files to initialize database etc. This can be done easily by adding files in the resource path. Nothing more needs to be done.


Creating webservice using Spring Boot

Controller

To create RESTful webservice we need a controller class as shown below.  
@RestController annotation will indicate that this is a restful webservice controller and we use @RequestMapping to map url path



Service Layer: 

This is a normal service layer indicated by stereotype @Service.




Dao Layer:

We can extend our interface with marker interface org.springframework.data.repository.Repository which provide captures the domain type to manage as well as the domain type's id type. General purpose is to hold type information as well as being able to discover interfaces that extend this one during classpath scanning for easy Spring bean creation.



Entity Class

We create an entity class but we have to ensure that there is a no argument constructor if we are adding any other constructor



Deploying and running Spring Boot application


Once we deploy Spring Boot application we can run it with java -jar command. 


Source code for the sample project can be found in below link




Resources


Monday, 12 September 2016

Database migration with Liquibase

Liquibase is an open source database independent library for managing database migration. Some of the important concepts that we need to understand before we can use liquibase are as follows.

1.DatabaseChangeLog: This is the file where we specify all the changes that should go into the database. The file can be XM, SQL, YAML or JSON format.

2.ChangeSet: This defines the individual changes or group of change that goes into database together. ChangeSet is identified by id and author.

3.PreCondition: As the name implies this ensures that database is in a particular state before executing the changeset.

4.Context: This ensure that changeSet runs only for specified environment.

Basic structure of a DatabaseChangeLog



Configuring Liquibase in the POM file.

Liquibase can be configured as plugin in the pom file. 



Steps for Database Migration:

Use case 1: This is the initial status of the database.

-  Only database is created but there is no database object yet created in the database. 

-  We have created the initial DatabaseChangeLog XML in our project. No changeSet other than the default is yet introduced in the DatabaseChangeLog.

1 . Run mvn liquibase:diff
This will create a changelog.xml file in the location as specified in the pom.
As there is no data in the database yet there will be no entry in this file.

2. Run mvn liquibase:updateSQL
This will create a sql file which have queries required to update the current version as specified in DatabaseChangeLog.
The initial DatabaseChangeLog has some default tables that needs to be created for liquibase such as DatabaseChangeLog and DatabaseChangeLoglock table.
The file will be created in the path we specify in the pom. 
Although the sql file will not be used to do an update in database but we can use this to verify what changes will actually get applied to the database when we run liquibase:update



3. Run  mvn liquibase:tag -Dliquibase.tag=initial_state
We will need to run this to create a rollback checkpoint in case we need to revert our changes.

4. Run mvn:update
This applies the DatabaseChangeLogs to the database. Once this run successfully you should be able to see the updated database.

5. Run mvn liquibase:diff
This will create a new changelog.xml file and it will have the database changes that has been made with previous update.



 Use case 2: We add changeSet to DatabaseChangeLog file

1. Make update to the DatabaseChangeLog to add new dataSet



2. Update the POM to add new version number. [Not mandatory,but recommended ]


3. Run liquibase:diff 
This will create changelog.xml file and will help us to keep a benchmark before we update the database. 

4. Run mvn liquibase:updateSQL

We will get the below file for the changeSet we have introduced.



5. Run mvn liquibase:tag -Dliquibase.tag=checkpoint_release_version_1
This will create a rollback checkpoint for this release



6. Run mvn liquibase:update
This will update the database with the new changes we introduced in the changeSet. We can see that address table is now present in the database.


Use Case 3: Rollback to a pre-existing state

Existing Database state. Two checkpoint defined.

1. Run mvn liquibase:rollback -Dliquibase.rollbackTag=checkpoint_release_version_2



Friday, 15 April 2016

JSON parsing with Java

JavaScript Object Notation is a syntax for storing and exchanging of data. JSON is very lightweight hence it has become an alternative to XML.

But to use in a Java program we have to parse a JSON file and convert it into an object form. This blog demonstrate couple of ways how we can convert a JSON file to a java object.

Let us assume the below JSON file.


{
    "id" : 1,
    "name" : "Manas",
    "address" : {
      "city" : "Pune",
      "state" : "Maharastra",
      "country" : "India"
    
    }
  }




Corresponding are the Java Entity


1. Student
public class Student {
Long id;
String name;
Address address;

    Getter/Setter methods
}
 
2. Address

public class Address {
private String city;
private String state;
private String country;

Getter/setter methods 
}




Below outlined two ways of parsing JSON document to Java object.


1. Using Apache commons


This is more of a traditional way of converting JSON to JAVA object. We have to manually map each attribute of the JSON to the java instance variable.


Characteristic of this approach are 

 

1. If we add or delete or modify any attribute from JSON document we have to make corresponding  modification to the JSON parsing method to reflect the changes.
2. It is not necessary to have the JSON attribute and instance variable of Java entity to have the same name as mapping is done explicitly.
3. It is not necessary for the JSON document to be well formed.

Steps


1. Add below dependency to the pom.xml 
<dependency>
<groupId>com.googlecode.json-simple</groupId>
<artifactId>json-simple</artifactId>
<version>1.1</version>
</dependency>


2. Parsing method will look as shown below




2. Using fasterXML Jackson


This has become more of a standard way of parsing JSON file. 

Characteristic of this approach

 

1. We do not have to worry if any field is added or removed from JSON document . API will take care of mapping. Hence this is maintainable and scalable
2. The name of attribute in JSON document and instance variable of the java entity should be same.
3. JSON should be well formed.

Steps

 

1. Add below dependency to pom.xml
<!-- Dependency for jackson -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.7.3</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.7.3</version>
</dependency>


2. The parsing method will look as shown below




We can configure ObjectMapper to have more control over deserialization with on/off property from DeserializationFeature.

Complete project can be found on my github repository https://github.com/manaspratimdas/blog

Resources








Streaming with Kafka API

The Kafka Streams API is a Java library for building real-time applications and microservices that efficiently process and analyze large-sca...