Java 8 Lambda Expression for Design Patterns – Strategy Design Pattern

The strategy pattern defines a family of algorithms encapsulated in a driver class usually known as Context and enables the algorithms to be interchangeable. It makes the algorithms easily interchangeable, and provides mechanism to choose the appropriate algorithm at a particular time.

The algorithms (strategies) are chosen at runtime either by a Client or by the Context. The Context class handles all the data during the interaction with the client.

The key participants of the Strategy pattern are represented below:

strategy.png
  • Strategy – Specifies the interface for all algorithms. This interface is used to invoke the algorithms defined by a ConcreteStrategy.
  • Context – Maintains a reference to a Strategy object.
  • ConcreteStrategy – Actual implementation of the algorithm as per Strategy interface

Now let’s look at a concrete example of the strategy pattern and see how it gets transformed with lambda expressions. Suppose we have different type of rates for calculating income tax. Based on whether tax is paid in advance or late, there is a rebate or penalty, respectively. We can encapsulate this functionality in the same class as different methods but it would need modification to the class if some other tax calculation is required in future. This is not an efficient approach. Changes in implementation of a class should be the last resort.

Let’s take an optimal approach by using Strategy pattern. We will make an interface for Tax Strategy with a basic method:

public interface TaxStrategy {

	public double calculateTax(double income);
}

Now let’s define the concrete strategy for normal income tax.

public class PersonalTaxStrategy implements TaxStrategy {

	public PersonalTaxStrategy() { }

	@Override
	public double calculateTax(double income) {

		System.out.println("PersonalTax");

		double tax = income * 0.3;
		return tax;
	}
}

The PersonalTaxStrategy class conforms to the TaxStrategy interface. Similarly, let’s define a concrete strategy for late tax payment which incurs a penalty.

public class PersonalTaxPenaltyStrategy implements TaxStrategy {

	public PersonalTaxPenaltyStrategy() { }

	@Override
	public double calculateTax(double income) {

		System.out.println("PersonalTaxWithPenalty");

		double tax = income * 0.4;
		return tax;
	}
}

Next lets define a concrete strategy for advance tax payment which results in tax rebate.

public class PersonalTaxRebateStrategy implements TaxStrategy {

	public PersonalTaxRebateStrategy() { }

	@Override
	public double calculateTax(double income) {

		System.out.println("PersonalTaxWithRebate");

		double tax = income * 0.2;
		return tax;
	}
}

Now let’s combine all classes and interfaces defined to leverage the power of Strategy pattern. Let the main method act as Context for the different strategies. See just one sample interplay of all these classes:

import java.util.Arrays;
import java.util.List;

public class TaxStrategyMain {

	public static void main(String [] args) {

		//Create a List of Tax strategies for different scenarios
		List<TaxStrategy> taxStrategyList =
				Arrays.asList(
						new PersonalTaxStrategy(),
						new PersonalTaxPenaltyStrategy(),
						new PersonalTaxRebateStrategy());

		//Calculate Tax for different scenarios with corresponding strategies
		for (TaxStrategy taxStrategy : taxStrategyList) {
			System.out.println(taxStrategy.calculateTax(30000.0));
		}
	}
}

Running this gives the following output:

PersonalTax
9000.0
PersonalTaxWithPenalty
12000.0
PersonalTaxWithRebate
6000.0

 

It clearly demonstrates how different tax rates can be calculated by using appropriate concrete strategy class. I have tried to combine all the concrete strategy (algorithms) in a list and then access them by iterating over the list.

What we have seen till now is just the standard strategy pattern and it’s been around for a long time. In these times when functional programming is the new buzzword one may ponder with the support of lambda expressions in Java, can things be done differently? Indeed, since the strategy interface is like a functional interface, we can rehash using lambda expressions in Java. Let’s see how the code looks like:

import java.util.Arrays;
import java.util.List;

public class TaxStrategyMainWithLambda {

	public static void main(String [] args) {

		//Create a List of Tax strategies for different scenarios with inline logic using Lambda
		List<TaxStrategy> taxStrategyList =
				Arrays.asList(
						(income) -> { System.out.println("PersonalTax"); return 0.30 * income; },
						(income) -> { System.out.println("PersonalTaxWithPenalty"); return 0.40 * income; },
						(income) -> { System.out.println("PersonalTaxWithRebate"); return 0.20 * income; }
			);

		//Calculate Tax for different scenarios with corresponding strategies
		taxStrategyList.forEach((strategy) -> System.out.println(strategy.calculateTax(30000.0)));
	}
}

Running this gives the similar output:

PersonalTax
9000.0
PersonalTaxWithPenalty
12000.0
PersonalTaxWithRebate
6000.0

 

We can see that use of lambda expressions, makes the additional classes for concrete strategies redundant. You don’t need additional classes; simply specify additional behavior using lambda expression.

All the code snippets can be accessed from my github repo

Posted in Java | Tagged , , , , | Leave a comment

2015 in review

The WordPress.com stats helper monkeys prepared a 2015 annual report for this blog.

Here's an excerpt:

A San Francisco cable car holds 60 people. This blog was viewed about 1,700 times in 2015. If it were a cable car, it would take about 28 trips to carry that many people.

Click here to see the complete report.

Posted in Java | Leave a comment

Java 8 Lambda Expression for Design Patterns – Decorator Design Pattern

The Decorator pattern (also known as Wrapper) allows behavior to be added to an individual object, either statically or dynamically, without affecting the behavior of other objects from the same class. It can be considered as an alternative to subclassing. We know that subclassing adds behavior at compile time and the change affects all instances of the original class. On the other hand, decorating can provide new behavior at run-time for selective objects.

The decorator conforms to the interface of the component it decorates so that it is transparent to the component’s clients. The decorator forwards requests to the component and may perform additional actions before or after forwarding. Transparency allows decorators to be nested recursively, thereby allowing an unlimited number of added responsibilities. The key participants of the Decorator pattern are represented below:

decorator

  • Component – Specifies the interface for objects that can have responsibilities added to them dynamically.
  • ConcreteComponent – Defines an object to which additional responsibilities can be added
  • Decorator – Keeps a reference to a Component object and conforms to Component’s interface. It contains the Component object to be decorated.
  • ConcreteDecorator – Adds responsibility to the component.

Now let’s look at a concrete example of the decorator pattern and see how it gets transformed with lambda expressions. Suppose we have different types of books and these books may be in different covers or different categories. We can chose any book and specify category or language by using inheritance. Book can be abstracted as a class. After that any other class can extend Book class and override the methods for cover or category. But this is not an efficient approach. Under this approach, the sub-classes may have unnecessary methods extended from super class. Also if we had to add more attributes or characterization there would be a change in parent class. Changes in implementation of a class should be the last resort.

Let’s take an optimal approach by using Decorator pattern. We will make an interface for Book with a basic method:

public interface Book {

    public String describe();

}

A BasicBook class can implement this interface to provide minimal abstraction:

public class BasicBook implements Book {

    @Override
    public String describe() {

        return "Book";

    }

}

Next, lets define the abstract class BookDecorator which will act as the Decorator:

abstract class BookDecorator implements Book {

    protected Book book;

    BookDecorator(Book book) {
        this.book = book;
    }

    @Override
    public String describe() {
        return book.describe();
    }
}

The BookDecorator class conforms to the Book interface and also stores a reference to Book interface. If we want to add category as a property to Book interface, we can use a concrete class which implements BookDecorator interface. For Fiction category we can have the following Decorator:

public class FictionBookDecorator extends BookDecorator {

    FictionBookDecorator(Book book) {
        super(book);
    }

    @Override
    public String describe() {
        return ("Fiction " + super.describe());
    }
}

You can see that FictionBookDecorator adds the category of book in the original operation, i.e. describe. Similarly, if we want to specify Science category we can have the corresponding ScienceBookDecorator:

public class ScienceBookDecorator extends BookDecorator {

    ScienceBookDecorator(Book book) {
        super(book);
    }

    @Override
    public String describe() {
        return ("Science " + super.describe());
    }
}

ScienceBookDecorator also adds the category of book in the original operation. There can also be another set of Decorators which indicate what type of cover the book has. We can have SoftCoverDecorator describing that the book has a soft cover.

public class SoftCoverDecorator extends BookDecorator {

	SoftCoverDecorator(Book book) {
		super(book);
	}
	
	@Override
	public String describe() {	
		return (super.describe() + " with Soft Cover");
	}
}

We can also have a HardCoverDecorator describing the book has a hard cover.

public class HardCoverDecorator extends BookDecorator {
	
	HardCoverDecorator(Book book) {
		super(book);
	}
	
	@Override
	public String describe() {	
		return (super.describe() + " with Hard Cover");
	}
}

Now let’s combine all classes and interfaces defined to leverage the power of Decorator pattern. See just one sample interplay of all these classes:

import java.util.List;
import java.util.ArrayList;

public class BookDescriptionMain {
	
	public static void main(String [] args) {
		
		BasicBook book = new BasicBook();
		
		//Specify book as Fiction category
		FictionBookDecorator fictionBook = new FictionBookDecorator(book);
		
		//Specify that the book has a hard cover
		HardCoverDecorator hardCoverBook = new HardCoverDecorator(book);
		
		//What if we want to specify both the category and cover type together
		HardCoverDecorator hardCoverFictionBook = new HardCoverDecorator(fictionBook);				
		
		//Specify book as Science category
		ScienceBookDecorator scienceBook = new ScienceBookDecorator(book);
		
		//What if we want to specify both the category and cover type together
		HardCoverDecorator hardCoverScienceBook = new HardCoverDecorator(scienceBook);				

		//Add all the decorated book items in a list
		List<Book> bookList = new ArrayList<Book>() {
			{
				add(book);
				add(fictionBook);
				add(hardCoverBook);
				add(hardCoverFictionBook);
				add(scienceBook);
				add(hardCoverScienceBook);
			}
		};
		
		//Traverse the list to access all the book items
		for(Book b: bookList) {
			System.out.println(b.describe());
		}		
	}
}

Running this gives the following output:

Book
Fiction Book
Book with Hard Cover
Fiction Book with Hard Cover
Science Book
Science Book with Hard Cover

It clearly demonstrates how different properties can be added to any pre-defined class / object. Also, multiple properties can be combined. I have tried to combine all the decorated book items in a list and then access them by iterating over the list.

What we have seen till now is just the standard decorator pattern and it’s been around for a long time now. In these times when functional programming is the new buzzword one may ponder whether the support of lambda expressions in Java, can things be done differently. Indeed, since the decorated interface is like a function interface, we can rehash using lambda expressions in Java. Lets see how the code looks like:

import java.util.List;
import java.util.ArrayList;

public class BookDescriptionMainWithLambda {
	
	public static void main(String [] args) {
		
		BasicBook book = new BasicBook();
		
		//Specify book as Fiction category using Lambda expression
		Book fictionBook = () -> "Fiction " + book.describe();
		
		//Specify that the book has a hard cover using Lambda expression
		Book hardCoverBook = () -> book.describe() + " with Hard Cover";
		
		//What if we want to specify both the category and cover type together
		Book hardCoverFictionBook = () -> fictionBook.describe() + " with Hard Cover";				
		
		//Specify book as Science category using Lambda expression
		Book scienceBook = () -> "Science " + book.describe();
		
		//What if we want to specify both the category and cover type together
		Book hardCoverScienceBook = () -> fictionBook.describe() + " with Hard Cover";				

		List<Book> bookList = new ArrayList<Book>() {
			{
				add(book);
				add(fictionBook);
				add(hardCoverBook);
				add(hardCoverFictionBook);
				add(scienceBook);
				add(hardCoverScienceBook);
			}
		};
		
		bookList.forEach(b -> System.out.println(b.describe()));
	}
}

Running this gives the similar output:

Book
Fiction Book
Book with Hard Cover
Fiction Book with Hard Cover
Science Book
Fiction Book with Hard Cover

We can see that use of lambda expressions, makes the additional classes for decorators redundant. You don’t need additional classes; simply specify additional behavior using lambda expression. However, there is support for finding the decorator again for re-use. If you have a concrete decorator class, you can reuse it next time also.

All the code snippets can be accessed from my github repo

Posted in Java | Tagged , , , | Leave a comment

Book Review – OCP Java SE 7 Programmer II Certification Guide: Prepare for the 1ZO-804 exam

Although this book is essentially for preparing for OCP certification but I feel it is a great reference for anyone interested in learning Java 7 features. Oracle has been adding tons of features with each Java release and it is no mean task to keep up with them. Author has done a great job in providing adequate treatment to all topics. I thoroughly enjoyed reading this book.

High Points:

  • Each chapter has well defined mapping with exam objectives, exam tips, sidebars and Twist in Tale
  • Interesting illustrations which keep the reader focused like Thread Life cycle.
  • Funny way to introduce concepts like overloading, exceptions. I loved the airplane example for exceptions.
  • Inner details explained with decompiled code (For e.g. Enum)
  • Good introduction to design patterns like Singleton, Factory, DAO, etc.
  • Lucid explanation for Collections with interesting examples.
  • Novel way to explain difference between Assertions and Exceptions.
  • Review notes and sample exam questions at the end of each chapter.

Improvement Areas:

  • Compiler errors can be in a different color.
  • I think Page 249 has a typo error. The phrase “Won’t compile; no way to pass argument to T” should be “Won’t compile; no way to pass argument T”
  • More emphasis should be given to StringBuffer and StringBuilder as developers typically get confused in them.
  • Exception Handling – More emphasis on why we should not catch Errors. Moreover, I don’t think extending RuntimeException should be encouraged.
  • Few examples for Java I/O.
  • Concurrency is not an easy area so more detailed treatment with sufficient examples should be given.
  • A consolidated mock paper in the end would help.

I feel this book would be good reference for preparing for OCP certification.

This review originally appeared at Amazon listing of the book.

Posted in Java | Tagged , | Leave a comment

Java 8 Lambda Expression for Design Patterns – Command Design Pattern

In this blog I would illustrate implementing the command pattern in functional programming style using Java 8 Lambda expressions. The intent of command pattern is to encapsulate a request as an object, thereby parameterizing clients with different requests, queue or log requests, and support corresponding operations. The command pattern is a way of writing generic code that sequences and executes methods based on run-time decisions. The participants in this pattern are following:

  • Command – Declares an interface for executing an operation.
  • ConcreteCommand – Defines a binding between a Receiver object and an action.
  • Client – Creates a ConcreteCommand instance and sets its receiver.
  • Invoker – Controls the command(s) to carry out the request(s).
  • Receiver – Performs the actual work.

The relationship between these participants is depicted below:

CommandPattern

Let’s look at a concrete example of the command pattern and see how it gets transformed with lambda expressions. Suppose we have a file system utility that has actions upon it that we’ll be calling, such as opening a file, writing to the file and closing the file. This can be implemented as macro functionality – that is, a series of operations that can be recorded and then run later as a single operation. This would be our receiver.

public interface FileSystemReceiver {
	void openFile();
	void writeFile();
        void closeFile();
}

Each of the operations, such as openFile and writefile, are commands. We can create a generic command interface to fit these different operations into. Let’s call this interface Action, as it represents performing a single action within our domain. This is the interface that all our command objects implement.

public interface Action {
    public void perform();
}

Let us now implement our Action interface for each of the operations. All these classes need to do is call a single method on FileReceiver and wrap this call into our Action interface. Lets name the classes after the operations that they wrap, with the appropriate class naming convention – so, the openFile method corresponds to a class called OpenFile.

public class OpenFile implements Action {

    private final FileReceiver fileReceiver;

    public OpenFile(FileReceiver fileReceiver) {
        this.fileReceiver = fileReceiver;
    }

    public void perform() {
        fileReceiver.openFile();
    }

}

Now let’s implement our Macro class. A macro consists of a sequence of actions that can be invoked in turn and this will act as invoker. This class can record actions and run them collectively. We can store the sequence of actions in a List then iteratively fetch each action in order to execute.

public class Macro {
    private final List actions;

    public Macro() {
        actions = new ArrayList<>();
    }

    public void record(Action action) {
        actions.add(action);
    }

    public void run() {
        actions.forEach(Action::perform);
    }
}

While populating the macros, we can add instance of each command that has been recorded to the Macro object. Now simply running the macro will call each of the commands in turn. This is our client code.

Macro macro = new Macro();
macro.record(new OpenFile(fileReceiver));
macro.record(new WriteFile(fileReceiver));
macro.record(new CloseFile(fileReceiver));
macro.run();

If you have been with me till this point, you would be wondering where lambda expressions fit in all this. Actually, all our command classes, such as OpenFile, WriteFile and CloseFile, are really just lambda expressions wanting to break out of their wrappers. They are just some behavior that is being passed around as classes. This entire pattern becomes a lot simpler with lambda expressions because we can completely do away with these classes. Let’s see how Macro class (client) can use lambda expressions instead of command classes.

Macro macro = new Macro();
macro.record(() -> fileReceiver.openFile());
macro.record(() -> fileReceiver.writeFile());
macro.record(() -> fileReceiver.closeFile());
macro.run();

This can be further improved by taking cognizance of the fact that each of these lambda expressions is performing a single method call. So, method references can be directly used.

Macro macro = new Macro();
macro.record(fileReceiver::openFile);
macro.record(fileReceiver::writeFile);
macro.record(fileReceiver::closeFile);
macro.run();

Command pattern is easily extendable and new action methods can be added in receivers to create new Command implementations without changing the client code. Runnable interface (java.lang.Runnable) in JDK is a popular interface where the Command pattern is used. In this blog, I have tried to express command pattern in Java 8 lambda expression. You would have seen by using lambda expressions, a lot less boilerplate is required leading to cleaner code.

Posted in Java | Tagged , , , | Leave a comment

Microservices Orchestration

In my earlier post, I had introduced the concept of microservices. As a quick recap, microservices model promises ease of development and maintenance, flexibility for developers/teams to work on different things, building block for a scalable system and a true polygot model of development. However, this model is not without challenges and the biggest being addressing the complexity of a distributed system. Since now we have to deal with multiple services spread across multiple hosts, it becomes difficult to keep track of different hosts and services. In order to scale further, the number of instances of services would further increase and this would in turn lead to increase in the number of hosts.

Container technology can help in reducing the complexity of microservice orchestration. The idea of operating system container is not new and has been present in one form or another for decades. Operating systems in general provide access to various resources in some standard way and share it amongst the running processes. Container technology takes this idea further by not only sharing but also segregating access to networks, filesystems, cpu and memory. Each container can have its own restricted amount of CPU, memory and network independent of the other containers.

Lets see how container technology helps with microservice deployment.

  • This isolation between containers running on the same host makes deploying microservices developed using different languages and tools very easy.
  • Portability of containers makes deployment of microservices seamless. Whenever a new version of a service is to be rolled, the running container can be stopped and a new container can be started using the latest version of service code. All the other containers will keep on running unaffected.
  • Efficient resource utilization as containers comprise of just the application and its dependencies.
  • Strictly controlled resource requirements without using virtualization.

There are two popular container technologies.

Docker – Google was amongst the biggest contributor for underlying container technology in Linux and it evolved into LXC. Docker was initially developed as a LXC client and allows the easy creation of containers. The container images are essentially filesystems with a configuration file that gives a startup profile for the process that runs inside the container. The biggest USP of Docker is its ease of use wherein a single command can download an existing image and boot it into a running container. This has helped in its popularity and widespread use. Latest versions of Docker have no dependency on LXC and through their libcontainer project have a full stack of container technology all the way to the kernel. Docker is all set to shake up the virtualization market by consuming it from the bottom.

There are some opensource additions on Docker – Kubernetes can deploy and manage multiple Docker containers of the same type and TOSCA can help with more complex orchestration involving multiple tiers.

Amazon ECS – The Amazon EC2 Container Service (Amazon ECS) defines a pool of compute resources called a “cluster.” Each cluster consists of one or more Amazon EC2 instances. Amazon ECS manages the state of all container-based applications running in cluster, provides telemetry and logging, and manages capacity utilization of the cluster, allowing for efficient scheduling of work. A group of containers can be defined using a construct called “task definition”. Each container in the task definition specifies the resources required by that container, and Amazon ECS will schedule that task for execution based on the available resources in the cluster. Amazon ECS simplifies “service discovery” by integrating with 3rd party tools like Zookeeper.

I would say use of containers to deploy microservices is an evolutionary step driven mostly by the need to make better use of compute resources and to maintain increasingly complex applications. The use of microservices architectural philosophy with containers addresses these needs. Containers offer the right abstraction to encapsulate Microservices.

Posted in Cloud | Tagged , , | Leave a comment

Microservices in a Nutshell

At the onset, “Microservices” sounds like yet another term on the long list of software architecture styles. Many experts consider it as lightweight or fine-grained SOA. Though it is not entirely a new idea, Microservices seem to have peaked in popularity in recent years, what with numerous blogs, articles and conferences evangelizing the benefits of building software systems in this style. The coming together of trends like Cloud, DevOps and Continuous Delivery has contributed to this popularity. Netflix, in particular, has been the poster boy of Microservices architectural model.

Nevertheless, Microservices deserves its place as a separate architectural style which while leveraging from earlier styles has a certain novelty. The fundamental idea behind Microservices is to develop systems as set of fine grained, independent and collaborating services which run in their own process space and communicate with lightweight mechanisms like HTTP. The stress is on services to be small and they can evolve over time.

In order to better understand the concept of microservices, let’s start by comparing them with traditional monolithic systems. Let’s consider an E-commerce store which sells products online. Customers can browse through the catalog for the product of their choice and give orders. The store verifies the inventory, takes the payment and then ships the order. The figure below demonstrates a sample E-commerce store.

monolith

The online store consists of many components like the UI which helps customers with the interaction. Besides, the UI there are other components for organizing the product catalog, order processing, taking care of any deals or discounts and managing the customer’s account. The business domain shares resources like Product, Order and Deals which are persisted in a Relational Database. The customers may access the store using a desktop browser or a mobile device.

Despite being a modularly designed system, the application is still monolithic. In the Java world the application would be deployed as a WAR file on a Web Server like Tomcat. The monolithic architecture has its own benefits:

  • Simple to develop as most of the IDEs and development tools are tailored for developing monolithic applications.
  • Ease of testing as only one application has to be launched.
  • Deployment friendly as in most cases a single file or directory has to be deployed on a server.

Till the monolithic application is simple everything seems normal from the outside. However, when the application starts getting complex in terms of scale or functionality, the handicaps become apparent. Some of the undesired shortcomings of a monolithic application are as following:

  • A large monolithic system becomes unmanageable when it comes to bug fixes or new features.
  • Even a single line of code change leads to a significant cycle time for deploying on production environment. The entire application has to be rebuilt and redeployed even if one module gets changed.
  • The entry barrier is high for trial and adoption of new technologies. It is difficult for different modules to use different technologies.
  • It is not possible to scale different modules differently. This can be a huge headache as monolithic architectures don’t scale to support large, long lived applications.

This is where the book, “The Art of Scalability” comes as a big help. It describes other architectural styles that do scale. It describes scalability model on the basis of a three dimensional scale cube.

scalability-axisThe X-axis scaling is the easiest way of scaling an application. It requires running multiple identical copies of the application behind a load balancer. In many cases it can provide good improvement in capacity and availability of an application without any refactoring.

In case of Z-axis scaling each server runs an identical copy of application. It sounds similar to X-axis scaling but the crucial difference is that each server is responsible for only a subset of the data. Requests are routed to appropriate server based on some intelligent mechanism. A common way is to route on the basis of the primary key of the entity being accessed, i.e. sharding. This technique for scaling is similar to X-axis scaling as it improves the application’s capacity and availability. However, none of these approaches does a good job of easing the development and application complexity. This is where the Y-axis scaling helps.

Y-axis scaling or functional decomposition is the 3rd dimension to scaling. We have seen that Z-axis splits things that are similar. On the other hand, Y-axis scaling splits things that are different. It divides a monolithic application into a set of services. Each service corresponds to a related functionality like catalog, order management, etc. How to split a system into services? It is not as easy as it appears to be. One way could be to split by verb or use case. For e.g., UI for compare products use case could be a separate service.

Another approach could be to split on the basis of nouns or system resources. This service takes ownership of all operations of an entity/resource. For e.g., there could be a separate catalog service which manages the catalog of products.

This leads to another interesting question. How big should a service really be? One school of thought recommends each service should have only a small set of responsibilities. As per Uncle Martin, services should be designed using the Single Responsibility Principle (SRP). The SRP suggests that a class should only have one reason to change. It makes great sense to apply the SRP to service design as well.

Another way of addressing this is to design services just like Unix utilities. Unix provides a large number of utilities such as grep, cat, ls and find. Each utility does exactly one thing, often exceptionally well, and can be combined with other utilities using a shell script to perform complex tasks. While designing services, try to model them on Unix utilities so that they perform a single function.

Some people argue that the lines of code (10 – 100 LOC) could be a benchmark for service size. I am not sure if it is a good talisman, what with many languages being more verbose than others. For e.g., Java is quite verbose as compared to Scala. An interesting take is that a microservice size should be such that an entire microservice can be developed from scratch by a scrum team in one sprint.

In all these interesting and diverse viewpoints, it should be remembered that the goal is to create a scalable system without any problems associated with monolithic architecture. It is perfectly fine to have some services which are very tiny while others are substantially larger.

On applying the Y-axis decomposition to the example monolithic application, we would get the following microservices based system:

servicesAs you can see after splitting the monolithic architecture, there are different frontend services addressing a specific UI need and interacting with multiple backend services. One such example is of the Catalog UI service which communicates with both the Catalog service and Product service. The backend services encompass the same logical modules which we had seen earlier in the monolithic system.

Benefits of Microservices

The microservices based architecture has numerous benefits and some of them can be found below:

  • The small size of microservice leads to ease of development and maintenance.
  • Multiple developers and teams can deliver relatively independently of each other.
  • Each microservice can be deployed independently of other services. If there is a change in a particular service then only that service need be deployed. Other services are unaffected. This in turn leads to easy transition to continuous deployment.
  • This architectural pattern leads to truly scalable systems. Each service can be scaled independently of other services.
  • The microservice architecture helps in troubleshooting. For example, a memory leak in one service only affects that service. Other services will continue to run normally.
  • Microservice architecture leads to a true polyglot model of development. Different services can deploy different technology stacks as per their needs. This gives a lot of flexibility to experiment with various tools and technologies. The small nature of these services makes the task of rewriting the service possible without much effort.

Drawbacks of Microservices

No technology is a silver bullet and microservice architecture is no exception. Let’s look at some challenges with this model.

  • There is added complexity of dealing with a distributed system. Each service would be a separate process and inter-process communication mechanism is required for message exchange amongst services. There are lots of remote procedure calls, REST APIs or messaging to glue components together across different processes and servers. We have to consider various challenges like network latency, fault tolerance, message serialization, unreliable networks, asynchronicity, versioning, varying loads within our application tiers etc.
  • Microservices are more likely to communicate with each other in an asynchronous manner. This helps in decompose work into genuinely separate independent tasks which can happen out of order at different times. Things start getting complex with the need for managing correlation IDs and distributed transactions to tie various actions together.
  • Different test strategies are required as asynchronous communication and dynamic message loads make it much harder to test systems. Services have to be tested both in isolation and their interactions also have to be tested.
  • Since there are many moving parts, the complexity is shifted towards orchestration. To manage this a PaaS-like technology such as Pivotal Cloud Foundry is required.
  • Careful planning required for features spanning multiple services.
  • Substantial DevOps skills are required as different services may use different tools. The developers need to be involved in operations also.

Summary

I am sure with all the aspects mentioned above you must be wondering whether you should start splitting your legacy monolithic architectures into microservices or in fact start any fresh system with this architectural style. The answer as in most cases is – “It Depends!” There are some organizations like Netflix, Amazon, The Guardian, etc. who have pioneered this architectural style and demonstrated that it can give tremendous benefits.

Success with microservices depends a lot on how well you componentize your system. If the components do not compose cleanly, then all you are doing is shifting complexity from inside a component to the connections between components. Not only that, it also moves complexity to a place that’s less explicit and harder to control.

The success of using microservices depends on the complexity of the system you’re contemplating. This approach introduces its own set of complexities like continuous deployment, monitoring, dealing with failure, eventual consistency, and other factors that a distributed system introduces. These complexities can be handled but its extra effort. So, I would say if you can manage to keep your system simple, then you may not need microservices. If your system runs the risk of becoming a complex one then you can certainly think about microservices.

Posted in Cloud, Java | Tagged , , | 2 Comments