SpringBoot : How to integrate a Filter in Spring Boot Application

What is a Filter :

A Filter is an object that performs filtering tasks on either the request to a resource (a servlet or static content), or on the response from a resource, or both. Filters perform filtering in the doFilter method. Every Filter has access to a FilterConfig object from which it can obtain its initialization parameters, a reference to the ServletContext which it can use, for example, to load resources needed for filtering tasks. In this article we will see how to integrate a Filter with a spring boot application. Following are the methods in a Filter:

Method Summary Method Description
void

destroy() Called by the web container to indicate to a filter that it is being taken out of service.

void

doFilter(ServletRequest request, ServletResponse response, FilterChain chain)

The doFilter method of the Filter is called by the container each time a request/response pair is passed through the chain due to a client request for a resource at the end of the chain.

void

init(FilterConfig filterConfig)

Called by the web container to indicate to a filter that it is being placed into service.

In our Filter WebFilter we will intercept the request in doFilter method and add a header parameter named remote_addr in the Request. And we will try to retreive the same header parameter in our controller class to ensure that our Filter implementation is working. Following is our filter implementation.

package com.tuturself.spring.boot.filter;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.Ordered;
import org.springframework.core.annotation.Order;

import javax.servlet.*;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import java.io.IOException;

@Component
@Order(Ordered.HIGHEST_PRECEDENCE)
public class WebFilter implements Filter {

	private static final Logger logger = LoggerFactory.getLogger(WebFilter.class);
	
	private static final boolean CONDITION = true;
	
	@Override
	public void init(FilterConfig filterConfig) throws ServletException {
		logger.debug("Initiating WebFilter >> ");
	}
	
	@Override
	public void doFilter(ServletRequest request, ServletResponse response,
			FilterChain chain) throws IOException, ServletException {
		if (CONDITION == true) {
			HttpServletRequest req = (HttpServletRequest) request;
			HeaderMapRequestWrapper requestWrapper = new 
					HeaderMapRequestWrapper(req);
			String remote_addr = request.getRemoteAddr();
			requestWrapper.addHeader("remote_addr", remote_addr);
            // Goes to default servlet
			chain.doFilter(requestWrapper, response); 
		} else {
			((HttpServletResponse) response)
				.setStatus(HttpServletResponse.SC_BAD_REQUEST);
		}
	}
	
	@Override
	public void destroy() {
		logger.debug("Destroying WebFilter >> ");
	}
}

The filter is registered by @Component annotation. The @Order(Ordered.HIGHEST_PRECEDENCE) is used for Advice execution precedence. The highest precedence advice runs first. The lower the number, the higher the precedence. For example, given two pieces of ‘before’ advice, the one with highest precedence will run first. If you have multiple filters then we can use numeric numbers to specify the order like

@Component @Order(2) public class WebFilter implements Filter {     } @Component @Order(1) public class AnotherWebFilter implements Filter {     }

Now we need to create our own HttpServletRequestWrapper for adding a new header parameter in request. Following is the HeaderMapRequestWrapper.

package com.tuturself.spring.boot.filter;

import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletRequestWrapper;
import java.util.*;


public class HeaderMapRequestWrapper extends HttpServletRequestWrapper {
    /**
     * construct a wrapper for this request
     *
     * @param request
     */
    public HeaderMapRequestWrapper(HttpServletRequest request) {
        super(request);
    }

    private Map<String, String> headerMap = new HashMap<String, String>();

    /**
     * add a header with given name and value
     *
     * @param name
     * @param value
     */
    public void addHeader(String name, String value) {
        headerMap.put(name, value);
    }

    @Override
    public String getHeader(String name) {
        String headerValue = super.getHeader(name);
        if (headerMap.containsKey(name)) {
            headerValue = headerMap.get(name);
        }
        return headerValue;
    }

    /**
     * get the Header names
     */
    @Override
    public Enumeration<String> getHeaderNames() {
        List<String> names = Collections.list(super.getHeaderNames());
        for (String name : headerMap.keySet()) {
            names.add(name);
        }
        return Collections.enumeration(names);
    }

    @Override
    public Enumeration<String> getHeaders(String name) {
        List<String> values = Collections.list(super.getHeaders(name));
        if (headerMap.containsKey(name)) {
            values.add(headerMap.get(name));
        }
        return Collections.enumeration(values);
    }
}

Now create our Controller class where we will check that the header is added correctly in the Request.

package com.tuturself.spring.boot.filter;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.RequestHeader;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.RestController;

@RestController
@RequestMapping("/test")
public class WebApi {

    private static final Logger logger = LoggerFactory.getLogger(WebApi.class);

    @RequestMapping(method = RequestMethod.GET)
    public ResponseEntity<Object> doSomething(@RequestHeader(name = "remote_addr") 
    	String remoteAddress) {
        logger.debug("The Remote address added by WebFiler is :: {}", remoteAddress);
        ResponseEntity<Object> response = null;
        try {
            response = new ResponseEntity<Object>("SUCCESS", HttpStatus.OK);
        } catch (Exception ex) {
            logger.error(ex.getMessage(), ex);
            return new ResponseEntity<Object>(ex.getMessage(), 
            		HttpStatus.INTERNAL_SERVER_ERROR);
        }
        return response;
    }
}

Here in the doSomething() method we retrieve the header parameter that we added in the Filter by:

@RequestHeader(name = "remote_addr") String remoteAddress

Download the whole project from GitHub : Download link

How to Effectively Use ExecutorService in Kafka Consumers

Apache Kafka is one of today’s most commonly used event streaming platforms. While using the Kafka platform, quite often, we run into a scenario where we have to process a large number of events/messages that are placed on a broker. Traditional approaches, where a consumer is listening to a topic and then processes these message within the consumer itself, can become a performance bottleneck if the number of messages being placed on the topic is high. In such cases, the rate at which a consumer can process messages will be very low, as there are a large number of messages getting placed on the topic. A potential solution that can be applied in such a scenario is to offload message processing to the worker threads in a thread pool.

In this section, we will take a look into how a Kafka consumer can offload its work to a thread pool. We will leverage Java’s ExecutorService framework to create a thread pool.

This approach primarily involves two steps. The first step is to create a KafkaConsumer that can read messages from a topic. Once the messages are read, they are delivered to a threadpool for further processing. The second step is to create worker threads that perform further processing of each message.

Step 1, Kafka Consumer Implementation: Here, we read the messages from a topic and dispatch the messages to a thread pool created using ThreadPoolExecutorService.

public class KafkaProcessor {
    private final KafkaConsumer<String, String> myConsumer;
    private ExecutorService executor;
    private static final Properties KAFKA_PROPERTIES = new Properties();
    static {
        KAFKA_PROPERTIES.put("bootstrap.servers", "localhost:9092");
        KAFKA_PROPERTIES.put("group.id", "test-consumer-group");
        KAFKA_PROPERTIES.put("enable.auto.commit", "true");
        KAFKA_PROPERTIES.put("auto.commit.interval.ms", "1000");
        KAFKA_PROPERTIES.put("session.timeout.ms", "30000");
        KAFKA_PROPERTIES.put("key.deserializer", "org.apache.kafka.common.serialization.StngDeserializer");'        
        KAFKA_PROPERTIES.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
   }
    public KafkaProcessor() {
        this.myConsumer = new KafkaConsumer<>(KAFKA_PROPERTIES);
        this.myConsumer.subscribe(Arrays.asList("testTopic"));
    }
    public void init(int numberOfThreads) {
      //Create a threadpool
      executor = new ThreadPoolExecutor(numberOfThreads, numberOfThreads, 0L, TimeUnit.MILLISECONDS,
               new ArrayBlockingQueue<Runnable>(1000), new ThreadPoolExecutor.CallerRunsPolicy());
    while (true) {
        ConsumerRecords<String, String> records = myConsumer.poll(100);
        for (final ConsumerRecord<String, String> record : records) {
        executor.submit(new KafkaRecordHandler(record)); 
    }
   }
}
public void shutdown() {
        if (myConsumer != null) {
        myConsumer.close();
        }
        if (executor != null) {
        executor.shutdown();
        }
        try {
          if (executor != null && !executor.awaitTermination(60, TimeUnit.MILLISECONDS)) {
          executor.shutdownNow();
          }
        }catch (InterruptedException e) {
        executor.shutdownNow();
        }
  }
}

Step 2, Worker Thread(Message/Record Handler) Implementation: Here, we perform further processing of the messages.

public class KafkaRecordHandler implements Runnable {
private ConsumerRecord<String, String> record;
    public KafkaRecordHandler(ConsumerRecord<String, String> record) {
    this.record = record;
    }
    @Override
    public void run() { // this is where further processing happens
        System.out.println("value = "+record.value());
        System.out.println("Thread id = "+ Thread.currentThread().getId());
    }
}

The final step is to create a KafkaConsumer (KafkaProcessor) and specify the number of worker threads through the init() method.

public class ConsumerTest {
    public static void main(String[] args) {
      KafkaProcessor processor = new KafkaProcessor();
      try {
          processor.init(5);
      }catch (Exception exp) {
          processor.shutdown();
      }
    }
}

This approach might not be needed/suitable for all scenarios. You have to carefully evaluate the best approach to be used with your Kafka consumer implementation.

Passing Multiple Arguments Into Stream Filter Predicates

When I am working with Java streams, I am intensively using filters to find objects. I often encounter the situation where I’d like to pass two arguments to the fiter function. But, unfortunately, the standard API only accepts a Predicate and not BiPredicate.

To solve this limitation, I define all of my predicates as methods in a class, for example, Predicates. Then, that predicate class takes a constant parameter.

public static class Predicates {
     private String pattern;
     public boolean containsPattern(String string) {
         return string.contains(pattern);
     }
     public Predicates(String pattern) {
         this.pattern = pattern;
     }
}

When I am using the Predicates, I instintiate an instance with the constant parameter of my choice. Then, I can use the instance methods as method references passed to the filter, like so:

Predicates predicates = new Predicates("SSH");
System.getenv().keySet().stream().filter(predicates::containsPattern).collect(Collectors.toSet());

This way, you can easily pass additional parameters to the filter, and your code is easy to read, even if you have multiple filters in the chain. Also, you can reuse the predicates in the Predicate class in other Collection operations.

Happy coding!

Why Do We Need an Interface in OOP?

Most of us feel confused about the usefulness of interfaces at a first glance. We have questions like “Why do we need interfaces?” and “What is the advantage of using interfaces?” In this blog, we will try to find the answers to these questions.

Let’s begin with a simple example. Assume that you are a student and need to prepare for an exam. You know that there will be distractions during your exam preparation. We will use mobile applications and friends as our example. Let’s describe each as a class:

class Friend {
    public void askHelp(){
        System.out.println("I will do my best to help you!");
    }
}
class MobileApplication {
    public void installApp(){
        System.out.println("The installation is complete.");
    }
}

I think most would agree with me that not every friend and mobile application can be a distraction. For example, we can have hardworking friends. Or, we can install useful mobile applications on our smartphones. So, adding the distract() method to these classes is not good. That is why we need to describe these distractions in greater detail:

class AdventureLoverFriend extends Friend{
   public void getAdviceAboutTravel(){
        System.out.println("I will help you travel better :)");
   }
}
class Facebook extends MobileApplication{
    public void connectPeople(){
        System.out.println("Stay in touch with friends from all over the world!");
    }
}

You know that your upcoming exam is going to be very difficult and you don’t have much time to prepare. This is why you decided to print a list of distractions with headers like “SAY NO!” and stick them on the wall in your room. How can you achieve this?

Well, you can think about inheritance. Because if we create a superclass named Distraction, and both the Facebook class and AdventureLoverFriend class extend it, we can collect all distractable things in one list. Because we can refer to the subclass object with the superclass reference variable, we can conduct the needed operation on this list. But Facebook and AdventureLoverFriend cannot extend the Distraction class. This is because, in Java, one class cannot extend more than one class.

At this time, we can see how the interface is useful. Let’s create an interface named Distractable to further demonstrate this:

interface Distractable{
    void distract();
}

And then, let’s implement it as follows:

class AdventureLoverFriend extends Friend implements Distractable{
    public void getAdviceAboutTravel(){
        System.out.println("I will help you travel better :)");
    }
    @Override
    public void distract() {
        System.out.println("I’m having a party this weekend and would love for you to come ^_^");
    }
}
class Facebook extends MobileApplication implements Distractable{
    public void connectPeople(){
        System.out.println("Stay in touch with friends from all over the world!");
    }
    @Override
    public void distract() {
        System.out.println("Go through your entire Facebook news feeds again and again :/");
    }
}

As you can see, the interfaces allow us to define common behavior that can be implemented by any class, regardless of its inheritance. Although the AdventureLoverFriend class extends the Friend class and the Facebook class extends the MobileApplication class, we can add common distractable behavior to them by implementing the Distractable interface. This means that we can “cut across” the inheritance hierarchy to implement functionality as we see fit.

Since Java allows us to refer implementation class objects with interface reference variable, we can write the following in theExamPreparation class:

class ExamPreparation {
    public static void main(String[] args) {
        List<Distractable> distractableList = getListOfDistractableThings();
        System.out.println("\t\t\t\t\t\t  SAY NO! ");
        printList(distractableList);
    }
    public static void printList(List<Distractable> distractableList){
        for(Distractable distractableThing: distractableList){
            distractableThing.distract();
        }
    }
    public static List<Distractable> getListOfDistractableThings(){
        List<Distractable> distractables = new ArrayList<>();
        Distractable facebook = new Facebook();
        distractables.add(facebook);
        Distractable adventureLoverFriend = new AdventureLoverFriend();
        distractables.add(adventureLoverFriend);
        return distractables;
    }
}

So, we print the list as we want:

                         SAY NO!
Go through your entire Facebook news feeds again and again :/
I’m having a party this weekend and would love for you to come ^_^

Also, consider that we only focus on common distractable behavior in the printList method. We don’t care about other behaviors because we look at them as a Distractable object, unlike the Facebook or AdventureLoverFriend object in the printList method. We can show it like this in the code:

Distractable facebook = new Facebook();
facebook.connectPeople(); //doesn't compile

There are many other reasons why we need interfaces in Java. In this post, I tried to explain one of the more important concepts. Hope it was helpful!

TOPICS

Questions Possible Answers
How to make a class thread safe.
What are the different scopes of spring bean and how they are used. request,session,global session,singleton(default),prototype
How to make a resource accept both xml and json as input. Header.accept(application/json) and (application/xml)
How validate a query parameter for its existence. Check for Null
What are pessimistic and optimistic locking.
What is left push w.r.t Redis
What is pipe lining in Redis.
What is xAPI
What is your tech stack
What are the dbs you have worked on
What are the commands of redis
Spring abstraction for redis.
Questions will mostly be on whatever you have worked on previously.
Junits.
Create a generic class with bounded nature.
What does spring boot helps us with Spring starters help to load all configurations
How to use Utils classes in Spring (constants etc)
Disadvatages of Spring boot Less Code Redablility because need to exclude auto configuration for custom implementation.
Maven Lifecycle
Benifits of Boot autoconfiguration , dependency reduction and embedded http servers
Java 8 stream program
REst Service basics
How you are building response object in rest services Response, ResponseEntity
Cassandra replication factors , nodes
what is the diff Abstract class and interface?
what is concurrenthashmap?
what is springprofile?
what is collection interfaces and implemented classes?
what is jpa?how to implementation process?
what is diff jpa and hibernate?
what is TDD and Junit,mockito flow?
what is nosql ? Non Structed Data
what is springbean lifecycle?
what is the diff spring and springboot ?explain Annotations?
what is restful service and flow of implementation?
how to interact one rest-service to another rest-service in spring? RestTemplate, WebClient, RestEasy
table relationship in Hibernate? Associations (one-to-many,etc…)
what is IOC?
what is the Junit with Mockito implementaiton flow for class and method?
Reactive java Asynchronized Non blocking functional programming.
If scope of bean is singleton how it will be threadsafe
Spring boot starters
how convert pojo to json in REST
Mongo DB features
Qualifier
Condition based bean creation @conditional
@Profiles
how to restrict autoconfigure in spring boot application
Junit with @Mock @spy @captor @injectMocks
How to achive default configuration in all applictions??
Agile Development
Code Review rules sonar, PMD,Find bugs,check styles
Exception Handling
Checked and runtime exceptions
Difference throwable and exception
Difference Array and arraylist
Difference List and Set
Java 8 features Lamda,functional interfaces(function,consume,supply,predicate),streams,date and time API,collectors
GIT commands pull,push,commit,rebase,add,diff,apply,merge,reset
pergemError Handle resources properly (open and close) memory leaks The first thing one can do is to make the size of the permanent generation heap space bigger.
This cannot be done with the usual –Xms(set initial heap size) and –Xmx(set maximum heap size) JVM arguments, since as mentioned, the permanent generation heap space is entirely separate from the regular Java Heap space, and these arguments set the space for this regular Java heap space. However, there are similar arguments which can be used(at least with the Sun/OpenJDK jvms) to make the size of the permanent generation heap bigger:

-XX:MaxPermSize=256m

GarbageCollectors System.gc
Database design for attendence system Normalization technics
How to maintain properties file as environment specific? @Profile
How to make beans as environment specific? @Conditional
What is Webservices? Diff between SOAP and REST
tell the query for who has 99% attendance in attendance table select AVG(col1) avgval, student_name from table group by student_name having avgval >99
How to consume rest service in spring boot?? RestTemplate
How to join tables(documents) in MongoDb
Diff between SQL and NOSQL Stores data in the form of documents(JSON)
Do you aware of Continueous Integration/Deployment? Jenkins
How Autowire works?
Observable ZIP and subscribing the calls to register.?
How Spring boot get initialized?
Diff between @Component and @Configuration?
Command line commands of spring boot?
Example to iterate the List using lamda expression, filter the stream?
Circuit breaker design pattern?
What is Multithreading?
How can we achieve ThreadSafe?
Why do we use @AutoConfiguration?
What is NOSQL?
Did you use Optional in java 8?
What is component scan?
what is Optional in java8?
what is concurrent HashMap Implementation?
write a program on streams to filter the Objects?
what is the maven command for Creating Jar and Excute test classes?
What is the jenkin command for compiling test cases?
why we use springboot in our application?
write a sample program in springboot app?
what is volatile ?
difference between map() and flatmap() in java 8 streams ?
Did you use any java 8 concurrent utilities like countdown latch and forkjoin pool
SQL VS NOSQL
Monolithic vs Micro services

Authentication and Authorization in Microservices

Microservices architecture has been gaining a lot of ground as the preferred architecture for implementing solutions, as it provides benefits like scalability, logical and physical separation, small teams managing a part of the functionality, flexibility in technology, etc. But since microservices are distributed the complexity of managing them increases.

One of the key challenges is how to implement authentication and authorization in microservices so that we can manage security and access control.

In this post, we will try to explore a few approaches that are available and see how we should implement them.

There are three approaches that we can follow:

Local Authentication and Authorization (Microservices are responsible for Authentication and Authorization)

      • Pros
        • Different authentication mechanisms can be implemented for each microservice.
        • Authorization can be more fine-grained
      • Cons
        • The code gets bulkier.
        • The probability of each service using different authentication is very low so code gets duplicated.
        • The developer needs to know the permission matrix and should understand what each permission will do.
        • The probability of making mistakes is quite high.

Global Authentication and Authorization (

    • It is an All or Nothing approach if the authorization for a service is there then it is accessible for all else none)

      • Pros
        • Authentication and authorization so there’s no repetition of code.
        • A future change to the authentication mechanism is easy, as there’s only one place to change the code.
        • Microservices’ code is very light and focuses on business logic only.
      • Cons
        • Microservices have no control over what the user can access or not, and finer level permissions cannot be granted.
        • Failure is centralized and will cause everything to stop working.

Global Authentication and Authorization as a part of Microservices

    • Pros
      • Fine-grained object permissions are possible, as microservices can decide what user the will see or not.
      • Global authentication will be easier to manage the lighter the load becomes.
      • Since authorization is controlled by the respective microservice there’s no network latency and it will be faster.
      • No centralized failure for authorization.
    • Cons
      • Slightly more code for developers to write, as they have to focus on permission control.
      • Needs some effort to understand what you can do with each permission.

In my opinion, the third option is the best one, as most of the applications have a common authentication mechanism, thus global authentication makes perfect sense. Microservices can be accessed from multiple applications and clients and they might have different data needs so global authorization becomes a limiting factor on what each application can see. With local authorization, microservices can make sure that the client application is only authorized to see what it needs to see.

My organization implemented the same approach in one of the projects that we were working on recently. We built an authentication service that was mainly responsible for integration with the LDAP system for verifying the user and then contacting the RBAC (Role-Based Access Control) service to populate the permission matrix based on the role the user is playing in the context of the application, e.g. the same user can be a normal user in one of the applications and an admin in another. So we need to understand the context from which the user is coming in and RBAC is the place where we decode the context and populate the relevant set of permissions. The permission matrix was then sent to the microservice as a part of claims in the JWT token. Microservices only apply to those permissions and return what is required to be returned. Please see the below diagram to see how we orchestrate the flow.

Microservice authentication and authorization

Architecture Flow for Authentication and Authorization

Conclusion

The above solution, where authentication is global and microservices control the authorizations of their content based on the permissions that are passed to it, is one of the possible solutions for handling authentication and authorization modules in microservices development. Also, we can enhance this solution by building a sidecar in a service mesh-type architecture, where we offload the authorization to the sidecar.

Java Garbage Collection Basics

Objects dynamically created using a new operator are deallocated automatically. The technique that accomplishes this is called garbage collection. It works like this: When no references to an object exist, that object is assumed to be no longer needed, and the memory occupied by the object can be reclaimed.

How Does Automatic Garbage Collection Work?

Automatic garbage collection works by looking at heap memory, identifying which objects are being referenced and which are not, and deleting the unused objects. This is widely known as the “Mark and Sweep Algorithm.”

The process of deallocating memory is handled automatically by the garbage collector in two steps:

Step 1: Marking

This is first step where the garbage collector identifies which pieces of memory are in use and which are not and marks those objects that are not referenced anymore.

Garbage Collection Marking

Source: Oracle Java Docs

Step 2: Deleting

In this step, all the marked objects, which are not referenced anymore, in Step 1 are deleted and reclaim the memory.

This can be done in two types:

2a) Normal Deletion

In this deletion, the memory allocator holds references to blocks of free space where a new object can be allocated, as shown in figure below.

Garbage Collection-Normal Deletion

Source: Oracle Java Docs

2b) Deletion With Compacting

In this process, instead of just removing objects from memory, the remaining objects will be compacted to further improve performance, in addition to deleting unreferenced objects. By moving referenced object together, this makes new memory allocation much easier and faster.

Garbage Collection- Compact Deletion

Source: Oracle Java Docs

For further information on how garbage collection works, please refer to this Oracle Java Docs.