1. Functional Interface, Stream API?
Functional interfaces are also called Single
Abstract Method interfaces (SAM Interfaces). As name suggest, a functional
interface permits exactly one abstract method in it. Java 8 introduces @FunctionalInterface
annotation which can be used for giving compile-time errors it a functional
interface violates the contracts.
What are the
functional interfaces available in java?
Some of the
examples of functional interfaces in Java are Runnable, ActionListener,
Comparable interfaces
Stream
API:
Java provides a new additional package in Java 8 called
java.util.stream. This package consists of classes, interfaces and enum to
allows functional-style operations on the elements. You can use stream by importing
java.util.stream package.
Stream provides following features:
· Stream does not store elements. It simply
conveys elements from a source such as a data structure, an array, or an I/O
channel, through a pipeline of computational operations.
· Stream is functional in nature. Operations
performed on a stream does not modify it's source. For example, filtering a
Stream obtained from a collection produces a new Stream without the filtered
elements, rather than removing elements from the source collection.
· Stream is lazy and evaluates code only when
required.
· The elements of a stream are only visited
once during the life of a stream. Like an Iterator, a new stream must be
generated to revisit the same elements of the source.
· You can use stream to filter, collect, print,
and convert from one data structure to other etc. In the following examples, we
have applied various operations with the help of stream.
Java8 Features? Java8 Features
4. Streams
6.StringJoiner
Lambda Expression: In Java programming language, a Lambda expression (or
function) is just an anonymous function, i.e., a function
with no name and
without being bounded to an identifier.
Lambda expressions are written exactly in the
place where it’s needed, typically as a parameter to some other function.
Syntax:
(parameters)->expression
(parameters)->{statements;}
()->expression
(x,y)->x+y
1. A lambda expression can have zero, one or
more parameters.
2. The type of the parameters can be explicitly declared,
or it can be inferred from the context.
3. Multiple parameters are enclosed in mandatory
parentheses and separated by commas. Empty parentheses are used to represent an
empty set of parameters.
4. When there is a single parameter, if its type
is inferred, it is not mandatory to use parentheses.
5. The body of the lambda expressions can
contain zero, one, or more statements.
6. If the body of lambda expression has a single
statement, curly brackets are not mandatory and the return type of the
anonymous function is the same as that of the body expression. When there is
more than one statement in the body then these must be enclosed in curly
brackets.
Functional Interfaces
Functional interfaces
are also called Single Abstract Method interfaces (SAM Interfaces). As
name suggest, a functional interface permits exactly one abstract
method in it.
Java
8 introduces @FunctionalInterface annotation which can be used
for giving compile-time errors it a functional interface violates the
contracts.
Functional
Interface Example
//Optional
annotation
@FunctionalInterface
public
interface MyFirstFunctionalInterface {public void firstWork(); }
For
example, given below is a perfectly valid functional interface.
@FunctionalInterface
public
interface MyFirstFunctionalInterface{
public void firstWork();
@Override
public String toString(); //Overridden from Object class
@Override
public booleanequals(Object obj); //Overridden from Object class
}
3.
Default Methods
Java
8 allows us to add non-abstract methods in the interfaces. These
methods must be declared default methods. Default methods were introducing
in java 8 to enable the functionality of lambda expression.
Default
methods enable us to introduce new functionality to the interfaces of our
libraries and ensure binary compatibility with code written for older versions
of those interfaces.
public
interface Moveable {
default void move(){
System.out.println("I
am moving"); }}
public
class Animal implements Moveable{
public static void main(String[] args){
Animal tiger = new Animal();
tiger.move(); }}
Output:
I am moving
4. Java 8 Streams
Another major
change introduced Java 8 Streams API, which provides a mechanism for processing
a set of data in various ways that can include filtering, transformation, or
any other way that may be useful to an application.
Streams
API in Java 8 supports a different type of iteration where we simply define the
set of items to be processed, the operation(s) to be performed on each item,
and where the output of those operations is to be stored.
4.1. Stream
API Example
In this example,
items is collection of String values and we want to remove the entries that
begin with some prefix text.
List<String>items;
String prefix;
List<String>
filteredList = items. stream().filter(e -> (!e.startsWith(prefix))).collect(Collectors.toList());
5. Java 8 Date/Time API Changes
The new Date and
Time APIs/classes (JSR-310), also called as ThreeTen, which have simply changed
the way we have been handling dates in java applications.
5.1. Date Classes
Date class has even become obsolete. The new
classes intended to replace Date class are LocalDate, LocalTime and
LocalDateTime.
The LocalDate class represents a date. There
is no representation of a time or time-zone.
The LocalTime class represents a time. There
is no representation of a date or time-zone.
The LocalDateTime class represents a
date-time. There is no representation of a time-zone
Example:
LocalDate localDate = LocalDate.now();
LocalTime localTime = LocalTime.of(12, 20);
LocalDateTime localDateTime =
LocalDateTime.now();
OffsetDateTime
offsetDateTime = OffsetDateTime.now();
6.StringJoiner
Java
added a new final class StringJoiner in java.util package. It is used to
construct a sequence of characters separated by a delimiter. Now, you can
create string by passing delimiters like comma(,), hyphen(-) etc.
Eg:
public class StringJoinerExample
{
public static void main(String[] args)
{
//adding
prefix and suffix
//StringJoiner
joinNames = new StringJoiner(","); // passing comma(,) as
delimiter
StringJoiner
joinNames = new StringJoiner(",", "[", "]"); // passing comma(,) and square-brackets as
delimiter
//
Adding values to StringJoiner
joinNames.add("Rahul");
joinNames.add("Raju");
joinNames.add("Peter");
joinNames.add("Raheem");
System.out.println(joinNames); }}
Collectors
Collectors is a
final class that extends Object class. It provides reduction operations, such
as accumulating elements into collections, summarizing elements according to
various criteria etc.
Example:
List<Product>productsList = new ArrayList<Product>();
//Adding Products
productsList.add(newProduct(1,"HP
Laptop",25000f));
productsList.add(newProduct(2,"Dell
Laptop",30000f));
productsList.add(newProduct(3,"Lenevo
Laptop",28000f));
productsList.add(newProduct(4,"Sony
Laptop",28000f));
productsList.add(newProduct(5,"Apple
Laptop",90000f));
Set<Float>productPriceList = productsList.stream()
.map(x->x.price) // fetching
price
.collect(Collectors.toSet()); // collecting as list
System.out.println(productPriceList);
Example
2:
Long noOfElements =
productsList.stream().collect(Collectors.counting());
System.out.println("Total elements :
"+noOfElements);
Example
3:
Double average =
productsList.stream().collect(Collectors.averagingDouble(p->p.price));
System.out.println("Average price is:
"+average);
Supplier : The Supplier Interface is a part of
the java. util. function package which
has been introduced since Java 8, to implement functional programming in Java.
It represents a function which does not take in any argument but produces a value
of type T.
Java 8 Supplier is a functional
interface whose functional method is get(). The Supplier interface
represents an operation that takes no argument and returns a result. As this is
a functional interface and can therefore be used as the assignment target for a
lambda expression or method reference
2. SpringBoot:
What different Annotations you implemented?
@SpringBootApplication
A single
@SpringBootApplication annotation can be used to enable those three
features, that is: @EnableAutoConfiguration: enable Spring Boot's auto-configuration
mechanism. @ComponentScan: Enable @Component scan on the package where
the application is located
3.
In Spring Boot, how you can implement DI without bean.xml file
4. Difference between @Autowired and
@Qualifer, main significance of @Qualifier, @Primary?
The @Autowired
(NoSuchBeanDefinitionException) annotation
provides more accurate control over where and how autowiring should be done.
This annotation is used to autowire bean on the setter methods, constructor, a
property or methods with arbitrary names or multiple arguments. By default, it
is a type driven injection.
When
you create more than one bean of the same type and want to wire only one of
them with a property you can use the @Qualifier
annotation along with @Autowired to
remove the ambiguity by specifying which exact bean should be wired.
Ex: Here we have two classes, Employee
and EmpAccount respectively. In EmpAccount, using @Qualifier its specified that
bean with id emp1 must be wired. (NoUniqueBeanDefinitionException)
In Spring framework, the @Primary
annotation is used to give higher preference to a bean, when there
are multiple beans of same type. The @Primary annotation may be used on any
class directly or indirectly annotated with @Component or on methods annotated
with @Bean
5.ThreadExecutorService
in multithreading.
We use the Executors. newSingleThreadExecutor ()
method to create an ExecutorService that uses a single worker thread for
executing tasks. If a task is submitted for execution and the thread is
currently busy executing another task, then the new task will wait in a
queue until the thread is free to execute it.
The ExecutorService helps in maintaining a pool of
threads and assigns them tasks. It also provides the facility to queue up
tasks until there is a free thread available if the number of tasks is more
than the threads available.
6.Volatile Keyword in Java?
volatile keyword
is used to communicate the content of memory between threads.
public final class Singleton {
private static volatile Singleton instance = null;
private Singleton() {}
public static Singleton getInstance() {
if (instance == null) {
synchronized(Singleton.class) {
if (instance == null) {
instance = new
Singleton(); } } } return instance; } }
7. Hibernate:
Why we use and what are main features?
Hibernate is a Java framework that simplifies the
development of Java application to interact with the database. It is an
open source, lightweight, ORM (Object Relational Mapping) tool. Hibernate
implements the specifications of JPA (Java Persistence API) for data persistence.
·
It provides Simple Querying of data.
·
An application server is not required to
operate.
·
The complex associations of objects in the
database can be manipulated.
·
Database access is minimized with smart
fetching strategies.
·
It manages the mapping of Java classes to
database tables without writing any code.
·
Properties of XML file is changed in case of
any required change in the database.
There are many features in Hibernate Java framework.
·
Relationship: Hibernate
supports relationships like One-To-One, One-To-One, One-To-Many, Many-To-Many.
·
Support Association: Hibernate
supports association type of relationship: composition and aggregation.
·
Support Inheritance: In
hibernate, if we save derived class object, then its base class object will
also be stored into the database, it means hibernate supports inheritance
mechanism.
·
Support Primary Key: Hibernate
has capability to generate primary keys automatically while we are storing the
records into database
·
Support Composite Key: Hibernate
has capability to generate composite primary keys based on the provided meta
data.
·
Support Validation: Hibernate
support domain model validation which enable by annotation. Hibernate has one
of the projects which is called Hibernate Validator which has rich set of
validation.
·
Support Full-Text Search: Hibernate
has a project which is called Hibernate Search which support full-text search
on domain model object.
·
Supports Collection (List/Set/Map): Hibernate
supports Java collections data structure: List, Set, Map.
·
Caching: Hibernate
supports 2 level of caching, first level and second level. Caching mechanism
reduces the number of round trips between an application and the database
improve our user experience and performance of application.
·
HQL (Hibernate Query Language): Hibernate
has its own query language, i.e. HQL (Hibernate query language) which is
database independent
·
No locking of DB: Hibernate
says don’t rely on database by writing your own SQL query. Prefer not to write
SQL query in project if you are using Hibernate.
·
No need of try-catch-exception block:
when we write code in JDBC, we use to catch SQLException and
transform into another exception it means, all exceptions are checked
exceptions, so we must write code in try-catch-exception and throws. But in
hibernate we only have Un-checked/run-time exceptions, so no need to write
try-catch-exception block and throws clause. Hibernate we have the translator
which converts checked/compile time to Un-checked/run-time
·
Auto-Generation: While
we are inserting any record, if we don’t have any particular table in the
database, JDBC will rises an error like “View not exist”, and
throws exception, but in case of Hibernate, if it not found any table in the
database this will create the table for us , if we configured in our meta-data
configuration file.
·
Auto-Generation Query on Console: if
we have enabled auto generation of query that will print/display on console or
log file which help most of the times when you debug any issue.
·
Support Annotation: Hibernate
supports annotations, apart from XML
·
Support many databases: Hibernate
provided Dialect classes, so we no need to write SQL queries in Hibernate,
instead we use the methods provided by that API.
·
Pagination Support: Getting
pagination in hibernate is quite simple.
·
Hierarchal Data: fetching
of Hierarchal data in Hibernate is very simple, e.g. in catalog system, you can
pull catalog -> category -> product -> product variation in one go,
which is very complex pull hierarchal data in normal JDBC.
·
Smart Query Generation: If
you think from developer mind set, most of the time you are not able to tune
our query. why we not able to tune our query? there could be various reason for
this: you may know how to tune but don’t have sufficient time, sometimes you
know how to tune but due to laziness, sometime don’t know how to tune,
sometimes focus is to finish our functionality, sometimes we know but think
later point of time will do this but later never come you know. Hibernate
generate smart query by us using its smart query generation engine.
·
Support Query Criteria: This
is powerful component in Hibernate, which enable rich set of filter and
project. if you want to generate your own query on run time that is most
difficult to do in simple JDBC and time-consuming test case. But when you are
using Query Criteria of Hibernate it’s very easy based on condition specially
for search functionality on 3/4 fields.
8.First level cache, second level cache in
hibernate?
The main
difference between the first level and second level cache in
Hibernate is that the first level is maintained at the Session level and
accessible only to the Session, while the second level cache is maintained at
the SessionFactory level and available to all Sessions.
9.Different Cache providers?
EHCache: It can cache in memory or on disk and clustered caching and it
supports the optional Hibernate query result cache.
OSCache: Supports caching to memory and disk in a single JVM with a rich
set of expiration policies and query cache support.
warmCache: A cluster cache based on
JGroups. It uses clustered invalidation but doesn't support the Hibernate query
cache.
JBoss Cache: A fully transactional replicated clustered cache also based on
the JGroups multicast library. It supports replication or invalidation,
synchronous or asynchronous communication, and optimistic and pessimistic
locking. The Hibernate query cache is supported.
10.Difference between Synchronous and Asynchronous
API?
Synchronous means that
you call a web service (or function or whatever) and wait until it returns -
all other code execution and user interaction is stopped until the call
returns.
Asynchronous means that you do not halt
all other operations while waiting for the web service call to return.
11.How
you use SOAP in project?
12.WSDL and XSD?
XSD defines a
schema which is a definition of how an XML document can be structured. You can
use it to check that a given XML document is valid and follows the rules you've
laid out in the schema.
WSDL is a XML
document that describes a web service. It shows which operations are available
and how data should be structured to send to those operations. WSDL
documents have an associated XSD that show what is valid to put in a
WSDL document.
13.Oracle:
What is inner join and outer join?
Inner Join: Returns
records that have matching values in both tables.
E.g. SELECT Orders.OrderID,
Customers.CustomerName
FROM Orders
INNER JOIN Customers ON Orders.CustomerID = Customers.CustomerID;
LEFT OUTER
JOIN:
Returns all records from the left table, and the matched records from the right
table.
E.g. SELECT Customers.CustomerName,
Orders.OrderID FROM Customers
LEFT JOIN Orders
ON Customers.CustomerID = Orders.CustomerID
ORDER BY Customers.CustomerName;
RIGHT (OUTER)
JOIN:
Returns all records from the right table, and the matched records from the left
table.
E.g. SELECT Orders.OrderID,
Employees.LastName, Employees.FirstName
FROM Orders
RIGHT JOIN Employees ON Orders.EmployeeID = Employees.EmployeeID
ORDER BY Orders.OrderID;
FULL OUTER
JOIN/ FULL JOIN:
SELECT Customers.CustomerName,
Orders.OrderID
FROM Customers
FULL OUTER JOIN Orders ON Customers.CustomerID=Orders.CustomerID
ORDER BY Customers.CustomerName;
14.Stored procedure and Stored function
difference?
The function
must return a value but in Stored Procedure it is optional. Even a procedure
can return zero or n values.
Functions can have
only input parameters for it whereas Procedures can have input or output
parameters.
Functions can be
called from Procedure whereas Procedures cannot be called from a
Function.
Advance
Differences between Stored Procedure and Function in SQL Server
The procedure
allows SELECT as well as DML(INSERT/UPDATE/DELETE) statement in it whereas
Function allows only SELECT statement in it.
Procedures cannot be
utilized in a SELECT statement whereas Function can be embedded in a
SELECT statement.
Stored Procedures cannot be
used in the SQL statements anywhere in the WHERE/HAVING/SELECT section whereas Function
can be.
Functions that return
tables can be treated as another Rowset. This can be used in JOINs with
other tables.
Inline
Function
can be through of as views that take parameters and can be used in JOINs and
other Rowset operations.
15.In previous project, how you save documents
in Documentum?
Documentum
is like a normal filesystem (hard drive) on steroids. Instead of storing your
files on your own hard disk, you store them inside the Documentum system. This
allows people to access your files if they need to and allows you to access
their files. It's kind of like a network file server, but much fancier.
16.What is Security Encryption?
Encryption “attempts to
make information unreadable by anyone who is not explicitly authorized to view
that data”. People or devices can be authorized to access encrypted data in
many ways, but typically this access is granted via passwords or decryption
keys.
17.Difference between Encryption and Encoding?
Encryption is the process
of securely encoding data in such a way that only authorized users with a
key or password can decrypt the data to reveal the original.
Encoding data is used
only when talking about data that is not securely encoded.
18.What are different keywords used in DB for select statement?
SELECT |
Selects data from a database |
SELECT DISTINCT |
Selects only distinct (different) values |
SELECT INTO |
Copies data from one table into a new table |
SELECT TOP |
Specifies the number of records to return in the result set |
19.Lambda expressions?
Lambda expression (or function) is just an anonymous
function, i.e., a function with no name and without being bounded to an identifier.
Lambda expressions are written exactly in the
place where it’s needed, typically as a parameter to some other function.
Syntax:
(parameters)->expression
(parameters)->{statements;}
()->expression
(x,y)->x+y
20.Messaging queue?
A message
queue provides a lightweight buffer which temporarily stores messages, and
endpoints that allow software components to connect to the queue in order to
send and receive messages. The messages are usually small, and can be
things like requests, replies, error messages, or just plain information.
21.Condition based annotations?
Using Spring @Conditional annotation you can
conditionally register a component. With @Conditional annotation, you need to
specify a condition and the component is registered only if the condition is true. For specifying the condition,
you need to implement org.springframework.context.annotation.Condition
interface.
22.Java Threads and future classes?
Extending
the Thread class will make your class unable to extend other classes, because
of the single inheritance feature in JAVA. However, this will give you a
simpler code structure. If you implement Runnable, you can gain better
object-oriented design and consistency and avoid the single inheritance
problems.
If
you just want to achieve basic functionality of a thread you can simply
implement Runnable interface and override run() method. But if you want to do
something serious with thread object as it has other methods like suspend(),
resume(), etc which are not available in Runnable interface then you may prefer
to extend the Thread class.
Signature of thread:
public
class Thread extends Object implements Runnable {}
Thread Class Priority Constants
MAX_PRIORITY It represents the maximum priority
that a thread can have.
MIN_PRIORITY It represents the minimum priority
that a thread can have.
NORM_PRIORITY
It represents the default priority that a thread can have.
Default
priority of a thread is 5 (NORM_PRIORITY). The value of MIN_PRIORITY is 1 and
the value of MAX_PRIORITY is 10.
Thread
class also defines many methods for managing threads.
setName() to give thread a name
getName() return thread's name
getPriority() return thread's priority
isAlive() checks if thread is still running
or not
join() Wait for a thread to end
run() Entry point for a thread
sleep() suspend thread for a specified time
start() start a thread by calling run() method
activeCount() Returns an estimate of the number of active
threads in the current thread's thread group and its subgroups.
checkAccess() Determines if the currently running thread has
permission to modify this thread.
currentThread() Returns a reference to the currently
executing thread object.
dumpStack() Prints a stack trace of the current thread
to the standard error stream.
getId() Returns the identifier of this Thread.
getState() Returns the state of this thread.
getThreadGroup() Returns the thread group to which this
thread belongs.
interrupt() Interrupts this thread.
interrupted() Tests whether the current thread has been
interrupted.
isAlive() Tests if this thread is alive.
isDaemon() Tests if this thread is a daemon thread.
isInterrupted() Tests whether this thread has been interrupted.
setDaemon(boolean
on) Marks this thread as either a
daemon thread or a user thread.
setPriority(int
newPriority) Changes the priority of
this thread.
yield() A hint to the scheduler that the current
thread is willing to yield its current use of a processor.
23.
RUNNABLE CALLABLE?
Callable has call()
method but Runnable has run() method.
Callable has call method which returns value
but Runnable has run method which doesn't return
any value. call method can throw checked exception but run method can't
throw checked exception.
Callable use submit()
method to put in task queue but Runnable use execute() method to put in
the task queue.
24. How Spring security works and
GrantedAuthority and different authorities?
Spring
Security Roles as Container: -
User with
ROLE_ADMIN role have the authorities to READ, DELETE, WRITE, UPDATE. A
user with role ROLE_USER has authority to READ only. User
with ROLE_MANAGER can perform READ , WRITE and UPDATE operations.
Granted
authority in
spring security is a “permission” or “right” given to a role. Some
example of the granted authorities can be
1)READ_AUTHORITY 2) WRITE_AUTHORITY 3) UPDATE_AUTHORITY 4) DELETE_AUTHORITY
Eg:
public User(String
username, String password, boolean
enabled, boolean accountNonExpired,
boolean credentialsNonExpired, boolean
accountNonLocked,
Collection<? extends
GrantedAuthority> authorities)
Roles can be seen
as coarse-grained GrantedAuthorities represented as a String with prefix
with “ROLE“. We can use a role directly in Spring security application by using
hasRole("CUSTOMER"). For few simple applications, you can
think of Roles as a GrantedAuthorities. Here are some examples for the
Spring security Roles.
1) ROLE_ADMIN
2) ROLE_MANAGER
3) ROLE_USER
We can also use
the roles as container for authorities or privileges. This approach provides
flexibility to map roles based on business rules. Let’s take look at few
examples to understand it clearly.
User with ROLE_ADMIN
role have the authorities to READ,DELETE,WRITE,UPDATE.
A user with
role ROLE_USER has authority to READ only.
User with ROLE_MANAGER
can perform READ, WRITE and UPDATE operations.
Using Granted
Authority vs Role in Spring Security
Spring
security use the hasRole() and hasAuthority() interchangeably. With
Spring security 4, it is more consistent, and we should also be consistent with
our approach while using the hasRole() and hasAuthority() method. Let’s keep in
mind the following simple rules.
Always add
the ROLE_ while using the hasAuthority() method (e.g
hasAuthority("ROLE_CUSTOMER")).
While using
hasRole(), do not add the ROLE_ prefix as it will be added automatically by
Spring security (hasRole("CUSTOMER")).
25.How Spring boot connect to DB?
To access the Relational Database by
using JdbcTemplate in Spring Boot application, we need to add the
Spring Boot Starter JDBC dependency in our build configuration file.
Then, if you @Autowired the JdbcTemplate class, Spring Boot
automatically connects the Database and sets the DataSource for the JdbcTemplate
object.
How does spring
boot connect to external DB?
Updating the Spring Boot Project Step by Step
Step 1 - Add a Dependency for Your Database
Connector to pom. xml. ...
Step 2 - Remove H2 Dependency From pom.xml. Or
at least make its scope as test. ...
Step 3 - Setup Your MySQL Database. ...
Step 4 - Configure Your Connection to Your
Database. ...
Step 5 - Restart and You Are Ready!
How does spring
boot application connect to database using JDBC?
How to use JDBC with Spring Boot: - Create
Database. Suppose that we have a table named books in a schema named bookshop.
...
Create Spring Boot Project. I use Eclipse IDE.
...
Configure Database Connection Properties. ...
Code Java Model class. ...
Code Spring Boot JDBC Application
How do I know
if my spring boot is connected to a database?
The easiest way to test the database connection
from Spring boot is to start the application and by checking to debug
logs. So, let's start the application with debug mode. To check
the debug logs from the Hikari connection pool, set your logger in
spring boot to debug mode as shown below.
26. How two or more Microservices communicate
with each other?
Another
communication pattern we can leverage in a microservice architecture is message-based
communication. Unlike HTTP communication, the services involved do not
directly communicate with each other. Instead, the services push messages to a
message broker that other services subscribe to.
27.How two or more Microservices
communicate with each other asynchronously?
In asynchronous
communication microservices use asynchronous messages or http polling to
communicate with other microservices, but the client request is served right
away.
Microservices that communicate in an asynchronous
manner can use a protocol such as AMQP to exchange messages via a message
broker. The intended service receives the message in its own time. The sending
service is not locked to the broker. It simply fires and forgets.
28.Intermediates and Terminals in
java8 Streams?
A
Stream supports several operations, and these operations are divided into
intermediate and terminal operations.
The
distinction between this operation is that an intermediate
operation is lazy while a terminal operation is not. When you invoke an intermediate
operation on a stream, the operation is not executed immediately. It is
executed only when a terminal operation is invoked on that stream. In a
way, an intermediate operation is memorized and is recalled as soon as a
terminal operation is invoked. You can chain multiple intermediate
operations and none of them will do anything until you invoke a terminal
operation. At that time, all the intermediate operations that you
invoked earlier will be invoked along with the terminal operation.
All
Intermediate operations return Stream (can be chained), while
terminal operations don't. Intermediate Operations are:
filter(Predicate<T>)
, map(Function<T>)
flatMap(Function<T>)
, sorted(Comparator<T>)
peek(Consumer<T>)
, distinct() , limit(long n) , skip(long n)
Terminal
operations produces a non-stream (cannot be chained) result such
as primitive value, a collection, or no value at all. Terminal Operations
are:
forEach,
forEachOrdered, toArray, reduce, collect, min,
max, count, anyMatch
allMatch,
noneMatch, findFirst, findAny
In
Java8 the Stream. reduce () combine elements of a stream and produces a single
value. A simple sum operation using a for loop.
List<Integer>
numbers = Arrays.asList(1, 2, 3, 4, 5, 6);
int result
= numbers .stream().reduce(0, (subtotal, element) -> subtotal + element);
assertThat(result).isEqualTo(21);
int result =
numbers.stream().reduce(0, Integer::sum);
assertThat(result).isEqualTo(21);
String result
= letters.stream().reduce("", String::concat);
assertThat(result).isEqualTo("abcde");
List<Integer>
ages = Arrays.asList(25, 30, 45, 28, 32);
int
computedAges = ages.parallelStream().reduce(0, (a, b) -> a + b,
Integer::sum);
29. In java 8 find the list of duplicated elements -
10,15,8,49,25,98,98,32,15
ArrayList<Integer>
numbersList = new
ArrayList<>(Arrays.asList(10,15,8,49,25,98,98,32,15));
System.out.println(numbersList);
//remove
duplicated elements
List<Integer>
listWithoutDuplicates =
numbersList.stream().distinct().collect(Collectors.toList());
System.out.println(listWithoutDuplicates);
//duplicated
elements print
noList.stream().filter(i
-> Collections.frequency(noList, i) >1)
.collect(Collectors.toSet()).forEach(System.out::println);
30.Employee class- write java 8
syntax to sort list of Employee according to age/name?
Collections.sort(Employees,
Comparator.comparing(Employee::getFname)
.thenComparingInt(Employee::getAge));
List<Employee>
list=new ArrayList<Employee>();
list.stream().sorted(Comparator.comparing(Employee::getName).thenComparing(Employee::getAge)).collect(Collectors.toList()).forEach(System.out::println);
31.Given a list of integers, find out all the
numbers starting with 1 using Stream functions? 10,15,8,49,25,98,32?
List<Integer>
myList = Arrays.asList(10,15,8,49,25,98,32);
// Convert
integer to String
myList.stream().map(s
-> s + "").filter(s -> s.startsWith("1")) .forEach(System.out::println);
32. What is importance
on hashcode and equals method?
The equals() and hashcode() are the two
important methods provided by the Object class for comparing objects. Since the
Object class is the parent class for all Java objects, hence all objects
inherit the default implementation of these two methods.
You must override hashCode() in every class
that overrides equals(). Failure to do so will result in a violation of the
general contract for Object. hashCode(), which will prevent your class from
functioning properly in conjunction with all hash-based collections, including
HashMap, HashSet, and Hashtable.
HashCode in Java helps the program to run faster.
For example, comparing two objects by their hashcode will give the result 20
times faster than comparing them using the equals() function. This is so
because hash data structures like HashMap, internally organize the elements in
an array-based data structure.
The difference in equals and hashCode in Java
is that the equals is used to compare two objects while the hashCode is
used in hashing to decide which group an object should be categorized into
To achieve a fully working
custom equality mechanism, it is mandatory to override hashcode() each
time you override equals(). Follow the tips below and
you'll never have leaks in your custom equality mechanism:
- If two
objects are equal, they MUST have the same hash code.
- If
two objects have the same hash code, it doesn't mean that they are
equal.
- Overriding equals() alone
will make your business fail with hashing data structures like: HashSet,
HashMap, HashTable ... etc.
- Overriding hashcode() alone doesn't
force Java to ignore memory addresses when comparing two objects.
· Whenever it is invoked on the same object more than
once during an execution of a Java application, the hashCode method must
consistently return the same integer, provided no information used in equals
comparisons on the object is modified. This integer need not remain consistent
from one execution of an application to another execution of the same
application.
· If two objects are equal according to the equals(Object) method,
then calling the hashCode method on each of the two objects must produce the
same integer result.
· It is not required that if two objects are unequal according to
the equals(java.lang.Object) method, then calling the hashCode method on each
of the two objects must produce distinct integer results. However, the
programmer should be aware that producing distinct integer results for unequal
objects may improve the performance of HashTable.
33. Hierarchy of
collection?
The hierarchy of the entire collection framework consists
of four core interfaces such as Collection, List, Set, Map, and two
specialized interfaces named SortedSet and SortedMap for
sorting. All the interfaces and classes for the collection framework are in
java.
34. HashMap internal working?
- equals(): It
checks the equality of two objects. It compares the Key, whether they are
equal or not. It is a method of the Object class. It can be overridden. If
you override the equals() method, then it is mandatory to override the
hashCode() method.
- hashCode(): This
is the method of the object class. It returns the memory reference of the
object in integer form. The value received from the method is used as the
bucket number. The bucket number is the address of the element inside the
map. Hash code of null Key is 0.
- Buckets: Array
of the node is called buckets. Each node has a data structure like a
LinkedList. More than one node can share the same bucket. It may be
different in capacity.
How HashMap works internally in
Java 8 with example?
o In Java 8, HashMap
replaces linked list with a binary tree when the number of elements in
a bucket reaches certain threshold 12/16=0.75 Load Factor. While converting
the list to binary tree, hashcode is used as a branching variable. ... This JDK
8 change applies only to HashMap, LinkedHashMap and ConcurrentHashMap
35. How springboot is different
from spring and SPRING MVC?
S.No. |
SPRING MVC |
SPRING BOOT |
1. |
Spring MVC is a Model View, and Controller based
web framework widely used to develop web applications. |
Spring Boot is built on top of the conventional
spring framework, widely used to develop REST APIs. |
2. |
If we are using Spring MVC, we need
to build the configuration manually. |
If we are using Spring Boot, there
is no need to build the configuration manually. |
3. |
In the Spring MVC, a deployment descriptor is
required. |
In the Spring Boot, there is no need for a
deployment descriptor. |
4. |
Spring MVC specifies each dependency
separately. |
It wraps the dependencies together
in a single unit. |
5. |
Spring MVC framework consists of four components :
Model, View, Controller, and Front Controller. |
There are four main layers in Spring Boot:
Presentation Layer, Data Access Layer, Service Layer, and Integration Layer. |
6. |
It takes more time in development. |
It reduces development time and
increases productivity. |
Spring: Spring Framework is the most popular application development framework of Java. The main feature of the Spring Framework is dependency
Injection or Inversion
of Control (IoC). With the help of Spring Framework, we can develop
a loosely coupled application. It is better to use if
application type or characteristics are purely defined.
36.What is difference between RestController and Controller
is @service
@Component @Controller @Restcontroller interchangeable in springboot context?
@Restcontroller : It is a combination of @Controller and @ResponseBody, used for creating a restful controller. It converts the response to JSON or XML. It ensures that data returned by each method will be written straight into the response body instead of returning a template.
@Controller:
The @Controller is a class-level annotation. It is a specialization of @Component.
It marks a class as a web request handler. It is often used to serve web pages.
By default, it returns a string that indicates which route to redirect. It is
mostly used with @RequestMapping annotation.
Scenario Based Interview Questions:
1) class A{
A(){}
}
class B extends A{
B(){
super();
}
}
class C extends B{}
Class Main{
public static void
main(String args[]){
A a1= new A(); // works fine
A a2= new B(); //
Throw exception}
Question 2: Given
Input is [ [2, 3, 5], [7, 11, 13], [17,
19, 23] ] o/p : [ 2, 3, 5, 7, 11, 13, 17, 19, 23 ]
List<List<Integer>>
list = Arrays.asList(Arrays.asList(2,3,5),Arrays.asList(7,11,13),Arrays.asList(17,19,23));
list.stream().flatMap(l->l.stream()).map(ll->ll).collect(Collectors.toList());
List<Integer>
listOfAllIntegers = list.stream().flatMap(x -> x.stream()).collect(Collectors.toList()); System.out.println(listOfAllIntegers); // [2,
3, 5, 7, 11, 13, 17, 19, 23]
// Similar on String data
String[][] dataArray = new
String[][]{{"a", "b"}, {"c", "d"},
{"e", "f"}, {"g", "h"}};
List<String> listOfAllChars =
Arrays.stream(dataArray) .flatMap(x -> Arrays.stream(x))
.collect(Collectors.toList());
System.out.println(listOfAllChars);
3)Design search
API based on n parameters?
@RequestMapping
@RestController
public class MySearch{
@PostMapping(
public searchAPI() {}
4) What is Overriding
strategy in Java7?
Rules for method
overriding:
- In
java, a method can only be written in Subclass, not in same class.
- The
argument list should be exact the same as that of the overridden method.
· The return type should
be the same or a subtype of the return type declared in the original overridden
method in the super class.
class A {
public void display() throws
NullPointerException {
System.out.println ("Class
A");
} }
class B extends A {
public void display() throws
RuntimeException {
System.out.println
("Class A");
} }
5) If you have an array and you want to add x number to every element how to do it? Sum of elements within array in java8?
int[] a = {10,20,30,40,50};
int sum = IntStream.of(a).sum();
System.out.println("The sum is " + sum);
int [] arr = {1,2,3,4};
int sum = Arrays.stream(arr).sum(); //prints 10
int[] array = new int[]{1,2,3,4,5};
int sum = IntStream.of(array).reduce( 0,(a, b) -> a + b);
System.out.println("The summation of array is " + sum);
System.out.println("Another way to find summation:" + IntStream.of(array).sum());
List<Integer> integers = Arrays.asList(1, 2, 3, 4, 5);
Integer total = integers.stream().collect(Collectors.summingInt(Integer::intValue));
System.out.println(total);
List<Integer> list = Arrays.asList(10, 12, 83, 46, 59);
Integer sumofElements = list.stream().mapToInt(Integer::intValue).sum();
System.out.println(sumofElements);
37. Microservices: suppose you have employee - how would you
decide which service should be which microservice? In which of these scenarios will you prefer to use microservices?
Let's look at some of
the typical scenarios where you can consider going for microservice style of
architecture: Monolithic application migration due to improvements
needed in scalability, manageability, agility, or speed of delivery.
Re-platform a legacy application by transforming functions/modules to
microservices.
Microservices design
considerations
- Single
Responsibility Principal. A microservice should have single responsibility
so that it will be
- easy
to maintain and be reusable. ...
- Stateless.
...
- Programming
frameworks. ...
- Data
handling. ...
- Secrets
Management. ...
- Dependency
graph. ...
- Versioning.
...
- Containers.
38.Microservice – which rest client have you used ? Post Man
Microservice
Architecture pattern (microservices.io)
39.
Profiles in Spring boot?
Spring Boot allows to define profile specific
property files in the form of application-{profile}. properties It automatically loads the properties in an
application. properties file for all profiles, and the ones in profile-specific
property files only for the specified profile.
40. On what criteria you decide a particular api should be under
this microservice?
The Single Responsibility
Principle: Just like with code, where
a class should have only a single reason to change, microservices should be
modeled in a similar fashion. Building bloated services which are subject to
change for more than one business context is a bad practice.
2. Have a separate data store(s)
for your microservice
It defeats the purpose of having microservices if you
are using a monolithic database that all your microservices share. Any change
or downtime to that database would then impact all the microservices that use
the database. Choose the right database for your microservice needs, customize
the infrastructure and storage to the data that it maintains, and let it be
exclusive to your microservice. Ideally, any other microservice that needs
access to that data would only access it through the APIs that the microservice
with write access has exposed.
3. Use asynchronous communication
to achieve loose coupling
To avoid building a mesh of tightly coupled
components, consider using asynchronous communication between
microservices.
a. Make calls to your dependencies asynchronously,
example below.
Example: Let’s say you have a Service A that calls
Service B. Once Service B returns a response, Service A returns success to the
caller. If the caller is not interested in Service B’s output, then Service A
can asynchronously invoke Service B and instantly respond with a success to the
caller.
b. An even better option is to use events for
communicating between microservices. Your microservice would publish an event
to a message bus either indicating a state change or a failure and whichever
microservice is interested in that event, would pick it up and process
it.
Example: In the pizza order system above, sending a
notification to the customer once their order is captured, or status messages
as the order gets fulfilled and delivered, can happen using asynchronous
communication. A notification service can listen to an event that an order has
been submitted and process the notification to the customer.
4. Fail fast by using a circuit
breaker to achieve fault tolerance
If your microservice is dependent on another system to
provide a response, and that system takes forever to respond, your overall
response SLAs will be impacted. To avoid this scenario and quickly respond, one
simple microservices best practice you can follow is to use a circuit breaker
to timeout the external call and return a default response or an error. The
Circuit Breaker pattern is explained in the references below. This will isolate
the failing services that your service is dependent on without causing cascade
failures, keeping your microservice in good health. You can choose to use
popular products like Hystrix that
Netflix developed. This is better than using the HTTP CONNECT_TIMEOUT and
READ_TIMEOUT settings as it does not spin up additional threads beyond what’s
been configured.
5. Proxy your microservice requests
through an API Gateway
Instead of every microservice in the system performing
the functions of API authentication, request / response logging, and
throttling, having an API gateway doing these for you upfront will add a lot of
value. Clients calling your microservices will connect to the API Gateway
instead of directly calling your service. This way you will avoid making all
those additional calls from your microservice and the internal URLs of your service
would be hidden, giving you the flexibility to redirect the traffic from the
API Gateway to a newer version of your service. This is even more necessary
when a third party is accessing your service, as you can throttle the incoming
traffic and reject unauthorized requests from the API gateway before they reach
your microservice. You can also choose to have a separate API gateway that
accepts traffic from external networks.
6. Ensure your API changes are
backwards compatible
You can safely introduce changes to your API and
release them fast as long as they don’t break existing callers. One possible
option is to notify your callers , have them provide a sign off for your
changes by doing integration testing. However, this is expensive, as all the
dependencies need to line up in an environment and it will slow you down with a
lot of coordination. A better option is to adopt contract testing for your
APIs. The consumers of your APIs provide contracts on their expected response
from your API. You as a provider would integrate those contract tests as
part of your builds and these will safeguard against breaking changes. The
consumer can test against the stubs that you publish as part of the consumer
builds. This way you can go to production faster with independently testing
your contract changes.
7. Version your microservices for
breaking changes
It's not always possible to make backwards compatible
changes. When you are making a breaking change, expose a new version of your
endpoint while continuing to support older versions. Consumers can choose to
use the new version at their convenience. However, having too many versions of
your API can create a nightmare for those maintaining the code. Hence, have a
disciplined approach to deprecate older versions by working with your clients
or internally rerouting the traffic to the newer versions.
8. Have dedicated infrastructure
hosting your microservice
You can have the best designed microservice meeting
all the checks, but with a bad design of the hosting platform it would still
behave poorly. Isolate your microservice infrastructure from other components
to get fault isolation and best performance. It is also important to isolate
the infrastructure of the components that your microservice depends on.
Example: In the pizza order example above, let's say
the inventory microservice uses an inventory database. It is not only important
for the Inventory Service to have dedicated host machines, but also the
inventory database needs to have dedicated host machines.
9. Create a separate release train
Your microservice needs to have its own separate
release vehicle which is not tied to other components within your organization.
This way you are not stepping on each other’s toes and wasting time
coordinating with multiple teams.
10. Create Organizational
Efficiencies
While microservices give you the freedom to develop
and release independently, certain standards need to be followed for cross
cutting concerns so that every team doesn’t spend time creating unique
solutions for these. This is very important in a distributed architecture such
as microservices, where you need to be able to connect all the pieces of the
puzzle to see a holistic picture. Hence, enterprise solutions are necessary for
API security, log aggregation, monitoring, API documentation, secrets
management, config management, distributed tracing, etc.
41. Why streams are faster compared to collections?
Streams are more about coding convenience and safety.
Convenience -- speed tradeoff is working here. It's like your results: stream
is slower than collection. Conclusion: much time were spent for stream
initialization/values transmitting.
Collections are used to store and group the data
in a particular data structure like List, Set or Map. But, streams are used to
perform complex data processing operations like filtering, matching, mapping
etc on stored data such as arrays, collections or I/O resources.
42. Benefits of streams?
There are a lot of benefits to using streams
in Java, such as the ability to write functions at a more abstract level which
can reduce code bugs, compact functions into fewer and more readable lines of
code, and the ease they offer for parallelization
43. Timeout
handling in rest client? rest Client how will you add timeout?
If a subflow for an operation is processing a
message and that subflow does not respond to the client within the expected
time limit, a message is routed to the Timeout error handler. The Timeout error
handler can then be used to pass a response back to the client to inform that
client that the operation has timed out.
The default
timeout is 10 seconds. The minimum is 1 millisecond, and the maximum is 120
seconds. If the callout is timing out, please try and increase the timeout on
the HTTP request to avoid that.
How do you set a
timeout on a REST call?
Example of setting
a custom timeout for HTTP callouts using the Apex HTTPRequest object. The
default timeout is 10 seconds. The minimum is 1 millisecond and the maximum is
120 seconds. If the callout is timing out, please try and increase the timeout
on the HTTP request to avoid that.
How do I set timeout
in API?
abort ()); const
timeout = setTimeout (() => controller.
If the timeout is
reached before the resource is fetched, then the fetch is aborted.
If the resource
is fetched before the timeout is reached then the timeout is cleared.
If the input
signal is aborted then the fetch is aborted and the timeout is cleared.
RestTemplate default
timeout:
private int
connectTimeout = - 1; private int readTimeout = - 1; By default, RestTemplate
uses timeout property from JDK installed on the machine which is always
infinite in not overridden. To override the default JVM timeout, we can pass
these properties during JVM start.
How do you increase
client timeout?
Set the
keep-alive options in the client configuration file:
Login to the
client machine and open the /etc/ssh/ssh_config file to set the necessary
parameter values to increase the SS connection timeout. ServerAliveInterval and
ServerAliveCountMax parameters are set to increase the connection timeout.
What is the default
timeout for rest template?
infinite: The default timeout is
infinite. By default, RestTemplate uses
SimpleClientHttpRequestFactory and that in turn uses HttpURLConnection.
43. Asynchronous
calls handling in spring?
Simply put, annotating a method of a bean with
@Async will make it execute in a separate thread. In other words, the
caller will not wait for the completion of the called method. One interesting
aspect in Spring is that the event support in the framework also has support
for async processing if necessary.
Hystrix is a library that controls the interaction between microservices to provide latency and fault tolerance. Additionally, it makes sense to modify the UI to let the user know that something might not have worked as expected or would take more time.
Fault tolerance can be achieved with the help of a circuit breaker. It is a pattern that wraps requests to external services and detects when they fail. If a failure is detected, the circuit breaker opens. All the subsequent requests immediately return an error instead of making requests to the unhealthy service. It monitors and detects the service which is down and misbehaves with other services. It rejects calls until it becomes healthy again.
How to process bulk records to apache Kafka?
45. Hibernate transaction management?
Transaction is an
interface available in org.hibernate package which is associated with the
session. In the transaction, if any single step fails, the complete transaction
will be failed. We can describe transaction with ACID properties.
Example:-
Session = null;
Transaction tx = null;
try {
session = sessionFactory.openSession();
tx = session.beginTransaction();
//some action.
tx.commit();
}catch
(Exception ex) {
46. How to do
transaction rollback?
ROLLBACK is the SQL command that is used for
reverting changes performed by a transaction. When a ROLLBACK command is issued
it reverts all the changes since last COMMIT or ROLLBACK.
How you will create new transactions and @transactional ?
Hibernate deals with database specific
transactions, whereas spring provides a general transaction management service.
@Transactional is a nice way of configuring transaction management behavior.
Transactions : -
Transactions are
basically units of work (ie changes to something) that are managed as a single
operation that can be either committed or rolled back. There are lots of
different types of transactions in the java world - database, messaging systems
like JMS, inter application transactions (for those who are not faint of heart)
or anything else that may need to be included in a transaction.
Spring is designed to be used as an
all-encompassing master of objects and services within your application, so its
concept of a transaction is at a higher level than the database specific
transactions that hibernate concerns itself with. Spring Transactions are
designed to give you fine grained control of all your transactional resources
while abstracting away the often messy coding required to co-ordinate the
transactions.
@Transactional
Spring provides a
few different methods for using transactions - among others there xml-based
aspects, coding to the API and annotation based declarative transactions. The annotation-based
transactions are handy because you dont need to add the transaction management
boilerplate code to your app (even using PlatformTransactionManager
via the API has quite a bit of coding overhead).
So basically what
happens with @Transactional is that at runtime spring scans your code base for
@Transactional classes and methods and wraps them up in the transaction
specific management code, based on what you have configured via the annotation.
So a method like this:
@Transactional(propagation
= REQUIRES_NEW, rollbackFor = {Exception.class})
public void
saveAndSendMessage(Foo foo) throws Exception {
dbManager.save(foo);
Bar bar = transform(foo);
jmsSystem.send(bar);
}
47. Circular
Dependency in Spring?
Circular
dependency in Spring happens when two or more beans require instance of each
other through constructor dependency injections. For example: There is a ClassA
that requires an instance of ClassB through constructor injection and ClassB
requires an instance of class A through constructor injection.
What is meant
by circular dependency?
In software
engineering, a circular dependency is a relation between two or more modules
which either directly or indirectly depend on each other to function properly.
Such modules are also known as mutually recursive.
How do you
resolve circular dependency when it occurs?
·
Use events to signal from one class to another. ...
·
If the above is true, but you feel that events seem wrong, you can
consider applying the Observer pattern.
·
If the communication must truly go both ways, you can use a
Mediator through which the components can communicate.
48. What is the difference between Spring Singleton bean scope
& Singleton Design pattern? or how are they different?
Singleton pattern
is described at per class loader level.
Singleton bean
scope is per spring container. Spring simply creates a new instance of that
class and that is available in the container to all class loaders which use
that container.
Suppose you have two scenarios:
1. There are multiple class loaders inside the
same spring container..
2. There are multiple containers using same
class loader..
In
first case - you will get 1 instance while in case 2 - you will get multiple
instances .
49.Difference between microservice
and monolith?
A monolithic architecture is built as one large system and
is usually one codebase. A monolith is often deployed all at once, both front
and end code together, regardless of what was changed.
A microservices architecture however is
where an app is built as a suite of small services, each with their own codebase.
50.Scope of beans?
1. singleton(default***): Scopes
a single bean definition to a single object instance per Spring IoC container.
2. prototype: Scopes a single bean definition
to any number of object instances.
3. request: Scopes a single bean definition to the
lifecycle of a single HTTP request; that is every HTTP request will have its
own instance of a bean created off the back of a single bean definition. Only
valid in the context of a web-aware Spring ApplicationContext.
4. session: Scopes a single bean definition
to the lifecycle of a HTTP Session. Only valid in the context of a web-aware
Spring ApplicationContext.
5. global session: Scopes a single bean definition
to the lifecycle of a global HTTP Session. Typically only valid when used in a
portlet context. Only valid in the context of a web-aware Spring
ApplicationContext.
51. Can you insert prototype bean to a singleton bean and
if yes how?
You cannot
dependency-inject a prototype-scoped bean into your singleton bean because that
injection occurs only once, when the
Spring container is instantiating the singleton bean and resolving and
injecting its dependencies.
If Yes then Singleton beans with prototype-bean
dependencies for below method to follow
Lookup Method Injection
When you use singleton-scoped beans with
dependencies on prototype beans, be aware that dependencies are resolved
at instantiation time. Thus if you dependency-inject
a prototype-scoped bean into a singleton-scoped bean, a new prototype
bean is instantiated and then dependency-injected into the singleton bean. The
prototype instance is the sole instance that is ever supplied to the singleton-scoped
bean.
“Lookup method injection is the ability of the
container to override methods on container managed beans, to return
the lookup result for another named bean in the container. The lookup typically
involves a prototype bean as in the scenario described in the preceding
section. The Spring Framework implements this method injection by using
bytecode generation from the CGLIB library to generate dynamically a subclass
that overrides the method”
<!--
a stateful bean deployed as a prototype (non-singleton) -->
<bean
id="command" class="fiona.apple.AsyncCommand"
scope="prototype">
<!-- inject dependencies here as required
-->
</bean>
<!--
commandProcessor uses statefulCommandHelper -->
<bean
id="commandManager" class="fiona.apple.CommandManager">
<lookup-method
name="createCommand" bean="command"/>
</bean>
52.
How to you include a different web service than the default tomcat server?
In order to be able to deploy
additional application on a different port, you will need to create an additional
Service configuration. In order to do so, edit the server. xml file once again
and additional configuration group. The new group, must have a different name,
different ports, both for HTTP and AJP traffic.
<Service
name="Catalina">
<Connector port="8080"
protocol="HTTP/1.1" connectionTimeout="20000"
redirectPort="8443" />
<Connector
port="8009" protocol="AJP/1.3"
redirectPort="8443" />
<Engine name="Catalina"
defaultHost="localhost">
<Realm
className="org.apache.catalina.realm.UserDatabaseRealm"
resourceName="UserDatabase"/>
<Host
name="localhost"
appBase="webapps" unpackWARs="true"
autoDeploy="true" xmlValidation="false" xmlNamespaceAware="false"></Host>
</Engine>
</Service>
</Server>
In order to
be able to deploy additional application on a different port, you will need to
create an additional Service configuration. In order to do so, edit the server.xml
file once again and additional configuration group. The new group, must have a
different name, different ports, both for HTTP and AJP traffic. The Engine and
Host groups must also have a different name. Note that in the Engine component
attribute the defaultHost entry must correspond to the Host component name.
Finally, the appBase attribute of the Host component, must point to the
existing directory where the deployment will be performed (I will explain
mention the directories you need to create/copy below).
<Service
name="Catalina-xyz">
<Connector port="8081"
protocol="HTTP/1.1" connectionTimeout="20000"
redirectPort="8444" />
<Connector
port="8008" protocol="AJP/1.3"
redirectPort="8444" />
<Engine name="Catalina-xyz"
defaultHost="xyz">
<Realm
className="org.apache.catalina.realm.UserDatabaseRealm"
resourceName="UserDatabase"/>
<Host name="xyz"
appBase="xyz" unpackWARs="true" autoDeploy="true"
xmlValidation="false"
xmlNamespaceAware="false"></Host>
</Engine>
</Service>
Before
restarting the Tomcat instance, we need to make sure that all required
directories are correctly set. First of all, create a Catalina directory copy
with the Catalina-xyz name, located in your TOMCAT_HOME directory (you must
have both the new and the old directories in the TOMCAT_HOME present). Next,
inside of the Catalina-xyz directory rename the localhost to xyz (or whatever
name you have chosen for your host). Next, copy the webapps directory (in my
case it was located inside the /var/lib/tomcat/ directory) with the xyz name to
the same folder. Finally, restart the Tomcat instance. Now you will be able to
deploy the two ROOT.war archives to the two different Tomcat ports.
53.Microservice Architecture?
Microservices architecture (often shortened to
microservices) refers to an architectural style for developing
applications. Microservices allow a large application to be
separated into smaller independent parts, with each part having its own realm
of responsibility.
Broadly speaking, there are two
types of microservices:
- Stateless
microservices.
- Stateful
microservices.
Microservices
- also known as the microservice architecture - is an architectural style that
structures an application as a collection of services that are
- Highly maintainable and testable
- Loosely coupled
- Independently deployable
- Organized around business capabilities
- Owned by a small team
The microservice architecture enables the rapid, frequent,
and reliable delivery of large, complex applications. It also enables an
organization to evolve its technology stack.
54. How do
you deploy a Web application in a production server?
Assign deployment
attributes for your Web Application:
- Open
the Administration Console.
- Select
the Web Applications node.
- Select
your Web Application.
- Assign
your Web Application to a WebLogic Server, cluster, or Virtual Host.
- Select
the File tab and define the appropriate attributes.
55. How to change the server in spring boot?
You will need to update pom. xml add the
dependency for spring-boot-starter-jetty . Also, you will need to
exclude default added spring-boot-starter-tomcat dependency.
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-undertow</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-jetty</artifactId>
</dependency>
spring-boot-starter-web
comes with Embedded Tomcat. We need to exclude this
dependency.
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<exclusions>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-tomcat</artifactId>
</exclusion>
</exclusions>
</dependency>
56. What would happen if hashcode () always
return one in Multithreading? What will be behavior if we override hashCode
method to always return 1?
Only Override
HashCode, Use the default Equals: Only the references to the same object will
return true. In other words, those objects you expected to be equal will not be
equal by calling the equals method.
57. When a bean is missing how do you handle it?
The
@ConditionalOnMissingBean annotation is used to load a bean only if a given
bean is missing:
@Bean
@ConditionalOnMissingBean(SomeBean.class)
public
SomeBean otherBean(){
return new SomeBean();
}
The above
bean will get loaded by Spring only if there is no other bean of this type
present in the context. On the other hand, if there is already a bean of the
type SomeBean present in the application context, the above bean will not be
created.
58. How do we connect to a database, when we need to connect
oracle in production and some other
database in development, how do you handle these connection properties?
We must write
individual properties files and configure each environment details to be added
59. Code repository, branching in git / Source Control management tool: GIT
·
Git branching allows developers to diverge from the production
version of code to fix a bug or add a feature. ...
·
As you create commits in the new branch, Git creates new pointers
to track the changes. ...
·
Git knows which branch you have checked out by using a special pointer
called HEAD.
Branch Naming
Strategies
·
username/description
·
username/workitem
You can name
a branch to indicate the branch’s function, like a feature, bug fix, or hotfix:
·
bugfix/description
·
feature/feature-name
·
hotfix/description
Option 1: Creating
a Branch
git branch
<branch name>
git branch
Option 2:
Creating a Branch using Checkout
git checkout
-b <branch name>
git branch
Option 3:
Creating a Branch from a Commit
git log
git branch
<branch name> <identifier>
Option 4:
Creating a Branch from Another Branch
git checkout
-b feature4 develop
Option 5:
Download Branch from Remote Repository
git pull
origin <branch name>
git branch
git checkout
<branch name>
git branch
Merging
Branches
Once you’ve
completed work on your branch, it is time to merge it into the main branch.
Merging takes your branch changes and implements them into the main branch.
Depending on the commit history, Git performs merges two ways: fast-forward and
three-way merge.
When you
merge the hotfix branch into the main branch, Git will move the main branch
pointer forward to commit nr7jk. Git does this because the hotfix branch shares
a direct ancestor commit with the main branch and is directly ahead of its
commit. This commit is a fast-forward merge.
Once you
merge the hotfix branch, continue working on the feature1 branch. As you
continue making commits on the feature1 branch, the commit history diverges.
Git is unable
to move the pointer to the latest commit like in a fast-forward commit. To
bring the feature1 branch into the main branch, Git performs a three-way merge.
Git takes a snapshot of three different commits to create a new one:
·
The common commit both branches share (a90hb)
·
The latest commit of the branch (az84f)
·
The commit of the branch to merge into (nr7jk)
Merging
Branches in a Local Repository
To merge
branches locally, use git checkout to switch to the branch you want to merge
into. This branch is typically the main branch. Next, use git merge and specify
the name of the other branch to bring into this branch. This example merges the
jeff/feature1 branch into the main branch. Note that this is a fast-forward
merge.
git checkout
main
git merge
jeff/feature1
Work
continues on the main and other branches, so they no longer share a common
commit history. Now a developer wants to merge the jeff/feature2 branch into
the main branch. Instead, Git performs a three-way (or recursive) merge commit.
git checkout
main
git merge
jeff/feature2
Merging
Branches to Remote Repository
git push
--set-upstream origin <branch name>
Merging Main
into a Branch
git checkout
<branch name>
git merge
main
60. Name all
core API of Kafka?
Kafka
APIs
· The Admin API to manage
and inspect topics, brokers, and other Kafka objects.
· The Producer API to publish
(write) a stream of events to one or more Kafka topics.
· The Consumer API to
subscribe to (read) one or more topics and to process the stream of events
produced to them.
If there are multiple consumers and
multiple producers with single topic how to process records?
61. How to
check logs in microservices?
The Below are some of logging tools to
handle microservice architecture
- Logstash.
Logstash is a free, open-source tool that runs on Java Virtual Machine
(JVM). ...
- Reimann.
...
- Prometheus.
...
- Elastic
Stack. ...
- Kibana.
...
- Glowroot.
...
- AWS
Cloudwatch. ...
- Datadog.
Here a few microservices logging best
practices:
· Use a Correlation ID
· Structure logs
appropriately
· Provide informative
application logs
· Visualize log data
· Use centralized log
storage
· Query logs
· Handle Failures
62. How are
ACID property implemented in microservices world?
Microservices guidelines strongly recommend you
to use the Single Repository Principle(SRP), which means each
microservice maintains its own database and no other service should access the
other service's database directly. There is no direct and simple way of
maintaining ACID principles across multiple databases.
63. Have you
worked with multiple Databases in microservices?
Create a single database for different
microservices is anti-pattern, then the correct way is to create a
database for each microservice.
It means
that we can use different database technologies for different
microservices. So one service may use an SQL database and another one a
NoSQL database. That's feature allows using the most efficient database
depending on the service requirements and functionality.
64. Can
multiple microservices share the same database?
In the shared-database-per-service
pattern, the same database is shared by several microservices. This
pattern does not reduce dependencies between development teams, and introduces
runtime coupling because all microservices share the same database.
How do microservices communicate with each other?
Because microservices are
distributed and microservices communicate with each other by inter-service
communication on network level. Each microservice has its own instance and
process. Therefore, services must interact using an inter-service communication
protocols like HTTP, gRPC or message brokers AMQP protocol.
How do I interact one microservice from another microservice?
There are two basic messaging
patterns that microservices can use to communicate with other microservices.
- Synchronous
communication. In this pattern, a service calls an API that another
service exposes, using a protocol such as HTTP or gRPC. ...
- Asynchronous
message passing.
Is it considered as a
good practice to connect to two different databases in on microservice?
The
main thing is that you have only one microservice per database, but it is ok
to have multiple
databases per microservice if the business case requires
it.
Your
microservice can abstract multiple data sources, connect them, etc. and then
just give consistent api to whoever is using it. And who's using it, doesn't
care how many data sources there actually is.
It
becomes an issue if you have same database abstracted by multiple
microservices. Then your microservice is no longer isolated and can break
because the data source you are using was changed by another team who's using
the same data source.
This
is one of the main problems in the micro service architecture that is being
addressed using the pattern:
Also
associated with this pattern are:
·
Command
Query Responsibility Segregation (CQRS)
But
this solves only part of the problem.
What are the
exceptions in Spring boot?
· Spring
BeanDefinitionStoreException.
· Spring
BeanCreationException.
· Is org Springframework
beans factory BeanCreationException?
springframework. beans. factory.
BeanCreationException – this is a very common exception thrown when
the BeanFactory creates beans of the bean definitions and encounters a problem.
When Autowiring Spring Beans, a common exception is a.
BeanCreationException. . This means that Spring found a bean to create,
but was unable to fulfill the dependencies needed to create this this Spring
bean
· Unsatisfied
Dependency in Spring.
org.springframework.beans.factory.UnsatisfiedDependencyException:
Error creating bean with name defined in file: Unsatisfied dependency expressed
through field occurs when a problem occurs when a bean auto-wired on
other beans that have not bean loaded in the spring boot application context.
The
best solution is to properly isolate beans. The DispatcherServlet
is responsible for routing and handling requests so all related beans should go
into its context. The ContextLoaderListener , which loads the root context,
should initialize any beans the rest of your application needs: services,
repositories, etc
· The
BeanDefinitionOverrideException in Spring Boot.
Thus, bean overriding
is a default behavior that happens when we define a bean within an ApplcationContext which
has the same name as another bean. It works by simply replacing the former
bean in case of a name conflict.
· Spring
Boot Error ApplicationContextException.
Usage of Actuator in Spring Boot?
2.
Working experience under a team of 10+ developers , everyone
working under same feature
3.
No. of team members in last project. Scrum master role handled by
4.
Experience on NoSQL DB
5.
Experience on Cassandra DB
6.
Other components in project apart from micro services.
7.
Different database/schema for all microservices or same w.r.t last
project I worked on.
8.
Use of functional interface
9.
what is functional interface
10.
List of Clients previously worked with
11.
Awareness on SDLC cycle
Method
references in java8?
Kubernetes – What is POD ? Optional
– not mandatory
What have you done in Kubernetes? Optional – not mandatory
Where in Kubernetes will you add the scalability?
Optional – not mandatory
What is enterprise integration pattern ?
Have you worked in Apache camel ?
Have you worked on Cassandra ? Optional – not mandatory
What is partition key ?
Optional – not mandatory .
How
to process bulk records to apache Kafka
How security handled in current/previous
project
Previous
project POC s
Asked
about previous Springboot project implementation?
1010101010
How many substring with sum as 2 can
be created from this String above?
Like:
101
0101
public
ModelAndView getMembers(HttpServletRequest request, Authentication auth)
{ if(auth != null) {
for (GrantedAuthority ga :
auth.getAuthorities()) {
// works find and logs
"ADMIN", btw. I'm using SimpleGrantedAuthority
this.logger.debug("0{}", ga);
}
} }
public
UserDetails loadUserByUsername(String username) {
.....
Collection<GrantedAuthority>
authorities = new ArrayList<>();
authorities.add(new
SimpleGrantedAuthority("ADMIN"));
return new org.springframework.security.core.userdetails.User(username,
password, enabled, true, true, true, authorities);
...
}
Abstract
factory vs Factory?
The factory method
is just a method, it can be overridden in a subclass, whereas the abstract
factory is an object that has multiple factory methods on it. The
Factory Method pattern uses inheritance and relies on a subclass to handle the
desired object instantiation
Threadpools
in java?
Java Thread pool represents
a group of worker threads that are waiting for the job and reused many times.
In the case of a thread pool, a group of fixed-size
threads is created. A thread from the thread pool is pulled out and assigned a
job by the service provider. After completion of the job, the thread is
contained in the thread pool again.
65. Thread Pool Methods
newFixedThreadPool(int
s): The method creates a thread pool of the fixed size s.
newCachedThreadPool(): The
method creates a new thread pool that creates the new threads when needed but
will still use the previously created thread whenever they are available to
use.
Advantage
of Java Thread Pool
Better performance It
saves time because there is no need to create a new thread.
Real
time usage
It is used in Servlet and JSP where the container
creates a thread pool to process the request
Example:
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
class WorkerThread implements Runnable {
private
String message;
public
WorkerThread(String s){
this.message=s;
}
public
void run() {
System.out.println(Thread.currentThread().getName()+" (Start)
message = "+message);
processmessage();//call processmessage method that sleeps the thread for
2 seconds
System.out.println(Thread.currentThread().getName()+"
(End)");//prints thread name
}
private
void processmessage() {
try
{ Thread.sleep(2000); } catch (InterruptedException e) {
e.printStackTrace(); }
}
}
public class TestThreadPool {
public
static void main(String[] args) {
ExecutorService executor = Executors.newFixedThreadPool(5);//creating a
pool of 5 threads
for
(int i = 0; i < 10; i++) {
Runnable worker = new WorkerThread("" + i);
executor.execute(worker);//calling execute method of
ExecutorService
}
executor.shutdown();
while
(!executor.isTerminated()) { }
System.out.println("Finished all threads");
}
}
66. Builder
design pattern?
If a developer forgets
to call a particular setter method? We could end up with an object that is only
partially initialized, and again, the compiler wouldn't see any problems with
it. Thus, there are two specific
problems that we need to solve:
o
Too many constructor
arguments /too many constructors to maintain.
o Incorrect
object state/error prone because many fields have same type.
This is where the Builder pattern comes
into play.
Builder is a creational design pattern,” which
allows constructing complex objects step by step.” Unlike other creational patterns,
Builder doesn't require products to have a common interface. That makes it
possible to produce different products using the same construction process.
“The
builder pattern provides a build object which is used to construct a complex
object called the product. It encapsulates the logic of constructing the
different pieces of the product.”
Builder (recognizable by creational methods
returning the instance itself)
Java.lang.StringBuilder#append()
(unsynchronized)
Java.lang.StringBuffer#append()
(synchronized)
Java.nio.ByteBuffer#put()
(also on CharBuffer, ShortBuffer, IntBuffer, LongBuffer, FloatBuffer and
DoubleBuffer)
Javax.swing.GroupLayout.Group#addComponent()
All
implementations of java.lang.Appendable
Java.util.stream.Stream.Builder.
Builder Design pattern is a creational pattern and
should be used when a number of parameters required in the constructor is more
than manageable usually 4 or at most 5.
Advantages:
1) more
maintainable if the number of fields required to create an object is more than
4 or 5.
2) less error
prone as users will know what they are passing because of the explicit method
call.
3) more
robust as only fully constructed object will be available to the client.
Disadvantages:
1) verbose
and code duplication as Builder needs to copy all fields from Original or Item
class.
67. Decorator
design pattern?
Usage of
Decorator Pattern : It is used When you want to add
responsibilities transparently and dynamically to objects without affecting
other objects.
When
you want to add responsibilities to an object that you may want to change in
future. Extending functionality by sub-classing is no longer practical.
Decorator
pattern achieves a single objective of dynamically adding responsibilities to
any object.
A Decorator
pattern can be used to attach additional responsibilities to an object either
statically or dynamically.
It enhances the extensibility of the object because changes
are made by coding new classes. It simplifies the coding by allowing you to
develop a series of functionality from targeted classes instead of coding all
the behavior into the object.
Usage examples: The Decorator is pretty standard in
Java code, especially in code related to streams.
Here are some examples of Decorator in core Java libraries:
All subclasses of java.io.InputStream, OutputStream, Reader
and Writer have constructors that accept objects of their own type.
o
java.util.Collections, methods checkedXXX(),
synchronizedXXX() and unmodifiableXXX().
o
javax.servlet.http.HttpServletRequestWrapper and
HttpServletResponseWrapper.
Identification: Decorator can be recognized by
creation methods or constructor that accept objects of the same class or
interface as a current class.
68. Versioning
in Rest api ?
API versioning
is the practice of transparently managing changes to your API.
Versioning is effective communication around changes to your API, so consumers
know what to expect from it. You are delivering data to the public in some
fashion and you need to communicate when you change the way that data is
delivered.
69. Difference between Hibernate JPA
vs Spring Data JPA?
Hibernate is a
JPA provider and ORM that maps Java objects to relational database tables.
Spring Data JPA is an abstraction that makes working with the JPA
provider less verbose. Using Spring Data JPA you can eliminate
a lot of the boilerplate code involved in managing a JPA provider like
Hibernate.
Hibernate
is a JPA implementation, while Spring Data JPA is a JPA data access
abstraction. Spring Data JPA cannot work without a JPA provider.
Spring
Data offers a solution to the DDD Repository
pattern or
the legacy GenericDao
custom implementations. It can also generate
JPA queries on your behalf through method name conventions.
With
Spring Data, you may use Hibernate, EclipseLink, or any other JPA provider. A
very interesting benefit of using Spring or Java EE is that you can control
transaction boundaries declaratively using the @Transactional
annotation.
Spring
JDBC is much more lightweight, and it's intended for native querying, and if
you only intend to use JDBC alone, then you are better off using Spring JDBC to
deal with the JDBC verbosity.
Therefore,
Hibernate and Spring Data are complementary rather than competitors.
70. SOLID
Principles ?
In Java, SOLID principles are an object-oriented approach
that are applied to software structure design. It is conceptualized by Robert
C. Martin (also known as Uncle Bob). These five principles have changed the
world of object-oriented programming, and also changed the way of writing
software.
o
Single Responsibility Principle (SRP)
o
Open-Closed Principle (OCP)
o
Liskov Substitution Principle (LSP)
o
Interface Segregation Principle (ISP)
o
Dependency Inversion Principle (DIP)
Single Responsibility Principle
Just because
you can doesn’t mean you should
·
Every class should have a single responsibility
·
There should never be more than one reason for a class
to change
·
Your classes should be small. No more than a screen
full of code
·
Avoid ‘god’ classes.
·
Split big classes into smaller classes.
·
Class should only have one responsibility.
Furthermore, it should only have one reason to change.
Open/Closed Principle
·
Your classes should be open for extension
·
But closed for modification
·
You should be able to extend a classes behaviour,
without modifying it.
·
Use private variables with getters and setters – ONLY
you need them
·
Use abstract base classes.
·
The open-closed principle. Simply put, classes should
be open for extension but closed for modification. In doing so, we stop
ourselves from modifying existing code and causing potential new bugs in an
otherwise happy application. Of course, the one exception to the rule is when
fixing bugs in existing code.
Liskov Substitution Principle
By Barbara
Liskov in 1998
·
Objects in a program would be replaceable with
instances of their subtypes WITHOUT altering the correctness of the program
·
Violations will often fail the “Is a “test
·
A Square “Is a” rectangle
·
However, a Rectangle “Is Not” a Square.
·
if class A is a subtype of class B, we should be able
to replace B with A without disrupting the behaviour of our program.
Interface Segregation Principle
·
Make fine gained interfaces that are client specific
·
May’s client specific interfaces are better than one
“general purpose” interface
·
Keep our components focused and minimize dependencies
between them.
·
Notice relationship to the Single Responsibility
Principle.
·
i.e. avoid
‘god’ interfaces
·
Larger interfaces should be split
into smaller ones. By doing so, we can ensure that implementing classes only
need to be concerned about the methods that are of interest to them.
Dependency Inversion Principle
·
Abstraction should not dependent upon details
·
Details should not depend upon abstractions
·
Important that higher level and lower-level objects
depend on the same abstract interaction
·
This is not same as Dependency Injection -Which is how
objects obtain dependent object
·
The principle of dependency inversion refers to the
decoupling of software modules. This way, instead of high-level modules
depending on low-level modules, both will depend on abstractions.
Summary
The SOLID
principles of OOP will lead you to better quality code
·
Your code will be more testable and easier to maintain
·
A Key theme avoiding tight coupling in your code
71. Redis
cache ?
Redis is a
popular, open-source, in-memory data structure store that can be used
as a database, cache, or message broker. ... Every time that you update or
delete information stored in a local cache on one machine, you must update the
in-memory caches on all machines that are part of the distributed cache.
You can't store objects directly
into Redis. So convert the object into String and then put it in Redis. In
order to do that your object must be serialized. Convert the object to
ByteArray and use some encoding algorithm (ex base64encoding) and
convert it as String then store in Redis.
Advantage of using Redis Cache in your application?
They have different conventions
for locating the external configuration files. The bootstrap context is
searching for a bootstrap. properties or a bootstrap. yaml file, whereas the
application context is searching for an application.
- Core
properties (logging properties, thread properties)
- Integration
properties (RabbitMQ properties, ActiveMQ properties)
- Web
properties (HTTP properties, MVC properties)
- Security
properties (LDAP properties, OAuth2 properties).
Spring security ?
Spring Security is a Java/Java EE framework that provides
authentication, authorization, and other security features for enterprise
applications.
Core java collections
73. TreeSet/HashMap in java (hashcode and Equals)
Does tree data structure use
hashCode and equals how sorting will happen internally in TreeSet TreeMap?
So it is sorting based
on the compareTo method and hashcode() method looks insignificant in
this scenario. However, Treeset is backed by TreeMap, so internally if TreeMap
is used for sorting.
When we implement a TreeSet, it creates a TreeMap
to store the elements. It sorts the elements either naturally or using the user
define comparator.
When the object of a TreeSet is created, it
automatically invokes the default constructor and creates an object of TreeMap
and assigns comparator as null. Below code is executed by the Java
compiler.
TreeSet is an implementation of SortedSet that does
not allow duplicate values. The elements in the TreeSet are by default sorted
in ascending order.
TreeSet is not thread-safe and doesn’t
allows duplicates , Allows null elements Yes.
TreeSet class provides an add method that is used to add a specific element to
the TreeSet. It also provides the ‘addAll’ method. This method accepts any
other collection as an argument and then adds all the elements of this
collection to the TreeSet.
How
to Create REST APIs with Java and Spring Boot (twilio.com)
https://github.com/MohammedAliIbrahim/spring-boot-college-management-system.git
No comments:
Post a Comment