Understanding Maven -2

I had shared a basic understanding of maven sometime back.

http://kamalmeet.com/java/understanding-maven/

Here I will try to get into some more details.

As mentioned earlier we need a pom file for any maven project. POM stands for Project Object Model. It contains configuration details for the project.

To start with we provide basic info such as

<groupId>com.companyname.project-group</groupId>
<artifactId>project</artifactId>
<version>1.0</version>

 
group id: shows project group, say finance
artifact id: within group we will have a unique project name as artifact id say payroll
version: It is the release version e.g. 1.0 or 2.1.2 etc.
repositories: where all the jars, plugins and other artifacts are stored and used by maven.

<repositories>
<repository>
<id>custom.id</id>
<url>http://validrepo.com/mavenrepo</url>
</repository>
</repositories>

Type of repositories

Local Repository: A folder on local machine where all the dependencies are stored. Maven by default creates the repo in user/home directory, this path can be overridden by “localrepository” tag in POM.
Central repository: This is by default provided by maven, and has most of the common jars. No need for a separate configuration for this.
Remote Repository: Developer has an option to provide a repository explicitly, so if a give dependency is not found in central repository, it will be fetched from remote repository.

Dependencies: All the Jar files on which project is dependent and will be downloaded before the build.


<dependencies>
<dependency>
<groupId>test</groupId>
<artifactId>a</artifactId>
<version>1.2</version>
</dependency>

 

Check Disk usage in Linux

Here are a few important commands you would need to check the disk usage on a linux machine

A simple df (disk filesystem) command can help us get important info on linux file system. -h will make the data more human readable.

$ df
$ df -h

More variations of df -h command

http://www.tecmint.com/how-to-check-disk-space-in-linux/

Another important command would be du (disk usage) for a particular folder. du -h would give same data in human readable form.

Other useful variation of du command

du -sh
du- sh *
du -Pshx /* 2>/dev/null

Open-Closed principle Revisited

Reference: http://kamalmeet.com/system-design-and-documentation/understanding-openclosed-principle/

Open closed principle states that your classes should be open for extension but closed for modification. One way to look at it is that when you provide a library or a jar file to a system, you can ofcourse use the classes or extend the classes, but you cannot get into the code and update it.

At a principle level, this means you should code in a manner that you never need to update your class once code. One major reason behind this principle is that you have a class which is reviewed and Unit tested, you would not like someone to modify and possibly corrupt the code.

How do I make sure that my class follow open closed principle?

Let’s look at a design of this MyPizza class

public class MyPizza {
public void createPizza(Pizza pizza)
{
if(pizza.type.equals("Cheese"))
{
//create a cheese pizza
}
else if(pizza.type.equals("Veg"))
{
//create a veg pizza
}
}
}

Following pizza type classes use this

class Pizza
{
String type;
}

class CheesePizza extends Pizza{
CheesePizza()
{
this.type=”Cheese”;
}
}

class VegPizza extends Pizza{
VegPizza()
{
this.type=”Veg”;
}
}

The above design clearly violates the open closed principle. What if I need to add a double cheese pizza here. I will have to go to MyPizza class and update it, which is not following “closed for modification” rule.

How can fix this design?

public class MyPizza {
public void createPizza(Pizza pizza)
{
pizza.create();
}
}


class CheesePizza extends Pizza{
CheesePizza()
{
this.type="Cheese";
}

public void create()
{
//do the creation here
}
}

With this simple modification we are making sure that we will need not change the code in MyPizza class even when we will add new types of pizza, as actual responsibility of creation would be with the new class being created (DoubleCheese).

Reverse Engineering: MySQL WorkBench

In last post I talked about creating sequence diagrams using MaintainJ. Another important aspect you would want to understand for a Project is the database schema design. How many tables are there? How do they interact with each other? etc.

For understanding this design the best way is to look into ER or Entity Relationship diagram. Ideally one would create the ER diagram first and then implement database.

In case we do not have a ER diagram available we can create using Reverse Engineering the database to ER diagram.

For MySQL, we can use MySQL WorkBench tool to create one.

Download the installer from https://www.mysql.com/products/workbench/

Once installed, you can connect to you mysql database in workbench. Then in Database Tab at the top, select Reverse Engineer option, and select the schema you want to reverse engineer.

Reverse Engineering: MaintainJ

The best way to analyze the code with hundred of Java classes is to look into the documentation, class diagrams, sequence diagrams etc to understand the flow and usage. Unfortunately there are times when you would not be provided with any such documentation.

Reverse Engineering tools can be of help upto some level. MaintainJ is one such tool to help you with Java.

So if you have a working codebase for a web application, which you need to analyze, here are the steps to go ahead.

1. Download the MaintainJ war file from http://maintainj.com/userGuide.jsp?param=install
2. Add the war file to the server where main application (to be analyzed is available), for example if you project war file is added to tomcat – tomcat/webapps, add the MaintainJ.war
3. Now if you will visit the link to server like http://localhost:8080/MaintainJ/, it will let you provide the package to be traced and directory where output file to be added.
4. It will provide simple settings to be added to catalina.sh (or other server config),
5. Once all settings done, restart the server.
6. Go to MaintainJ link and start tracing.
7. Now browse through the actual app, MaintainJ will create sequence diagrams to the directory where you have provided the path.

You can view the ser file created by MaintainJ in eclipse by adding MaintainJ plugin to eclipse. Create a new project of MaintainJ trace type and copy generated ser files into this project in a folder.

A good overall demo is provided – http://maintainj.com/userGuide.jsp?param=overviewDemo

Shared Nothing vs Shared Everything

In database cluster implementation we can have multiple ways to make sure how different nodes will communicate with each other.

Shared nothing approach: None of the nodes will use others memory or storage. This is best suited for the solutions where inter node communication is not required, i.e. a node can come up with a solution on its own.

Shared Memory: In this approach memory is shared, i.e. each node/ processor is working with same memory. This is used when we need nodes to share solutions/ calculations done by other nodes and are available in memory.

Shared Everything: In this approach nodes share memory plus storage. This makes sense when nodes are working on problem where calculations and data created/ used by node is dependent on others.

Further Reads:

https://www.quora.com/What-are-the-differences-between-shared-nothing-shared-memory-and-shared-storage-architectures-in-the-context-of-scalable-computing-analytics

https://en.wikipedia.org/wiki/Shared_nothing_architecture

How to check if a port is open on a server?

I wanted to check if the service I am trying to access on a server is actually listening on the port I am hitting. I tried to look for a ping variant which could tell me if the port is listening and required service is up and running.

Found nmap command as answer.

nmap -p 80 google.com

Starting Nmap 6.40 ( http://nmap.org ) at 2016-03-02 16:05 IST
Nmap scan report for google.com (216.58.197.78)
Host is up (0.050s latency).
rDNS record for 216.58.197.78: maa03s21-in-f14.1e100.net
PORT STATE SERVICE
80/tcp open http

It checks the port state and also what service is listening at the port.

Inject Or Autowired

Sometimes in a Spring framework code, we see the keywords @Inject and @Autowired being used interchangeably. Simply putting it @Autowired is Spring specific annotation for implementing Dependency Injection, whereas @Inject is part of Java EE 6 specification, hence more generic.

JSR 299, Java EE 6 specification, provided a feature called Contexts and Dependency Injection (CDI). http://docs.oracle.com/javaee/6/tutorial/doc/giwhb.html.

JSR 330, more specifically talks about Dependency Injection. https://dzone.com/articles/what-relation-betwe-there

A more elaborated explanation on usage of various DI annotations – http://blogs.sourceallies.com/2011/08/spring-injection-with-resource-and-autowired/#more-2350