Archive | March, 2012

Basic File Upload with Apache Commons

31 Mar

Below is the piece of servlet code which you are try out in case if you even has a requirement for implementing a basic uploads in your website with Apache Commons File Upload library.

/*
* To change this template, choose Tools | Templates
* and open the template in the editor.
*/
package test;

import java.io.File;
import java.io.IOException;
import java.io.PrintWriter;
import java.util.Iterator;
import java.util.List;

import javax.servlet.ServletConfig;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;

import org.apache.commons.fileupload.FileItem;
import org.apache.commons.fileupload.FileUploadException;
import org.apache.commons.fileupload.disk.DiskFileItemFactory;
import org.apache.commons.fileupload.servlet.ServletFileUpload;
import org.apache.commons.io.FilenameUtils;

public class UploadPDF extends HttpServlet {

    private static final String TMP_DIR_PATH = “D:\\temp”;
    private File tmpDir;
    private static final String DESTINATION_DIR_PATH = “D:\\UploadedPDF”;
    private File destinationDir;

    @Override
    public void init(ServletConfig config) throws ServletException {
        super.init(config);
        tmpDir = new File(TMP_DIR_PATH);
        if (!tmpDir.isDirectory()) {
            throw new ServletException(TMP_DIR_PATH + ” is not a directory”);
        }
        String realPath = getServletContext().getRealPath(DESTINATION_DIR_PATH);
        destinationDir = new File(DESTINATION_DIR_PATH);
        if (!destinationDir.isDirectory()) {
            throw new ServletException(DESTINATION_DIR_PATH + ” is not a directory”);
        }

    }

    @Override
    protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
        PrintWriter out = response.getWriter();
        response.setContentType(“text/plain”);

        DiskFileItemFactory fileItemFactory = new DiskFileItemFactory();
        /*
         *Set the size threshold, above which content will be stored on disk.
         */
        fileItemFactory.setSizeThreshold(1 * 1024 * 1024); //1 MB
        /*
         * Set the temporary directory to store the uploaded files of size above threshold.
         */
        fileItemFactory.setRepository(tmpDir);

        ServletFileUpload uploadHandler = new ServletFileUpload(fileItemFactory);

        try {
            /*
             * Parse the request
             */
            List items = uploadHandler.parseRequest(request);
            Iterator itr = items.iterator();
            while (itr.hasNext()) {
                FileItem item = (FileItem) itr.next();
                /*
                 * Handle Form Fields.
                 */
                String fileName = item.getName();

                if (item.isFormField()) {
                    out.println(“File Name = ” + item.getFieldName() + “, Value = ” + item.getString());
                } else {
                    //Handle Uploaded files.
                    out.println(“Field Name = ” + item.getFieldName()
                            + “, File Name = ” + item.getName()
                            + “, Content type = ” + item.getContentType()
                            + “, File Size = ” + item.getSize()
                            + “, boolean isInMemory = ” + item.isInMemory());
                    /*
                     * Write file to the location.
                     */

                    out.println(“File Name is = ” + FilenameUtils.getName(fileName));
                    File file = new File(destinationDir, FilenameUtils.getName(fileName));
                    out.println(destinationDir);
                    out.println(item.getName());                 
                    item.write(file);
                }
                out.close();
            }
        } catch (FileUploadException ex) {
            log(“Error encountered while parsing the request”, ex);
        } catch (Exception ex) {
            log(“Error encountered while uploading file”, ex);
        }

    }
}

Please make sure that you have apache commons IO library in your container lib folder or common io jar files in your application lib folder.In addition to commons IO library you will also need to download file upload jar files and place them in webapps lib folder.All these jars are the must for this code to run.More details on the location of these files can be found here.

In addition to having all the dependencies imported to your project,you also have make sure that proper encoding method is set in your form’s page that is request encoding type should be set as

enctype=”multipart/form-data”

In addition to this we also need to ensure that filename/file size/content is properly send by the client(browsers).Sometimes it might happen that due to some security policies, all these information might be blocked so I suggest to check for these possibilities in case if you find any issues.We can check this by giving couple of print statements and making sure that none of the values are coming out as null.

At the time of writing this post, I have checked that this code works with IE9/FF10/Safari5.03/Opera 10x.

Technorati Tags: ,
Advertisements

Do you know how many cores your server has ?

27 Mar

Do you know how many cores your server has ? If you don’t have answer to this, then probably you should get this information internally from concerned team before you present your load testing results to the stakeholders or read this post.Its very essential piece of information which hardly any performance engineer can ignore.

Ok I understand  sometimes for various reasons, it so happens during load testing engagements that performance team is often unaware about the capacity of the hardware infrastructure like number of CPU’s the server has, network capacity etc.Without this information it really becomes hard to conclude meaningful outcome about the test results.However there are ways, where in we can collect all these information ourselves.So let me share with you a  quick tip as how to collect information about number of CPU’s associated with the servers namely Windows/Solaris.

Windows Task Manager is one of the best tools which will give you the information about the number of cores your server has on the box. All you need to do is,go to command prompt, and type taskmgr or  get inside the taskmanager window.

Cores

The 4 Histograms in the above picture tells me that I have 4 CPU cores in my laptop.Each Histogram represents a single core in the box.So if you have N histograms, it often means you have N cores on that box.

We can also check the same information by going to cmd prompt and typing msconfig.Once inside the system configuration section, click on Boot tab and then Advanced options.The dropdown list will show you the number of processor server has.(Please note that on server machine, you might or might not have access to msconfig,but you can always ask person who has access to check this for you)

Cores_01

Cores_02

In addition to above methods, we can also use the process explorer to get this information,Just install process explorer and then go to view menu and get system information, over here you should be seeing the similar screen as seen in windows task manager.Please ensure that you select “Show one graph per CPU” option.

Cores_03

Sometimes for some databases(rare case as per my exposure) like Oracle/MS SQL depending on the license type, we might have to dig deeper to understand the relation between NUMA Mode/Socket/Logical CPU/Core/Hyper threading.So here is the thumb rule which I picked up from Microsoft MSSQL Team some years back,

CPU Socket – Actual holder of physical CPU, Multicore

CPU Core (Core) – Single physical CPU, can be part of a package placed in a given socket. Dual or multiple cores depending on packaging might or might not share L2 cache.

Logical CPU (Logical thread, Hyper thread) – a single logical CPU that has its own set of dedicated components on physical CPU but shares the rest of physical CPU, Core, with another logical CPUs.

NUMA Node – a set of CPU Sockets packaged together with a block of dedicated memory. A Node might consist of one CPU Socket.

So based on above, NUMA Node contains Sockets, Sockets contain Cores, and Cores contain Logical CPUs. For example, the number of CPUs output by Windows in Task Manager or PerfMon can be calculated by Nodes*Sockets*Cores*LogicalCPUs. For example for single node system with two sockets, two cores per socket, and two logical CPUs, OS will show 1*2*2*2 = 8 CPUs.

For Solaris box, the below command will give you the information about CPU and associated cores with it.You can get the more information on this command by referring the man pages(Below is copied extract from Man pages)

psrinfo –pv

DESCRIPTION
psrinfo displays information about processors. Each physical
processor may support multiple virtual processors. Each virtual processor is an entity with its own interrupt ID, capable of executing independent threads.Without the processor_id operand, psrinfo displays one line for each configured processor, displaying whether it is on-line, non-interruptible (designated by no-intr), spare,off-line, faulted or powered off, and when that status last changed. Use the processor_id operand to display information about a specific processor. See OPERANDS.

OPTIONS

The following options are supported:

-s processor_id
Silent mode. Displays 1 if the specified processor is fully  on-line,  and  0  if the specified processor is non-interruptible, off-line, or powered off.Use silent mode when using psrinfo in shell scripts.

-p

Display the number of physical processors in a system.When combined with the -v option, reports  additional information about each physical processor.

-v

Verbose mode. Displays  additional  information about the  specified  processors, including: processor type,floating point unit type and clock speed.If any of this information   cannot be determined,psrinfo displays unknown.

When combined with the -p option,reports additional information about each physical processor.

OPERANDS
The following operands are supported:

processor_id – The processor ID of the processor about which information is to be displayed.

Hope this helps.

Technorati Tags: ,

Java Performance Series -1

26 Mar

Over the years working with various Java technologies applications, I have seen the below memory set for java heap with regard to various operating systems.

Operating System Heap Size JVM(Java Virtual Machine)
Windows < 1500mb 32 bit
Windows Between 1600mb to 32gb 64 bit
Linux <2gb ;appx 1800mb 32 bit
Linux Between 1600mb to 32gb 64 bit
Solaris <3g 32 bit
Solaris Between 3g to 32gb 64 bit

There always exists some restrictions with regard to 32 bit systems given that OS reserves around 2 gb for its use and another 2 gb is given to programs for their use.

Among all the operating systems Solaris tends to give more horse power to the applications compare to Windows/Linux when it comes to Performance.Though I believe Solaris works much faster or give more horsepower is because they are non gui or has limited GUI features in it compare to other OS’s.The heap size is often configured via

-Xmx(size) – to set the maximum Java heap size

-Xms(size) – to set the initial Java heap size

When it comes to hosting on 64 bit systems , there often exists a performance issues with the application in spite of having sufficient memory and horse powers, and quite often I have seen people using   -XX:+UseCompressedOops flag to boast the performance of the application.However testing should confirm you about the gain in performance.

If you want to check as what version of VM application is using, use below command,

C:\>java  -version
java version “1.6.0_31”
Java(TM) SE Runtime Environment (build 1.6.0_31-b05)
Java HotSpot(TM) Client VM (build 20.6-b01, mixed mode, sharing)

Technorati Tags: ,

Java Performance Series

22 Mar

It’s really been long time I have worked on performance testing of the java based applications (Good 2+ years), so in order to rehearse my past experience on java based applications, I am thinking to start series of posts which will showcase my thoughts on the testing/identifying/isolating/fixing/suggesting some of the key performance issues which I have seen/observed/fixed while working on the java based applications.

We know that performance tuning of the java based applications is a kind of painful iterative process where there is no single size fit all solution which can help to determine the optimum memory requirements of the java based application. I call this as painful process for the simple reason that there are very few people who are ready to make changes to the code base in order to fix the performance issue and almost no one in case if we are dealing with legacy systems or legacy applications which has no original SME’s working for that application. Any change in the code base is considered as a high risk item unless it’s a very low hanging fruit and something which is external and yet impacts application performance(think load balancing). So I believe that’s one of primary reasons as why lot many people turn to tune memory allocation requirements rather than fix the badly composed/written or outdated data structure code used by the java based application. Another good valid reason I could think of is that hardware has become lot cheaper than hiring the developer to fix the issue and however this approach also by no means assures the business that it’s going to fix the original issue without any side effects to the other part of the code.There always exists a risk for regression.

Allocating the right size of the memory to the Java heap along the right JVM runtime environments can help to mitigate some/most of the performance issues but definitely not all especially if you have designed the application without keeping performance engineering requirements in your mind. Memory requirements for Java based application are quite often described/measured in terms of Java heap size. Lot many folks says larger the heap size better the performance in terms of latency and throughput, but I believe otherwise for simple reason that if you have bad code which is consuming a lot of memory, larger heap size will give that bad piece of code extra time to live rather than make it fail fast.That’s band aid and not the permanent fix. (IIS App pool recycling technique used by IIS is one such good example for this).

Tuning the JVM often helps in ensuring that application meets acceptable level of response time/throughput/availability .To large extent we can also improve the start time/latency/throughput and manageability of the application by the tuning the JVM and using right runtime environment. The availability of the applications can also be improved by deploying the applications across multiple JVM’s provided your application is designed in such a way that it supports this solution. Client JVM Runtime environments often have good start up time and provide good throughput and latency compare Server JVM Runtime environments, but lacks the code optimization techniques used by the server runtime environment. Depending on the application and system requirements one can choose between client and server runtime environments.

That’s it for now, stay tuned for next post on some of my weird thoughts on Java performance stuff.

Technorati Tags:

James Whittaker,Google,Privacy,Ads..

17 Mar

James Whittaker, yes you read the name right ?..test engineering guru and his rant has become a viral hit on the internet, if you do not know what I am talking, read here, and then after reading this article ,do a quick Google search, you should be able to find many other news channel which has reported about his rants.

So whats so special about this, first thing  about this is that he is known and Respected Test Engineering Experts and I have been reading his work  via Google testing blogs . So we share the same profession with only difference is that I guess he does more of mentoring/leadership roles and I am a contractor doing small tasks here and there.

Another thing what I share with him, is concern for privacy, and I agree with him that Google has lost its commitment to user privacy.I just hate when Google starts reading my mail contents and thereby starts displaying corresponding  ads matching content of the my mails. This really pissess me off and gives me an impression that someone is having an access to my mail box.There are many other Google products which shows similar behavior and tracks you ruthlessly.

So in case if you want to prevent Google or some one from tracking you, I would say go ahead and install  this in your IE Browser .In case if you are with Firefox , I would suggest you to install Collusion and adPlus blocker . With Collusion, you should be in position to see as how many ad companies tracks you and your browsing session. I am sure this data will surprise you. IE also has tracking protection list which exclusively blocks tracking features of Google . At the time of writing, IE has protections from below Google sites,

-d news.google.com
-d youtube.com
-d blogger.com
– apis.google.com/*plusone*
-d plus.google.com
-d googleadservices.com
-d googletagservices.com
-d googlesyndication.com
-d googleadservices.com
-d google-analytics.com
-d doubleclick.net
-d doubleclick.com
http://google.*/api/sclk?
http://google.*/client_204?
http://google.*/gen204?
– google.com*/lh/ajaxlog?
– google.com*/uds/stats?
– google.com*/bin/stats?
– google.com*/log?
– google.com*/buzz

Finally there is something which I disagree with James on moral/ethical grounds is that it looks bad when you join a competitor and rant loudly about your ex company.Perhaps on personal blog would have made more sense ,but then probably it would not have gone viral and this is how society in general works.

My Thoughts on “Customer sues Epicor after ERP software project attempt ends in ‘big mess”

11 Mar

 

This is an interesting case where in the customer sues the vendor for failing to implement the ERP Project as per the agreed timelines and probably budget.ERP projects has their own challenges with regard to implementation and requirements gathering process. I do fully agree with ParknPool that it does impact the bottom line of the company when ever the key systems goes off line for some reasons.I have seen the mess that happens specially if you are multi million dollar company having sales/order booking partners located  in various locations across the country. Given that most of the key information about the clients like credit limit/ account balance/banking details etc are often interdependent and located in the same systems in ERP, it becomes a kind of challenge of the company officials to work without this information.Manually taking orders and updating the payment is just impossible given that during peak seasons there exists lot of chances of human errors or creating intentional gaps by sales team. Remember no matter how high your position is in the company or no matter how close or deep relationship you have with your clients or customers, there always exists a risk that you might go beyond the credit limit allocated to the customer if you are super busy sales guy and has bonus attached to your monthly targets.

So why this case is interesting , there are number of reasons for this ,

  • Implementer failed to read the requirements correctly.”Because we’re a drop-ship business, we need to invoice our client after the last item ships, because they could ship from multiple locations,” Fonner said. “The Epicor system couldn’t deal with that.”. Though the requirements looks 4 lines statements, I am sure it takes at least 2 weeks to get more clarity on this requirements and at least minimum 4 weeks to implement this.(This is based on my experience).
  • There was also an attempt to change scope  which we often see in some projects due to various reasons.”Epicor also performed something of a bait-and-switch with ParknPool, initially saying that the company’s need would be met with a specific set of software modules, but then saying that more were required after the project started, Fonner said.”. Most service providers tries to do this indirectly in cases where they feel the client can provide still more business.This is just an immature and terrible thought  that is surely going to backfire on you.
  • Software implemented was untested.If you are deploying in production any software which is untested, there exists a risk that you are going to do a  business in loss at least this is true for Software which are as complex as ERP systems. These systems often has lot of information with regard to inventory, finance ,rates and client information etc. I can go on writing as how various departments in the company sees an ERP systems. Its just a very high risk decision to implement an ERP in actual usage without testing for validity of business rules implemented.
  • Its just not easy to sue the seller here for the simple reason that during the implementation phase,I feel there wasn’t any legal check points or milestones to measure percent completion of work.If there were check points, ParknPool could have seen the red flags right after some days of execution start.Checkpoints are must in any projects to see where we stand.
  • This should be interesting case to follow as I have seen many companies changing vendors due to bad quality of management, improper execution or  just operational loopholes or lack of quality for the work done. Suing the vendor for bad quality or incorrect or incomplete implementation is something I feel should have some impact on the way IT Industry works and the way outsourcing works.In all, this case should definitely benefit IT industry in some way specially at least in defining the legal definition of “DONE”.

         

Technorati Tags: ,

Quote for the Week

10 Mar

“If you wish to persuade me, you must think my thoughts, feel my feelings, and  speak my words.”

This is very powerful quote by some Roman legacy philosopher and a very wise man who use to live some 3000 years before I was born.The good thing about this quote is that lot many of us use it on daily basis and its quite often used to train people in sales.

%d bloggers like this: