Does It Make Any Sense Today to Use Java for Web Applications?

I spent nine years as a Java architect/programmer through mid-2008 and was always a big proponent of the Java ecosysm; today though I wonder if the use cases for building web application in Java simply aren't there any more.

In the earliest days of J2EE I worked with one other programmer to build a large JSP-based system for SABRE. It comprised of about 100 JSP's and used nothing but Java and JDBC. EJB's were useless and in any case we had no tools to build them. There were no other frameworks, no AJAX, basically only the bare bones of what Weblogic provided. Still, with such a minimal ecosystem we were able to build an application with a customer (travel agency) site, and an internal workflow and management system with full auditing of the data (corporate flight discount contract details) in six months at which point a number of agencies started using the beta. It tool SABRE a further six months to build a login system with the Reservation system, during which time we expand the feature set. It was very successful for SABRE and improved both their bottom line as well as made the customers happy.

I think that today, using all the frameworks and tools available, it would take just as long or even longer to deliver the same application. Using a modern framework set like Hibernate/Spring I just don't see that performance or productivity would be sufficiently greater than non-Java options to justify building it today in the Java world.

During my Java only days I poohed PHP in this blog (and got rightly smacked down) but since working with it more extensively, as well as looking at Ruby and Python alternatives, that I can get more work down faster with these "less enterprisey" language ecosystems.

Java is not going anyway anytime soon, as it is still firmly entrenched in lots of corporate IT groups, especially in places where being able to "file a ticket" and get support (or pass blame) is more important than anything other factor. The Java world also has extensive ways to connect to other systems which, once you decide to go that way, become harder to replace.

The question is, does the need for Java plumbing require the actually web application be in Java? I think not, though I imagine (and have enough experience being on Architecture teams) this would be a hard sell in many IT shops.

In the early days of AJAX I worked on two directory applications for finding/searching information on employees and field offices. Java was only used for fetching data from the databases, and all of the actually work was in Javascript. I found it refreshingly productive, being able to make changes interactively without compilation or restarting the application server. This lead to my trying PHP and later Ruby, and looking at various JVM alternatives such as Groovy.

At the same company the customer service application was built using BEA Weblogic Portal Server, despite the site having no portlets. The added burden of the additional layers and configuration utilities, plus the need to restart Weblogic itself repeatedly made progress maddingly slow. Of course this was a stupid decision, but even if we had used real portlets the productivity of the environment was slow and costly.

At my next position I worked for a digital printing company which had their online store in PHP (which had been writen by 1 guy in a couple of months) and had been convinced by someone in the company to replace it with Java. I gave up within a few months as no progress was ever made, the design was a series of wishful thinkings, and eventually the company went through multiple development teams and to this day (5 years or so later) is still limited by the original PHP based store. The new store was supposed to support a more flexible business model but otherwise have similar features to the original.

Naturally the fault with these examples could be argued as having nothing to do with Java and more with stupid management, but the more productive the environment the more likely you can get something working before a brainless management can make a bad decision.

Productivity and the cost-savings derived from it can be a powerful argument in IT assuming you can get anyone to listen. My two ajax derived apps made more of an impact in the company than their importance as applications.

So what does Java have that is truly a reason to use it?

Better performance would seem obvious, but the speed at which an individual request is processed is dominated by I/O and database access in any environment, and scaling is best handled horizontally anyway so speed of a single thread is rather unimportant.

Productivity? From my experiences with everything Java web oriented and just listening to others it obvious that things take longer even in the modern Java era with attributes and myriads of front-end frameworks like JSF. If I can build a page five times from scratch to your one I can not only get the job done faster, and I could redo the page multiple times until it's perfect instead of building it once and not having time to make changes (or wanting the pain of refactoring). I can build test cases in the time it takes you to get the configurations all to match.

Connectivity? The argument here might be better, certainly if you live in a Java SOA world it might seem easier to stick with Java for everything. But is it sufficiently better than having a split environment with something else as the web piece?

In any case making a case for not using Java for web applications in a corporate IT world wedded to Java is going to be be hard no matter what the advantages are. That said, if you can truly look at the alternatives without bias to corporate realities, and measure them according to their advantages, it might be hard to find a use case for strictly using Java for web applications.