Applied Computer Systems
37
ISSN 2255-8691 (online)
ISSN 2255-8683 (print)
May 2018, vol. 23, no. 1, pp. 37–44
doi: 10.2478/acss-2018-0005
https://www.degruyter.com/view/j/acss
©2018 Kristiāns Kronis, Marina Uhanova.
This is an open access article licensed under the Creative Commons Attribution License
(http://creativecommons.org/licenses/by/4.0), in the manner agreed with De Gruyter Open.
Performance Comparison of Java EE and ASP.NET
Core Technologies for Web API Development
Kristiāns Kronis
1*
, Marina Uhanova
2
1, 2
Riga Technical University, Riga, Latvia
Abstract – The paper describes the implementation of organic
benchmarks for Java EE and ASP.NET Core, which are used to
compare the performance characteristics of the language
runtimes. The benchmarks are created as REST services, which
process data in the JSON format. The ASP.NET Core
implementation utilises the Kestrel web server, while the Java EE
implementation uses Apache TomEE, which is based on Apache
Tomcat. A separate service is created for invoking the benchmarks
and collecting their results. It uses Express with ES6 (for its async
features), Redis and MySQL. A web-based interface for utilising
this service and displaying the results is also created, using
Angular 5.
Keywords – Benchmark testing, computer languages,
programming, software performance.
I. INTRODUCTION
Both Java EE (Java Platform, Enterprise Edition), developed
by Oracle, and ASP.NET (Active Server Pages .NET),
developed by Microsoft, offer features fit for the creation of
web-based applications. However, in recent history, Java EE
has had better cross-platform support – it is possible to install
the HotSpot implementation of the JVM (Java Virtual
Machine), which is supported by Oracle, on both Windows and
GNU/Linux operating systems. However, when dealing with
.NET, the full .NET framework does not run on GNU/Linux
and Mono must be used, which was originally an open source
project and was only acquired by Microsoft in 2016 [1]. It does
not offer support for WPF (Windows Presentation Foundation),
WWF (Windows Workflow Foundation), while offering
limited support for WCF (Windows Communication
Foundation) and ASP.NET [2].
However, with the release of .NET Core in 2016 and,
subsequently, the ASP.NET Core [3], Microsoft is supporting
more operating systems. Now, as there is a first-party CLR
(Common Language Runtime) implementation available on
GNU/Linux, in addition to a modern rewrite of ASP.NET and
a new web server – Kestrel [4], it would be beneficial to re-
evaluate which technology stack is better for new projects.
The evaluation can be performed by examining their
differences, i.e., how the Kestrel web server is different from
IIS, which it is supposed to replace, and the most popular Java
web servers, such as Apache Tomcat [5], how the runtime
performance differs in typical use cases, running under similar,
commonly utilised configurations.
The present paper describes an implementation of a system,
which is to be used for running organic benchmarks (real-world
*
Corresponding author’s e-mail: kristians.kronis@edu.rtu.lv
tests) and collecting their results, offering immediate visual
feedback to the user. The main goal of the benchmarking is to
gain an approximation of how performant both technology
stacks are on the GNU/Linux operating system and to highlight
any obvious differences. General guidelines are also laid out for
the software architecture and implementation practices to ensure
the capability of generating hundreds of concurrent requests
and efficiently processing them, as well as handling any errors.
A common REST (Representational State Transfer) API
(Application Programming Interface), which uses JSON
(JavaScript Object Notation) for data transfer is described and
implemented in both technologies and deployed on identical
servers. A separate application, consisting of a front-end for test
configuration written in Angular 5, and a back-end service for
test execution and result processing, written in Express and
Node.js, which uses Redis for temporary storage and MySQL
for result logging, are also created. This system is designed
modularly – the servers implementing the testing APIs can be
configured in the front-end interface, in addition to configuring
Redis and MySQL logging.
While no claims are made that the results will be objective,
the system should serve as a starting point, allowing for
extensibility – adding more servers, which can run different
languages and software or hardware configurations, with no
code changes, or extending the list of benchmarks to be run,
should there be necessity for more specific tests in the future.
II. J
AVA EE
Java EE (Java Platform, Enterprise Edition) is a superset of
the Java SE (Java Platform, Standard Edition), which extends
the general-purpose Java APIs to provide features, which are
useful in an enterprise setting, such as dependency injection
(CDI, EJB), transaction management (JTA) and dynamic
webpage functionality (JSP, JSF), as well as features for
creating web services (JAX-RS, JAX-WS), at the same time
also shortening the development time and the software
complexity (Fig. 1) [6]. The development is organised with the
Java Community Process (JCP) and based on Java Specification
Requests (JSR).
The present paper describes a setup that uses Java EE 7,
which was released in 2013 [7]. The Java EE 7 platform can be
further divided into the Full Platform and Web Profile, the
purpose of which is to provide a more limited set of features,
which is easier to support [8]. The developed API
implementation takes advantage of Servlets, JSON, CDI and