Repositories? We don’t need no stinkin’ repositories!

March 26, 2026

So, if you’re an old curmudgeon like me, you’ve written your JPA entities with static methods that perform finder type operations, or bulk updates, or whatever you need beyond what EntityManager gives you. Let’s take the examples from our previous exploration of Panache.Next:

Class Diagram

So, if we needed to look up an Organization by name, we might have something like this:

@Entity
@NamedQueries({
@NamedQuery(name = "Organization.findByName", query = "SELECT o FROM Organization o WHERE o.name = :name")
})
public class Organization extends Party {
// private fields
protected Organization() {}
public Organization(String identifier, String name) {
super(identifier, name);
}
public static List<Organization> findByName(EntityManager em, String name) {
return em.createNamedQuery("Organization.findByName", Organization.class)
.setParameter("name", name)
.getResultList();
}
// public accessors, methods that do things, etc.
}

We would then rely on some external class being able to get an EntityManager instance to pass in to the method (because we can’t do dependency injection in entities, something I stare at in bewilderment all the time).

Then came the Hibernate with Panache extension in Quarkus. Suddenly, we had (very specific) dependency injection (ok, just the EntityManager) in entities! Yay! Now, with our parent Party class extending from PanacheEntity or PanacheEntityBase, we could make the above finder method simpler:

    public static List<Organization> findByName(String name) {
        return getEntityManager().createNamedQuery("Organization.findByName", Organization.class)
                .setParameter("name", name)
                .getResultList();
    }

Then, tragedy struck (ok, too dramatic). The static getEntityManager() method was removed, in favor of defining repository interfaces. If you’re making your entities like this (Active Record Pattern), then you’ve probably uttered the title of this post, or something similar. For me, the presence of a repository class acting upon the entity means you have an anemic domain object that has no agency. An entity is a object, which means it has state and behavior, dadgumit! You shouldn’t need another class (one per entity, by the way) to do the things that it should be able to do itself.

Well, what do we do then? If we want to keep on using this style, then we need to make a few adjustments.

A new LTS version dropped recently, so let’s use it since it has the latest changes to the extension formerly known as Panache.Next.

Option 1: Hacker’s Paradise

So, we use our parent class (or make a new parent class for all of our entities), and re-add a getEntityManager() method:

    public static EntityManager getEntityManager() {
        return CDI.current().select(EntityManager.class).get();
    }

There’s probably some unforeseen problem with this, but it passes the hacker’s test: 1. It compiles. 2. It worked (on my machine). Ship it!

In a conversation I had with some of the developers of the extension, I mentioned this, and their heads didn’t immediately explode, but your mileage may vary, caveat emptor, etc.

Option 2: Let Them Eat Cake

You can define a repository interface inside the entity, so at least you have high cohesion, but you have to do this in every entity, so you’re now doubling the number of classes because something has to implement that interface, even if it’s generated…

    public interface Repo extends PanacheRepository<Organization> {
        @Find
        List<Organization> findByName(String name);
    }

The problem here, is that you lose the connection to the named query, and any optimizations that come with that. The query should be cached after the first call, however.

Now, how do we use it? Well, you can use the entity metadata to access it:

Organization_.repo().findByName("Bob");

Or, even better, we can hide all of that behind the static finder method:

    public static List<Organization> findByName(String name) {
        return Organization_.repo().findByName(name);
    }

Option 3: There’s No School Like Old School

If we still want to use our named queries, we can still get access to a Session, which is a child of EntityManager, and we don’t need anything except an empty repository interface:

    public static List<Organization> findByName(String name) {
        return Organization_.repo().getSession()
                .createNamedQuery("Organization.findByName", Organization.class)
                .setParameter("name", name)
                .getResultList();
    }

    public interface Repo extends PanacheRepository<Organization> {
    }

These repository interfaces are not my favorite thing, but they are not going away. The only other option would be to not use the soon to be renamed extension, putting yourself back into the old Java EE days of having to pass the EntityManager into the entity. Doing the ThreadLocal trick is out of favor too, with the advent of Virtual Threads. You could use Option 1 from above though.

Perhaps it’s best to live by the wisdom of The Man’s Prayer: “I’m a man, but I can change, if I have to, I guess”.


What’s Next for Panache?

February 4, 2026

Update: There are some updates noted below after sharing this info with the team working on Panache.Next.

When Hibernate with Panache (Panache) was introduced in Quarkus, I was skeptical at first. After I used it a few times, I was sold. It solved the main problem that I had in Java EE / Jakarta EE projects: entity classes do not support dependency injection. More specifically, they did not support injecting the EntityManager, so I had to pass it into my static finder / updater / whatever-er methods on the entity.

Now with Panache, the EntityManager was already there, just by extending PanacheEntity or PanacheEntityBase. Hurray! Big problem solved.

Now, with Panache.Next (or Panache 2 or whatever the final name will be) recently released in Quarkus 3.31, it’s time to kick the tires and see what’s different.

Setting The Stage

The documentation shows a simple example, so I’ll leave that as an exercise for the reader to try out. What about something a little more complex, like, inheritance? Let’s dust off an old idea from the early days – Party, Person and Organization (Coad Letter Issues 103 and 107; Java Modeling in Color with UML).

Simply put, a Party is a legal entity that can be sued in a court of law. Person and Organization are more specific instances that inherit from Party, where Person is generally a human, and Organization is generally a collection of humans (business, trade association, club).

Class Diagram

When mapping this class structure to the database (we’re using PostgreSQL for this example), we have four options (note that Party is an abstract class):

  1. @MappedSuperclass – Party is annotated with this and we have 2 tables, one each for Person and Organization
  2. @Inheritance(strategy = JOINED) – Party is annotated with this and we have 3 tables, one each for Party, Person, and Organization
  3. @Inheritance(strategy = SINGLE_TABLE) – Party is annotated with this and we have 1 table for everything with a discriminator column to tell which record is what type
  4. @Inheritance(strategy = TABLE_PER_CLASS) – Party is annotated with this and we have 2 tables, like the @MappedSuperclass style
@MappedSuperclass and InheritanceType.TABLE_PER_CLASS
InheritanceType.JOINED
InheritanceType.SINGLE_TABLE

I set all of this up with a Quarkus 3.27.2 app, and the code can be found at https://github.com/nderwin/party-time on the main branch. I only ran this in dev mode, using the database that was spun up as a dev service. The only Liquibase bit needed was to create the schemas, all table and field creation is left to Hibernate to do for simplicity of this example.

What Changed?

For starters, I created a new branch, spike/panache.next and then used the quarkus command line tool to upgrade the project to Quarkus 3.31.2. Initially, there was a bug with 3.31.0 and 3.31.1 with the dependency name that has since been fixed in 3.31.2. Now, what changes were made?

pom.xml

  • update from Java 21 to Java 25 (not necessary for this, but since it is supposed to support it, why not?)
  • Quarkus platform version 3.27.2 -> 3.31.2 (yeah, we knew that)
  • quarkus-junit5 -> quarkus-junit dependency
  • fix for the surefire argLine not including previously set values

Now, to use Panache.Next, we need to do some further changes:

  • quarkus-hibernate-orm-panache -> quarkus-hibernate-panache-next dependency change
  • add some annotation processor path stuff to the maven-compiler-plugin config section (see documentation)

That’s it! Now, the important changes are in the entities themselves.

Code Changes

In the REST API endpoint classes, we were simply calling Organization.listAll() or Person.listAll() for the endpoint that just returned all of their respective items. Those methods are no longer generated for your entity. Now, you must introduce some sort of repository interface in order to access that functionality.

Once you have that repository interface defined, then you @Inject it into wherever you need to perform those operations, and use that instead.

Update: Instead of injecting the repository, you can access it through the generated metadata class.

// Old Way
public List<Person> list() {
return Person.listAll();
}
// New Way
@Inject
Person.Repo repo;
public List<Person> list() {
return repo.listAll();
}
// Alternative new way
public List<Person> list() {
return Person_.repo().listAll();
}

But wait, it gets better. You need to define repository interfaces on every single class in the hierarchy (except if you are using @MappedSuperclass, in which case you only have to define repositories on every child class.

Update: There are repository interfaces already defined for your when extending PanacheEntity that are available through the generated metadata class – managedBlocking() and statelessBlocking(), so you are not required to define any repository interfaces in your classes. Unfortunately, if you remove all of them, you end up with compile errors in the generated metadata classes. I logged https://github.com/quarkusio/quarkus/issues/52440 for the compile errors.

@MappedSuperclass
public abstract class Party extends PanacheEntity {
}
@Table(schema = "mapped", name = "organization")
@Entity(name = "MappedOrganization")
public class Organization extends Party {
public interface Repo extends PanacheRepository<Organization> {
@Find
Organization findByName(String name);
}
}
@Table(schema = "mapped", name = "person")
@Entity(name = "MappedPerson")
public class Person extends Party {
public interface Repo extends PanacheRepository<Person> {
@Find
Person findByName(String name);
}
}

That’s right, you can’t just have an empty interface either – it must have at least one operation defined in it, otherwise you get a compile error with the generated metadata classes (Person_, Organization_) because it won’t generate an import statement for the Repo interface if it doesn’t have any methods!

Update: It should just work to have a repository interface with no members. This is being addressed in https://github.com/quarkusio/quarkus/issues/50178 at the time of this update.

What about the other 3 types of inheritance mapping? That depends on what type of repository interface you define in your parent class.

If you use PanacheRepository, then you have to define 3 interfaces in your parent class, and 2 in each of the subclasses.

If you use PanacheManagedBlockingRepository, then you only have to define 2 interfaces in your parent class, and then 2 in each of the subclasses.

Oh, don’t forget you have to also define a method in every single one of those interface definitions, otherwise you get compile errors.

Why 2 or 3 interfaces in the parent and 2 in the children? To avoid compile errors in the generated metadata classes.

Note that the generics are probably not necessary, I was trying to make things simpler for the subclasses, but it didn’t help.

@Table(schema = "joined", name = "party")
@Entity(name = "JoinedParty")
@Inheritance(strategy = JOINED)
public abstract class Party extends PanacheEntity {
protected interface Repo<T extends Party> extends PanacheRepository<T> {
@Find
T findByName(String name);
}
protected interface StatelessRepo<T extends Party> extends PanacheRepository<T> {
@Find
T findByName(String name);
}
protected interface StatelessBlockingRepo<T extends Party> extends PanacheStatelessBlockingRepository<T> {
@Find
T findByName(String name);
}
}
@Table(schema = "joined", name = "organization")
@Entity(name = "JoinedOrganization")
public class Organization extends Party {
public interface Repo extends Party.Repo<Organization> {
@Find
@Override
Organization findByName(String name);
}
public interface StatelessRepo extends Party.StatelessRepo<Organization> {
@Find
@Override
Organization findByName(String name);
}
public interface StatelessBlockingRepo extends Party.StatelessBlockingRepo<Organization> {
@Find
@Override
Organization findByName(String name);
}
}
@Table(schema = "joined", name = "person")
@Entity(name = "JoinedPerson")
public class Person extends Party {
public interface Repo extends Party.Repo<Person> {
@Find
@Override
Person findByName(String name);
}
public interface StatelessRepo extends Party.StatelessRepo<Person> {
@Find
@Override
Person findByName(String name);
}
public interface StatelessBlockingRepo extends Party.StatelessBlockingRepo<Person> {
@Find
@Override
Person findByName(String name);
}
}

Compare to using PanacheManagedBlockingRepository

@Table(schema = "multiple", name = "party")
@Entity(name = "TablePerClassParty")
@Inheritance(strategy = TABLE_PER_CLASS)
public abstract class Party extends PanacheEntity {
protected interface Repo<T extends Party> extends PanacheManagedBlockingRepository<T> {
@Find
T findByName(String name);
}
protected interface StatelessRepo<T extends Party> extends PanacheStatelessBlockingRepository<T> {
@Find
T findByName(String name);
}
}
@Table(schema = "multiple", name = "organization")
@Entity(name = "TablePerClassOrganization")
public class Organization extends Party {
public interface Repo extends Party.Repo<Organization> {
@Find
@Override
Organization findByName(String name);
}
public interface StatelessRepo extends Party.StatelessRepo<Organization> {
@Find
@Override
Organization findByName(String name);
}
}
@Table(schema = "multiple", name = "person")
@Entity(name = "TablePerClassPerson")
public class Person extends Party {
public interface Repo extends Party.Repo<Person> {
@Find
@Override
Person findByName(String name);
}
public interface StatelessRepo extends Party.StatelessRepo<Person> {
@Find
@Override
Person findByName(String name);
}
}

Conclusion

The extension is marked as experimental at this point, so it is subject to change. I would hope that the necessity of defining a method in the repository interface is removed because that’s a lot of repetition. I also hope the stateless repository bits become optional.

My biggest hope would be to get rid of defining any repository interface, but I doubt I would win that battle.


Just Say No – Succinctly – with Java 25

January 2, 2026

Somehow I stumbled upon this git repo:  https://github.com/hotheadhacker/no-as-a-service. Cute.  Funny.  Oh, huh, it’s JavaScript.

Then I started thinking, surely, I can do this with Java.  Just the JDK, no dependencies.  So, that led me to forking the project into this:  https://github.com/nderwin/no-as-a-service-java.

With the latest improvements in Java 25, I was able to pare this down into a single source file, in less than 30 lines.  That’s not even lines of code, it’s the total number of lines in the file!  Let’s dive into it.

The Code

import module jdk.httpserver;

void main() throws IOException {
    final CopyOnWriteArrayList<String> reasons = new CopyOnWriteArrayList<>();
    Stream.of(Pattern.compile("\\\\u([0-9a-fA-F]{4})")
            .matcher(Files.readString(Paths.get("reasons.json")))
            .replaceAll((m) -> String.valueOf((char) Integer.parseInt(m.group(1), 16)))
            .split(System.lineSeparator())
    )
            .filter((t) -> !t.startsWith("[") && !t.startsWith("]"))
            .forEach((t) -> reasons.add(t.substring(3, t.length() - 2).trim()));

    final HttpServer server = HttpServer.create(new InetSocketAddress(8080), 0);
    server.createContext("/no", (final HttpExchange exchange) -> {
        final byte[] response = "{\"reason\":\"%s\"}".formatted(reasons.get(
                ThreadLocalRandom.current().nextInt(reasons.size())
        )).getBytes(StandardCharsets.UTF_8);

        exchange.getResponseHeaders().set("Content-Type", "application/json");
        exchange.sendResponseHeaders(200, response.length);
        try (OutputStream os = exchange.getResponseBody()) {
            os.write(response);
        }
    });

    server.setExecutor(null);
    server.start();
    System.getLogger(this.getClass().getName()).log(System.Logger.Level.INFO, "Server started successfully");
}

What is this doing?

Line 1 – Hey, wait a minute… where’s the package declaration? What is this module keyword?

We’re told “put your code in a package, always” as a best practice. That’s nice, but here we have a single class. We are not using it in another class. We’re not even getting out of the main method (as we’ll see later). Putting this class into a package is unnecessary, so we don’t need that line.

Module imports are a new feature of Java 25 (previewed in earlier releases). It basically functions like a star import, but it can encompass multiple packages. Here, the jdk.httpserver module covers the com.sun.net.httpserver and com.sun.net.httpserver.spi packages. This allows us access to the HttpServer class to serve our API endpoint.

You’ll also notice some other classes being referenced in the code with no corresponding import statement. Those all belong to packages that are part of the java.base module, which is imported for you so you don’t even need to declare it.

Line 3 – Where’s the class declaration? Don’t you need public static before that method declaration? What about the method argument?

This is more Java 25 goodness – since this is a simple, single class application, we don’t have to declare the class, nor the additional bits about the main() method. These are just implied, so we can get right down to writing our code with less boilerplate.

Lines 4 – 11 – The intent here is to load a file of reasons, from which we will later randomly pull from to provide the API response. The original project stored this info as a JSON array. To make it more interesting, there were also Unicode escaped characters in the reason text, for example

[
  "My future self wrote me a note: \u2018Please don\u2019t do this again.\u2019"
]

Note the \u2018 and \u2019 characters – when you read the file, you get that exact string, not the Unicode character it represents, unless you do some shenanigans with java.util.Properties, where it will do the decoding. Instead, I’m using a Pattern to replace those by turning the 4 “digits” into an Integer, then cast it to a char, then turn it into a String.

So, the whole chain of events goes like this:

  1. get the path to the reason file – Paths.get()
  2. read the whole file into a String – Files.readString()
  3. pass that off to the pattern matcher – Pattern.compile().matcher()
  4. replace all of the encoded Unicode characters with their decoded values – .replaceAll()
  5. split the file into individual lines using the line separator character from the system – .split()
  6. wrap that String[] with a Stream so we can process each line – Stream.of()
  7. filter out the opening and closing square brackets of the JSON array – .filter()
  8. for each of the remaining lines, let’s strip indentation and double quotes, trim any leading or trailing spaces, and add it to the reasons array list

Line 13 – Create an HTTP server on port 8080 with no backlog.

Danger Will Robinson! This class is in a com.sun package which means that while it’s available for use in this Java version, there is no guarantee it will be available in the next one. Hopefully, they’ll move it to a java or javax package in the future.

Lines 14 – 24 – This is our endpoint handler. We do not have JAX-RS here, so no @GET or @Path annotations to do the work for us. We don’t even have any annotation processing with this project.

This is fairly straightforward; take a String template, format it with a randomly chosen value from the array of reasons, then turn it into a byte array for output.

We set up our response headers to make callers understand that we are returning a JSON object, set the status as successful, and then write the response body.

Lines 26 – 27 – Here we are just setting up the server to use the default thread executor instead of defining our own, and then starting the HTTP server.

Make It So

Here’s the fun part – we don’t even need to compile the code to run it, simply

java src/main/java/No.java

Once the reason file is loaded, you will see

INFO: Server started successfully

on the command line. You can then curl http://localhost:8080/no or paste the URL into your favorite browser to receive a cleverly worded “No”. This is another one of the new features of Java 25 – being able to just run a single Java source file without the need for compilation.

This was a fun exercise into some of the new features of Java 25. I can see how this would be handy for making small, command line applications that could replace some of what a shell script would do.


Add some color to your Maven output in NetBeans

January 19, 2024

Missing the colorized output from Maven when building projects in NetBeans? Getting it back is a simple as adding two Global Execution Options (Tools -> Options, Java tab):

-Djansi.passthrough=true -Dstyle.color=always

After you’ve added these, the next Maven command you run will include color in the Output window.

Enjoy!


Augment Your Quarkus Identity

January 20, 2020

As of Quarkus 3.20.1, the documentation has been updated to discourage use of a @Dependent scoped bean, making the code simpler to implement.

In a recent thread in the Quarkus developer chat, I learned of the existence of an interface, SecurityIdentityAugmentor, that could be used to add or modify information in the current security context of a REST request.

I wondered if it would be possible to use the current identity to look up additional information from a database, and add it to the security context. After a few attempts, I found a way.

It can be simply implemented as an ApplicationScoped bean as such

import io.quarkus.security.identity.AuthenticationRequestContext;
import io.quarkus.security.identity.SecurityIdentity;
import io.quarkus.security.identity.SecurityIdentityAugmentor;
import java.util.concurrent.CompletionStage;
import javax.inject.Inject;

@ApplicationScoped
public class MyAugmentor implements SecurityIdentityAugmentor {

    @Inject
    MySupplier supplier;

    @Override
    public int priority() {
        return 0;
    }

    @Override
    public CompletionStage augment(final SecurityIdentity identity, final AuthenticationRequestContext context) {
        supplier.setIdentity(identity);
        return context.runBlocking(supplier);
    }

}

Note the injected MySupplier class. This is necessary so that we can add the appropriate annotations to avoid errors being thrown when using the EntityManager.

Note also using the context to run the blocking operation, which also is needed for the EntityManager.

The Supplier is where the magic happens

import io.quarkus.security.identity.SecurityIdentity;
import io.quarkus.security.runtime.QuarkusSecurityIdentity;
import java.util.function.Supplier;
import javax.enterprise.context.Dependent;
import javax.enterprise.context.control.ActivateRequestContext;
import javax.inject.Inject;
import javax.persistence.EntityManager;
import javax.transaction.Transactional;

@Dependent
public class MySupplier implements Supplier {

    @Inject
    EntityManager em;

    private SecurityIdentity identity;

    @ActivateRequestContext
    @Transactional
    @Override
    public SecurityIdentity get() {
        if (identity.isAnonymous()) {
            return identity;
        } else {
            // Copy the existing identity to the builder
            QuarkusSecurityIdentity.Builder builder = QuarkusSecurityIdentity.builder()
                    .setPrincipal(identity.getPrincipal())
                    .addAttributes(identity.getAttributes())
                    .addCredentials(identity.getCredentials())
                    .addRoles(identity.getRoles());

            // Read whatever data you need from the EntityManager here

            // Add whatever you need to the builder

            return builder.build();
        }
    }

    public void setIdentity(final SecurityIdentity identity) {
        this.identity = identity;
    }

}

It’s just that easy. Inject a SecurityIdentity into your code, and the additional properties will be available for use.


Project LEE7 – Give it a REST

September 24, 2014

I started thinking about how I would document the REST endpoints for use by outside processes (web UI, desktop UI, maybe some other server side code), when I stumbled upon Swagger. After playing around with it for a little bit, it became clear that the existing REST endpoints were not quite right. Thus, I began another round of refactoring.

REST Endpoints Part Deux

Fortunately, REST has been around for long enough now that some standard naming conventions and response values have been established. Comparing what I had originally with what I read at http://www.restapitutorial.com, led to the following changes.

Instead of having the application namespace at the top level, we will just deploy to the root of the server, leaving application specific names to the respective ApplicationConfig classes.

<jboss-web>  
    <context-root>/</context-root>  
</jboss-web>
@ApplicationPath("contact")
public class ApplicationConfig extends Application {
}

The next step was to update the service classes to pluralize the resource names, and remove the HTTP actions from the endpoints. I also took the opportunity to standardize on JSON for all requests and responses, and introduce the remaining CRUD operations.

@Stateless
@LocalBean
@TransactionAttribute(TransactionAttributeType.REQUIRES_NEW)
@Path("/organizations")
@Produces(MediaType.APPLICATION_JSON)
@Consumes(MediaType.APPLICATION_JSON)
public class OrganizationResource {

    @GET
    @Path("/")
    public Response getAll(
            @DefaultValue("50") @QueryParam("limit") final int limit, 
            @DefaultValue("0") @QueryParam("offset") final int offset) {
        ...
    }

    @GET
    @Path("/{id}")
    public Response get(@PathParam("id") final Long id) {
        ...
    }

    @POST
    @Path("/")
    public Response save(@Context HttpServletRequest request, final Organization organization) throws URISyntaxException {
        ...
    }

    @PUT
    @Path("/{id}")
    public Response update(@PathParam("id") final Long id, final Organization organization) {
        ...
    }

    @DELETE
    @Path("/{id}")
    public Response delete(@PathParam("id") final Long id) {
        ...
    }
}

Now, to access these endpoints, we construct endpoint calls that look much more reasonable

  • GET {host}/contact/organizations?limit=50&offset=0 – return a paged list of all organizations
  • GET {host}/contact/organizations/{id} – return a specific organization
  • POST {host}/contact/organizations – persist a new organization
  • PUT {host}/contact/organizations/{id} – update an existing organization
  • DELETE {host}/contact/organizations/{id} – delete an existing organization

Another trick I found to reduce the amount of repeated code was to create an ExceptionMapper class. This can then be used to provide a more useful error to the endpoint client, rather than just giving a 500 Server Error, and hoping that the logs have enough detail to debug the issue.

@Provider
public class ResourceExceptionMapper implements ExceptionMapper<Exception> {

    @Override
    public Response toResponse(Exception exception) {
        ...
    }
}

Here we can map specific exceptions to HTTP response codes, as well as log information from the exception.

Testing, Always Testing

You could just deploy the war file to a server, and then manually test the endpoints with something like RESTClient, or using browser plugins. Since I had previously set up Arquillian to do integration testing, it was only natural to continue using it. I did, however, run into some issues which caused a lot of refactoring in the integration tests.

For starters, by using the exception mapper as described above, I could no longer make calls as I had been to the methods being tested. This was not going through all of the framework code, so the exceptions that would have been mapped were making their way into the test code. A simple, I thought, fix would be to change the tests to run in client mode (@RunAsClient). While this did take care of the issue of the exception mapper not being called, it then caused the tests to not work as I had originally written them.

In order to test the endpoints, I needed a way to call them as the clients would be calling them. This is where the javax.ws.rs.client package comes in. It defines a set of interfaces that you can call to programmatically create a REST client, and then use it in the test methods. We need a couple dependencies added to the pom, and since JBoss is the target server, RESTEasy is a natural choice.

        <dependency>
            <groupId>org.jboss.resteasy</groupId>
            <artifactId>resteasy-client</artifactId>
            <version>3.0.6.Final</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.jboss.resteasy</groupId>
            <artifactId>resteasy-jackson-provider</artifactId>
            <version>3.0.6.Final</version>
            <scope>test</scope>
        </dependency>

Now, each test method will have to set up its own test data, but that is the beauty of having a separate database for testing. We can create whatever data is needed for the tests, and not worry about it interfering with live data. It all will be wiped clean the next time the tests are run. This also seemed like a better choice than chaining the test methods together using the @InSequence annotation and static properties to pass state between test methods, and running the integration tests with a single thread.

I will briefly mention that I have added JaCoCo to the build to get a rough idea of the amount of test coverage I had. Since changing the integration tests over to running in client mode, JaCoCo cannot measure the integration test coverage of all of those REST endpoints due to how the war is deployed in the test container. Hopefully that will be added to the JaCoCo Arquillian extension in the near future.

As always, the full project can be found on GitHub. Questions and comments are welcome.


Project LEE7 – Persistent Issues

September 8, 2014

Old Business

Getting back to an issue I had earlier with validation logic on some of the entity properties, I started looking into what it would take to call the field setters in the object constructor. Fortunately, I have a lot of warnings turned on in my IDE, so it can gently remind me (or encourage me to research why) some things I wanted to code were bad ideas. This was one of them. So, the first step would have been to change out setting the field directly in the constructor with a call to the setter. Sounds simple enough, now we have all the validation logic on the field in one place! But, now I get a warning from the IDE – you shouldn’t call a method in the constructor that can be overridden. Fine. So, I’ll go an make the setter final. That lead me off on a different trail with PropertyVetoChangeEvent so I could notify classes that use the method of bad input data and allow other objects to add more “business logic” to determine valid values to set (you know, the kind of things one would do by overriding the setter). After all that, I see another warning from the IDE – methods of persistent properties should not be final. Fine, time to do some research. Then I find in the Java EE 7 Tutorial, section 37.1.1, Requirements for Entity Classes – “The class must not be declared final. No methods or persistent instance variables must be declared final.” Well, alrighty then. Now I see why Spring continues to be a thing. The specs seem to encourage anemic domain models. There has to be a way to have a more robust domain object. The “logic” that I had in there wasn’t anything too complex so… enter Bean Validation! Now, all the code that I had in two places are replaced by two simple annotations on the field – @NotNull, and @Size(min, max).

    @Basic(optional = false)
    @Column(name = "givenname", nullable = false, length = 50)
    @NotNull
    @Size(min = 1, max = 50)
    private String givenName;

This also means that the setter now just simply sets the field to the value passed in. The unit tests for these need to be updated, and a new test dependency is needed for a validation implementation.

    <dependencies>
...
        <!-- Hibernate Validator -->
        <dependency>
            <groupId>org.hibernate</groupId>
            <artifactId>hibernate-validator</artifactId>
            <version>5.1.2.Final</version>
            <scope>test</scope>
        </dependency>
    </dependencies>
public class PersonTest {

    private Validator validator;

    @Before
    public void setUp() {
        ValidatorFactory factory = Validation.buildDefaultValidatorFactory();
        validator = factory.getValidator();
    }

    @Test
    public void testConstructor() {
        Person testMe = new Person("", null);
        Set&amp;amp;lt;ConstraintViolation&amp;amp;gt; violations = validator.validate(testMe);
        assertEquals(violations.size(), 1);
    }

    @Test
    public void testSetGivenName() {
        Person testMe = new Person("Smith", "John");
        testMe.setGivenName(null);
        Set&amp;amp;lt;ConstraintViolation&amp;amp;gt; violations = validator.validate(testMe);
        assertEquals(violations.size(), 1);
    }
}

New Business

With that little bit cleared up, we can move on now to persisting the entities into a more permanent database. For this, I have chosen to use PostgreSQL 9.3, which I have used on a couple of past projects with great success.

PostgreSQL Configuration

After installing PostgreSQL (and pgAdmin so that we have a decent visual admin tool), we need to alter the password for the postgres user. We do this because by default, there is no password, and pgAdmin won’t work if you do not have a password for a login role (user). So, a little shell command will clear this up (set the password to whatever you want, hopefully something a little more secure than this).

sudo -u postgres psql -c “ALTER USER postgres PASSWORD 'postgres';”

Now, using pgAdmin, we can create some login roles, databases, and schemas. We shall start with the login role that will own the databases, and it shall be named “lee7”.

new-login-role-1

Set the password, but leave the expiration date blank.

new-login-role-2

Select the create databases and create roles checkboxes.

new-login-role-3

Accept the changes, and then we will move on to the login role for the contact application, “contact”. Follow the same procedures as above, but do not give this user the ability to create databases or roles. Now, a new database is needed for the application, so we will create it, and make the lee7 login role the owner.

new-database-1

Set a few more options about the database, and then accept the changes.

new-database-2

Now, create a schema to hold the data for the contact application. I have plans on adding more applications/modules in the future, each of which will be separated into their own schemas. You could also separate them into different databases, but that would make query joins more difficult than necessary.

new-schema-1

Next, we need a separate database (lee7_test) for running the integration tests so we do not mix automated test data with manual test data or production data. We can just use the existing database as a template for the new one to a save a few configuration steps. Note that pgAdmin will complain if there are any connections to the source database when doing this, so make sure everything is disconnected first.

new-database-3

JBoss Configuration

If you want to use the web admin for JBoss, you will first need to create a management user which will then log into the management console (port 9990). Here, JBOSS_INSTALL_DIR is wherever you have JBoss installed. For me, it would be /opt/wildfly-8.0.0.Final for the main installation, ~/.m2/arquillian/contact/wildfly-8.0.0.Final for the installation that Arquillian uses when running integration tests.

$JBOSS_INSTALL_DIR/bin/add-user.sh admin Admin --silent=true

You will also need to install the PostgreSQL JDBC driver and configure a datasource. I defer to http://www.mastertheboss.com/jboss-datasource/configuring-a-datasource-with-postgresql-and-jboss/wildfly, which does a fine job of explaining the process. Note that the datasource name I used is contactDS.

persistence.xml Configuration

The persistence.xml file needs a couple of tweaks to use the new datasource:

<jta-data-source>java:jboss/datasources/contactDS</jta-data-source>

The property for the schema generation should be removed. All database functions from this point are handled with Liquibase. I have also turned off the options for Hibernate to show and format the SQL commands in the log for the main configuration, but left it turned on for the test configuration.

Liquibase Configuration

In the src/main/resources directory of the project, we create conf and db directories to hold configuration and Liquibase changelog files, respectively. The configuration files for Liquibase are simply .properties files, and an environment name in the file name for use when we want to pick an environment specific configuration. Each one contains the database connection information, along with some Liquibase behavior attributes.

driver=org.postgresql.Driver
url=jdbc:postgresql://localhost:5432/lee7
username=contact
password=contact
verbose=true
dropFirst=false
promptOnNonLocalDatabase=true
contexts=dev
driver=org.postgresql.Driver
url=jdbc:postgresql://localhost:5432/lee7_test
username=contact
password=contact
verbose=true
dropFirst=true
promptOnNonLocalDatabase=true
contexts=test

The main differences are which database is used in each, and that in test we drop everything before running the changesets to make sure we have a fresh start for the tests. The changelog files are standard XML. In the past, I have used one changelog file per ticket, one changelog file per feature, and a mixture of both. The Liquibase documentation suggests one per version, so that is what will be used here. It really does not matter what convention you want to use, just be consistent with it.

<?xml version="1.0" encoding="UTF-8"?>
<databaseChangeLog
    xmlns="http://www.liquibase.org/xml/ns/dbchangelog"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xmlns:ext="http://www.liquibase.org/xml/ns/dbchangelog-ext"
    xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-3.1.xsd
    http://www.liquibase.org/xml/ns/dbchangelog-ext http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-ext.xsd">

    <include file="changelog-1.0.0.xml" relativeToChangelogFile="true"/> 
</databaseChangeLog>
<?xml version="1.0" encoding="UTF-8"?>
<databaseChangeLog
    xmlns="http://www.liquibase.org/xml/ns/dbchangelog"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xmlns:ext="http://www.liquibase.org/xml/ns/dbchangelog-ext"
    xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-3.1.xsd
    http://www.liquibase.org/xml/ns/dbchangelog-ext http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-ext.xsd">

    <changeSet id="create-legal-entity-sequence" author="nderwin">
        <comment>Creating the primary key sequence for LegalEntity</comment>
        <createSequence schemaName="contact" sequenceName="legal_entity_seq" />
    </changeSet>
    
    <changeSet id="create-organization" author="nderwin">
        <comment>Creating the table for the Organization entity</comment>
        <createTable schemaName="contact" tableName="organization">
            <column name="id" type="BIGINT">
                <constraints nullable="false" primaryKey="true" primaryKeyName="contact_organization_PK" />
            </column>
            <column name="name" type="VARCHAR(50)">
                <constraints nullable="false" />
            </column>
        </createTable>
    </changeSet>
    
    <changeSet id="create-person" author="nderwin">
        <comment>Creating the table for the Person entity</comment>
        <createTable schemaName="contact" tableName="person">
            <column name="id" type="BIGINT">
                <constraints nullable="false" primaryKey="true" primaryKeyName="contact_person_PK" />
            </column>
            <column name="surname" type="VARCHAR(50)" />
            <column name="givenname" type="VARCHAR(50)">
                <constraints nullable="false" />
            </column>
        </createTable>
    </changeSet>
</databaseChangeLog>

Maven Configuration

At this point, we can bring all the previous work together in the pom file. I was not satisfied with the integration tests firing off every time I started a build of the code, so I created an integration testing profile that contained the maven-failsafe-plugin configuration. This is also where the liquibase-maven-plugin is set to run.

    <profiles>
        <profile>
            <activation>
                <activeByDefault>false</activeByDefault>
            </activation>
            <id>integration-test</id>
            <build>
                <plugins>
                    <plugin>
                        <groupId>org.liquibase</groupId>
                        <artifactId>liquibase-maven-plugin</artifactId>
                        <version>3.2.2</version>
                        <dependencies>
                            <dependency>
                                <groupId>org.postgresql</groupId>
                                <artifactId>postgresql</artifactId>
                                <version>9.3-1102-jdbc41</version>
                            </dependency>
                        </dependencies>
                        <executions>
                            <execution>
                                <phase>process-test-resources</phase>
                                <goals>
                                    <goal>update</goal>
                                </goals>
                                <configuration>
                                    <driver>org.postgresql.Driver</driver>
                                    <propertyFile>target/classes/conf/liquibase.test.properties</propertyFile>
                                    <propertyFileWillOverride>true</propertyFileWillOverride>
                                    <changeLogFile>target/classes/db/changelog.xml</changeLogFile>
                                    <changelogSchemaName>contact</changelogSchemaName>
                                    <defaultSchemaName>contact</defaultSchemaName>
                                </configuration>
                            </execution>
                        </executions>
                    </plugin>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-failsafe-plugin</artifactId>
                        <version>2.17</version>
                        <configuration>
                            <parallel>methods</parallel>
                            <threadCount>10</threadCount>
                            <systemPropertyVariables>
                                <project.artifactId>${project.artifactId}</project.artifactId>
                                <arquillian.launch>wildfly</arquillian.launch>
                            </systemPropertyVariables>
                            <systemProperties>
                                <property>
                                    <name>java.util.logging.manager</name>
                                    <value>org.jboss.logmanager.LogManager</value>
                                </property>
                            </systemProperties>
                        </configuration>
                        <executions>
                            <execution>
                                <goals>
                                    <goal>integration-test</goal>
                                    <goal>verify</goal>
                                </goals>
                            </execution>
                        </executions>
                    </plugin>
                </plugins>
            </build>
        </profile>
        <profile>
            <activation>
                <activeByDefault>false</activeByDefault>
            </activation>
            <id>update-dev-database</id>
            <build>
                <plugins>
                    <plugin>
                        <groupId>org.liquibase</groupId>
                        <artifactId>liquibase-maven-plugin</artifactId>
                        <version>3.2.2</version>
                        <dependencies>
                            <dependency>
                                <groupId>org.postgresql</groupId>
                                <artifactId>postgresql</artifactId>
                                <version>9.3-1102-jdbc41</version>
                            </dependency>
                        </dependencies>
                        <executions>
                            <execution>
                                <phase>process-test-resources</phase>
                                <goals>
                                    <goal>update</goal>
                                </goals>
                                <configuration>
                                    <propertyFile>target/classes/conf/liquibase.dev.properties</propertyFile>
                                    <propertyFileWillOverride>true</propertyFileWillOverride>
                                    <changeLogFile>src/main/resources/db/changelog.xml</changeLogFile>
                                    <changelogSchemaName>contact</changelogSchemaName>
                                    <defaultSchemaName>contact</defaultSchemaName>
                                </configuration>
                            </execution>
                        </executions>
                    </plugin>
                </plugins>
            </build>
        </profile>
    </profiles>

You will also note the update-dev-database profile. This allows for running the Liquibase change sets against the development database. For now, the profiles are working well, however I may look at moving all of this back into the main plugin area in the future and try to control it instead with command line switches. Remember, be Agile! For me, that means try it out, if it does not work as well as you hoped, find a better way.

As always, the full project can be found on GitHub. Questions and comments welcome.


Project LEE7 – Unit and Integration Testing

August 29, 2014

Overview

In the previous post, we started working on the server side bits to serve up some RESTful content.  Before we get too much farther into adding more functionality, we should really take some time to work on the testing infrastructure.  This is where the die-hard Test Driven Development people would be shouting “You should have had your tests written already!”.  Obviously, I am not one of those types of people, but I do think that we have a minimum amount of functioning code, so we should start adding some tests now rather than trying to do it all in one shot later.

Unit Testing

For unit testing, we will use the venerable JUnit framework and maven-surefire-plugin for executing the tests during the build cycle.  There’s nothing really special to see here, so I’ll just give the pom bits and move along.

 <dependencies>
 ...
   <dependency>
     <groupId>junit</groupId>
     <artifactId>junit</artifactId>
     <version>4.11</version>
     <scope>test</scope>
   </dependency>
 </dependencies>
 <build>
 ...
   <plugins>
 ...
     <plugin>
       <groupId>org.apache.maven.plugins</groupId>
       <artifactId>maven-surefire-plugin</artifactId>
       <version>2.17</version>
     </plugin>
   </plugins>
 </build>

I’ve also added the surefire report plugin for when the maven project site is generated.  This will create a page listing the results of the unit tests.  It’s not necessary, but I wanted to have it in there for an example.  What interests me more is code coverage, but to be useful, it needs to be an aggregation of unit and integration tests.

 <reporting>
   <plugins>
 ...
     <plugin>
       <groupId>org.apache.maven.plugins</groupId>
       <artifactId>maven-surefire-report-plugin</artifactId>
       <version>2.17</version>
     </plugin>
   </plugins>
 </reporting>

The unit tests themselves are very thin.  We are testing the entity classes to ensure that we can construct and change their attributes so that they are in a valid state.  There is a non-nullable, non-empty String name field in each that acts as an identifier.  There are two ways to set these name fields – either through the constructor or the setter for the field.  Why two places?  The constructor ensures that the entity is constructed in a valid state (yes, I see the protected default constructor sitting there, quiet you).  The setter ensures that later changes to the entity are also valid.  These name fields need to be able to change over time, because, well, things change names over time.

Note:  I am a little uncomfortable with having the name checking logic in the constructor and setter.  This is a compromise to keep from having the setter declared as final so that it can be called in the constructor.  I may change it to be final anyway just to make it DRYer.

We could also test the hashCode, equals, and toString methods, but at that point you are starting to test the JDK.  Yes, you can argue that you are testing to make sure that hashCode and equals are working on the same set of attributes, but a good code review could do that.  Besides, if you forget to add a new field to the test as well as either the hashCode or equals methods, you still didn’t gain anything.

As for the rest of the code in the project, the REST boundaries, we could write unit tests and use a mock framework for the EntityManager.  I am going to choose to do integration testing on these because you end up spending a lot of time writing mock code and expectations in your tests once your class starts doing interesting things.  Sometimes those interesting things are not easily unit testable, especially if you are using third party libraries.  It will be easier in the long run to wire up an integration test, and then you can really test transaction boundaries, response codes, and the like.  If you find you are spending more time mocking things for unit tests than writing test code, then it is time to move to an integration test.

Integration Testing

To start the integration testing, we will use the maven-failsafe-plugin in conjunction with Arquillian.  There are some good introductory tutorials on the Arquillian website.  One important thing to note is that to use this, you need to have a copy of your application server set up so that it can be started by Arquillian when the tests are run.  Borrowing an idea from a colleague, I set up a copy of Wildfly 8.0 in my local .m2 directory, and then using maven property replacement, referenced that as the location of the server for Arquillian.

 <build>
...
   <plugins>
...
     <plugin>
       <groupId>org.apache.maven.plugins</groupId>
       <artifactId>maven-resources-plugin</artifactId>
       <version>2.6</version>
       <executions>
         <execution>
           <phase>process-test-resources</phase>
           <goals>
             <goal>copy-resources</goal>
           </goals>
           <configuration>
             <outputDirectory>${user.home}/.m2/arquillian/${project.artifactId}/wildfly-8.0.0.Final/standalone/configuration</outputDirectory>
             <resources>
               <resource>
                 <directory>src/test/resources-wildfly</directory>
                 <filtering>false</filtering>
               </resource>
             </resources>
           </configuration>
         </execution>
       </executions>
     </plugin>
...
     <plugin>
       <groupId>org.apache.maven.plugins</groupId>
       <artifactId>maven-failsafe-plugin</artifactId>
       <version>2.17</version>
       <configuration>
         <parallel>methods</parallel>
         <threadCount>10</threadCount>
         <systemPropertyVariables>
           <project.artifactId>${project.artifactId}</project.artifactId>
           <arquillian.launch>wildfly</arquillian.launch>
         </systemPropertyVariables>
         <systemProperties>
           <property>
             <name>java.util.logging.manager</name>
             <value>org.jboss.logmanager.LogManager</value>
           </property>
         </systemProperties>
       </configuration>
       <executions>
         <execution>
           <goals>
             <goal>integration-test</goal>
             <goal>verify</goal>
           </goals>
         </execution>
       </executions>
     </plugin>
...
   </plugins>
...
 </build>

By using the resources plugin, we can substitute a standalone configuration file that specifies a different database connection for the datasource, thereby guaranteeing a consistent testing environment.  We get the added benefit of not corrupting our other database with test data.

With the default configuration, the integration tests are run for every build.  In the future, I will look at adding parameters to enable the integration tests, mainly for CI builds, or when a developer does not mind the extra execution time.  Maybe I will name it such that somewhere in the command line you will see “coffee.break=true”.

Reporting

One final thing about testing, it would be nice if other people could see the results of our testing work to inspire confidence in the quality of the build.  For this, we set up the maven-site-plugin and the maven-surefire-report-plugin to give a nice overview of the project and test results.

 <build>
...
   <plugins>
...
     <plugin>
       <groupId>org.apache.maven.plugins</groupId>
       <artifactId>maven-site-plugin</artifactId>
       <version>3.4</version>
       <configuration>
         <skipDeploy>true</skipDeploy>
       </configuration>
     </plugin>
...
   </plugins>
...
 </build>
...
    <reporting>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-report-plugin</artifactId>
                <version>2.17</version>
            </plugin>
        </plugins>
    </reporting>

There is some more configuration to be done with the maven-scm-publish-plugin, which I may cover at some other point.  To produce the site documentation, simply use

mvn clean site:site to generate the site

As always, the full project can be found on GitHub.  Questions and comments welcome.


Project LEE7 – Server Bits

April 14, 2014

This post is part of a series about Learning Java EE 7.  The previous post gives a quick introduction to the project.

Overview

Server Bits Overview

We will start with a UML class diagram that gives an overview of the classes we use on the server side.  As you can see, we do not need a lot of Java code to get this up and running.  The introduction of annotations has greatly simplified the configuration, though we do still need a few XML files to complete the plumbing.

Persistence

So, let us start with a little deeper dive into the persistence layer.

Contact Entities

We have an abstract parent class, LegalEntity, which represents some thing that a court of law would recognize as a party to a proceeding.  Or, in other words, something that could be sued.  Coad’s writings called this a Party, which could also be used.  The main point is that we have a common starting point that will allow us to use people and groups of people (organizations) interchangeably at times in the application.

The two concrete subclasses of this then form the basis for our contact management software – a person and an organization.  Later we may have more specific types of organizations such as non-profits, companies, clubs, etc.  In the interest of keeping it simple, we will focus on just having simple name attributes for these classes.   To really model the many complex cases for people’s names across many cultures would mean making the Person class much more complex to handle any number of name parts.  Again, let us keep it simple so we can actually complete something shall we?  Be agile, we will make it more complex in a future iteration.

The entity classes themselves are your typical POJO style Java classes with some persistence annotations.  The parent class is marked as abstract and denotes the inheritance strategy of TABLE_PER_CLASS.  We will also let the persistence provider/database deal with generating the unique ids for each entry in the database.

@Entity(name = "LegalEntity")
@Inheritance(strategy = InheritanceType.TABLE_PER_CLASS)
public abstract class LegalEntity implements Serializable {
    private static final long serialVersionUID = -1822203393624550172L;
    @Id
    @GeneratedValue(strategy = GenerationType.AUTO)
    @Column(name = "id", nullable = false)
    private Long id;
    public Long getId() {
        return id;
    }
}

A setter for the id attribute is not provided since there’s no reason for the application code to ever set a value for this property.  It exists solely as a means for uniquely identifying a record in the database, so leave it to the persistence provider to deal with that property.

The concrete subclasses are a little more interesting.

@Entity(name = "Person")
@Table(schema = "contact", name = "person")
public class Person extends LegalEntity implements Serializable {

    private static final long serialVersionUID = -6010293547035868894L;

    @Basic
    @Column(name = "surname")
    private String surname;

    @Basic(optional = false)
    @Column(name = "givenname", nullable = false)
    private String givenName;

    protected Person() {
    }

    public Person(final String surname, final String givenName) {
        if (givenName == null || givenName.isEmpty()) {
            throw new IllegalArgumentException("givenName must contain a value");
        }

        this.surname = surname;
        this.givenName = givenName;
    }
}

@Entity(name = "Organization")
@Table(schema = "contact", name = "organization")
public class Organization extends LegalEntity implements Serializable {

    private static final long serialVersionUID = 542507467814592302L;

    @Basic(optional = false)
    @Column(name = "name", nullable = false)
    private String name;

    protected Organization() {
    }

    public Organization(final String name) {
        if (name == null || name.isEmpty()) {
            throw new IllegalArgumentException("name must contain a value");
        }

        this.name = name;
    }
}

Note the protected default constructors, which are provided solely to satisfy the JPA contract.  The public constructors make sure that we are constructing classes in a meaningful and valid state.  We also have accessor methods for the listed properties because it is possible that names can change over time, so the application will need to be able to change them.  We also implement the equals, hashCode, and toString methods.  Until we have some other uniquely identifying properties for a Person, we will be looking at the two name properties to determine if two instances are the same.  We know this is not going to work long-term as people are free to be named the same thing.  See George Foreman and his kids.  This will be a future iteration point for sure.

Now we are ready for some XML configuration.  As previously mentioned, we will deploy this to a JBoss server, Wildfly 8 to be specific.  By default, an H2 in-memory datasource is configured in the server, so let us use that to start.  We will also use Hibernate for the persistence provider.  Here is where using a schema in the Table annotation is going to bite us.  There is a bug in Hibernate where it does not create the schema before trying to create the tables from the annotated entities.  For now, we will work around that by changing the datasource definition to create the schema.  Add the following to the end of the connection URL for the ExampleDS datasource

;INIT=CREATE SCHEMA IF NOT EXISTS contact

This will make sure that the schema exists before Hibernate starts creating the tables when the application is deployed.

And finally, we have the persistence.xml configuration

<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.1" xmlns="http://xmlns.jcp.org/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/persistence http://xmlns.jcp.org/xml/ns/persistence/persistence_2_1.xsd">
  <persistence-unit name="contactPU" transaction-type="JTA">
    <provider>org.hibernate.ejb.HibernatePersistence</provider>
    <jta-data-source>java:/jboss/datasources/ExampleDS</jta-data-source>
    <exclude-unlisted-classes>false</exclude-unlisted-classes>
    <shared-cache-mode>ALL</shared-cache-mode>
    <properties>
      <property name="hibernate.show_sql" value="true"/>
      <property name="hibernate.format_sql" value="true"/>
      <property name="javax.persistence.schema-generation.database.action" value="drop-and-create"/>
    </properties>
  </persistence-unit>
</persistence>

For debugging purposes only, having Hibernate log the SQL commands being issued is enabled.  In a future iteration, a more permanent database will be created and used.

REST Boundary

To enable REST web services, we need a class that extends the javax.ws.rs.core.Application class.  The only interesting thing here is the ApplicationPath annotation that has the URL path to our services.

@ApplicationPath("resources")
public class ApplicationConfig extends Application {
}

In an earlier iteration, I had created a single REST service class that provided methods to interact with people and organizations.  This, however, started making the URLs for the resources convoluted, so this has now been refactored into two separate REST services.

@Stateless
@LocalBean
@TransactionAttribute(TransactionAttributeType.REQUIRES_NEW)
@Path("person")
@Consumes(MediaType.APPLICATION_JSON)
public class PersonResource {

    @PersistenceContext
    EntityManager em;

    @GET
    @Path("get/{id}")
    public Response getPerson(@PathParam("id") final Long id) {
        Person p = em.find(Person.class, id);
        return Response.ok(p).build();
    }
}

@Stateless
@LocalBean
@TransactionAttribute(TransactionAttributeType.REQUIRES_NEW)
@Path("organization")
@Consumes(MediaType.APPLICATION_JSON)
public class OrganizationResource {

    @PersistenceContext
    EntityManager em;

    @GET
    @Path("get/{id}")
    public Response getOrganization(@PathParam("id") final Long id) {
        Organization c = em.find(Organization.class, id);
        return Response.ok(c).build();
    }
}

Looking at it now, it seems there are just about equal parts code and annotations!  Briefly, we have a couple of stateless session beans without interfaces since we don’t need them.  A new transaction is used for each method call, and we are expecting JSON data to be passed into those calls.  I also have a POST method in each service that I was using for some manual testing, but those may or may not last long term.

A few more bits of configuration are found in the web.xml and jboss-web.xml files.  The web.xml file just defines a session timeout of 30 minutes and sets the welcome file to a simple HTML page.  The jboss-web.xml file formally defines the context-root (/contact) of the war.  If you don’t define this, JBoss will use the war file name (less the .war part) as the context root.

After compiling and deploying, we have a minimally functioning web application.  A quick test of the GET methods can be accomplished with a browser.  The URLs are simply

/contact/resources/person/get/0
/contact/resources/organization/get/0

                                 ^----- @Get method
                        ^----- Class @Path
             ^----- @ApplicationPath
    ^----- context-root

So far, I have done manual testing of the classes, which works, but we really need to start creating some unit and integration tests now, and get in the habit of updating them with new features.  I think I’ll cover that in the next installment.

The final results are available in the GitHub repository dev branch.

As always, questions and comments are welcome.


Introducing Project LEE7 – Learning Java EE 7

April 7, 2014

As a means of diving deeper into the architectural aspects of Java EE programming, I decided to work on a series of posts where I investigate creating a modern enterprise application.  A simple modern enterprise application, but one that takes advantage of the latest bells and whistles in Java EE 7.  Just in time too, since Java 8 dropped a couple of weeks ago!

Technology Stack

As I mentioned, Java 8 was recently release, so we’ll be using that as the basis for this project.  Maven is used for the project structure and build scripting.  We’ll be building a war file that contains the RESTful web services, which is deployed to a JBoss Wildfly 8.0.0 application server.  Initially, we’ll use the H2 database that comes with JBoss, but will eventually migrate to use a standalone database, most likely PostGreSQL.

Additionally, I will be creating some different REST clients for the web services.  There will be a Swing desktop app, JavaFX desktop app (yes, I know, not much difference there), an AngularJS client, and if I get ambitious, an Android client.  All of these sub-projects are housed in my GitHub repository.

Design Philosophy

I’ll be taking some cues from the book Real World Java EE Patterns – Rethinking Best Practices by Adam Bien.  It’s been a good book for looking at ways to do EE development in a simpler and easier to understand manner.

I’m also looking way back to a couple of blog posts from The Coad Letter: Modeling and Design, specifically issues 103 and 107.  These talk about how to design a generic structure for modeling contact information, though it may be a bit overkill with the number of classes.

Comments and suggestions are welcome.  Now, on to the server bits.


Design a site like this with WordPress.com
Get started