Thursday, September 3, 2020

My minimum standards for coding in Java

People have a tendency to overlook certain issues, and may require gentle reminders every so often, to nudge them in the right direction. I'd imagine a simplified list would help with rehearsing the drill. This happens everywhere, including at work with my team, with regards to Java coding for webapp development. Here's what I've come up with succinctly abbreviated as "SCROLL":
  • Switches
  • Comments
  • Reusability
  • OWASP
  • LogLevels

Switches

Implement soft toggles that allow the running application to toggle features without server restarts. This is particularly useful for rolling out enhancements prior to the actual go-live date.

One way is to store such values in the database via system variables or code tables. The "enhanced" code should then check the switch each time it is called. Of course, not all situations can adopt this method, but I'd think that doing this as much as possible will be of great help.

Comments

Elaborate explanations in the codes will help others understand your thought processes in future. I find it equally helpful for when I revisit very old codes that was written by yours truly. The explanations for changes in workflows will be useful for troubleshooting several years down the road. 

A recommended format would be //name, date, description for single-liners. The next person could potentially approach the person who built it, and be able to discern a timeline of which set of codes came after which.

An added bonus would be if the comments were following conventional Java /** **/ format such that it can show up properly in generated javadocs.

Reusability

Optimised codes can be refactored into methods that can be write-once-run-anywhere (at the code level). This includes system variables and constants. Also part of this category are Util classes that serve a common, generic purpose.

OWASP

Security should never be an afterthought, where the OWASP still is the recommended set of guidelines that web developers should work with. Proper validations should be ensured (even for basic "!= null" checks) which will aid in the long run in case of code scans and security tests.

LogLevels

Logging is important, but so is the correct use of loglevels. Only use INFO level for production environments, while aiming to only output a single line containing all the useful information without generating extraneous logs. A sub-point would be to avoid logging sensitive data, and sanitising the output if it's necessary.


The above are meant for highlighting specific areas to focus on, in the name of brevity. Do you have any others you'd consider adding/replacing to the list?

Wednesday, August 19, 2020

StackOverflowError during Maven assembly

While trying to rebuild a project after reviewing changes from a team member, the compilation threw a Stack Overflow error (not the website) in the process of assembling the final JAR file.

Exception in thread "main" java.lang.StackOverflowError
    at sun.nio.cs.SingleByte.withResult(SingleByte.java:44)
    at sun.nio.cs.SingleByte.access$000(SingleByte.java:38)
    at sun.nio.cs.SingleByte$Encoder.encodeArrayLoop(SingleByte.java:187)
    at sun.nio.cs.SingleByte$Encoder.encodeLoop(SingleByte.java:219)
    at java.nio.charset.CharsetEncoder.encode(CharsetEncoder.java:579)
    at sun.nio.cs.StreamEncoder.implWrite(StreamEncoder.java:271)
    at sun.nio.cs.StreamEncoder.write(StreamEncoder.java:125)
    at java.io.OutputStreamWriter.write(OutputStreamWriter.java:207)
    at java.io.BufferedWriter.flushBuffer(BufferedWriter.java:129)
    at java.io.PrintStream.write(PrintStream.java:526)
    at java.io.PrintStream.print(PrintStream.java:669)
    at java.io.PrintStream.println(PrintStream.java:806)
    at org.slf4j.impl.SimpleLogger.write(SimpleLogger.java:381)
    at org.slf4j.impl.SimpleLogger.log(SimpleLogger.java:376)
    at org.slf4j.impl.SimpleLogger.info(SimpleLogger.java:538)
    at org.apache.maven.cli.logging.Slf4jLogger.info(Slf4jLogger.java:59)
    at org.codehaus.plexus.archiver.AbstractArchiver$1.hasNext(AbstractArchiver.java:464)
    at org.codehaus.plexus.archiver.AbstractArchiver$1.hasNext(AbstractArchiver.java:467)
    at org.codehaus.plexus.archiver.AbstractArchiver$1.hasNext(AbstractArchiver.java:467)

Some suggestions included looking at the JRE memory heap. Another hinted at the thread stack size instead. Turns out that the latter was more correct. Naturally, setting the MAVEN_OPTS value in the System PATH variable did not help. Restarting Eclipse didn't help either. 

To which, my next line of thought went towards wondering, what if the value was set into the JRE when executing the Maven build. 


Here's what I did:

  1. Navigate to Eclipse
  2. Run Configurations > [Select build profile]
  3. JRE tab > VM arguments
  4. Input "-Xss2m"
  5. Apply and Run
The thread stack size would have been increased to 2MB at this point for the build, the complain goes away, and the build completes successfully (for me at least).

Thursday, July 2, 2020

Death to Java Applets

Preface
Why are we still on this topic in the second half of the year 2020? Unfortunately, not all enterprise applications can cut off such dependencies as easily as your next-door neighbour hosting their WordPress e-commerce shop in the cloud.

Intro
In the heydays of applets, the UI components and integration with webpages proved to be highly sought after. Certain operations and functionalities were useful wayback when, which includes but not limited to the following:
  1. Remote Method Invocation;
  2. Native Library Interfaces;
  3. Function calls on the local filesystem;
While the world is preparing to mourn the passing of the Adobe Flash Player in a few months from now, the other veteran from the same era had received less attention, partly due to its application in less consumer-centric purposes.

In this day and age, what could possibly replace such a utilitarian platform for enterprises that employ web-based applications, built around access via browsers? While it may be apparent to some, it might not be equally obvious to others. Let's break it down some.

Ajax and JSON
The basic transport employed by web apps has to be HTTP(S) these days. The use of jQuery, amongst many other libraries have been staples for some time. Some of the functionalities for Java RMI can effectively be subsumed by asynchronously transmitting JSON traffic to be dealt with on the serverside. There is less dependency on use of Java objects, and this would prove more useful in a heterogeneous application ecosystem. Testing is simpler via SoapUI or Postman, and there are standard HTTP client libraries available in your favourite language.

Local Services
While Cross Origin Resource Sharing (CORS) is quite a mouthful, it's rather trivial to allow access control in your codes. Yes, what I'm proposing is for hosting a lightweight server for responding to requests local to the workstation. Of course, security has to be enforced, to prevent remote abuse. Once the various concerns have been addressed, a simple tray program could make for a powerful utility for supplementing your web application. This service essentially serves as a local IPC conduit between the browser and native functions, using HTTP as the socket transport, and JSON as the message format.

Web Sockets
The standard REST APIs typically employ asynchronous transmissions using GET/POST methods, and should suffice if your operations are straightforward enough. The only time you might want to consider taking out the big guns, are if you have requirements for bidirectional communication with a native device. This is where Web Sockets will prove useful. That said, it is less trivial to implement SockJS or STOMP on your mini service.

Conclusion
Taking all the above into consideration, I'd worked on a side project that was effectively a Local Service program. It would only respond to requests from localhost, and was capable of loading pluggable handler classes at runtime.  These plugins each serves a different purpose, all running off a different context path. No Web Sockets for now. Operations that could potentially be offloaded to server-side, would be done that way, but we still had to deal with certain tasks local to the user workstation. This was our way out.

Let me know if you'd found a different way out.

The Java Applet is dead. Long live the Java Applet.

Thursday, January 23, 2020

Poor Image Quality in Generated PDF

Happy New Year!

We'd recently had the opportunity to upgrade some codes from codes related to PDF generation. The source was previously coded to manually position elements (e.g. 100px from the left, 20px from the top) for the output. The revised strategy was to adopt a Word Document as the template, with placeholders prepared. An XML will then bind to the template at runtime. But this is trivial only for text content. The images were slightly more complicated, but we were able to overcome it, with slight adjustments to Docx4J.

The problem arose however, when we noticed that an image with fine detail would then appear to look very poor in quality when viewed on the PDF. Yet, if the image were to be copied from the PDF, and pasted into a separate image viewer, it looks clean and crisp.

The couple of assessments I had initially led to deadends:
  1. It was not due to AffineTransform having lousy output;
  2. It was not due to the process of converting the DOCX to PDF format;
  3. It did not make a difference regardless of the image format provided (PNG/JPEG/BMP);
  4. It was not due to the difference of colour spaces (e.g. BufferedImage.TYPE_INT_ARGB);
  5. It did not make a difference setting the DPI into the PNG metadata;

After a bit of investigation, by stepping into the Docx4J classes at runtime, I started noticing that the DPI use was suspect. Diving into the library sources, I noticed that the preprocessing stage for generating the PDF will attempt to determine the dimensions of each image. From there, I was able to surmise that I could further change the codes on my end.

//create the image part
BinaryPartAbstractImage imagePart = BinaryPartAbstractImage.createImagePart(wordMLPackage, imageBytes);
//derive inline element
Inline inline = imagePart.createImageInline( null, "image alt", 0, 1, true);
//retrieve dimensions to fix
CTPositiveSize2D ext = inline.getExtent();
ext.setCx((long) (ext.getCx()*0.75));
ext.setCy((long) (ext.getCy()*0.75));
inline.setExtent(ext);

After adding the fix (in bold), the image appeared much cleaner. When copied into an external image editor, the image looked much closer to scale in comparison to it's counterpart viewed from the PDF viewer at 100% zoom. The image dimensions (in pixels) would then of course have to be adjusted larger to compensate.

As an aside, I'd also learnt about 2 new units of measurement:
  1. "mpt" - millipoints
  2. "twip" - twentieth of a point 
 Not that they are useful in any way outside of this situation.