So, Java 9 came out last year... What now? Where to get started? If that's what you're asking yourself, then you've come to the right place! This Java 9 tutorial is a condensation of all you need to know to find your way around the new release, to get you ready to explore it in more depth. Most topics begin with a block of code, so you can see right away how it works.
We start with setup (including tool support and migration challenges) before coming to Java 9's upsides: language changes (e.g. private interface methods), new and improved APIs (e.g.collection factory methods and improvements to streams and optionals), changes to the JVM (e.g. multi-release JARs), and finally the new release's flagship feature, the module system. There will be plenty of links for you to explore these topics further.
▚Getting Started With Java 9
You can download JDK 9 from Oracle. Personally, I prefer to download ZIPs and just unpack them instead of using JDK 9 as my default JVM, but that might be a left-over from using the early-access build. Nowadays you could give it a try.
▚Tool Support
For the best integration into your favorite IDE you should use its most current version as Java 9 support is constantly improved. If the cutting edge isn't for you, you should at least be on Intellij IDEA 2017.2 or Eclipse Oxygen.1a (before that version, Eclipse needed Java 9 support plugins - they are obsolete now).
Similarly, use a current version of your build tool. In the case of Maven this should at least be 3.5.0 (although, e.g. this bug was only fixed in 3.6.1) of the application itself and 3.7.0 of the compiler plugin. For Gradle, use at least 4.2.1.
⇝ Six tips for running Maven on Java 9.
▚Migration Challenges
While modularization remains fully optional, migrating to Java 9, i.e. simply building and executing a project on the new release, may require a few changes. The entire JDK has been modularized and together with some other internal changes this causes migration challenges when compiling and running code on Java 9. They can usually be fixed in the short-term with the new command line options, so you can take your time to properly resolve them.
Here are the seven most common challenges you might encounter:
- illegal access to internal APIs
- dependencies on Java EE modules
- split packages
- casting to
URLClassLoader
- rummaging around in runtime images
- boot class path
- new version strings
⇝ Read my post on migration challenges to learn how to overcome them.
▚Language Changes
Java 8 revolutionized how we write code - Java 9 does not even get close. But it does improve a few details and people looking for clean and warning-free code will appreciate them.
▚Private Interface Methods
public interface InJava8 {
default boolean evenSum(int... numbers) {
return sum(numbers) % 2 == 0;
}
default boolean oddSum(int... numbers) {
return sum(numbers) % 2 == 1;
}
// before Java 9, this had to be `default`
// and hence public
private int sum(int[] numbers) {
return IntStream.of(numbers).sum();
}
}
As you can see, private interface methods are just that, the possibility to add private
methods to interfaces.
They are exactly like other private methods:
- can not be
abstract
, i.e.
must contain a body
- can not be overriden
- can only be called in the same source file
Their only use case is to share code between default methods without requiring you to add another default method to the interface's API.
▚Try With Effectively Final Resources
void doSomethingWith(Connection connection)
throws Exception {
// before Java 9, this had to be:
// try (Connection c = connection)
try(connection) {
connection.doSomething();
}
}
If connection
is effectively final, you can write try (connection)
instead of the laborious try (Connection c = connection)
that you had to use before Java 9.
Finally!
▚Diamond Operator
<T> Box<T> createBox(T content) {
// before Java 9, we had to put `T` there
return new Box<>(content) {
// useless anonymous class
};
}
The diamond operator can now be applied to anonymous classes. In some cases the compiler might derive a type that the Java type system can not express (didn't know those existed; they are called non-denotable types), in which case you get a compile error (this was the reason why they were not allowed in the first place). Here's an example:
Box<?> createCrazyBox(Object content) {
List<?> innerList = Arrays.asList(content);
// compile error
return new Box<>(innerList) {
// useless anonymous class
};
}
▚Private Safe Varargs And Less Deprecation Warning
import java.io.LineNumberInputStream;
@Deprecated
public class DeprecatedImportsAndSafeVarargs<T> {
LineNumberInputStream stream;
@SafeVarargs
private void compareToNext(T... args) {
// [...]
}
}
On Java 8, the import
directive would cause a warning because java.io.LineNumberInputStream
is deprecated and the @SafeVarargs
annotations would cause a compile error because it was not applicable to non-final methods.
From Java 9 on, imports no longer cause deprecation warnings and @SafeVarargs
can be applied to private methods (final or not).
▚New And Improved APIs
The lack of cohesion of the new and improved APIs might make it seem that nothing much happened, but that's far from the truth! Much work went into them - they just don't have a well-marketable label like "Streams and Lambdas".
▚Stream API
public Stream<LogMessage> fromWarningToError() {
return messages.stream()
.dropWhile(message -> message.lessThan(WARNING))
// this actually excludes the error
.takeWhile(message -> message.atLeast(ERROR));
}
The stream API saw good improvements, of which a single example can only show a little. The changes are:
Stream::ofNullable
creates a stream of either zero or one element, depending on whether the parameter passed to the method wasnull
or not.Stream::iterate
create a stream much like afor
loop.Stream::dropWhile
takes a predicate and removes elements from the stream's beginning until the predicate fails for the first time - from then on, the stream remains the same and no more elements are tested against the predicate (unlikefilter
would do).Stream::takeWhile
takes a predicate and returns elements from the stream's beginning until the predicate fails for the first time - there the stream ends and no more elements are tested against the predicate (unlikefilter
would do).
⇝ More on stream improvements.
▚Optional API
public interface Search {
Optional<Customer> inMemory(String id);
Optional<Customer> onDisk(String id);
Optional<Customer> remotely(String id);
default void logLogin(String id, Logger logger) {
inMemory(id)
.or(() -> onDisk(id))
.or(() -> remotely(id));
.ifPresentOrElse(
logger::customerLogin,
() -> logger.unknownLogin(id));
}
}
The Optional
API was improved as well - too much for a single example.
The changes are:
Optional::stream
creates a stream of either zero or one element, depending on the optional is empty or not - great to replace.filter(Optional::isPresent).map(Optional::get)
stream pipelines with.flatMap(Optional::stream)
.Optional::or
takes a supplier of anotherOptional
and when empty, returns the instance supplied by it; otherwise returns itself.Optional::ifPresentOrElse
extendsOptional::isPresent
to take an additional parameter, aRunnable
, that is called if theOptional
is empty.
⇝ More on Optional
improvements.
▚Collection Factories
List<String> list = List.of("a", "b", "c");
Set<String> set = Set.of("a", "b", "c");
Map<String, Integer> mapImmediate = Map.of(
"one", 1,
"two", 2,
"three", 3);
Map<String, Integer> mapEntries = Map.ofEntries(
entry("one", 1),
entry("two", 2),
entry("three", 3));
The new collection factory methods List::of
, Set::of
, Map::of
return collections that:
- are immutable (unlike e.g.
Array::asList
, where elements can be replaced) but do not express that in the type system - calling e.g.List::add
causes anUnsupportedOperationException
- roundly reject
null
as elements/keys/values (unlikeArrayList
,HashSet
, andHashMap
, but likeConcurrentHashMap
) - for
Set
andMap
, randomize iteration order between JDK runs
⇝ Here's a good introduction to collection factory methods.
▚Reactive Streams
I'm gonna break with the code-first approach here, because for reactive streams there is too much code involved - have a look at the demo.
Reactive streams require three basic types:
Publisher
produces items to consume and can be subscribed to.Subscriber
subscribes to publisher and offers methodsonNext
(for new items to consume),onError
(to inform if publisher encountered an error),onComplete
(if publisher is done).Subscription
is the connection between publisher and subscriber and can be used torequest
items orcancel
the subscription
The programmatic flow is as follows:
- Creation and subscription:
- create
Publisher pub
andSubscriber sub
- call
pub.subscribe(sub)
pub
createsSubscription script
and callssub.onSubscription(script)
sub
storesscript
- create
- Streaming:
sub
callsscript.request(10)
pub
callssub.onNext(element)
(max 10x)
- Canceling:
sub
may callsub.OnError(err)
orsub.onComplete()
sub
may callscript.cancel()
There are no reactive APIs in JDK 9.
For now, it only contains these interfaces (in java.util.concurrent.Flow
) to offer reactive libraries like RxJava that implement those interfaces a common integration point in the JDK.
In the future, JDK APIs might make use of them themselves.
⇝ Here's a good introduction to the flow API.
▚Stack-Walking
private static Class<?> getCallingClass() {
return StackWalker
.getInstance(RETAIN_CLASS_REFERENCE)
.walk(frames -> frames
.map(StackFrame::getDeclaringClass)
.filter(declaringClass -> declaringClass != Utils.class)
.findFirst()
.orElseThrow(IllegalStateException::new);
}
The new stack-walking API makes it easier to walk the Java call stack and considerably improves performance of partial walks (e.g. when only to determine the immediate caller like logging frameworks do) and walks that require cheaper information (i.e. no source code information like line number).
The trick is to first get a StackWalker
instance and then hand a Function<Stream<StackFrame>, T>
(plus wild cards) to walk
, so when the walker hands you a stream of frames, you do your thing and compute your T
(in the case above finding the Class
that called into Utils
), which walk
will then return.
Why doesn't walk
simply return a Stream<StackFrame>
?
Because the stream is lazy (that's the whole point of the new API) and you could get weird results when evaluating it at some random future time.
Hence walk
forces you to evaluate the frames within its call.
⇝ Deep dive into stack-walking API.
▚OS Processes
public static void main(String[] args) throws Exception {
// tree -i /home/nipa | grep pdf
ProcessBuilder ls = new ProcessBuilder()
.command("tree", "-i")
.directory(Paths.get("/home/nipa").toFile());
ProcessBuilder grepPdf = new ProcessBuilder()
.command("grep", "pdf")
.redirectOutput(Redirect.INHERIT);
List<Process> lsThenGrep = ProcessBuilder
// new in Java 9
.startPipeline(asList(ls, grepPdf));
System.out.println("Started processes...");
CompletableFuture[] lsThenGrepFutures = lsThenGrep.stream()
// onExit returns a CompletableFuture<Process>
.map(Process::onExit)
.map(processFuture -> processFuture.thenAccept(
process -> System.out.println(
"Process " + process.getPid() + " finished.")))
.toArray(CompletableFuture[]::new);
// wait until all processes are finished
CompletableFuture
.allOf(lsThenGrepFutures)
.join();
System.out.println("Processes done");
}
The process API got a few new methods to create process pipelines as well as new methods on Process
...
boolean supportsNormalTermination()
long pid()
CompletableFuture<Process> onExit()
Stream<ProcessHandle> children()
Stream<ProcessHandle> descendants()
ProcessHandle toHandle()
... and a new type ProcessHandle
with some interesting static factory methods:
Stream<ProcessHandle> allProcesses()
Optional<ProcessHandle> of(long pid)
ProcessHandle current()
Looks like all you need to build a simple task manager with Java. 😊
▚Version API
Version version = Runtime.version();
System.out.println(
version.major()
+ "." + version.minor()
+ "." + version.security());
Java 9 changed the version scheme (and Java 10 changes it again), which made all that prodding of system properties and parsing their values all the more error-prone.
Java 9 finally resolves that with Runtime.Version
, which gives you safe access to Java 9's (and 10+'s) version information with methods like major
and minor
(which have been renamed for Java 10+ to feature
and interim
).
▚Further Changed APIs
- multi-resolution images (JEP 251)
- native desktop integration (JEP 272)
- deserialization filter (JEP 290)
- experimental HTTP/2 support (JEP 110, **!
Fully supported in Java 11 !**), DTLS (JEP 219), TLS ALPN and OCSP stapling (JEP 244)
▚JVM Changes
Not only the language and API was improved, though. The JVM got some new features as well. Naturally, its a little tougher to show them with code-first, but I'll do my best.
▚Multi-Release JARs
Say you have a class Main
and another one Version
.
Version
is special because you need it to run different code on Java 8 and 9.
With multi-release JARs you can do that as follows:
- write
Main
for Java 8 and compile it into the folderclasses-8
- create two implementations of
Version
with the same fully-qualified name and the same public API; one targets Java 8, the other Java 9 - compile them into two different folders
classes-8
andclasses-9
With Java 9's jar
you can do this:
jar
--create --file mr.jar
-C classes-8 .
--release 9 -C classes-9 .
Without the last line in that command, it's the typical way to package a bunch of classes into a JAR that would look like this:
└ org
└ codefx ... (moar folders)
├ Main.class
└ Version.class
With the last line the JAR looks like this, though:
└ org
└ codefx ... (moar folders)
├ Main.class
└ Version.class
└ META-INF
└ versions
└ 9
└ org
└ codefx ... (moar folders)
└ Version.class
JVMs before 8 ignore the META-INF/versions
folder, but Java 9 will first look there when loading classes.
That means running the JAR on Java 9 will execute a different Version
class than when running on Java 8.
With multi-release JARs you can create artifacts that execute different code, depending on the JVM version they run on.
This allows your library to use the best API on each JVM version, for example the throwable-creating (for stack information) and property-parsing (for version information) on Java 8 and earlier and the StackWalking
and Runtime.Version
APIs on Java 9.
⇝ Read my detailed guide to multi-release JARs.
▚Redirected Platform Logging
No code this time, because you're unlikely to write any.
This is the job of your favorite logging framework's maintainers, so they can get their project ready to be the backend for all JDK log messages (not JVM logging).
Because from Java 9 on, the JDK will send its log messages through a set of interfaces (System.LoggerFinder
, System.Logger
) for which logging frameworks can provide implementations.
This feature works well with multi-release JARs, which allows the framework to work fine on older Java versions, while benefiting from the additional functionality if run on Java 9.
▚Unified Logging
$ java -Xlog:gc*=debug -version
> [0.006s][info][gc,heap] Heap region size: 1M
> [0.006s][debug][gc,heap] Minimum heap 8388608 Initial heap 262144000
Maximum heap 4192206848
# truncated about two dozen message
> [0.072s][info ][gc,heap,exit ] Heap
# truncated a few messages showing final GC statistics
This time it's about JVM logging. Thanks to a unified infrastructure (JEP 158, JEP 271), log messages from most (in the future, all) JVM subsystems can be configured with the same command line flag.
Internally, it works similarly to common logging frameworks, with messages getting a a level, a message, a time stamp, tags, etc.
What's a little unusual is the configuration with -Xlog
.
⇝ In-depth guide to unified logging.
▚JVM Performance Improvements
As usual, the JVM got once again faster in Java 9. Here's the list of the performance-related changes:
- compact strings reduce average heap size by 10% to 15% (JEP 254)
- improved ("indified") string concatenation significantly reduces overhead when putting strings together (JEP 280)
- Java 9 is aware of cgroup memory limits, which makes it play nicer with Docker et al (this was backported to Java 8)
- something with interned strings and class data sharing (JEP 250)
- contended locks reduce the performance overhead caused by internal bookkeeping (JEP 143)
- security manager performance hit was reduced (JEP 232)
- Java 2D rendering got better with the Marlin renderer (JEP 265)
⇝ There's a great talk by Aleksey Shipilëv about the challenges and impact of implementing compact strings and indified string concatenation.
▚Further JVM changes
There are many more changes I can't go into detail on. For something approaching completeness, I will list them instead.
- new version strings (JEP 223)
- GNU-style command line options (JEP 293)
- command line flag validation (JEP 245)
- reserved stack areas (JEP 270)
▚Module System
The Java Platform Module System (JPMS) is undoubtedly Java 9's major feature. It posits that artifacts should no longer be plain JARs but JARs that describe a module, modular JARs, so to speak, and that they should be represented at runtime as modules.
The JPMS posits that artifacts should be represented at runtime as modules
A JAR is made modular by adding a module descriptor, module-info.class
, which gets compiled from a module declaration, module-info.java
:
module com.example.project {
requires org.library;
requires io.framework;
exports com.example.project.pack;
}
As you can see a modules has a name, expresses dependencies, and defines some exports. The module system has many features, but its two cornerstones are:
- making sure all required modules are presented when an application gets compiled or launched (called reliable configuration)
- preventing access to all classes except the public ones in those exported packages (string encapsulation)
This allows compiler and runtime to fail faster when dependencies are missing or code does things it's not supposed to and will make Java applications, particularly large ones, more stable.
Other interesting features are more refined imports and exports (e.g. optional dependencies), services, or the possibility to create runtime images with jlink
with exactly the modules your application needs.
By aligning the JVM's conception (which sees all code on the class path as a big ball of mud) with ours (which usually sees trees of dependencies with artifacts that have names, dependencies, and APIs) an jarring conceptual dissonance is mended.
To process modules the module system introduces a concept paralleling the class path: the module path. It expects modular JARs and represents artifacts it finds as modules.
The class path won't go anywhere, though, and remains a completely appropriate way to build and run projects. This and a few specific mechanisms (mostly unnamed module and automatic modules) allow the Java ecosystem to modularize almost independently from one another without forcing any project to either go modular or stay plain against its maintainers will.
For a thorough introduction to the module system:
- ⇝ read the Code-First Java 9 Module System Tutorial
- ⇝ get my book The Java Module System (Manning)
▚Reflection
And that's it. Phew...