Rob Gingell, chief engineer, fellow, and vice president at Sun Microsystems, Inc. is responsible for the overall technical architecture and direction of Sun's product line.
He is also chair of the Java Community Process[sm] (JCP[sm]) program. As part of JavaWorld's ongoing Java Community Process coverage, JavaWorld columnist Frank Sommers interviews Rob in the Q&A that follows.
Sun has, for a long time, claimed that JavaTM technolgoy needed a high-level of compatibility to
preserve the "Write Once, Run Anywhere" promise, and that open-source licenses could
not enforce that degree of compatibility. Hence the Java Community ProcessSM (JCPSM) program, and
its requirement that specifications and reference implementations be accompanied by a
technology compatibility kit (TCK): All subsequent implementations of a JCP-originated Java
standard must pass those TCKs, if they claim to be compatible with that
The latest changes to the Java Community Process program- JCP 2.5, inaugurated in October,
2002 - give Java Specification Request (JSR) leads a leeway in deciding licensing
policy for their work: JSRs can now be developed in an open-source manner, and the
resulting work, including the reference implementation, may also be licensed under an
open-source license, such as the GPL. How does the need to ensure compatibility mash
with the new JCP policy?
There are two aspects here that tend to get confused, it seems. One is the process by which specifications are being developed. The other aspect is the manner in which the resulting work is licensed.
Your question involves a confusion between the process to develop a specification and
the manner in which the resulting work is licensed. The JCP is a process. It does not
actually license anything directly. Project leaders under the JCP are the licensors.
When Sun is the licensor, we use SCSL [Sun Community Source License], a license that
requires derivative uses of the work to maintain compatibility with the specification
as part of the terms for having access to the work.
The JCP now requires that the work products of a JSR - specification, reference
implementation (RI), and conformance tests (TCKs) - are licensed separately from each
other. The general principle is that those doing the work should be able to decide
how they make their work available.
How does that licensing freedom still ensure that all JSR implementations are
To illustrate the question, suppose that I propose and develop a new Java-based API
through the JCP - say, an API to interact with Java-enabled toaster ovens. After the
work is complete, I release the reference implementation under an Apache-style
license. Given that license's terms, anyone can now download my implementation and
hack away at the code. At some point Big Bad Toasters, Inc., decides to fork my
original Java toaster-API implementation such that the new code branch works only
with their toaster ovens and breaks some of the code written against my original
implementation. Can that occur under the JCP 2.5?
As specification lead of the Java toaster JSR, you have decided to make the reference
implementation (RI), and maybe the TCK, available under an open-source license. If
Big Bad Toasters (BBT) takes that RI and creates an incompatible derivative work from
it while still claiming to implement the specification, then they would be in
violation of the specification license.
On the other hand, if they took your work and implemented a completely different
specification from it, say,
com.BBT.Toaster, that would be a legitimate,
though annoying, thing for them to do. The specification IP [intellectual property] protection says that you
can't lie to Java programmers about what Java is. The JCP defines that truth, and the
materials produced in JSRs are used to validate it, and collectively the artifacts
and the process work to maintain that assurance. In this case, however, they're not
lying to anyone. They're not claiming it to be an implementation of the JSR, nor are
they offering an artifact that would poach upon developers who were expecting it to
be the JSR, since it lives in BBT's namespace.
If you make your RI available without the requirement that others using it create
implementations compatible to the original specification, a competitor, such as BBT,
can create a different API and bootstrap their market effort with your work. If they
then work in the marketplace to cause their API to become popular, that's
permissible, unless they infringed on a patent or other IP not licensed to them.
They can also operate much more communally to the same effect. For instance, they can
subclass your JSR-specified API. They will then be compatible with all existing
applications of the original API, but create some new functionality which has great
appeal. If they're successful, it may come to pass that there is no application for
toasters which ends up directly using your original API anymore, making it
marketplace-irrelevant. BBT has done nothing harmful, but merely made its
non-community-defined extensions more popular than the base defined by the JCP.
There's nothing about the JCP that prevents people from succeeding with Java technology, even
where that success overshadows the JCP - that's what competition is all about.
Indeed, the changes adopted in the JCP serve to increase the competition for
compatible implementations. The main thing the JCP strives for is to ensure that
those who write Java applications are not lied to by those who make the
implementations they build and deliver upon.
Does a JCP JSR represent an "official" Java standard? What would prevent developers
from favoring a non-JCP-developed solution to a problem over a relevant JSR?
The JCP reserves the namespaces
than namespace use, there's nothing about the JCP that requires anyone to use the
APIs created through it. If there were, say, a
javax.toaster.* family of
classes, there's nothing that stops anyone from working outside the JCP to create an
The JCP would tend to resist having a competing
activity under its roof, but couldn't do anything about someone setting up a
completely different thing in competition.
One reason we don't see instances of many competing APIs is that there's a general
appreciation that Java's value lies mostly in the over 3 million developers who see
that an investment in a single set of skills gives them a wide market in which to
work. Fragmentation would be inconsistent with the value proposition perceived by
those developers, and thus counter-productive to the motivations that would lead one
to want to make toaster-based APIs in the first place.
Ultimately any community is defined by what makes up [its members'] common
self-interest, and while that self-interest might be codified into agreements and
practices and process rules, what really makes it work is the shared set of values
behind it. If you're building something you want developers to target, you're not
well-served by fragmenting that developer pool.
You mentioned in an interview on www.sun.com that customers care mostly
about binary compatibility, given that it's binary code that delivers the benefits of
the software they use. When the JCP talks about compatibility, does it mean binary or
source-level (API) compatibility? In general, could you please explain a bit the
differences, and what the JCP's focus is in that regard?
We actually mean both source and binary level compatibility in the JCP.
We mean source so that the developers who have invested their energies in learning
Java technology have a wide range of uses against which to apply that skill. Source
compatibility is for programmers.
Binary compatibility is for products - and for the users of those products. Some of
those users are also programmers who want to deploy anywhere, so they get a benefit
from binary compatibility too, and that is the source of the notion of "write once,
The UNIX [R] industry was regarded as fragmented by customers because you couldn't
interchange the binary artifacts between systems, either in whole or in part. Yet we
had all those UNIX standards, and everyone claimed to conform to them. So what went
Well, in some ways, nothing - only, we had an audience mismatch. UNIX was very
successful in having a technology standard - source code was relatively
portable. Linux's rapid rise is in part due to a ready-made set of applications, and
its historical evolution was partly driven by finding an application it couldn't run
and then adding what was needed. The programmers actually did, and largely do, enjoy
"open systems" in UNIX.
End-customers, however, did not. For them, the UNIX standards have a similar import
that steel standards have to a person trying to buy a car. No one cares about them
explicitly, nor makes a purchase decision based explicitly on them. The JCP manages
Java technology in both spaces, providing both programmers and end-users of their work the
benefits they're seeking.
Do you mean that for Java's continued success in the marketplace, binary
compatibility is much more important than source code compatibility?
I think binary compatibility is much more market- and economically relevant than
source compatibility independent of the technology. The power of Java technology stems in part from
being partitioned into two pieces: The JVM [Java Virtual Machine] as the basis for an ISA [instruction set
architecture], which is universal, and then the means used to target it, which is
largely, but not exclusively, Java technology. I wouldn't be surprised to see additional things
targeting the JVM, and some of what we know of as "the Java language" to see some
diversity in coming years as we consider more areas of computing.
There are very few simple models of the industry that are both simple and accurate.
One which seems to pass that test says that the industry can be modelled by looking
at the positive feedback loop among developers: Developers write applications. That
produces volume, which then attracts more developers, and so on. And that model is
fundamentally a model that applies to binaries. It explains much that source level
compatibility doesn't explain.
For instance, the SolarisTM operating environment has essentially 100% coverage of the UNIX applications market.
Every UNIX application that exists has a Solaris/SPARC [R] technology instance for it. You could
not, therefore, imagine a more trivial recompilation exercise than to make the same
application available for Solaris/IA32 [Solaris, Intel x86 Edition]. So how come it
didn't happen? According to the source code theory of the world, that should have
Or, consider Alpha. How come Digital had to essentially buy off people to make Alpha
versions of applications? Aren't they all UNIX applications? Isn't it just a
recompile, or maybe a recompile with a little work? How come they had to be paid to
Then, when Linux came around, which is really a UNIX/IA32 system, how come all the
applications showed up?
The answer in all [those] cases relates to anticipated volume of binaries. Having a
shared space of binaries is much more vital and powerful than having a shared space
of source. That's not to say that shared spaces of source are not valuable in their
own right. It's just that the properties that attend to them are not the ones that
have historically explained economic behavior in the industry.
You witnessed first-hand the fragmentation of the Unix market in the 1980s and early
90s. During all that time, Unix was not open-source. On the other hand, Linux, which
is open-source, has thus far exhibited remarkable coherence, numerous Linux
distributions notwithstanding: Most programs compiled for Linux run on RedHat,
Mandrake, Caldera, SuSe, or any other Linux variety, without modification. Given the
Unix experience, if the JCP's goal is to preserve Java compatibility, would Java not
be better served in a fully open-source environment?
Fragmentation in the marketplace of products relates to the existence of binary
standards. The key point in your question was the phrase "programs compiled for
Linux," yielding a binary. Binaries are largely interchangeable between [Linux]
distributions, and it's that attribute - and not the terms under which the source is
available - that prevents, or causes, fragmentation. No one considers the PC
marketplace "fragmented" - because there's one binary standard even though there's no
source availability at all.
The paradox of Linux's marketplace surge isn't that it's got a community of
developers - UNIX always had that too until around 1990 - but rather the fact that
the volume computer, the PC, has a de facto binary standard for it shared across a
number of suppliers with sufficient volume to matter. Conversely, the reason other
UNIXs are criticized for "fragmentation" is that they never did have a binary
standard because, well, they all arrived on different instruction sets and, thus, had
To be precise about it, we should be including an ISA [instruction-set architecture]
reference when we talk about these binaries - UNIX/IA32 is effectively the space
defined by all the popular Linux distributions, and that's what you were referring to
when you said "compiled for Linux." Indeed, several years ago, we [at Sun] stopped
trying to capture binaries for Solaris/IA32, and simply told anyone who cared that
we'd run the UNIX/IA32 binaries from the far more voluminous Linux/IA32 community. We
weren't going to beat them, and it wasn't an objective to do so. So we just joined
The market presence of that "installed base" is the most important factor in
resisting fragmentation. If one of the Linux distributions starts making it necessary
that most, or all, binaries for them have to be unique to that distribution, then
that's going to fragment the market, even if the source is common to other
distributions, and even if the source is available on open-source terms. That new
binary probably wouldn't be successful unless it was someone with a very large
percentage of the market, like Red Hat. They might do it either consciously, as a
device to lock in the share of their distribution in the market, or unconsciously -
gosh, this would sure be better! - but in any case they have to have the wherewithall
to make it happen.
If the source is available, then the means are there to cope with that fragmentation.
That's one appeal of open source. But we shouldn't confuse having the ability to
recompile things and do things ourselves, and the freedom we think that provides,
with a circumstance where that becomes an obligation. I have Linux systems I
use that are convenient because I can sling binaries around. It'd be possible to use
them if I could only sling source, but far less convenient - so much so in fact that
that I'd hardly put up with it for long.
Java's proven to be successful because it, too, is a binary standard, and indeed the
only existence proof of any such thing managed as such in the industry. Its binaries
substitute universally for any ISA. Java applications are effectively Java/JVM
binaries. There could be others and, indeed, there are Ada/JVM and FORTRAN/JVM
Java technology established that [binary] standard largely before there was a big population of
applications. Linux on the other hand, is a UNIX system produced 30 years after the
start of UNIX. There was a pile of applications that only needed to be compiled for
it, and once done so, shared among all the distributions. The applications inertia
came largely for free for Linux. That wasn't true for Java technology, which has had to
withstand some determined attempts to force it to fragment at that all important
Which leads us to the last part of the question: Would Java technology be better served in a
fully open-source environment? My answer is that it'd be differently served. The
notions of community - involving more intellects than any one organization has - and
the ideas of shared knowledge have a lot of appeal. But with respect to
fragmentation, to be open-sourced or not is an orthogonal issue - what matters is the
presence or absence of a binary standard.
Just as the Solaris ABI [application binary interface] has served to ensure
compatibility between Solaris and its applications, will the Java binary format and
JVM assume similar roles in Sun's future? In other words, will Java be Sun's
Yes, I tend to tell people that the primary ABI of our future lies in IP/JVM, JVM
being the ISA, and a collection of IP-based protocols serving the role that we used
to ascribe to operating systems. That's a softer definition than what we said most of
the 1990's - namely Solaris/SPARC. That's not a denial of Solaris/SPARC or UNIX or
microprocessor development in general, but is a recognition of the growth of a new
class of applications, enabled by the ubiquity of the network. Those applications add
to our existing business in a powerful way.