joshivineet / protobuf

Automatically exported from code.google.com/p/protobuf
Other
0 stars 0 forks source link

Java code should detect incompatible runtime library version #210

Open GoogleCodeExporter opened 9 years ago

GoogleCodeExporter commented 9 years ago
What steps will reproduce the problem?
1. Generate java sources from any proto file using protoc 2.0.3
2. Compile the generates source files using protobuf-java lib 2.0.3 and create 
a model jar
3. Use the compiled model jar in a runtime environment with protobuf-java 2.3.0

What is the expected output? What do you see instead?
It is expected that the runtime would succeed instead of exploding with an 
exception due to method signature changes in 
Descriptors.FileDescriptor.internalBuildGeneratedFileFrom(...)

What version of the product are you using? On what operating system?
CentOS 5 with protobuf 2.0.3 and 2.3.0

Please provide any additional information below.

java.lang.NoSuchMethodError: 
com.google.protobuf.Descriptors$FileDescriptor.internalBuildGeneratedFileFrom(Lj
ava/lang/String;[Lcom/google/protobuf/Descriptors$FileDescriptor;Lcom/google/pro
tobuf/Descriptors$FileDescriptor$InternalDescriptorAssigner;)V

The change was made to com.google.protobuf.Descriptors.java in r189
http://code.google.com/p/protobuf/source/diff?spec=svn189&r=189&format=side&path
=/trunk/java/src/main/java/com/google/protobuf/Descriptors.java

Suggested fix is to add an old method back as an overload and just pass it to 
the new signature by wrapping the single String into a String[]{descriptorData} 
to preserve backwards compatibility with library artifacts that were compiled 
against 2.0.x previously.

Original issue reported on code.google.com by aant...@gmail.com on 29 Jul 2010 at 4:22

GoogleCodeExporter commented 9 years ago
protoc and the runtime library are tightly coupled.  You cannot use an older 
protoc with a newer runtime nor vice versa.  Your suggested fix would only 
solve one small problem of many.  Sorry, I realize this could be inconvenient, 
but that's how the system is designed.

Original comment by kenton@google.com on 29 Jul 2010 at 6:48

GoogleCodeExporter commented 9 years ago
What is the expected compatibility between the library and protoc?  Only exact 
versions are ever compatible?

Original comment by jacob.to...@gmail.com on 29 Jul 2010 at 6:54

GoogleCodeExporter commented 9 years ago
Yes, only exact versions.

I'm re-opening this bug with a different purpose:  The Java generated code 
should detect incompatible runtime library versions and throw an exception.  
The C++ code already does this.

Original comment by kenton@google.com on 29 Jul 2010 at 7:02

GoogleCodeExporter commented 9 years ago
Kenton,  one of the reasons we've created the "shared" model artifact that is 
pre-compiled is to avoid re-generation of src files and recompiling them every 
time we run a build, which is done my many multiple projects. Is it possible to 
have protoc be smart about the files it re-generates, and just like javac, only 
"compile" the ones that have changed, not the ones that have already been 
generated before?

Also, since you've indicated that the versions of protobuf-java are not really 
intended to be backwards compatible to each other, would it be better to change 
the versioning scheme to be just sequential, instead of following the 
major.minor.patch convention?  This way the build systems like Maven and Ivy 
would detect the incompatibility during build time and not allow something like 
2.0.3 and 2.3.0 to be build-compatible.

Original comment by aant...@gmail.com on 29 Jul 2010 at 7:41

GoogleCodeExporter commented 9 years ago
> Kenton,  one of the reasons we've created the "shared" model artifact that is
> pre-compiled is to avoid re-generation of src files and recompiling them every
> time we run a build, which is done my many multiple projects. Is it possible 
to
> have protoc be smart about the files it re-generates, and just like javac,
> only "compile" the ones that have changed, not the ones that have already
> been generated before?

I don't understand what this has to do with the bug.  Can't you just update 
your pre-compiled protos whenever you update your protobuf version?

But to answer your question:  it's generally the responsibility of the build 
system to detect when changes have occurred and avoid re-running commands.  If 
none of the input .protos have changed, and the protoc binary has not changed, 
then the build system should recognize that there's no reason to run it.

> Also, since you've indicated that the versions of protobuf-java are not really
> intended to be backwards compatible to each other, would it be better to 
change
> the versioning scheme to be just sequential, instead of following the
> major.minor.patch convention?  This way the build systems like Maven and Ivy
> would detect the incompatibility during build time and not allow something 
like
> 2.0.3 and 2.3.0 to be build-compatible.

No, our version numbering scheme should not be dependent on some build system's 
conventions.

However, if there is a way to communicate the incompatibility to Maven without 
changing the public-facing version number, I'm happy to do that.  We do 
something like that in C++:  we bump the SONAME with every release, so protobuf 
2.3.0 corresponds to libprotobuf.so.6, whereas 2.2.0 was libprotobuf.so.5, etc.

Original comment by kenton@google.com on 29 Jul 2010 at 8:00

GoogleCodeExporter commented 9 years ago
In java, the same can be done when publishing the protobuf-java.jar.  
Right now, if you look into the 'java' folder of the project, the pom.xml has 
the same version in it, as it corresponds to the overall protobuf release, i.e. 
protobuf 2.3.0 would produce protobuf-java-2.3.0.jar
Both Maven and Ivy treat is as minor backwards-compatible to 
protobuf-java-2.2.0.jar, etc...  Having the jar be named as protobuf-java-6.jar 
vs protobuf-java-5.jar would do the trick, since those would be major line 
differences, and thus deemed not backwards-compatible.

Original comment by aant...@gmail.com on 29 Jul 2010 at 8:17

GoogleCodeExporter commented 9 years ago
The thing is, the Libtool docs very explicitly say "This SO version number 
should have nothing at all to do with your project version number.", and as 
such no one expects these to be related.  Can the same be said of the version 
number given to Maven?

Original comment by temporal on 4 Aug 2010 at 3:17

GoogleCodeExporter commented 9 years ago
I would dare to say "yes", as the protobuf-java jar doen't really have much to 
do with the protoc version.  Just like in C, the *.so file is being linked 
against during compilation, in java you link against classes provided in a jar 
during your program compilation.

The version of protoc should match the definitions in the *.proto files (if new 
syntax is added.  The java code that gets generated does not really follows the 
backwards or any other compatibility rules of the protoc itself (actually 
according to Kenton, the java versions, just like C, are never backwards or 
forwards compatible to anything but itself)

Original comment by aant...@gmail.com on 4 Aug 2010 at 3:29

GoogleCodeExporter commented 9 years ago
It is actually very important that you match protoc versions with 
libprotobuf.jar versions.  However, in the future we'd like to spin off the 
whole Java side into a separate package (runtime library + code generator 
plugin), at which point its version numbers wouldn't necessarily have to match 
protoc's.  Although, we still may want them to match to express which protoc 
feature set is implemented by each Java implementation version.

Original comment by temporal on 4 Aug 2010 at 3:41

GoogleCodeExporter commented 9 years ago
So how is it different from using protoc to generate the C code and have that 
be bound to a particular version of *.so?  Isn't the same holds true, the 
version of protoc must match the version of libprotobuf.so?

Original comment by aant...@gmail.com on 4 Aug 2010 at 3:44

GoogleCodeExporter commented 9 years ago
Right.  My point is that the so version isn't expected to relate to anything 
else, whereas I suspect that the version on the jar file / in the POM generally 
is expected to match the publicized version number.  Is that true or not?  If 
not, then twiddling that seems like a fine solution.  I just don't know enough 
about Maven best practices to know how it is likely to be interpreted.

Original comment by kenton@google.com on 4 Aug 2010 at 8:43

GoogleCodeExporter commented 9 years ago
Sorry for not replying to this sooner, things have been a bit hacktik.  The 
basic answer is Yes, Maven, as well as Ivy will resolve the artifact's version 
based on the jar's naming convention - name-major.minor.patch-extra.extention.  
So if a jar is named protobuf-java-2.3.0.jar, according to default conventions 
it is expected to be compatible with protobuf-java-2.2.0.jar, etc.  If, 
however, the names are protobuf-java-2.jar and protobuf-java-3.jar, they would 
not be viewed as compatible by default (one would have to explicitly indicate 
the compatibility rules as being major-line compatible).  In case of protocol 
buffers, I see a number of artifacts produced by the project, there is a 
version of protobuf format, which defines what is available as far as syntax 
and options that can go into the .proto files (meaning a protoc compiler of 
version 2.3 should be able to compile proto files that were intended for protoc 
2.2, etc), and there are also language-specific libraries that enable various 
languages (java, c++, python, etc) to be capable of consuming the binary 
payload and represent it in the forms of objects within those languages.  To 
me, those appear as separate artifacts that have a separate versioning scheme 
and an indication of being able to handle a protobuf byte stream as defined by 
a given version of protobuf encoding specification.  So just like you 
separately version the .so libs for cpp, the same applies to the java 
versioning scheme and other languages as well (meaning no need to match the 
java's protobuf jar to the version of protoc and .proto syntax spec)

Original comment by aant...@gmail.com on 26 Aug 2010 at 1:40

GoogleCodeExporter commented 9 years ago
Hmm, not very clear about the maven version compatibility... So suppose you 
have a pom.xml:

<project>
<dependencies>
  <dependency>
    <groupId>com.google.protobuf</groupId>
    <artifactId>protobuf-java</artiFactId>
    <version>2.3</version>
  </dependency>
</dependencies>
</project>

will it also accept protobuf-java-2.2.0.jar ? Or did you simply write 
<version>2</version>?

Original comment by liujisi@google.com on 8 Dec 2010 at 5:37

GoogleCodeExporter commented 9 years ago
It would if you specify, what Maven calls, a range -> [1.0,2.0), which would 
basically read that anything inclusive between 1.0 and 2.0, but excluding the 
2.0 itself.  If you just leave it as 2.3, that is what Maven calls, a "Soft" 
requirement on 2.3 (just a recommendation - helps select the correct version if 
it matches all ranges), basically means that you would like 2.3 if possible, 
but maven will do its best to try to select the nearest (by default within the 
minor range (keeping the same major)) version that would match.

Original comment by aant...@gmail.com on 8 Dec 2010 at 5:53

GoogleCodeExporter commented 9 years ago
Urgh, this is problematic.  When people declare a dependency on protocol 
buffers, I want them to be able to use the version number which we advertise 
publicly, not some other version number which exists solely to trick Maven into 
doing what we want.

It's unfortunate that Maven tries to impose a version numbering convention 
based on low-level technical details.  This doesn't work.  In practice, version 
numbers are chosen based on much higher-level product traits.  The only 
reasonable strategy for determining compatibility between versions is for the 
product itself to somehow express compatibility relationships.  For example, 
the product could define a regex which defines the prefix of the version number 
that determines binary compatibility, or it could simply provide a 
compatibility table that enumerates all known versions and which versions they 
are compatible with.

In any case, I think that it would be too confusing for the Maven version 
numbers to differ from the advertised version numbers.  Therefore I think what 
we need to do is detect version skew at runtime.  We can have every generated 
class start with a block like:

  static {
    // If this fails to compile or run, the runtime library
    // version does not match the protoc version.
    com.google.protobuf.Internal.checkVersionIs_2_3();
  }

Original comment by kenton@google.com on 9 Dec 2010 at 1:30

GoogleCodeExporter commented 9 years ago
[deleted comment]
GoogleCodeExporter commented 9 years ago
I like the proposal that the version number is detected at runtime. I also 
propose that   version number is added to each generated source file.

Original comment by gustav%v...@gtempaccount.com on 18 Feb 2011 at 12:22

GoogleCodeExporter commented 9 years ago
Kenton, I'm not sure I would entirely agree with you about Maven.  It actually 
follows a common and well-accepted convention of major-minor-patch 
compatibility model.  So in this particular case, it is actually the 
protobuf-java library that violates the versioning convention, because the java 
library version 2.3 is NOT backwards compatible to 2.0.3 (which, if we look at 
the versions is the common expectation).  The problem in this case is that 
while the "Product" version of Protocol Buffers dictates the compatibility 
between .proto syntax format, meaning that a .proto file that was created for 
2.0.3 compiler will be happily compiled using 2.3 compiler (protoc), the java 
library, just like its analogous C++ library follows not the definition 
compatibility scheme, but a runtime/source compile compatibility, which is 
totally different from 2.0.3 to 2.3, etc.
So it is only natural to expect that java-protobuf library would be 
major-versioned every time there is a non-backwards compatible change made, 
just like you already do so with the C++ one by increasing the SONAME by a full 
version (4->5->6).

Original comment by aant...@gmail.com on 13 Apr 2011 at 9:07

GoogleCodeExporter commented 9 years ago

Original comment by kenton@google.com on 17 Apr 2011 at 7:47

GoogleCodeExporter commented 9 years ago

Original comment by xiaof...@google.com on 4 Feb 2013 at 1:49