-
Notifications
You must be signed in to change notification settings - Fork 9.1k
HADOOP-19219. Resolve Certificate error in Hadoop-auth tests. #6939
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
💔 -1 overall
This message was automatically generated. |
💔 -1 overall
This message was automatically generated. |
is it only required for test but not required at runtime? |
@pan3793 I have been trying to compile it using maven locally and during compilation I have faced this issue. |
of course, those parameters are not required for compilation, but it's dangerous to only add them for test but not runtime. |
<groupId>org.apache.maven.plugins</groupId> | ||
<artifactId>maven-surefire-plugin</artifactId> | ||
<configuration> | ||
<argLine>--add-exports=java.base/sun.security.x509=ALL-UNNAMED --add-exports=java.base/sun.security.util=ALL-UNNAMED</argLine> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
BTW, I would suggest adding those parameters unconditionally, just like Apache Spark does.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@pan3793 Can you please clarify "unconditionally" here? Do you mean "regardless of Java version"? --add-exports
and --add-opens
do not work with Java 8. I guess Spark does not support building/testing with Java 8.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you please clarify "unconditionally" here? Do you mean "regardless of Java version"?
correct.
I guess Spark does not support building/testing with Java 8
Spark 3.3 ~ 3.5 supports Java 8 ~ 17, you can check Spark branch-3.5
https://github.com/apache/spark/blob/branch-3.5/pom.xml#L300
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks. Spark build makes it work by also setting -XX:+IgnoreUnrecognizedVMOptions
:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, sorry, i should mention IgnoreUnrecognizedVMOptions
too.
do you think that hadoop can also follow that? it would simplify shell scripts and pom.xml
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm happy with -XX:+IgnoreUnrecognizedVMOptions
in our build/test scripts
<plugin> | ||
<groupId>org.apache.maven.plugins</groupId> | ||
<artifactId>maven-compiler-plugin</artifactId> | ||
<configuration> | ||
<source>${maven.compiler.source}</source> | ||
<target>${maven.compiler.target}</target> | ||
</configuration> | ||
</plugin> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Setting properties maven.compiler.source
and maven.compiler.target
has the same effect as creating a configuration
for the plugin with source
and target
set. We only need one of these changes.
I'd keep the properties.
<plugin> | |
<groupId>org.apache.maven.plugins</groupId> | |
<artifactId>maven-compiler-plugin</artifactId> | |
<configuration> | |
<source>${maven.compiler.source}</source> | |
<target>${maven.compiler.target}</target> | |
</configuration> | |
</plugin> |
<plugin> | ||
<groupId>org.apache.maven.plugins</groupId> | ||
<artifactId>maven-enforcer-plugin</artifactId> | ||
<version>3.0.0</version> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: version is not needed, defined in root POM.
<version>3.0.0</version> |
<executions> | ||
<execution> | ||
<id>enforce-java-version</id> | ||
<goals> | ||
<goal>enforce</goal> | ||
</goals> | ||
<configuration> | ||
<rules> | ||
<requireJavaVersion> | ||
<version>1.8</version> | ||
</requireJavaVersion> | ||
</rules> | ||
</configuration> | ||
</execution> | ||
</executions> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why require Java version 1.8 in a profile activated by <jdk>1.8</jdk>
?
<maven.compiler.source>${javac.version}</maven.compiler.source> | ||
<maven.compiler.target>${javac.version}</maven.compiler.target> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like these properties are set regardless of Java version. So they could be moved out of the profiles.
However, we can use the profiles to set maven.compiler.release
for Java 9+ instead of the other two properties.
Java 9 introduced new javac
option --release
, which helps ensure compatibility with the given Java version (blog with great explanation). The goal of this change is to set this option when compiling with JDK 9+, instead of --source
and --target
.
--release
expects version without the old-style 1.
prefix (i.e. 8
instead of 1.8
). maven-enforcer-plugin
added support for the same in version 3.2.1, so we also need to bump the plugin.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do you want to leverage the --release
to allow Hadoop building against Java 17 but could run with Java 8 and above?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
source
and target
are set for the same reason, to be able to build with newer Java, but still create binaries for Java 8. release
is similar, but better.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure if it really works. I have experienced a lot of the following error cases when running jar on JDK 8 that built with newer JDK with release 8
java.lang.NoSuchMethodError: java.nio.ByteBuffer.limit(I)Ljava/nio/ByteBuffer
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have experienced a lot of the following error cases when running jar on JDK 8 that built with newer JDK with release 8
Using -target
or --release
?
Because that's exactly the kind of problem --release
solves compared to -target
.
It's easy to reproduce the problem and verify the fix:
$ cat > NoSuchMethod.java <<EOF
import java.nio.ByteBuffer;
public class NoSuchMethod {
public static void main(String[] args) {
System.out.println(ByteBuffer.allocate(100).position(20).limit(62).remaining());
}
}
EOF
$ usr/lib/jvm/java-11-openjdk-amd64/bin/javac -Xlint:none -source 8 -target 8 NoSuchMethod.java \
&& /usr/lib/jvm/java-8-openjdk-amd64/bin/java NoSuchMethod
Exception in thread "main" java.lang.NoSuchMethodError: java.nio.ByteBuffer.position(I)Ljava/nio/ByteBuffer;
at NoSuchMethod.main(NoSuchMethod.java:6)
$ /usr/lib/jvm/java-11-openjdk-amd64/bin/javac --release 8 NoSuchMethod.java \
&& /usr/lib/jvm/java-8-openjdk-amd64/bin/java NoSuchMethod
42
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wow, thanks for letting me know it can be resolved by --release
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
that bytebuffer thing is trouble as even recent java8 open jdks have that bytebuffer override; you need to build on an oracle jdk right now. this will be an improvement if we can trust it
<profile> | ||
<id>java-8</id> | ||
<activation> | ||
<jdk>1.8</jdk> | ||
</activation> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why are these profiles being added in hadoop-common-project/hadoop-auth/pom.xml
instead of the root POM?
- We may need the
--add-opens
args for other tests. - More importantly, the compiler settings should be global.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why are these profiles being added in
hadoop-common-project/hadoop-auth/pom.xml
instead of the root POM?
- We may need the
--add-opens
args for other tests.- More importantly, the compiler settings should be global.
@adoroszlai Sure I will move it to hadoop-main pom.
FYI, |
@adoroszlai, @pan3793 Thank you for the review. I have changed implementation as per your suggestions. Please have a look! |
💔 -1 overall
This message was automatically generated. |
Runtime JPMS args mentioned here #6939 (comment) should be considered, otherwise the runtime errors will be hidden in testing. |
@@ -167,9 +167,11 @@ | |||
--> | |||
<enforced.java.version>[${javac.version},)</enforced.java.version> | |||
<enforced.maven.version>[3.3.0,)</enforced.maven.version> | |||
|
|||
<extraJavaTestArgs> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
jdk8 does not support --add-exports, and the test will fail.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
read here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm happy with this
💔 -1 overall
This message was automatically generated. |
@@ -618,8 +618,7 @@ function hadoop_bootstrap | |||
|
|||
export HADOOP_OS_TYPE=${HADOOP_OS_TYPE:-$(uname -s)} | |||
|
|||
# defaults | |||
export HADOOP_OPTS=${HADOOP_OPTS:-"-Djava.net.preferIPv4Stack=true"} | |||
export HADOOP_OPTS="${HADOOP_OPTS:-"-Djava.net.preferIPv4Stack=true"} -XX:+IgnoreUnrecognizedVMOptions --add-exports=java.base/sun.security.x509=ALL-UNNAMED --add-exports=java.base/sun.security.util=ALL-UNNAMED" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would suggest adding the JPMS opts in function hadoop_finalize_hadoop_opts
or add a dedicated function hadoop_finalize_jpms_opts
also, we should add comments on both sides(here and pom.xml), to mention the future contributors sync the jpms opts
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I concur. we also need to make sure all forked processes (terasort...) are happy
looks like this PR has been inactive for a while, given this is a blocker for JDK 17 support, I opened #7084 to take over. |
It was the suggestion from #6939 (comment) |
Description of PR
Implemented JDK 17 profile to suppress sun.security certificate errors encountered during migration from JDK 8, ensuring seamless compilation and compatibility.
How was this patch tested?
Tested locally by compiling project on JDK17.
For code changes:
Added profile for JDK17.