Skip to content

[SPARK-49966][SQL] Codegen Support for JsonToStructs(from_json) #48466

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 2 commits into from

Conversation

panbingkun
Copy link
Contributor

@panbingkun panbingkun commented Oct 15, 2024

What changes were proposed in this pull request?

The pr aims to add Codegen Support for JsonToStructs(from_json).

Why are the changes needed?

  • improve codegen coverage.

Does this PR introduce any user-facing change?

No.

How was this patch tested?

Pass GA & Existed UT (eg: JsonFunctionsSuite#*from_json*)

Was this patch authored or co-authored using generative AI tooling?

No.

@github-actions github-actions bot added the SQL label Oct 15, 2024
@panbingkun panbingkun marked this pull request as ready for review October 15, 2024 05:45
@panbingkun
Copy link
Contributor Author

cc @MaxGekk @cloud-fan

code"""
|${eval.code}
|$resultType $resultTerm = ($resultType) $refEvaluator.evaluate(
| ${eval.isNull} ? null : ${eval.value});
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do you need this check? seems like evaluate() does this.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, it's redundant. I have already removed it.

Copy link
Contributor Author

@panbingkun panbingkun Oct 16, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The code generated by an example is roughly like this:

  • Before
/* 031 */       boolean localtablescan_isNull_0 = localtablescan_row_0.isNullAt(0);
/* 032 */       UTF8String localtablescan_value_0 = localtablescan_isNull_0 ?
/* 033 */       null : (localtablescan_row_0.getUTF8String(0));
/* 034 */       InternalRow project_result_0 = (InternalRow) ((org.apache.spark.sql.catalyst.expressions.json.JsonToStructsEvaluator) references[1] /* evaluator */).evaluate(
/* 035 */         localtablescan_isNull_0 ? null : localtablescan_value_0);
/* 036 */       boolean project_isNull_0 = project_result_0 == null;
/* 037 */       InternalRow project_value_0 = null;
/* 038 */       if (!project_isNull_0) {
/* 039 */         project_value_0 = project_result_0;
/* 040 */       }
/* 041 */       project_mutableStateArray_0[0].reset();
/* 042 */
/* 043 */       project_mutableStateArray_0[0].zeroOutNullBytes();
  • After
/* 031 */       boolean localtablescan_isNull_0 = localtablescan_row_0.isNullAt(0);
/* 032 */       UTF8String localtablescan_value_0 = localtablescan_isNull_0 ?
/* 033 */       null : (localtablescan_row_0.getUTF8String(0));
/* 034 */       InternalRow project_result_0 = (InternalRow) ((org.apache.spark.sql.catalyst.expressions.json.JsonToStructsEvaluator) references[1] /* evaluator */).evaluate(localtablescan_value_0);
/* 035 */       boolean project_isNull_0 = project_result_0 == null;
/* 036 */       InternalRow project_value_0 = null;
/* 037 */       if (!project_isNull_0) {
/* 038 */         project_value_0 = project_result_0;
/* 039 */       }
/* 040 */       project_mutableStateArray_0[0].reset();
/* 041 */
/* 042 */       project_mutableStateArray_0[0].zeroOutNullBytes();
  • Obviously unnecessary
image
  • So it has been removed.

@panbingkun panbingkun requested a review from MaxGekk October 16, 2024 01:25
@MaxGekk
Copy link
Member

MaxGekk commented Oct 16, 2024

+1, LGTM. Merging to master.
Thank you, @panbingkun.

@MaxGekk MaxGekk closed this in 2a13011 Oct 16, 2024

override def nullSafeEval(json: Any): Any = evaluator.evaluate(json.asInstanceOf[UTF8String])

override def doGenCode(ctx: CodegenContext, ev: ExprCode): ExprCode = {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is it possible to use Invoke with Literal(new JsonToStructsEvaluator(...), ObjectType(...)) to rewrite the expression?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let me investigate it.

MaxGekk pushed a commit that referenced this pull request Oct 18, 2024
…on`)

### What changes were proposed in this pull request?
The pr aims to use `Invoke` to implement `JsonToStructs`(`from_json`).

### Why are the changes needed?
Based on cloud-fan's suggestion, I believe that implementing `JsonToStructs`(`from_json`) with `Invoke` can greatly simplify the code.
#48466 (comment)

### Does this PR introduce _any_ user-facing change?
No.

### How was this patch tested?
Update existed UT.

### Was this patch authored or co-authored using generative AI tooling?
No.

Closes #48509 from panbingkun/SPARK-49966_FOLLOWUP.

Authored-by: panbingkun <[email protected]>
Signed-off-by: Max Gekk <[email protected]>
himadripal pushed a commit to himadripal/spark that referenced this pull request Oct 19, 2024
### What changes were proposed in this pull request?
The pr aims to add `Codegen` Support for `JsonToStructs`(`from_json`).

### Why are the changes needed?
- improve codegen coverage.

### Does this PR introduce _any_ user-facing change?
No.

### How was this patch tested?
Pass GA & Existed UT (eg: JsonFunctionsSuite#`*from_json*`)

### Was this patch authored or co-authored using generative AI tooling?
No.

Closes apache#48466 from panbingkun/SPARK-49966.

Authored-by: panbingkun <[email protected]>
Signed-off-by: Max Gekk <[email protected]>
himadripal pushed a commit to himadripal/spark that referenced this pull request Oct 19, 2024
…on`)

### What changes were proposed in this pull request?
The pr aims to use `Invoke` to implement `JsonToStructs`(`from_json`).

### Why are the changes needed?
Based on cloud-fan's suggestion, I believe that implementing `JsonToStructs`(`from_json`) with `Invoke` can greatly simplify the code.
apache#48466 (comment)

### Does this PR introduce _any_ user-facing change?
No.

### How was this patch tested?
Update existed UT.

### Was this patch authored or co-authored using generative AI tooling?
No.

Closes apache#48509 from panbingkun/SPARK-49966_FOLLOWUP.

Authored-by: panbingkun <[email protected]>
Signed-off-by: Max Gekk <[email protected]>
@LuciferYang
Copy link
Contributor

I found that this one has made the SubExprEliminationBenchmark unexecutable.

git reset --hard f3b2535d8d92c2210501f15c5845dd589414ffe3   // before this one 
build/sbt clean "sql/Test/runMain org.apache.spark.sql.execution.SubExprEliminationBenchmark"

SubExprEliminationBenchmark can be executed successfully.

git reset --hard 2a1301133138ba0d5e2d969fc6428153903ffff1 // this one
build/sbt clean "sql/Test/runMain org.apache.spark.sql.execution.SubExprEliminationBenchmark"

then

[info] Running benchmark: from_json as subExpr in Filter
[info]   Running case: subExprElimination false, codegen: true
[info] 00:40:56.209 ERROR org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator: Failed to compile the generated Java code.
[info] org.codehaus.commons.compiler.InternalCompilerException: Compiling "GeneratedClass" in File 'generated.java', Line 1, Column 1: File 'generated.java', Line 24, Column 16: Compiling "processNext()"
[info] 	at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:402)
[info] 	at org.codehaus.janino.UnitCompiler.access$000(UnitCompiler.java:236)
[info] 	at org.codehaus.janino.UnitCompiler$2.visitCompilationUnit(UnitCompiler.java:363)
[info] 	at org.codehaus.janino.UnitCompiler$2.visitCompilationUnit(UnitCompiler.java:361)
[info] 	at org.codehaus.janino.Java$CompilationUnit.accept(Java.java:371)
[info] 	at org.codehaus.janino.UnitCompiler.compileUnit(UnitCompiler.java:361)
[info] 	at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:264)
[info] 00:40:56.222 ERROR org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator: 
[info] /* 001 */ public Object generate(Object[] references) {
[info] /* 002 */   return new GeneratedIteratorForCodegenStage1(references);
[info] /* 003 */ }
[info] /* 004 */
[info] /* 005 */ // codegenStageId=1
[info] /* 006 */ final class GeneratedIteratorForCodegenStage1 extends org.apache.spark.sql.execution.BufferedRowIterator {
[info] /* 007 */   private Object[] references;
[info] /* 008 */   private scala.collection.Iterator[] inputs;
[info] /* 009 */   private scala.collection.Iterator inputadapter_input_0;
[info] /* 010 */   private org.apache.spark.sql.catalyst.e...
[error] Exception in thread "main" org.codehaus.commons.compiler.InternalCompilerException: Failed to compile: org.codehaus.commons.compiler.InternalCompilerException: Compiling "GeneratedClass" in File 'generated.java', Line 1, Column 1: File 'generated.java', Line 24, Column 16: Compiling "processNext()"
[error] 	at org.apache.spark.sql.errors.QueryExecutionErrors$.internalCompilerError(QueryExecutionErrors.scala:649)
[error] 	at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.doCompile(CodeGenerator.scala:1552)
[error] 	at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.$anonfun$cache$1(CodeGenerator.scala:1638)
[error] 	at org.apache.spark.util.NonFateSharingCache$$anon$1.load(NonFateSharingCache.scala:68)
[error] 	at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3576)
[error] 	at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2318)
[error] 	at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2191)
[error] 	at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2081)
[error] 	at com.google.common.cache.LocalCache.get(LocalCache.java:4019)
[error] 	at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4042)
[error] 	at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:5024)
[error] 	at org.apache.spark.util.NonFateSharingLoadingCache.$anonfun$get$2(NonFateSharingCache.scala:94)
[error] 	at org.apache.spark.util.KeyLock.withLock(KeyLock.scala:64)
[error] 	at org.apache.spark.util.NonFateSharingLoadingCache.get(NonFateSharingCache.scala:94)
[error] 	at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1489)
[error] 	at org.apache.spark.sql.execution.WholeStageCodegenExec.liftedTree1$1(WholeStageCodegenExec.scala:732)
[error] 	at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:731)
[error] 	at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeRDD$1(SparkPlan.scala:187)
[error] 	at scala.util.Try$.apply(Try.scala:217)
[error] 	at org.apache.spark.util.Utils$.doTryWithCallerStacktrace(Utils.scala:1375)
[error] 	at org.apache.spark.util.Utils$.getTryWithCallerStacktrace(Utils.scala:1429)
[error] 	at org.apache.spark.util.LazyTry.get(LazyTry.scala:58)
[error] 	at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:108)
[error] 	at org.apache.spark.sql.execution.QueryExecution.assertCommandExecuted(QueryExecution.scala:169)
[error] 	at org.apache.spark.sql.internal.DataFrameWriterImpl.runCommand(DataFrameWriterImpl.scala:614)
[error] 	at org.apache.spark.sql.internal.DataFrameWriterImpl.saveInternal(DataFrameWriterImpl.scala:196)
[error] 	at org.apache.spark.sql.internal.DataFrameWriterImpl.save(DataFrameWriterImpl.scala:125)
[error] 	at org.apache.spark.sql.execution.SubExprEliminationBenchmark$.$anonfun$withFilter$6(SubExprEliminationBenchmark.scala:112)
[error] 	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
[error] 	at org.apache.spark.sql.catalyst.SQLConfHelper.withSQLConf(SQLConfHelper.scala:56)
[error] 	at org.apache.spark.sql.catalyst.SQLConfHelper.withSQLConf$(SQLConfHelper.scala:38)
[error] 	at org.apache.spark.sql.execution.SubExprEliminationBenchmark$.withSQLConf(SubExprEliminationBenchmark.scala:38)
[error] 	at org.apache.spark.sql.execution.SubExprEliminationBenchmark$.$anonfun$withFilter$5(SubExprEliminationBenchmark.scala:108)
[error] 	at org.apache.spark.benchmark.Benchmark.$anonfun$addCase$1(Benchmark.scala:77)
[error] 	at org.apache.spark.benchmark.Benchmark.$anonfun$addCase$1$adapted(Benchmark.scala:75)
[error] 	at org.apache.spark.benchmark.Benchmark.measure(Benchmark.scala:144)
[error] 	at org.apache.spark.benchmark.Benchmark.$anonfun$run$1(Benchmark.scala:108)
[error] 	at scala.collection.StrictOptimizedIterableOps.map(StrictOptimizedIterableOps.scala:100)
[error] 	at scala.collection.StrictOptimizedIterableOps.map$(StrictOptimizedIterableOps.scala:87)
[error] 	at scala.collection.mutable.ArrayBuffer.map(ArrayBuffer.scala:42)
[error] 	at org.apache.spark.benchmark.Benchmark.run(Benchmark.scala:106)
[error] 	at org.apache.spark.sql.execution.SubExprEliminationBenchmark$.$anonfun$withFilter$1(SubExprEliminationBenchmark.scala:117)
[error] 	at org.apache.spark.sql.execution.SubExprEliminationBenchmark$.$anonfun$withFilter$1$adapted(SubExprEliminationBenchmark.scala:83)
[error] 	at org.apache.spark.sql.catalyst.plans.SQLHelper.withTempPath(SQLHelper.scala:41)
[error] 	at org.apache.spark.sql.catalyst.plans.SQLHelper.withTempPath$(SQLHelper.scala:38)
[error] 	at org.apache.spark.sql.execution.SubExprEliminationBenchmark$.withTempPath(SubExprEliminationBenchmark.scala:38)
[error] 	at org.apache.spark.sql.execution.SubExprEliminationBenchmark$.withFilter(SubExprEliminationBenchmark.scala:83)
[error] 	at org.apache.spark.sql.execution.SubExprEliminationBenchmark$.$anonfun$runBenchmarkSuite$1(SubExprEliminationBenchmark.scala:125)
[error] 	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
[error] 	at org.apache.spark.benchmark.BenchmarkBase.runBenchmark(BenchmarkBase.scala:42)
[error] 	at org.apache.spark.sql.execution.SubExprEliminationBenchmark$.runBenchmarkSuite(SubExprEliminationBenchmark.scala:123)
[error] 	at org.apache.spark.benchmark.BenchmarkBase.main(BenchmarkBase.scala:72)
[error] 	at org.apache.spark.sql.execution.SubExprEliminationBenchmark.main(SubExprEliminationBenchmark.scala)
[error] 	Suppressed: java.lang.Exception: Stacktrace under doTryWithCallerStacktrace
[error] 		at org.apache.spark.sql.errors.QueryExecutionErrors$.internalCompilerError(QueryExecutionErrors.scala:649)
[error] 		at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.doCompile(CodeGenerator.scala:1552)
[error] 		at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.$anonfun$cache$1(CodeGenerator.scala:1638)
[error] 		at org.apache.spark.util.NonFateSharingCache$$anon$1.load(NonFateSharingCache.scala:68)
[error] 		at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3576)
[error] 		at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2318)
[error] 		at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2191)
[error] 		at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2081)
[error] 		at com.google.common.cache.LocalCache.get(LocalCache.java:4019)
[error] 		at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4042)
[error] 		at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:5024)
[error] 		at org.apache.spark.util.NonFateSharingLoadingCache.$anonfun$get$2(NonFateSharingCache.scala:94)
[error] 		at org.apache.spark.util.KeyLock.withLock(KeyLock.scala:64)
[error] 		at org.apache.spark.util.NonFateSharingLoadingCache.get(NonFateSharingCache.scala:94)
[error] 		at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1489)
[error] 		at org.apache.spark.sql.execution.WholeStageCodegenExec.liftedTree1$1(WholeStageCodegenExec.scala:732)
[error] 		at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:731)
[error] 		at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeRDD$1(SparkPlan.scala:187)
[error] 		at scala.util.Try$.apply(Try.scala:217)
[error] 		at org.apache.spark.util.Utils$.doTryWithCallerStacktrace(Utils.scala:1375)
[error] 	Suppressed: java.lang.Exception: Full stacktrace of original doTryWithCallerStacktrace caller
[error] 		at org.apache.spark.sql.errors.QueryExecutionErrors$.internalCompilerError(QueryExecutionErrors.scala:649)
[error] 		at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.doCompile(CodeGenerator.scala:1552)
[error] 		at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.$anonfun$cache$1(CodeGenerator.scala:1638)
[error] 		at org.apache.spark.util.NonFateSharingCache$$anon$1.load(NonFateSharingCache.scala:68)
[error] 		at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3576)
[error] 		at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2318)
[error] 		at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2191)
[error] 		at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2081)
[error] 		at com.google.common.cache.LocalCache.get(LocalCache.java:4019)
[error] 		at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4042)
[error] 		at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:5024)
[error] 		at org.apache.spark.util.NonFateSharingLoadingCache.$anonfun$get$2(NonFateSharingCache.scala:94)
[error] 		at org.apache.spark.util.KeyLock.withLock(KeyLock.scala:64)
[error] 		at org.apache.spark.util.NonFateSharingLoadingCache.get(NonFateSharingCache.scala:94)
[error] 		at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1489)
[error] 		at org.apache.spark.sql.execution.WholeStageCodegenExec.liftedTree1$1(WholeStageCodegenExec.scala:732)
[error] 		at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:731)
[error] 		at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeRDD$1(SparkPlan.scala:187)
[error] 		at scala.util.Try$.apply(Try.scala:217)
[error] 		at org.apache.spark.util.Utils$.doTryWithCallerStacktrace(Utils.scala:1375)
[error] 		at org.apache.spark.util.LazyTry.tryT$lzycompute(LazyTry.scala:46)
[error] 		at org.apache.spark.util.LazyTry.tryT(LazyTry.scala:46)
[error] 		at org.apache.spark.util.LazyTry.get(LazyTry.scala:58)
[error] 		at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:200)
[error] 		at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:259)
[error] 		at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
[error] 		at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:256)
[error] 		at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:196)
[error] 		at org.apache.spark.sql.execution.datasources.v2.V2TableWriteExec.writeWithV2(WriteToDataSourceV2Exec.scala:367)
[error] 		at org.apache.spark.sql.execution.datasources.v2.V2TableWriteExec.writeWithV2$(WriteToDataSourceV2Exec.scala:365)
[error] 		at org.apache.spark.sql.execution.datasources.v2.OverwriteByExpressionExec.writeWithV2(WriteToDataSourceV2Exec.scala:249)
[error] 		at org.apache.spark.sql.execution.datasources.v2.V2ExistingTableWriteExec.run(WriteToDataSourceV2Exec.scala:343)
[error] 		at org.apache.spark.sql.execution.datasources.v2.V2ExistingTableWriteExec.run$(WriteToDataSourceV2Exec.scala:342)
[error] 		at org.apache.spark.sql.execution.datasources.v2.OverwriteByExpressionExec.run(WriteToDataSourceV2Exec.scala:249)
[error] 		at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:43)
[error] 		at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:43)
[error] 		at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:49)
[error] 		at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$2(QueryExecution.scala:132)
[error] 		at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$7(SQLExecution.scala:160)
[error] 		at org.apache.spark.sql.execution.SQLExecution$.withSessionTagsApplied(SQLExecution.scala:264)
[error] 		at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$6(SQLExecution.scala:123)
[error] 		at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:287)
[error] 		at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$1(SQLExecution.scala:123)
[error] 		at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:748)
[error] 		at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId0(SQLExecution.scala:77)
[error] 		at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:229)
[error] 		at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:132)
[error] 		at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:629)
[error] 		at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$eagerlyExecute$1(QueryExecution.scala:131)
[error] 		at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$3.applyOrElse(QueryExecution.scala:146)
[error] 		at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$3.applyOrElse(QueryExecution.scala:141)
[error] 		at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:470)
[error] 		at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:86)
[error] 		at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:470)
[error] 		at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:37)
[error] 		at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:330)
[error] 		at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:326)
[error] 		at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:37)
[error] 		at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:37)
[error] 		at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:446)
[error] 		at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:141)
[error] 		at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyCommandExecuted$1(QueryExecution.scala:103)
[error] 		at scala.util.Try$.apply(Try.scala:217)
[error] 		at org.apache.spark.util.Utils$.doTryWithCallerStacktrace(Utils.scala:1375)
[error] 		at org.apache.spark.util.LazyTry.tryT$lzycompute(LazyTry.scala:46)
[error] 		at org.apache.spark.util.LazyTry.tryT(LazyTry.scala:46)
[error] 		... 32 more
...
[error] Caused by: org.codehaus.commons.compiler.InternalCompilerException: Compiling "GeneratedClass" in File 'generated.java', Line 1, Column 1: File 'generated.java', Line 24, Column 16: Compiling "processNext()"
[error] 	at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:402)
[error] 	at org.codehaus.janino.UnitCompiler.access$000(UnitCompiler.java:236)
[error] 	at org.codehaus.janino.UnitCompiler$2.visitCompilationUnit(UnitCompiler.java:363)
[error] 	at org.codehaus.janino.UnitCompiler$2.visitCompilationUnit(UnitCompiler.java:361)
[error] 	at org.codehaus.janino.Java$CompilationUnit.accept(Java.java:371)
[error] 	at org.codehaus.janino.UnitCompiler.compileUnit(UnitCompiler.java:361)
[error] 	at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:264)
[error] 	at org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:294)
[error] 	at org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:288)
[error] 	at org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:267)
[error] 	at org.codehaus.commons.compiler.Cookable.cook(Cookable.java:82)
[error] 	at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.doCompile(CodeGenerator.scala:1546)
[error] 	at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.$anonfun$cache$1(CodeGenerator.scala:1638)
[error] 	at org.apache.spark.util.NonFateSharingCache$$anon$1.load(NonFateSharingCache.scala:68)
[error] 	at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3576)
[error] 	at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2318)
[error] 	at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2191)
[error] 	at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2081)
[error] 	at com.google.common.cache.LocalCache.get(LocalCache.java:4019)
[error] 	at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4042)
[error] 	at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:5024)
[error] 	at org.apache.spark.util.NonFateSharingLoadingCache.$anonfun$get$2(NonFateSharingCache.scala:94)
[error] 	at org.apache.spark.util.KeyLock.withLock(KeyLock.scala:64)
[error] 	at org.apache.spark.util.NonFateSharingLoadingCache.get(NonFateSharingCache.scala:94)
[error] 	at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1489)
[error] 	at org.apache.spark.sql.execution.WholeStageCodegenExec.liftedTree1$1(WholeStageCodegenExec.scala:732)
[error] 	at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:731)
[error] 	at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeRDD$1(SparkPlan.scala:187)
[error] 	at scala.util.Try$.apply(Try.scala:217)
[error] 	at org.apache.spark.util.Utils$.doTryWithCallerStacktrace(Utils.scala:1375)
[error] 	at org.apache.spark.util.LazyTry.tryT$lzycompute(LazyTry.scala:46)
[error] 	at org.apache.spark.util.LazyTry.tryT(LazyTry.scala:46)
[error] 	at org.apache.spark.util.LazyTry.get(LazyTry.scala:58)
[error] 	at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:200)
[error] 	at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:259)
[error] 	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
[error] 	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:256)
[error] 	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:196)
[error] 	at org.apache.spark.sql.execution.datasources.v2.V2TableWriteExec.writeWithV2(WriteToDataSourceV2Exec.scala:367)
[error] 	at org.apache.spark.sql.execution.datasources.v2.V2TableWriteExec.writeWithV2$(WriteToDataSourceV2Exec.scala:365)
[error] 	at org.apache.spark.sql.execution.datasources.v2.OverwriteByExpressionExec.writeWithV2(WriteToDataSourceV2Exec.scala:249)
[error] 	at org.apache.spark.sql.execution.datasources.v2.V2ExistingTableWriteExec.run(WriteToDataSourceV2Exec.scala:343)
[error] 	at org.apache.spark.sql.execution.datasources.v2.V2ExistingTableWriteExec.run$(WriteToDataSourceV2Exec.scala:342)
[error] 	at org.apache.spark.sql.execution.datasources.v2.OverwriteByExpressionExec.run(WriteToDataSourceV2Exec.scala:249)
[error] 	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:43)
[error] 	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:43)
[error] 	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:49)
[error] 	at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$2(QueryExecution.scala:132)
[error] 	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$7(SQLExecution.scala:160)
[error] 	at org.apache.spark.sql.execution.SQLExecution$.withSessionTagsApplied(SQLExecution.scala:264)
[error] 	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$6(SQLExecution.scala:123)
[error] 	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:287)
[error] 	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$1(SQLExecution.scala:123)
[error] 	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:748)
[error] 	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId0(SQLExecution.scala:77)
[error] 	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:229)
[error] 	at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:132)
[error] 	at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:629)
[error] 	at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$eagerlyExecute$1(QueryExecution.scala:131)
[error] 	at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$3.applyOrElse(QueryExecution.scala:146)
[error] 	at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$3.applyOrElse(QueryExecution.scala:141)
[error] 	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:470)
[error] 	at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:86)
[error] 	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:470)
[error] 	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:37)
[error] 	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:330)
[error] 	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:326)
[error] 	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:37)
[error] 	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:37)
[error] 	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:446)
[error] 	at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:141)
[error] 	at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyCommandExecuted$1(QueryExecution.scala:103)
[error] 	at scala.util.Try$.apply(Try.scala:217)
[error] 	at org.apache.spark.util.Utils$.doTryWithCallerStacktrace(Utils.scala:1375)
[error] 	at org.apache.spark.util.LazyTry.tryT$lzycompute(LazyTry.scala:46)
[error] 	at org.apache.spark.util.LazyTry.tryT(LazyTry.scala:46)
[error] 	... 32 more
[error] Caused by: org.codehaus.commons.compiler.InternalCompilerException: File 'generated.java', Line 24, Column 16: Compiling "processNext()"
[error] 	at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:3333)
[error] 	at org.codehaus.janino.UnitCompiler.compileDeclaredMethods(UnitCompiler.java:1447)
[error] 	at org.codehaus.janino.UnitCompiler.compileDeclaredMethods(UnitCompiler.java:1420)
[error] 	at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:829)
[error] 	at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:1026)
[error] 	at org.codehaus.janino.UnitCompiler.access$700(UnitCompiler.java:236)
[error] 	at org.codehaus.janino.UnitCompiler$3.visitMemberClassDeclaration(UnitCompiler.java:425)
[error] 	at org.codehaus.janino.UnitCompiler$3.visitMemberClassDeclaration(UnitCompiler.java:418)
[error] 	at org.codehaus.janino.Java$MemberClassDeclaration.accept(Java.java:1533)
[error] 	at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:418)
[error] 	at org.codehaus.janino.UnitCompiler.compileDeclaredMemberTypes(UnitCompiler.java:1397)
[error] 	at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:864)
[error] 	at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:442)
[error] 	at org.codehaus.janino.UnitCompiler.access$400(UnitCompiler.java:236)
[error] 	at org.codehaus.janino.UnitCompiler$3.visitPackageMemberClassDeclaration(UnitCompiler.java:422)
[error] 	at org.codehaus.janino.UnitCompiler$3.visitPackageMemberClassDeclaration(UnitCompiler.java:418)
[error] 	at org.codehaus.janino.Java$PackageMemberClassDeclaration.accept(Java.java:1688)
[error] 	at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:418)
[error] 	at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:392)
[error] 	... 107 more
[error] Caused by: java.lang.RuntimeException: File 'generated.java', Line 25, Column 1
[error] 	at org.codehaus.janino.UnitCompiler.compileStatements(UnitCompiler.java:1663)
[error] 	at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:3658)
[error] 	at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:3329)
[error] 	... 125 more
[error] Caused by: java.lang.RuntimeException: File 'generated.java', Line 25, Column 1
[error] 	at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1604)
[error] 	at org.codehaus.janino.UnitCompiler.compileStatements(UnitCompiler.java:1661)
[error] 	... 127 more
[error] Caused by: java.lang.RuntimeException: File 'generated.java', Line 25, Column 41
[error] 	at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1604)
[error] 	at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:2001)
[error] 	at org.codehaus.janino.UnitCompiler.access$2200(UnitCompiler.java:236)
[error] 	at org.codehaus.janino.UnitCompiler$6.visitWhileStatement(UnitCompiler.java:1584)
[error] 	at org.codehaus.janino.UnitCompiler$6.visitWhileStatement(UnitCompiler.java:1575)
[error] 	at org.codehaus.janino.Java$WhileStatement.accept(Java.java:3389)
[error] 	at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1575)
[error] 	... 128 more
[error] Caused by: java.lang.RuntimeException: File 'generated.java', Line 28, Column 1
[error] 	at org.codehaus.janino.UnitCompiler.compileStatements(UnitCompiler.java:1663)
[error] 	at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:1646)
[error] 	at org.codehaus.janino.UnitCompiler.access$1700(UnitCompiler.java:236)
[error] 	at org.codehaus.janino.UnitCompiler$6.visitBlock(UnitCompiler.java:1579)
[error] 	at org.codehaus.janino.UnitCompiler$6.visitBlock(UnitCompiler.java:1575)
[error] 	at org.codehaus.janino.Java$Block.accept(Java.java:3115)
[error] 	at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1575)
[error] 	... 134 more
[error] Caused by: java.lang.RuntimeException: File 'generated.java', Line 28, Column 1
[error] 	at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1604)
[error] 	at org.codehaus.janino.UnitCompiler.compileStatements(UnitCompiler.java:1661)
[error] 	... 140 more
[error] Caused by: java.lang.RuntimeException: File 'generated.java', Line 28, Column 4
[error] 	at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1604)
[error] 	at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:1691)
[error] 	at org.codehaus.janino.UnitCompiler.access$2600(UnitCompiler.java:236)
[error] 	at org.codehaus.janino.UnitCompiler$6.visitDoStatement(UnitCompiler.java:1588)
[error] 	at org.codehaus.janino.UnitCompiler$6.visitDoStatement(UnitCompiler.java:1575)
[error] 	at org.codehaus.janino.Java$DoStatement.accept(Java.java:3794)
[error] 	at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1575)
[error] 	... 141 more
[error] Caused by: java.lang.RuntimeException: File 'generated.java', Line 11938, Column 1
[error] 	at org.codehaus.janino.UnitCompiler.compileStatements(UnitCompiler.java:1663)
[error] 	at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:1646)
[error] 	at org.codehaus.janino.UnitCompiler.access$1700(UnitCompiler.java:236)
[error] 	at org.codehaus.janino.UnitCompiler$6.visitBlock(UnitCompiler.java:1579)
[error] 	at org.codehaus.janino.UnitCompiler$6.visitBlock(UnitCompiler.java:1575)
[error] 	at org.codehaus.janino.Java$Block.accept(Java.java:3115)
[error] 	at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1575)
[error] 	... 147 more
[error] Caused by: java.lang.RuntimeException: File 'generated.java', Line 11938, Column 1
[error] 	at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1604)
[error] 	at org.codehaus.janino.UnitCompiler.compileStatements(UnitCompiler.java:1661)
[error] 	... 153 more
[error] Caused by: java.lang.RuntimeException: File 'generated.java', Line 11938, Column 9
[error] 	at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:2745)
[error] 	at org.codehaus.janino.UnitCompiler.access$2700(UnitCompiler.java:236)
[error] 	at org.codehaus.janino.UnitCompiler$6.visitLocalVariableDeclarationStatement(UnitCompiler.java:1589)
[error] 	at org.codehaus.janino.UnitCompiler$6.visitLocalVariableDeclarationStatement(UnitCompiler.java:1575)
[error] 	at org.codehaus.janino.Java$LocalVariableDeclarationStatement.accept(Java.java:3842)
[error] 	at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1575)
[error] 	... 154 more
[error] Caused by: org.codehaus.commons.compiler.InternalCompilerException: Code grows beyond 64 KB
[error] 	at org.codehaus.janino.CodeContext.makeSpace(CodeContext.java:699)
[error] 	at org.codehaus.janino.CodeContext.write(CodeContext.java:558)
[error] 	at org.codehaus.janino.UnitCompiler.write(UnitCompiler.java:13079)
[error] 	at org.codehaus.janino.UnitCompiler.store(UnitCompiler.java:12752)
[error] 	at org.codehaus.janino.UnitCompiler.store(UnitCompiler.java:12730)
[error] 	at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:2742)
[error] 	... 159 more
[error] Nonzero exit code returned from runner: 1
[error] (sql / Test / runMain) Nonzero exit code returned from runner: 1

@panbingkun Do you have time to take a look at this issue? Thanks ~

also cc @MaxGekk and @cloud-fan

@LuciferYang
Copy link
Contributor

LuciferYang commented Jan 7, 2025

def withFilter(rowsNum: Int, numIters: Int): Unit = {
val benchmark = new Benchmark("from_json as subExpr in Filter", rowsNum, output = output)
withTempPath { path =>
prepareDataInfo(benchmark)
val numCols = 500
val schema = writeWideRow(path.getAbsolutePath, rowsNum, numCols)

If numCols is reduced to 330, the SubExprEliminationBenchmark can be executed successfully, but this is already pushing the limit. Exceeding 330 will cause the generated Java code for this case to exceed the 64kb limit. I am currently using GA to run SubExprEliminationBenchmark with val numCols = 330 to determine if the comparison data in the SubExprEliminationBenchmark meets expectations:

@panbingkun
Copy link
Contributor Author

Thank you for noticing this issue, let me investigate it.

@LuciferYang
Copy link
Contributor

LuciferYang commented Jan 8, 2025

  • before this one : commit f3b2535 with numCols=330
OpenJDK 64-Bit Server VM 17.0.13+11-LTS on Linux 6.8.0-1017-azure
AMD EPYC 7763 64-Core Processor
from_json as subExpr in Filter:           Best Time(ms)   Avg Time(ms)   Stdev(ms)    Rate(M/s)   Per Row(ns)   Relative
------------------------------------------------------------------------------------------------------------------------
subExprElimination false, codegen: true            2766           3046         284          0.0    27655915.4       1.0X
subExprElimination false, codegen: false           2884           2940          51          0.0    28837016.0       1.0X
subExprElimination true, codegen: true              886            901          25          0.0     8860269.8       3.1X
subExprElimination true, codegen: false             854            857           3          0.0     8539656.4       3.2X
  • after this one: commit 2a13011 with numCols=330
OpenJDK 64-Bit Server VM 17.0.13+11-LTS on Linux 6.8.0-1017-azure
AMD EPYC 7763 64-Core Processor
from_json as subExpr in Filter:           Best Time(ms)   Avg Time(ms)   Stdev(ms)    Rate(M/s)   Per Row(ns)   Relative
------------------------------------------------------------------------------------------------------------------------
subExprElimination false, codegen: true            3314           3570         277          0.0    33143936.7       1.0X
subExprElimination false, codegen: false           2823           3027         193          0.0    28234617.4       1.2X
subExprElimination true, codegen: true             3501           3579          68          0.0    35009514.9       0.9X
subExprElimination true, codegen: false             841            865          28          0.0     8409037.4       3.9X
  • master: commit 194aa18 with numCols=330
OpenJDK 64-Bit Server VM 17.0.13+11-LTS on Linux 6.8.0-1017-azure
AMD EPYC 7763 64-Core Processor
from_json as subExpr in Filter:           Best Time(ms)   Avg Time(ms)   Stdev(ms)    Rate(M/s)   Per Row(ns)   Relative
------------------------------------------------------------------------------------------------------------------------
subExprElimination false, codegen: true            3445           3643         270          0.0    34451139.9       1.0X
subExprElimination false, codegen: false           3017           3122          91          0.0    30169801.5       1.1X
subExprElimination true, codegen: true             3205           3290          95          0.0    32052385.0       1.1X
subExprElimination true, codegen: false             768            797          30          0.0     7683521.6       4.5X

It seems that there is a performance regression in the withFilter scenario with subExprElimination true, codegen: true, showing 3501, 3579, 68, 0.0, 35009514.9, 0.9X.

@dongjoon-hyun
Copy link
Member

Thank you for reporting, @LuciferYang .

@panbingkun
Copy link
Contributor Author

The number of data rows in this benchmark is a bit small (100 rows).


Let me first verify the performance data locally with more data rows.

@panbingkun
Copy link
Contributor Author

I think it's because there seem to be some issues with the filter scenario after implementing Codegen.
I will submit a PR to remove it first, and then submit it after locating it.
#49411

@cloud-fan

LuciferYang pushed a commit that referenced this pull request Jan 8, 2025
### What changes were proposed in this pull request?

This PR aims to regenerate benchmark results as a new baseline before cutting branch-4.0.

### Why are the changes needed?

To make the result up-to-date and make it easy to validate the regression.
- `SubExprEliminationBenchmark` benchmark result is not included because it's broken currently.
  - #48466 (comment)

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Manual review.

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes #49409 from dongjoon-hyun/bm_20250107.

Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: yangjie01 <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants