You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi
Update to Spring Boot 2.0 and use Spring Boot Batch.
But I have a part that works differently than Sprinb Boot 1.5 (Spring Batch 3.0).
(Maybe the bug is right)
As far as I know, if there is a failed history in Spring Batch, when the same batch job is run
I know that you are using the job parameter of the failed job.
However, I know that if the same key and another value are passed to the Job parameter, it will be executed with the new value replaced.
But now it is not the latest version.
If there is a failure, the new value of the same key is not changed, and the job parameter of the failed job is always used.
So, if you use the latest version of Batch now, you should always delete the job meta table if there are any jobs that failed.
Otherwise, no matter how you change the Job parameter, only the old failed Job parameter will be used.
I found a problem in the Spring Batch code and sent a PR.
Affects: 4.0.1
Issue Links:
BATCH-2711 Existing failed job is restarted, even if new job contains different job parameters
("duplicates")
Thank you for reporting this issue and for opening a PR! Indeed, new (identifying) job parameters are overridden with those of the previous execution if it has failed.
jojoldu opened BATCH-2741 and commented
Hi
Update to Spring Boot 2.0 and use Spring Boot Batch.
But I have a part that works differently than Sprinb Boot 1.5 (Spring Batch 3.0).
(Maybe the bug is right)
As far as I know, if there is a failed history in Spring Batch, when the same batch job is run
I know that you are using the job parameter of the failed job.
However, I know that if the same key and another value are passed to the Job parameter, it will be executed with the new value replaced.
But now it is not the latest version.
If there is a failure, the new value of the same key is not changed, and the job parameter of the failed job is always used.
So, if you use the latest version of Batch now, you should always delete the job meta table if there are any jobs that failed.
Otherwise, no matter how you change the Job parameter, only the old failed Job parameter will be used.
I found a problem in the Spring Batch code and sent a PR.
Affects: 4.0.1
Issue Links:
("duplicates")
Referenced from: pull request #625
The text was updated successfully, but these errors were encountered: