Skip to content

CI cannot be started #2361

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
addaleax opened this issue Jun 20, 2020 · 5 comments
Closed

CI cannot be started #2361

addaleax opened this issue Jun 20, 2020 · 5 comments

Comments

@addaleax
Copy link
Member

Attempting to start a node-test-pull-request run for any Node.js PR fails silently, i.e. the form can be submitted but no CI job appears to be started as a result.

https://github.com/nodejs/node/labels/needs-ci has a list of PRs for which @ronag or I (so far) have tried to start CI runs but couldn’t.

I’m not sure what to do about this, since it seems like a Jenkins bug? @nodejs/build-infra

@richardlau
Copy link
Member

In the Jenkins system logs (https://ci.nodejs.org/log/all) there's multiple errors with this stack trace:

Jun 20, 2020 10:48:08 AM WARNING net.bull.javamelody.internal.common.JavaLogger warn

exception while collecting data: java.io.IOException: No space left on device
java.io.IOException: No space left on device
	at java.io.FileOutputStream.writeBytes(Native Method)
	at java.io.FileOutputStream.write(FileOutputStream.java:326)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)
	at java.util.zip.DeflaterOutputStream.deflate(DeflaterOutputStream.java:253)
	at java.util.zip.DeflaterOutputStream.write(DeflaterOutputStream.java:211)
	at java.util.zip.GZIPOutputStream.write(GZIPOutputStream.java:145)
	at net.bull.javamelody.internal.model.CounterStorage$CounterOutputStream.write(CounterStorage.java:73)
	at java.io.ObjectOutputStream$BlockDataOutputStream.drain(ObjectOutputStream.java:1877)
	at java.io.ObjectOutputStream$BlockDataOutputStream.flush(ObjectOutputStream.java:1822)
	at java.io.ObjectOutputStream.flush(ObjectOutputStream.java:719)
	at java.io.ObjectOutputStream.close(ObjectOutputStream.java:740)
	at net.bull.javamelody.internal.model.CounterStorage.writeToFile(CounterStorage.java:130)
	at net.bull.javamelody.internal.model.CounterStorage.writeToFile(CounterStorage.java:117)
	at net.bull.javamelody.internal.model.Counter.writeToFile(Counter.java:962)
	at net.bull.javamelody.internal.model.Collector.collectCounterData(Collector.java:742)
	at net.bull.javamelody.internal.model.Collector.collect(Collector.java:363)
	at net.bull.javamelody.internal.model.Collector.collectWithoutErrors(Collector.java:329)
	at net.bull.javamelody.NodesCollector.collectWithoutErrorsNow(NodesCollector.java:173)
	at net.bull.javamelody.NodesCollector.collectWithoutErrors(NodesCollector.java:147)
	at net.bull.javamelody.NodesCollector$1.run(NodesCollector.java:91)
	at java.util.TimerThread.mainLoop(Timer.java:555)
	at java.util.TimerThread.run(Timer.java:505)

I think this is on the Jenkins server itself which may need someone from @nodejs/build-infra to address:
https://ci.nodejs.org/computer/(master)/
image

@jbergstroem
Copy link
Member

So, we're about ~300G worth of build history across all jobs. Seeing how the job diversity/matrix has grown over time I can't help but feel that we should increase disk space.

@jbergstroem
Copy link
Member

jbergstroem commented Jun 20, 2020

Since the windows axis builds takes most space (~60% of total by all builds), I took the liberty of removing the 2 oldest weeks of build folders (not logs though) for win-2019 and win-2019-x86. Apologies for the inconvenience. This gives us some breathing room to find a more permanent solution. FWIW, it looks like we're storing one month's worth build output which may or may not be too long. Since disk space is cheap I'd prefer just getting a new disk, twice the size, attached to jenkins alone.

If someone could confirm that things are on their way again, that'd be great.

@jbergstroem
Copy link
Member

I will be expanding available space to jenkins this coming Sunday. I will close this issue after that has been done and confirmed working unless there are outstanding issues.

@jbergstroem
Copy link
Member

Going to close this since we identified space-related issues and fixed them. Please reopen if I'm incorrect in my assessment.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants