Skip to content

node_exporter fails to restart #1881

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
sam-celer opened this issue Nov 7, 2020 · 9 comments
Closed

node_exporter fails to restart #1881

sam-celer opened this issue Nov 7, 2020 · 9 comments
Labels

Comments

@sam-celer
Copy link

Host operating system: output of uname -a

Linux 3.10.0-1127.19.1.el7.x86_64 #1 SMP Tue Aug 25 17:23:54 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux

node_exporter version: output of node_exporter --version

node_exporter, version 1.0.1 (branch: HEAD, revision: 3715be6)
build user: root@1f76dbbcfa55
build date: 20200616-12:44:12
go version: go1.14.4

node_exporter command line flags

--collector.textfile.directory /opt/celertech/releases/tools/prometheus/node-exporter/current/data/collector --log.level info

Are you running node_exporter in Docker?

No

What did you do that produced an error?

Scheduled restart of the process

What did you expect to see?

Process coming up

What did you see instead?

Process crashing

The issue is inconsistent, sometimes it restarts fine, sometimes it doesn't. Stack trace below

level=info ts=2020-11-07T12:05:05.510Z caller=node_exporter.go:177 msg="Starting node_exporter" version="(version=1.0.1, branch=HEAD, revision=3715be6ae899f2a9b9dbfd9c39f3e09a7bd4559f)"
level=info ts=2020-11-07T12:05:05.510Z caller=node_exporter.go:178 msg="Build context" build_context="(go=go1.14.4, user=root@1f76dbbcfa55, date=20200616-12:44:12)"
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:105 msg="Enabled collectors"
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=arp
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=bcache
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=bonding
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=btrfs
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=conntrack
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=cpu
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=cpufreq
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=diskstats
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=edac
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=entropy
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=filefd
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=filesystem
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=hwmon
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=infiniband
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=ipvs
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=loadavg
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=mdadm
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=meminfo
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=netclass
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=netdev
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=netstat
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=nfs
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=nfsd
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=powersupplyclass
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=pressure
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=rapl
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=schedstat
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=sockstat
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=softnet
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=stat
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=textfile
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=thermal_zone
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=time
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=timex
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=udp_queues
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=uname
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=vmstat
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=xfs
level=info ts=2020-11-07T12:05:05.512Z caller=node_exporter.go:112 collector=zfs
level=info ts=2020-11-07T12:05:05.515Z caller=node_exporter.go:191 msg="Listening on" address=:9100
level=info ts=2020-11-07T12:05:05.515Z caller=tls_config.go:170 msg="TLS is disabled and it cannot be enabled on the fly." http2=false
runtime: checkdead: find g 10261 in status 1
fatal error: checkdead: runnable g

runtime stack:
runtime.throw(0xbd3314, 0x15)
/usr/local/go/src/runtime/panic.go:1116 +0x72
runtime.checkdead()
/usr/local/go/src/runtime/proc.go:4407 +0x390
runtime.mput(...)
/usr/local/go/src/runtime/proc.go:4824
runtime.stopm()
/usr/local/go/src/runtime/proc.go:1832 +0x95
runtime.exitsyscall0(0xc000244a80)
/usr/local/go/src/runtime/proc.go:3268 +0x111
runtime.mcall(0x0)
/usr/local/go/src/runtime/asm_amd64.s:318 +0x5b

goroutine 1 [IO wait, 8 minutes]:
internal/poll.runtime_pollWait(0x7f7a0fa6af18, 0x72, 0x0)
/usr/local/go/src/runtime/netpoll.go:203 +0x55
internal/poll.(*pollDesc).wait(0xc0001e4198, 0x72, 0x0, 0x0, 0xbc7ee1)
/usr/local/go/src/internal/poll/fd_poll_runtime.go:87 +0x45
internal/poll.(*pollDesc).waitRead(...)
/usr/local/go/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0xc0001e4180, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
/usr/local/go/src/internal/poll/fd_unix.go:384 +0x1d4
net.(*netFD).accept(0xc0001e4180, 0x84bee676a2cfadf3, 0x0, 0x84bee676a2cfadf3)
/usr/local/go/src/net/fd_unix.go:238 +0x42
net.(*TCPListener).accept(0xc0001e2b00, 0x5fa68d75, 0xc0001a7b58, 0x4bee86)
/usr/local/go/src/net/tcpsock_posix.go:139 +0x32
net.(*TCPListener).Accept(0xc0001e2b00, 0xc0001a7ba8, 0x18, 0xc000000180, 0x6d09ac)
/usr/local/go/src/net/tcpsock.go:261 +0x64
net/http.(*Server).Serve(0xc0001ec000, 0xcd8400, 0xc0001e2b00, 0x0, 0x0)
/usr/local/go/src/net/http/server.go:2901 +0x25d
net/http.(*Server).ListenAndServe(0xc0001ec000, 0xc000095e80, 0x4)
/usr/local/go/src/net/http/server.go:2830 +0xb7
github.com/prometheus/node_exporter/https.Listen(0xc0001ec000, 0x0, 0x0, 0xccade0, 0xc00017e960, 0x0, 0xc0001e8f60)
/app/https/tls_config.go:171 +0x18d
main.main()
/app/node_exporter.go:193 +0x1244

goroutine 23 [IO wait]:
internal/poll.runtime_pollWait(0x7f7a0fa6ae38, 0x72, 0xffffffffffffffff)
/usr/local/go/src/runtime/netpoll.go:203 +0x55
internal/poll.(*pollDesc).wait(0xc0001e4218, 0x72, 0x1000, 0x1000, 0xffffffffffffffff)
/usr/local/go/src/internal/poll/fd_poll_runtime.go:87 +0x45
internal/poll.(*pollDesc).waitRead(...)
/usr/local/go/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0xc0001e4200, 0xc000123000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
/usr/local/go/src/internal/poll/fd_unix.go:169 +0x19b
net.(*netFD).Read(0xc0001e4200, 0xc000123000, 0x1000, 0x1000, 0x203000, 0x203000, 0x7f7a0fa70158)
/usr/local/go/src/net/fd_unix.go:202 +0x4f
net.(*conn).Read(0xc00000e950, 0xc000123000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
/usr/local/go/src/net/net.go:184 +0x8e
net/http.(*connReader).Read(0xc0001e9260, 0xc000123000, 0x1000, 0x1000, 0x7f7a0fa71e90, 0x0, 0xc00007f429)
/usr/local/go/src/net/http/server.go:786 +0xf4
bufio.(*Reader).fill(0xc000197800)
/usr/local/go/src/bufio/bufio.go:100 +0x103
bufio.(*Reader).ReadSlice(0xc000197800, 0xc00006190a, 0x7f7a0fa71e90, 0xc0000619c0, 0x40c5d6, 0xc000364000, 0x100)
/usr/local/go/src/bufio/bufio.go:359 +0x3d
bufio.(*Reader).ReadLine(0xc000197800, 0xc0000619c8, 0x11ab240, 0x7f7a366f7108, 0x0, 0xb8bdc0, 0xc0001e92c0)
/usr/local/go/src/bufio/bufio.go:388 +0x34
net/textproto.(*Reader).readLineSlice(0xc0001e92c0, 0xc000364000, 0x0, 0x43745c, 0xc000061a28, 0x0)
/usr/local/go/src/net/textproto/reader.go:58 +0x6c
net/textproto.(*Reader).ReadLine(...)
/usr/local/go/src/net/textproto/reader.go:39
net/http.readRequest(0xc000197800, 0x0, 0xc000364000, 0x0, 0x0)
/usr/local/go/src/net/http/request.go:1015 +0xa4
net/http.(*conn).readRequest(0xc000167540, 0xcdac00, 0xc000095f00, 0x0, 0x0, 0x0)
/usr/local/go/src/net/http/server.go:966 +0x191
net/http.(*conn).serve(0xc000167540, 0xcdac00, 0xc000095f00)
/usr/local/go/src/net/http/server.go:1822 +0x6d4
created by net/http.(*Server).Serve
/usr/local/go/src/net/http/server.go:2933 +0x35c

goroutine 77 [select]:
github.com/prometheus/client_golang/prometheus.(*Registry).Gather(0xc000031720, 0x0, 0x0, 0x0, 0x0, 0x0)
/app/vendor/github.com/prometheus/client_golang/prometheus/registry.go:510 +0xb6a
github.com/prometheus/client_golang/prometheus.Gatherers.Gather(0xc0001e27c0, 0x2, 0x2, 0xc00003c360, 0xc0002137fe, 0x451500, 0x8f8273, 0x115a801)
/app/vendor/github.com/prometheus/client_golang/prometheus/registry.go:716 +0xc8c
github.com/prometheus/client_golang/prometheus/promhttp.HandlerFor.func1(0x7f7a0fa6fb88, 0xc000535810, 0xc0003a4500)
/app/vendor/github.com/prometheus/client_golang/prometheus/promhttp/http.go:126 +0x93
net/http.HandlerFunc.ServeHTTP(0xc0001c3260, 0x7f7a0fa6fb88, 0xc000535810, 0xc0003a4500)
/usr/local/go/src/net/http/server.go:2012 +0x44
github.com/prometheus/client_golang/prometheus/promhttp.InstrumentHandlerInFlight.func1(0x7f7a0fa6fb88, 0xc000535810, 0xc0003a4500)
/app/vendor/github.com/prometheus/client_golang/prometheus/promhttp/instrument_server.go:40 +0xab
net/http.HandlerFunc.ServeHTTP(0xc0001e8e40, 0x7f7a0fa6fb88, 0xc000535810, 0xc0003a4500)
/usr/local/go/src/net/http/server.go:2012 +0x44
github.com/prometheus/client_golang/prometheus/promhttp.InstrumentHandlerCounter.func1(0xcd86c0, 0xc0001ec1c0, 0xc0003a4500)
/app/vendor/github.com/prometheus/client_golang/prometheus/promhttp/instrument_server.go:100 +0xda
net/http.HandlerFunc.ServeHTTP(0xc0001e8f30, 0xcd86c0, 0xc0001ec1c0, 0xc0003a4500)
/usr/local/go/src/net/http/server.go:2012 +0x44
main.(*handler).ServeHTTP(0xc0000958c0, 0xcd86c0, 0xc0001ec1c0, 0xc0003a4500)
/app/node_exporter.go:77 +0x53c
net/http.(*ServeMux).ServeHTTP(0x11aa4a0, 0xcd86c0, 0xc0001ec1c0, 0xc0003a4500)
/usr/local/go/src/net/http/server.go:2387 +0x1a5
net/http.serverHandler.ServeHTTP(0xc0001ec000, 0xcd86c0, 0xc0001ec1c0, 0xc0003a4500)
/usr/local/go/src/net/http/server.go:2807 +0xa3
net/http.(*conn).serve(0xc000167b80, 0xcdac00, 0xc0002b3940)
/usr/local/go/src/net/http/server.go:1895 +0x86c
created by net/http.(*Server).Serve
/usr/local/go/src/net/http/server.go:2933 +0x35c

goroutine 10228 [semacquire]:
sync.runtime_Semacquire(0xc000346ea8)
/usr/local/go/src/runtime/sema.go:56 +0x42
sync.(*WaitGroup).Wait(0xc000346ea0)
/usr/local/go/src/sync/waitgroup.go:130 +0x64
github.com/prometheus/node_exporter/collector.NodeCollector.Collect(0xc00017eb70, 0xccade0, 0xc00017e960, 0xc0001963c0)
/app/collector/collector.go:148 +0x15d
github.com/prometheus/client_golang/prometheus.(*Registry).Gather.func1()
/app/vendor/github.com/prometheus/client_golang/prometheus/registry.go:443 +0x19d
created by github.com/prometheus/client_golang/prometheus.(*Registry).Gather
/app/vendor/github.com/prometheus/client_golang/prometheus/registry.go:535 +0xe36

goroutine 10227 [semacquire]:
sync.runtime_Semacquire(0xc000346e94)
/usr/local/go/src/runtime/sema.go:56 +0x42
sync.(*WaitGroup).Wait(0xc000346e94)
/usr/local/go/src/sync/waitgroup.go:130 +0x64
github.com/prometheus/client_golang/prometheus.(*Registry).Gather.func2(0xc000346e94, 0xc0001963c0, 0xc000196420)
/app/vendor/github.com/prometheus/client_golang/prometheus/registry.go:460 +0x2b
created by github.com/prometheus/client_golang/prometheus.(*Registry).Gather
/app/vendor/github.com/prometheus/client_golang/prometheus/registry.go:459 +0x5d8

goroutine 10261 [runnable]:
syscall.Syscall(0x3, 0xa, 0x0, 0x0, 0x0, 0x0, 0x0)
/usr/local/go/src/syscall/asm_linux_amd64.s:18 +0x5
syscall.Close(0xa, 0xc000260180, 0x0)
/usr/local/go/src/syscall/zsyscall_linux_amd64.go:285 +0x40
internal/poll.(*FD).destroy(0xc000260180, 0xc000330701, 0x0)
/usr/local/go/src/internal/poll/fd_unix.go:78 +0x43
internal/poll.(*FD).decref(0xc000260180, 0x1, 0x0)
/usr/local/go/src/internal/poll/fd_mutex.go:213 +0x42
internal/poll.(*FD).Close(0xc000260180, 0xf, 0x0)
/usr/local/go/src/internal/poll/fd_unix.go:100 +0x4f
os.(*file).close(0xc000260180, 0xc00026e990, 0xc000340cd3)
/usr/local/go/src/os/file_unix.go:248 +0x38
os.(*File).Close(0xc000392468, 0x1, 0xc00026e990)
/usr/local/go/src/os/file_unix.go:237 +0x33
path/filepath.glob(0xc000340cc0, 0x12, 0xc000340cd3, 0x12, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...)
/usr/local/go/src/path/filepath/match.go:336 +0x236
path/filepath.Glob(0xc000340cc0, 0x25, 0x2, 0xc000340cc0, 0x25, 0x1, 0x1)
/usr/local/go/src/path/filepath/match.go:251 +0x36c
github.com/prometheus/procfs/sysfs.FS.ClassThermalZoneStats(0xbc5090, 0x4, 0xc000014430, 0x0, 0x0, 0x0, 0xc000014498)
/app/vendor/github.com/prometheus/procfs/sysfs/class_thermal.go:42 +0x165
github.com/prometheus/node_exporter/collector.(*thermalZoneCollector).Update(0xc0000959c0, 0xc0001963c0, 0x11aa540, 0xc000014678)
/app/collector/thermal_zone_linux.go:70 +0x4f
github.com/prometheus/node_exporter/collector.execute(0xbcc945, 0xc, 0xccb680, 0xc0000959c0, 0xc0001963c0, 0xccade0, 0xc00017e960)
/app/collector/collector.go:153 +0x84
github.com/prometheus/node_exporter/collector.NodeCollector.Collect.func1(0xc0001963c0, 0xc00017eb70, 0xccade0, 0xc00017e960, 0xc000346ea0, 0xbcc945, 0xc, 0xccb680, 0xc0000959c0)
/app/collector/collector.go:144 +0x6d
created by github.com/prometheus/node_exporter/collector.NodeCollector.Collect
/app/collector/collector.go:143 +0x133

goroutine 10253 [runnable]:
syscall.Syscall6(0x106, 0xffffffffffffff9c, 0xc0002bbfc0, 0xc000184b98, 0x100, 0x0, 0x0, 0x0, 0xc000184b98, 0x0)
/usr/local/go/src/syscall/asm_linux_amd64.s:41 +0x5
syscall.fstatat(0xffffffffffffff9c, 0xc0002bbfa0, 0x18, 0xc000184b98, 0x100, 0xc0002bbfa0, 0xc0002bbfb2)
/usr/local/go/src/syscall/zsyscall_linux_amd64.go:1480 +0xc6
syscall.Lstat(...)
/usr/local/go/src/syscall/syscall_linux_amd64.go:76
os.lstatNolog(0xc0002bbfa0, 0x18, 0x3, 0x3, 0xc0002bbfa0, 0x18)
/usr/local/go/src/os/stat_unix.go:42 +0x6e
os.Lstat(0xc0002bbfa0, 0x18, 0x11, 0xbc46e6, 0x1, 0xc0004d7320)
/usr/local/go/src/os/stat.go:22 +0x4d
os.(*File).readdir(0xc00000ef40, 0xffffffffffffffff, 0x0, 0x0, 0xc00000ef40, 0x0, 0x0)
/usr/local/go/src/os/file_unix.go:385 +0x1bd
os.(*File).Readdir(...)
/usr/local/go/src/os/dir.go:26
io/ioutil.ReadDir(0xc0002bba40, 0x11, 0x7f7a366f7108, 0x0, 0x0, 0x0, 0x203000)
/usr/local/go/src/io/ioutil/ioutil.go:98 +0x1eb
github.com/prometheus/procfs/sysfs.NetClass.parseNetClassIface(0xc0004f4b10, 0xc0002bba40, 0x11, 0xc0002bba40, 0x11, 0x0)
/app/vendor/github.com/prometheus/procfs/sysfs/net_class.go:110 +0x6d
github.com/prometheus/procfs/sysfs.FS.NetClass(0xbc5090, 0x4, 0x0, 0xc000544a70, 0xc000339860)
/app/vendor/github.com/prometheus/procfs/sysfs/net_class.go:95 +0x29a
github.com/prometheus/node_exporter/collector.(*netClassCollector).getNetClassInfo(0xc000095980, 0x0, 0x1142559, 0x1142559)
/app/collector/netclass_linux.go:175 +0x4f
github.com/prometheus/node_exporter/collector.(*netClassCollector).Update(0xc000095980, 0xc0001963c0, 0x11aa540, 0xc00001570a)
/app/collector/netclass_linux.go:62 +0x43
github.com/prometheus/node_exporter/collector.execute(0xbc967d, 0x8, 0xccb3e0, 0xc000095980, 0xc0001963c0, 0xccade0, 0xc00017e960)
/app/collector/collector.go:153 +0x84
github.com/prometheus/node_exporter/collector.NodeCollector.Collect.func1(0xc0001963c0, 0xc00017eb70, 0xccade0, 0xc00017e960, 0xc000346ea0, 0xbc967d, 0x8, 0xccb3e0, 0xc000095980)
/app/collector/collector.go:144 +0x6d
created by github.com/prometheus/node_exporter/collector.NodeCollector.Collect
/app/collector/collector.go:143 +0x133

goroutine 10221 [IO wait]:
internal/poll.runtime_pollWait(0x7f7a0fa6ad58, 0x72, 0xffffffffffffffff)
/usr/local/go/src/runtime/netpoll.go:203 +0x55
internal/poll.(*pollDesc).wait(0xc0003b2318, 0x72, 0x0, 0x1, 0xffffffffffffffff)
/usr/local/go/src/internal/poll/fd_poll_runtime.go:87 +0x45
internal/poll.(*pollDesc).waitRead(...)
/usr/local/go/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0xc0003b2300, 0xc00039b3f1, 0x1, 0x1, 0x0, 0x0, 0x0)
/usr/local/go/src/internal/poll/fd_unix.go:169 +0x19b
net.(*netFD).Read(0xc0003b2300, 0xc00039b3f1, 0x1, 0x1, 0xc000095fd8, 0xc00023f768, 0x47de1c)
/usr/local/go/src/net/fd_unix.go:202 +0x4f
net.(*conn).Read(0xc000398908, 0xc00039b3f1, 0x1, 0x1, 0x0, 0x0, 0x0)
/usr/local/go/src/net/net.go:184 +0x8e
net/http.(*connReader).backgroundRead(0xc00039b3e0)
/usr/local/go/src/net/http/server.go:678 +0x58
created by net/http.(*connReader).startBackgroundRead
/usr/local/go/src/net/http/server.go:674 +0xd0

@SuperQ
Copy link
Member

SuperQ commented Nov 7, 2020

That is quite the strange crash. I have not encountered that in a Go program before.

@SuperQ SuperQ added the bug label Nov 7, 2020
@sam-celer
Copy link
Author

Maybe related to this bug: prometheus/consul_exporter#184

Looks like it impacts go 1.14.4 which is what is used by this version of node_exporter

@SuperQ
Copy link
Member

SuperQ commented Nov 7, 2020

Ahh, good find. I had forgotten about that one. I guess a new release is in order. I've been thinking about releasing 1.1 soon anyway.

@sam-celer
Copy link
Author

Hi there, do you have some rough ETA for the next release? Thanks

@tbuckel
Copy link

tbuckel commented Dec 17, 2020

Hi there, having the same error causing random stops of the node_exporter service which triggers my system alerting. Seems to be related to prometheus/consul_exporter#184 (mentioned earlier) which is related to golang/go#40368 and looks like it's been fixed end of July.
When is the next 1.1 version planned?

@discordianfish
Copy link
Member

Anyone here able to reproduce this with >=1.1.x? Otherwise we can close this.

@tbuckel
Copy link

tbuckel commented May 27, 2021

Hi @discordianfish hasn't happened for me since I've upgrade.

@sam-celer
Copy link
Author

sam-celer commented May 27, 2021 via email

@discordianfish
Copy link
Member

Thanks for confirming!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants