Skip to content

Endless query loop causing unusual CPU usage #14367

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
1 of 6 tasks
seadra opened this issue Jan 17, 2021 · 5 comments
Closed
1 of 6 tasks

Endless query loop causing unusual CPU usage #14367

seadra opened this issue Jan 17, 2021 · 5 comments
Labels
performance/speed performance issues with slow downs

Comments

@seadra
Copy link

seadra commented Jan 17, 2021

  • Gitea version (or commit ref): 1.13.1 (self compiled with sqlite3, using go 1.15.5)
  • Git version: 2.20.1
  • Operating system: Debian (armv7l)
  • Database (use [x]):
    • PostgreSQL
    • MySQL
    • MSSQL
    • SQLite
  • Can you reproduce the bug at https://try.gitea.io:
    • Yes (provide example URL)
    • No
  • Log gist:

Description

gitea has unusual CPU usage. From the logs, it seems gitea is endlessly running this query in the background:

..m.io/xorm/core/db.go:286:afterProcess() [I] [SQL] SELECT user_id, count(*) AS count FROM notification WHERE user_id IN (SELECT user_id FROM notification WHERE updated_unix >= ? AND updated_unix < ?) AND status = ? GROUP BY user_id [1610911440 1610911450 1] - 1.048278ms

The user_id is numbers 1610911440 and 1610911450 are incremented by 10 in each iteration. gitea has been running for 2 days, and the log file is filled continuously with this query.

Screenshots

@noerw
Copy link
Member

noerw commented Jan 17, 2021

Increased CPU usage is a known issue, see #7910.
This query is related to checking notifications for a signed in user every 10 seconds, and is most likely not responsible for base load.
Closing this as duplicate, unless you can provide more context.

@noerw noerw closed this as completed Jan 17, 2021
@zeripath
Copy link
Contributor

https://docs.gitea.io/en-us/config-cheat-sheet/#ui---notification-uinotification

You can also change the frequency and or turn it off by changing the EVENT_SOURCE_UPDATE_TIME value

@seadra
Copy link
Author

seadra commented Jan 17, 2021

How can I get more context? Is there an easy way to profile the gitea to see what is causing the CPU load?

I don't think my issue is a duplicate of #7910 because it didn't happen in versions <=1.12.

@seadra
Copy link
Author

seadra commented Jan 17, 2021

I tried running go tool pprof http://localhost:6060/debug/pprof/profile?seconds=30 and top10 shows the following

Duration: 30s, Total samples = 1.72s ( 5.73%)
Entering interactive mode (type "help" for commands, "o" for options)
(pprof) top10
Showing nodes accounting for 1420ms, 82.56% of 1720ms total
Showing top 10 nodes out of 91
      flat  flat%   sum%        cum   cum%
     400ms 23.26% 23.26%      400ms 23.26%  runtime.futex
     320ms 18.60% 41.86%      320ms 18.60%  runtime.epollwait
     190ms 11.05% 52.91%      190ms 11.05%  runtime._LostSIGPROFDuringAtomic64
     150ms  8.72% 61.63%      150ms  8.72%  runtime._ExternalCode
     140ms  8.14% 69.77%      140ms  8.14%  runtime.usleep
     120ms  6.98% 76.74%      120ms  6.98%  kernelcas
      40ms  2.33% 79.07%      740ms 43.02%  runtime.findrunnable
      20ms  1.16% 80.23%      100ms  5.81%  code.gitea.io/gitea/modules/queue.(*WorkerPool).doWork
      20ms  1.16% 81.40%       20ms  1.16%  runtime.heapBitsSetType
      20ms  1.16% 82.56%       40ms  2.33%  runtime.mallocgc
(pprof)

During the 30 seconds, no requests was made to gitea (supposed to be completely idle), but it kept consuming 4-10% CPU.

Any ideas?

@noerw noerw reopened this Jan 17, 2021
@noerw noerw added the performance/speed performance issues with slow downs label Jan 17, 2021
@zeripath
Copy link
Contributor

zeripath commented Feb 7, 2021

The doWork loop here implies that there is some background work going on.

To understand the runtime.futex load you'd need to find out what is calling that - it's too low level to talk about. It's probably just things waiting for work - maybe even sqlite.

Now there is a polling loop:

func (q *ByteFIFOQueue) readToChan() {

Which ideally would be changeable to something that is blocked rather than being a timer loop but unfortunately I've not been able to find that.

There is always background polling going on and 4% is hardly a lot of work on a raspberry pi.

@go-gitea go-gitea locked and limited conversation to collaborators May 3, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
performance/speed performance issues with slow downs
Projects
None yet
Development

No branches or pull requests

4 participants