-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Extension is writing a lot of data to the disk #5362
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hi @aggserp4. Could you tell us what files are being written to, and their sizes? It is expected that when a project is created, or its configuration changed, that the extension will (re)build a database of symbols in all headers in all configured paths, in order to provide that information via IntelliSense. That means it will parse all system headers as well as headers from any paths you have configured. |
This doesn't appear to be specifically affected by me creating new projects or including more files. It just seems to write more and more data as I write more code. I was working on a couple of files for about an hour just writing some functions and it wrote about 3.6GB to the disk over that period. I also noticed that the "Updating Intellisense..." symbol was appearing full time while I was editing, and it's still showing up now regardless of which projects I open. That wasn't the case when I opened this issue, though. As for which files it's writing to, I don't know. I can see which files the process has opened at any given time but I don't know to which ones it's writing. I don't think it's draining any space, if that's of any help. |
This sounds like the IntelliSense cache. You can set the size or location via changing the intelliSenseCacheSize and intelliSenseCachePath settings (or set the size to 0 to disable it). |
Sounds about right, I turned it off and it was writing very little data. Turned it back on and it started writing in the same pattern as before. Still doesn't explain why the "Updating Intellisense..." symbol occasionally shows up and never goes away, but perhaps that's a separate issue. |
@aggserp4 The "Updating IntelliSense..." means we're waiting on work from cpptools-srv. So if you see that process using CPU then it's busy doing something (potentially in an infinite loop, although we don't know of any known cases of that currently). If you don't see cpptools-srv then it may have crashed or if you see cpptools using CPU instead, it could be stuck computing recursive includes before it's able to lauch cpptools-srv (removing "**" from your includePath can disable recursive includes). |
For the record, I'm having this very issue, but in my case it's not that I'm worried about the amount of data written, it's that writing this very data makes the whole machine totally unusable. I don't have a laptop with a top-of-the-line NVMe SSD ... and 300MB to 600 MB "drops" every few seconds leads to Linux freezing the whole machine, regularly. I'm happy I finally found why the whole vscode would freeze regularly. I was considering just returning to plain vim. |
@Colengms I first tried to check which file was being written by stracing cpptools, but nothing was coming up ... and then realized the ipch file is mmaped. So, that's one hypothesis. Disabling the cache (by setting its size to zero) also solves the problem. Second hint. |
I'm not sure. It's so unusable that I just disabled the cache for now. Big deadline coming up, won't have much time looking into it further for now.
I'm not sure I fully understand the implication of this change. |
@aggserp4 could you expand on the reason why you are closing this issue ? |
Apologies, I was looking through my issues and closed this one without paying much attention because it had no responses for a while and I had solved my issue by disabling the cache. However it is clear that your problem remains, so I will reopen the issue. This was my mistake. I should mention that I do not seem to be having the problem you mentioned. I am using a mechanical drive and despite the large amount of data that is being written in a short period of time with the cache enabled, my computer remains responsive. |
I have the same problem with you. I edit a cpp project for one day and I see in the system monitor that cpptools-srv has written 467GiB data. |
@kilasuelika Do you know what files are getting written? Does changing the C_Cpp.intelliSenseCacheSize to 0 fix the issue? The ipch cache files should be the only files that cpptools-srv creates. The default C_Cpp.intelliSenseCacheSize is 5 GB and the location is shared for all projects using the default C_Cpp.intelliSenseCachePath, so I'm not sure how you could be getting 467 GB of data generated...the system monitor might be reporting writing of memory multiple times to the same location. |
@sean-mcmanus, I'm interpreting @kilasuelika's message as |
cpptools-srv uses large memory mapped ipch files by default, so every time memory is written it might eventually show up as a file write. You can set C_Cpp.intelliSenseCacheSize to 0 to stop doing that to see if it improves performance for you. Our performance testing showed that using the caching improved performance, but we've heard some users seeing opposite results. |
This is a bit of a tone-deaf answer. I'm sure it improves performance, even more so probably on very high quality NVMe SSDs. However, and I think everyone that ever had to look over this issue would agree, the "might" in your sentence is downplaying the issue. It does end up as very large amount of actual writes. In the case of @kilasuelika, ~450 GiB for one day, which matches my own experience. That's 45 GiB per hour. It's a massive amount of writes. cpptools-srv does seem to be modifying, often, a lot of pages in the cache mmap. What is worrying in this thread is that you basically does not seem to consider this as an issue. Consumer-grade SSDs are rated for amounts of writes smaller than 1 full disk write per day. For mine, it's 0.3. 450 GiB can definitely kill such a SSD unreasonably rapidly, on a stock install of vscode. |
@doudou Does setting C_Cpp.intelliSenseCacheSize to 0 not fix the issue for you or are you saying that setting should set to 0 by default or what other change are you proposing? Our feature matches what is used by VS. I can forward your message to that team to see what their thoughts are. |
@doudou @kilasuelika When you repro the issue are you editing a header file or do you see "failed to read response" (or "crash") messages in the logging? Those scenarios might be causing the excessive writes, but under other conditions that would not be expected since the ipch data would stay unchanged and the data not re-written to the disk. https://github.com/microsoft/vscode-cpptools/releases/tag/1.2.0-preview disables the caching for a TU after a crash, so using that would fix that case. Or are you opening lots of new source files that don't have auto-pch files generated? Or is some configuration changing that could cause the ipch files to need to be regenerated? The design philosophy between the ipch files is to store the processed header files to disk so they don't need to be reprocessed. Would some intelliSenseCacheSizeDiskUsagePerHour setting be useful or are you okay with just disabling the feature? |
From my experience (and this discussion), I would say that the setting default should be zero, and there should be some mechanism inside cpptools-srv to detect excessive writes and either (loudly) warn the user or automatically disabling the cache.
I would argue that editing a header is a rather expected part of the development workflow. If editing a header is expected to generate excessive writes, I'd say that these excessive writes are expected in general.
I'm not sure what that "auto-pch files generated" means. Do you refer to the files generated by cpptools-srv itself ? Possibly. Since I have no means to know whether a file had a PCH or not, it's hard to say. |
auto-pch is another name for the ipch files that get written to disk. If you set C_Cpp.intelliSenseCachePath to something like ${workspaceFolder}/.vscode then it's easier to see when a file is generated or regenerated. We'll look into possibly changing the default. Not sure yet... |
I find that the problem has disappeared maybe due to updates. It only writes a few GiBs data now. What's more, formating speed has also been increased. I have to wait some seconds after Ctrl+S before. |
@kilasuelika Which update? 1.2.0-insiders? We didn't intentionally change any ipch-related issues, unless you were hitting some crashes that was causing the caching to be repeatedly invalidated. Formatting speed could be increased due to clang-format changes we did? |
I guess it is. Now I see that the version is indeed "1.2.0-insiders". Yesterday when I open the extension manager, some extensions are showing "installing" by themselves and quickly done. |
I have been worrying that the CPP extension tools will destroy my hard drive by writing 100G to it every day. intelliSenseCacheSize = 0 trick is so nice. Thanks guys! |
had to disable the cpp extension because of high cpu, memory and disk r/w on ubuntu 22.04 from
|
@bach001 Are you able to see what file type is using the disk space? i.e. .ipch or .db? You can try setting C_Cpp.files.exclude for a smaller .db (use a folder glob and not a glob that ends with "**" or you'll need to change the C_Cpp.exclusionPolicy setting). and C_Cpp.intelliSenseCacheSize for fewer .ipch. |
Hey I'm experiencing this same issue, and after reading this thread, I got a few questions and suggestions.
|
I didn't see the details. but observed that every tool employing clangd to do the job manifests the same symptom. I guess it should be many many .pch file(for windows vs). I've got a close look when clangd working behind QtCreator. The essence is the same. So actually this is actually a clangd issue. |
OS: Ubuntu 18.04 LTS
VS Code version: 1.44.2
Extension version: 0.27.0
The extension appears to be writing a lot of data to the disk very quickly, for reasons I can't figure out. My projects usually only contain a couple of files that are a few KB in size, yet when I make the slightest edit, save, or do just about anything the extension writes a lot of data to the disk. After about 20 minutes of usage, it has already written over 1 gigabyte of data.
Is this normal behavior or is something wrong? What's causing all those writes?
The text was updated successfully, but these errors were encountered: