Stream: t-compiler/wg-rls-2.0

Topic: slow first query


simulacrum (Feb 14 2020 at 23:46, on Zulip):

I'm not sure if this is expected, but even on relatively small projects (e.g., a single lib.rs with ~200 lines), when I first open a file in vim, the server pretty much instantaneously prints "workspace loaded, 12 rust packages". Then, when I use "go to definition" on anything, the server will jump to 100% CPU for maybe 2-3 seconds before responding.

On larger projects (e.g., rustc) that 2-3 seconds is probably more like 15 seconds, though I haven't measured. Certainly non-instantaneous. This means that my workflow in pretty much any coding session is to open vim, then go to some random line in a file, and "goto definition" and wait. After the first time I've done this, future queries are pretty much instantaneous -- but waiting for the first time is annoying.

I don't know whether I should file a bug. I guess the behavior I would have expected is to provide some way to say "please eagerly compute things in the background" -- most of the time, I'll only run a query after dozens of seconds have passed, so I don't need instantaneous feedback as soon as my editor opens... but I would prefer that I don't have to wait once I do decide to run a query.

One considerations is that in my case, at least, I don't care at all about battery/CPU usage/memory usage pretty much, because I'm running rust analyzer on a remote server (where I also run rustc and cargo), whereas my editor is local (on my laptop). That server has enough CPU cores and so forth that I'm not worried about rust analyzer doing work up front.

Happy to provide more details -- this is probably long enough already :)

matklad (Feb 14 2020 at 23:49, on Zulip):

Yeah, this is a known issue: https://github.com/rust-analyzer/rust-analyzer/issues/1650

matklad (Feb 14 2020 at 23:51, on Zulip):

The fundamental fix here is to implement on-disk persistence, so that everything can be cached and quickly loaded from disk.

However, we probably should prioritize a short-term remedy: eagerly "prime" freshly opened files, and parallelize the analysis

simulacrum (Feb 15 2020 at 02:14, on Zulip):

Yes, to be clear, I'd be more than happy with doing lots of (potentially unnecessary) work at startup. Disk space is actually at more of a premium :)

simulacrum (Feb 15 2020 at 03:31, on Zulip):

Thanks for the feedback though, that issue was an interesting read. I'll have to take a look at the Cargo.lock vs toml thing at least.

Jeremy Kolb (Feb 15 2020 at 16:38, on Zulip):

Can we notify the user through progress notifications on the first slow query while everything is 'warming up' so they at least know that something is going on?

matklad (Feb 15 2020 at 22:05, on Zulip):

We can, but I do feel that there's some low hanging fruit to make it actually faster that we should pick first. Like, an extremely basic thing to do would be to collect profile traces for this case and see if we do something stupid.

matklad (Feb 16 2020 at 17:32, on Zulip):

Pushed a bench harness for goto-def specifically: https://github.com/rust-analyzer/rust-analyzer/pull/3172

Here's a run for one of my projects, haven't investigated this yet

https://gist.github.com/matklad/1812e96746e1d2befa5858da0cc888c4

matklad (Feb 16 2020 at 17:58, on Zulip):

Looked a little bit and found interesting high-level bit we should optimize. A lot of time is spend collecting the impls. We need to collect all impls eagarly for trait solving, that’s how it works. However, our code for impls is lazy: we process each imple one by one, and, I think, we also look into impls bodies for that.

I think we need to change this to store impl headers (trait and type name, but probably not generics and where clauses) inside modules in crate def map, and only compute impl bodies lazily. Cc @Florian Diebold , this looks like a fun find.

std::Veetaha (Feb 17 2020 at 10:46, on Zulip):

(deleted)

matklad (Feb 25 2020 at 00:31, on Zulip):

@simulacrum what exactly are you using with vim? I think there are many implementations of LSP for vim, I wonder if this somehow relates to this as well...

simulacrum (Feb 25 2020 at 00:32, on Zulip):

ah, coc with vim 8 I thnk

simulacrum (Feb 25 2020 at 00:48, on Zulip):

@matklad in theory I can try to run some experiments or so if it would be helpful

Last update: Feb 25 2020 at 02:10UTC