A user posted an issue in which the lifetime of a reference was unnecessarily inferred to include a yield point. Since this can impact the auto traits of the resulting state machine, it'd be ideal if this didn't happen. I was just wondering if anyone is thinking about how yield points interact with region inference. https://internals.rust-lang.org/t/async-await-and-references-in-pattern-match/9409/2
Also this is my first time using Zulip so sorry if I've not done things properly somehow. :))
hmm there's definitely something funny happening there
I want to double-check something: Is the claim here that this issue arises with and without AST-borrowck (which is non-trivial to check since one cannot run this example on the 2015 edition)?
and that the reason NLL is relevant is because one might hope that the increased precision of NLL would allow sidestepping the problem?
Or is the claim that this code works with AST-borrowck and NLL is somehow regressing it?
@boats ^ ?
I haven't investigated closely, but I don't have any reason to think its a regression.
I find it fascinating that pulling the
f.foo() expression out into its own local sidesteps the error. I'm more used to things going in the other direction. :smiley:
(except of course for the cases where pulling it into a local is also introducing changes to evaluation order. But I do not think that is happening here. Then again I might be misunderstanding the generator desugaring.)
Yea it was very surprising to me also that that code change made it compile.
The most interesting thing to me in this thread was that this is a design goal that could be factored into future NLL/polonius work - reducing the number of live objects across yield points. This matters both for auto traits like in this example and also optimizing the size of the state machine.