-
Notifications
You must be signed in to change notification settings - Fork 4.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ensure frozen snapshots don't return less data if solution already has compilation data #75796
Conversation
// literal they're hovering over. | ||
// Partial semantics should always be sufficient because the (unconverted) type | ||
// of a literal can always easily be determined. | ||
var (_, semanticModel) = await document.GetFullOrPartialSemanticModelAsync(cancellationToken).ConfigureAwait(false); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
removed this extension as the main WithFrozenPartialSemantics extension has these semantics now.
// Make sure the new document is frozen before we try to get the semantic model. This is to avoid trigger source | ||
// generator, which is expensive and not needed for calculating the change. Pass in 'forceFreeze: true' to | ||
// ensure all further transformations we make do not run generators either. | ||
document = document.WithSyntaxRoot(annotatedRoot).WithFrozenPartialSemantics(forceFreeze: true, cancellationToken); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
here's an example of where we want 'forceFreeze: true'. We are making several transformation passes, and we don't want generators running between each phase.
{ | ||
if (!forceFreeze && this.Project.TryGetCompilation(out _)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
possibly silly question - is there an issue where the caller might now unexpectedly cause source generators in other projects to run if the solution is not 'frozen' (b/c this project had a compilation, but others did not)?
Or does the concept of frozen-ness already only apply to this project/compilation?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The latter. And we doc it as such in the doc comments.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Specifically: /// when getting its compilation. However, all other projects will still run generators when their compilations are
@@ -487,18 +487,35 @@ public ImmutableArray<DocumentId> GetLinkedDocumentIds() | |||
return filteredDocumentIds.Remove(this.Id); | |||
} | |||
|
|||
/// <inheritdoc cref="WithFrozenPartialSemantics(bool, CancellationToken)"/> | |||
internal Document WithFrozenPartialSemantics(CancellationToken cancellationToken) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
note: in a followup i'm renaming this to WithFullOrFrozenSemantics. But that would hit several dozen files, so i'm not touching that here.
…s compilatino data in progress in progress Revert in progress Return existing full semantic info for frozen document if already available Fix
5341838
to
91457ed
Compare
src/Features/Core/Portable/Completion/Providers/AbstractMemberInsertingCompletionProvider.cs
Outdated
Show resolved
Hide resolved
…InsertingCompletionProvider.cs
/// Returns the semantic model for this document that may be produced from partial semantics. The semantic model | ||
/// is only guaranteed to contain the syntax tree for <paramref name="document"/> and nothing else. | ||
/// </summary> | ||
public static async Task<(Document document, SemanticModel? semanticModel)> GetFullOrPartialSemanticModelAsync(this Document document, CancellationToken cancellationToken) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
removed. WithFrozenPartialSemantics has these semantics by default now.
Assert.NotSame(partialProject, project); | ||
var partialProject = project.Documents.Single().WithFrozenPartialSemantics(forceFreeze, CancellationToken.None).Project; | ||
|
||
// If we're forcing the freeze, we must get a difference project instance. If we're not, we'll get the same |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
src/Workspaces/CoreTest/SolutionTests/SolutionWithSourceGeneratorTests.cs
Outdated
Show resolved
Hide resolved
@CyrusNajmabadi when you feel this is ready, would you please give me a 17.12 port? |
@arunchndr that is :#75803 |
Fixes an issue we're seeing with classification sometimes 'flickering' while people scroll within a text buffer.
The core issue relates to how two disparate systems interact with each other. The first is tagging, and the second is 'frozen partial caching'.
With tagging, we always perform two passes. An initial pass where we use frozen-partial (i.e. 'incomplete') semantics, and a second pass where we compute with full semantics. We do things this was as computing full semantics can be extraordinarily expensive (sometimes seconds or more, depending on how expensive things like generators are). For an experience we want to run in milliseconds, this is unacceptable. So tagging will continually try to quickly tag with incomplete semantics, while always finishing with a full semantics tagging pass.
The problem occurs then while scrolling happens. Let's consider a sequence of events:
The issue happens right at the end. In step '2' when we computed the frozen partial snapshot for this document, we cache it, so that any other features that need it will get that back, including any semantics it has cached. However, when a scroll happens, there are no real changes to our workspace, so we continue using the same Solution snapshots (and cached frozen partial solutions it is holding onto). This means that in step '5' we end up producing tags against the cached frozen partial snapshot, producing the same results we produced in step '2'. This can cause classification to disappear for a symbol, only to return when the next full tag pass happens.
The fix here is to tweak how frozen-partial requests happen. Specifically, instead of always forking to produce a frozen partial solution, we will not take that step in the case where full compilation info has already been computed and cached in the solution for that document's project. In this case, since the compilation is already computed and available, we have already run generators/skeletons, and there's no perf concern. This allows avoiding a fork and allows greater reuse of symbol information from that compilation.
This PR makes that the default behavior for features asking for frozen-partial semantics. However, there is a wrinkle. We don't always want to return the existing solution if compilation data is ready for it. Getting a frozen-partial snapshot also ensured that any forks of that snapshot would also be frozen-partial as well. This is good for features which do some up front analysis, and hten want to make a series of complex transformations (and don't care if generators run after each step). To continue supporting this scenario, callers can request that the frozen-partial fork always happen, even if compilation information is available.
--
Note: i have a followup PR where i do some renaming to make everything clearer. However, that greatly increases the size of this PR (which we want to backport to 17.12). So I'm holding off on that.