Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run constraint.GetAllConstrainedEntities after timer #337

Merged
merged 4 commits into from
Jul 25, 2024

Conversation

wrefgtzweve
Copy link
Contributor

Currently every removed constraint being removed triggers constraint.GetAllConstrainedEntities which is a pretty expensive function to run, considering its output is only used when ent1 and ent2 are valid it seems like it can be just ran after it. This saves a lot of lag when a large dupe is removed.

Before:
image

After:
image

@FPtje
Copy link
Owner

FPtje commented Jul 20, 2024

Thanks for the PR! I wonder if this is still correct. The removed comment notes that the to-be-removed constraint is still there when the constrained entities are fetched. After the timer, that would no longer be the case. That means the call to get constrained entities would return fewer entities, especially when only the constraint was removed, and not the entities.

What are your thoughts on this?

@wrefgtzweve
Copy link
Contributor Author

I didn't notice any weird behavior, i see the problem though as now it doesn't properly check for ent2's constraints.
Maybe running constraint.GetAllConstrainedEntities and FPP.RecalculateConstrainedEntities` for both ent1 and ent2 could be better.

FPP.RecalculateConstrainedEntities(player.GetAll(), constrainedEnts1)

local constrainedEnts2 = constraint.GetAllConstrainedEntities(ent2)
FPP.RecalculateConstrainedEntities(player.GetAll(), constrainedEnts2)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's quite possible to have overlap between constrainedEnts1 and constrainedEnts2, especially if somewhere there is a different constraint that keeps the whole contraption together.

You could build the union of the two tables, but I have a different idea (see next post)

@@ -516,10 +520,7 @@ local function handleConstraintRemoved(ent)
entMem[ent1] = true
entMem[ent2] = true
Copy link
Owner

@FPtje FPtje Jul 21, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What if we use this entMem: instead of just storing the two entities, what about storing ALL the constrained entities in entMem?

That way, instead of calling the expensive function once (or twice) per unique pair of entities, it would call it once per contraption.

I got this idea from a quick look. Would it work?

Copy link
Contributor Author

@wrefgtzweve wrefgtzweve Jul 24, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this would work as entMem is checked before the timer is created.
Seems like the timer here is buggy though as it creates a unique timer that gets overwritten each time the function runs likely losing out on a lot of constraintRemovedTimer calls which should be made.

The current code has pretty much no lag impact even with calling it twice (tested with some workshop dupes with 300+ constraints and 300+ props) so performance should be fine as far as i can see. I did merge the tables to prevent double calling it.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The profile says more than intuition. Thanks for taking a look 👍

@FPtje FPtje merged commit cfd4014 into FPtje:master Jul 25, 2024
1 check passed
@FPtje
Copy link
Owner

FPtje commented Jul 25, 2024

Thanks for the performance improvement!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants