-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Different ply returned by same position #41
Comments
mcostalba
added a commit
that referenced
this issue
Feb 5, 2017
Introduces inconsistencies in the search. See #41 Fixing them requires even more tweaks and logic. Just revert to the simple convention used before. Now 'ply' is no more what players may expect, but at least it is consistent across all the rules and conditions. It is up to the GUI to 'fix' it, indeed it is not even a value visible by the user, it is assumed to be consumed by the GUI.
gbtami
added a commit
to pychess/scoutfish
that referenced
this issue
Feb 7, 2017
Use nth ply move from .pgn in nth ply position in move related rules.
Created correct move_rule logic patch as pull request #42 to fix this. |
Updated the patch to current master and fixed it to pass all unit test also. |
@mcostalba can I do anything else to let you apply pull request #42 ? I think this is the correct way to really fix issue #33 also. |
gbtami
referenced
this issue
in rozim/scoutfish
Dec 5, 2017
e.g QQ vs Q+RR. My understanding is that "material" and "material-imbalance" don't do this.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
As test case, chose the position after 1. e4
We have wrong m2[0] because it should be ply: [1], considering that position is the same, and anyhow the sub-fen is matched at ply 1 not at 2.
The error arises because move_rule is not reset after a single condition matches, but only at the end of the game. Moreover, even if we correct the reset, we would end up with repeated plies in matches output, something like:
{u'ofs': 666, u'ply': [1, 1]}
So we would need to add additional logic to remove duplicates before to output the result.
The text was updated successfully, but these errors were encountered: