Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Alignment over longer spans to assess priming decay #24

Open
fusaroli opened this issue May 10, 2018 · 5 comments
Open

[Feature] Alignment over longer spans to assess priming decay #24

fusaroli opened this issue May 10, 2018 · 5 comments
Assignees

Comments

@fusaroli
Copy link
Collaborator

Several papers have pointed to differential patterns and effects of alignment over longer spans:

  • Reitter and Moore 2017: syntactic alignment at 20 turns of distance (but not shorter) predicts performance
  • children with ASD show similar patterns of immediate lexical priming to TD children, though they show different patterns over longer timecourses (Harper-Hill, Copland, & Arnott, 2014; Henderson, Clarke, & Snowling, 2011)

We should implement the possibility to measure alignment over longer spans (not adjacent turns, but at a distance of n). This would also require measuring self-alignment at increasing spans to control for that.

@fusaroli fusaroli self-assigned this May 10, 2018
@fusaroli
Copy link
Collaborator Author

fusaroli commented May 18, 2018

The fix could look like adding a loop around the TurnByTurnAnalysis function looping from delay 1 to delay n, and adding a delay column to the output.
As far as I can see there is no check as to whether it's self or other alignment that is being measured, but the direction column is keeping track of that (e.g. A > B vs. A>A).
The interest of adding self-alignment is that it allows us to control whether the temporal changes in alignment to the interlocutor over longer delays are confounded by the interlocutor repeating themselves.
Not implementing this yet, so we can keep the package as-is during the review process

@jseale-asapp
Copy link

The way delay is implemented now, is it the case that, if delay is say, 20, then the alignment will be measured between the prime and the one target that is exactly 20 turns later?

@jseale-asapp
Copy link

I'm seeing different patterns in comprehension-production and production-production priming in my data, so a self-alignment option would be great.

@fusaroli
Copy link
Collaborator Author

fusaroli commented Aug 13, 2019 via email

@jseale-asapp
Copy link

jseale-asapp commented Aug 13, 2019

Would it be of interest to create a window of turns from the delay-indicated turn to, say, the end of the dialogue?

There may be some short-term priming effects within the window that would need to be taken into consideration. For example, using 20 as the delay, if a prime construction is repeated in turn 21 and turn 22, then the probability that is repeated in turn 22 is bolstered by its presence in 21. However, it seems as if the larger window may help to detect the presence of longer term priming—perhaps if the prime-target pairs are restricted to one target per prime within the window, instead of allowing for multiple targets, that would give something of a binary indication that the priming is occurring.

Yes, exactly that
On Tue, 13 Aug 2019 at 16.41, jseale-asapp @.***> wrote: The way delay is implemented now, is it the case that, if delay is say, 20, then the alignment will be measured between the prime and the one target that is exactly 20 turns later? — You are receiving this because you were assigned. Reply to this email directly, view it on GitHub <#24?email_source=notifications&email_token=ABOTIJJ4QNLCKT7RHKP2CQ3QELBZXA5CNFSM4E7GBWFKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD4F4CWQ#issuecomment-520864090>, or mute the thread https://github.com/notifications/unsubscribe-auth/ABOTIJNXOKVA7E3EIXM4XBLQELBZXANCNFSM4E7GBWFA .


-- Riccardo Fusaroli Associate Professor in Cognitive Science Interacting Minds Centre & School of Communication and Culture Aarhus University

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants