You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The ASSISTment 09-10 dataset has a field order_id, which is explained on the official website as: these id's are chronological, and refer to the id of the original problem log.
So for the processing of this dataset, after grouping by user_id, should we sort by 'order_id', otherwise it will destroy the chronological order of each user's answer. Although the preprocessing part constructs the timestamp, it cannot completely guarantee the user's question order. After each user's problem is sorted by order_id, the result of the program run has changed.
The text was updated successfully, but these errors were encountered:
It is probably a potential issue. We did not notice the order_id field, and we assumed the original order in the dataset is already chronological. Maybe we should rerun the experiments on this dataset.
1.We rerun the experiments on this dataset and obtained the following results:
2.The ASSISTment 2012 dataset was sorted by 'timestamp': (int(start) + int(end)) // 2, this may destroy the seriality of the data. For example, one data : start = 1, end = 7, so timestamp = 4; another data : start =2, end = 4, so timestamp = 3. Wouldn't it make more sense to sort the dataset by start_time or end_time? @THUwangcy
The ASSISTment 09-10 dataset has a field order_id, which is explained on the official website as: these id's are chronological, and refer to the id of the original problem log.
So for the processing of this dataset, after grouping by user_id, should we sort by 'order_id', otherwise it will destroy the chronological order of each user's answer. Although the preprocessing part constructs the timestamp, it cannot completely guarantee the user's question order. After each user's problem is sorted by order_id, the result of the program run has changed.
The text was updated successfully, but these errors were encountered: