You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the current state of the package, there are two functions that take considerable time to execute:
Reading in a tracking data file into the right dataframe format
Synchronising the tracking and event data, more specifically, solving the Needleman-Wunch algorithm.
For this issue I would like to challenge you to make the code for reading tracking data (independent of which provider/type) reasonably faster (~30%)!
Lets also start discussions on ideas to make it faster. As for now, the tracab .dat tracking data loader is the fastest, because that one I use the most. But improvements in other providers are also more than welcome!
The text was updated successfully, but these errors were encountered:
In the current state of the package, there are two functions that take considerable time to execute:
For this issue I would like to challenge you to make the code for reading tracking data (independent of which provider/type) reasonably faster (~30%)!
Lets also start discussions on ideas to make it faster. As for now, the tracab .dat tracking data loader is the fastest, because that one I use the most. But improvements in other providers are also more than welcome!
The text was updated successfully, but these errors were encountered: