scan multiple demos for kills of one player #390
-
I am trying to scan about 1.200 demo files of a teammate of mine. The aim is to find his best matches to make a frag movie... When I use the example code to scan ONE demo for kills it works fine. But when I use a loop through all demo files in a folder, the parseStream() kinda fucks me up. does anybody have some advice for me? Here is my code: const testFolderPath = '/Volumes/Backup/biggie-demos/';
const fs = require('fs');
const path = require('path');
const demofile = require("demofile");
fs.readdirSync(testFolderPath).forEach(tempFile => {
var demoFile = new demofile.DemoFile();
demoFile.gameEvents.on("player_death", e => {
var victim = demoFile.entities.getByUserId(e.userid);
var victimName = victim ? victim.name : "unnamed";
var attacker = demoFile.entities.getByUserId(e.attacker);
var attackerName = attacker ? attacker.name : "unnamed";
var headshotText = e.headshot ? " HS" : "";
console.log(`${attackerName} [${e.weapon}${headshotText}] ${victimName}`);
});
var tempFilePath = path.resolve(testFolderPath, tempFile);
demoFile.parseStream(fs.createReadStream(tempFilePath));
console.log(tempFilePath);
}); when I run the code, I get the list of all demo files in the folder (last line of code within the for each loop)
the for loop should just: what I really don't understand is: the code lists the file names of all the demo files in the folder, even though the parseStream() of the first demo should be run before iterating to the next demo file? this works perfectly fine with one demo thanks to the example code, but when I put it in a loop over more demo files it doesnt work - even though I want to print the filename and kills within ONE iteration of the for loop |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
Please can you edit your question with details on exactly what the issue is? |
Beta Was this translation helpful? Give feedback.
-
The issue is that you are running out of memory because you are trying to read 1,200 demos at the same time (in parallel).
const testFolderPath = "/Volumes/Backup/biggie-demos/";
const fs = require("fs");
const path = require("path");
const demofile = require("demofile");
for (let tempFile of fs.readdirSync(testFolderPath)) {
await new Promise((resolve, reject) => {
var demoFile = new demofile.DemoFile();
demoFile.on("end", e => {
if (e.error) {
reject(e.error);
} else {
resolve(null);
}
});
demoFile.gameEvents.on("player_death", e => {
var victim = demoFile.entities.getByUserId(e.userid);
var victimName = victim ? victim.name : "unnamed";
var attacker = demoFile.entities.getByUserId(e.attacker);
var attackerName = attacker ? attacker.name : "unnamed";
var headshotText = e.headshot ? " HS" : "";
console.log(`${attackerName} [${e.weapon}${headshotText}] ${victimName}`);
});
var tempFilePath = path.resolve(testFolderPath, tempFile);
demoFile.parseStream(fs.createReadStream(tempFilePath));
console.log(tempFilePath);
});
} Note that top-level 'await' expressions are only allowed when the TypeScript 'module' option is set to 'es2022', 'esnext', 'system', 'node16', or 'nodenext', and the 'target' option is set to 'es2017' or higher. |
Beta Was this translation helpful? Give feedback.
The issue is that you are running out of memory because you are trying to read 1,200 demos at the same time (in parallel).
parseStream
is not a synchronous function. It will immediately return and process the stream asynchronously. You're likely better off running through the demos one by one to ensure you don't run out of memory. One way to do this would be to await a promise that resolves when the demo has finished parsing. See below for an example: