Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Errors while removing background on deployed server #106

Open
Rolstenhouse opened this issue Feb 23, 2024 · 11 comments
Open

Errors while removing background on deployed server #106

Rolstenhouse opened this issue Feb 23, 2024 · 11 comments

Comments

@Rolstenhouse
Copy link

When I attempt to remove the background from a file on my server I get the following issue

corrupted size vs. prev_size
OR
free(): invalid size
OR
munmap_chunk(): invalid pointer

Have not been able to identify what triggers which, but feels like it might be an issue with the ML image (I'm using the small one)

Docker host: 20.10.12 linux x86_64
Node version: Node.js v21.6.2
Package version: 1.4.4

@Rolstenhouse Rolstenhouse changed the title Issue with background-removal-js-node Errors while removing background on deployed server Feb 23, 2024
@LevwTech
Copy link

Same issue

@DanielHauschildt
Copy link
Contributor

Would you be open to share the minimal example as I cannot reproduce it.

@Rolstenhouse
Copy link
Author

Rolstenhouse commented Feb 27, 2024

Sure - here's some more context

Snippet

         mediaUrl = "https://api.twilio.com/2010-04-01/Accounts/ACfbfe2e1e70ce74b02a4151bf91b23693/Messages/MM3fa6329883117973ec3cd7b180c6caca/Media/ME76f45b7483238aac2516ab5429c5018a"
          try {
            ort.env.debug = true;
            ort.env.logLevel = "warning";

            logger.info("Removing background for image", { mediaUrl });
            const localPath = `file://${process.cwd()}/public/imgly/`;
            logger.info("localPath", { localPath });
            const blob: Blob = await removeBackground(mediaUrl, {
              publicPath:
                process.env.NODE_ENV === "production"
                  ? "file:///myapp/public/imgly/"
                  : localPath,
              // publicPath: "https://stickerfy.xyz/imgly/",
              debug: true,
              model: "small",
              progress: (key, current, total) => {
                logger.warn(`Downloading ${key}: ${current} of ${total}`);
              },
            });
            buffer = Buffer.from(await blob.arrayBuffer());
          } catch (error) {
            logger.error("Error while removing background for image", {
              mediaUrl,
              error,
              errorMessage: error.message,
              errorStack: error.stack,
              errorName: error.name,
            });
          }
        }
        
        // Write the buffer to S3
        
        if (buffer) {
          // Upload to S3
          logger.info("Uploading image to S3", {
            info: {
              key: mediaSid!,
              contentType: "image/png",
              userId: user?.autoId || 0,
              buffer: buffer.length,
            },
          });
          backgroundRemovedImage = await uploadImageToS3({
            key: mediaSid!,
            buffer,
            contentType: "image/png",
            userId: user?.autoId || 0,
          });
        }

Here's a screenshot of the logs (and included a CSV with the log output)
image

Also note: this snippet includes the local file path, but I also ran into this issue when referencing the hosted model.

Deployed server is running on fly.io btw (not sure if that might be an issue)

extract-2024-02-27T00_49_29.879Z.csv

@LevwTech
Copy link

LevwTech commented Feb 27, 2024

My server is running on digital ocean with the same issue.
Droplet info:
Ubuntu 23.10 x64
Node 20
No gpu

@n3m3s7s
Copy link

n3m3s7s commented Mar 7, 2024

Hi,
I get the same error on WSL2 (Ubuntu).

Could be related to "onnxruntime-node" or WASM and the fact that TensorFlow and the model need a GPU and on server or a remote environment they are not available ?

I noticed that in the source, the function:

async function createOnnxSession(model, config) {
  if (config.debug) {
    ort.env.debug = true;
    ort.env.logLevel = "verbose";
    console.debug("ort.env.wasm:", ort.env.wasm);
  }
}

on my WSL2 environment actually prints an empty object to the console:

fetch /models/medium 100%
ort.env.wasm: {}
free(): invalid size

Thanks!

@DanielHauschildt
Copy link
Contributor

onnxruntime-node should work without a GPU.
ort.env.wasm seems wrong and the Node Version does not yet support the wasm backend.
So it seems ok that it's empty.

I have no access to such a machine at the moment, so unfortunately I cannot reproduce the error.
Also, I have no idea what the cause is.

@Rolstenhouse
Copy link
Author

Thanks for looking into it. For other devs that might encounter this, I used a different package: rembg on replicate and just paid the small out of pocket cost.

@jemeetala
Copy link

npm ERR! code 1
npm ERR! path /home1/freeback/nodevenv/imageremove/18/lib/node_modules/onnxruntime-node
npm ERR! command failed
npm ERR! command sh -c node ./script/install
npm ERR! Downloading "https://github.com/microsoft/onnxruntime/releases/download/v1.17.3/onnxruntime-linux-x64-gpu-1.17.3.tgz"...
npm ERR! node:internal/deps/undici/undici:7534
npm ERR! return await WebAssembly.instantiate(mod, {
npm ERR! ^
npm ERR!
npm ERR! RangeError: WebAssembly.instantiate(): Out of memory: wasm memory
npm ERR! at lazyllhttp (node:internal/deps/undici/undici:7534:32)
npm ERR!
npm ERR! Node.js v18.18.2
npm ERR! A complete log of this run can be found in: /home1/freeback/.npm/_logs/2024-08-27T13_49_31_015Z-debug-0.log

getting this error on cpnal it's working fine with local so help me out ?

@Das-Felix
Copy link

Hello, i am having the same Issue. I am using the background remover locally without any problems. When i deploy my code to docker i get: munmap_chunk(): invalid pointer

Docker version: 27.2.1
Node: node:22-bookworm-slim image
Package version: 1.4.5

Is there a fix for this? Thanks

@Songthamt
Copy link

getting same issue ... am using the background remover locally without any problems. When i deploy my code to docker i get free(): invalid size.

this sucks because the npm package is too big for lambda layer and even when using container it doesn't work.

@Das-Felix
Copy link

I made a simple removeBackground api endpoint with express as a test, and it runs without any problems, with the same node image on docker. So maybe its conflicting with some other package i use in my main project.

Anyway, i use this now as a workaround now. Here is the code, if you are interested:

import { removeBackground } from "@imgly/background-removal-node";
import express from "express";
import multer from "multer";
import bodyParser from "body-parser";
import fs from "fs/promises";
import path from "path";

const app = express();
const port = 3838;
const apiKey = process.env.REMOVE_BG_API_KEY;

const upload = multer({ dest: 'uploads/' });

app.post('/removeBackground', upload.single('image'), async (req, res) => {
    if (!apiKey) {
        return res.status(500).send('API key is not set');
    }

    if (req.body.apiKey !== apiKey) {
        return res.status(401).send('Unauthorized');
    }

    try {
        const imagePath = req.file.path;
        const outputImagePath = path.join('uploads', `removed-background-${Date.now()}.png`);

        const imageBuffer = await fs.readFile(imagePath);
        const imageBlob = new Blob([imageBuffer], { type: 'image/png' });

        let removedBackground = await removeBackground(imageBlob);
        const arrayBuffer = await removedBackground.arrayBuffer();
        const removedBackgroundBuffer = Buffer.from(arrayBuffer);
        await fs.writeFile(outputImagePath, removedBackgroundBuffer);
        res.sendFile(outputImagePath, { root: './' });

        await fs.unlink(imagePath);
        await fs.unlink(outputImagePath);
    } catch (error) {
        console.error(error);
        res.status(500).send('Error processing image');
    }
});

app.listen(port, () => {
    console.log(`Server is running on port ${port}`);
});
FROM node:22-bookworm-slim
WORKDIR /usr/src/app

COPY package*.json ./

RUN npm install

COPY . .


EXPOSE 3838

CMD ["node", "server.js"]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants