Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

change modelBuffer size in tflite.cc does not work #41

Open
carter54 opened this issue Mar 9, 2022 · 4 comments
Open

change modelBuffer size in tflite.cc does not work #41

carter54 opened this issue Mar 9, 2022 · 4 comments

Comments

@carter54
Copy link

carter54 commented Mar 9, 2022

Hi @Volcomix , thx for the nice project.

I tried to build tflite and tflite-simd wasm with your code, the only thing I have changed is the size of modelBuffer here, as I want to try a float 32 model.

I modified it as char modelBuffer[1024 * 1024];. Thanks for your dockerfile, I can rebuild tflite-simd wasm successfully.

However, when I apply the model, this error appears

Uncaught (in promise) RuntimeError: memory access out of bounds
    at :3001/tflite/tflite-simd.wasm
    at :3001/tflite/tflite-simd.wasm
    at :3001/tflite/tflite-simd.wasm
    at Object.Module._loadModel (tflite-simd.js:9:14734)
    at loadTFLiteModel (bundle.js:558:21)
    at async getModel (bundle.js:39:17)

my model size is

[WASM] Loading model of size: 496468

which is much smaller than the modelBuffer I set 1024*1024 (=1048576)

Do I make any mistakes or miss something?

@Volcomix
Copy link
Owner

Hi @carter54,

The change you described looks fine to me.

When stress testing a lot the app by changing a lots of times the model to load, I have in very few cases the same exception and I'm wondering if there could be something wrong which is unrelated to the model size. However this is very hard to investigate because I can't reproduce it very often. Maybe your issue is something completely different but it could probably be easier if we manage to tackle your case.
Is there a chance that you could share your model so I could reproduce your issue and try to debug it?

@Volcomix
Copy link
Owner

Volcomix commented Mar 12, 2022

Maybe one thing that you could try though is to add a printf after loading the model, so after this block: https://github.com/volcomix/virtual-background/blob/main/tflite/tflite.cc#L67-L70

Maybe the memory issue doesn't happen when loading the model in memory but rather when loading and allocating memory for the Interpreter and for all the tensors (which could require more memory for a float 32 model). I'd expect Emscripten to make the memory grow in this situation but maybe that could be the issue.

@Volcomix
Copy link
Owner

You can also try adding "-s ASSERTIONS=1", to the linking options to get some memory debugging info:
https://github.com/volcomix/virtual-background/blob/main/tflite/BUILD#L28-L32

@carter54
Copy link
Author

@Volcomix Thx for the reply. The model I used is from @PINTO0309, which can be downloaded by this.
saved_model_openvino/model_float32.tflite.
I will also try your suggestions and see if they work. Thank you~

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants