You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have created a few large lgbm models (250mb+ when saved as txt) and when i run dump_model on them in Python it returns
"JSONDecodeError: Expecting ',' delimiter"
Reproducible example
import lightgbm as lgb
lgb_model = lgb.Booster(model_file = "VeryLargeModel.txt")
lgb.dump_model()
Environment info
python == 3.9.7
lightgbm == 3.3.2
Additional Comments
I have experienced the issue for several months and my other colleagues also get it. I'm not certain if it's a lightgbm issue or a json issue more generally. I've tried a few fixes from stack overflow on similar but not identical issues but nothing works. The models work completely fine, I use them daily and no issue when training etc.
The text was updated successfully, but these errors were encountered:
Can you please either provide the model file where you've observed this issue, or a sample code using publicly-available data (e.g. the datasets from sklearn.datasets) that reproduces this issue?
Without those specifics, it'll be difficult for maintainers here to work on this problem.
Excellent, thanks for that! I or another maintainer will try to investigate the issue at some point, but I can't make any guarantees about how soon that will be.
Linking this related (although not identical) issue: #3858
Description
I have created a few large lgbm models (250mb+ when saved as txt) and when i run dump_model on them in Python it returns
"JSONDecodeError: Expecting ',' delimiter"
Reproducible example
Environment info
python == 3.9.7
lightgbm == 3.3.2
Additional Comments
I have experienced the issue for several months and my other colleagues also get it. I'm not certain if it's a lightgbm issue or a json issue more generally. I've tried a few fixes from stack overflow on similar but not identical issues but nothing works. The models work completely fine, I use them daily and no issue when training etc.
The text was updated successfully, but these errors were encountered: