Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for large data #9

Open
ghost opened this issue Feb 14, 2019 · 1 comment
Open

Support for large data #9

ghost opened this issue Feb 14, 2019 · 1 comment

Comments

@ghost
Copy link

ghost commented Feb 14, 2019

Hi,
when I want to encode large json data i see this error: 'InvalidArgumentException with message 'Data too large'. Is it possible to increase max. data size? Thanks in advance.

@z38
Copy link
Owner

z38 commented Feb 15, 2019

Aztec codes can contain up to about 1914 bytes (which includes error correction data). Depending on your application, you might be able to encode more data if you lower the ECC rate (second parameter of Encoder::encode).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant