This community package contains a node to work with BPE Tokens such as OpenAI's GPT models use under the hood. As a matter of fact this node works just fine with the OpenAI Node.
You can:
- Encode a string into BPE Tokens (may be cool for custom training)
- Decode an array of BPE Tokens back to a string (for funzies?)
- Determine a strings token length before submitting to the OpenAI API
- Calculate costs before submitting to OpenAI API
- Split a text into chunks which match exactly a definable Token Limit
n8n is a fair-code licensed workflow automation platform.
Supported Operations
Installation
Compatibility
About
Version History
Operation | Description | Options |
---|---|---|
Encode | Encode a string into BPE Tokens. Returns an array of Tokens. | - |
Decode | Decode an array of BPE Tokens into a string. Returns a string. | - |
Count Tokens | Count the tokens a string produces. Return the number of tokens. | - |
Check Token Limit | Wheather a given string exceeds a defined Token Limit. Returns a boolean. | Optional: throw an error if the Token Limit is exceeded. |
Slice to Max Token Limit | Slice the string into block which match exactly the provided token limit. Returns an array of strings. | - |
Follow the installation guide in the n8n community nodes documentation. It also should automatically install this depency: https://www.npmjs.com/package/gpt-tokenizer, which is a port of the original BPE Token Python Library.
The Latest Version of n8n. If you encounter any problem, feel free to open an issue on Github.
Hi I'm geckse and I let your work flow! 👋 I hope you are enjoying these nodes. If you are in need of a smooth automation, steady integration or custom code check my page: https://let-the-work-flow.com
- just polishing the npm release
- initial release