A package that contains core functionality for LFX UI products. It includes design tokens and PrimeTek theme configuration that is shared across LFX UI products.
This package contains design tokens and PrimeTek theme configuration that is shared across LFX UI products.
The generated tokens are organized into three layers:
- Primitive Tokens: Base-level design values (colors, spacing, typography, etc.)
- Semantic Tokens: Purpose-driven tokens that reference primitive tokens
- Component Tokens: Component-specific tokens that reference semantic tokens
npm install @linuxfoundation/lfx-ui-core
After installing the package, you can import and use the tokens in your application:
import { Component } from '@angular/core';
import { PrimeNGConfig } from 'primeng/api';
import { definePreset } from 'primeng/themes';
import { Aura } from 'primeng/themes/aura';
import { lfxPreset } from '@linuxfoundation/lfx-ui-core';
const customPreset = definePreset(Aura, {
primitive: lfxPreset.primitive,
});
@Component({
selector: 'app-root',
templateUrl: './app.component.html',
styleUrls: ['./app.component.css'],
})
export class AppComponent {
constructor(private config: PrimeNGConfig) {
this.config.theme.set({
preset: customPreset,
options: {
prefix: 'p',
darkModeSelector: '.dark-mode',
},
});
}
}
The tokens are strongly typed, providing autocomplete support and type safety in your IDE.
- Node.js 20.x
- npm
- Clone the repository:
git clone https://github.com/linuxfoundation/lfx-ui
cd lfx-ui-core
- Install dependencies:
npm ci
- Build the tokens:
npm run build
- The source tokens are defined in
src/design/tokens/tokens.json
- Modify the tokens file according to your needs, or update it in Figma using Tokens Studio
- Run the build script to generate updated token files:
npm run build
- Create a new version tag following semver conventions:
git tag v1.0.0
git push origin v1.0.0
- The GitHub Action will automatically:
- Build the package
- Update the version
- Publish to npm
- Follow semantic versioning for releases
- Update documentation when adding new token categories
- Add comments to explain complex token relationships
- Test tokens in a real application before releasing