Skip to content

Feat/datalist runtime override #2040

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 24 commits into from
Sep 22, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
23798d1
fix: duplicate flow error check
chrismclarke Aug 19, 2023
bd58a34
chore: remove deprecated code
chrismclarke Aug 19, 2023
e60c39c
refactor: app-data overrrides shared and spec
chrismclarke Aug 19, 2023
41bfc6f
refactor: separate file and console loggers, test utils
chrismclarke Aug 19, 2023
27e7366
feat: add backend support for data list override condition
chrismclarke Aug 19, 2023
d087b1e
chore: code tidying
chrismclarke Aug 19, 2023
f807f2b
test: data list overrides
chrismclarke Aug 19, 2023
84ad3e2
refactor: move test logging utils
chrismclarke Aug 19, 2023
d84f6e2
Merge branch 'fix/sync-local-workflow' of https://github.com/IDEMSInt…
chrismclarke Aug 19, 2023
021ed2a
feat: add templated data list method
chrismclarke Aug 20, 2023
748de4e
feat: add data variable service
chrismclarke Aug 21, 2023
604c96b
feat: add base and field variable handlers
chrismclarke Aug 21, 2023
792e4f4
feat: data variable service spec tests
chrismclarke Aug 21, 2023
f929516
Merge branch 'master' of https://github.com/IDEMSInternational/parent…
chrismclarke Aug 25, 2023
0ae72ce
chore: update logging outputs
chrismclarke Aug 25, 2023
cdb2237
chore: fix tests
chrismclarke Aug 25, 2023
e6c46d3
chore: fix shared import
chrismclarke Aug 25, 2023
5ee8e3b
Merge branch 'master' into feat/datalist-runtime-override
jfmcquade Sep 1, 2023
b11dfdd
fix: test db service init
chrismclarke Sep 4, 2023
ba23a70
test: tidy mock services
chrismclarke Sep 4, 2023
54ea596
Merge branch 'master' into feat/datalist-runtime-override
esmeetewinkel Sep 19, 2023
647d0a8
chore: fix typo
chrismclarke Sep 22, 2023
bfba8cb
chore: fix typo
chrismclarke Sep 22, 2023
86ef2fd
chore: update docs
chrismclarke Sep 22, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
34 changes: 25 additions & 9 deletions documentation/docs/authors/advanced/overrides.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,15 @@
# Template Overrides

Alternative templates can be used to override initial templates for use cases such as A/B testing or language variations.
There are various ways to override data within the app

## Example
## Case 1 - Template and data lists at runtime
Both template and data lists support providing `override_target` and `override_condition` columns on their contents sheet to determine under which conditions these flows will override.

!!! warning
Other flow types and generated flows do not currently support this feature

### Example

In the following a template named `default_template` will be overidden by a template named `override_template` whenever the app language starts with the letters `es` (e.g. es_sp, es_ca)

Expand All @@ -14,25 +21,34 @@ In the following a template named `default_template` will be overidden by a temp
[Google Sheet Demo](https://docs.google.com/spreadsheets/d/1MpoH3BxhECZRmYM10HZ0pTOoe69FJ-fEW9FwzK-Q6yw/edit#gid=1745157248)
[Live Preview Demo](https://plh-teens-app1.web.app/template/example_override_default)

## Parameters
### Parameters

| Parameter | Description |
| ------------------ | ------------------------------------------------------- |
| override_target | Name of template to override |
| override_condition | Condition statement that must be satisfied for override |


## Self-reference
### Template Self-reference
Sometimes it might be useful to reference the original template from the override sheet, such as displaying the same original template but with different variables, or with additional content above or below.

In this case an additional `is_override_target` parameter is required when referencing the default template from the override.

*override_template*

| type | name | value | is_override_target |
|--------- | --------------- | ----------------- |------------------- |
| title | title_1 | This appears above default content | |
| begin_template | default_template | default_template | ==TRUE== |
| end_template | | | |
| type | name | value | is_override_target |
|--------- | --------------- | ----------------- |------------------- |
| title | title_1 | This appears above default content | |
| begin_template | default_template | default_template | ==TRUE== |
| end_template | | | |

This is to prevent an infinite loop that would otherwise occur as the default_template is replaced by the override.


## Case 2 - All flow types at build time
It is also possible to replace any other flow type during sync and build by specifying additional sheets with the same flow_type and flow_name properties. In this case the last-synced (or generated) flow will take priority.

This can be useful in cases where multiple apps share some of the same core content, and want to only apply a number of replacements/additions.

This is to prevent an infinite loop that would otherwise occur as the default_template is replaced by the override.
!!! example "Experimental"
See open issue [2081](https://github.com/IDEMSInternational/parenting-app-ui/issues/2081) for proposals to add multiple data sources to make this type of override system more easy to implement
2 changes: 1 addition & 1 deletion packages/data-models/functions.ts
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ const DYNAMIC_STRING_REGEX = /[`!]?@([a-z]+)\.([0-9a-z_]+)([0-9a-z_.]*)/gi;
* Store these references in a separate object so they can be evaluated at runtime
*/
export function extractDynamicFields(data: any) {
let dynamicFields: any = {};
let dynamicFields: FlowTypes.IDynamicField = {};
switch (typeof data) {
case "object":
// simply convert array to object to handle in next case
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -53,8 +53,8 @@ describe("App Data Converter", () => {
const errors = getLogs("error");
const errorMessages = errors.map((err) => err.message);
expect(errorMessages).toEqual([
"Duplicate flow name",
"No parser available for flow_type: test_invalid_type",
"Duplicate flows found",
]);
});
it("Throws on duplicate flows", async () => {
Expand All @@ -76,14 +76,17 @@ describe("App Data Converter - Error Checking", () => {
emptyDirSync(paths.outputFolder);
}
});
beforeEach(() => {
errorConverter = new AppDataConverter(errorPaths);
});
it("Tracks number of conversion errors", async () => {
errorConverter = new AppDataConverter(errorPaths);
const { errors } = await errorConverter.run();
expect(errors.length).toBeGreaterThan(0);
});
it("Throws on duplicate flows (2)", async () => {
await errorConverter.run().catch((err) => {
expect(err.message.includes("Duplicate flows found")).toBe(true);
expect(err.message.includes("Duplicate flows found")).toBeTrue();
});
});
});
10 changes: 2 additions & 8 deletions packages/scripts/src/commands/app-data/convert/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ import { JsonFileCache } from "./cacheStrategy/jsonFile";
import {
generateFolderFlatMap,
IContentsEntry,
createChildLogger,
createChildFileLogger,
logSheetsSummary,
getLogs,
Logger,
Expand Down Expand Up @@ -59,7 +59,7 @@ export class AppDataConverter {

public activeDeployment = ActiveDeployment.get();

public logger = createChildLogger({ source: "converter" });
public logger = createChildFileLogger({ source: "converter" });

cache: JsonFileCache;

Expand Down Expand Up @@ -199,12 +199,6 @@ export class AppDataConverter {
);

fs.ensureDirSync(path.dirname(flowOutputPath));
if (fs.existsSync(flowOutputPath)) {
this.logger.error({
message: "Duplicate flows found",
details: [flow, fs.readJsonSync(flowOutputPath)],
});
}
// ensure newline characters are standardised (i.e. replace "\r\n" with "\n")
fs.writeFileSync(flowOutputPath, standardiseNewlines(JSON.stringify(flow, null, 2)));
});
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,13 @@ import path from "path";
import PQueue from "p-queue";
import { Logger } from "winston";
import { IConverterPaths } from "../types";
import { IContentsEntry, createChildLogger } from "../utils";
import { IContentsEntry, createChildFileLogger } from "../utils";
import { JsonFileCache } from "../cacheStrategy/jsonFile";
import chalk from "chalk";

class BaseProcessor<T = any, V = any> {
/** Used to invalidate cache */
public cacheVersion = 20221026.0;
public cacheVersion = 20230818.0;

public logger: Logger;

Expand All @@ -28,7 +28,7 @@ class BaseProcessor<T = any, V = any> {
*/
constructor(public context: { namespace: string; paths: IConverterPaths }) {
const { namespace } = context;
this.logger = createChildLogger({ source: namespace });
this.logger = createChildFileLogger({ source: namespace });
this.setupCache();
}
/**
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,10 @@ const paths = {
SHEETS_INPUT_FOLDER: path.resolve(testDataDir, "input"),
SHEETS_OUTPUT_FOLDER: path.resolve(testDataDir, "output"),
};
// Export method to allow use in parser-specific tests (to test on multiple instances of a flow type)
export function getTestFlowParserProcessor() {
return new FlowParserProcessor(paths);
}

// NOTE - inputs are just to test general structure and not run actual parser code
const testInputs: FlowTypes.FlowTypeWithData[] = [
Expand Down Expand Up @@ -44,7 +48,7 @@ const testInputs: FlowTypes.FlowTypeWithData[] = [
let processor: FlowParserProcessor;
describe("FlowParser Processor", () => {
beforeAll(() => {
processor = new FlowParserProcessor(paths);
processor = getTestFlowParserProcessor();
processor.cache.clear();
});
beforeEach(() => {
Expand Down Expand Up @@ -107,7 +111,7 @@ describe("FlowParser Processor", () => {
/** Additional tests for data pipe integration */
describe("FlowParser Processor - Data Pipes", () => {
beforeAll(() => {
processor = new FlowParserProcessor(paths);
processor = getTestFlowParserProcessor();
processor.cache.clear();
});
beforeEach(() => {
Expand Down
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
import { FlowTypes } from "data-models";
import * as Parsers from "./parsers";
import { IConverterPaths, IFlowHashmapByType, IParsedWorkbookData } from "../../types";
import { arrayToHashmap, groupJsonByKey, IContentsEntry } from "../../utils";
import { arrayToHashmap, groupJsonByKey, IContentsEntry, Logger } from "../../utils";
import BaseProcessor from "../base";

export class FlowParserProcessor extends BaseProcessor<FlowTypes.FlowTypeWithData> {
public cacheVersion = 20230509.3;
public cacheVersion = 20230818.3;

public parsers: { [flowType in FlowTypes.FlowType]: Parsers.DefaultParser } = {
data_list: new Parsers.DataListParser(this),
Expand All @@ -18,14 +18,22 @@ export class FlowParserProcessor extends BaseProcessor<FlowTypes.FlowTypeWithDat

/** Keep a track of all processed flows by type and name (used in data_pipes)*/
public processedFlowHashmap: {
[flowType in FlowTypes.FlowType]?: { [flow_name: string]: FlowTypes.FlowTypeWithData["rows"] };
[flowType in FlowTypes.FlowType]?: { [flow_name: string]: any[] };
} = {};

/**
* Additional hashmap with full flow data (not just rows), for use in tracking flow duplicates
* (could use processedFlowHashmap but would require refactor to retain _xlsx path as well as rows)
*/
public processedFlowHashmapWithMeta: {
[flowType in FlowTypes.FlowType]?: { [flow_name: string]: FlowTypes.FlowTypeWithData };
} = {};

constructor(paths: IConverterPaths) {
super({ paths, namespace: "flowParser" });
}

public processInput(flow: FlowTypes.FlowTypeWithData) {
public override processInput(flow: FlowTypes.FlowTypeWithData) {
const { flow_name, flow_type, _xlsxPath } = flow;
const parser = this.parsers[flow_type];
if (!parser) {
Expand Down Expand Up @@ -54,12 +62,20 @@ export class FlowParserProcessor extends BaseProcessor<FlowTypes.FlowTypeWithDat
}

public updateProcessedFlowHashmap(flow: FlowTypes.FlowTypeWithData) {
const { flow_name, flow_type } = flow;
const { flow_name, flow_type, _xlsxPath } = flow;
if (!this.processedFlowHashmap[flow_type]) {
this.processedFlowHashmap[flow_type] = {};
this.processedFlowHashmapWithMeta[flow_type] = {};
}
const duplicateFlow = this.processedFlowHashmapWithMeta[flow_type][flow_name];
if (duplicateFlow) {
this.logger.error({
message: "Duplicate flow name",
details: { flow_name, flow_type, _xlsxPaths: [_xlsxPath, duplicateFlow._xlsxPath] },
});
}
// Key should be unique as duplicates checked in main convert method
this.processedFlowHashmap[flow_type][flow_name] = flow.rows;
this.processedFlowHashmapWithMeta[flow_type][flow_name] = flow;
}

/**
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
import { DataListParser } from ".";
import { getTestFlowParserProcessor } from "../flowParser.spec";

const testFlow = {
flow_type: "data_list",
Expand All @@ -14,7 +15,7 @@ const testFlow = {
],
};

describe("data_list Parser", () => {
describe("data_list Parser (single)", () => {
let outputRows: any[];
beforeAll(() => {
const parser = new DataListParser({ processedFlowHashmap: {} } as any);
Expand All @@ -41,3 +42,32 @@ describe("data_list Parser", () => {
expect(test_notification_schedule).toEqual({ key_1: "value_1", key_2: "value_2" });
});
});

describe("data_list Parser (multiple)", () => {
const parser = getTestFlowParserProcessor();
beforeAll(() => {
parser.cache.clear();
});
afterAll(() => {
parser.cache.clear();
});
it("Adds override targets to flows", async () => {
await parser.process([
{ flow_type: "data_list", flow_name: "list_1", rows: [] },
{
flow_type: "data_list",
flow_name: "list_1_override",
rows: [],
override_target: "list_1",
override_condition: "example_condition",
},
]);
const { processedFlowHashmapWithMeta } = parser;
expect(processedFlowHashmapWithMeta.data_list.list_1).toEqual({
flow_type: "data_list",
flow_name: "list_1",
rows: [],
_overrides: { list_1_override: "example_condition" },
});
});
});
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
import { extractDynamicFields, FlowTypes } from "data-models";
import {
assignFlowOverrides,
extractConditionList,
parseAppDataCollectionString,
setNestedProperty,
Expand Down Expand Up @@ -31,4 +32,9 @@ export class DataListParser extends DefaultParser {
}
return row;
}

public postProcessFlows(flows: FlowTypes.FlowTypeWithData[]) {
const flowsWithOverrides = assignFlowOverrides(flows);
return flowsWithOverrides;
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,8 @@ import { FlowTypes } from "data-models";
import { extractDynamicFields } from "data-models";
import { DefaultParser } from "./default.parser";
import {
arrayToHashmap,
assignFlowOverrides,
extractDynamicDependencies,
logWarning,
parseAppDataCollectionString,
parseAppDataListString,
} from "../../../utils";
Expand Down Expand Up @@ -71,32 +70,10 @@ export class TemplateParser extends DefaultParser {
}

public postProcessFlows(flows: FlowTypes.FlowTypeWithData[]) {
const flowsWithOverrides = this.assignTemplateOverrides(flows);
const flowsWithOverrides = assignFlowOverrides(flows);
return flowsWithOverrides;
}

/** Check all templates for specified overrides and link to override_target row where exists */
private assignTemplateOverrides(flows: FlowTypes.FlowTypeWithData[]) {
const flowsByName = arrayToHashmap(flows, "flow_name");
for (const flow of flows) {
const { override_target, override_condition, flow_name } = flow;
if (override_target) {
if (!flowsByName[override_target]) {
logWarning({
msg1: `Override target does not exist: ${override_target}`,
msg2: flow_name,
});
} else {
if (!flowsByName[override_target]._overrides) {
flowsByName[override_target]._overrides = {};
}
flowsByName[override_target]._overrides[flow_name] = override_condition;
}
}
}
return Object.values(flowsByName);
}

private parseParameterList(parameterList: string[]) {
const parameterObj: FlowTypes.TemplateRow["parameter_list"] = {};
parameterList.forEach((p) => {
Expand Down
Loading