-
File URL paths ---> https://nodejs.org/api/fs.html#fs_file_url_paths face: {
detector: { modelPath: 'file://models/faceboxes.json' }
} I am a dependent of your module humanLibrary = require('@vladmandic/human/dist/human.node.js').default; User OS's and how people decide to install stuff, @vladmandic/human can end up at different paths var humanModulePath = path.dirname(require.resolve("@vladmandic/human/package.json")); I then get a nice path, value of humanModulePath
I try pluggin this into your detector: { modelPath: 'file:/' + correctedModulePath + 'models/faceboxes.json' } and I get:
I know I can write up some spaghetti code using .indexOf('/') to correct for it but there is also the issue that require.resolve will use what ever the OS uses for paths. and reading up on File URL paths tells me I'll have to do something like. var forceLinuxPath = humanModulePath.replace(/\\/g, "/"); before I go coding around this are you willing to code it up so it works as a dependent package ? also I may just make a copy of the models and make my own folder but then its duplicating a bunch of files and increasing the install size of the package. Is my brain right ? Thanks you, |
Beta Was this translation helpful? Give feedback.
Replies: 16 comments 11 replies
-
oh I just noticed i missed a '/' ..... fixed and still an issue detector: { modelPath: 'file:/' + correctedModulePath + '/models/faceboxes.json' }
|
Beta Was this translation helpful? Give feedback.
-
instead of adding strings, better to use path.join so nodejs takes care of directory separators. detector: { modelPath: 'file:/' + path.join(correctedModulePath, 'models/faceboxes.json') } |
Beta Was this translation helpful? Give feedback.
-
My poor brain If I use
I get
But i know its a good path. Lets change meeki to some false path like notmeeki
I get
I don't understand :( |
Beta Was this translation helpful? Give feedback.
-
i've just tried with both absolute and relative i've also just tried with something like: config.body.modelPath = new URL('/posenet.json', 'file://models').toString(); and it also loaded model correctly (note that |
Beta Was this translation helpful? Give feedback.
-
Thank you! I never thought about just using the object like that.....! I got rid of the symbolic link to the project. Published it to npm and then installed it like a user would to get proper dir structure for testing my package @ /home/meeki/.node-red/node_modules/node-red-contrib-human-recognition myConfig.face.detector.modelPath = new URL('/faceboxes.json', 'file://.node-red/node_modules/@vladmandic/human/models').toString(); output is
So I then thought .... I'll just throw the faceboxes.json and .bin in that dir and give it what it wants....well......
output of process.env.PWD
However output of
I've even copied the models dir to /home/meeki/.node-red/node_modules/node-red-contrib-human-recognition/models Gerr im missing something...... I don't expect you to be my tech support. If you don't have time to respond I'll be fine. Going to read.....ALLOT . Its kinda fun when you figure it out. I enjoy the challenge. Thanks M8, |
Beta Was this translation helpful? Give feedback.
-
quick test program to load model (in the same manner that human does): const tf = require('@tensorflow/tfjs-node');
async function main() {
const modelPath = 'file://models/blazepalm/graph/blazepalm.json';
console.log('model path:', modelPath);
console.log('platform: browser', tf.env().getBool('IS_BROWSER'), 'node', tf.env().getBool('IS_NODE'));
const model = await tf.loadGraphModel(modelPath);
console.log('loaded using:', model.handler);
console.log('bytes used:', tf.engine().state.numBytes);
console.log('model signature:', model.signature);
tf.dispose(model);
}
main();
actual load function is provided by now, default io handler is platform-default for fetch function, in your case, it seems that handler stays as platform fetch and then fetch expectedly fails with and to confirm the theory, that's the same error i get if i manually call in case of |
Beta Was this translation helpful? Give feedback.
-
Got to goto work but these are my objects from the test: {"MODEL_JSON_FILENAME":"model.json","WEIGHTS_BINARY_FILENAME":"weights.bin","MODEL_BINARY_FILENAME":"tensorflowjs.pb","path":"/home/meeki/.node-red/node_modules/@vladmandic/human/models/faceboxes.json"}
{"outputs":{"scores":{"name":"scores"},"boxes":{"name":"boxes"},"num_boxes":{"name":"num_boxes"}}} platform: browserfalsenodetrue I'll clean it up code later tonight and make it output pretty :) |
Beta Was this translation helpful? Give feedback.
-
oh not good. I got that from running the : "quick test program to load model (in the same manner that human does):" ////////////////////////////////////////////////////////
//const modelPath = 'file://.node-red/node_modules/@vladmandic/human/models/faceboxes.json';
modelPath = 'file://.node-red/node_modules/@vladmandic/human/models/faceboxes.json';
notify_user_errors('model path:' + modelPath);
notify_user_errors('platform: browser' + tf.env().getBool('IS_BROWSER') + 'node' + tf.env().getBool('IS_NODE'));
model = await tf.loadGraphModel(modelPath);
notify_user_errors('loaded using:' + model.handler);
notify_user_errors('bytes used:' + tf.engine().state.numBytes);
notify_user_errors('model signature:' + model.signature);
msg.modelhandler = model.handler;
msg.modelsignature = model.signature;
tf.dispose(model);
//////////////////////////////////////////////////////
send(msg); |
Beta Was this translation helpful? Give feedback.
-
ah, understood. let me know. |
Beta Was this translation helpful? Give feedback.
-
Not sure what "plays nice with ts" means? Does the quick earlier test work in node-red? loading models handler............. So far its looking to be a node-red issue and that is not your responsibility. As the node-red free node that is installed using:
is working fine. loves my promises and spits out all the checks as good. sorry if I've wasted your time. |
Beta Was this translation helpful? Give feedback.
-
FOUND IT!!!!error msg sent me down the wrong rabbit hole! If the config is not perfectly formatted object it sends:
Example failure by re-naming Object key to bodies instead of body const myConfig = {
backend: 'tensorflow',
console: true,
videoOptimized: false,
async: false,
face: {
detector: { modelPath: 'file://.node-red/node_modules/node-red-contrib-human-recognition/models/faceboxes.json' }, // cannot use blazeface in nodejs due to missing required kernel function in tfjs-node
mesh: { modelPath: 'file://.node-red/node_modules/node-red-contrib-human-recognition/models/facemesh.json' },
iris: { modelPath: 'file://.node-red/node_modules/node-red-contrib-human-recognition/models/iris.json' },
age: { modelPath: 'file://.node-red/node_modules/node-red-contrib-human-recognition/models/age-ssrnet-imdb.json' },
gender: { modelPath: 'file://.node-red/node_modules/node-red-contrib-human-recognition/models/gender-ssrnet-imdb.json' },
emotion: { modelPath: 'file://.node-red/node_modules/node-red-contrib-human-recognition/models/emotion-large.json' },
},
bodys: { modelPath: 'file://.node-red/node_modules/node-red-contrib-human-recognition/models/posenet.json' },
hand: {
detector: { modelPath: 'file://.node-red/node_modules/node-red-contrib-human-recognition/models/handdetect.json' },
skeleton: { modelPath: 'file://.node-red/node_modules/node-red-contrib-human-recognition/models/handskeleton.json' },
},
}; Output
Or if you try to send it an object that is incomplete face: {
detector: { modelPath: 'file://.node-red/node_modules/node-red-contrib-human-recognition/models/faceboxes.json' }, // cannot use blazeface in nodejs due to missing required kernel function in tfjs-node
} Output
You have to give every object key a value. const myConfig = {
backend: 'tensorflow',
console: true,
videoOptimized: false,
async: false,
face: {
detector: { modelPath: 'file://.node-red/node_modules/node-red-contrib-human-recognition/models/faceboxes.json' }, // cannot use blazeface in nodejs due to missing required kernel function in tfjs-node
mesh: { enabled: false },
iris: { enabled: false },
age: { enabled: false },
gender: { enabled: false },
emotion: { enabled: false },
},
body: { enabled: false },
hand: { enabled: false },
}; Output:
as you can see its now loaded :) You outline it in your https://github.com/vladmandic/human/wiki/Configuration but I missed the part where you have to send the Object key somting. |
Beta Was this translation helpful? Give feedback.
-
So you may be wondering how i figured this one out. I would love to say some tecnobable about how i did this and that.......nope.....I was playing around with the nodeJS NON-node-red version and decided to try just facial rec, no other options,like i did with the node-version. Got the same output I was getting in the node-red version and went .........."What did I just do....What did I change.......How did i make this happen". Played around for 10min with code and put together what was happening. Corrected the object in the node-red version and BINGO! |
Beta Was this translation helpful? Give feedback.
-
interesting! how are you using object const human = new Human();
human.config = myConfig then I'd expect it to fail if its incomplete as you just overwrote defaults, but if you pass it during construction or load such as const human = new Human(myConfig); then if (userConfig) this.config = mergeDeep(this.config, userConfig); // helper function: perform deep merge of multiple objects so it allows full inheriance with overrides
function mergeDeep(...objects) {
const isObject = (obj) => obj && typeof obj === 'object';
return objects.reduce((prev, obj) => {
Object.keys(obj || {}).forEach((key) => {
const pVal = prev[key];
const oVal = obj[key];
if (Array.isArray(pVal) && Array.isArray(oVal)) prev[key] = pVal.concat(...oVal);
else if (isObject(pVal) && isObject(oVal)) prev[key] = mergeDeep(pVal, oVal);
else prev[key] = oVal;
});
return prev;
}, {});
} |
Beta Was this translation helpful? Give feedback.
-
how are you using object myConfig? //************************//
const humanLibrary = require('@vladmandic/human/dist/human.node.js').default; // or const Human = require('../dist/human.node-gpu.js').default;
//************************//
///////////////////////////////////
//Global stuff used by entire node
///////////////////////////////////
//blabla code here
////////////////////////
// construct the node //
////////////////////////
function humanrecognitionNode(config)
{
RED.nodes.createNode(this,config);
let human = null;
//************************//
const myConfig = {
//************************//
backend: 'tensorflow',
console: true,
videoOptimized: false,
async: false,
face: {
detector: { modelPath: 'file://.node-red/node_modules/node-red-contrib-human-recognition/models/faceboxes.json' }, // cannot use blazeface in nodejs due to missing required kernel function in tfjs-node
mesh: { enabled: false },
iris: { enabled: false },
age: { enabled: false },
gender: { enabled: false },
emotion: { enabled: false },
},
body: { enabled: false },
hand: { enabled: false },
};
//is tfjs_ready
var tfjs_ready; //error check of tfjs_ready
Promise.resolve
(
tf.ready()
)
.then( tfjs_ready = true )
.catch(error =>
{
tfjs_ready = ("tfjs is not ready " + error),
this.warn(tfjs_ready),
this.status(
{
fill: 'red',
shape: 'dot',
text: "detected error"
});
});
//is human_ready - pre-load models
var human_ready; //error check of human_ready
//************************//
human = new humanLibrary(myConfig);
//************************//
this.warn('Human:' + human.version);
this.warn('Current folder:' + process.env.PWD);
this.warn('Active Configuration:' + JSON.stringify(human.config));
this.warn('TFJS Version:' + JSON.stringify(tf.version_core));
this.warn('TFJS Backend:' + JSON.stringify(tf.getBackend()));
this.warn('TFJS Flags:' + JSON.stringify(tf.env().features));
this.warn('Loading models:');
Promise.resolve
(
//************************//
human.load()
//************************//
)
.then( human_ready = true )
.catch(error =>
{
human_ready = ("human is not ready " + error),
this.warn(human_ready),
this.status(
{
fill: 'red',
shape: 'dot',
text: "detected error"
});
});
for (const model of Object.keys(human.models)) this.warn(' Loaded:' + model);
this.warn('Memory state:' + JSON.stringify(human.tf.engine().memory()));
this.warn('Test Complete'); I don't think I overrode the config........ |
Beta Was this translation helpful? Give feedback.
-
NOTE: I would love to use async in the node-red constructor, but the class constructor function for node-red does not support async/await directly. var node = this;
this.on('input', async function(msg, send, done) but if I want to load human so its ready for a user input (buffered image) I have to use promises in the constructor. .....there is a method for my madness. else if I load human models in the user input (buffered image) section with async It would take the time to load models again and again for every image the user sends.......thats allot of overhead |
Beta Was this translation helpful? Give feedback.
-
you ask for - this.warn('Human:' + human.version); I think you want - this.warn('Active Configuration:' + JSON.stringify(human.config)); This is partly my fault. I limit the amount of char my debug can send per msg so I don't risk the chance of a lock up. Here is the complete output for - human.config - using bodys insted of body for myConfig just like the example a few threads above.
As you can see it adds your:
back in and then adds my incorrect object to the bottom:
I have been ignoring the output from your checks: for (const model of Object.keys(human.models)) this.warn(' Loaded:' + model); because it only checks for the key and not its value. So even though I may use: iris: { enabled: false }, It will still give me an output:
I will code up a check that looks for the object key and its value so if its false it will tell me its not loaded. Then the last output I get is the failure of the Promise for human.load():
for the failure of:
I hope this gives you the information you need. Oh the output of this.warn('Human:' + human.version);
|
Beta Was this translation helpful? Give feedback.
you ask for - this.warn('Human:' + human.version); I think you want - this.warn('Active Configuration:' + JSON.stringify(human.config));
This is partly my fault. I limit the amount of char my debug can send per msg so I don't risk the chance of a lock up.
So when I got the output for - human.config - It is chopped off before I could notice the end.
Here is the complete output for - human.config - using bodys insted of body for myConfig just like the example a few threads above.