To create the target model (do not forget to edit a properties file, download required data etc.):
- clone & run ./compile (you only need java & maven to do that)
- To build the RDF dataset run ./runBuildDBpediaProductModel (is not so easy, because it requires asking DBpedia Spotlight)
To create the test NVD model:
- Download the NVD data (e.g. run ./update_NVDfeeds)
- To build the NVD dataset run ./runBuildNVDModel
More details is here
If you want to refer to the CPE/DBpedia model, please cite:
Brazhuk A. Building annotated semantic model of software products towards integration of DBpedia with NVD vulnerability dataset //International Journal of Open Information Technologies ISSN: 2307-8162 - vol. 7, - no.7, - 2019 - C. 35-41
To test the target model you can use Protege & the FaCT++ reasoner or Pellet reasoner.
For DL use the standard DL query tab:
For SPARQL use the snap-sparql-query plugin:
- Examples of SPARQL queries. (note: the requests containing data properties only work with Pellet)
To create the target model:
- clone & run ./compile (you only need java & maven to do that).
- run ./runBuildSemanticModelv3 to build the OWL file (do not forget to edit a properties file, download required data etc.)
Full description is here
If you want to refer to the CAPEC&CWE model, please cite:
Brazhuk A. Semantic model of attacks and vulnerabilities based on CAPEC and CWE dictionaries //International Journal of Open Information Technologies. – 2019. – Т. 7. – №. 3. – С. 38-41.
To create the target model:
- clone & run ./compile (you only need java & maven to do that).
- ./runBuildSemanticModel to build the OWL file (do not forget to edit a properties file, download required data etc.)