Skip to content

Commit

Permalink
Add open sourcing bits. (#1)
Browse files Browse the repository at this point in the history
  • Loading branch information
matz-e authored May 31, 2024
1 parent ba4f3c4 commit fe55b29
Show file tree
Hide file tree
Showing 9 changed files with 50 additions and 79 deletions.
34 changes: 0 additions & 34 deletions .gitlab-ci.hpc.yml

This file was deleted.

25 changes: 0 additions & 25 deletions .gitlab-ci.nse.yml

This file was deleted.

15 changes: 0 additions & 15 deletions .gitlab-ci.yml

This file was deleted.

1 change: 1 addition & 0 deletions AUTHORS.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
Matthias Wolf
Fernando Pereira
Sergio Rivas Gomez
Luc Dominic Grosheintz-Laval
Tristan Carel
5 changes: 5 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
FROM python:latest

RUN apt-get update \
&& apt-get install -y libopenmpi-dev
RUN pip install /workspace
35 changes: 32 additions & 3 deletions README.rst
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
.. image:: doc/source/_static/banner.jpg
:alt: A nice banner for functionalizer

Functionalizer
==============

Expand All @@ -10,9 +13,27 @@ To process the large quantities of data optimally, this software uses PySpark.
Installation
------------

For manual installation via `pip`, a compiler handling C++17 will be necessary.
Otherwise, all `git` submodules should be checked out and `cmake` as well as `ninja` be
installed.
The easiest way to install `functionalizer` is via:

.. code-block:: console
pip install functionalizer
Due to a dependency on ``mpi4py``, a MPI implementation needs to be installed on the
system used. On Ubuntu, this can be achieved with:

.. code-block:: console
apt-get install -y libopenmpi-dev
For manual installation from sources via ``pip``, a compiler handling C++17 will be
necessary. Furthermore, all ``git`` submodules should be checked out:

.. code-block:: console
gh repo clone BlueBrain/functionalizer -- --recursive --shallow-submodules
cd functionalizer
pip install .
Spark and Hadoop should be installed and set up as runtime dependencies.

Expand All @@ -27,4 +48,12 @@ Where the final argument `edges.h5` may also be a directory of Parquet files. W
running on a cluster with multiple nodes, care should be taken that every rank occupies a
whole node, Spark will then spread out across each node.

Acknowledgment
--------------
The development of this software was supported by funding to the Blue Brain Project,
a research center of the École polytechnique fédérale de Lausanne (EPFL),
from the Swiss government's ETH Board of the Swiss Federal Institutes of Technology.

Copyright (c) 2017-2024 Blue Brain Project/EPFL

.. _SONATA extension: https://sonata-extension.readthedocs.io
Binary file added doc/source/_static/banner.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
1 change: 1 addition & 0 deletions doc/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,3 +56,4 @@
html_theme_options = {"metadata_distribution": "functionalizer"}
html_title = "functionalizer"
html_show_sourcelink = False
html_static_path = ['_static']
13 changes: 11 additions & 2 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
[build-system]
requires = ["scikit-build-core[pyproject]", "setuptools_scm"]
requires = ["scikit-build-core[pyproject]", "setuptools_scm", "cmake", "ninja"]
build-backend = "scikit_build_core.build"

[project]
name = "functionalizer"
dynamic = ["version"]
description = "A PySpark implementation of the Blue Brain Project Functionalizer"
license = {"text" = "Blue Brain Project License"}
license = {"file" = "LICENSE.txt"}
authors = [{"name" = "BlueBrain HPC", "email" = "bbp-ou-hpc@epfl.ch"}]
maintainers = [
{"name" = "Matthias Wolf", "email" = "matthias.wolf@epfl.ch"},
Expand Down Expand Up @@ -74,3 +74,12 @@ max-args = 8
[tool.black]
extend-exclude = 'deps\/.*$'
line_length = 100

[tool.cibuildwheel]
skip = ["cp3{6,7,8}-*", "pp*", "*-win32", "*-manylinux_i686", "*-musllinux_i686", "*-musllinux_x86_64", "*-musllinux_aarch64"]
# MPI needed for testing with mpi4py
before-all = "yum install -y openmpi3-devel java-11-openjdk"
environment = { MPICC="/usr/lib64/openmpi3/bin/mpicc" }
# Remove setuptools with Spark v4 - v3 has implicit dependency on distutils
test-requires = ["pytest", "mock", "setuptools"]
test-command = "pytest --run-slow {project}/tests"

0 comments on commit fe55b29

Please sign in to comment.