Skip to content

Fixing the get_leaves function and setup issues for Python 3.12 #404

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Oct 7, 2024
Merged
Show file tree
Hide file tree
Changes from 4 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/tests_01.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python: [3.11]
python: [3.12]
env:
BIGML_USERNAME: ${{ secrets.BIGML_USERNAME }}
BIGML_API_KEY: ${{ secrets.BIGML_API_KEY }}
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/tests_05.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python: [3.11]
python: [3.12]
env:
BIGML_USERNAME: ${{ secrets.BIGML_USERNAME }}
BIGML_API_KEY: ${{ secrets.BIGML_API_KEY }}
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/tests_22.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python: [3.11]
python: [3.12]
env:
BIGML_USERNAME: ${{ secrets.BIGML_USERNAME }}
BIGML_API_KEY: ${{ secrets.BIGML_API_KEY }}
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/tests_23.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python: [3.11]
python: [3.12]
env:
BIGML_USERNAME: ${{ secrets.BIGML_USERNAME }}
BIGML_API_KEY: ${{ secrets.BIGML_API_KEY }}
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/tests_36.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python: [3.11]
python: [3.12]
env:
BIGML_USERNAME: ${{ secrets.BIGML_USERNAME }}
BIGML_API_KEY: ${{ secrets.BIGML_API_KEY }}
Expand Down
15 changes: 14 additions & 1 deletion .readthedocs.yaml
Original file line number Diff line number Diff line change
@@ -1,9 +1,22 @@
# .readthedocs.yaml
# Read the Docs configuration file
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details

# Required
version: 2

# Set the version of Python and other tools you might need
build:
os: ubuntu-22.04
tools:
python: "3.10"
python: "3.11"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not 3.12?


# Build documentation in the docs/ directory with Sphinx
sphinx:
configuration: docs/conf.py

# We recommend specifying your dependencies to enable reproducible builds:
# https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
python:
install:
- requirements: docs/requirements.txt
7 changes: 7 additions & 0 deletions HISTORY.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,13 @@
History
-------

9.8.0 (2024-10-02)
------------------

- Fixing the get_leaves function for local decision trees.
- Fixing setup issues in Python3.12
- Changing documentation templates.

9.8.0.dev1 (2024-02-28)
-----------------------

Expand Down
10 changes: 5 additions & 5 deletions bigml/dataset.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,6 @@
"""
import os
import logging
import warnings
import subprocess

from bigml.fields import Fields, sorted_headers, get_new_fields
Expand All @@ -40,12 +39,13 @@

#pylint: disable=locally-disabled,bare-except,ungrouped-imports
try:
# avoiding tensorflow info logging
warnings.filterwarnings("ignore", category=DeprecationWarning)
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'
# bigml-sensenet should be installed for image processing
logging.disable(logging.WARNING)
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'
import tensorflow as tf
tf.get_logger().setLevel('ERROR')
tf.autograph.set_verbosity(0)
logging.getLogger("tensorflow").setLevel(logging.ERROR)
import sensenet
from bigml.images.featurizers import ImageFeaturizer as Featurizer
except:
pass
Expand Down
13 changes: 4 additions & 9 deletions bigml/deepnet.py
Original file line number Diff line number Diff line change
Expand Up @@ -62,20 +62,15 @@
import bigml.laminar.preprocess_np as pp

try:
# avoiding tensorflow info logging
warnings.filterwarnings("ignore", category=DeprecationWarning)
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'
logging.disable(logging.WARNING)
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'
logging.getLogger("tensorflow").setLevel(logging.ERROR)
import tensorflow as tf
tf.get_logger().setLevel('ERROR')
tf.autograph.set_verbosity(0)
LAMINAR_VERSION = False
except Exception:
LAMINAR_VERSION = True

try:
from sensenet.models.wrappers import create_model
from bigml.images.utils import to_relative_coordinates
from bigml.constants import IOU_REMOTE_SETTINGS
LAMINAR_VERSION = False
except Exception:
LAMINAR_VERSION = True

Expand Down
1 change: 0 additions & 1 deletion bigml/ensemble.py
Original file line number Diff line number Diff line change
Expand Up @@ -214,7 +214,6 @@ def __init__(self, ensemble, api=None, max_models=None, cache_get=None,
# avoid checking fields because of old ensembles
ensemble = retrieve_resource(self.api, self.resource_id,
no_check_fields=True)

self.parent_id = ensemble.get('object', {}).get('dataset')
self.name = ensemble.get('object', {}).get('name')
self.description = ensemble.get('object', {}).get('description')
Expand Down
9 changes: 6 additions & 3 deletions bigml/generators/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -135,8 +135,9 @@ def get_leaves(model, path=None, filter_function=None):

offsets = model.offsets

def get_tree_leaves(tree, fields, path, leaves, filter_function=None):
def get_tree_leaves(tree, fields, path, filter_function=None):

leaves = []
node = get_node(tree)
predicate = get_predicate(tree)
if isinstance(predicate, list):
Expand All @@ -149,10 +150,12 @@ def get_tree_leaves(tree, fields, path, leaves, filter_function=None):

if children:
for child in children:

leaves += get_tree_leaves(child, fields,
path[:], leaves,
path[:],
filter_function=filter_function)
else:
print("id:", node[offsets["id"]])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

debug leftover?

leaf = {
'id': node[offsets["id"]],
'confidence': node[offsets["confidence"]],
Expand All @@ -171,7 +174,7 @@ def get_tree_leaves(tree, fields, path, leaves, filter_function=None):
or filter_function(leaf)):
leaves += [leaf]
return leaves
return get_tree_leaves(model.tree, model.fields, path, leaves,
return get_tree_leaves(model.tree, model.fields, path,
filter_function)


Expand Down
6 changes: 6 additions & 0 deletions bigml/tests/create_model_steps.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@
from bigml.deepnet import Deepnet
from bigml.fusion import Fusion
from bigml.ensemble import Ensemble
from bigml.generators.model import get_leaves


from .read_resource_steps import wait_until_status_code_is
Expand Down Expand Up @@ -690,3 +691,8 @@ def the_cloned_logistic_regression_is(step, logistic_regression):
def check_deepnet_id_local_id(step):
"""Checking that deepnet ID and local deepnet ID match"""
eq_(world.deepnet["resource"], step.bigml["local_deepnet"].resource_id)


def check_leaves_number(step, leaves_number):
"""Checking the number of leaves in a tree local model"""
eq_(len(get_leaves(step.bigml["local_model"])), leaves_number)
2 changes: 1 addition & 1 deletion bigml/tests/test_14_create_evaluations.py
Original file line number Diff line number Diff line change
Expand Up @@ -194,7 +194,7 @@ def test_scenario4(self):
"evaluation_wait", "metric", "value"]
examples = [
['data/iris.csv', '50', '50', '800', '80', 'average_phi',
'0.97007']]
'0.98029']]
for example in examples:
example = dict(zip(headers, example))
show_method(self, self.bigml["method"], example)
Expand Down
2 changes: 1 addition & 1 deletion bigml/tests/test_36_compare_predictions.py
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ def test_scenario1(self):
'Iris-versicolor', '{}'],
['data/iris_missing2.csv', '30', '50', '60', '{}', '000004',
'Iris-versicolor', '{}'],
['data/grades.csv', '30', '50', '60', '{}', '000005', 55.6560,
['data/grades.csv', '30', '50', '60', '{}', '000005', 47.04852,
'{}'],
['data/spam.csv', '30', '50', '60', '{}', '000000', 'ham', '{}']]
show_doc(self.test_scenario1)
Expand Down
10 changes: 6 additions & 4 deletions bigml/tests/test_40_local_from_file.py
Original file line number Diff line number Diff line change
Expand Up @@ -66,17 +66,18 @@ def test_scenario1(self):
When I create a local model from the file "<exported_file>"
Then the model ID and the local model ID match
And the prediction for "<input_data>" is "<prediction>"
And the number of leaves is "<leaves#>"
"""
show_doc(self.test_scenario1)
headers = ["data", "source_wait", "dataset_wait", "model_wait",
"pmml", "exported_file", "input_data", "prediction",
"model_conf"]
"model_conf", 'leaves#']
examples = [
['data/iris.csv', '10', '10', '10', False,
'./tmp/model.json', {}, "Iris-setosa", '{}'],
'./tmp/model.json', {}, "Iris-setosa", '{}', 9],
['data/iris.csv', '10', '10', '10', False,
'./tmp/model_dft.json', {}, "Iris-versicolor",
'{"default_numeric_value": "mean"}']]
'{"default_numeric_value": "mean"}', 9]]
for example in examples:
example = dict(zip(headers, example))
show_method(self, self.bigml["method"], example)
Expand All @@ -97,6 +98,7 @@ def test_scenario1(self):
model_create.check_model_id_local_id(self)
model_create.local_model_prediction_is(
self, example["input_data"], example["prediction"])
model_create.check_leaves_number(self, example["leaves#"])

def test_scenario2(self):
"""
Expand Down Expand Up @@ -211,7 +213,7 @@ def test_scenario4(self):
['data/iris.csv', '10', '10', '500', './tmp/deepnet.json', {},
'Iris-versicolor', '{}'],
['data/iris.csv', '10', '10', '500', './tmp/deepnet_dft.json', {},
'Iris-virginica', '{"default_numeric_value": "maximum"}']]
'Iris-versicolor', '{"default_numeric_value": "maximum"}']]
for example in examples:
example = dict(zip(headers, example))
show_method(self, self.bigml["method"], example)
Expand Down
16 changes: 8 additions & 8 deletions bigml/tests/test_49_local_pipeline.py
Original file line number Diff line number Diff line change
Expand Up @@ -210,28 +210,28 @@ def test_scenario4(self):
examples = [
['data/dates2.csv', '20', '45', '160',
'{"time-1": "1910-05-08T19:10:23.106", "cat-0":"cat2"}',
'000002', -0.02616, "pipeline1"],
'000002', -0.4264, "pipeline1"],
['data/dates2.csv', '20', '45', '160',
'{"time-1": "2011-04-01T00:16:45.747", "cat-0":"cat2"}',
'000002', 0.13352, "pipeline2"],
'000002', 0.11985, "pipeline2"],
['data/dates2.csv', '20', '45', '160',
'{"time-1": "1969-W29-1T17:36:39Z", "cat-0":"cat1"}',
'000002', 0.10071, "pipeline3"],
'000002', -0.08211, "pipeline3"],
['data/dates2.csv', '20', '45', '160',
'{"time-1": "1920-06-45T20:21:20.320", "cat-0":"cat1"}',
'000002', 0.10071, "pipeline4"],
'000002', -0.08211, "pipeline4"],
['data/dates2.csv', '20', '45', '160',
'{"time-1": "2001-01-05T23:04:04.693", "cat-0":"cat2"}',
'000002', 0.15235, "pipeline5"],
'000002', 0.00388, "pipeline5"],
['data/dates2.csv', '20', '45', '160',
'{"time-1": "1950-11-06T05:34:05.602", "cat-0":"cat1"}',
'000002', -0.07686, "pipeline6"],
'000002', -0.04976, "pipeline6"],
['data/dates2.csv', '20', '45', '160',
'{"time-1": "1932-01-30T19:24:11.440", "cat-0":"cat2"}',
'000002', 0.0017, "pipeline7"],
'000002', -0.36264, "pipeline7"],
['data/dates2.csv', '20', '45', '160',
'{"time-1": "Mon Jul 14 17:36 +0000 1969", "cat-0":"cat1"}',
'000002', 0.10071, "pipeline8"]]
'000002', -0.08211, "pipeline8"]]
show_doc(self.test_scenario4)
for example in examples:
example = dict(zip(headers, example))
Expand Down
2 changes: 1 addition & 1 deletion bigml/version.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
__version__ = '9.8.0.dev1'
__version__ = '9.8.0'
4 changes: 2 additions & 2 deletions docs/101_anomaly.rst
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
.. toctree::
:hidden:

BigML Bindings: 101 - Using an anomaly detector
===============================================
101 - Anomaly detector usage
============================

Following the schema described in the `prediction workflow <api_sketch.html>`_,
document, this is the code snippet that shows the minimal workflow to
Expand Down
4 changes: 2 additions & 2 deletions docs/101_association.rst
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
.. toctree::
:hidden:

BigML Bindings: 101 - Using Association Discovery
=================================================
101 - Association Discovery usage
=================================

Following the schema described in the `prediction workflow <api_sketch.html>`_,
document, this is the code snippet that shows the minimal workflow to
Expand Down
4 changes: 2 additions & 2 deletions docs/101_cluster.rst
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
.. toctree::
:hidden:

BigML Bindings: 101 - Using a Cluster
=====================================
101 - Cluster Usage
===================

Following the schema described in the `prediction workflow <api_sketch.html>`_,
document, this is the code snippet that shows the minimal workflow to
Expand Down
4 changes: 2 additions & 2 deletions docs/101_deepnet.rst
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
.. toctree::
:hidden:

BigML Bindings: 101 - Using a Deepnet Model
===========================================
101 - Deepnet usage
===================

Following the schema described in the `prediction workflow <api_sketch.html>`_,
document, this is the code snippet that shows the minimal workflow to
Expand Down
4 changes: 2 additions & 2 deletions docs/101_ensemble.rst
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
.. toctree::
:hidden:

BigML Bindings: 101 - Using an Ensemble
=======================================
101 - Ensemble usage
====================

Following the schema described in the `prediction workflow <api_sketch.html>`_,
document, this is the code snippet that shows the minimal workflow to
Expand Down
4 changes: 2 additions & 2 deletions docs/101_fusion.rst
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
.. toctree::
:hidden:

BigML Bindings: 101 - Using a Fusion Model
==========================================
101 - Fusion usage
==================

Following the schema described in the `prediction workflow <api_sketch.html>`_,
document, this is the code snippet that shows the minimal workflow to
Expand Down
4 changes: 2 additions & 2 deletions docs/101_images_classification.rst
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
.. toctree::
:hidden:

BigML Bindings: 101 - Images Classification
===========================================
101 - Images Classification
===========================

Following the schema described in the `prediction workflow <api_sketch.html>`_,
document, this is the code snippet that shows the minimal workflow to
Expand Down
4 changes: 2 additions & 2 deletions docs/101_images_feature_extraction.rst
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
.. toctree::
:hidden:

BigML Bindings: 101 - Images Feature Extraction
===============================================
101 - Images Feature Extraction
===============================

Following the schema described in the `prediction workflow <api_sketch.html>`_,
document, this is the code snippet that shows the minimal workflow to
Expand Down
4 changes: 2 additions & 2 deletions docs/101_linear_regression.rst
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
.. toctree::
:hidden:

BigML Bindings: 101 - Using a Linear Regression
=================================================
101 - Linear Regression usage
=============================

Following the schema described in the `prediction workflow <api_sketch.html>`_,
document, this is the code snippet that shows the minimal workflow to
Expand Down
4 changes: 2 additions & 2 deletions docs/101_logistic_regression.rst
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
.. toctree::
:hidden:

BigML Bindings: 101 - Using a Logistic Regression
=================================================
101 - Logistic Regression usage
===============================

Following the schema described in the `prediction workflow <api_sketch.html>`_,
document, this is the code snippet that shows the minimal workflow to
Expand Down
Loading
Loading