Skip to content
This repository has been archived by the owner on Jan 3, 2023. It is now read-only.

Commit

Permalink
Merge pull request #535 from NervanaSystems/jennifer/doc
Browse files Browse the repository at this point in the history
doc fixes, use overview for index page, fix icon
  • Loading branch information
Jennifer Myers authored Nov 17, 2016
2 parents 5fac560 + 7da2f8e commit 475387a
Show file tree
Hide file tree
Showing 9 changed files with 23 additions and 82 deletions.
11 changes: 9 additions & 2 deletions doc/source/axes.rst
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,7 @@ The ``Axis`` object represents one dimension of a tensor, and can be created wit
For tensors with multiple dimensions, we create an ``Axes`` passing in a list of individual ``Axis`` objects. Note that
the ordering does *not* matter in specifying the axes, and has no bearing on the eventual data layout during execution. See Properties
for a full description of axes properties.

::

axes = ng.make_axes([H, W])
Expand Down Expand Up @@ -125,7 +126,7 @@ We can also use ``ng.cast_axis`` to recast the axes of an already defined tensor
Properties
----------

1. The order of Axes does not matter. ::
1. The order of Axes does not matter.

- Two tensors ``x`` and ``y`` are considered having the same type if

Expand Down Expand Up @@ -166,6 +167,7 @@ Properties
A set of standard neon axes are defined for neon frontends.

- Axes roles

::

ar = ng.make_name_scope(name="ar")
Expand All @@ -177,6 +179,7 @@ Properties
ar.Time = ng.make_axis_role()

- Image / feature map

::

ax = ng.make_name_scope(name="ax")
Expand All @@ -187,6 +190,7 @@ Properties
ax.W = ng.make_axis(roles=[ar.Width], docstring="input image width")

- Filter (convolution kernel)

::

ax.R = ng.make_axis(roles=[ar.Height], docstring="filter height")
Expand All @@ -196,18 +200,21 @@ Properties
ax.K = ng.make_axis(roles=[ar.Channelout], docstring="number of output feature maps")

- Output

::

ax.M = ng.make_axis(roles=[ar.Depth], docstring="output image depth")
ax.P = ng.make_axis(roles=[ar.Height], docstring="output image height")
ax.Q = ng.make_axis(roles=[ar.Width], docstring="output image width")

- Recurrent

::

ax.REC = ng.make_axis(roles=[ar.Time], recurrent=True, docstring="recurrent axis")

- Target

::

ax.Y = ng.make_axis(docstring="target")
Expand Down Expand Up @@ -238,7 +245,7 @@ Elementwise Binary Ops
(C,) + (H, W) -> (C, H, W)
(H, W) + (C,) -> (H, W, C)

In the following example, `z` from left and right are equivalent, although
In the following example, ``z`` from left and right are equivalent, although
the axis orders are different.

::
Expand Down
2 changes: 1 addition & 1 deletion doc/source/building_graphs.rst
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,7 @@ We can guard the ``update_op`` with a context ``ng.Op.saved_user_deps`` to make
with ng.Op.saved_user_deps():
update_op = ng.assign(w, w + 1)
This modification will then allow the `w_comp()` to properly print ``0, 0, 0`` for each call. Ops that are defined inside the context are not included in the dependencies of the computation unless explicitly named. To recreate the ``1, 2, 3`` behavior now that the ``update_op`` is guarded, we would have to explicitly name the ``update_op`` in the computation:
This modification will then allow the ``w_comp()`` to properly print ``0, 0, 0`` for each call. Ops that are defined inside the context are not included in the dependencies of the computation unless explicitly named. To recreate the ``1, 2, 3`` behavior now that the ``update_op`` is guarded, we would have to explicitly name the ``update_op`` in the computation:

.. code-block:: python
Expand Down
7 changes: 1 addition & 6 deletions doc/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -13,12 +13,7 @@
.. limitations under the License.
.. ---------------------------------------------------------------------------
|Geon|
================

:Release: |version|
:Date: |today|
.. include:: overview.rst

.. toctree::
:hidden:
Expand Down
5 changes: 3 additions & 2 deletions doc/source/ngraph_theme/layout.html
Original file line number Diff line number Diff line change
Expand Up @@ -103,15 +103,16 @@
<div class="wy-side-nav-search">
{% block sidebartitle %}

{% if logo and theme_logo_only %}
{% if logo %}
<a href="{{ pathto(master_doc) }}">
{% else %}
<a href="{{ pathto(master_doc) }}" class="icon icon-home"> {{ project }}
<a href="{{ pathto(master_doc) }}" class="icon icon-home">
{% endif %}

{% if logo %}
{# Not strictly valid HTML, but it's the only way to display/scale it properly, without weird scripting or heaps of work #}
<img src="{{ pathto('_static/' + logo, 1) }}" class="logo" />
{{ project }}
{% endif %}
</a>

Expand Down
Binary file modified doc/source/ngraph_theme/static/favicon.ico
100644 → 100755
Binary file not shown.
63 changes: 0 additions & 63 deletions doc/source/op_graph.rst

This file was deleted.

7 changes: 5 additions & 2 deletions doc/source/overview.rst
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,10 @@
.. ---------------------------------------------------------------------------
Overview
********
========

:Release: |version|
:Date: |today|

.. Note::
Nervana Graph is currently a preview release and the API's and implementation are subject to change. We encourage you to contribute to the discussion and help shape the future Nervana Graph.
Expand Down Expand Up @@ -118,4 +121,4 @@ We are actively working towards:

Join us
-------
With the rapid pace in the deep learning community we realize that a project like this won't succeed without community participation, which is what motivated us to put this preview release out there to get feedback and encourage people like you to come join us in defining the next wave of deep learning tooling. Feel free to make pull requests/suggestions/comments on `Github <https://github.com/NervanaSystems/ngraph>`_) or reach out to us on our `mailing list <https://groups.google.com/forum/#!forum/neon-users>`_. We are also hiring for full-time and internship positions.
With the rapid pace in the deep learning community we realize that a project like this won't succeed without community participation, which is what motivated us to put this preview release out there to get feedback and encourage people like you to come join us in defining the next wave of deep learning tooling. Feel free to make pull requests/suggestions/comments on `Github <https://github.com/NervanaSystems/ngraph>`_) or reach out to us on our `mailing list <https://groups.google.com/forum/#!forum/neon-users>`_. We are also hiring for full-time and internship positions.
8 changes: 4 additions & 4 deletions doc/source/transformer_usage.rst
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ Computation objects are created by the transformer and provide an interface to e
Computation Creation
--------------------

Computations are created with the ``Transformer.computation`` method. When creating a computation, the user must specify a list of results which should be evaluated by the computation. These results should be ngraph ``Op``s. The transformer is able to traverse the graph backwards from these results to determine the entire subset of graph nodes required to evaluate these results, so it is not necessary for the user to specify the entire subset of nodes to execute. The user must also specify a list of graph nodes to be set as inputs to the computation. Typically these are placeholder tensors. Continuing from the above code example, a simple graph and computation can be created:
Computations are created with the ``Transformer.computation`` method. When creating a computation, the user must specify a list of results which should be evaluated by the computation. These results should be ngraph ``Op`` s. The transformer is able to traverse the graph backwards from these results to determine the entire subset of graph nodes required to evaluate these results, so it is not necessary for the user to specify the entire subset of nodes to execute. The user must also specify a list of graph nodes to be set as inputs to the computation. Typically these are placeholder tensors. Continuing from the above code example, a simple graph and computation can be created:

.. code-block:: python
Expand All @@ -76,7 +76,7 @@ After all computations are created, the ``Transformer.initialize`` method must b
Computation Execution
---------------------

This computation object can be executed with is ``__call__`` method by specifying the input ``c``.
This computation object can be executed with its ``__call__`` method by specifying the input ``c``.

.. code-block:: python
Expand All @@ -94,7 +94,7 @@ In real world cases, we often want computations that return multiple results. Fo
example_comp2 = transformer.computation([d, e], b, c)
result_d, result_e = example_comp2(2, 7)
In addition to returning the final result as above, this example will also set result_d to the result of the d operation, which should be 8.
In addition to returning the final result as above, this example will also set ``result_d`` to the result of the ``d`` operation, which should be 8.

Transformed Graph State
-----------------------
Expand All @@ -110,7 +110,7 @@ For convenience, an executor utility is provided in ngraph.util.utils. This exec

.. code-block:: python
from ngraph.util.utils import executor
from ngraph.util.utils import executor
example_comp = executor(e, b, c)
result_e = example_comp(2, 7)
Expand Down
2 changes: 0 additions & 2 deletions examples/walk_through/Logistic_Regression_Part_1.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -191,8 +191,6 @@
},
"outputs": [],
"source": [
"ngt.make_transformer()\n",
"\n",
"transformer = ngt.make_transformer()\n",
"update_fun = transformer.computation([L, W, update], alpha, X, Y)"
]
Expand Down

0 comments on commit 475387a

Please sign in to comment.