Skip to content

Practical exploration of TensorFlow eager vs graph execution modes with real code examples, performance benchmarks, control flow, safe patterns for side-effects, and variable updates — demonstrating deep understanding of TensorFlow mechanics.

License

Notifications You must be signed in to change notification settings

imehranasgari/DL_TensorFlow_LowAPI_Eager_VS_Graph

Repository files navigation

TensorFlow Eager vs Graph Execution – Practical Exploration

1. Problem Statement and Goal of Project

The goal of this project is to investigate, compare, and document the practical differences between TensorFlow’s eager execution and graph execution (@tf.function), focusing on:

  • Numerical correctness between modes.
  • Execution time trade-offs for small and repeated operations.
  • Behavior of Python control flow inside graphs.
  • Issues with Python side-effects (e.g., list mutation) and safe graph-friendly alternatives.
  • Variable mutation and in-place updates in @tf.function.

2. Solution Approach

The notebook takes a hands-on experimental approach, using small, controlled code snippets to explore specific behaviors:

  1. Basic numerical test – Implement a simple function (matrix multiplication + bias addition) and run in both eager and graph modes.
  2. Control flow with AutoGraph – Demonstrate how Python if/else is converted into graph operations and inspect the generated graph definition.
  3. Micro-benchmarking – Use timeit to measure performance differences for single vs repeated calls.
  4. Model tracing – Build a minimal Keras model and examine graph tracing.
  5. Side-effects – Show that Python list appends behave unexpectedly inside graphs, then introduce the safe pattern using tf.TensorArray and tf.range.
  6. Variable mutation – Demonstrate correct in-graph state updates using .assign and tf.tensor_scatter_nd_update.

3. Technologies & Libraries

From imports and usage:

  • TensorFlow (tensorflow) – eager execution, tf.function, TensorArray, variables, and tensor ops.
  • Keras (via TensorFlow) – Input, Model, Conv2D, Dense, Flatten.
  • NumPy – tensor creation and manipulation.
  • timeit – performance measurement.
  • (Optional) ipdb – debugging (not required to run the notebook).

4. Description about Dataset

Not provided – all examples use synthetic/randomly generated tensors; no external dataset is loaded.


5. Installation & Execution Guide

Requirements:

pip install tensorflow numpy

(Optional)

pip install ipdb

Run the notebook:

jupyter notebook eager_vs_graph_me.ipynb

or in JupyterLab:

jupyter lab eager_vs_graph_me.ipynb

Execute cells sequentially to reproduce experiments and outputs.


6. Key Results / Performance

Functional Equivalence (Numerical)

Both modes produce identical results for the test function:

Eager function value:
[[22.7]]
Graph mode function value:
[[22.7]]

Control Flow (AutoGraph)

Branching with Python if/else is correctly lowered into graph ops:

first_branch:1
sec_branch:0

The generated graph can be printed via:

tf_simple_relu.get_concrete_function(tf.constant(1)).graph.as_graph_def()

Performance (timeit)

Single call:

egar_function time 0.0099053
graph_finction time 0.0362043

10,000 calls:

egar_function time 0.2091298
graph_finction time 1.2625766

(Values are from the recorded run; actual times vary by hardware and environment.)

Side-effects & Safe Pattern

Appending to a Python list inside @tf.function captures tf.Tensor objects, not eager values. Safe pattern with tf.TensorArray:

<tf.Tensor: shape=(3,), dtype=int32, numpy=array([2, 4, 6])>

Variable Updates Inside Graph

Demonstrates mutation with .assign and tf.tensor_scatter_nd_update:

<tf.Tensor: shape=(3,), dtype=int32, numpy=array([1, 5, 5])>
tf.Tensor([50  4  6], shape=(3,), dtype=int32)

7. Screenshots / Sample Output

(Extracted directly from notebook outputs)

Eager vs Graph Equality

Eager function value:
[[22.7]]
Graph mode function value:
[[22.7]]

Performance – 10k calls

egar_function time 0.2091298
graph_finction time 1.2625766

TensorArray Output

<tf.Tensor: shape=(3,), dtype=int32, numpy=array([2, 4, 6])>

8. Additional Learnings / Reflections

  • Eager mode is often faster for small, repeated functions in interactive contexts due to reduced tracing overhead.
  • Graph mode is valuable for complex computations, deployment, or hardware optimization, but may introduce pitfalls with Python side-effects.
  • AutoGraph enables Pythonic control flow in graphs but requires care with data types and shapes.
  • State changes in graph mode must be expressed using TensorFlow ops (assign, TensorArray) rather than Python constructs.

💡 Some interactive outputs (e.g., plots, widgets) may not display correctly on GitHub. If so, please view this notebook via nbviewer.org for full rendering.


👤 Author

Mehran Asgari Email: imehranasgari@gmail.com GitHub: https://github.com/imehranasgari


📄 License

This project is licensed under the Apache 2.0 License – see the LICENSE file for details.


About

Practical exploration of TensorFlow eager vs graph execution modes with real code examples, performance benchmarks, control flow, safe patterns for side-effects, and variable updates — demonstrating deep understanding of TensorFlow mechanics.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published