Last Updated : 23 Jul, 2025
TensorFlow is a machine learning framework that has offered flexibility, scalability and performance for deep learning tasks. tf.function helps to optimize and accelerate computation by leveraging graph-based execution. In the article, we will cover the concept of tf.function in TensorFlow.
What is tf.function in TensorFlow?tf.function is a decorator provided by TensorFlow that transforms Python functions into graph operations. This transformation enables TensorFlow to compile and optimize the function's computation, leading to enhanced performance and efficiency. Unlike traditional Python functions, tf.function utilizes graph-based execution, which can significantly improve execution speed, especially for repetitive tasks.
Features of tf.functionLet's understand important characteristics of tf.function():
The tf.function work involves examining the tracing process, compilation and optimization. Let's explore the whole process:
We can use tf.function in TensorFlow as a decorator. Let's have a look at the implementation:
1. Define Your Python Function: Start by defining a Python function that contains TensorFlow operations.
Python3
def my_function(x, y):
return tf.add(x, y)
2. Decorate the Function: Apply the @tf.function decorator to your Python function.
@tf.function
def my_function(x, y):
return tf.add(x, y)
3. Call the Function: Once decorated, you can call your function as you would any other Python function.
result = my_function(tf.constant(2), tf.constant(3))
print(result)
Output:
tf.Tensor(5, shape=(), dtype=int32)How can we generate graphs using tf.function?
A graph is a data structure that represents a computation as a set of nodes and edges. Graphs can help you optimize and deploy your TensorFlow models.
To create a graph using the tf.function() function, you can either use it as a decorator or as a direct call.
1. As a decorator: Python3
@tf.function
def add(a, b):
return a + b
2. As a direct call: Python
def multiply(a, b):
return a * b
multiply = tf.function(multiply)
Both methods will convert the Python functions into PolymorphicFunction objects, which can build TensorFlow graphs when called.
Using TensorBoardTo generate these graphs, we need to use TensorBoard. TensorBoard is a powerful tool that allows us to visualize and analyze our TensorFlow models and data. With TensorBoard, we can:
To use TensorBoard, we need to:
pip install tensorboard
tensorboard --logdir logs/graphs
To use the tf.function annotation to "autograph", i.e., transform, a Python computation function into a high-performance TensorFlow graph. For these situations, we use TensorFlow Summary Trace API to log autographed functions for visualization in TensorBoard.
To use the Summary Trace API: Define and annotate a function with tf.function. Use tf.summary.trace_on() immediately before your function call site. Add profile information (memory, CPU time) to graph by passing profiler=True. With a Summary file writer, call tf.summary.trace_export() to save the log data.
We can then use TensorBoard to see how our function behaves.
Python
# The function to be traced.
@tf.function
def my_func(x, y):
# A simple hand-rolled layer.
return tf.nn.relu(tf.matmul(x, y))
# Set up logging.
stamp = datetime.now().strftime("%Y%m%d-%H%M%S")
logdir = 'logs/func/%s' % stamp
writer = tf.summary.create_file_writer(logdir)
# Sample data for our function.
x = tf.random.uniform((3, 3))
y = tf.random.uniform((3, 3))
# Bracket the function call with
# tf.summary.trace_on() and tf.summary.trace_export().
tf.summary.trace_on(graph=True, profiler=True)
# Call only one tf.function when tracing.
z = my_func(x, y)
with writer.as_default():
tf.summary.trace_export(
name="my_func_trace",
step=0,
profiler_outdir=logdir)
Run following command in the command prompt:
tensorboard --logdir logs/func
Output:
Output Computational GraphRetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4