10 Useful Python Decorators with Code Examples
This article introduces ten practical Python decorators—including @timer, @memoize, @validate_input, and others—explaining their purpose, providing detailed code implementations, and demonstrating how they can improve performance, error handling, logging, validation, and visualization in data‑science and general programming tasks.
Decorators are a powerful and flexible feature in Python that allow modification or enhancement of functions or classes without changing the original code.
1. @timer – Measure Execution Time
The @timer decorator helps track the execution time of a function, useful for identifying performance bottlenecks.
import time
def timer(func):
def wrapper(*args, **kwargs):
start_time = time.time()
result = func(*args, **kwargs)
end_time = time.time()
print(f"{func.__name__} took {end_time - start_time:.2f} seconds to execute.")
return result
return wrapper
@timer
def my_data_processing_function():
# Your data processing code here
passCombining @timer with other decorators can provide comprehensive performance analysis.
2. @memoize – Cache Results
The @memoize decorator caches function results to avoid redundant calculations, dramatically speeding up expensive recursive or data‑science functions.
def memoize(func):
cache = {}
def wrapper(*args):
if args in cache:
return cache[args]
result = func(*args)
cache[args] = result
return result
return wrapper
@memoize
def fibonacci(n):
if n <= 1:
return n
return fibonacci(n-1) + fibonacci(n-2)It can also be applied to other recursive functions for optimization.
3. @validate_input – Input Validation
This decorator checks that function arguments meet predefined criteria before the function runs, ensuring data integrity.
def validate_input(func):
def wrapper(*args, **kwargs):
# Your data validation logic here
if valid_data:
return func(*args, **kwargs)
else:
raise ValueError("Invalid data. Please check your inputs.")
return wrapper
@validate_input
def analyze_data(data):
# Your data analysis code here
passIt provides a consistent way to enforce input standards in data‑science projects.
4. @log_results – Log Output
The @log_results decorator writes a function’s result to a log file, aiding debugging and monitoring.
def log_results(func):
def wrapper(*args, **kwargs):
result = func(*args, **kwargs)
with open("results.log", "a") as log_file:
log_file.write(f"{func.__name__} - Result: {result}\n")
return result
return wrapper
@log_results
def calculate_metrics(data):
# Your metric calculation code here
passIt can be combined with more advanced logging libraries for richer functionality.
5. @suppress_errors – Graceful Error Handling
The @suppress_errors decorator catches exceptions, prints a friendly message, and returns None so the program can continue.
def suppress_errors(func):
def wrapper(*args, **kwargs):
try:
return func(*args, **kwargs)
except Exception as e:
print(f"Error in {func.__name__}: {e}")
return None
return wrapper
@suppress_errors
def preprocess_data(data):
# Your data preprocessing code here
passThis prevents unexpected crashes while still providing error details for debugging.
6. @validate_output – Output Validation
After a function runs, this decorator checks that the result satisfies certain quality criteria, raising an error if not.
def validate_output(func):
def wrapper(*args, **kwargs):
result = func(*args, **kwargs)
if valid_output(result):
return result
else:
raise ValueError("Invalid output. Please check your function logic.")
return wrapper
@validate_output
def clean_data(data):
# Your data cleaning code here
passIt ensures that downstream processing receives reliable data.
7. @retry – Automatic Retries
The @retry decorator attempts to re‑execute a function a specified number of times with a delay between attempts, improving resilience against transient failures.
import time
def retry(max_attempts, delay):
def decorator(func):
def wrapper(*args, **kwargs):
attempts = 0
while attempts < max_attempts:
try:
return func(*args, **kwargs)
except Exception as e:
print(f"Attempt {attempts + 1} failed. Retrying in {delay} seconds.")
attempts += 1
time.sleep(delay)
raise Exception("Max retry attempts exceeded.")
return wrapper
return decorator
@retry(max_attempts=3, delay=2)
def fetch_data_from_api(api_url):
# Your API data fetching code here
passUse it judiciously to avoid excessive retry loops.
8. @visualize_results – Automatic Visualization
This decorator runs a function, then creates a Matplotlib figure to display the results, streamlining the generation of visual output.
import matplotlib.pyplot as plt
def visualize_results(func):
def wrapper(*args, **kwargs):
result = func(*args, **kwargs)
plt.figure()
# Your visualization code here
plt.show()
return result
return wrapper
@visualize_results
def analyze_and_visualize(data):
# Your combined analysis and visualization code here
pass9. @debug – Debugging Helper
The @debug decorator prints the function name along with its positional and keyword arguments before execution, simplifying troubleshooting.
def debug(func):
def wrapper(*args, **kwargs):
print(f"Debugging {func.__name__} - args: {args}, kwargs: {kwargs}")
return func(*args, **kwargs)
return wrapper
@debug
def complex_data_processing(data, threshold=0.5):
# Your complex data processing code here
pass10. @deprecated – Mark Deprecated Functions
This decorator issues a DeprecationWarning when a function is called, alerting users that the function will be removed in future versions.
import warnings
def deprecated(func):
def wrapper(*args, **kwargs):
warnings.warn(f"{func.__name__} is deprecated and will be removed in future versions.", DeprecationWarning)
return func(*args, **kwargs)
return wrapper
@deprecated
def old_data_processing(data):
# Your old data processing code here
passDecorators are a versatile tool in Python, useful for caching, logging, access control, and many other scenarios; employing the examples above can simplify development and make code more robust.
Python Programming Learning Circle
A global community of Chinese Python developers offering technical articles, columns, original video tutorials, and problem sets. Topics include web full‑stack development, web scraping, data analysis, natural language processing, image processing, machine learning, automated testing, DevOps automation, and big data.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.