Home » The Python Decorator Handbook

The Python Decorator Handbook

by Icecream
0 comment

Python decorators present a straightforward but highly effective syntax for modifying and lengthening the habits of capabilities in your code.

A decorator is basically a operate that takes one other operate, augments its performance, and returns a brand new operate – with out completely modifying the unique operate itself.

This tutorial will stroll you thru 11 useful decorators to assist add performance like timing execution, caching, charge limiting, debugging and extra. Whether you wish to profile efficiency, enhance effectivity, validate information, or handle errors, these decorators have gotten you lined!

The examples right here deal with the widespread utilization patterns and utilities of decorators that may turn out to be useful in your day-to-day programming and prevent a variety of effort. Understanding the flexibleness of decorators will aid you write clear, resilient, and optimized utility code.

Table of Contents

Here are the decorators lined on this tutorial:

But first, just a little introduction.

How Python Decorators Work

Before diving in, let’s perceive some key advantages of decorators in Python:

  • Enhancing capabilities with out invasive modifications: Decorators increase capabilities transparently with out altering the unique code, retaining the core logic clear and maintainable.
  • Reusing performance throughout locations: Common capabilities like logging, caching, and charge limiting may be constructed as soon as in decorators and utilized wherever wanted.
  • Readable and declarative syntax: The @decorator syntax merely conveys performance enhancement on the definition web site.
  • Modularity and separation of considerations: Decorators promote unfastened coupling between practical logic and secondary capabilities like efficiency, safety, logging and so forth.

The takeaway is that decorators unlock easy but versatile methods of transparently enhancing Python capabilities for improved code group, effectivity, and reuse with out introducing complexity or redundancy.

Here is a fundamental instance of decorator syntax in Python with annotations:

# Decorator operate
def my_decorator(func):

# Wrapper operate
    def wrapper():
        print("Before the operate name") # Extra processing earlier than the operate
        func() # Call the precise operate being embellished
        print("After the operate name") # Extra processing after the operate
    return wrapper # Return the nested wrapper operate

# Function to brighten
def my_function():
    print("Inside my operate")

# Apply decorator on the operate
@my_decorator
def my_function():
    print("Inside my operate")

# Call the embellished operate
my_function()
Skeleton code template to make the only of the decorators

A decorator in Python is a operate that takes one other operate as an argument and extends its habits with out modifying it. The decorator operate wraps the unique operate by defining a wrapper operate within it. This wrapper operate executes code earlier than and after calling the unique operate.

Specifically, when defining a decorator operate equivalent to my_decorator within the instance, it takes a operate as an argument, which we typically name func. This func would be the precise operate that’s embellished beneath the hood.

The wrapper operate inside my_decorator can execute arbitrary code earlier than and after calling func(), which invokes the unique operate. When making use of @my_decorator earlier than the definition of my_function, it passes my_function as an argument to my_decorator, so func refers to my_function in that context.

The wrapper operate then returns the improved wrapped operate. So now my_function has been embellished by my_decorator. When it’s later referred to as, the wrapper code inside my_decorator executes earlier than and after my_function runs. This permits decorators to transparently prolong the habits of a operate, while not having to change the operate itself.

And as you will recall, the unique my_function stays unchanged, retaining decorators non-invasive and versatile.

When my_function() is embellished with @my_decorator, it’s mechanically enhanced. The my_decorator operate right here returns a wrapper operate. This wrapper operate will get executed when the my_function() known as now.

First, the wrapper prints "Before the operate name" earlier than truly calling the unique my_function() operate being embellished. Then, after my_function() executes, it prints "After operate name".

So, further habits and printed messages are added earlier than and after the my_function() execution within the wrapper, with out instantly modifying my_function() itself. The decorator permits you to prolong my_function() in a clear means with out affecting its core logic, because the wrapper handles the improved habits.

image-109
Applying a Decorator to a Function

So let’s begin exploring the highest 11 sensible decorators that each Python developer ought to know.

Log Arguments and Return Value of a Function

The Log Arguments and Return Value decorator tracks the enter parameters and output of capabilities. This helps debugging by logging a transparent file of information move by complicated operations.

def log_decorator(original_function):
    def wrapper(*args, **kwargs):
        print(f"Calling {original_function.__name__} with args: {args}, kwargs: {kwargs}")

        # Call the unique operate
        end result = original_function(*args, **kwargs)

        # Log the return worth
        print(f"{original_function.__name__} returned: {end result}")

        # Return the end result
        return end result
    return wrapper

# Example utilization
@log_decorator
def calculate_product(x, y):
    return x * y

# Call the embellished operate
end result = calculate_product(10, 20)
print("Result:", end result)
Decorator that logs arguments and return values of a operate

Output:

Calling calculate_product with args: (10, 20), kwargs: {}
calculate_product returned: 200
Result: 200

In this instance, the decorator operate is called log_decorator() and accepts a operate, original_function, as its argument. Within log_decorator(), a nested operate referred to as wrapper() is outlined. This wrapper() operate is what the decorator returns and successfully replaces the unique operate.

When the wrapper() operate is invoked, it prints logging statements pertaining to the operate name. Then it calls the unique operate, original_function, captures its end result, prints the end result, and returns the end result.

The @log_decorator syntax above the calculate_product() operate is a Python conference to use the log_decorator as a decorator to the calculate_product operate. So when calculate_product() is invoked, it is truly invoking the wrapper() operate returned by log_decorator(). Therefore, log_decorator() acts as a wrapper, introducing logging statements earlier than and after the execution of the unique calculate_product() operate.

Usage and Applications

This decorator is broadly adopted in utility improvement for including runtime logging with out interfering with enterprise logic implementation.

For instance, take into account a banking utility that processes monetary transactions. The core transaction processing logic resides in capabilities like transfer_funds() and accept_payment(). To monitor these transactions, logging may be added by together with @log_decorator above every operate.

Then when transactions are triggered by calling transfer_funds(), you possibly can print the operate identify, arguments just like the sender, receiver, and quantity earlier than the precise switch. Then after the operate returns, you possibly can print the whether or not the switch succeeded or failed.

This sort of logging with decorators permits you to observe transactions with out including any code to core capabilities like transfer_funds(). The logic stays clear whereas debuggability and observability improves. Logging messages may be directed to a monitoring dashboard or log analytics system as properly.

Get the Execution Time of a Function

This decorator is your ally within the quest for efficiency optimization. By measuring and logging the execution time of a operate, this decorator facilitates a deep dive into the effectivity of your code, serving to you pinpoint bottlenecks and streamline your utility’s efficiency.

It’s preferrred for situations the place velocity is essential, equivalent to real-time functions or large-scale information processing. And it permits you to determine and handle efficiency bottlenecks systematically.

import time

def measure_execution_time(func):
    def timed_execution(*args, **kwargs):
        start_timestamp = time.time()
        end result = func(*args, **kwargs)
        end_timestamp = time.time()
        execution_duration = end_timestamp - start_timestamp
        print(f"Function {func.__name__} took {execution_duration:.2f} seconds to execute")
        return end result
    return timed_execution

# Example utilization
@measure_execution_time
def multiply_numbers(numbers):
    product = 1
    for num in numbers:
        product *= num
    return product

# Call the embellished operate
end result = multiply_numbers([i for i in range(1, 10)])
print(f"Result: {end result}")
Decorator that shows the execution time of the operate

Output:

Function multiply_numbers took 0.00 seconds to execute
Result: 362880

This code showcases a decorator that is designed to measure the execution period of capabilities.

The measure_execution_time() decorator takes a operate, func, and defines an inside operate, timed_execution(), to wrap the unique operate. Upon invocation, timed_execution() data the beginning time, calls the unique operate, data the tip time, calculates the period, and prints it.

The @measure_execution_time syntax applies this decorator to capabilities beneath it, equivalent to multiply_numbers(). Consequently, when multiply_numbers() known as, it invokes the timed_execution() wrapper, which logs the period alongside the operate end result.

This instance illustrates how decorators seamlessly increase present capabilities with further performance, like timing, with out direct modification.

Usage and Applications

This decorator is useful in profiling capabilities to determine efficiency bottlenecks in functions. For instance, take into account an e-commerce web site with a number of backend capabilities like get_recommendations(), calculate_shipping(), and so forth. By adorning them with @measure_execution_time, you possibly can monitor their runtime.

When get_recommendations() is invoked in a person session, the decorator will time its execution period by recording a begin and finish timestamp. After execution, it’ll print the time taken earlier than returning suggestions.

Doing this systematically throughout functions and analyzing outputs will present you the capabilities which are taking an unusually very long time. The improvement workforce can then optimize such capabilities by caching, parallel processing, and different methods to enhance general utility efficiency.

Without such timing decorators, discovering optimization candidates would require tedious logging code additions. Decorators present visibility simply with out contaminating enterprise logic.

Convert Function Return Value to a Specified Data Type

The Convert Return Value Type decorator enhances information consistency in capabilities by mechanically changing the return worth to a specified information sort, selling predictability and stopping sudden errors. It is especially helpful for downstream processes that require constant information varieties, decreasing runtime errors.

def convert_to_data_type(target_type):
    def type_converter_decorator(func):
        def wrapper(*args, **kwargs):
            end result = func(*args, **kwargs)
            return target_type(end result)
        return wrapper
    return type_converter_decorator

@convert_to_data_type(int)
def add_values(a, b):
    return a + b

int_result = add_values(10, 20)
print("Result:", int_result, sort(int_result))

@convert_to_data_type(str)
def concatenate_strings(str1, str2):
    return str1 + str2

str_result = concatenate_strings("Python", " Decorator")
print("Result:", str_result, sort(str_result))
Decorator that converts the operate return worth to a selected information sort

Output:

Result: 30 <class 'int'>
Result: Python Decorator <class 'str'>

The above code instance exhibits a decorator that is designed to transform the return worth of a operate to a specified information sort.

The decorator, named convert_to_data_type(), takes the goal information sort as a parameter and returns a decorator named type_converter_decorator(). Within this decorator, a wrapper() operate is outlined to name the unique operate, convert its return worth to the goal sort utilizing target_type(), and subsequently return the transformed end result.

The syntax @convert_to_data_type(int) that is utilized above a operate (equivalent to add_values()) makes use of this decorator to transform the return worth to an integer. Similarly, for concatenate_strings(), passing str codecs the return worth as a string.

This instance additionally showcases how decorators seamlessly modify operate outputs to desired codecs with out altering the core logic of the capabilities.

Usage and Application

This return worth transformation decorator proves helpful in functions the place that you must mechanically adapt capabilities to anticipated information codecs.

For occasion, you may use it in a climate API that returns temperatures by default in decimal format like 23.456 levels. But the buyer front-end utility expects an integer worth to show.

Instead of fixing the API operate to return an integer, simply adorn it with @convert_to_data_type(int). This will seamlessly convert the decimal temperature to the integer 23, on this instance, earlier than returning to the shopper app. Without any API operate modification, you have reformatted the return worth.

Similarly for backend processing anticipating JSON, return values may be transformed utilizing the @convert_to_data_type(json) decorator. The core logic stays unchanged whereas the presentation format adapts primarily based in your use case’s wants. This avoids duplication of format dealing with code throughout capabilities.

Decorators externally impose required information representations for seamless integration and reusability throughout utility layers with mismatched codecs.

Cache Function Results

This decorator optimizes efficiency by storing and retrieving operate outcomes, eliminating redundant computations for repeated inputs, and bettering utility responsiveness, particularly for time-consuming computations.

def cached_result_decorator(func):
    result_cache = {}

    def wrapper(*args, **kwargs):
        cache_key = (*args, *kwargs.objects())

        if cache_key in result_cache:
            return f"[FROM CACHE] {result_cache[cache_key]}"

        end result = func(*args, **kwargs)
        result_cache[cache_key] = end result

        return end result

    return wrapper

# Example utilization

@cached_result_decorator
def multiply_numbers(a, b):
    return f"Product = {a * b}"

# Call the embellished operate a number of occasions
print(multiply_numbers(4, 5))  # Calculation is carried out
print(multiply_numbers(4, 5))  # Result is retrieved from cache
print(multiply_numbers(5, 7))  # Calculation is carried out
print(multiply_numbers(5, 7))  # Result is retrieved from cache
print(multiply_numbers(-3, 7))  # Calculation is carried out
print(multiply_numbers(-3, 7))  # Result is retrieved from cache
Decorator that caches the return values of a operate

Output:

Product = 20
[FROM CACHE] Product = 20
Product = 35
[FROM CACHE] Product = 35
Product = -21
[FROM CACHE] Product = -21

This code pattern showcases a decorator that is designed to cache and reuse operate name outcomes effectively.

The cached_result_decorator() operate takes one other operate and returns a wrapper. Within this wrapper, a cache dictionary (result_cache) shops distinctive name parameters and their corresponding outcomes.

Before executing the precise operate, the wrapper() checks if the end result for the present parameters is already within the cache. If so, it retrieves and returns the cached end result – in any other case, it calls the operate, shops the end result within the cache, and returns it.

The @cached_result_decorator syntax applies this caching logic to any operate, equivalent to multiply_numbers(). This ensures that, upon subsequent calls with the identical arguments, the cached result’s reused, stopping redundant calculations.

In essence, the decorator enhances performance by optimizing efficiency by end result caching.

Usage and Applications

Caching decorators like this are extraordinarily helpful in utility improvement for optimizing efficiency of repetitive operate calls.

For instance, take into account a advice engine calling predictive mannequin capabilities to generate person recommendations. get_user_recommendations() prepares the enter information and feeds into the mannequin for each person request.Instead of re-running computations, it may be embellished with @cached_result_decorator to introduce caching layer.

Now the primary time distinctive person parameters are handed, the mannequin runs and the end result caches. Subsequent calls with the identical inputs instantly return the cached mannequin outputs, skipping the mannequin recalculation.

This drastically improves latency for responding to person requests by avoiding duplicate mannequin inferences. You can monitor cache hit charges to justify cutting down mannequin server infrastructure prices.

Decoupling such optimization considerations by caching decorators moderately than mixing them inside operate logic improves modularity, readability and permits speedy efficiency beneficial properties. Caches might be configured, invalidated individually with out intruding enterprise capabilities.

Validate Function Arguments Based on Condition

This one checks if enter arguments meet predefined standards earlier than execution, enhancing operate reliability and stopping sudden habits. It is helpful for parameters requiring constructive integers or non-empty strings.

def check_condition_positive(worth):
    def argument_validator(func):
        def validate_and_calculate(*args, **kwargs):
            if worth(*args, **kwargs):
                return func(*args, **kwargs)
            else:
                increase ValueError("Invalid arguments handed to the operate")
        return validate_and_calculate
    return argument_validator

@check_condition_positive(lambda x: x > 0)
def compute_cubed_result(quantity):
    return quantity ** 3

print(compute_cubed_result(5))  # Output: 125
print(compute_cubed_result(-2))  # Raises ValueError: Invalid arguments handed to the operate
Decorator to validate operate arguments conditionally 

Output:

125Traceback (most up-to-date name final):

  File "C:\Program Files\Sublime Text 3\take a look at.py", line 16, in <module>
    print(compute_cubed_result(-2))  # Raises ValueError: Invalid arguments handed to the operate
  File "C:\Program Files\Sublime Text 3\take a look at.py", line 7, in validate_and_calculate
    increase ValueError("Invalid arguments handed to the operate")
ValueError: Invalid arguments handed to the operate

This code showcases how one can implement a decorator for validating operate arguments.

The check_condition_positive() is a decorator manufacturing unit that generates an argument_validator() decorator. This validator, when utilized with @check_condition_positive() above the compute_cubed_result() operate, checks if the situation (on this case, that the argument ought to be better than 0) holds true for the handed arguments.

If the situation is met, the embellished operate is executed – in any other case, a ValueError exception is raised.

This succinct instance illustrates how decorators function a mechanism for validating operate arguments earlier than their execution, guaranteeing adherence to specified situations.

Usage and Applications

Such parameter validation decorators are extraordinarily helpful in functions to assist implement enterprise guidelines, safety constraints, and so forth.

For instance, an insurance coverage claims processing system would have a operate process_claim() that takes particulars like declare id, approver identify, and so forth. Certain enterprise guidelines dictate who can approve claims.

Rather than cluttering the operate logic itself, you possibly can adorn it with @check_condition_positive() which validates if the approver position matches the declare quantity. If a junior agent tries approving a big declare (thus violating the foundations), this decorator would catch it by elevating exception even earlier than process_claim() executes.

Similarly, enter information validation constraints for safety and compliance may be imposed with out touching particular person capabilities. Decorators externally make sure that violated arguments by no means attain utility dangers.

Common validation patterns ought to be reused throughout a number of capabilities. This improves safety and promotes separation of considerations by isolating constraints from core logic move in a modular means.

Retry a Function Multiple Times on Failure

This decorator comes useful while you wish to mechanically retry a operate after failure, enhancing its resilience in conditions involving transient failures. It is used for exterior companies or community requests vulnerable to intermittent failures.

import sqlite3
import time

def retry_on_failure(max_attempts, retry_delay=1):
    def decorator(func):
        def wrapper(*args, **kwargs):
            for _ in vary(max_attempts):
                attempt:
                    end result = func(*args, **kwargs)
                    return end result
                besides Exception as error:
                    print(f"Error occurred: {error}. Retrying...")
                    time.sleep(retry_delay)
            increase Exception("Maximum makes an attempt exceeded. Function failed.")

        return wrapper
    return decorator

@retry_on_failure(max_attempts=3, retry_delay=2)
def establish_database_connection():
    connection = sqlite3.join("instance.db")
    db_cursor = connection.cursor()
    db_cursor.execute("SELECT * FROM customers")
    query_result = db_cursor.fetchall()
    db_cursor.shut()
    connection.shut()
    return query_result

attempt:
    retrieved_data = establish_database_connection()
    print("Data retrieved efficiently:", retrieved_data)
besides Exception as error_message:
    print(f"Failed to ascertain database connection: {error_message}")
Decorator to Retry a Function Multiple Times on Failure 

Output:

Error occurred: no such desk: customers. Retrying...
Error occurred: no such desk: customers. Retrying...
Error occurred: no such desk: customers. Retrying...
Failed to ascertain database connection: Maximum makes an attempt exceeded. Function failed.

This instance introduces a decorator that is designed for retrying operate executions within the occasion of failures. It has a specified most try depend and delay between retries.

The retry_on_failure() is a decorator manufacturing unit, taking parameters for max retry depend and delay, and returning a decorator() that manages the retry logic.

Within the wrapper() operate, the embellished operate undergoes execution in a loop, making an attempt a specified most variety of occasions.

In case of an exception, it prints an error message, introduces a delay specified by retry_delay, and retries. If all makes an attempt fail, it raises an exception indicating that the utmost makes an attempt have been exceeded.

The @retry_on_failure() utilized above establish_database_connection() integrates this retry logic, permitting for as much as 3 retries with a 2-second delay between every try in case the database connection encounters failures.

This demonstrates the utility of decorators in seamlessly incorporating retry capabilities with out altering the core operate code.

Usage and Application

This retry decorator can show extraordinarily helpful in utility improvement for including resilience in opposition to short-term or intermittent errors.

For occasion, take into account a flight reserving app that calls a fee gateway API process_payment() to deal with buyer transactions. Sometimes community blips or excessive hundreds at fee supplier finish may trigger transient errors in API response.

Rather than instantly exhibiting failures to prospects, the process_payment() operate may be embellished with @retry_on_failure to deal with such situations implicitly. Now when a fee fails as soon as, it’ll seamlessly retry sending the request as much as 3 occasions earlier than lastly reporting the error if it persists.

This supplies shielding from short-term hiccups with out exposing customers to unreliable infrastructure habits instantly.The utility additionally stays accessible reliably even when dependent companies fail sometimes.

The decorator helps confine the retry logic neatly with out spreading it throughout the API’s code. Failures past the app’s management are dealt with gracefully moderately than instantly impacting customers by utility faults. This demonstrates how decorators lend higher resilience with out complicating enterprise logic.

Enforce Rate Limits on a Function

By controlling the frequency of capabilities referred to as, the Enforce Rate Limits decorator ensures efficient useful resource administration and guards in opposition to misuse. It is very useful in situations like API misuse or useful resource conservation the place limiting operate calls is important.

import time

def rate_limiter(max_allowed_calls, reset_period_seconds):
    def decorate_rate_limited_function(original_function):
        calls_count = 0
        last_reset_time = time.time()

        def wrapper_function(*args, **kwargs):
            nonlocal calls_count, last_reset_time
            elapsed_time = time.time() - last_reset_time

            # If the elapsed time is larger than the reset interval, reset the decision depend
            if elapsed_time > reset_period_seconds:
                calls_count = 0
                last_reset_time = time.time()

            # Check if the decision depend has reached the utmost allowed restrict
            if calls_count >= max_allowed_calls:
                increase Exception("Rate restrict exceeded. Please attempt once more later.")

            # Increment the decision depend
            calls_count += 1

            # Call the unique operate
            return original_function(*args, **kwargs)

        return wrapper_function
    return decorate_rate_limited_function

# Allowing a most of 6 API calls inside 10 seconds.
@rate_limiter(max_allowed_calls=6, reset_period_seconds=10)
def make_api_call():
    print("API name executed efficiently...")

# Make API calls
for _ in vary(8):
    attempt:
        make_api_call()
    besides Exception as error:
        print(f"Error occurred: {error}")
time.sleep(10)
make_api_call()
Decorator to Enforce Rate Limits on a Function

Output:

API name executed efficiently...
API name executed efficiently...
API name executed efficiently...
API name executed efficiently...
API name executed efficiently...
API name executed efficiently...
Error occurred: Rate restrict exceeded. Please attempt once more later.
Error occurred: Rate restrict exceeded. Please attempt once more later.
API name executed efficiently...

This code showcases the implementation of a rate-limiting mechanism for operate calls utilizing a decorator.

The rate_limiter() operate, specified with most calls and a interval in seconds to reset the depend, serves because the core of the rate-limiting logic. The decorator, decorate_rate_limited_function(), employs a wrapper to handle the speed limits by resetting the depend if the interval has elapsed. It checks if the depend has reached the utmost allowed, after which both raises an exception or increments the depend and executes the operate accordingly.

Applied to make_api_call() utilizing @rate_limiter(), it restricts the operate to 6 calls inside any 10-second interval. This introduces charge limiting with out altering the operate logic, guaranteeing that calls adhere to limits and stopping extreme use inside set intervals.

Usage and Application

Rate limiting decorators like this are very helpful in utility improvement for controlling utilization of APIs and stopping abuse.

For occasion, a journey reserving utility could depend on third celebration Flight Search API for checking dwell seat availability throughout airways. While most utilization is legit, some customers may doubtlessly name this API excessively, degrading general service efficiency.

By adorning the API integration module like @rate_limiter(100, 60), the applying can prohibit extreme calls internally, too. This would restrict the reserving module to make solely 100 Flight API calls per minute. Additional calls get rejected instantly by the decorator with out even reaching precise API.

This saves downstream service from overuse enabling fairer distribution of capability for normal utility performance.

Decorators present straightforward charge management for each inner and exterior going through APIs with out altering practical code. This means you do not have to account for utilization quotas whereas safeguarding companies, infrastructure, and bounding adoption threat. And it is all due to application-side controls utilizing wrappers.

Handle Exceptions and Provide Default Response

The Handle Exceptions decorator is a security web for capabilities, gracefully dealing with exceptions and offering default responses once they happen. It shields the applying from crashing as a consequence of unexpected circumstances, guaranteeing easy operation.

def handle_exceptions(default_response_msg):
    def exception_handler_decorator(func):
        def decorated_function(*args, **kwargs):
            attempt:
                # Call the unique operate
                return func(*args, **kwargs)
            besides Exception as error:
                # Handle the exception and supply the default response
                print(f"Exception occurred: {error}")
                return default_response_msg
        return decorated_function
    return exception_handler_decorator

# Example utilization
@handle_exceptions(default_response_msg="An error occurred!")
def divide_numbers_safely(dividend, divisor):
    return dividend / divisor

# Call the embellished operate
end result = divide_numbers_safely(7, 0)  # This will increase a ZeroDivisionError
print("Result:", end result)
Decorator that Handles Exceptions and Provides Default Response

Output:

Exception occurred: division by zero
Result: An error occurred!

This code showcases exception dealing with in capabilities utilizing decorators.

The handle_exceptions() decorator manufacturing unit, accepting a default response, produces exception_handler_decorator(). This decorator, when utilized to capabilities, makes an attempt to execute the unique operate. If an exception arises, it prints error particulars, and returns the desired default response.

The @handle_exceptions() syntax above a operate incorporates this exception-handling logic. For occasion, in divide_numbers_safely(), division by zero triggers an exception, which the decorator catches, stopping a crash and returning the default “An error occurred!” response.

Essentially, these decorators adeptly seize exceptions in capabilities, offering a seamless technique of incorporating dealing with logic and stopping crashes.

Usage and Applications

Exception dealing with decorators significantly simplify utility error administration and assist conceal unreliable habits from customers.

For instance, an e-commerce web site could depend on fee, stock, and delivery companies to finish orders. Instead of complicated exception blocks all over the place, core order processing operate like place_order() may be embellished to realize resilience.

The @handle_exceptions decorator utilized above it might take up any third celebration service outage or intermittent situation throughout order finalization. On exception, it logs errors for debugging whereas serving a swish “Order failed, please attempt once more later” message to the client. This avoids expose complicated failure root causes like fee timeouts to finish person.

Decorators defend prospects from unreliable service points with out altering enterprise code. They present pleasant default responses when errors occur. This improves buyer expertise

Also, decorators give builders visibility into these errors behind the scenes. So they will deal with systematically fixing the basis causes of failures. This separation of considerations by decorators reduces complexity. Customers see extra reliability, and also you get actionable insights into faults – all whereas retaining enterprise logic untouched.

Enforce Type Checking on Function Arguments

The Enforce Type Checking decorator ensures information integrity by verifying operate arguments conform to specified information varieties, stopping type-related errors, and selling code reliability. It is especially helpful in conditions the place strict information sort adherence is essential.

import examine

def enforce_type_checking(func):
    def type_checked_wrapper(*args, **kwargs):
        # Get the operate signature and parameter names
        function_signature = examine.signature(func)
        function_parameters = function_signature.parameters

        # Iterate over the positional arguments
        for i, arg_value in enumerate(args):
            parameter_name = record(function_parameters.keys())[i]
            parameter_type = function_parameters[parameter_name].annotation
            if not isinstance(arg_value, parameter_type):
                increase TypeError(f"Argument '{parameter_name}' have to be of sort '{parameter_type.__name__}'")

        # Iterate over the key phrase arguments
        for keyword_name, arg_value in kwargs.objects():
            parameter_type = function_parameters[keyword_name].annotation
            if not isinstance(arg_value, parameter_type):
                increase TypeError(f"Argument '{keyword_name}' have to be of sort '{parameter_type.__name__}'")

        # Call the unique operate
        return func(*args, **kwargs)

    return type_checked_wrapper

# Example utilization
@enforce_type_checking
def multiply_numbers(factor_1: int, factor_2: int) -> int:
    return factor_1 * factor_2

# Call the embellished operate
end result = multiply_numbers(5, 7)  # No sort errors, returns 35
print("Result:", end result)

end result = multiply_numbers("5", 7)  # Type error: 'factor_1' have to be of sort 'int'
Decorator that Enforces Type Checking on Function Arguments

Output:

Result:Traceback (most up-to-date name final):
  File "C:\Program Files\Sublime Text 3\take a look at.py", line 36, in <module>
 35
    end result = multiply_numbers("5", 7)  # Type error: 'factor_1' have to be of sort 'int'
  File "C:\Program Files\Sublime Text 3\take a look at.py", line 14, in type_checked_wrapper
    increase TypeError(f"Argument '{parameter_name}' have to be of sort '{parameter_type.__name__}'")
TypeError: Argument 'factor_1' have to be of sort 'int'

The enforce_type_checking decorator validates whether or not the arguments handed to a operate match the desired sort annotations.

Inside the type_checked_wrapper, it examines the signature of the embellished operate, retrieves parameter names and kind annotations, and ensures that the offered arguments align with the anticipated varieties. This consists of checking positional arguments in opposition to their order, and key phrase arguments in opposition to parameter names. If a kind mismatch is detected, a TypeError is raised.

This decorator is exemplified by its utility to the multiply_numbers operate, the place arguments are annotated as integers. Attempting to go a string leads to an exception, whereas passing integers executes the operate with out points. This sort checking is enforced with out altering the unique operate physique.

Usage and Applications

Type checking decorators are utilized to detect points early and enhance reliability. For instance, take into account an internet utility backend with an information entry layer operate get_user_data() annotated to count on integer person IDs. Its queries would fail if string IDs move into it from frontend code.

Rather than add express checks and lift exceptions regionally, you should use this decorator. Now any upstream or shopper code passing invalid varieties might be mechanically caught throughout operate execution. The decorator examines annotations versus argument varieties and throws errors accordingly earlier than reaching the database layer.

This runtime safety for elements by decorators ensures that solely legitimate information shapes move throughout layers, stopping obscure errors. Type security is imposed with out further checks cluttering cleaner logic.

Measure Memory Usage of a Function

When it involves giant dataset-intensive functions or resource-constrained environments, the Measure Memory Usage Decorator is a reminiscence detective that provides insights into operate reminiscence consumption. It does this by optimising reminiscence utilization.

import tracemalloc

def measure_memory_usage(target_function):
    def wrapper(*args, **kwargs):
        tracemalloc.begin()

        # Call the unique operate
        end result = target_function(*args, **kwargs)

        snapshot = tracemalloc.take_snapshot()
        top_stats = snapshot.statistics("lineno")

        # Print the highest memory-consuming traces
        print(f"Memory utilization of {target_function.__name__}:")
        for stat in top_stats[:5]:
            print(stat)

        # Return the end result
        return end result

    return wrapper

# Example utilization
@measure_memory_usage
def calculate_factorial_recursive(quantity):
    if quantity == 0:
        return 1
    else:
        return quantity * calculate_factorial_recursive(quantity - 1)

# Call the embellished operate
result_factorial = calculate_factorial_recursive(3)
print("Factorial:", result_factorial)

Output:

Memory utilization of calculate_factorial_recursive:
C:\Program Files\Sublime Text 3\take a look at.py:29: measurement=1552 B, depend=6, common=259 B
C:\Program Files\Sublime Text 3\take a look at.py:8: measurement=896 B, depend=3, common=299 B
C:\Program Files\Sublime Text 3\take a look at.py:10: measurement=416 B, depend=1, common=416 B
Memory utilization of calculate_factorial_recursive:
C:\Program Files\Sublime Text 3\take a look at.py:29: measurement=1552 B, depend=6, common=259 B
C:\Program Files\Python310\lib\tracemalloc.py:226: measurement=880 B, depend=3, common=293 B
C:\Program Files\Sublime Text 3\take a look at.py:8: measurement=832 B, depend=2, common=416 B
C:\Program Files\Python310\lib\tracemalloc.py:173: measurement=800 B, depend=2, common=400 B
C:\Program Files\Python310\lib\tracemalloc.py:505: measurement=592 B, depend=2, common=296 B
Memory utilization of calculate_factorial_recursive:
C:\Program Files\Sublime Text 3\take a look at.py:29: measurement=1440 B, depend=4, common=360 B
C:\Program Files\Python310\lib\tracemalloc.py:535: measurement=1240 B, depend=3, common=413 B
C:\Program Files\Python310\lib\tracemalloc.py:67: measurement=1216 B, depend=19, common=64 B
C:\Program Files\Python310\lib\tracemalloc.py:193: measurement=1104 B, depend=23, common=48 B
C:\Program Files\Python310\lib\tracemalloc.py:226: measurement=880 B, depend=3, common=293 B
Memory utilization of calculate_factorial_recursive:
C:\Program Files\Python310\lib\tracemalloc.py:558: measurement=1416 B, depend=29, common=49 B
C:\Program Files\Python310\lib\tracemalloc.py:67: measurement=1408 B, depend=22, common=64 B
C:\Program Files\Sublime Text 3\take a look at.py:29: measurement=1392 B, depend=3, common=464 B
C:\Program Files\Python310\lib\tracemalloc.py:535: measurement=1240 B, depend=3, common=413 B
C:\Program Files\Python310\lib\tracemalloc.py:226: measurement=832 B, depend=2, common=416 B
Factorial: 6

This code showcases a decorator, measure_memory_usage, designed to measure the reminiscence consumption of capabilities.

The decorator, when utilized, initiates reminiscence monitoring earlier than the unique operate known as. Once the operate completes its execution, a reminiscence snapshot is taken and the highest 5 traces consuming essentially the most reminiscence are printed.

Illustrated by the instance of calculate_factorial_recursive(), the decorator permits you to monitor reminiscence utilization with out altering the operate itself, providing priceless insights for optimization functions.

In essence, it supplies an easy means to evaluate and analyze the reminiscence consumption of any operate throughout its runtime.

Usage and Applications

Memory measurement decorators like these are extraordinarily priceless in utility improvement for figuring out and troubleshooting reminiscence bloat or leak points.

For instance, take into account an information streaming pipeline with crucial ETL elements like transform_data() that processes giant volumes of data. Though the method appears superb throughout common hundreds, excessive quantity information like Black Friday gross sales may trigger extreme reminiscence utilization and crashes.

Rather than guide debugging, adorning processors like @measure_memory_usage can reveal helpful insights. It will print the highest reminiscence intensive traces throughout peak information move with none code change.

You ought to goal to pinpoint particular phases consuming up reminiscence quickly and handle by higher algorithms or optimization.

Such decorators assist bake diagnostics views throughout crucial paths to acknowledge irregular consumption developments early. Instead of delayed manufacturing points, issues may be preemptively recognized by profiling earlier than launch. They scale back debugging complications and reduce runtime failures through simpler instrumentation for reminiscence monitoring.

Cache Function Results with Expiration Time

Specifically designed for outdated information, the Cache Function Results with Expiration Time Decorator is a device that mixes caching with a time-based expiration characteristic to ensure that cached information is frequently refreshed to forestall staleness and keep relevance.

import time

def cached_function_with_expiry(expiry_time):
    def decorator(original_function):
        cache = {}

        def wrapper(*args, **kwargs):
            key = (*args, *kwargs.objects())

            if key in cache:
                cached_value, cached_timestamp = cache[key]

                if time.time() - cached_timestamp < expiry_time:
                    return f"[CACHED] - {cached_value}"

            end result = original_function(*args, **kwargs)
            cache[key] = (end result, time.time())

            return end result

        return wrapper

    return decorator

# Example utilization

@cached_function_with_expiry(expiry_time=5)  # Cache expiry time set to five seconds
def calculate_product(x, y):
    return f"PRODUCT - {x * y}"

# Call the embellished operate a number of occasions
print(calculate_product(23, 5))  # Calculation is carried out
print(calculate_product(23, 5))  # Result is retrieved from cache
time.sleep(5)
print(calculate_product(23, 5))  # Calculation is carried out (cache expired)

Output:

PRODUCT - 115
[CACHED] - PRODUCT - 115
PRODUCT - 115

This code showcases a caching decorator that has an computerized cache expiration time.

The operate cached_function_with_expiry() generates a decorator that, when utilized, makes use of a dictionary referred to as cache to retailer operate outcomes and their corresponding timestamps. The wrapper() operate checks if the end result for the present arguments is within the cache. If current and inside the expiry time, it returns the cached end result – in any other case, it calls the operate.

Illustrated utilizing calculate_product(), the decorator initially calculates and caches the end result. Subsequent calls retrieve the cached end result till the expiry interval, at which level the cache is refreshed by a recalculation.

In essence, this implementation prevents redundant calculations whereas mechanically refreshing outcomes after the desired expiry interval.

Usage and Applications

Automatic cache expiry decorators are very helpful in utility improvement for optimizing efficiency of information fetching modules.

For instance, take into account a journey web site that calls backend API get_flight_prices() to point out dwell costs to customers. While caches scale back calls to costly flight information sources, static caching results in displaying stale costs.

Instead, you should use @cached_function_with_expiry(60) to auto-refresh each minute. Now, the primary person name fetches dwell costs and caches them, whereas subsequent requests in a 60s window effectively reuse the cached pricing. But caches mechanically invalidate after the expiry interval to ensure contemporary information.

This permits your to optimize flows with out worrying about nook circumstances associated to outdated representations. This decorator handles the state of affairs reliably, retaining caches in sync with upstream modifications by configurable refreshing. There’s zero redundancy of recalculations, and you continue to get the absolute best up to date info to finish customers. Common caching patterns get packaged conveniently for reuse throughout codebase with personalized expiry guidelines.

Conclusion

Python decorators proceed to see widespread utilization in utility improvement for cleanly inserting widespread cross-cutting considerations. Authentications, monitoring, and restrictions are some normal examples of use circumstances that use decorators in frameworks like Django and Flask.

The recognition of net APIs has additionally result in widespread adoption of charge limiting and caching decorators for efficiency.

Decorators have truly been round since early Python releases. Guido van Rossum wrote about enhancement with decorators in a 1990 paper on Python. Later when operate decorators syntax stabilized in Python 2.4 in 2004, it opened the doorways for elegant options by oriented programming. From net to information science, they proceed to empower abstraction and modularity throughout Python domains.

The examples on this handbook solely scratch the floor of what customized tailor-made decorators can allow. Based on any particular goal like safety, throttling person requests, clear encryption, and so forth, you possibly can create progressive decorators to handle your wants. Structuring logic processing pipelines utilizing a composition of specialised single-responsibility decorators additionally encourages reuse over redundancy.

Understanding decorators not solely improves improvement abilities however unlocks methods to dictate program behaviour flexibly. I encourage you to evaluate widespread wants throughout your codebases that may be abstracted into standalone decorators.  With some observe, it turns into straightforward to identify cross-cutting considerations and prolong capabilities effectively with out breaking a sweat.

If you appreciated this lesson and wish to discover extra insightful tech content material, together with Python, Django, and System Design reads, take a look at my Blog. You may also view my tasks with proof of labor on GitHub and join with me on LinkedIn for a chat.

You may also like

Leave a Comment