Overview
Python’s asyncio runs on a single thread. However, tasks switch whenever they hit an await (I/O wait) point. If multiple tasks try to update the same variable or resource at the same time, it can cause a “Race Condition.”
To prevent this, we use asyncio.Lock(). This tool ensures that only one task can access a specific resource at a time, keeping your data accurate and consistent.
Specifications (Input/Output)
- Input: Multiple coroutines that update shared resources (variables, files, databases, etc.). Each process includes an
await asyncio.sleepto intentionally trigger contention. - Output: A shared resource with a correct, consistent value after all tasks finish.
- Logic:
- Without a lock, the final result will be wrong (e.g., counting to 95 instead of 100).
- With a lock, the result will match the expected count.
Basic Usage
The standard way to implement exclusive control is to wrap the “critical section” (the part that shouldn’t run at the same time) with async with lock:.
Syntax and Meaning
| Syntax | Meaning |
lock = asyncio.Lock() | Creates a lock object (mutex). The initial state is “unlocked.” |
async with lock: | Acquires the lock, runs the block, and releases it automatically. Other tasks wait if the lock is held. |
await lock.acquire() | Manually acquires the lock (using async with is recommended). |
lock.release() | Manually releases the lock. |
# Basic pattern
lock = asyncio.Lock()
async def safe_operation():
async with lock:
# Only one task can execute here at a time
print("Locked processing...")
Full Code
This example creates a “Bank Account” class. It simulates a situation where multiple ATMs (tasks) try to “check balance and deposit” at the same time. The lock prevents other tasks from interfering between reading and writing the balance.
import asyncio
class BankAccount:
def __init__(self, balance: int = 0):
self.balance = balance
# Create a lock object for exclusive control
self.lock = asyncio.Lock()
async def deposit(self, amount: int, task_name: str):
"""
Method to deposit money.
Uses a lock to maintain consistency between reading and writing.
"""
print(f"[{task_name}] Waiting for lock...")
# --- Start Critical Section ---
async with self.lock:
print(f"[{task_name}] Lock acquired. Starting process.")
# 1. Get current balance
current_value = self.balance
print(f" [{task_name}] Read balance: {current_value}")
# 2. Simulate a context switch (other tasks could run here)
await asyncio.sleep(0.1)
# 3. Calculate and update
new_value = current_value + amount
self.balance = new_value
print(f" [{task_name}] Updated balance: {new_value}")
# --- End Critical Section (Lock is automatically released) ---
print(f"[{task_name}] Process finished. Lock released.")
async def main():
# Create an account (Initial balance 0)
account = BankAccount()
print("--- Starting concurrent deposit processing ---")
# Three tasks try to deposit 100 yen at the same time
# Without a lock, they might all read "0" and the final balance would be 100
await asyncio.gather(
account.deposit(100, "ATM-A"),
account.deposit(100, "ATM-B"),
account.deposit(100, "ATM-C")
)
print("--- All processes finished ---")
print(f"Final Balance: {account.balance} (Expected: 300)")
if __name__ == "__main__":
asyncio.run(main())
Sample Execution Result
The logs show that each task takes turns acquiring the lock and processing.
--- Starting concurrent deposit processing ---
[ATM-A] Waiting for lock...
[ATM-B] Waiting for lock...
[ATM-C] Waiting for lock...
[ATM-A] Lock acquired. Starting process.
[ATM-A] Read balance: 0
[ATM-A] Updated balance: 100
[ATM-A] Process finished. Lock released.
[ATM-B] Lock acquired. Starting process.
[ATM-B] Read balance: 100
[ATM-B] Updated balance: 200
[ATM-B] Process finished. Lock released.
[ATM-C] Lock acquired. Starting process.
[ATM-C] Read balance: 200
[ATM-C] Updated balance: 300
[ATM-C] Process finished. Lock released.
--- All processes finished ---
Final Balance: 300 (Expected: 300)
Customization Points
Lock Granularity (Range)
Keep the range of async with lock: as small as possible. If you include unrelated heavy tasks (like complex math or network calls) inside the lock, you lose the benefits of concurrency, and your program will slow down as tasks wait unnecessarily.
Global Variables vs Class Members
While a global lock works for simple scripts, storing it as a class member (self.lock) is better. This allows different account objects to have their own independent locks, so ATM-A using Account-1 doesn’t stop ATM-B from using Account-2.
Important Notes
Deadlock
If you try to acquire a second lock inside a first lock, you might create a “Deadlock” where your program waits forever. If you must use multiple locks, always acquire them in a consistent order.
Await Placement
When you await something inside a lock block, the lock remains held. Other tasks will be forced to wait for that duration. However, if there are no await points inside a block, a context switch usually won’t happen, and you might not even need a lock.
Not Thread-Safe
asyncio.Lock is only for preventing contention within an asyncio event loop. It cannot protect resources from being accessed by a different thread running with the threading module. For OS-level threads, you need threading.Lock.
Advanced Usage
This pattern uses lock.acquire() and lock.release() manually. While async with is preferred, this is useful when you need to hold a lock across multiple functions.
import asyncio
lock = asyncio.Lock()
async def manual_lock_example():
print("Acquiring lock...")
await lock.acquire()
try:
print("Executing critical section")
await asyncio.sleep(0.1)
finally:
# Use try-finally to ensure the lock is released even if an error occurs
print("Releasing lock")
lock.release()
if __name__ == "__main__":
asyncio.run(manual_lock_example())
Summary
Locks are essential for managing shared states in asyncio to prevent data inconsistency.
- Use Cases: Updating databases, writing to files, or aggregating global variables where multiple tasks perform read-write operations.
- Key Point: Wrap only the truly conflicting parts with
async with lock:. - Warning: Unnecessary locks will slow down your processing speed.
asyncio makes it easy to write concurrent code, but you must always be careful about data contention.
