Top 50 Python Interview Questions & Answers
From fundamentals to advanced concepts — everything you need to ace your Python interview in 2026.
Beginner Questions
Q1 – Q17Python is a high-level, general-purpose, interpreted programming language created by Guido van Rossum and first released in 1991. It emphasizes code readability and simplicity.
Key Features:
- Easy to learn & read — clean, English-like syntax
- Interpreted — runs line by line without prior compilation
- Dynamically typed — no need to declare variable types
- Multi-paradigm — supports OOP, functional, and procedural styles
- Extensive standard library — "batteries included" philosophy
- Cross-platform — runs on Windows, Linux, macOS
- Large ecosystem — PyPI hosts 500,000+ packages
- Memory management — automatic garbage collection
Python is both compiled and interpreted. The process has two stages:
- Compilation to bytecode: Python source (
.py) is first compiled to bytecode (.pycfiles in__pycache__). - Interpretation: The Python Virtual Machine (PVM) then interprets the bytecode line by line at runtime.
This is why Python is usually called interpreted — the compilation step is hidden and automatic. Compared to truly compiled languages like C++, Python's interpretation adds overhead, making it slower for CPU-bound tasks but much faster to develop with.
| Mutable | Immutable |
|---|---|
| Can be changed after creation | Cannot be changed after creation |
| list, dict, set, bytearray | int, float, str, tuple, frozenset, bytes |
| Same object is modified in-place | Any "change" creates a new object |
# Mutable — list changes in place
lst = [1, 2, 3]
print(id(lst)) # e.g. 140234567
lst.append(4)
print(id(lst)) # same id — same object
# Immutable — string creates new object
s = "hello"
print(id(s)) # e.g. 140234999
s += " world"
print(id(s)) # different id — new object
None as default and create inside the function.| Feature | List | Tuple | Set |
|---|---|---|---|
| Syntax | [1,2,3] | (1,2,3) | {1,2,3} |
| Ordered | ✅ Yes | ✅ Yes | ❌ No |
| Mutable | ✅ Yes | ❌ No | ✅ Yes |
| Duplicates | ✅ Allowed | ✅ Allowed | ❌ Unique only |
| Indexable | ✅ Yes | ✅ Yes | ❌ No |
| Use case | General data | Fixed records | Membership tests |
Sets support fast O(1) membership testing using hashing, making them ideal for deduplication and intersection/union operations.
- Numeric:
int,float,complex - Sequence:
str,list,tuple,range - Mapping:
dict - Set types:
set,frozenset - Boolean:
bool(True/False) - Binary:
bytes,bytearray,memoryview - None type:
NoneType
Python uses duck typing: the type of an object is determined by its behavior (methods it supports), not its declared type. Use type() or isinstance() to check types at runtime.
List comprehension provides a concise, readable way to create lists from iterables, often replacing multi-line for loops.
Syntax: [expression for item in iterable if condition]
# Traditional loop
squares = []
for x in range(10):
if x % 2 == 0:
squares.append(x ** 2)
# List comprehension — same result
squares = [x**2 for x in range(10) if x % 2 == 0]
# [0, 4, 16, 36, 64]
# Nested comprehension (matrix flatten)
flat = [num for row in matrix for num in row]
for loops, but avoid nesting more than 2 levels deep — readability suffers.*args allows a function to accept any number of positional arguments (collected as a tuple). **kwargs allows any number of keyword arguments (collected as a dict).
def demo(*args, **kwargs):
print(args) # tuple of positional args
print(kwargs) # dict of keyword args
demo(1, 2, 3, name="Alice", age=30)
# (1, 2, 3)
# {'name': 'Alice', 'age': 30}
# Unpacking with * and **
nums = [1, 2, 3]
info = {"sep": "-"}
print(*nums, **info) # 1-2-3
A lambda is an anonymous, single-expression function defined inline. Syntax: lambda arguments: expression.
# Named function vs lambda
def square(x): return x ** 2
square_l = lambda x: x ** 2
# Common use: sorting with key
people = [("Bob", 25), ("Alice", 30), ("Charlie", 20)]
people.sort(key=lambda p: p[1])
# sorted by age: [('Charlie', 20), ('Bob', 25), ('Alice', 30)]
# With map() and filter()
doubled = list(map(lambda x: x*2, [1,2,3])) # [2,4,6]
Use lambdas for short, throwaway functions. For anything more complex, prefer a named def for readability.
Dictionaries are key-value stores with O(1) average lookup time. Since Python 3.7+, they preserve insertion order.
d = {"name": "Alice", "age": 30}
d.get("name") # "Alice" (safe, no KeyError)
d.get("city", "Unknown") # default value
d.keys() # dict_keys(['name', 'age'])
d.values() # dict_values(['Alice', 30])
d.items() # dict_items([('name','Alice'),...])
d.update({"city": "NY"}) # merge another dict
d.pop("age") # remove & return value
d.setdefault("x", 0) # set only if key missing
# Merge (Python 3.9+)
merged = d | {"extra": 1}
Python uses try / except / else / finally blocks for exception handling.
try:
result = 10 / 0
except ZeroDivisionError as e:
print(f"Error: {e}")
except (TypeError, ValueError):
print("Type or value error")
else:
print("No exception!") # runs only if no exception
finally:
print("Always runs") # cleanup, always executes
# Raising custom exceptions
class InvalidAgeError(ValueError):
pass
raise InvalidAgeError("Age cannot be negative")
- Encapsulation — bundling data and methods inside a class; hiding internal state using private (
__attr) and protected (_attr) attributes. - Inheritance — a child class inherits attributes and methods from a parent class. Enables code reuse.
- Polymorphism — different classes can be used interchangeably if they share the same interface (same method name). Python achieves this via duck typing and method overriding.
- Abstraction — hiding implementation details and exposing only essential interfaces, often via abstract base classes (
ABC).
class Animal:
def __init__(self, name):
self.name = name # encapsulation
def speak(self): # polymorphism
raise NotImplementedError
class Dog(Animal): # inheritance
def speak(self):
return f"{self.name} says Woof!"
Python supports: Single, Multiple, Multilevel, Hierarchical, and Hybrid inheritance.
# Single
class Child(Parent): pass
# Multiple
class C(A, B): pass
# Multilevel
class GrandChild(Child): pass
# super() — call parent method
class Dog(Animal):
def __init__(self, name, breed):
super().__init__(name) # calls Animal.__init__
self.breed = breed
ClassName.__mro__.| __new__ | __init__ |
|---|---|
| Creates the object (allocates memory) | Initializes the object |
Static method; takes cls | Instance method; takes self |
| Called first | Called after __new__ returns instance |
| Must return an instance | Must return None |
You rarely need to override __new__ except when subclassing immutable types (like int or str) or implementing singletons/metaclasses.
s = " Hello, World! "
s.strip() # "Hello, World!" (removes whitespace)
s.lower() # " hello, world! "
s.upper() # " HELLO, WORLD! "
s.replace("World", "Python")
s.split(",") # [" Hello", " World! "]
s.find("World") # 9 (index), -1 if not found
s.startswith(" H") # True
s.endswith("! ") # True
s.count("l") # 3
" ".join(["a","b"]) # "a b"
s.zfill(20) # zero-pad
f"Hello, {'Python'}!" # f-string (recommended)
# Always use context manager (with) — auto-closes file
with open("file.txt", "r", encoding="utf-8") as f:
content = f.read() # entire file as string
lines = f.readlines() # list of lines
# Write
with open("out.txt", "w") as f:
f.write("Hello\n")
# Append
with open("out.txt", "a") as f:
f.write("More text\n")
# Modes: r, w, a, x, rb, wb, r+
range(start, stop, step) generates an immutable sequence of integers. It is lazy — it doesn't create all numbers in memory at once, making it memory-efficient even for large ranges.
range(5) # 0, 1, 2, 3, 4
range(2, 8) # 2, 3, 4, 5, 6, 7
range(0, 10, 2) # 0, 2, 4, 6, 8
range(10, 0, -2) # 10, 8, 6, 4, 2
import sys
print(sys.getsizeof(range(1000000))) # 48 bytes — always!
Python uses the LEGB rule for name resolution: Local → Enclosing → Global → Built-in.
x = "global"
def outer():
x = "enclosing"
def inner():
global x # access module-level x
nonlocal x # or: access enclosing x
x = "modified"
inner()
# Use global/nonlocal keywords to modify outer variables
Intermediate Questions
Q18 – Q35A decorator is a function that takes another function as input, adds functionality, and returns a new function — without modifying the original function's source code. They use the @ syntax sugar.
import functools, time
def timer(func):
@functools.wraps(func) # preserves metadata
def wrapper(*args, **kwargs):
start = time.perf_counter()
result = func(*args, **kwargs)
elapsed = time.perf_counter() - start
print(f"{func.__name__} took {elapsed:.4f}s")
return result
return wrapper
@timer
def slow_func():
time.sleep(0.1)
# Decorator with arguments
def repeat(n):
def decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
for _ in range(n):
result = func(*args, **kwargs)
return result
return wrapper
return decorator
@repeat(3)
def greet(): print("Hello!")
A generator is a function that uses yield to produce values lazily, one at a time. It maintains its state between calls and does not load all values into memory.
# Generator function
def fibonacci(n):
a, b = 0, 1
for _ in range(n):
yield a
a, b = b, a + b
for num in fibonacci(10):
print(num) # 0 1 1 2 3 5 8 13 21 34
# Generator expression (lazy list comprehension)
gen = (x**2 for x in range(1000000))
next(gen) # 0 — computed one at a time
# send() to generator
def accumulator():
total = 0
while True:
value = yield total
if value is None: break
total += value
( ) while a list comprehension uses [ ].| Iterable | Iterator |
|---|---|
Has __iter__() method | Has both __iter__() and __next__() |
| Can be looped over | Produces one item at a time |
| list, str, dict, set, range | generator, file objects, zip, map |
| Can restart iteration | Exhausted after one pass |
lst = [1, 2, 3] # iterable
it = iter(lst) # iterator
next(it) # 1
next(it) # 2
next(it) # 3
next(it) # raises StopIteration
Context managers handle setup and teardown logic automatically using the with statement. They implement __enter__ and __exit__ methods (or use @contextmanager).
# Class-based context manager
class DBConnection:
def __enter__(self):
self.conn = connect_db()
return self.conn
def __exit__(self, exc_type, exc_val, tb):
self.conn.close()
return False # don't suppress exceptions
# Generator-based (cleaner)
from contextlib import contextmanager
@contextmanager
def db_connection():
conn = connect_db()
try:
yield conn
finally:
conn.close()
with db_connection() as conn:
conn.execute("SELECT * FROM users")
A closure is an inner function that "remembers" the variables from its enclosing scope, even after the outer function has finished executing.
def make_multiplier(factor):
def multiply(x):
return x * factor # 'factor' is closed over
return multiply
double = make_multiplier(2)
triple = make_multiplier(3)
double(5) # 10
triple(5) # 15
# Check closure variables
print(double.__closure__[0].cell_contents) # 2
Closures are the mechanism behind decorators. They're also used in factory functions and to implement data encapsulation without classes.
from functools import reduce
nums = [1, 2, 3, 4, 5]
# map() — apply function to every element
squares = list(map(lambda x: x**2, nums))
# [1, 4, 9, 16, 25]
# filter() — keep elements where function is True
evens = list(filter(lambda x: x % 2 == 0, nums))
# [2, 4]
# reduce() — fold list to single value
product = reduce(lambda acc, x: acc * x, nums)
# 120 (1*2*3*4*5)
# Modern equivalents (often preferred)
squares = [x**2 for x in nums]
evens = [x for x in nums if x % 2 == 0]
| Shallow Copy | Deep Copy |
|---|---|
| Copies top-level object | Recursively copies all nested objects |
| Nested objects share references | Nested objects are fully independent |
| Faster, less memory | Slower, more memory |
copy.copy() or list[:] | copy.deepcopy() |
import copy
original = [[1, 2], [3, 4]]
shallow = copy.copy(original)
deep = copy.deepcopy(original)
original[0].append(99)
print(shallow[0]) # [1, 2, 99] — affected!
print(deep[0]) # [1, 2] — unaffected
Dunder (double underscore) methods, also called magic or special methods, allow you to define how objects behave with Python's built-in operations.
class Vector:
def __init__(self, x, y):
self.x, self.y = x, y
def __repr__(self): # repr(v)
return f"Vector({self.x}, {self.y})"
def __add__(self, other): # v1 + v2
return Vector(self.x + other.x, self.y + other.y)
def __len__(self): # len(v)
return int((self.x**2 + self.y**2)**0.5)
def __eq__(self, other): # v1 == v2
return self.x == other.x and self.y == other.y
def __bool__(self): # bool(v)
return self.x != 0 or self.y != 0
| Instance Method | @classmethod | @staticmethod | |
|---|---|---|---|
| First arg | self (instance) | cls (class) | None |
| Access instance | ✅ Yes | ❌ No | ❌ No |
| Access class | ✅ Yes | ✅ Yes | ❌ No |
| Use case | Normal methods | Alternative constructors | Utility functions |
class Date:
def __init__(self, y, m, d):
self.y, self.m, self.d = y, m, d
@classmethod
def from_string(cls, s): # alternative constructor
y, m, d = map(int, s.split("-"))
return cls(y, m, d)
@staticmethod
def is_valid_year(year): # utility function
return 1900 <= year <= 2100
d = Date.from_string("2026-04-01")
Python uses the C3 linearisation algorithm (also called C3 MRO) to determine the order in which base classes are searched when looking up a method. This avoids the "diamond problem".
class A:
def who(self): print("A")
class B(A):
def who(self): print("B")
class C(A):
def who(self): print("C")
class D(B, C): pass # Diamond
D().who() # "B" — MRO: D → B → C → A
print(D.__mro__)
# (D, B, C, A, object)
- Module: A single
.pyfile containing Python code (functions, classes, variables). - Package: A directory containing multiple modules, identified by an
__init__.pyfile (which can be empty). Enables hierarchical namespace organization.
# Importing a module
import math
from math import sqrt, pi
# Importing from a package
from mypackage.utils import helper
# Package structure:
# mypackage/
# __init__.py
# utils.py
# models/
# __init__.py
# user.py
import re
text = "Contact: alice@example.com or bob@test.org"
# Find first match
match = re.search(r'\b\w+@\w+\.\w+\b', text)
# Find all matches
emails = re.findall(r'[\w.-]+@[\w.-]+\.\w+', text)
# Substitute
cleaned = re.sub(r'\s+', ' ', "hello world")
# Compile for reuse (faster)
pattern = re.compile(r'\d{4}-\d{2}-\d{2}')
dates = pattern.findall("2026-04-01 and 2025-12-31")
# Named groups
m = re.match(r'(?P<year>\d{4})-(?P<month>\d{2})', "2026-04")
m.group('year') # '2026'
Virtual environments create isolated Python environments for each project, preventing dependency conflicts between projects that require different package versions.
# Create virtual environment
python -m venv .venv
# Activate (Linux/macOS)
source .venv/bin/activate
# Activate (Windows)
.venv\Scripts\activate
# Install packages (isolated)
pip install requests pandas
# Freeze dependencies
pip freeze > requirements.txt
# Install from requirements
pip install -r requirements.txt
# Modern alternative: uv (2024+)
uv venv && uv pip install requests
Slicing syntax: lst[start:stop:step]. All parameters are optional and can be negative.
lst = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
lst[2:6] # [2, 3, 4, 5]
lst[:4] # [0, 1, 2, 3]
lst[6:] # [6, 7, 8, 9]
lst[:] # shallow copy
lst[::-1] # [9,8,7,...,0] — reversed
lst[1:8:2] # [1, 3, 5, 7]
lst[-3:] # [7, 8, 9]
lst[2:-2] # [2, 3, 4, 5, 6, 7]
# Slice assignment
lst[2:5] = [20, 30] # replaces elements
- Reference Counting: Every object tracks how many references point to it. When the count hits zero, memory is freed immediately.
- Cyclic Garbage Collector: Handles circular references (A → B → A) that reference counting can't resolve. Can be triggered with
gc.collect(). - Memory Pools: Python's memory allocator (
pymalloc) manages small objects (<512 bytes) using memory pools for efficiency. - Interning: Small integers (−5 to 256) and short strings are cached and reused to save memory.
import sys, gc
a = []
print(sys.getrefcount(a)) # reference count
print(sys.getsizeof(a)) # object size in bytes
gc.collect() # force garbage collection
| threading | multiprocessing | |
|---|---|---|
| Unit | Thread (lightweight) | Process (heavy) |
| Memory | Shared | Separate memory space |
| GIL | Blocked by GIL | Bypasses GIL |
| Best for | I/O-bound (network, files) | CPU-bound (computation) |
| Overhead | Low | High |
from concurrent.futures import ThreadPoolExecutor, ProcessPoolExecutor
# I/O bound — use threads
with ThreadPoolExecutor(max_workers=4) as ex:
results = list(ex.map(fetch_url, urls))
# CPU bound — use processes
with ProcessPoolExecutor(max_workers=4) as ex:
results = list(ex.map(heavy_compute, data))
words = ["apple", "banana", "cherry", "apple"]
# Set comprehension — unique items
unique = {w for w in words}
# {'apple', 'banana', 'cherry'}
# Dict comprehension
lengths = {w: len(w) for w in words}
# {'apple': 5, 'banana': 6, 'cherry': 6}
# Invert a dict
inv = {v: k for k, v in lengths.items()}
# Conditional dict comprehension
long_words = {w: len(w) for w in words if len(w) > 5}
# {'banana': 6, 'cherry': 6}
Pickling is the process of serializing a Python object into a byte stream (for storage or transmission). Unpickling is the reverse — deserializing bytes back into an object.
import pickle
data = {"name": "Alice", "scores": [95, 87, 92]}
# Pickle to file
with open("data.pkl", "wb") as f:
pickle.dump(data, f)
# Unpickle from file
with open("data.pkl", "rb") as f:
loaded = pickle.load(f)
# In-memory bytes
b = pickle.dumps(data)
original = pickle.loads(b)
json for safe data exchange.Advanced Questions
Q36 – Q50The GIL is a mutex in CPython that ensures only one thread executes Python bytecode at a time, even on multi-core systems. It simplifies memory management (especially reference counting) but prevents true parallelism in CPU-bound multi-threaded programs.
Implications:
- CPU-bound code: Multiple threads don't speed things up — use
multiprocessingor C extensions instead. - I/O-bound code: Threads ARE effective because the GIL is released during I/O waits.
- Python 3.12+:
PEP 703introduced a GIL-free (free-threaded) build mode as an opt-in experiment. - Python 3.13: Free-threaded CPython is available via
python3.13t, marking a historic shift.
A metaclass is the "class of a class" — it defines how classes themselves are created. In Python, type is the default metaclass. When Python sees a class statement, it calls the metaclass to build the class object.
# Custom metaclass
class SingletonMeta(type):
_instances = {}
def __call__(cls, *args, **kwargs):
if cls not in cls._instances:
cls._instances[cls] = super().__call__(*args, **kwargs)
return cls._instances[cls]
class Database(metaclass=SingletonMeta):
def __init__(self):
self.connection = "connected"
db1 = Database()
db2 = Database()
print(db1 is db2) # True — same instance
# Metaclass for enforcing API
class EnforceMeta(type):
def __new__(mcs, name, bases, namespace):
if 'process' not in namespace:
raise TypeError(f"{name} must define process()")
return super().__new__(mcs, name, bases, namespace)
Python's asyncio implements cooperative multitasking via an event loop. async def defines a coroutine; await suspends it, allowing other tasks to run. This achieves concurrency without threads, ideal for I/O-bound workloads.
import asyncio
async def fetch_data(url: str) -> str:
await asyncio.sleep(1) # simulate I/O
return f"Data from {url}"
async def main():
# Run concurrently (not sequentially)
results = await asyncio.gather(
fetch_data("url1"),
fetch_data("url2"),
fetch_data("url3"),
)
# All 3 complete in ~1s, not ~3s
print(results)
asyncio.run(main())
# Async context manager
async with aiohttp.ClientSession() as session:
async with session.get(url) as resp:
data = await resp.json()
A descriptor is any object that defines __get__, __set__, or __delete__. When an attribute is a descriptor, Python calls these methods instead of directly accessing the object's __dict__. Properties, functions, and class/static methods are all implemented as descriptors.
class Validated:
"""A descriptor that validates positive numbers."""
def __set_name__(self, owner, name):
self.name = name
def __get__(self, obj, objtype=None):
if obj is None: return self
return obj.__dict__.get(self.name)
def __set__(self, obj, value):
if value < 0:
raise ValueError(f"{self.name} must be positive")
obj.__dict__[self.name] = value
class Circle:
radius = Validated() # descriptor instance
c = Circle()
c.radius = 5 # OK
c.radius = -1 # raises ValueError
ABCs define a common interface that subclasses must implement. They enforce contracts at class definition time, not at runtime call time. Use them to build plugin systems, define APIs, and enable isinstance checks against interfaces.
from abc import ABC, abstractmethod
class Shape(ABC):
@abstractmethod
def area(self) -> float: ...
@abstractmethod
def perimeter(self) -> float: ...
def describe(self): # concrete method
return f"Area={self.area():.2f}"
class Circle(Shape):
def __init__(self, r): self.r = r
def area(self): return 3.14159 * self.r**2
def perimeter(self): return 2 * 3.14159 * self.r
Shape() # TypeError: Can't instantiate abstract class
Type hints (PEP 484+) add optional static type annotations to Python. They don't affect runtime behavior but enable tools like mypy, pyright, and IDE type checkers to catch bugs before execution.
from __future__ import annotations
from typing import Optional, Union, TypeVar, Generic
from collections.abc import Callable, Sequence
def greet(name: str, times: int = 1) -> str:
return (name + " ") * times
def process(items: list[int] | None) -> dict[str, int]:
if items is None: return {}
return {f"item_{i}": v for i, v in enumerate(items)}
T = TypeVar('T')
class Stack(Generic[T]):
def __init__(self) -> None:
self._items: list[T] = []
def push(self, item: T) -> None:
self._items.append(item)
def pop(self) -> T:
return self._items.pop()
@dataclass auto-generates __init__, __repr__, __eq__ (and optionally __lt__, __hash__, __slots__) from field annotations.
from dataclasses import dataclass, field, KW_ONLY
@dataclass(order=True, frozen=True)
class Point:
x: float
y: float
_: KW_ONLY
label: str = ""
def distance(self) -> float:
return (self.x**2 + self.y**2)**0.5
p = Point(3.0, 4.0, label="origin")
p.distance() # 5.0
# frozen=True makes instances hashable
# order=True adds comparison operators
By default, Python stores instance attributes in a __dict__ per instance. Defining __slots__ replaces this with fixed-size slot arrays, reducing memory usage by ~40–60% and speeding up attribute access.
class Point:
__slots__ = ('x', 'y')
def __init__(self, x, y):
self.x, self.y = x, y
p = Point(1, 2)
p.z = 3 # AttributeError — no __dict__
import sys
class Normal:
def __init__(self): self.x = self.y = 0
print(sys.getsizeof(Normal())) # ~232 bytes
print(sys.getsizeof(Point(0,0))) # ~56 bytes
__slots__ when you need to create millions of small objects (e.g., nodes in a graph, pixels, events). Avoid if you need dynamic attributes or multiple inheritance gets complex.A memoryview exposes the internal buffer of a bytes-like object without copying it. This is critical for high-performance binary processing where copying large buffers (images, audio, network packets) would be expensive.
data = bytearray(1_000_000) # 1 MB buffer
# Without memoryview — copies 1MB every time!
chunk = data[1000:2000]
# With memoryview — zero-copy slice
mv = memoryview(data)
chunk = mv[1000:2000] # no copy, same buffer
chunk[0] = 42 # modifies original data
# Used heavily in: NumPy, Pillow, socket recv_into
sock.recv_into(mv[offset:], nbytes)
A weak reference doesn't increase an object's reference count. When the object's only remaining references are weak, it can be garbage collected. Useful for caches, observer patterns, and preventing memory leaks in cyclic structures.
import weakref
class Cache:
def __init__(self):
self._store = weakref.WeakValueDictionary()
def set(self, key, value):
self._store[key] = value
def get(self, key):
return self._store.get(key) # None if GC'd
# WeakValueDictionary auto-removes entries
# when values are garbage collected
cache = Cache()
obj = SomeLargeObject()
cache.set("key", obj)
del obj # object GC'd; cache entry removed
When Python performance is insufficient for CPU-bound work, you can call C code from Python via several approaches, in increasing order of complexity:
- ctypes: Call shared C libraries directly with zero compilation.
- cffi: C Foreign Function Interface — cleaner than ctypes.
- Cython: Write Python-like code compiled to C. Popular in NumPy/SciPy.
- Python C API: Write full extension modules in C for maximum control.
- Numba: JIT-compile numerical Python code to LLVM — no C needed.
- PyO3 (Rust): Write Python extensions in Rust — increasingly popular in 2026.
# ctypes example
import ctypes
lib = ctypes.CDLL("./mylib.so")
lib.add.restype = ctypes.c_int
lib.add.argtypes = [ctypes.c_int, ctypes.c_int]
result = lib.add(3, 4) # 7
# Numba example (no C needed)
from numba import jit
@jit(nopython=True)
def fast_sum(arr):
total = 0
for x in arr:
total += x
return total
Golden rule: Always measure before optimizing. Use profiling to find the actual bottleneck, not where you guess it is.
# 1. timeit — micro-benchmarking
import timeit
timeit.timeit("[x**2 for x in range(100)]", number=10000)
# 2. cProfile — function-level profiling
python -m cProfile -s cumtime my_script.py
# 3. line_profiler — line-level
@profile
def slow_func(): ...
# kernprof -l -v my_script.py
# 4. memory_profiler
@memory_profiler.profile
def mem_func(): ...
# 5. py-spy — sampling profiler (no code changes)
# py-spy record -o profile.svg -- python app.py
Common optimizations: use built-ins (written in C), prefer generators over lists for large datasets, use collections.deque for queue operations, avoid + string concatenation in loops, cache with functools.lru_cache.
# SINGLETON via metaclass (see Q37)
# OBSERVER pattern
class EventEmitter:
def __init__(self):
self._handlers: dict = {}
def on(self, event, fn):
self._handlers.setdefault(event, []).append(fn)
def emit(self, event, *args):
for fn in self._handlers.get(event, []):
fn(*args)
# FACTORY method
class ShapeFactory:
_registry = {}
@classmethod
def register(cls, name):
def decorator(klass):
cls._registry[name] = klass
return klass
return decorator
@classmethod
def create(cls, name, **kwargs):
return cls._registry[name](**kwargs)
# Memoization with lru_cache
from functools import lru_cache
@lru_cache(maxsize=128)
def fib(n):
return n if n < 2 else fib(n-1) + fib(n-2)
import pytest
from unittest.mock import Mock, patch
# Fixtures for setup/teardown
@pytest.fixture
def db(tmp_path):
conn = create_db(tmp_path / "test.db")
yield conn
conn.close()
# Parametrize for many inputs
@pytest.mark.parametrize("n,expected", [
(0, 0), (1, 1), (10, 55), (20, 6765)
])
def test_fib(n, expected):
assert fib(n) == expected
# Mocking external services
def test_api_call():
with patch('requests.get') as mock_get:
mock_get.return_value.json.return_value = {"ok": True}
result = fetch_users()
assert result == {"ok": True}
# Test exceptions
def test_invalid():
with pytest.raises(ValueError, match="positive"):
Circle(radius=-1)
Python 3.12 (Oct 2023):
- PEP 695: New type parameter syntax —
type Alias = list[int],def func[T](x: T) -> T - PEP 692: TypedDict with
**kwargsusingUnpack - Better error messages: Clearer SyntaxErrors and tracebacks
- f-strings: Nested f-strings and quotes inside f-strings
- ~5% performance improvement over 3.11
Python 3.13 (Oct 2024):
- PEP 703: Free-threaded CPython (no-GIL build) — experimental but available as
python3.13t - Experimental JIT compiler: Opt-in with
--enable-experimental-jit - Improved REPL: Multi-line editing, color highlighting
- PEP 667:
locals()now returns a proper mapping - Deprecations removed: Many Python 2 legacy APIs cleaned up
# Python 3.12 — new generic syntax (PEP 695)
def first[T](lst: list[T]) -> T:
return lst[0]
class Stack[T]:
def push(self, item: T) -> None: ...
type Vector = list[float] # type alias statement
# Python 3.12 — f-strings improved
name = "world"
print(f"{'hello'!r} {name}") # quotes inside f-string
No comments:
Post a Comment
Please keep your comments relevant.
Comments with external links and adult words will be filtered.