Особенности Python для интервью¶
~27 минут чтения
Предварительно: Подготовка к Python-интервью
Python -- 1-й язык по популярности на технических интервью (Tech Interview Handbook, 2025): 13 уникальных языковых особенностей (generators, decorators, comprehensions, context managers, GIL, metaclasses и др.) отличают Python-собеседования от интервью на Java/C++. Каждая из 13 тем ниже содержит объяснение "почему это Python-специфично", типичные задачи, и ссылки на практические ресурсы.
1. Generators и yield¶
Почему это Python-специфично?¶
Generators в Python предоставляют ленивое вычисление (lazy evaluation) через yield keyword, что принципиально отличает Python от языков с eager evaluation. В отличие от функций с return, генераторы:
- Сохраняют состояние между вызовами
- Производят значения "на лету" без хранения в памяти
- Идеальны для бесконечных последовательностей и больших датасетов
Ключевое отличие от return:
"In Python, 'return' sends a value and terminates a function, while 'yield' produces a value but retains the function's state, allowing it to resume from where it left off."
Практическая ценность для интервью¶
- Экономия памяти: Генератор для 1M элементов vs список - разница в 100+ MB RAM
- Memory efficiency: "Generators do not store all the values in memory, they generate the values on the fly"
- Признак зрелости: "Being able to write a small generator function with yield instantly signals maturity"
Типичные задачи для live coding¶
Источник: W3Resource - Python Generators Practice
17 основных типов задач (85 вариаций):
| # | Задача | Паттерн |
|---|---|---|
| 1 | Генератор кубов (1 to n) | Базовый yield в цикле |
| 2 | Генератор случайных чисел | yield + random module |
| 3 | Простые числа в диапазоне | yield + математическая логика |
| 4 | Последовательность Фибоначчи | yield + состояние переменных |
| 5 | Перестановки списка | yield + itertools или рекурсия |
| 6 | Комбинации элементов | yield + алгоритмы комбинаторики |
| 7 | Последовательность Коллатца | yield + условная логика |
| 8 | Генератор палиндромов | yield + проверка условия |
| 9 | Простые множители | yield + факторизация |
| 10 | Генератор "счастливых чисел" | yield + математика |
| 11 | Генератор running average | yield + аккумуляция состояния |
Продвинутая концепция: yield from
"Generators can be delegated via
yield from, flattening nested generators and auto-propagatingStopIteration.value."
Источники¶
- GeeksforGeeks - Top 50+ Python Interview Questions 2025
- W3Resource - Generators Practice
- InterviewBit - Python Interview Questions 2025
2. Decorators¶
Почему это Python-специфично?¶
Декораторы - это синтаксический сахар для higher-order functions, позволяющий модифицировать функции без изменения их кода. Это Python-specific паттерн метапрограммирования.
"A decorator is a design pattern that allows you to modify the functionality of a function by wrapping it in another function."
Real-world use cases¶
12 типичных задач для интервью:
Источник: W3Resource - Decorator Practice
- Logging функций - логирование аргументов и return values
- Измерение времени выполнения - performance profiling
- Конвертация типов - преобразование результатов
- Кеширование (memoization) - оптимизация повторных вызовов
- Валидация аргументов - type/value checking
- Retry механизм - автоматические повторы при сбоях
- Rate limiting - ограничение частоты вызовов
- Exception handling с fallback - graceful degradation
- Type enforcement - строгая типизация аргументов
- Memory usage tracking - профилирование памяти
- TTL кеширование - кеш с временем жизни
Встроенные декораторы¶
Built-in decorators в классах:
- @staticmethod - метод без self/cls
- @classmethod - метод с cls
- @property - getter/setter/deleter
Chaining decorators¶
"To chain decorators in Python, we can apply multiple decorators to a single function by placing them one after the other, with the most inner decorator being applied first."
Порядок применения - снизу вверх (bottom-up fashion).
Источники¶
- W3Resource - Python Decorators Practice
- GeeksforGeeks - Decorators in Python
- Terminal.io - 15 Python Interview Questions
3. Context Managers (with statement)¶
Почему это важно для интервью¶
Context managers - это Python-specific протокол для автоматического управления ресурсами. Они реализуют паттерн RAII (Resource Acquisition Is Initialization) через __enter__ и __exit__ methods.
"Context managers are often used for resource management tasks such as file handling, database connections, and locks."
Преимущества перед try-finally¶
Почему context managers > try-finally: - Автоматическая очистка ресурсов - Предотвращение resource leaks - Exception-safe гарантии - Меньше boilerplate кода
"Using
withreduces code complexity and prevents resource leaks by ensuring proper resource release, even if exceptions occur."
Способы создания¶
- Class-based - реализация
__enter__и__exit__ - Generator-based - через
@contextmanagerdecorator из contextlib
Типичные вопросы интервью¶
Источник: DynamicDuniya - 50+ Context Manager Q&A
- Какие методы должен реализовать custom context manager?
- Что происходит при exception внутри
withблока? - В чем разница между
__exit__()иfinally? - Как использовать context managers без
withstatement? - Что такое
contextlib.suppress()? - Как работают async context managers (
async with)?
Продвинутые концепции¶
Async context managers:
contextlib utilities:
- contextlib.suppress() - игнорирование исключений
- contextlib.closing() - автоматический close()
- contextlib.ExitStack() - динамическое количество контекстов
Источники¶
- GeeksforGeeks - Context Manager in Python
- Real Python - Python's with Statement
- DynamicDuniya - 50+ Context Manager Interview Questions
4. List/Dict/Set Comprehensions¶
Почему это важно¶
Comprehensions - это синтаксический сахар для создания коллекций, демонстрирующий понимание Pythonic code style. Это одна из самых характерных особенностей Python.
"List comprehensions provide an intuitive way to generate and modify lists in a single, succinct line of code, making them a valuable tool for solving array-related questions in coding interviews."
Преимущества¶
Ключевые преимущества: 1. Читаемость - выражение логики в одной строке 2. Меньше ошибок - меньше кода = меньше багов 3. Performance boost - "faster than using a loop because it is optimized for list creation" 4. Pythonic style - показывает зрелость разработчика
"Using list comprehensions shows you're not just writing 'code in Python' but writing 'Pythonic code'—idiomatic, elegant, and efficient."
Performance детали¶
Time Complexity: O(n) - каждый элемент обрабатывается один раз
Performance notes:
- Comprehensions быстрее простых циклов
- list.append() в цикле значительно замедляет код
- Для ОЧЕНЬ больших датасетов использовать generator expressions
"List comprehensions are memory-efficient for small to moderately sized datasets, but when working with very large datasets, they create lists in memory that can become a performance bottleneck."
Generator Expressions как альтернатива¶
"Although similar to list comprehensions in their syntax, generator expressions return values only when asked for, as opposed to a whole list in the former case. As a result, they use less memory and by dint of that are more efficient."
Syntax:
- List comp: [x**2 for x in range(10)]
- Generator exp: (x**2 for x in range(10))
Типичные задачи интервью¶
Источник: GeeksforGeeks - List Comprehension Interview Questions
Категории вопросов:
Fundamentals:
- Что такое list comprehension и его преимущества?
- Conditional filtering с if statements
- Combining comprehensions с функциями
Intermediate:
- Nested loops в comprehensions
- Working с enumerate()
- Multiple iterables одновременно
Advanced: - Flattening nested lists - Extracting values из dictionaries в lists - Conversion to generators
Dictionary Comprehension¶
"Dict comprehensions are just like list comprehensions, except that you group the expression using curly braces instead of square braces. Also, the left part before the for keyword expresses both a key and a value, separated by a colon."
Syntax:
Когда НЕ использовать¶
"When your code is too elaborate or complex, it is better to avoid list comprehension as it can be difficult to understand the code and hamper its performance."
Источники¶
- GeeksforGeeks - List Comprehension Interview Questions
- Medium - List Comprehensions: Key to Array Questions
- Zero To Mastery - Beginner's Guide to List Comprehension
5. *args / **kwargs и Unpacking Operators¶
Почему это Python-специфично¶
*args и **kwargs - это Python-specific механизм для работы с произвольным количеством аргументов. Это fundamental feature для создания гибких API и wrapper functions.
Основные концепции¶
Два типа packing:
- *args - Non-keyword arguments → tuple
- **kwargs - Keyword arguments → dictionary
"The special syntax *args allows us to pass any number of positional (non-keyword) arguments to a function. These arguments are collected into a tuple."
Unpacking Operators¶
"The unpacking operators are operators that unpack the values from iterable objects in Python. The single asterisk operator * can be used on any iterable that Python provides, while the double asterisk operator ** can only be used on dictionaries."
Enhanced in Python 3.5+ (PEP 448): - Более мощные возможности unpacking - Работа в любых контекстах, не только в вызовах функций
Практические примеры¶
Unpacking tuple:
Unpacking dictionary:
def info(name, age, country):
print(f"Name: {name}, Age: {age}, Country: {country}")
data = {"name": "Alice", "age": 30, "country": "USA"}
info(**data) # Unpacks dict
Combining *args and **kwargs:
Правильный порядок аргументов¶
"The correct order of arranging the arguments in a function definition is: Standard variable arguments, *args arguments and then the **kwargs arguments."
Syntax:
Keyword-Only Arguments (Python 3+)¶
"Python 3 introduced the ability to enforce that arguments appearing after a bare * or after *args must be supplied as keyword arguments."
Dictionary Merging (Modern Pattern)¶
default_config = {'timeout': 30, 'retries': 3}
user_overrides = {'retries': 5, 'log_level': 'DEBUG'}
final_config = {**default_config, **user_overrides}
Это cleaner альтернатива старым методам конкатенации.
Real-world use cases¶
"These operators are used in socket programming where we need to send unknown (infinite) number of requests to server, and in web frameworks like Django to send variable arguments to view functions."
Источники¶
- Real Python - Python kwargs and args Demystified
- GeeksforGeeks - Packing and Unpacking Arguments
- Towards Data Science - Args and Kwargs Made Easy
6. Dunder Methods (Magic Methods)¶
Что это такое¶
"Python Magic methods are the methods starting and ending with double underscores '__'. They are defined by built-in classes in Python and commonly used for operator overloading."
Alternative names: Dunder methods (Double Under), Special methods
Почему важно для интервью¶
"Python automatically calls magic methods as a response to certain operations, such as instantiation, sequence indexing, attribute managing, and much more."
Dunder methods - это Python-specific протокол, определяющий поведение объектов в различных контекстах. Все операторы Python (+, ==, in) используют dunder methods.
Категории dunder methods¶
Object Creation & Representation:
- __init__ - constructor (инициализация объекта)
- __new__ - создание нового instance (вызывается до __init__)
- __repr__ - "official" string representation (для разработчиков)
- __str__ - "informal" string representation (для пользователей)
- __del__ - destructor (cleanup при уничтожении)
Operator Overloading:
- __add__ - поведение оператора +
- __eq__ - поведение оператора ==
- __lt__, __le__, __gt__, __ge__ - сравнения (<, <=, >, >=)
- __len__ - поведение len()
- __contains__ - поведение оператора in
Iterator Protocol:
- __iter__ - возвращает iterator object (for loop)
- __next__ - следующий элемент в итерации
Context Manager Protocol:
- __enter__ - вход в with block
- __exit__ - выход из with block
Callable Objects:
- __call__ - делает объект callable как функция
Типичные задачи интервью¶
Источник: GeeksforGeeks - Dunder Magic Methods
- Как реализовать operator overloading для matrix multiplication?
- Создать custom context manager через
__enter__и__exit__ - Использовать
__call__для memoization function - Реализовать custom iteration через
__iter__и__next__ - Override
__eq__для сравнения объектов с разными атрибутами
Best Practices¶
"Note that because magic methods have special meaning for Python itself, you should avoid naming custom methods using leading and trailing double underscores."
Важное правило:
"Resist the temptation to create new dunder names outside the standard Python data model. Sticking to the built-in set of dunder methods keeps your code clear and prevents unexpected behavior."
Источники¶
- GeeksforGeeks - Dunder Magic Methods
- Real Python - Python's Magic Methods
- DataCamp - Python Dunder Methods
- Analytics Vidhya - MCQs on Python Special Methods
7. Descriptors и @property¶
Что такое Descriptors¶
"In Python, a descriptor is any object that implements at least one of the following methods:
__get__(self, instance, owner),__set__(self, instance, value), or__delete__(self, instance)."
Descriptors - это Python-specific протокол для кастомного контроля доступа к атрибутам. Они являются фундаментом для работы properties, methods, и static methods.
Два типа Descriptors¶
Data Descriptors:
- Реализуют __get__, __set__, и/или __delete__
- Могут управлять и чтением, и записью
- Имеют приоритет над instance attributes
Non-Data Descriptors:
- Реализуют только __get__
- Могут управлять только чтением
- Instance attributes могут их override
Когда использовать¶
"You would use descriptors when you want custom control over attribute access, when you're building a reusable pattern like data validation or transformation, or when you're working with ORMs, framework internals, or data-binding logic."
@property Decorator¶
"The
@propertydecorator is a built-in decorator in Python which is helpful in defining the properties effortlessly without manually calling the inbuilt functionproperty()."
Ключевая деталь:
"The
@propertyis implemented as a descriptor under the hood."
Property vs Descriptor¶
@property advantages: - Проще в использовании - Меньше boilerplate кода - Встроенная поддержка getter/setter/deleter
Descriptors advantages: - Переиспользуемые паттерны валидации - Работают на уровне класса, не инстанса - Более гибкие для сложных сценариев
Практический пример¶
Descriptor для валидации:
class StringDescriptor:
def __set_name__(self, owner, name):
self.name = name # Called automatically by Python 3.6+
def __get__(self, instance, owner):
if instance is None:
return self
return instance.__dict__.get(self.name)
def __set__(self, instance, value):
if not isinstance(value, str):
raise ValueError("Must be string")
instance.__dict__[self.name] = value
class User:
name = StringDescriptor() # __set_name__ sets self.name = "name"
Coverage в ORM системах¶
"Coverage of descriptors often starts with a close look at Python's property built-in function and dynamic attribute look up. ORM-like field validation descriptors are common examples that leverage class decorators to solve usability issues."
Важное замечание¶
"Always keep in mind that there is nothing that descriptors can do that can't be achieved with plain methods. However, descriptors provide a cleaner, more Pythonic interface for attribute management."
Источники¶
- GeeksforGeeks - Descriptor in Python
- GeeksforGeeks - Python Property Decorator
- Web Asha - Advanced Python Interview Questions
8. Metaclasses¶
Что такое Metaclasses¶
"Metaclasses are classes that define how other classes are created. While regular classes act as blueprints for creating objects, metaclasses serve as blueprints for creating classes themselves."
Ключевая концепция:
"When we create a class in Python, it is, in fact, an instance of a metaclass. By default, every class in Python is an instance of the built-in
typemetaclass."
Когда использовать¶
"Metaclasses allow you to customize how classes are created. When you define a class, Python's default metaclass (
type) is used to create that class object. By defining your own metaclass, you can intercept and modify this class creation process."
Use cases: - Автоматическая регистрация классов в registry - Enforcing coding standards - Creating custom class behavior - Framework development
Важное замечание для интервью¶
"For the most part, you don't need to be aware of metaclasses. Most Python programmers rarely, if ever, have to think about metaclasses."
Когда НЕ использовать:
"Custom metaclasses mostly aren't necessary. If it isn't pretty obvious that a problem calls for them, then it will probably be cleaner and more readable if solved in a simpler way."
Современная альтернатива¶
"Luckily for us, since Python 3.6 there is another hook available:
__init_subclass__. It is able to replace the majority (if not all) metaclasses."
__init_subclass__() - это более простой и понятный способ кастомизации создания классов.
Interview perspective¶
"Interviewers may ask about metaclasses to evaluate your understanding of Python's advanced object model and ability to work with complex metaprogramming concepts."
Black Belt концепция:
"Metaclasses are mentioned among the most advanced features of Python. Knowing how to write one is perceived like having a Python black belt."
Decorators vs Metaclasses¶
"Some problems in Python can be solved using decorators or metaclasses. While decorators provide a simple solution for many tasks, there are certain situations where only metaclasses can provide a more efficient or scalable solution."
Источники¶
- GeeksforGeeks - Metaprogramming with Metaclasses
- Real Python - Python Metaclasses
- breadcrumbscollector.tech - When to use metaclasses
9. GIL и его влияние¶
Что такое GIL¶
"The GIL is a mutex (mutual exclusion lock) that protects access to Python objects, preventing multiple threads from executing Python bytecode simultaneously. In simple terms, the GIL ensures that only one thread can execute Python code at any given time within a single Python process."
Зачем был введен¶
"The GIL was introduced to simplify memory management in Python as many internal operations, such as object creation, are not thread safe by default. Without a GIL, multiple threads trying to access the shared resources will require complex locks or synchronisation mechanisms to prevent race conditions and data corruption."
Важная деталь¶
"The GIL is specific to CPython. Other implementations like Jython and PyPy don't have it."
Threading vs Multiprocessing vs Asyncio¶
General rule:
"The general rule of thumb is: use multiprocessing for CPU-bound tasks and use asyncio or threads for I/O-bound tasks."
Multithreading¶
Характеристики: - Threads share same memory and resources - GIL ограничивает эффективность для CPU-bound tasks - Threads release GIL during I/O waits - Подходит для I/O-bound operations (file reads, network, DB queries)
Overhead:
"Threading can consume more resources due to the need for context switching between threads and the overhead of managing multiple threads."
Multiprocessing¶
Характеристики: - Each process has its own GIL - True parallel execution на multiple CPU cores - Bypasses GIL limitations - Подходит для CPU-intensive tasks (data processing, calculations, image manipulation)
Memory considerations:
"Using multiprocessing consumes more memory than multithreading because each process runs its own Python interpreter with a separate memory space."
Asyncio¶
Характеристики: - Single-threaded event loop - Cooperative multitasking - Non-blocking I/O operations - Подходит для I/O-bound tasks с тысячами concurrent connections
GIL и asyncio:
"asyncio bypasses this constraint for I/O-bound operations because it uses single-threaded non-blocking I/O. While waiting for I/O (like network/file read), coroutines yield control using await, letting other coroutines run. There is no need for multiple threads to achieve concurrency, so GIL is not a limiting factor."
Ключевые interview points¶
For CPU-bound tasks: multiprocessing For I/O-bound tasks: multithreading или asyncio
Для 1 million I/O tasks:
"If the requirement is to handle one million I/O tasks, multiprocessing is not the right choice because it will consume excessive memory and suffer from process creation overhead. Since the workload is I/O-bound rather than CPU-bound, asyncio or multithreading is a much better fit."
Workarounds для GIL¶
- Multiprocessing - bypasses GIL через multiple processes
- C extensions (NumPy, TensorFlow) - heavy lifting в C/C++ где GIL released
- Async I/O (asyncio) - для I/O-bound tasks без threading
Источники¶
- Towards Data Science - Deep Dive into Multithreading, Multiprocessing, and Asyncio
- GeeksforGeeks - Asyncio Vs Threading
- DEV Community - GIL in Python: Everything for Interviews
10. asyncio vs threading vs multiprocessing¶
Когда использовать каждый подход¶
Основное правило:
"The general rule of thumb is: use multiprocessing for CPU-bound tasks and use asyncio or threads for I/O-bound tasks."
Multithreading - детально¶
Best for: I/O-bound operations (file reads, network requests, database queries)
Почему работает: - GIL's impact минимален - Threads release GIL during I/O waits - Другие threads могут выполняться
Drawbacks: - GIL limits CPU-bound effectiveness - Context switching overhead - Higher resource consumption
Multiprocessing - детально¶
Best for: CPU-intensive tasks (data processing, calculations, image manipulation)
Advantages: - True parallelism на multi-core processors - Each process has own GIL - Fully utilizes multiple CPU cores
Drawbacks: - Higher memory usage (each process = own Python interpreter) - Process creation overhead - Inter-process communication complexity
Asyncio - детально¶
Best for: High-concurrency I/O operations с многими simultaneous connections
Mechanism:
"Asyncio utilizes a single-threaded event loop to handle concurrency. It is designed to efficiently manage I/O-bound tasks by using asynchronous coroutines and non-blocking operations."
Advantages: - Minimal overhead vs threading - Может handle thousands/millions of connections - Single-threaded = no GIL limitations для I/O
How it works:
"asyncio achieves concurrency within one thread by cooperative multitasking. Threads achieve parallelism through OS-level scheduling but share GIL limitations."
Практическое сравнение¶
1 million I/O tasks scenario:
"If the requirement is to handle one million I/O tasks, multiprocessing is not the right choice because it will consume excessive memory and suffer from process creation overhead. Since the workload is I/O-bound rather than CPU-bound, asyncio or multithreading is a much better fit. Async I/O allows the program to scale to thousands or even millions of connections with minimal overhead because it does not block on I/O operations."
Interview ключевые точки¶
"Python provides three main approaches to handle multiple tasks simultaneously: multithreading, multiprocessing, and asyncio. Choosing the right model is crucial for maximising your program's performance and efficiently using system resources. It is also a common interview question!"
Источники¶
- Towards Data Science - Deep Dive into Multithreading, Multiprocessing, and Asyncio
- GeeksforGeeks - Asyncio Vs Threading
- DEV Community - Asyncio Interview Questions
11. BONUS: Дополнительные Python-специфичные концепции¶
11.1. Walrus Operator := (Python 3.8+)¶
Что это:
"Python's walrus operator
:=allows you to assign a value to a variable within an expression, combining assignment and use in a single step."
Название:
"During early discussions, it was dubbed the walrus operator because the
:=syntax resembles the eyes and tusks of a walrus lying on its side."
Ключевое отличие:
"An assignment expression returns the value, while a traditional assignment doesn't."
Use cases: 1. While loops - assignment в условии 2. List comprehensions - избежание повторных вызовов функции 3. Conditional statements - assign + test одновременно
Syntax note:
"To avoid confusion between the assignment statement
=and assignment expression:=, there's no situation in which both options are syntactically valid. For this reason, it's often necessary to enclose the assignment expression in parentheses."
Sources: - Real Python - Walrus Operator - DataCamp - Python Walrus Operator Tutorial - GeeksforGeeks - Walrus Operator in Python 3.8
11.2. itertools Module¶
Что это:
"
itertoolsis a module that provides various functions that work on iterators to produce complex iterators. It works as a fast, memory-efficient tool that is used either by themselves or in combination to form iterator algebra."
Common patterns:
Combinatoric Iterators:
- permutations() - all orderings (order matters)
- combinations() - groups where order doesn't matter
- product() - Cartesian products
Grouping:
- groupby(iterable, key) - group consecutive elements
- Important: Sort data first!
Infinite Iterators:
- count(start, step) - infinite counting
- repeat(val, num) - repeated values
Chaining:
- chain() - combine multiple sequences
- islice() - slicing with start/stop/step
Sources: - Towards Data Science - itertools and functools - GeeksforGeeks - Python Itertools
11.3. functools Module¶
Что это:
"Python's
functoolsmodule is a powerful utility for working with higher-order functions, decorators, and other functional programming constructs."
Common patterns:
reduce():
"applies function of two arguments cumulatively to the items of iterable, from left to right, so as to reduce the iterable to a single value."
partial():
"allows you to freeze a portion of the arguments by assigning single values to at least one argument."
Caching:
- @lru_cache - memoization с LRU eviction
- @functools.cache - unlimited cache (Python 3.9+)
cmp_to_key():
- Transforms old-style comparison function to key function
- Used with sorted(), min(), max(), heapq
Sources: - Towards Data Science - itertools and functools
11.4. slots Memory Optimization¶
Что это:
"
__slots__is a special attribute you can define in a Python class to explicitly declare a fixed set of attributes for instances of that class."
Benefits: - Memory reduction: 20-50% savings - Faster attribute access - Prevents dynamic attribute creation
Real numbers:
"Creating 1 million objects without slots used 171.18 MiB RAM, while the same with slots used only 70.56 MiB RAM - saving approximately 100 MiB."
When to use: - Large numbers of instances - Memory-constrained environments - Data-intensive applications
Limitations: - No dynamic attributes - Deep inheritance complexity - Library compatibility issues
Dataclass integration (Python 3.10+):
Sources: - Machine Learning Plus - Memory Optimization with slots - GeeksforGeeks - Python Use of slots
11.5. Dataclass vs NamedTuple¶
Dataclass (Python 3.7+):
- Mutable by default (frozen=True for immutable)
- Full class features
- Can use __post_init__
- Better performance
- Support для default values
NamedTuple: - Always immutable - Tuple-like behavior - Access by index AND name - More memory efficient than dataclass - Less flexible
Performance:
"DataClass is 8.18% faster to create objects than NamedTuple."
Memory:
"Named tuples are much more memory efficient than data classes, but data classes with slots are more memory efficient."
When to use: - Dataclass: when performance matters, need mutability - NamedTuple: want tuples that are easier to read, immutability required
Sources: - Earthly Blog - Python Data Classes vs Named Tuples - GeeksforGeeks - DataClass vs NamedTuple vs Object
11.6. Type Hints и Static Analysis¶
Что это:
"Type hints are a feature in Python that allow developers to annotate their code with expected types for variables and function arguments."
typing Module (Python 3.5+): - Generic types (List, Dict, Tuple, Set) - Optional, Union - Callable signatures - Protocols
Static Analysis:
"mypy is a popular static type checker for Python. It analyzes Python code with type hints and reports type inconsistencies or errors."
TYPE_CHECKING constant:
"Sometimes there's code that must be seen by a type checker but should not be executed. For such situations the typing module defines a constant, TYPE_CHECKING, that is considered True during type checking but False at runtime."
Benefits: - Improved readability (self-documenting) - Static type checking (catch errors before runtime) - Better IDE support (autocomplete, inline docs)
Drawbacks: - Increased boilerplate - Learning curve for beginners
Sources: - GeeksforGeeks - Type Hints in Python - Matics Academy - Type Hints Complete Guide
11.7. Duck Typing и EAFP vs LBYL¶
Duck Typing:
"A programming style which does not look at an object's type to determine if it has the right interface; instead, the method or attribute is simply called or used."
Philosophy:
"If it looks like a duck and quacks like a duck, it must be a duck."
EAFP (Easier to Ask Forgiveness than Permission):
"Try first, and handle exceptions if needed. Preferred in Pythonic code."
LBYL (Look Before You Leap):
"Check conditions before performing an action. More common in statically typed languages."
When to use: - EAFP: Errors rare, want cleaner code, multi-threaded environment - LBYL: Cheaper to check, exceptions too broad/slow
Performance:
"With LBYL an extra operation will always occur to validate. With EAFP an extra operation will only sometimes occur on failure."
Sources: - Real Python - Duck Typing in Python - Medium - EAFP vs LBYL
11.8. Built-in Functions: any, all, zip, enumerate¶
enumerate():
"Adds a counter to an iterable and returns it in a form of an enumerate object."
zip():
"Combines two or more iterables into a single iterator of tuples. Each tuple contains elements that share the same index."
Combining both:
any() / all():
- any() - Returns True if any element is true
- all() - Returns True if all elements are true (or empty iterable)
Sources: - GeeksforGeeks - enumerate and zip together - note.nkmk.me - enumerate and zip Together
12. Coding Patterns (не Python-специфичные, но важные)¶
Two Pointers Pattern¶
"The two-pointer technique is a pattern where two pointers iterate over the data structure in tandem or separately until they satisfy a certain condition."
Time Complexity: O(n) вместо O(n²)
Variants: - Opposite Direction (start/end moving toward each other) - Fast and Slow Pointers (Floyd's cycle detection)
Sliding Window Pattern¶
"A Sliding Window is an ingenious pattern for tackling problems involving subarrays or substrings. You slide a window (which can expand or shrink) across the data."
Key difference:
"Does the data between the pointers matter? Yes -> Sliding Window. No -> Two Pointers."
Keywords: "Longest substring", "Shortest subarray", "Max sum subarray of size K"
Источники¶
13. Что делает Python СПЕЦИФИЧНЫМ¶
Отличительные черты от других языков¶
Источник: InterviewBit - Python Interview Questions 2025
1. Indentation-Based Blocks:
"Where in other programming languages the indentation in code is for readability only, the indentation in Python is very important. Python uses indentation to indicate a block of code."
2. Interpreted Language:
"Python does not need to be compiled before it is run."
3. Dynamic Typing:
"You don't need to state the types of variables when you declare them."
4. First-Class Functions:
"Functions can be assigned to variables, returned from other functions and passed into functions."
5. Multiple Inheritance Support:
"Python does support multiple inheritances, unlike Java."
6. No Access Specifiers:
"Python does not have access specifiers (like C++'s public, private). Python lays down the concept of prefixing the name with single or double underscore."
7. Built-in Garbage Collector:
"Python has an inbuilt garbage collector, which recycles all the unused memory and frees the memory."
8. Strongly Typed:
"In a strongly-typed language, such as Python, '1' + 2 will result in a type error since these languages don't allow for 'type-coercion'."
9. Multi-Paradigm:
"It supports functional and structured programming methods as well as OOP."
10. Extensive Library:
"Python has a huge library of functions and data structures available."
Итоговая таблица: Что спрашивают чаще всего¶
| Концепция | Частота | Сложность | Why Python-Specific |
|---|---|---|---|
| Generators/yield | ⭐⭐⭐⭐⭐ | Medium | Lazy evaluation, state preservation |
| Decorators | ⭐⭐⭐⭐⭐ | Medium | Higher-order functions syntax sugar |
| List Comprehensions | ⭐⭐⭐⭐⭐ | Easy | Pythonic code style marker |
| Context Managers | ⭐⭐⭐⭐ | Medium | Resource management protocol |
| *args/**kwargs | ⭐⭐⭐⭐ | Easy | Flexible function signatures |
| Dunder Methods | ⭐⭐⭐⭐ | Medium | Operator overloading protocol |
| GIL | ⭐⭐⭐⭐ | Hard | CPython concurrency model |
| Asyncio | ⭐⭐⭐ | Hard | Single-threaded concurrency |
| Descriptors | ⭐⭐ | Hard | Attribute access protocol |
| Metaclasses | ⭐ | Very Hard | Rarely needed, black belt |
Типичные заблуждения¶
Заблуждение: @property и descriptor -- разные механизмы
@property -- это descriptor под капотом. Когда интервьюер спрашивает "как работает property?", ожидается ответ про __get__/__set__/__delete__ протокол. Property -- частный случай data descriptor.
Заблуждение: asyncio заменяет threading
asyncio -- single-threaded cooperative multitasking. Если хотя бы одна корутина делает CPU-bound работу без await, весь event loop блокируется. Для 1 миллиона I/O-задач asyncio идеален (минимальный overhead), но для CPU-bound нужен multiprocessing.
Заблуждение: metaclasses нужны для продвинутого Python
99% задач, решаемых metaclasses, проще решить через __init_subclass__ (Python 3.6+) или class decorators. Django ORM использует metaclasses исторически, но это не значит, что вам нужно. На интервью достаточно знать type как metaclass и __init_subclass__ как альтернативу.
Вопросы на собеседовании¶
"Объясните разницу между generator и list comprehension"
Слабый ответ: "Generator -- ленивый, list -- нет"
Сильный ответ: "Generator expression
(x for x in data) возвращает iterator, занимает O(1) памяти, но одноразовый -- нельзя перебрать дважды, нет len(), []. List comprehension [x for x in data] создаёт весь список сразу -- O(n) памяти, но с произвольным доступом. Для 1M элементов разница: ~80MB vs ~56 bytes."
"Как работает __slots__?"
Слабый ответ: "Ограничивает атрибуты класса"
Сильный ответ: "
__slots__ заменяет __dict__ фиксированным набором атрибутов. 1M объектов без slots -- 171 MiB, с slots -- 71 MiB (экономия ~100 MiB). Минусы: нет динамических атрибутов, сложности с наследованием. Python 3.10+: @dataclass(slots=True)."
"Когда использовать threading vs multiprocessing vs asyncio?"
Слабый ответ: "multiprocessing для тяжёлых задач, threading для лёгких"
Сильный ответ: "CPU-bound (ML inference, image processing) -> multiprocessing (обходит GIL, true parallelism). I/O-bound с малым числом connections -> threading (проще, shared memory). I/O-bound с тысячами+ connections -> asyncio (minimal overhead, single thread). C-extensions (NumPy) освобождают GIL, поэтому threading работает для них."
Источники (полный список)¶
Generators¶
- GeeksforGeeks - Top 50+ Python Interview Questions 2025
- W3Resource - Python Generators Practice
- InterviewBit - Python Interview Questions 2025
Decorators¶
- W3Resource - Python Decorators Practice
- GeeksforGeeks - Decorators in Python
- Terminal.io - 15 Python Interview Questions
Context Managers¶
- GeeksforGeeks - Context Manager in Python
- Real Python - Python's with Statement
- DynamicDuniya - 50+ Context Manager Q&A
Dunder Methods¶
- GeeksforGeeks - Dunder Magic Methods
- Real Python - Python's Magic Methods
- DataCamp - Python Dunder Methods
GIL & Concurrency¶
- Towards Data Science - Multithreading, Multiprocessing, Asyncio
- GeeksforGeeks - Asyncio Vs Threading
- DEV Community - GIL in Python
List Comprehensions¶
- GeeksforGeeks - List Comprehension Interview Questions
- Medium - List Comprehensions for Array Questions