Skip to main content

Implementing Custom Serializers

Custom serializers allow you to override how Pydantic-core transforms Python objects into serialized formats (like JSON or plain Python dictionaries). You can implement these using "Plain" serializers for direct transformations or "Wrap" serializers for intercepting and modifying the default serialization process.

In this tutorial, you will build a serialization pipeline that handles custom type formatting, collection filtering, and context-aware field serialization.

Prerequisites

To follow this tutorial, you need pydantic-core installed. You will primarily use the core_schema module to define your serialization logic.

from pydantic_core import SchemaSerializer, core_schema, PydanticOmit
from collections import deque
from typing import Any

Step 1: Creating a Plain Serializer

A plain serializer is the simplest form of custom serialization. It takes a value and returns the serialized version. You use plain_serializer_function_ser_schema to define it.

In this example, you'll create a serializer that converts any value to its string representation using repr().

def repr_function(value: Any, info: core_schema.SerializationInfo) -> str:
# info.mode_is_json() tells us if we are serializing to JSON or Python
return repr(value)

schema = core_schema.any_schema(
serialization=core_schema.plain_serializer_function_ser_schema(
repr_function,
info_arg=True
)
)

serializer = SchemaSerializer(schema)
print(serializer.to_python(123))
# Output: '123'

How it works

  • function: The first argument is your custom serialization function.
  • info_arg=True: This tells Pydantic to pass a SerializationInfo object as the second argument to your function. This object contains metadata like the serialization mode (json or python).
  • plain_serializer_function_ser_schema: This creates a schema that completely replaces the default serialization logic for the target field or type.

Step 2: Implementing a Wrap Serializer

Wrap serializers are more powerful because they allow you to call the "default" serialization logic inside your custom function. This is useful for pre-processing or post-processing data.

You will now create a serializer for a deque that respects Pydantic's include and exclude parameters by using the SerializerFunctionWrapHandler.

def serialize_deque(
value: Any,
handler: core_schema.SerializerFunctionWrapHandler,
info: core_schema.SerializationInfo
) -> Any:
items = []
for index, item in enumerate(value):
try:
# The handler takes the value and an optional index/key
# This allows Pydantic to check include/exclude rules
v = handler(item, index)
except PydanticOmit:
# PydanticOmit is raised if the item should be excluded
continue
else:
items.append(v)

if info.mode_is_json():
return items
return deque(items)

schema = core_schema.any_schema(
serialization=core_schema.wrap_serializer_function_ser_schema(
serialize_deque,
info_arg=True,
schema=core_schema.any_schema()
)
)

s = SchemaSerializer(schema)
data = deque([10, 20, 30])
print(s.to_python(data, exclude={1}))
# Output: deque([10, 30])

How it works

  • handler: A SerializerFunctionWrapHandler that you call to perform the next step in the serialization chain.
  • index_key: When calling handler(item, index), you pass the index so Pydantic knows which element is being processed, enabling correct filtering via exclude.
  • PydanticOmit: Catching this exception allows your custom logic to gracefully skip items that Pydantic's internal logic has decided to exclude.

Step 3: Context-Aware Field Serializers

Sometimes you need to know which field you are serializing or access the parent object. By setting is_field_serializer=True, your function receives the parent object as the first argument and a FieldSerializationInfo object.

def ser_x(
parent: Any,
value: Any,
handler: core_schema.SerializerFunctionWrapHandler,
info: core_schema.FieldSerializationInfo,
) -> str:
# Access the parent object (e.g., a dict or model instance)
# Access the field name via info.field_name
serialized_val = handler(value)
return f"{serialized_val}-{info.field_name}"

schema = core_schema.typed_dict_schema(
{
'x': core_schema.typed_dict_field(
core_schema.int_schema(
serialization=core_schema.wrap_serializer_function_ser_schema(
ser_x,
is_field_serializer=True,
info_arg=True
)
)
)
}
)

s = SchemaSerializer(schema)
print(s.to_python({'x': 1000}))
# Output: {'x': '1000-x'}

How it works

  • is_field_serializer=True: Changes the function signature to (parent, value, [handler], info).
  • FieldSerializationInfo: Provides the field_name property, which is essential for generic serializers applied to multiple fields.

Step 4: Controlling When Serializers Run

You can control when your custom logic is triggered using the when_used parameter in both plain and wrap schemas.

ValueDescription
'always'(Default) Always use the custom serializer.
'unless-none'Use the custom serializer unless the value is None.
'json'Only use the custom serializer when serializing to JSON.
'json-unless-none'Only use for JSON, and only if the value is not None.
# Example: Only use custom logic for JSON output
serialization_schema = core_schema.plain_serializer_function_ser_schema(
repr_function,
when_used='json'
)

Complete Result

By combining these techniques, you can handle complex data structures. Here is a complete example of a wrap serializer that handles a custom collection while respecting global serialization settings:

from pydantic_core import SchemaSerializer, core_schema, PydanticOmit
from collections import deque

def custom_collection_serializer(value, handler, info):
results = []
for i, v in enumerate(value):
try:
results.append(handler(v, i))
except PydanticOmit:
continue
return results if info.mode_is_json() else deque(results)

schema = core_schema.any_schema(
serialization=core_schema.wrap_serializer_function_ser_schema(
custom_collection_serializer,
info_arg=True,
when_used='always'
)
)

serializer = SchemaSerializer(schema)
# This will now correctly handle deques, JSON mode, and exclusions.

Next, you might want to explore return_schema to validate the output of your custom serializers or use context within SerializationInfo to pass runtime data into your serialization logic.