types¶
Custom SQLAlchemy types for use with the ORM.
SQLAlchemy custom types for use with the ORM.
Core Types¶
- class advanced_alchemy.types.GUID[source]¶
Bases:
TypeDecoratorPlatform-independent GUID type.
Uses PostgreSQL’s UUID type (Postgres, DuckDB, Cockroach), MSSQL’s UNIQUEIDENTIFIER type, Oracle’s RAW(16) type, otherwise uses BINARY(16) or CHAR(32), storing as stringified hex values.
Will accept stringified UUIDs as a hexstring or an actual UUID
- cache_ok: Optional[bool] = True¶
Indicate if statements using this
ExternalTypeare “safe to cache”.The default value
Nonewill emit a warning and then not allow caching of a statement which includes this type. Set toFalseto disable statements using this type from being cached at all without a warning. When set toTrue, the object’s class and selected elements from its state will be used as part of the cache key. For example, using aTypeDecorator:class MyType(TypeDecorator): impl = String cache_ok = True def __init__(self, choices): self.choices = tuple(choices) self.internal_only = True
The cache key for the above type would be equivalent to:
>>> MyType(["a", "b", "c"])._static_cache_key (<class '__main__.MyType'>, ('choices', ('a', 'b', 'c')))
The caching scheme will extract attributes from the type that correspond to the names of parameters in the
__init__()method. Above, the “choices” attribute becomes part of the cache key but “internal_only” does not, because there is no parameter named “internal_only”.The requirements for cacheable elements is that they are hashable and also that they indicate the same SQL rendered for expressions using this type every time for a given cache value.
To accommodate for datatypes that refer to unhashable structures such as dictionaries, sets and lists, these objects can be made “cacheable” by assigning hashable structures to the attributes whose names correspond with the names of the arguments. For example, a datatype which accepts a dictionary of lookup values may publish this as a sorted series of tuples. Given a previously un-cacheable type as:
class LookupType(UserDefinedType): """a custom type that accepts a dictionary as a parameter. this is the non-cacheable version, as "self.lookup" is not hashable. """ def __init__(self, lookup): self.lookup = lookup def get_col_spec(self, **kw): return "VARCHAR(255)" def bind_processor(self, dialect): ... # works with "self.lookup" ...
Where “lookup” is a dictionary. The type will not be able to generate a cache key:
>>> type_ = LookupType({"a": 10, "b": 20}) >>> type_._static_cache_key <stdin>:1: SAWarning: UserDefinedType LookupType({'a': 10, 'b': 20}) will not produce a cache key because the ``cache_ok`` flag is not set to True. Set this flag to True if this type object's state is safe to use in a cache key, or False to disable this warning. symbol('no_cache')
If we did set up such a cache key, it wouldn’t be usable. We would get a tuple structure that contains a dictionary inside of it, which cannot itself be used as a key in a “cache dictionary” such as SQLAlchemy’s statement cache, since Python dictionaries aren’t hashable:
>>> # set cache_ok = True >>> type_.cache_ok = True >>> # this is the cache key it would generate >>> key = type_._static_cache_key >>> key (<class '__main__.LookupType'>, ('lookup', {'a': 10, 'b': 20})) >>> # however this key is not hashable, will fail when used with >>> # SQLAlchemy statement cache >>> some_cache = {key: "some sql value"} Traceback (most recent call last): File "<stdin>", line 1, in <module> TypeError: unhashable type: 'dict'
The type may be made cacheable by assigning a sorted tuple of tuples to the “.lookup” attribute:
class LookupType(UserDefinedType): """a custom type that accepts a dictionary as a parameter. The dictionary is stored both as itself in a private variable, and published in a public variable as a sorted tuple of tuples, which is hashable and will also return the same value for any two equivalent dictionaries. Note it assumes the keys and values of the dictionary are themselves hashable. """ cache_ok = True def __init__(self, lookup): self._lookup = lookup # assume keys/values of "lookup" are hashable; otherwise # they would also need to be converted in some way here self.lookup = tuple((key, lookup[key]) for key in sorted(lookup)) def get_col_spec(self, **kw): return "VARCHAR(255)" def bind_processor(self, dialect): ... # works with "self._lookup" ...
Where above, the cache key for
LookupType({"a": 10, "b": 20})will be:>>> LookupType({"a": 10, "b": 20})._static_cache_key (<class '__main__.LookupType'>, ('lookup', (('a', 10), ('b', 20))))
Added in version 1.4.14: - added the
cache_okflag to allow some configurability of caching forTypeDecoratorclasses.Added in version 1.4.28: - added the
ExternalTypemixin which generalizes thecache_okflag to both theTypeDecoratorandUserDefinedTypeclasses.See also
- property python_type: type[UUID]¶
Return the Python type object expected to be returned by instances of this type, if known.
Basically, for those types which enforce a return type, or are known across the board to do such for all common DBAPIs (like
intfor example), will return that type.If a return type is not defined, raises
NotImplementedError.Note that any type also accommodates NULL in SQL which means you can also get back
Nonefrom any type in practice.
- __init__(*args, binary=True, **kwargs)[source]¶
Construct a
TypeDecorator.Arguments sent here are passed to the constructor of the class assigned to the
implclass level attribute, assuming theimplis a callable, and the resulting object is assigned to theself.implinstance attribute (thus overriding the class attribute of the same name).If the class level
implis not a callable (the unusual case), it will be assigned to the same instance attribute ‘as-is’, ignoring those arguments passed to the constructor.Subclasses can override this to customize the generation of
self.implentirely.
- load_dialect_impl(dialect)[source]¶
Return a
TypeEngineobject corresponding to a dialect.This is an end-user override hook that can be used to provide differing types depending on the given dialect. It is used by the
TypeDecoratorimplementation oftype_engine()to help determine what type should ultimately be returned for a givenTypeDecorator.By default returns
self.impl.
- process_bind_param(value, dialect)[source]¶
Receive a bound parameter value to be converted.
Custom subclasses of
_types.TypeDecoratorshould override this method to provide custom behaviors for incoming data values. This method is called at statement execution time and is passed the literal Python data value which is to be associated with a bound parameter in the statement.The operation could be anything desired to perform custom behavior, such as transforming or serializing data. This could also be used as a hook for validating logic.
- process_result_value(value, dialect)[source]¶
Receive a result-row column value to be converted.
Custom subclasses of
_types.TypeDecoratorshould override this method to provide custom behaviors for data values being received in result rows coming from the database. This method is called at result fetching time and is passed the literal Python data value that’s extracted from a database result row.The operation could be anything desired to perform custom behavior, such as transforming or deserializing data.
SQLAlchemy type for UUID/GUID columns with database-specific implementations.
- advanced_alchemy.types.JsonB¶
alias of JSON()
Enhanced JSON type with JSONB support for PostgreSQL and optimized storage for other databases.
- class advanced_alchemy.types.DateTimeUTC[source]¶
Bases:
TypeDecoratorTimezone Aware DateTime.
Ensure UTC is stored in the database and that TZ aware dates are returned for all dialects.
- cache_ok: Optional[bool] = True¶
Indicate if statements using this
ExternalTypeare “safe to cache”.The default value
Nonewill emit a warning and then not allow caching of a statement which includes this type. Set toFalseto disable statements using this type from being cached at all without a warning. When set toTrue, the object’s class and selected elements from its state will be used as part of the cache key. For example, using aTypeDecorator:class MyType(TypeDecorator): impl = String cache_ok = True def __init__(self, choices): self.choices = tuple(choices) self.internal_only = True
The cache key for the above type would be equivalent to:
>>> MyType(["a", "b", "c"])._static_cache_key (<class '__main__.MyType'>, ('choices', ('a', 'b', 'c')))
The caching scheme will extract attributes from the type that correspond to the names of parameters in the
__init__()method. Above, the “choices” attribute becomes part of the cache key but “internal_only” does not, because there is no parameter named “internal_only”.The requirements for cacheable elements is that they are hashable and also that they indicate the same SQL rendered for expressions using this type every time for a given cache value.
To accommodate for datatypes that refer to unhashable structures such as dictionaries, sets and lists, these objects can be made “cacheable” by assigning hashable structures to the attributes whose names correspond with the names of the arguments. For example, a datatype which accepts a dictionary of lookup values may publish this as a sorted series of tuples. Given a previously un-cacheable type as:
class LookupType(UserDefinedType): """a custom type that accepts a dictionary as a parameter. this is the non-cacheable version, as "self.lookup" is not hashable. """ def __init__(self, lookup): self.lookup = lookup def get_col_spec(self, **kw): return "VARCHAR(255)" def bind_processor(self, dialect): ... # works with "self.lookup" ...
Where “lookup” is a dictionary. The type will not be able to generate a cache key:
>>> type_ = LookupType({"a": 10, "b": 20}) >>> type_._static_cache_key <stdin>:1: SAWarning: UserDefinedType LookupType({'a': 10, 'b': 20}) will not produce a cache key because the ``cache_ok`` flag is not set to True. Set this flag to True if this type object's state is safe to use in a cache key, or False to disable this warning. symbol('no_cache')
If we did set up such a cache key, it wouldn’t be usable. We would get a tuple structure that contains a dictionary inside of it, which cannot itself be used as a key in a “cache dictionary” such as SQLAlchemy’s statement cache, since Python dictionaries aren’t hashable:
>>> # set cache_ok = True >>> type_.cache_ok = True >>> # this is the cache key it would generate >>> key = type_._static_cache_key >>> key (<class '__main__.LookupType'>, ('lookup', {'a': 10, 'b': 20})) >>> # however this key is not hashable, will fail when used with >>> # SQLAlchemy statement cache >>> some_cache = {key: "some sql value"} Traceback (most recent call last): File "<stdin>", line 1, in <module> TypeError: unhashable type: 'dict'
The type may be made cacheable by assigning a sorted tuple of tuples to the “.lookup” attribute:
class LookupType(UserDefinedType): """a custom type that accepts a dictionary as a parameter. The dictionary is stored both as itself in a private variable, and published in a public variable as a sorted tuple of tuples, which is hashable and will also return the same value for any two equivalent dictionaries. Note it assumes the keys and values of the dictionary are themselves hashable. """ cache_ok = True def __init__(self, lookup): self._lookup = lookup # assume keys/values of "lookup" are hashable; otherwise # they would also need to be converted in some way here self.lookup = tuple((key, lookup[key]) for key in sorted(lookup)) def get_col_spec(self, **kw): return "VARCHAR(255)" def bind_processor(self, dialect): ... # works with "self._lookup" ...
Where above, the cache key for
LookupType({"a": 10, "b": 20})will be:>>> LookupType({"a": 10, "b": 20})._static_cache_key (<class '__main__.LookupType'>, ('lookup', (('a', 10), ('b', 20))))
Added in version 1.4.14: - added the
cache_okflag to allow some configurability of caching forTypeDecoratorclasses.Added in version 1.4.28: - added the
ExternalTypemixin which generalizes thecache_okflag to both theTypeDecoratorandUserDefinedTypeclasses.See also
- property python_type: type[datetime]¶
Return the Python type object expected to be returned by instances of this type, if known.
Basically, for those types which enforce a return type, or are known across the board to do such for all common DBAPIs (like
intfor example), will return that type.If a return type is not defined, raises
NotImplementedError.Note that any type also accommodates NULL in SQL which means you can also get back
Nonefrom any type in practice.
- process_bind_param(value, dialect)[source]¶
Receive a bound parameter value to be converted.
Custom subclasses of
_types.TypeDecoratorshould override this method to provide custom behaviors for incoming data values. This method is called at statement execution time and is passed the literal Python data value which is to be associated with a bound parameter in the statement.The operation could be anything desired to perform custom behavior, such as transforming or serializing data. This could also be used as a hook for validating logic.
- process_result_value(value, dialect)[source]¶
Receive a result-row column value to be converted.
Custom subclasses of
_types.TypeDecoratorshould override this method to provide custom behaviors for data values being received in result rows coming from the database. This method is called at result fetching time and is passed the literal Python data value that’s extracted from a database result row.The operation could be anything desired to perform custom behavior, such as transforming or deserializing data.
DateTime type that enforces UTC timezone and provides timezone-aware operations.
- class advanced_alchemy.types.ORA_JSONB[source]¶
Bases:
TypeDecorator,SchemaTypeOracle Binary JSON type.
JsonB = _JSON().with_variant(PG_JSONB, “postgresql”).with_variant(ORA_JSONB, “oracle”)
- cache_ok: Optional[bool] = True¶
Indicate if statements using this
ExternalTypeare “safe to cache”.The default value
Nonewill emit a warning and then not allow caching of a statement which includes this type. Set toFalseto disable statements using this type from being cached at all without a warning. When set toTrue, the object’s class and selected elements from its state will be used as part of the cache key. For example, using aTypeDecorator:class MyType(TypeDecorator): impl = String cache_ok = True def __init__(self, choices): self.choices = tuple(choices) self.internal_only = True
The cache key for the above type would be equivalent to:
>>> MyType(["a", "b", "c"])._static_cache_key (<class '__main__.MyType'>, ('choices', ('a', 'b', 'c')))
The caching scheme will extract attributes from the type that correspond to the names of parameters in the
__init__()method. Above, the “choices” attribute becomes part of the cache key but “internal_only” does not, because there is no parameter named “internal_only”.The requirements for cacheable elements is that they are hashable and also that they indicate the same SQL rendered for expressions using this type every time for a given cache value.
To accommodate for datatypes that refer to unhashable structures such as dictionaries, sets and lists, these objects can be made “cacheable” by assigning hashable structures to the attributes whose names correspond with the names of the arguments. For example, a datatype which accepts a dictionary of lookup values may publish this as a sorted series of tuples. Given a previously un-cacheable type as:
class LookupType(UserDefinedType): """a custom type that accepts a dictionary as a parameter. this is the non-cacheable version, as "self.lookup" is not hashable. """ def __init__(self, lookup): self.lookup = lookup def get_col_spec(self, **kw): return "VARCHAR(255)" def bind_processor(self, dialect): ... # works with "self.lookup" ...
Where “lookup” is a dictionary. The type will not be able to generate a cache key:
>>> type_ = LookupType({"a": 10, "b": 20}) >>> type_._static_cache_key <stdin>:1: SAWarning: UserDefinedType LookupType({'a': 10, 'b': 20}) will not produce a cache key because the ``cache_ok`` flag is not set to True. Set this flag to True if this type object's state is safe to use in a cache key, or False to disable this warning. symbol('no_cache')
If we did set up such a cache key, it wouldn’t be usable. We would get a tuple structure that contains a dictionary inside of it, which cannot itself be used as a key in a “cache dictionary” such as SQLAlchemy’s statement cache, since Python dictionaries aren’t hashable:
>>> # set cache_ok = True >>> type_.cache_ok = True >>> # this is the cache key it would generate >>> key = type_._static_cache_key >>> key (<class '__main__.LookupType'>, ('lookup', {'a': 10, 'b': 20})) >>> # however this key is not hashable, will fail when used with >>> # SQLAlchemy statement cache >>> some_cache = {key: "some sql value"} Traceback (most recent call last): File "<stdin>", line 1, in <module> TypeError: unhashable type: 'dict'
The type may be made cacheable by assigning a sorted tuple of tuples to the “.lookup” attribute:
class LookupType(UserDefinedType): """a custom type that accepts a dictionary as a parameter. The dictionary is stored both as itself in a private variable, and published in a public variable as a sorted tuple of tuples, which is hashable and will also return the same value for any two equivalent dictionaries. Note it assumes the keys and values of the dictionary are themselves hashable. """ cache_ok = True def __init__(self, lookup): self._lookup = lookup # assume keys/values of "lookup" are hashable; otherwise # they would also need to be converted in some way here self.lookup = tuple((key, lookup[key]) for key in sorted(lookup)) def get_col_spec(self, **kw): return "VARCHAR(255)" def bind_processor(self, dialect): ... # works with "self._lookup" ...
Where above, the cache key for
LookupType({"a": 10, "b": 20})will be:>>> LookupType({"a": 10, "b": 20})._static_cache_key (<class '__main__.LookupType'>, ('lookup', (('a', 10), ('b', 20))))
Added in version 1.4.14: - added the
cache_okflag to allow some configurability of caching forTypeDecoratorclasses.Added in version 1.4.28: - added the
ExternalTypemixin which generalizes thecache_okflag to both theTypeDecoratorandUserDefinedTypeclasses.See also
- property python_type: type[dict[str, Any]]¶
Return the Python type object expected to be returned by instances of this type, if known.
Basically, for those types which enforce a return type, or are known across the board to do such for all common DBAPIs (like
intfor example), will return that type.If a return type is not defined, raises
NotImplementedError.Note that any type also accommodates NULL in SQL which means you can also get back
Nonefrom any type in practice.
- coerce_compared_value(op, value)[source]¶
Suggest a type for a ‘coerced’ Python value in an expression.
By default, returns self. This method is called by the expression system when an object using this type is on the left or right side of an expression against a plain Python object which does not yet have a SQLAlchemy type assigned:
expr = table.c.somecolumn + 35Where above, if
somecolumnuses this type, this method will be called with the valueoperator.addand35. The return value is whatever SQLAlchemy type should be used for35for this particular operation.
- load_dialect_impl(dialect)[source]¶
Return a
TypeEngineobject corresponding to a dialect.This is an end-user override hook that can be used to provide differing types depending on the given dialect. It is used by the
TypeDecoratorimplementation oftype_engine()to help determine what type should ultimately be returned for a givenTypeDecorator.By default returns
self.impl.- Return type:
- Parameters:
dialect (Dialect)
- process_bind_param(value, dialect)[source]¶
Receive a bound parameter value to be converted.
Custom subclasses of
_types.TypeDecoratorshould override this method to provide custom behaviors for incoming data values. This method is called at statement execution time and is passed the literal Python data value which is to be associated with a bound parameter in the statement.The operation could be anything desired to perform custom behavior, such as transforming or serializing data. This could also be used as a hook for validating logic.
- process_result_value(value, dialect)[source]¶
Receive a result-row column value to be converted.
Custom subclasses of
_types.TypeDecoratorshould override this method to provide custom behaviors for data values being received in result rows coming from the database. This method is called at result fetching time and is passed the literal Python data value that’s extracted from a database result row.The operation could be anything desired to perform custom behavior, such as transforming or deserializing data.
Oracle-specific JSON type using native JSON column support (Oracle 21c+).
File Storage¶
- class advanced_alchemy.types.FileObject[source]¶
Bases:
objectRepresents file metadata during processing using a dataclass structure.
This class provides a unified interface for handling file metadata and operations across different storage backends.
Content or a source path can optionally be provided during initialization via kwargs, store it internally, and add save/save_async methods to persist this pending data using the configured backend.
- __init__(backend, filename, to_filename=None, content_type=None, size=None, last_modified=None, checksum=None, etag=None, version_id=None, metadata=None, source_path=None, content=None)[source]¶
Perform post-initialization validation and setup.
Handles default path, content type guessing, backend protocol inference, and processing of ‘content’ or ‘source_path’ from extra kwargs.
- Raises:
ValueError – If filename is not provided, size is negative, backend/protocol mismatch, or both ‘content’ and ‘source_path’ are provided.
- to_dict()[source]¶
Convert FileObject to a dictionary for storage or serialization.
- Note: The ‘backend’ attribute is intentionally excluded as it’s often
not serializable or relevant for storage representations. The ‘extra’ dict is included.
- async get_content_async(*, options=None)[source]¶
Get the file content from the storage backend asynchronously.
- async sign_async(*, expires_in=None, for_upload=False)[source]¶
Generate a signed URL for the file asynchronously.
- delete()[source]¶
Delete the file from storage.
- Raises:
RuntimeError – If no backend is configured or path is missing.
- Return type:
- save(data=None, *, use_multipart=None, chunk_size=5 * 1024 * 1024, max_concurrency=12)[source]¶
Save data to the storage backend using this FileObject’s metadata.
If data is provided, it is used directly. If data is None, checks internal source_content or source_path. Clears pending attributes after successful save.
- Parameters:
data¶ (
Union[IO[bytes],Path,bytes,Iterator[bytes],Iterable[bytes],None]) – Optional data to save (bytes, iterator, file-like, Path). If None, internal pending data is used.use_multipart¶ (
Optional[bool]) – Passed to the backend’s save method.max_concurrency¶ (
int) – Passed to the backend’s save method.data (IO[bytes] | Path | bytes | Iterator[bytes] | Iterable[bytes] | None)
use_multipart (bool | None)
chunk_size (int)
max_concurrency (int)
- Return type:
- Returns:
The updated FileObject instance returned by the backend.
- Raises:
TypeError – If trying to save async data synchronously.
- async save_async(data=None, *, use_multipart=None, chunk_size=5 * 1024 * 1024, max_concurrency=12)[source]¶
Save data to the storage backend asynchronously.
If data is provided, it is used directly. If data is None, checks internal source_content or source_path. Clears pending attributes after successful save. Uses asyncio.to_thread for reading source_path if backend doesn’t handle Path directly.
- Parameters:
data¶ (
Union[IO[bytes],Path,bytes,AsyncIterator[bytes],AsyncIterable[bytes],Iterator[bytes],Iterable[bytes],None]) – Optional data to save (bytes, async iterator, file-like, Path, etc.). If None, internal pending data is used.use_multipart¶ (
Optional[bool]) – Passed to the backend’s async save method.chunk_size¶ (
int) – Passed to the backend’s async save method.max_concurrency¶ (
int) – Passed to the backend’s async save method.data (IO[bytes] | Path | bytes | AsyncIterator[bytes] | AsyncIterable[bytes] | Iterator[bytes] | Iterable[bytes] | None)
use_multipart (bool | None)
chunk_size (int)
max_concurrency (int)
- Return type:
- Returns:
The updated FileObject instance returned by the backend.
- Raises:
TypeError – If trying to save sync data asynchronously.
- classmethod __get_pydantic_core_schema__(source_type, handler)[source]¶
Get the Pydantic core schema for FileObject.
This method defines how Pydantic should validate and serialize FileObject instances. It creates a schema that validates dictionaries with the required fields and converts them to FileObject instances.
- Raises:
MissingDependencyError – If Pydantic is not installed when this method is called.
- Parameters:
handler¶ (
GetCoreSchemaHandler) – The Pydantic schema handlersource_type (Any)
handler (GetCoreSchemaHandler)
- Return type:
Union[InvalidSchema,AnySchema,NoneSchema,BoolSchema,IntSchema,FloatSchema,DecimalSchema,StringSchema,BytesSchema,DateSchema,TimeSchema,DatetimeSchema,TimedeltaSchema,LiteralSchema,MissingSentinelSchema,EnumSchema,IsInstanceSchema,IsSubclassSchema,CallableSchema,ListSchema,TupleSchema,SetSchema,FrozenSetSchema,GeneratorSchema,DictSchema,AfterValidatorFunctionSchema,BeforeValidatorFunctionSchema,WrapValidatorFunctionSchema,PlainValidatorFunctionSchema,WithDefaultSchema,NullableSchema,UnionSchema,TaggedUnionSchema,ChainSchema,LaxOrStrictSchema,JsonOrPythonSchema,TypedDictSchema,ModelFieldsSchema,ModelSchema,DataclassArgsSchema,DataclassSchema,ArgumentsSchema,ArgumentsV3Schema,CallSchema,CustomErrorSchema,JsonSchema,UrlSchema,MultiHostUrlSchema,DefinitionsSchema,DefinitionReferenceSchema,UuidSchema,ComplexSchema]- Returns:
A Pydantic core schema for FileObject
SQLAlchemy type for file storage with support for multiple backends (fsspec, obstore).
- advanced_alchemy.types.FileObjectList¶
alias of
MutableList[FileObject]
List of FileObject instances with SQLAlchemy integration.
- class advanced_alchemy.types.StoredObject[source]¶
Bases:
TypeDecoratorCustom SQLAlchemy type for storing single or multiple file metadata.
Stores file metadata in JSONB and handles file validation, processing, and storage operations through a configured storage backend.
- cache_ok: Optional[bool] = True¶
Indicate if statements using this
ExternalTypeare “safe to cache”.The default value
Nonewill emit a warning and then not allow caching of a statement which includes this type. Set toFalseto disable statements using this type from being cached at all without a warning. When set toTrue, the object’s class and selected elements from its state will be used as part of the cache key. For example, using aTypeDecorator:class MyType(TypeDecorator): impl = String cache_ok = True def __init__(self, choices): self.choices = tuple(choices) self.internal_only = True
The cache key for the above type would be equivalent to:
>>> MyType(["a", "b", "c"])._static_cache_key (<class '__main__.MyType'>, ('choices', ('a', 'b', 'c')))
The caching scheme will extract attributes from the type that correspond to the names of parameters in the
__init__()method. Above, the “choices” attribute becomes part of the cache key but “internal_only” does not, because there is no parameter named “internal_only”.The requirements for cacheable elements is that they are hashable and also that they indicate the same SQL rendered for expressions using this type every time for a given cache value.
To accommodate for datatypes that refer to unhashable structures such as dictionaries, sets and lists, these objects can be made “cacheable” by assigning hashable structures to the attributes whose names correspond with the names of the arguments. For example, a datatype which accepts a dictionary of lookup values may publish this as a sorted series of tuples. Given a previously un-cacheable type as:
class LookupType(UserDefinedType): """a custom type that accepts a dictionary as a parameter. this is the non-cacheable version, as "self.lookup" is not hashable. """ def __init__(self, lookup): self.lookup = lookup def get_col_spec(self, **kw): return "VARCHAR(255)" def bind_processor(self, dialect): ... # works with "self.lookup" ...
Where “lookup” is a dictionary. The type will not be able to generate a cache key:
>>> type_ = LookupType({"a": 10, "b": 20}) >>> type_._static_cache_key <stdin>:1: SAWarning: UserDefinedType LookupType({'a': 10, 'b': 20}) will not produce a cache key because the ``cache_ok`` flag is not set to True. Set this flag to True if this type object's state is safe to use in a cache key, or False to disable this warning. symbol('no_cache')
If we did set up such a cache key, it wouldn’t be usable. We would get a tuple structure that contains a dictionary inside of it, which cannot itself be used as a key in a “cache dictionary” such as SQLAlchemy’s statement cache, since Python dictionaries aren’t hashable:
>>> # set cache_ok = True >>> type_.cache_ok = True >>> # this is the cache key it would generate >>> key = type_._static_cache_key >>> key (<class '__main__.LookupType'>, ('lookup', {'a': 10, 'b': 20})) >>> # however this key is not hashable, will fail when used with >>> # SQLAlchemy statement cache >>> some_cache = {key: "some sql value"} Traceback (most recent call last): File "<stdin>", line 1, in <module> TypeError: unhashable type: 'dict'
The type may be made cacheable by assigning a sorted tuple of tuples to the “.lookup” attribute:
class LookupType(UserDefinedType): """a custom type that accepts a dictionary as a parameter. The dictionary is stored both as itself in a private variable, and published in a public variable as a sorted tuple of tuples, which is hashable and will also return the same value for any two equivalent dictionaries. Note it assumes the keys and values of the dictionary are themselves hashable. """ cache_ok = True def __init__(self, lookup): self._lookup = lookup # assume keys/values of "lookup" are hashable; otherwise # they would also need to be converted in some way here self.lookup = tuple((key, lookup[key]) for key in sorted(lookup)) def get_col_spec(self, **kw): return "VARCHAR(255)" def bind_processor(self, dialect): ... # works with "self._lookup" ...
Where above, the cache key for
LookupType({"a": 10, "b": 20})will be:>>> LookupType({"a": 10, "b": 20})._static_cache_key (<class '__main__.LookupType'>, ('lookup', (('a', 10), ('b', 20))))
Added in version 1.4.14: - added the
cache_okflag to allow some configurability of caching forTypeDecoratorclasses.Added in version 1.4.28: - added the
ExternalTypemixin which generalizes thecache_okflag to both theTypeDecoratorandUserDefinedTypeclasses.See also
- property python_type: type[FileObject | list[FileObject] | set[FileObject] | MutableList[FileObject] | None]¶
Specifies the Python type used, accounting for the multiple flag.
- property backend: StorageBackend¶
Resolves and returns the storage backend instance.
- __init__(backend, multiple=False, *args, **kwargs)[source]¶
Initialize StoredObject type.
- Parameters:
backend¶ (
Union[str,StorageBackend]) – Key to retrieve the backend or from the storage registry or storage backend to use.multiple¶ (
bool) – If True, stores a list of files; otherwise, a single file.*args¶ (typing.Any) – Additional positional arguments for TypeDecorator.
**kwargs¶ (typing.Any) – Additional keyword arguments for TypeDecorator.
backend (str | StorageBackend)
multiple (bool)
args (Any)
kwargs (Any)
- Return type:
None
- process_bind_param(value, dialect)[source]¶
Convert FileObject(s) to JSON representation for the database.
Injects the configured backend into the FileObject before conversion.
- Note: This method expects an already processed FileInfo or its dict representation.
Use handle_upload() or handle_upload_async() for processing raw uploads.
- Parameters:
- Raises:
TypeError – If the input value is not a FileObject or a list of FileObjects.
- Returns:
A dictionary representing the file metadata, or None if the input value is None.
- Return type:
Internal representation of a stored file.
- class advanced_alchemy.types.StorageBackend[source]¶
Bases:
ABCUnified protocol for storage backend implementations supporting both sync and async operations.
- abstractmethod async get_content_async(path, *, options=None)[source]¶
Get the content of a file asynchronously.
- abstractmethod save_object(file_object, data, *, use_multipart=None, chunk_size=5 * 1024 * 1024, max_concurrency=12)[source]¶
Store a file using information from a FileObject.
- Parameters:
file_object¶ – A FileObject instance containing metadata like path, content_type.
data¶ – The file data to store.
use_multipart¶ – Whether to use multipart upload.
chunk_size¶ – Size of chunks for multipart upload.
max_concurrency¶ – Maximum number of concurrent uploads.
file_object (FileObject)
data (DataLike)
use_multipart (Optional[bool])
chunk_size (int)
max_concurrency (int)
- Returns:
The stored file object, potentially updated with backend info (size, etag, etc.).
- Return type:
- abstractmethod async save_object_async(file_object, data, *, use_multipart=None, chunk_size=5 * 1024 * 1024, max_concurrency=12)[source]¶
Store a file asynchronously using information from a FileObject.
- Parameters:
file_object¶ (
FileObject) – A FileObject instance containing metadata like path, content_type.data¶ (
Union[IO[bytes],Path,bytes,AsyncIterator[bytes],AsyncIterable[bytes],Iterator[bytes],Iterable[bytes]]) – The file data to store.use_multipart¶ (
Optional[bool]) – Whether to use multipart upload.max_concurrency¶ (
int) – Maximum number of concurrent uploads.file_object (FileObject)
data (IO[bytes] | Path | bytes | AsyncIterator[bytes] | AsyncIterable[bytes] | Iterator[bytes] | Iterable[bytes])
use_multipart (bool | None)
chunk_size (int)
max_concurrency (int)
- Returns:
The stored file object, potentially updated with backend info (size, etag, etc.).
- Return type:
- abstractmethod sign(paths, *, expires_in=None, for_upload=False)[source]¶
Generate a signed URL for one or more files.
- Parameters:
paths¶ (
Union[str,Path,PathLike[Any],Sequence[Union[str,Path,PathLike[Any]]]]) – Path or paths to generate URLs forexpires_in¶ (
Optional[int]) – Optional expiration time in secondspaths (str | Path | PathLike[Any] | Sequence[str | Path | PathLike[Any]])
expires_in (int | None)
for_upload (bool)
- Returns:
The signed URL
- Return type:
- abstractmethod async sign_async(paths, *, expires_in=None, for_upload=False)[source]¶
Generate a signed URL for one or more files asynchronously.
- Parameters:
paths¶ (
Union[str,Path,PathLike[Any],Sequence[Union[str,Path,PathLike[Any]]]]) – Path or paths to generate URLs forexpires_in¶ (
Optional[int]) – Optional expiration time in secondspaths (str | Path | PathLike[Any] | Sequence[str | Path | PathLike[Any]])
expires_in (int | None)
for_upload (bool)
- Returns:
The signed URL
- Return type:
- class advanced_alchemy.types.StorageRegistry[source]¶
Bases:
objectA provider for creating and managing threaded portals.
- __init__(json_serializer=encode_json, json_deserializer=decode_json, default_backend=DEFAULT_BACKEND)[source]¶
Initialize the PortalProvider.
- set_default_backend(default_backend)[source]¶
Set the default storage backend.
- Parameters:
default_backend¶ – The default storage backend
default_backend (Union[str, type[StorageBackend]])
- Return type:
None
- get_backend(key)[source]¶
Retrieve a configured storage backend from the registry.
- Returns:
The storage backend associaStorageBackendiven key.
- Return type:
- Raises:
ImproperConfigurationError – If no storage backend is registered with the given key.
- Parameters:
key (str)
- register_backend(value, key=None)[source]¶
Register a new storage backend in the registry.
- Parameters:
value¶ – The storage backend to register.
key¶ – The key to register the storage backend with.
value (Union[StorageBackend, str])
key (Optional[str])
- Raises:
ImproperConfigurationError – If a string value is provided without a key.
- Return type:
None
Security Types¶
- class advanced_alchemy.types.EncryptedString[source]¶
Bases:
TypeDecoratorSQLAlchemy TypeDecorator for storing encrypted string values in a database.
This type provides transparent encryption/decryption of string values using the specified backend. It extends
sqlalchemy.types.TypeDecoratorand implements String as its underlying type.- Parameters:
key¶ (str | bytes | Callable[[], str | bytes] | None) – The encryption key. Can be a string, bytes, or callable returning either. Defaults to os.urandom(32).
backend¶ (Type[EncryptionBackend] | None) – The encryption backend class to use. Defaults to FernetBackend.
length¶ (int | None) – The length of the unencrypted string. This is used for documentation and validation purposes only, as encrypted strings will be longer.
**kwargs¶ (Any | None) – Additional arguments passed to the underlying String type.
- backend¶
The encryption backend instance.
- Type:
- cache_ok: Optional[bool] = True¶
Indicate if statements using this
ExternalTypeare “safe to cache”.The default value
Nonewill emit a warning and then not allow caching of a statement which includes this type. Set toFalseto disable statements using this type from being cached at all without a warning. When set toTrue, the object’s class and selected elements from its state will be used as part of the cache key. For example, using aTypeDecorator:class MyType(TypeDecorator): impl = String cache_ok = True def __init__(self, choices): self.choices = tuple(choices) self.internal_only = True
The cache key for the above type would be equivalent to:
>>> MyType(["a", "b", "c"])._static_cache_key (<class '__main__.MyType'>, ('choices', ('a', 'b', 'c')))
The caching scheme will extract attributes from the type that correspond to the names of parameters in the
__init__()method. Above, the “choices” attribute becomes part of the cache key but “internal_only” does not, because there is no parameter named “internal_only”.The requirements for cacheable elements is that they are hashable and also that they indicate the same SQL rendered for expressions using this type every time for a given cache value.
To accommodate for datatypes that refer to unhashable structures such as dictionaries, sets and lists, these objects can be made “cacheable” by assigning hashable structures to the attributes whose names correspond with the names of the arguments. For example, a datatype which accepts a dictionary of lookup values may publish this as a sorted series of tuples. Given a previously un-cacheable type as:
class LookupType(UserDefinedType): """a custom type that accepts a dictionary as a parameter. this is the non-cacheable version, as "self.lookup" is not hashable. """ def __init__(self, lookup): self.lookup = lookup def get_col_spec(self, **kw): return "VARCHAR(255)" def bind_processor(self, dialect): ... # works with "self.lookup" ...
Where “lookup” is a dictionary. The type will not be able to generate a cache key:
>>> type_ = LookupType({"a": 10, "b": 20}) >>> type_._static_cache_key <stdin>:1: SAWarning: UserDefinedType LookupType({'a': 10, 'b': 20}) will not produce a cache key because the ``cache_ok`` flag is not set to True. Set this flag to True if this type object's state is safe to use in a cache key, or False to disable this warning. symbol('no_cache')
If we did set up such a cache key, it wouldn’t be usable. We would get a tuple structure that contains a dictionary inside of it, which cannot itself be used as a key in a “cache dictionary” such as SQLAlchemy’s statement cache, since Python dictionaries aren’t hashable:
>>> # set cache_ok = True >>> type_.cache_ok = True >>> # this is the cache key it would generate >>> key = type_._static_cache_key >>> key (<class '__main__.LookupType'>, ('lookup', {'a': 10, 'b': 20})) >>> # however this key is not hashable, will fail when used with >>> # SQLAlchemy statement cache >>> some_cache = {key: "some sql value"} Traceback (most recent call last): File "<stdin>", line 1, in <module> TypeError: unhashable type: 'dict'
The type may be made cacheable by assigning a sorted tuple of tuples to the “.lookup” attribute:
class LookupType(UserDefinedType): """a custom type that accepts a dictionary as a parameter. The dictionary is stored both as itself in a private variable, and published in a public variable as a sorted tuple of tuples, which is hashable and will also return the same value for any two equivalent dictionaries. Note it assumes the keys and values of the dictionary are themselves hashable. """ cache_ok = True def __init__(self, lookup): self._lookup = lookup # assume keys/values of "lookup" are hashable; otherwise # they would also need to be converted in some way here self.lookup = tuple((key, lookup[key]) for key in sorted(lookup)) def get_col_spec(self, **kw): return "VARCHAR(255)" def bind_processor(self, dialect): ... # works with "self._lookup" ...
Where above, the cache key for
LookupType({"a": 10, "b": 20})will be:>>> LookupType({"a": 10, "b": 20})._static_cache_key (<class '__main__.LookupType'>, ('lookup', (('a', 10), ('b', 20))))
Added in version 1.4.14: - added the
cache_okflag to allow some configurability of caching forTypeDecoratorclasses.Added in version 1.4.28: - added the
ExternalTypemixin which generalizes thecache_okflag to both theTypeDecoratorandUserDefinedTypeclasses.See also
- __init__(key=DEFAULT_ENCRYPTION_KEY, backend=FernetBackend, length=None, **kwargs)[source]¶
Initializes the EncryptedString TypeDecorator.
- Parameters:
key¶ (str | bytes | Callable[[], str | bytes] | None) – The encryption key. Can be a string, bytes, or callable returning either. Defaults to os.urandom(32).
backend¶ (Type[EncryptionBackend] | None) – The encryption backend class to use. Defaults to FernetBackend.
length¶ (int | None) – The length of the unencrypted string. This is used for documentation and validation purposes only.
**kwargs¶ (Any | None) – Additional arguments passed to the underlying String type.
- property python_type: type[str]¶
Returns the Python type for this type decorator.
- Returns:
The Python string type.
- Return type:
Type[str]
- load_dialect_impl(dialect)[source]¶
Loads the appropriate dialect implementation based on the database dialect.
Note: The actual column length will be larger than the specified length due to encryption overhead. For most encryption methods, the encrypted string will be approximately 1.35x longer than the original.
- process_bind_param(value, dialect)[source]¶
Processes the value before binding it to the SQL statement.
This method encrypts the value using the specified backend and validates length if specified.
SQLAlchemy type for encrypted string storage.
- class advanced_alchemy.types.EncryptedText[source]¶
Bases:
EncryptedStringSQLAlchemy TypeDecorator for storing encrypted text/CLOB values in a database.
This type provides transparent encryption/decryption of text values using the specified backend. It extends
EncryptedStringand implements Text as its underlying type. This is suitable for storing larger encrypted text content compared to EncryptedString.- Parameters:
key¶ (str | bytes | Callable[[], str | bytes] | None) – The encryption key. Can be a string, bytes, or callable returning either. Defaults to os.urandom(32).
backend¶ (Type[EncryptionBackend] | None) – The encryption backend class to use. Defaults to FernetBackend.
**kwargs¶ (Any | None) – Additional arguments passed to the underlying String type.
- cache_ok: Optional[bool] = True¶
Indicate if statements using this
ExternalTypeare “safe to cache”.The default value
Nonewill emit a warning and then not allow caching of a statement which includes this type. Set toFalseto disable statements using this type from being cached at all without a warning. When set toTrue, the object’s class and selected elements from its state will be used as part of the cache key. For example, using aTypeDecorator:class MyType(TypeDecorator): impl = String cache_ok = True def __init__(self, choices): self.choices = tuple(choices) self.internal_only = True
The cache key for the above type would be equivalent to:
>>> MyType(["a", "b", "c"])._static_cache_key (<class '__main__.MyType'>, ('choices', ('a', 'b', 'c')))
The caching scheme will extract attributes from the type that correspond to the names of parameters in the
__init__()method. Above, the “choices” attribute becomes part of the cache key but “internal_only” does not, because there is no parameter named “internal_only”.The requirements for cacheable elements is that they are hashable and also that they indicate the same SQL rendered for expressions using this type every time for a given cache value.
To accommodate for datatypes that refer to unhashable structures such as dictionaries, sets and lists, these objects can be made “cacheable” by assigning hashable structures to the attributes whose names correspond with the names of the arguments. For example, a datatype which accepts a dictionary of lookup values may publish this as a sorted series of tuples. Given a previously un-cacheable type as:
class LookupType(UserDefinedType): """a custom type that accepts a dictionary as a parameter. this is the non-cacheable version, as "self.lookup" is not hashable. """ def __init__(self, lookup): self.lookup = lookup def get_col_spec(self, **kw): return "VARCHAR(255)" def bind_processor(self, dialect): ... # works with "self.lookup" ...
Where “lookup” is a dictionary. The type will not be able to generate a cache key:
>>> type_ = LookupType({"a": 10, "b": 20}) >>> type_._static_cache_key <stdin>:1: SAWarning: UserDefinedType LookupType({'a': 10, 'b': 20}) will not produce a cache key because the ``cache_ok`` flag is not set to True. Set this flag to True if this type object's state is safe to use in a cache key, or False to disable this warning. symbol('no_cache')
If we did set up such a cache key, it wouldn’t be usable. We would get a tuple structure that contains a dictionary inside of it, which cannot itself be used as a key in a “cache dictionary” such as SQLAlchemy’s statement cache, since Python dictionaries aren’t hashable:
>>> # set cache_ok = True >>> type_.cache_ok = True >>> # this is the cache key it would generate >>> key = type_._static_cache_key >>> key (<class '__main__.LookupType'>, ('lookup', {'a': 10, 'b': 20})) >>> # however this key is not hashable, will fail when used with >>> # SQLAlchemy statement cache >>> some_cache = {key: "some sql value"} Traceback (most recent call last): File "<stdin>", line 1, in <module> TypeError: unhashable type: 'dict'
The type may be made cacheable by assigning a sorted tuple of tuples to the “.lookup” attribute:
class LookupType(UserDefinedType): """a custom type that accepts a dictionary as a parameter. The dictionary is stored both as itself in a private variable, and published in a public variable as a sorted tuple of tuples, which is hashable and will also return the same value for any two equivalent dictionaries. Note it assumes the keys and values of the dictionary are themselves hashable. """ cache_ok = True def __init__(self, lookup): self._lookup = lookup # assume keys/values of "lookup" are hashable; otherwise # they would also need to be converted in some way here self.lookup = tuple((key, lookup[key]) for key in sorted(lookup)) def get_col_spec(self, **kw): return "VARCHAR(255)" def bind_processor(self, dialect): ... # works with "self._lookup" ...
Where above, the cache key for
LookupType({"a": 10, "b": 20})will be:>>> LookupType({"a": 10, "b": 20})._static_cache_key (<class '__main__.LookupType'>, ('lookup', (('a', 10), ('b', 20))))
Added in version 1.4.14: - added the
cache_okflag to allow some configurability of caching forTypeDecoratorclasses.Added in version 1.4.28: - added the
ExternalTypemixin which generalizes thecache_okflag to both theTypeDecoratorandUserDefinedTypeclasses.See also
SQLAlchemy type for encrypted text storage (longer content).
- class advanced_alchemy.types.password_hash.base.HashedPassword[source]¶
Bases:
objectWrapper class for a hashed password.
This class holds the hash string and provides a method to verify a plain text password against it.
SQLAlchemy type for password hashing with pluggable hashers.
- class advanced_alchemy.types.PasswordHash[source]¶
Bases:
TypeDecoratorSQLAlchemy TypeDecorator for storing hashed passwords in a database.
This type provides transparent hashing of password values using the specified backend. It extends
sqlalchemy.types.TypeDecoratorand implements String as its underlying type.- cache_ok: Optional[bool] = True¶
Indicate if statements using this
ExternalTypeare “safe to cache”.The default value
Nonewill emit a warning and then not allow caching of a statement which includes this type. Set toFalseto disable statements using this type from being cached at all without a warning. When set toTrue, the object’s class and selected elements from its state will be used as part of the cache key. For example, using aTypeDecorator:class MyType(TypeDecorator): impl = String cache_ok = True def __init__(self, choices): self.choices = tuple(choices) self.internal_only = True
The cache key for the above type would be equivalent to:
>>> MyType(["a", "b", "c"])._static_cache_key (<class '__main__.MyType'>, ('choices', ('a', 'b', 'c')))
The caching scheme will extract attributes from the type that correspond to the names of parameters in the
__init__()method. Above, the “choices” attribute becomes part of the cache key but “internal_only” does not, because there is no parameter named “internal_only”.The requirements for cacheable elements is that they are hashable and also that they indicate the same SQL rendered for expressions using this type every time for a given cache value.
To accommodate for datatypes that refer to unhashable structures such as dictionaries, sets and lists, these objects can be made “cacheable” by assigning hashable structures to the attributes whose names correspond with the names of the arguments. For example, a datatype which accepts a dictionary of lookup values may publish this as a sorted series of tuples. Given a previously un-cacheable type as:
class LookupType(UserDefinedType): """a custom type that accepts a dictionary as a parameter. this is the non-cacheable version, as "self.lookup" is not hashable. """ def __init__(self, lookup): self.lookup = lookup def get_col_spec(self, **kw): return "VARCHAR(255)" def bind_processor(self, dialect): ... # works with "self.lookup" ...
Where “lookup” is a dictionary. The type will not be able to generate a cache key:
>>> type_ = LookupType({"a": 10, "b": 20}) >>> type_._static_cache_key <stdin>:1: SAWarning: UserDefinedType LookupType({'a': 10, 'b': 20}) will not produce a cache key because the ``cache_ok`` flag is not set to True. Set this flag to True if this type object's state is safe to use in a cache key, or False to disable this warning. symbol('no_cache')
If we did set up such a cache key, it wouldn’t be usable. We would get a tuple structure that contains a dictionary inside of it, which cannot itself be used as a key in a “cache dictionary” such as SQLAlchemy’s statement cache, since Python dictionaries aren’t hashable:
>>> # set cache_ok = True >>> type_.cache_ok = True >>> # this is the cache key it would generate >>> key = type_._static_cache_key >>> key (<class '__main__.LookupType'>, ('lookup', {'a': 10, 'b': 20})) >>> # however this key is not hashable, will fail when used with >>> # SQLAlchemy statement cache >>> some_cache = {key: "some sql value"} Traceback (most recent call last): File "<stdin>", line 1, in <module> TypeError: unhashable type: 'dict'
The type may be made cacheable by assigning a sorted tuple of tuples to the “.lookup” attribute:
class LookupType(UserDefinedType): """a custom type that accepts a dictionary as a parameter. The dictionary is stored both as itself in a private variable, and published in a public variable as a sorted tuple of tuples, which is hashable and will also return the same value for any two equivalent dictionaries. Note it assumes the keys and values of the dictionary are themselves hashable. """ cache_ok = True def __init__(self, lookup): self._lookup = lookup # assume keys/values of "lookup" are hashable; otherwise # they would also need to be converted in some way here self.lookup = tuple((key, lookup[key]) for key in sorted(lookup)) def get_col_spec(self, **kw): return "VARCHAR(255)" def bind_processor(self, dialect): ... # works with "self._lookup" ...
Where above, the cache key for
LookupType({"a": 10, "b": 20})will be:>>> LookupType({"a": 10, "b": 20})._static_cache_key (<class '__main__.LookupType'>, ('lookup', (('a', 10), ('b', 20))))
Added in version 1.4.14: - added the
cache_okflag to allow some configurability of caching forTypeDecoratorclasses.Added in version 1.4.28: - added the
ExternalTypemixin which generalizes thecache_okflag to both theTypeDecoratorandUserDefinedTypeclasses.See also
- property python_type: type[str]¶
Returns the Python type for this type decorator.
- Returns:
The Python string type.
- process_bind_param(value, dialect)[source]¶
Process the value before binding it to the SQL statement.
This method hashes the value using the specified backend. If the backend returns a SQLAlchemy FunctionElement (for DB-side hashing), it is returned directly. Otherwise, the hashed string is returned.
- process_result_value(value, dialect)[source]¶
Process the value after retrieving it from the database.
This method wraps the hash string in a HashedPassword object.
- Parameters:
- Returns:
A HashedPassword object or None if the input is None.
- Return type:
Union[HashedPassword, None]
Password hash type wrapper.
Encryption Backends¶
- class advanced_alchemy.types.EncryptionBackend[source]¶
Bases:
ABCAbstract base class for encryption backends.
This class defines the interface that all encryption backends must implement. Concrete implementations should provide the actual encryption/decryption logic.
- abstractmethod encrypt(value)[source]¶
Encrypts the given value.
- Parameters:
- Returns:
The encrypted value.
- Return type:
- Raises:
NotImplementedError – If the method is not implemented by the subclass.
- class advanced_alchemy.types.FernetBackend[source]¶
Bases:
EncryptionBackendFernet-based encryption backend.
This backend uses the Python cryptography library’s Fernet implementation for encryption/decryption operations. Provides symmetric encryption with built-in rotation support.
- fernet¶
The Fernet instance used for encryption/decryption.
- mount_vault(key)[source]¶
Mounts the vault with the provided encryption key.
This method hashes the key using SHA256 before initializing the engine.
- class advanced_alchemy.types.encrypted_string.PGCryptoBackend[source]¶
Bases:
EncryptionBackendPostgreSQL pgcrypto-based encryption backend.
This backend uses PostgreSQL’s pgcrypto extension for encryption/decryption operations. Requires the pgcrypto extension to be installed in the database.
Password Hashers¶
- class advanced_alchemy.types.password_hash.argon2.Argon2Hasher[source]¶
Bases:
HashingBackendHashing backend using Argon2 via the argon2-cffi library.
- class advanced_alchemy.types.password_hash.passlib.PasslibHasher[source]¶
Bases:
HashingBackendHashing backend using Passlib.
Relies on the passlib package being installed. Install with pip install passlib or uv pip install passlib.
- __init__(context)[source]¶
Initialize PasslibBackend.
- Parameters:
context¶ (
CryptContext) – The Passlib CryptContext to use for hashing and verification.context (CryptContext)
- Return type:
None
- compare_expression(column, plain)[source]¶
Direct SQL comparison is not supported for Passlib.
- Raises:
NotImplementedError – Always raised.
- Return type:
- Parameters:
column (ColumnElement[str])
plain (Any)
- class advanced_alchemy.types.password_hash.pwdlib.PwdlibHasher[source]¶
Bases:
HashingBackendHashing backend using Pwdlib.
- __init__(hasher)[source]¶
Initialize PwdlibBackend.
- Parameters:
hasher¶ (
HasherProtocol) – The Pwdlib hasher to use for hashing and verification.hasher (HasherProtocol)
- Return type:
None
- compare_expression(column, plain)[source]¶
Direct SQL comparison is not supported for Pwdlib.
- Raises:
NotImplementedError – Always raised.
- Return type:
- Parameters:
column (ColumnElement[str])
plain (Any)
Mutable Types¶
- class advanced_alchemy.types.MutableList[source]¶
Bases:
MutableList[T]A list type that implements
Mutable.The
MutableListobject implements a list that will emit change events to the underlying mapping when the contents of the list are altered, including when values are added or removed.This is a replication of default Mutablelist provide by SQLAlchemy. The difference here is the properties _removed which keep every element removed from the list in order to be able to delete them after commit and keep them when session rolled back.
- pop(*arg)[source]¶
Remove and return item at index (default last).
Raises IndexError if list is empty or index is out of range.
- remove(i)[source]¶
Remove first occurrence of value.
Raises ValueError if the value is not present.
- Return type:
- Parameters:
i (T)
- sort(**kw)[source]¶
Sort the list in ascending order and return None.
The sort is in-place (i.e. the list itself is modified) and stable (i.e. the order of two equal elements is maintained).
If a key function is given, apply it once to each list item and sort them, ascending or descending, according to their function values.
The reverse flag can be set to sort in descending order.
SQLAlchemy mutable list type that tracks changes.
Utilities¶
- advanced_alchemy.types.BigIntIdentity¶
alias of BigInteger()
Constants¶
- advanced_alchemy.types.NANOID_INSTALLED¶
The specification for a module, used for loading.
A module’s spec is the source for information about the module. For data associated with the module, including source, use the spec’s loader.
name is the absolute name of the module. loader is the loader to use when loading the module. parent is the name of the package the module is in. The parent is derived from the name.
is_package determines if the module is considered a package or not. On modules this is reflected by the __path__ attribute.
origin is the specific location used by the loader from which to load the module, if that information is available. When filename is set, origin will match.
has_location indicates that a spec’s “origin” reflects a location. When this is True, __file__ attribute of the module is set.
cached is the location of the cached bytecode file, if any. It corresponds to the __cached__ attribute.
submodule_search_locations is the sequence of path entries to search when importing submodules. If set, is_package should be True–and False otherwise.
Packages are simply modules that (may) have submodules. If a spec has a non-None value in submodule_search_locations, the import system will consider modules loaded from the spec as packages.
Only finders (see importlib.abc.MetaPathFinder and importlib.abc.PathEntryFinder) should modify ModuleSpec instances.
Flag indicating if nanoid library is installed.
- advanced_alchemy.types.UUID_UTILS_INSTALLED¶
The specification for a module, used for loading.
A module’s spec is the source for information about the module. For data associated with the module, including source, use the spec’s loader.
name is the absolute name of the module. loader is the loader to use when loading the module. parent is the name of the package the module is in. The parent is derived from the name.
is_package determines if the module is considered a package or not. On modules this is reflected by the __path__ attribute.
origin is the specific location used by the loader from which to load the module, if that information is available. When filename is set, origin will match.
has_location indicates that a spec’s “origin” reflects a location. When this is True, __file__ attribute of the module is set.
cached is the location of the cached bytecode file, if any. It corresponds to the __cached__ attribute.
submodule_search_locations is the sequence of path entries to search when importing submodules. If set, is_package should be True–and False otherwise.
Packages are simply modules that (may) have submodules. If a spec has a non-None value in submodule_search_locations, the import system will consider modules loaded from the spec as packages.
Only finders (see importlib.abc.MetaPathFinder and importlib.abc.PathEntryFinder) should modify ModuleSpec instances.
Flag indicating if uuid-utils library is installed.