Please consider the following derived script structure:
import abc
class BaseMeta(abc.ABCMeta):
__registry__ = {}
def __init__(cls, name, bases, namespace):
if bases:
BaseMeta.__registry__.update({ cls.__name__: cls })
# other modifications of namespaces
super().__init__(name, bases, namespace)
class Base(metaclass = BaseMeta):
pass # Various abstract properties and methods
class Derived0(Base): pass
class Derived1(Base): pass
# ...
class DerivedN(Base): pass
The code above was adapted from this answer; I cannot use __subclass__ because I need to track direct as well as indirect subclasses. While I can circumvent this requirement for now, I would like to keep it for forward compatibility.
I would like to dynamically dispatch these subclasses using __getattr__() from the Base.__registry__. Because these subclasses can grow to several hundreds of SLOC and can depend on dozens of files, primarily SQL schemas, I would like to organize them into packages, e.g.
__main__.py
gui # Irrelevant package
db # Irrelevant package
# [...] Other irrelevant packages
relevant
__init__.py # Contains Base
derived0
__init__.py # Contains Derived0 and related machinery
derived1
__init__.py # Contains Derived1 and relevant machinery and imports module-xxx.py
module-xxx.py
schema_main.sql
copy-query-.xxx.sql
# [...]
# [...]
The problem is then that we would need to import dynamically as well. The only solution that I am aware of is to use pkgutil to walk the packages and exec(compile()) the buffers. This, however, messes up the namespaces.
I believe this problem to be somewhat analogous to this one, which seems as idiomatic as code for such problems can get, but I do not know result in exactly the same behavior as a simple import from a parent package? Does anyone know how to solve this?
All help is more than welcome, and so is other critique of the above code. Thank you very much.