33 Commits

Author SHA1 Message Date
long2ice
d2e0a68351 Fix packaging error. (#92) 2020-12-02 23:03:15 +08:00
long2ice
ee6cc20c7d Fix empty items 2020-11-30 11:14:09 +08:00
long2ice
4e917495a0 Fix upgrade in new db. (#96) 2020-11-30 11:02:48 +08:00
long2ice
bfa66f6dd4 update changelog 2020-11-29 11:15:43 +08:00
long2ice
f00715d4c4 Merge pull request #97 from TrDex/pathlib-for-path-resolving
Use `pathlib` for path resolving
2020-11-29 11:02:44 +08:00
Mykola Solodukha
6e3105690a Use pathlib for path resolving 2020-11-28 19:23:34 +02:00
long2ice
c707f7ecb2 bug fix 2020-11-28 14:31:41 +08:00
long2ice
0bbc471e00 Fix sqlite stuck. (#90) 2020-11-26 23:38:57 +08:00
long2ice
fb6cc62047 update README and CHANGELOG 2020-11-23 16:44:16 +08:00
long2ice
e9ceaf471f Merge pull request #87 from ALexALed/remove-default-detections-for-callable
Remove callable detection for defaults
2020-11-23 16:41:30 +08:00
alexaled
85fc3b2aa2 Remove callable detection for defaults 2020-11-23 10:35:40 +02:00
long2ice
a677d506a9 Fix ci error 2020-11-19 10:41:52 +08:00
long2ice
9879004fee Add rename column support MySQL5 2020-11-19 10:11:52 +08:00
long2ice
5760fe2040 Merge pull request #83 from SakuraSound/fix-migrate-unlink
Catch OSError (if read-only file system)
2020-11-18 15:40:29 +08:00
Joir-dan Gumbs
b229c30558 Catch OSError (if read-only file system) 2020-11-17 23:28:00 -08:00
long2ice
5d2f1604c3 update github action poetry 2020-11-17 10:57:56 +08:00
long2ice
499c4e1c02 Fix black 2020-11-17 10:50:57 +08:00
long2ice
1463ee30bc update deps 2020-11-17 10:43:27 +08:00
long2ice
3b801932f5 Merge remote-tracking branch 'origin/dev' into dev 2020-11-17 10:36:14 +08:00
long2ice
c2eb4dc9e3 update poetry in github actions 2020-11-17 10:35:51 +08:00
long2ice
5927febd0c Delete .DS_Store 2020-11-17 10:10:32 +08:00
long2ice
a1c10ff330 exclude .DS_store 2020-11-17 10:09:37 +08:00
long2ice
f2013c931a Fix test error 2020-11-16 22:32:19 +08:00
long2ice
b21b954d32 Use .sql instead of .json to store version file. (#79) 2020-11-16 22:25:01 +08:00
long2ice
f5588a35c5 update deps 2020-11-12 21:27:58 +08:00
long2ice
f5dff84476 Fix encoding error. (#75) 2020-11-08 23:00:44 +08:00
long2ice
e399821116 update deps 2020-11-05 17:43:41 +08:00
long2ice
648f25a951 Compatible with models file in directory. (#70) 2020-10-30 19:51:46 +08:00
long2ice
fa73e132e2 remove .vscode 2020-10-30 16:45:12 +08:00
long2ice
1bac33cd33 add confirmation_option when downgrade 2020-10-30 16:39:14 +08:00
long2ice
4e76f12ccf update README.md 2020-10-28 17:12:23 +08:00
long2ice
724379700e Support multiple databases. (#68) 2020-10-28 17:02:02 +08:00
long2ice
bb929f2b55 update deps 2020-10-25 17:48:05 +08:00
17 changed files with 749 additions and 440 deletions

View File

@@ -11,11 +11,14 @@ jobs:
- uses: actions/setup-python@v2
with:
python-version: '3.x'
- uses: dschep/install-poetry-action@v1.3
- name: Install and configure Poetry
uses: snok/install-poetry@v1.1.1
with:
virtualenvs-create: false
- name: Build dists
run: make build
- name: Pypi Publish
uses: pypa/gh-action-pypi-publish@master
with:
user: __token__
password: ${{ secrets.pypi_password }}
password: ${{ secrets.pypi_password }}

View File

@@ -1,5 +1,5 @@
name: test
on: [push, pull_request]
on: [ push, pull_request ]
jobs:
testall:
runs-on: ubuntu-latest
@@ -19,7 +19,10 @@ jobs:
- uses: actions/setup-python@v2
with:
python-version: '3.x'
- uses: dschep/install-poetry-action@v1.3
- name: Install and configure Poetry
uses: snok/install-poetry@v1.1.1
with:
virtualenvs-create: false
- name: CI
env:
MYSQL_PASS: root
@@ -28,4 +31,4 @@ jobs:
POSTGRES_PASS: 123456
POSTGRES_HOST: 127.0.0.1
POSTGRES_PORT: 5432
run: make ci
run: make ci

4
.gitignore vendored
View File

@@ -143,4 +143,6 @@ cython_debug/
.idea
migrations
aerich.ini
src
src
.vscode
.DS_Store

View File

@@ -1,7 +1,32 @@
# ChangeLog
## 0.4
### 0.4.2
- Use `pathlib` for path resolving. (#89)
- Fix upgrade in new db. (#96)
- Fix packaging error. (#92)
### 0.4.1
- Bug fix. (#91 #93)
### 0.4.0
- Use `.sql` instead of `.json` to store version file.
- Add `rename` column support MySQL5.
- Remove callable detection for defaults. (#87)
- Fix `sqlite` stuck. (#90)
## 0.3
### 0.3.3
- Fix encoding error. (#75)
- Support multiple databases. (#68)
- Compatible with models file in directory. (#70)
### 0.3.2
- Fix migrate to new database error. (#62)

View File

@@ -3,8 +3,10 @@ black_opts = -l 100 -t py38
py_warn = PYTHONDEVMODE=1
MYSQL_HOST ?= "127.0.0.1"
MYSQL_PORT ?= 3306
MYSQL_PASS ?= "123456"
POSTGRES_HOST ?= "127.0.0.1"
POSTGRES_PORT ?= 5432
POSTGRES_PASS ?= "123456"
help:
@echo "Aerich development makefile"
@@ -22,7 +24,7 @@ up:
@poetry update
deps:
@poetry install -E dbdrivers --no-root
@poetry install -E dbdrivers
style: deps
isort -src $(checkfiles)

View File

@@ -10,7 +10,7 @@
Aerich is a database migrations tool for Tortoise-ORM, which like alembic for SQLAlchemy, or Django ORM with it\'s
own migrations solution.
**If you upgrade aerich from <= 0.2.5 to >= 0.3.0, see [changelog](https://github.com/tortoise/aerich/blob/dev/CHANGELOG.md) for upgrade steps.**
**Important: You can only use absolutely import in your `models.py` to make `aerich` work.**
## Install
@@ -103,22 +103,20 @@ If your Tortoise-ORM app is not default `models`, you must specify
```shell
> aerich migrate --name drop_column
Success migrate 1_202029051520102929_drop_column.json
Success migrate 1_202029051520102929_drop_column.sql
```
Format of migrate filename is
`{version_num}_{datetime}_{name|update}.json`.
`{version_num}_{datetime}_{name|update}.sql`.
And if `aerich` guess you are renaming a column, it will ask `Rename {old_column} to {new_column} [True]`, you can choice `True` to rename column without column drop, or choice `False` to drop column then create.
If you use `MySQL`, only MySQL8.0+ support `rename..to` syntax.
### Upgrade to latest version
```shell
> aerich upgrade
Success upgrade 1_202029051520102929_drop_column.json
Success upgrade 1_202029051520102929_drop_column.sql
```
Now your db is migrated to latest.
@@ -134,13 +132,17 @@ Usage: aerich downgrade [OPTIONS]
Options:
-v, --version INTEGER Specified version, default to last. [default: -1]
-d, --delete Delete version files at the same time. [default:
False]
--yes Confirm the action without prompting.
-h, --help Show this message and exit.
```
```shell
> aerich downgrade
Success downgrade 1_202029051520102929_drop_column.json
Success downgrade 1_202029051520102929_drop_column.sql
```
Now your db rollback to specified version.
@@ -150,7 +152,7 @@ Now your db rollback to specified version.
```shell
> aerich history
1_202029051520102929_drop_column.json
1_202029051520102929_drop_column.sql
```
### Show heads to be migrated
@@ -158,13 +160,27 @@ Now your db rollback to specified version.
```shell
> aerich heads
1_202029051520102929_drop_column.json
1_202029051520102929_drop_column.sql
```
## Support this project
### Multiple databases
- Just give a star!
- Donation.
```python
tortoise_orm = {
"connections": {
"default": expand_db_url(db_url, True),
"second": expand_db_url(db_url_second, True),
},
"apps": {
"models": {"models": ["tests.models", "aerich.models"], "default_connection": "default"},
"models_second": {"models": ["tests.models_second"], "default_connection": "second",},
},
}
```
You need only specify `aerich.models` in one app, and must specify `--app` when run `aerich migrate` and so on.
## Support this project
| AliPay | WeChatPay | PayPal |
| -------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------- | ---------------------------------------------------------------- |

View File

@@ -1 +1 @@
__version__ = "0.3.2"
__version__ = "0.4.2"

View File

@@ -1,9 +1,9 @@
import asyncio
import json
import os
import sys
from configparser import ConfigParser
from functools import wraps
from pathlib import Path
import click
from click import Context, UsageError
@@ -13,7 +13,13 @@ from tortoise.transactions import in_transaction
from tortoise.utils import get_schema_sql
from aerich.migrate import Migrate
from aerich.utils import get_app_connection, get_app_connection_name, get_tortoise_config
from aerich.utils import (
get_app_connection,
get_app_connection_name,
get_tortoise_config,
get_version_content_from_file,
write_version_file,
)
from . import __version__
from .enums import Color
@@ -38,7 +44,11 @@ def coro(f):
@click.group(context_settings={"help_option_names": ["-h", "--help"]})
@click.version_option(__version__, "-V", "--version")
@click.option(
"-c", "--config", default="aerich.ini", show_default=True, help="Config file.",
"-c",
"--config",
default="aerich.ini",
show_default=True,
help="Config file.",
)
@click.option("--app", required=False, help="Tortoise-ORM app name.")
@click.option(
@@ -57,7 +67,7 @@ async def cli(ctx: Context, config, app, name):
invoked_subcommand = ctx.invoked_subcommand
if invoked_subcommand != "init":
if not os.path.exists(config):
if not Path(config).exists():
raise UsageError("You must exec init first", ctx=ctx)
parser.read(config)
@@ -66,13 +76,13 @@ async def cli(ctx: Context, config, app, name):
tortoise_config = get_tortoise_config(ctx, tortoise_orm)
app = app or list(tortoise_config.get("apps").keys())[0]
if "aerich.models" not in tortoise_config.get("apps").get(app).get("models"):
raise UsageError("Check your tortoise config and add aerich.models to it.", ctx=ctx)
ctx.obj["config"] = tortoise_config
ctx.obj["location"] = location
ctx.obj["app"] = app
Migrate.app = app
if invoked_subcommand != "init-db":
if not Path(location, app).exists():
raise UsageError("You must exec init-db first", ctx=ctx)
await Migrate.init_with_old_models(tortoise_config, app, location)
@@ -102,12 +112,11 @@ async def upgrade(ctx: Context):
exists = False
if not exists:
async with in_transaction(get_app_connection_name(config, app)) as conn:
file_path = os.path.join(Migrate.migrate_location, version_file)
with open(file_path, "r", encoding="utf-8") as f:
content = json.load(f)
upgrade_query_list = content.get("upgrade")
for upgrade_query in upgrade_query_list:
await conn.execute_script(upgrade_query)
file_path = Path(Migrate.migrate_location, version_file)
content = get_version_content_from_file(file_path)
upgrade_query_list = content.get("upgrade")
for upgrade_query in upgrade_query_list:
await conn.execute_script(upgrade_query)
await Aerich.create(
version=version_file,
app=app,
@@ -116,7 +125,7 @@ async def upgrade(ctx: Context):
click.secho(f"Success upgrade {version_file}", fg=Color.green)
migrated = True
if not migrated:
click.secho("No migrate items", fg=Color.yellow)
click.secho("No upgrade items found", fg=Color.yellow)
@cli.command(help="Downgrade to specified version.")
@@ -128,9 +137,20 @@ async def upgrade(ctx: Context):
show_default=True,
help="Specified version, default to last.",
)
@click.option(
"-d",
"--delete",
is_flag=True,
default=False,
show_default=True,
help="Delete version files at the same time.",
)
@click.pass_context
@click.confirmation_option(
prompt="Downgrade is dangerous, which maybe lose your data, are you sure?",
)
@coro
async def downgrade(ctx: Context, version: int):
async def downgrade(ctx: Context, version: int, delete: bool):
app = ctx.obj["app"]
config = ctx.obj["config"]
if version == -1:
@@ -146,16 +166,17 @@ async def downgrade(ctx: Context, version: int):
for version in versions:
file = version.version
async with in_transaction(get_app_connection_name(config, app)) as conn:
file_path = os.path.join(Migrate.migrate_location, file)
with open(file_path, "r", encoding="utf-8") as f:
content = json.load(f)
downgrade_query_list = content.get("downgrade")
if not downgrade_query_list:
return click.secho("No downgrade item found", fg=Color.yellow)
for downgrade_query in downgrade_query_list:
await conn.execute_query(downgrade_query)
await version.delete()
os.unlink(file_path)
file_path = Path(Migrate.migrate_location, file)
content = get_version_content_from_file(file_path)
downgrade_query_list = content.get("downgrade")
if not downgrade_query_list:
click.secho("No downgrade items found", fg=Color.yellow)
return
for downgrade_query in downgrade_query_list:
await conn.execute_query(downgrade_query)
await version.delete()
if delete:
os.unlink(file_path)
click.secho(f"Success downgrade {file}", fg=Color.green)
@@ -193,16 +214,21 @@ async def history(ctx: Context):
help="Tortoise-ORM config module dict variable, like settings.TORTOISE_ORM.",
)
@click.option(
"--location", default="./migrations", show_default=True, help="Migrate store location."
"--location",
default="./migrations",
show_default=True,
help="Migrate store location.",
)
@click.pass_context
@coro
async def init(
ctx: Context, tortoise_orm, location,
ctx: Context,
tortoise_orm,
location,
):
config_file = ctx.obj["config_file"]
name = ctx.obj["name"]
if os.path.exists(config_file):
if Path(config_file).exists():
return click.secho("You have inited", fg=Color.yellow)
parser.add_section(name)
@@ -212,7 +238,7 @@ async def init(
with open(config_file, "w", encoding="utf-8") as f:
parser.write(f)
if not os.path.isdir(location):
if not Path(location).is_dir():
os.mkdir(location)
click.secho(f"Success create migrate location {location}", fg=Color.green)
@@ -234,12 +260,14 @@ async def init_db(ctx: Context, safe):
location = ctx.obj["location"]
app = ctx.obj["app"]
dirname = os.path.join(location, app)
if not os.path.isdir(dirname):
dirname = Path(location, app)
if not dirname.is_dir():
os.mkdir(dirname)
click.secho(f"Success create app migrate location {dirname}", fg=Color.green)
else:
return click.secho(f"Inited {app} already", fg=Color.yellow)
return click.secho(
f"Inited {app} already, or delete {dirname} and try again.", fg=Color.yellow
)
await Tortoise.init(config=config)
connection = get_app_connection(config, app)
@@ -249,16 +277,21 @@ async def init_db(ctx: Context, safe):
version = await Migrate.generate_version()
await Aerich.create(
version=version, app=app, content=Migrate.get_models_content(config, app, location)
version=version,
app=app,
content=Migrate.get_models_content(config, app, location),
)
with open(os.path.join(dirname, version), "w", encoding="utf-8") as f:
content = {
"upgrade": [schema],
}
json.dump(content, f, ensure_ascii=False, indent=2)
return click.secho(f'Success generate schema for app "{app}"', fg=Color.green)
content = {
"upgrade": [schema],
}
write_version_file(Path(dirname, version), content)
click.secho(f'Success generate schema for app "{app}"', fg=Color.green)
def main():
sys.path.insert(0, ".")
cli()
if __name__ == "__main__":
main()

View File

@@ -22,6 +22,9 @@ class BaseDDL:
_DROP_FK_TEMPLATE = 'ALTER TABLE "{table_name}" DROP FOREIGN KEY "{fk_name}"'
_M2M_TABLE_TEMPLATE = 'CREATE TABLE "{table_name}" ("{backward_key}" {backward_type} NOT NULL REFERENCES "{backward_table}" ("{backward_field}") ON DELETE CASCADE,"{forward_key}" {forward_type} NOT NULL REFERENCES "{forward_table}" ("{forward_field}") ON DELETE {on_delete}){extra}{comment};'
_MODIFY_COLUMN_TEMPLATE = 'ALTER TABLE "{table_name}" MODIFY COLUMN {column}'
_CHANGE_COLUMN_TEMPLATE = (
'ALTER TABLE "{table_name}" CHANGE {old_column_name} {new_column_name} {new_column_type}'
)
def __init__(self, client: "BaseDBAsyncClient"):
self.client = client
@@ -136,6 +139,16 @@ class BaseDDL:
new_column_name=new_column_name,
)
def change_column(
self, model: "Type[Model]", old_column_name: str, new_column_name: str, new_column_type: str
):
return self._CHANGE_COLUMN_TEMPLATE.format(
table_name=model._meta.db_table,
old_column_name=old_column_name,
new_column_name=new_column_name,
new_column_type=new_column_type,
)
def add_index(self, model: "Type[Model]", field_names: List[str], unique=False):
return self._ADD_INDEX_TEMPLATE.format(
unique="UNIQUE" if unique else "",

View File

@@ -1,15 +1,17 @@
import json
import inspect
import os
import re
from datetime import datetime
from importlib import import_module
from io import StringIO
from pathlib import Path
from typing import Dict, List, Optional, Tuple, Type
import click
from tortoise import (
BackwardFKRelation,
BackwardOneToOneRelation,
BaseDBAsyncClient,
ForeignKeyFieldInstance,
ManyToManyFieldInstance,
Model,
@@ -20,7 +22,7 @@ from tortoise.fields import Field
from aerich.ddl import BaseDDL
from aerich.models import MAX_VERSION_LENGTH, Aerich
from aerich.utils import get_app_connection
from aerich.utils import get_app_connection, write_version_file
class Migrate:
@@ -41,15 +43,16 @@ class Migrate:
app: str
migrate_location: str
dialect: str
_db_version: Optional[str] = None
@classmethod
def get_old_model_file(cls, app: str, location: str):
return os.path.join(location, app, cls.old_models + ".py")
return Path(location, app, cls.old_models + ".py")
@classmethod
def get_all_version_files(cls) -> List[str]:
return sorted(
filter(lambda x: x.endswith("json"), os.listdir(cls.migrate_location)),
filter(lambda x: x.endswith("sql"), os.listdir(cls.migrate_location)),
key=lambda x: int(x.split("_")[0]),
)
@@ -64,18 +67,25 @@ class Migrate:
def remove_old_model_file(cls, app: str, location: str):
try:
os.unlink(cls.get_old_model_file(app, location))
except FileNotFoundError:
except (OSError, FileNotFoundError):
pass
@classmethod
async def _get_db_version(cls, connection: BaseDBAsyncClient):
if cls.dialect == "mysql":
sql = "select version() as version"
ret = await connection.execute_query(sql)
cls._db_version = ret[1][0].get("version")
@classmethod
async def init_with_old_models(cls, config: dict, app: str, location: str):
await Tortoise.init(config=config)
last_version = await cls.get_last_version()
cls.app = app
cls.migrate_location = os.path.join(location, app)
cls.migrate_location = Path(location, app)
if last_version:
content = last_version.content
with open(cls.get_old_model_file(app, location), "w") as f:
with open(cls.get_old_model_file(app, location), "w", encoding="utf-8") as f:
f.write(content)
migrate_config = cls._get_migrate_config(config, app, location)
@@ -96,6 +106,7 @@ class Migrate:
from aerich.ddl.postgres import PostgresDDL
cls.ddl = PostgresDDL(connection)
await cls._get_db_version(connection)
@classmethod
async def _get_last_version_num(cls):
@@ -110,8 +121,8 @@ class Migrate:
now = datetime.now().strftime("%Y%m%d%H%M%S").replace("/", "")
last_version_num = await cls._get_last_version_num()
if last_version_num is None:
return f"0_{now}_init.json"
version = f"{last_version_num + 1}_{now}_{name}.json"
return f"0_{now}_init.sql"
version = f"{last_version_num + 1}_{now}_{name}.sql"
if len(version) > MAX_VERSION_LENGTH:
raise ValueError(f"Version name exceeds maximum length ({MAX_VERSION_LENGTH})")
return version
@@ -122,13 +133,12 @@ class Migrate:
# delete if same version exists
for version_file in cls.get_all_version_files():
if version_file.startswith(version.split("_")[0]):
os.unlink(os.path.join(cls.migrate_location, version_file))
os.unlink(Path(cls.migrate_location, version_file))
content = {
"upgrade": cls.upgrade_operators,
"downgrade": cls.downgrade_operators,
}
with open(os.path.join(cls.migrate_location, version), "w", encoding="utf-8") as f:
json.dump(content, f, indent=2, ensure_ascii=False)
write_version_file(Path(cls.migrate_location, version), content)
return version
@classmethod
@@ -181,8 +191,7 @@ class Migrate:
:param location:
:return:
"""
path = os.path.join(location, app, cls.old_models)
path = path.replace(os.sep, ".").lstrip(".")
path = Path(location, app, cls.old_models).as_posix().replace("/", ".")
config["apps"][cls.diff_app] = {
"models": [path],
"default_connection": config.get("apps").get(app).get("default_connection", "default"),
@@ -201,7 +210,15 @@ class Migrate:
old_model_files = []
models = config.get("apps").get(app).get("models")
for model in models:
old_model_files.append(import_module(model).__file__)
module = import_module(model)
possible_models = [getattr(module, attr_name) for attr_name in dir(module)]
for attr in filter(
lambda x: inspect.isclass(x) and issubclass(x, Model) and x is not Model,
possible_models,
):
file = inspect.getfile(attr)
if file not in old_model_files:
old_model_files.append(file)
pattern = rf"(\n)?('|\")({app})(.\w+)('|\")"
str_io = StringIO()
for i, model_file in enumerate(old_model_files):
@@ -293,13 +310,26 @@ class Migrate:
else:
is_rename = diff_key in cls._rename_new
if is_rename:
cls._add_operator(
cls._rename_field(new_model, old_field, new_field), upgrade,
)
if (
cls.dialect == "mysql"
and cls._db_version
and cls._db_version.startswith("5.")
):
cls._add_operator(
cls._change_field(new_model, old_field, new_field),
upgrade,
)
else:
cls._add_operator(
cls._rename_field(new_model, old_field, new_field),
upgrade,
)
break
else:
cls._add_operator(
cls._add_field(new_model, new_field), upgrade, cls._is_fk_m2m(new_field),
cls._add_field(new_model, new_field),
upgrade,
cls._is_fk_m2m(new_field),
)
else:
old_field = old_fields_map.get(new_key)
@@ -315,7 +345,9 @@ class Migrate:
cls._add_operator(
cls._alter_null(new_model, new_field), upgrade=upgrade
)
if new_field.default != old_field.default:
if new_field.default != old_field.default and not callable(
new_field.default
):
cls._add_operator(
cls._alter_default(new_model, new_field), upgrade=upgrade
)
@@ -350,11 +382,15 @@ class Migrate:
if isinstance(new_field, ForeignKeyFieldInstance):
if old_field.db_constraint and not new_field.db_constraint:
cls._add_operator(
cls._drop_fk(new_model, new_field), upgrade, True,
cls._drop_fk(new_model, new_field),
upgrade,
True,
)
if new_field.db_constraint and not old_field.db_constraint:
cls._add_operator(
cls._add_fk(new_model, new_field), upgrade, True,
cls._add_fk(new_model, new_field),
upgrade,
True,
)
for old_key in old_keys:
@@ -364,12 +400,20 @@ class Migrate:
not upgrade and old_key not in cls._rename_new
):
cls._add_operator(
cls._remove_field(old_model, field), upgrade, cls._is_fk_m2m(field),
cls._remove_field(old_model, field),
upgrade,
cls._is_fk_m2m(field),
)
for new_index in new_indexes:
if new_index not in old_indexes:
cls._add_operator(cls._add_index(new_model, new_index,), upgrade)
cls._add_operator(
cls._add_index(
new_model,
new_index,
),
upgrade,
)
for old_index in old_indexes:
if old_index not in new_indexes:
cls._add_operator(cls._remove_index(old_model, old_index), upgrade)
@@ -465,6 +509,15 @@ class Migrate:
def _rename_field(cls, model: Type[Model], old_field: Field, new_field: Field):
return cls.ddl.rename_column(model, old_field.model_field_name, new_field.model_field_name)
@classmethod
def _change_field(cls, model: Type[Model], old_field: Field, new_field: Field):
return cls.ddl.change_column(
model,
old_field.model_field_name,
new_field.model_field_name,
new_field.get_for_dialect(cls.dialect, "SQL_TYPE"),
)
@classmethod
def _add_fk(cls, model: Type[Model], field: ForeignKeyFieldInstance):
"""

View File

@@ -1,17 +1,24 @@
import importlib
from typing import Dict
from click import BadOptionUsage, Context
from tortoise import BaseDBAsyncClient, Tortoise
def get_app_connection_name(config, app) -> str:
def get_app_connection_name(config, app_name: str) -> str:
"""
get connection name
:param config:
:param app:
:param app_name:
:return:
"""
return config.get("apps").get(app).get("default_connection", "default")
app = config.get("apps").get(app_name)
if app:
return app.get("default_connection", "default")
raise BadOptionUsage(
option_name="--app",
message=f'Can\'t get app named "{app_name}"',
)
def get_app_connection(config, app) -> BaseDBAsyncClient:
@@ -49,3 +56,55 @@ def get_tortoise_config(ctx: Context, tortoise_orm: str) -> dict:
ctx=ctx,
)
return config
_UPGRADE = "##### upgrade #####\n"
_DOWNGRADE = "##### downgrade #####\n"
def get_version_content_from_file(version_file: str) -> Dict:
"""
get version content
:param version_file:
:return:
"""
with open(version_file, "r", encoding="utf-8") as f:
content = f.read()
first = content.index(_UPGRADE)
try:
second = content.index(_DOWNGRADE)
except ValueError:
second = len(content) - 1
upgrade_content = content[first + len(_UPGRADE) : second].strip() # noqa:E203
downgrade_content = content[second + len(_DOWNGRADE) :].strip() # noqa:E203
ret = {
"upgrade": list(filter(lambda x: x or False, upgrade_content.split(";\n"))),
"downgrade": list(filter(lambda x: x or False, downgrade_content.split(";\n"))),
}
return ret
def write_version_file(version_file: str, content: Dict):
"""
write version file
:param version_file:
:param content:
:return:
"""
with open(version_file, "w", encoding="utf-8") as f:
f.write(_UPGRADE)
upgrade = content.get("upgrade")
if len(upgrade) > 1:
f.write(";\n".join(upgrade) + ";\n")
else:
f.write(f"{upgrade[0]}")
if not upgrade[0].endswith(";"):
f.write(";")
f.write("\n")
downgrade = content.get("downgrade")
if downgrade:
f.write(_DOWNGRADE)
if len(downgrade) > 1:
f.write(";\n".join(downgrade) + ";\n")
else:
f.write(f"{downgrade[0]};\n")

View File

@@ -13,10 +13,15 @@ from aerich.ddl.sqlite import SqliteDDL
from aerich.migrate import Migrate
db_url = os.getenv("TEST_DB", "sqlite://:memory:")
db_url_second = os.getenv("TEST_DB_SECOND", "sqlite://:memory:")
tortoise_orm = {
"connections": {"default": expand_db_url(db_url, True)},
"connections": {
"default": expand_db_url(db_url, True),
"second": expand_db_url(db_url_second, True),
},
"apps": {
"models": {"models": ["tests.models", "aerich.models"], "default_connection": "default"},
"models_second": {"models": ["tests.models_second"], "default_connection": "second"},
},
}
@@ -62,5 +67,5 @@ async def initialize_tests(event_loop, request):
Migrate.ddl = SqliteDDL(client)
elif client.schema_generator is AsyncpgSchemaGenerator:
Migrate.ddl = PostgresDDL(client)
Migrate.dialect = Migrate.ddl.DIALECT
request.addfinalizer(lambda: event_loop.run_until_complete(Tortoise._drop_databases()))

716
poetry.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,6 @@
[tool.poetry]
name = "aerich"
version = "0.3.2"
version = "0.4.2"
description = "A database migrations tool for Tortoise ORM."
authors = ["long2ice <long2ice@gmail.com>"]
license = "Apache-2.0"
@@ -25,7 +25,7 @@ asyncpg = {version = "*", optional = true}
[tool.poetry.dev-dependencies]
flake8 = "*"
isort = "*"
black = "^19.10b0"
black = "^20.8b1"
pytest = "*"
pytest-xdist = "*"
pytest-asyncio = "*"

63
tests/models_second.py Normal file
View File

@@ -0,0 +1,63 @@
import datetime
from enum import IntEnum
from tortoise import Model, fields
class ProductType(IntEnum):
article = 1
page = 2
class PermissionAction(IntEnum):
create = 1
delete = 2
update = 3
read = 4
class Status(IntEnum):
on = 1
off = 0
class User(Model):
username = fields.CharField(max_length=20, unique=True)
password = fields.CharField(max_length=200)
last_login = fields.DatetimeField(description="Last Login", default=datetime.datetime.now)
is_active = fields.BooleanField(default=True, description="Is Active")
is_superuser = fields.BooleanField(default=False, description="Is SuperUser")
avatar = fields.CharField(max_length=200, default="")
intro = fields.TextField(default="")
class Email(Model):
email = fields.CharField(max_length=200)
is_primary = fields.BooleanField(default=False)
user = fields.ForeignKeyField("models_second.User", db_constraint=False)
class Category(Model):
slug = fields.CharField(max_length=200)
name = fields.CharField(max_length=200)
user = fields.ForeignKeyField("models_second.User", description="User")
created_at = fields.DatetimeField(auto_now_add=True)
class Product(Model):
categories = fields.ManyToManyField("models_second.Category")
name = fields.CharField(max_length=50)
view_num = fields.IntField(description="View Num")
sort = fields.IntField()
is_reviewed = fields.BooleanField(description="Is Reviewed")
type = fields.IntEnumField(ProductType, description="Product Type")
image = fields.CharField(max_length=200)
body = fields.TextField()
created_at = fields.DatetimeField(auto_now_add=True)
class Config(Model):
label = fields.CharField(max_length=200)
key = fields.CharField(max_length=20)
value = fields.JSONField()
status: Status = fields.IntEnumField(Status, default=Status.on)

View File

@@ -42,7 +42,7 @@ def test_create_table():
"id" SERIAL NOT NULL PRIMARY KEY,
"slug" VARCHAR(200) NOT NULL,
"name" VARCHAR(200) NOT NULL,
"created_at" TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
"created_at" TIMESTAMPTZ NOT NULL DEFAULT CURRENT_TIMESTAMP,
"user_id" INT NOT NULL REFERENCES "user" ("id") ON DELETE CASCADE
);
COMMENT ON COLUMN "category"."user_id" IS 'User';"""

View File

@@ -62,18 +62,18 @@ def test_sort_all_version_files(mocker):
mocker.patch(
"os.listdir",
return_value=[
"1_datetime_update.json",
"11_datetime_update.json",
"10_datetime_update.json",
"2_datetime_update.json",
"1_datetime_update.sql",
"11_datetime_update.sql",
"10_datetime_update.sql",
"2_datetime_update.sql",
],
)
Migrate.migrate_location = "."
assert Migrate.get_all_version_files() == [
"1_datetime_update.json",
"2_datetime_update.json",
"10_datetime_update.json",
"11_datetime_update.json",
"1_datetime_update.sql",
"2_datetime_update.sql",
"10_datetime_update.sql",
"11_datetime_update.sql",
]