34 Commits

Author SHA1 Message Date
long2ice
d2e0a68351 Fix packaging error. (#92) 2020-12-02 23:03:15 +08:00
long2ice
ee6cc20c7d Fix empty items 2020-11-30 11:14:09 +08:00
long2ice
4e917495a0 Fix upgrade in new db. (#96) 2020-11-30 11:02:48 +08:00
long2ice
bfa66f6dd4 update changelog 2020-11-29 11:15:43 +08:00
long2ice
f00715d4c4 Merge pull request #97 from TrDex/pathlib-for-path-resolving
Use `pathlib` for path resolving
2020-11-29 11:02:44 +08:00
Mykola Solodukha
6e3105690a Use pathlib for path resolving 2020-11-28 19:23:34 +02:00
long2ice
c707f7ecb2 bug fix 2020-11-28 14:31:41 +08:00
long2ice
0bbc471e00 Fix sqlite stuck. (#90) 2020-11-26 23:38:57 +08:00
long2ice
fb6cc62047 update README and CHANGELOG 2020-11-23 16:44:16 +08:00
long2ice
e9ceaf471f Merge pull request #87 from ALexALed/remove-default-detections-for-callable
Remove callable detection for defaults
2020-11-23 16:41:30 +08:00
alexaled
85fc3b2aa2 Remove callable detection for defaults 2020-11-23 10:35:40 +02:00
long2ice
a677d506a9 Fix ci error 2020-11-19 10:41:52 +08:00
long2ice
9879004fee Add rename column support MySQL5 2020-11-19 10:11:52 +08:00
long2ice
5760fe2040 Merge pull request #83 from SakuraSound/fix-migrate-unlink
Catch OSError (if read-only file system)
2020-11-18 15:40:29 +08:00
Joir-dan Gumbs
b229c30558 Catch OSError (if read-only file system) 2020-11-17 23:28:00 -08:00
long2ice
5d2f1604c3 update github action poetry 2020-11-17 10:57:56 +08:00
long2ice
499c4e1c02 Fix black 2020-11-17 10:50:57 +08:00
long2ice
1463ee30bc update deps 2020-11-17 10:43:27 +08:00
long2ice
3b801932f5 Merge remote-tracking branch 'origin/dev' into dev 2020-11-17 10:36:14 +08:00
long2ice
c2eb4dc9e3 update poetry in github actions 2020-11-17 10:35:51 +08:00
long2ice
5927febd0c Delete .DS_Store 2020-11-17 10:10:32 +08:00
long2ice
a1c10ff330 exclude .DS_store 2020-11-17 10:09:37 +08:00
long2ice
f2013c931a Fix test error 2020-11-16 22:32:19 +08:00
long2ice
b21b954d32 Use .sql instead of .json to store version file. (#79) 2020-11-16 22:25:01 +08:00
long2ice
f5588a35c5 update deps 2020-11-12 21:27:58 +08:00
long2ice
f5dff84476 Fix encoding error. (#75) 2020-11-08 23:00:44 +08:00
long2ice
e399821116 update deps 2020-11-05 17:43:41 +08:00
long2ice
648f25a951 Compatible with models file in directory. (#70) 2020-10-30 19:51:46 +08:00
long2ice
fa73e132e2 remove .vscode 2020-10-30 16:45:12 +08:00
long2ice
1bac33cd33 add confirmation_option when downgrade 2020-10-30 16:39:14 +08:00
long2ice
4e76f12ccf update README.md 2020-10-28 17:12:23 +08:00
long2ice
724379700e Support multiple databases. (#68) 2020-10-28 17:02:02 +08:00
long2ice
bb929f2b55 update deps 2020-10-25 17:48:05 +08:00
long2ice
6339dc86a8 Fix migrate to new database error 2020-10-14 20:33:23 +08:00
17 changed files with 764 additions and 448 deletions

View File

@@ -11,7 +11,10 @@ jobs:
- uses: actions/setup-python@v2 - uses: actions/setup-python@v2
with: with:
python-version: '3.x' python-version: '3.x'
- uses: dschep/install-poetry-action@v1.3 - name: Install and configure Poetry
uses: snok/install-poetry@v1.1.1
with:
virtualenvs-create: false
- name: Build dists - name: Build dists
run: make build run: make build
- name: Pypi Publish - name: Pypi Publish

View File

@@ -1,5 +1,5 @@
name: test name: test
on: [push, pull_request] on: [ push, pull_request ]
jobs: jobs:
testall: testall:
runs-on: ubuntu-latest runs-on: ubuntu-latest
@@ -19,7 +19,10 @@ jobs:
- uses: actions/setup-python@v2 - uses: actions/setup-python@v2
with: with:
python-version: '3.x' python-version: '3.x'
- uses: dschep/install-poetry-action@v1.3 - name: Install and configure Poetry
uses: snok/install-poetry@v1.1.1
with:
virtualenvs-create: false
- name: CI - name: CI
env: env:
MYSQL_PASS: root MYSQL_PASS: root

2
.gitignore vendored
View File

@@ -144,3 +144,5 @@ cython_debug/
migrations migrations
aerich.ini aerich.ini
src src
.vscode
.DS_Store

View File

@@ -1,7 +1,36 @@
# ChangeLog # ChangeLog
## 0.4
### 0.4.2
- Use `pathlib` for path resolving. (#89)
- Fix upgrade in new db. (#96)
- Fix packaging error. (#92)
### 0.4.1
- Bug fix. (#91 #93)
### 0.4.0
- Use `.sql` instead of `.json` to store version file.
- Add `rename` column support MySQL5.
- Remove callable detection for defaults. (#87)
- Fix `sqlite` stuck. (#90)
## 0.3 ## 0.3
### 0.3.3
- Fix encoding error. (#75)
- Support multiple databases. (#68)
- Compatible with models file in directory. (#70)
### 0.3.2
- Fix migrate to new database error. (#62)
### 0.3.1 ### 0.3.1
- Fix first version error. - Fix first version error.

View File

@@ -3,8 +3,10 @@ black_opts = -l 100 -t py38
py_warn = PYTHONDEVMODE=1 py_warn = PYTHONDEVMODE=1
MYSQL_HOST ?= "127.0.0.1" MYSQL_HOST ?= "127.0.0.1"
MYSQL_PORT ?= 3306 MYSQL_PORT ?= 3306
MYSQL_PASS ?= "123456"
POSTGRES_HOST ?= "127.0.0.1" POSTGRES_HOST ?= "127.0.0.1"
POSTGRES_PORT ?= 5432 POSTGRES_PORT ?= 5432
POSTGRES_PASS ?= "123456"
help: help:
@echo "Aerich development makefile" @echo "Aerich development makefile"
@@ -22,7 +24,7 @@ up:
@poetry update @poetry update
deps: deps:
@poetry install -E dbdrivers --no-root @poetry install -E dbdrivers
style: deps style: deps
isort -src $(checkfiles) isort -src $(checkfiles)

View File

@@ -10,7 +10,7 @@
Aerich is a database migrations tool for Tortoise-ORM, which like alembic for SQLAlchemy, or Django ORM with it\'s Aerich is a database migrations tool for Tortoise-ORM, which like alembic for SQLAlchemy, or Django ORM with it\'s
own migrations solution. own migrations solution.
**If you upgrade aerich from <= 0.2.5 to >= 0.3.0, see [changelog](https://github.com/tortoise/aerich/blob/dev/CHANGELOG.md) for upgrade steps.** **Important: You can only use absolutely import in your `models.py` to make `aerich` work.**
## Install ## Install
@@ -103,22 +103,20 @@ If your Tortoise-ORM app is not default `models`, you must specify
```shell ```shell
> aerich migrate --name drop_column > aerich migrate --name drop_column
Success migrate 1_202029051520102929_drop_column.json Success migrate 1_202029051520102929_drop_column.sql
``` ```
Format of migrate filename is Format of migrate filename is
`{version_num}_{datetime}_{name|update}.json`. `{version_num}_{datetime}_{name|update}.sql`.
And if `aerich` guess you are renaming a column, it will ask `Rename {old_column} to {new_column} [True]`, you can choice `True` to rename column without column drop, or choice `False` to drop column then create. And if `aerich` guess you are renaming a column, it will ask `Rename {old_column} to {new_column} [True]`, you can choice `True` to rename column without column drop, or choice `False` to drop column then create.
If you use `MySQL`, only MySQL8.0+ support `rename..to` syntax.
### Upgrade to latest version ### Upgrade to latest version
```shell ```shell
> aerich upgrade > aerich upgrade
Success upgrade 1_202029051520102929_drop_column.json Success upgrade 1_202029051520102929_drop_column.sql
``` ```
Now your db is migrated to latest. Now your db is migrated to latest.
@@ -134,13 +132,17 @@ Usage: aerich downgrade [OPTIONS]
Options: Options:
-v, --version INTEGER Specified version, default to last. [default: -1] -v, --version INTEGER Specified version, default to last. [default: -1]
-d, --delete Delete version files at the same time. [default:
False]
--yes Confirm the action without prompting.
-h, --help Show this message and exit. -h, --help Show this message and exit.
``` ```
```shell ```shell
> aerich downgrade > aerich downgrade
Success downgrade 1_202029051520102929_drop_column.json Success downgrade 1_202029051520102929_drop_column.sql
``` ```
Now your db rollback to specified version. Now your db rollback to specified version.
@@ -150,7 +152,7 @@ Now your db rollback to specified version.
```shell ```shell
> aerich history > aerich history
1_202029051520102929_drop_column.json 1_202029051520102929_drop_column.sql
``` ```
### Show heads to be migrated ### Show heads to be migrated
@@ -158,13 +160,27 @@ Now your db rollback to specified version.
```shell ```shell
> aerich heads > aerich heads
1_202029051520102929_drop_column.json 1_202029051520102929_drop_column.sql
``` ```
## Support this project ### Multiple databases
- Just give a star! ```python
- Donation. tortoise_orm = {
"connections": {
"default": expand_db_url(db_url, True),
"second": expand_db_url(db_url_second, True),
},
"apps": {
"models": {"models": ["tests.models", "aerich.models"], "default_connection": "default"},
"models_second": {"models": ["tests.models_second"], "default_connection": "second",},
},
}
```
You need only specify `aerich.models` in one app, and must specify `--app` when run `aerich migrate` and so on.
## Support this project
| AliPay | WeChatPay | PayPal | | AliPay | WeChatPay | PayPal |
| -------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------- | ---------------------------------------------------------------- | | -------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------- | ---------------------------------------------------------------- |

View File

@@ -1 +1 @@
__version__ = "0.3.1" __version__ = "0.4.2"

View File

@@ -1,9 +1,9 @@
import asyncio import asyncio
import json
import os import os
import sys import sys
from configparser import ConfigParser from configparser import ConfigParser
from functools import wraps from functools import wraps
from pathlib import Path
import click import click
from click import Context, UsageError from click import Context, UsageError
@@ -13,7 +13,13 @@ from tortoise.transactions import in_transaction
from tortoise.utils import get_schema_sql from tortoise.utils import get_schema_sql
from aerich.migrate import Migrate from aerich.migrate import Migrate
from aerich.utils import get_app_connection, get_app_connection_name, get_tortoise_config from aerich.utils import (
get_app_connection,
get_app_connection_name,
get_tortoise_config,
get_version_content_from_file,
write_version_file,
)
from . import __version__ from . import __version__
from .enums import Color from .enums import Color
@@ -38,7 +44,11 @@ def coro(f):
@click.group(context_settings={"help_option_names": ["-h", "--help"]}) @click.group(context_settings={"help_option_names": ["-h", "--help"]})
@click.version_option(__version__, "-V", "--version") @click.version_option(__version__, "-V", "--version")
@click.option( @click.option(
"-c", "--config", default="aerich.ini", show_default=True, help="Config file.", "-c",
"--config",
default="aerich.ini",
show_default=True,
help="Config file.",
) )
@click.option("--app", required=False, help="Tortoise-ORM app name.") @click.option("--app", required=False, help="Tortoise-ORM app name.")
@click.option( @click.option(
@@ -57,7 +67,7 @@ async def cli(ctx: Context, config, app, name):
invoked_subcommand = ctx.invoked_subcommand invoked_subcommand = ctx.invoked_subcommand
if invoked_subcommand != "init": if invoked_subcommand != "init":
if not os.path.exists(config): if not Path(config).exists():
raise UsageError("You must exec init first", ctx=ctx) raise UsageError("You must exec init first", ctx=ctx)
parser.read(config) parser.read(config)
@@ -66,13 +76,13 @@ async def cli(ctx: Context, config, app, name):
tortoise_config = get_tortoise_config(ctx, tortoise_orm) tortoise_config = get_tortoise_config(ctx, tortoise_orm)
app = app or list(tortoise_config.get("apps").keys())[0] app = app or list(tortoise_config.get("apps").keys())[0]
if "aerich.models" not in tortoise_config.get("apps").get(app).get("models"):
raise UsageError("Check your tortoise config and add aerich.models to it.", ctx=ctx)
ctx.obj["config"] = tortoise_config ctx.obj["config"] = tortoise_config
ctx.obj["location"] = location ctx.obj["location"] = location
ctx.obj["app"] = app ctx.obj["app"] = app
Migrate.app = app Migrate.app = app
if invoked_subcommand != "init-db": if invoked_subcommand != "init-db":
if not Path(location, app).exists():
raise UsageError("You must exec init-db first", ctx=ctx)
await Migrate.init_with_old_models(tortoise_config, app, location) await Migrate.init_with_old_models(tortoise_config, app, location)
@@ -102,9 +112,8 @@ async def upgrade(ctx: Context):
exists = False exists = False
if not exists: if not exists:
async with in_transaction(get_app_connection_name(config, app)) as conn: async with in_transaction(get_app_connection_name(config, app)) as conn:
file_path = os.path.join(Migrate.migrate_location, version_file) file_path = Path(Migrate.migrate_location, version_file)
with open(file_path, "r", encoding="utf-8") as f: content = get_version_content_from_file(file_path)
content = json.load(f)
upgrade_query_list = content.get("upgrade") upgrade_query_list = content.get("upgrade")
for upgrade_query in upgrade_query_list: for upgrade_query in upgrade_query_list:
await conn.execute_script(upgrade_query) await conn.execute_script(upgrade_query)
@@ -116,7 +125,7 @@ async def upgrade(ctx: Context):
click.secho(f"Success upgrade {version_file}", fg=Color.green) click.secho(f"Success upgrade {version_file}", fg=Color.green)
migrated = True migrated = True
if not migrated: if not migrated:
click.secho("No migrate items", fg=Color.yellow) click.secho("No upgrade items found", fg=Color.yellow)
@cli.command(help="Downgrade to specified version.") @cli.command(help="Downgrade to specified version.")
@@ -128,9 +137,20 @@ async def upgrade(ctx: Context):
show_default=True, show_default=True,
help="Specified version, default to last.", help="Specified version, default to last.",
) )
@click.option(
"-d",
"--delete",
is_flag=True,
default=False,
show_default=True,
help="Delete version files at the same time.",
)
@click.pass_context @click.pass_context
@click.confirmation_option(
prompt="Downgrade is dangerous, which maybe lose your data, are you sure?",
)
@coro @coro
async def downgrade(ctx: Context, version: int): async def downgrade(ctx: Context, version: int, delete: bool):
app = ctx.obj["app"] app = ctx.obj["app"]
config = ctx.obj["config"] config = ctx.obj["config"]
if version == -1: if version == -1:
@@ -146,15 +166,16 @@ async def downgrade(ctx: Context, version: int):
for version in versions: for version in versions:
file = version.version file = version.version
async with in_transaction(get_app_connection_name(config, app)) as conn: async with in_transaction(get_app_connection_name(config, app)) as conn:
file_path = os.path.join(Migrate.migrate_location, file) file_path = Path(Migrate.migrate_location, file)
with open(file_path, "r", encoding="utf-8") as f: content = get_version_content_from_file(file_path)
content = json.load(f)
downgrade_query_list = content.get("downgrade") downgrade_query_list = content.get("downgrade")
if not downgrade_query_list: if not downgrade_query_list:
return click.secho("No downgrade item found", fg=Color.yellow) click.secho("No downgrade items found", fg=Color.yellow)
return
for downgrade_query in downgrade_query_list: for downgrade_query in downgrade_query_list:
await conn.execute_query(downgrade_query) await conn.execute_query(downgrade_query)
await version.delete() await version.delete()
if delete:
os.unlink(file_path) os.unlink(file_path)
click.secho(f"Success downgrade {file}", fg=Color.green) click.secho(f"Success downgrade {file}", fg=Color.green)
@@ -193,16 +214,21 @@ async def history(ctx: Context):
help="Tortoise-ORM config module dict variable, like settings.TORTOISE_ORM.", help="Tortoise-ORM config module dict variable, like settings.TORTOISE_ORM.",
) )
@click.option( @click.option(
"--location", default="./migrations", show_default=True, help="Migrate store location." "--location",
default="./migrations",
show_default=True,
help="Migrate store location.",
) )
@click.pass_context @click.pass_context
@coro @coro
async def init( async def init(
ctx: Context, tortoise_orm, location, ctx: Context,
tortoise_orm,
location,
): ):
config_file = ctx.obj["config_file"] config_file = ctx.obj["config_file"]
name = ctx.obj["name"] name = ctx.obj["name"]
if os.path.exists(config_file): if Path(config_file).exists():
return click.secho("You have inited", fg=Color.yellow) return click.secho("You have inited", fg=Color.yellow)
parser.add_section(name) parser.add_section(name)
@@ -212,7 +238,7 @@ async def init(
with open(config_file, "w", encoding="utf-8") as f: with open(config_file, "w", encoding="utf-8") as f:
parser.write(f) parser.write(f)
if not os.path.isdir(location): if not Path(location).is_dir():
os.mkdir(location) os.mkdir(location)
click.secho(f"Success create migrate location {location}", fg=Color.green) click.secho(f"Success create migrate location {location}", fg=Color.green)
@@ -234,12 +260,14 @@ async def init_db(ctx: Context, safe):
location = ctx.obj["location"] location = ctx.obj["location"]
app = ctx.obj["app"] app = ctx.obj["app"]
dirname = os.path.join(location, app) dirname = Path(location, app)
if not os.path.isdir(dirname): if not dirname.is_dir():
os.mkdir(dirname) os.mkdir(dirname)
click.secho(f"Success create app migrate location {dirname}", fg=Color.green) click.secho(f"Success create app migrate location {dirname}", fg=Color.green)
else: else:
return click.secho(f"Inited {app} already", fg=Color.yellow) return click.secho(
f"Inited {app} already, or delete {dirname} and try again.", fg=Color.yellow
)
await Tortoise.init(config=config) await Tortoise.init(config=config)
connection = get_app_connection(config, app) connection = get_app_connection(config, app)
@@ -249,16 +277,21 @@ async def init_db(ctx: Context, safe):
version = await Migrate.generate_version() version = await Migrate.generate_version()
await Aerich.create( await Aerich.create(
version=version, app=app, content=Migrate.get_models_content(config, app, location) version=version,
app=app,
content=Migrate.get_models_content(config, app, location),
) )
with open(os.path.join(dirname, version), "w", encoding="utf-8") as f:
content = { content = {
"upgrade": [schema], "upgrade": [schema],
} }
json.dump(content, f, ensure_ascii=False, indent=2) write_version_file(Path(dirname, version), content)
return click.secho(f'Success generate schema for app "{app}"', fg=Color.green) click.secho(f'Success generate schema for app "{app}"', fg=Color.green)
def main(): def main():
sys.path.insert(0, ".") sys.path.insert(0, ".")
cli() cli()
if __name__ == "__main__":
main()

View File

@@ -22,6 +22,9 @@ class BaseDDL:
_DROP_FK_TEMPLATE = 'ALTER TABLE "{table_name}" DROP FOREIGN KEY "{fk_name}"' _DROP_FK_TEMPLATE = 'ALTER TABLE "{table_name}" DROP FOREIGN KEY "{fk_name}"'
_M2M_TABLE_TEMPLATE = 'CREATE TABLE "{table_name}" ("{backward_key}" {backward_type} NOT NULL REFERENCES "{backward_table}" ("{backward_field}") ON DELETE CASCADE,"{forward_key}" {forward_type} NOT NULL REFERENCES "{forward_table}" ("{forward_field}") ON DELETE {on_delete}){extra}{comment};' _M2M_TABLE_TEMPLATE = 'CREATE TABLE "{table_name}" ("{backward_key}" {backward_type} NOT NULL REFERENCES "{backward_table}" ("{backward_field}") ON DELETE CASCADE,"{forward_key}" {forward_type} NOT NULL REFERENCES "{forward_table}" ("{forward_field}") ON DELETE {on_delete}){extra}{comment};'
_MODIFY_COLUMN_TEMPLATE = 'ALTER TABLE "{table_name}" MODIFY COLUMN {column}' _MODIFY_COLUMN_TEMPLATE = 'ALTER TABLE "{table_name}" MODIFY COLUMN {column}'
_CHANGE_COLUMN_TEMPLATE = (
'ALTER TABLE "{table_name}" CHANGE {old_column_name} {new_column_name} {new_column_type}'
)
def __init__(self, client: "BaseDBAsyncClient"): def __init__(self, client: "BaseDBAsyncClient"):
self.client = client self.client = client
@@ -136,6 +139,16 @@ class BaseDDL:
new_column_name=new_column_name, new_column_name=new_column_name,
) )
def change_column(
self, model: "Type[Model]", old_column_name: str, new_column_name: str, new_column_type: str
):
return self._CHANGE_COLUMN_TEMPLATE.format(
table_name=model._meta.db_table,
old_column_name=old_column_name,
new_column_name=new_column_name,
new_column_type=new_column_type,
)
def add_index(self, model: "Type[Model]", field_names: List[str], unique=False): def add_index(self, model: "Type[Model]", field_names: List[str], unique=False):
return self._ADD_INDEX_TEMPLATE.format( return self._ADD_INDEX_TEMPLATE.format(
unique="UNIQUE" if unique else "", unique="UNIQUE" if unique else "",

View File

@@ -1,25 +1,28 @@
import json import inspect
import os import os
import re import re
from datetime import datetime from datetime import datetime
from importlib import import_module from importlib import import_module
from io import StringIO from io import StringIO
from typing import Dict, List, Tuple, Type from pathlib import Path
from typing import Dict, List, Optional, Tuple, Type
import click import click
from tortoise import ( from tortoise import (
BackwardFKRelation, BackwardFKRelation,
BackwardOneToOneRelation, BackwardOneToOneRelation,
BaseDBAsyncClient,
ForeignKeyFieldInstance, ForeignKeyFieldInstance,
ManyToManyFieldInstance, ManyToManyFieldInstance,
Model, Model,
Tortoise, Tortoise,
) )
from tortoise.exceptions import OperationalError
from tortoise.fields import Field from tortoise.fields import Field
from aerich.ddl import BaseDDL from aerich.ddl import BaseDDL
from aerich.models import MAX_VERSION_LENGTH, Aerich from aerich.models import MAX_VERSION_LENGTH, Aerich
from aerich.utils import get_app_connection from aerich.utils import get_app_connection, write_version_file
class Migrate: class Migrate:
@@ -40,43 +43,53 @@ class Migrate:
app: str app: str
migrate_location: str migrate_location: str
dialect: str dialect: str
_db_version: Optional[str] = None
@classmethod @classmethod
def get_old_model_file(cls, app: str, location: str): def get_old_model_file(cls, app: str, location: str):
return os.path.join(location, app, cls.old_models + ".py") return Path(location, app, cls.old_models + ".py")
@classmethod @classmethod
def get_all_version_files(cls) -> List[str]: def get_all_version_files(cls) -> List[str]:
return sorted( return sorted(
filter(lambda x: x.endswith("json"), os.listdir(cls.migrate_location)), filter(lambda x: x.endswith("sql"), os.listdir(cls.migrate_location)),
key=lambda x: int(x.split("_")[0]), key=lambda x: int(x.split("_")[0]),
) )
@classmethod @classmethod
async def get_last_version(cls) -> Aerich: async def get_last_version(cls) -> Optional[Aerich]:
try:
return await Aerich.filter(app=cls.app).first() return await Aerich.filter(app=cls.app).first()
except OperationalError:
pass
@classmethod @classmethod
def remove_old_model_file(cls, app: str, location: str): def remove_old_model_file(cls, app: str, location: str):
try: try:
os.unlink(cls.get_old_model_file(app, location)) os.unlink(cls.get_old_model_file(app, location))
except FileNotFoundError: except (OSError, FileNotFoundError):
pass pass
@classmethod
async def _get_db_version(cls, connection: BaseDBAsyncClient):
if cls.dialect == "mysql":
sql = "select version() as version"
ret = await connection.execute_query(sql)
cls._db_version = ret[1][0].get("version")
@classmethod @classmethod
async def init_with_old_models(cls, config: dict, app: str, location: str): async def init_with_old_models(cls, config: dict, app: str, location: str):
await Tortoise.init(config=config) await Tortoise.init(config=config)
last_version = await cls.get_last_version() last_version = await cls.get_last_version()
cls.app = app
cls.migrate_location = Path(location, app)
if last_version: if last_version:
content = last_version.content content = last_version.content
with open(cls.get_old_model_file(app, location), "w") as f: with open(cls.get_old_model_file(app, location), "w", encoding="utf-8") as f:
f.write(content) f.write(content)
migrate_config = cls._get_migrate_config(config, app, location) migrate_config = cls._get_migrate_config(config, app, location)
cls.app = app
cls.migrate_config = migrate_config cls.migrate_config = migrate_config
cls.migrate_location = os.path.join(location, app)
await Tortoise.init(config=migrate_config) await Tortoise.init(config=migrate_config)
connection = get_app_connection(config, app) connection = get_app_connection(config, app)
@@ -93,6 +106,7 @@ class Migrate:
from aerich.ddl.postgres import PostgresDDL from aerich.ddl.postgres import PostgresDDL
cls.ddl = PostgresDDL(connection) cls.ddl = PostgresDDL(connection)
await cls._get_db_version(connection)
@classmethod @classmethod
async def _get_last_version_num(cls): async def _get_last_version_num(cls):
@@ -107,8 +121,8 @@ class Migrate:
now = datetime.now().strftime("%Y%m%d%H%M%S").replace("/", "") now = datetime.now().strftime("%Y%m%d%H%M%S").replace("/", "")
last_version_num = await cls._get_last_version_num() last_version_num = await cls._get_last_version_num()
if last_version_num is None: if last_version_num is None:
return f"0_{now}_init.json" return f"0_{now}_init.sql"
version = f"{last_version_num + 1}_{now}_{name}.json" version = f"{last_version_num + 1}_{now}_{name}.sql"
if len(version) > MAX_VERSION_LENGTH: if len(version) > MAX_VERSION_LENGTH:
raise ValueError(f"Version name exceeds maximum length ({MAX_VERSION_LENGTH})") raise ValueError(f"Version name exceeds maximum length ({MAX_VERSION_LENGTH})")
return version return version
@@ -119,13 +133,12 @@ class Migrate:
# delete if same version exists # delete if same version exists
for version_file in cls.get_all_version_files(): for version_file in cls.get_all_version_files():
if version_file.startswith(version.split("_")[0]): if version_file.startswith(version.split("_")[0]):
os.unlink(os.path.join(cls.migrate_location, version_file)) os.unlink(Path(cls.migrate_location, version_file))
content = { content = {
"upgrade": cls.upgrade_operators, "upgrade": cls.upgrade_operators,
"downgrade": cls.downgrade_operators, "downgrade": cls.downgrade_operators,
} }
with open(os.path.join(cls.migrate_location, version), "w", encoding="utf-8") as f: write_version_file(Path(cls.migrate_location, version), content)
json.dump(content, f, indent=2, ensure_ascii=False)
return version return version
@classmethod @classmethod
@@ -178,8 +191,7 @@ class Migrate:
:param location: :param location:
:return: :return:
""" """
path = os.path.join(location, app, cls.old_models) path = Path(location, app, cls.old_models).as_posix().replace("/", ".")
path = path.replace(os.sep, ".").lstrip(".")
config["apps"][cls.diff_app] = { config["apps"][cls.diff_app] = {
"models": [path], "models": [path],
"default_connection": config.get("apps").get(app).get("default_connection", "default"), "default_connection": config.get("apps").get(app).get("default_connection", "default"),
@@ -198,7 +210,15 @@ class Migrate:
old_model_files = [] old_model_files = []
models = config.get("apps").get(app).get("models") models = config.get("apps").get(app).get("models")
for model in models: for model in models:
old_model_files.append(import_module(model).__file__) module = import_module(model)
possible_models = [getattr(module, attr_name) for attr_name in dir(module)]
for attr in filter(
lambda x: inspect.isclass(x) and issubclass(x, Model) and x is not Model,
possible_models,
):
file = inspect.getfile(attr)
if file not in old_model_files:
old_model_files.append(file)
pattern = rf"(\n)?('|\")({app})(.\w+)('|\")" pattern = rf"(\n)?('|\")({app})(.\w+)('|\")"
str_io = StringIO() str_io = StringIO()
for i, model_file in enumerate(old_model_files): for i, model_file in enumerate(old_model_files):
@@ -290,13 +310,26 @@ class Migrate:
else: else:
is_rename = diff_key in cls._rename_new is_rename = diff_key in cls._rename_new
if is_rename: if is_rename:
if (
cls.dialect == "mysql"
and cls._db_version
and cls._db_version.startswith("5.")
):
cls._add_operator( cls._add_operator(
cls._rename_field(new_model, old_field, new_field), upgrade, cls._change_field(new_model, old_field, new_field),
upgrade,
)
else:
cls._add_operator(
cls._rename_field(new_model, old_field, new_field),
upgrade,
) )
break break
else: else:
cls._add_operator( cls._add_operator(
cls._add_field(new_model, new_field), upgrade, cls._is_fk_m2m(new_field), cls._add_field(new_model, new_field),
upgrade,
cls._is_fk_m2m(new_field),
) )
else: else:
old_field = old_fields_map.get(new_key) old_field = old_fields_map.get(new_key)
@@ -312,7 +345,9 @@ class Migrate:
cls._add_operator( cls._add_operator(
cls._alter_null(new_model, new_field), upgrade=upgrade cls._alter_null(new_model, new_field), upgrade=upgrade
) )
if new_field.default != old_field.default: if new_field.default != old_field.default and not callable(
new_field.default
):
cls._add_operator( cls._add_operator(
cls._alter_default(new_model, new_field), upgrade=upgrade cls._alter_default(new_model, new_field), upgrade=upgrade
) )
@@ -347,11 +382,15 @@ class Migrate:
if isinstance(new_field, ForeignKeyFieldInstance): if isinstance(new_field, ForeignKeyFieldInstance):
if old_field.db_constraint and not new_field.db_constraint: if old_field.db_constraint and not new_field.db_constraint:
cls._add_operator( cls._add_operator(
cls._drop_fk(new_model, new_field), upgrade, True, cls._drop_fk(new_model, new_field),
upgrade,
True,
) )
if new_field.db_constraint and not old_field.db_constraint: if new_field.db_constraint and not old_field.db_constraint:
cls._add_operator( cls._add_operator(
cls._add_fk(new_model, new_field), upgrade, True, cls._add_fk(new_model, new_field),
upgrade,
True,
) )
for old_key in old_keys: for old_key in old_keys:
@@ -361,12 +400,20 @@ class Migrate:
not upgrade and old_key not in cls._rename_new not upgrade and old_key not in cls._rename_new
): ):
cls._add_operator( cls._add_operator(
cls._remove_field(old_model, field), upgrade, cls._is_fk_m2m(field), cls._remove_field(old_model, field),
upgrade,
cls._is_fk_m2m(field),
) )
for new_index in new_indexes: for new_index in new_indexes:
if new_index not in old_indexes: if new_index not in old_indexes:
cls._add_operator(cls._add_index(new_model, new_index,), upgrade) cls._add_operator(
cls._add_index(
new_model,
new_index,
),
upgrade,
)
for old_index in old_indexes: for old_index in old_indexes:
if old_index not in new_indexes: if old_index not in new_indexes:
cls._add_operator(cls._remove_index(old_model, old_index), upgrade) cls._add_operator(cls._remove_index(old_model, old_index), upgrade)
@@ -462,6 +509,15 @@ class Migrate:
def _rename_field(cls, model: Type[Model], old_field: Field, new_field: Field): def _rename_field(cls, model: Type[Model], old_field: Field, new_field: Field):
return cls.ddl.rename_column(model, old_field.model_field_name, new_field.model_field_name) return cls.ddl.rename_column(model, old_field.model_field_name, new_field.model_field_name)
@classmethod
def _change_field(cls, model: Type[Model], old_field: Field, new_field: Field):
return cls.ddl.change_column(
model,
old_field.model_field_name,
new_field.model_field_name,
new_field.get_for_dialect(cls.dialect, "SQL_TYPE"),
)
@classmethod @classmethod
def _add_fk(cls, model: Type[Model], field: ForeignKeyFieldInstance): def _add_fk(cls, model: Type[Model], field: ForeignKeyFieldInstance):
""" """

View File

@@ -1,17 +1,24 @@
import importlib import importlib
from typing import Dict
from click import BadOptionUsage, Context from click import BadOptionUsage, Context
from tortoise import BaseDBAsyncClient, Tortoise from tortoise import BaseDBAsyncClient, Tortoise
def get_app_connection_name(config, app) -> str: def get_app_connection_name(config, app_name: str) -> str:
""" """
get connection name get connection name
:param config: :param config:
:param app: :param app_name:
:return: :return:
""" """
return config.get("apps").get(app).get("default_connection", "default") app = config.get("apps").get(app_name)
if app:
return app.get("default_connection", "default")
raise BadOptionUsage(
option_name="--app",
message=f'Can\'t get app named "{app_name}"',
)
def get_app_connection(config, app) -> BaseDBAsyncClient: def get_app_connection(config, app) -> BaseDBAsyncClient:
@@ -49,3 +56,55 @@ def get_tortoise_config(ctx: Context, tortoise_orm: str) -> dict:
ctx=ctx, ctx=ctx,
) )
return config return config
_UPGRADE = "##### upgrade #####\n"
_DOWNGRADE = "##### downgrade #####\n"
def get_version_content_from_file(version_file: str) -> Dict:
"""
get version content
:param version_file:
:return:
"""
with open(version_file, "r", encoding="utf-8") as f:
content = f.read()
first = content.index(_UPGRADE)
try:
second = content.index(_DOWNGRADE)
except ValueError:
second = len(content) - 1
upgrade_content = content[first + len(_UPGRADE) : second].strip() # noqa:E203
downgrade_content = content[second + len(_DOWNGRADE) :].strip() # noqa:E203
ret = {
"upgrade": list(filter(lambda x: x or False, upgrade_content.split(";\n"))),
"downgrade": list(filter(lambda x: x or False, downgrade_content.split(";\n"))),
}
return ret
def write_version_file(version_file: str, content: Dict):
"""
write version file
:param version_file:
:param content:
:return:
"""
with open(version_file, "w", encoding="utf-8") as f:
f.write(_UPGRADE)
upgrade = content.get("upgrade")
if len(upgrade) > 1:
f.write(";\n".join(upgrade) + ";\n")
else:
f.write(f"{upgrade[0]}")
if not upgrade[0].endswith(";"):
f.write(";")
f.write("\n")
downgrade = content.get("downgrade")
if downgrade:
f.write(_DOWNGRADE)
if len(downgrade) > 1:
f.write(";\n".join(downgrade) + ";\n")
else:
f.write(f"{downgrade[0]};\n")

View File

@@ -13,10 +13,15 @@ from aerich.ddl.sqlite import SqliteDDL
from aerich.migrate import Migrate from aerich.migrate import Migrate
db_url = os.getenv("TEST_DB", "sqlite://:memory:") db_url = os.getenv("TEST_DB", "sqlite://:memory:")
db_url_second = os.getenv("TEST_DB_SECOND", "sqlite://:memory:")
tortoise_orm = { tortoise_orm = {
"connections": {"default": expand_db_url(db_url, True)}, "connections": {
"default": expand_db_url(db_url, True),
"second": expand_db_url(db_url_second, True),
},
"apps": { "apps": {
"models": {"models": ["tests.models", "aerich.models"], "default_connection": "default"}, "models": {"models": ["tests.models", "aerich.models"], "default_connection": "default"},
"models_second": {"models": ["tests.models_second"], "default_connection": "second"},
}, },
} }
@@ -62,5 +67,5 @@ async def initialize_tests(event_loop, request):
Migrate.ddl = SqliteDDL(client) Migrate.ddl = SqliteDDL(client)
elif client.schema_generator is AsyncpgSchemaGenerator: elif client.schema_generator is AsyncpgSchemaGenerator:
Migrate.ddl = PostgresDDL(client) Migrate.ddl = PostgresDDL(client)
Migrate.dialect = Migrate.ddl.DIALECT
request.addfinalizer(lambda: event_loop.run_until_complete(Tortoise._drop_databases())) request.addfinalizer(lambda: event_loop.run_until_complete(Tortoise._drop_databases()))

716
poetry.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,6 @@
[tool.poetry] [tool.poetry]
name = "aerich" name = "aerich"
version = "0.3.1" version = "0.4.2"
description = "A database migrations tool for Tortoise ORM." description = "A database migrations tool for Tortoise ORM."
authors = ["long2ice <long2ice@gmail.com>"] authors = ["long2ice <long2ice@gmail.com>"]
license = "Apache-2.0" license = "Apache-2.0"
@@ -25,7 +25,7 @@ asyncpg = {version = "*", optional = true}
[tool.poetry.dev-dependencies] [tool.poetry.dev-dependencies]
flake8 = "*" flake8 = "*"
isort = "*" isort = "*"
black = "^19.10b0" black = "^20.8b1"
pytest = "*" pytest = "*"
pytest-xdist = "*" pytest-xdist = "*"
pytest-asyncio = "*" pytest-asyncio = "*"

63
tests/models_second.py Normal file
View File

@@ -0,0 +1,63 @@
import datetime
from enum import IntEnum
from tortoise import Model, fields
class ProductType(IntEnum):
article = 1
page = 2
class PermissionAction(IntEnum):
create = 1
delete = 2
update = 3
read = 4
class Status(IntEnum):
on = 1
off = 0
class User(Model):
username = fields.CharField(max_length=20, unique=True)
password = fields.CharField(max_length=200)
last_login = fields.DatetimeField(description="Last Login", default=datetime.datetime.now)
is_active = fields.BooleanField(default=True, description="Is Active")
is_superuser = fields.BooleanField(default=False, description="Is SuperUser")
avatar = fields.CharField(max_length=200, default="")
intro = fields.TextField(default="")
class Email(Model):
email = fields.CharField(max_length=200)
is_primary = fields.BooleanField(default=False)
user = fields.ForeignKeyField("models_second.User", db_constraint=False)
class Category(Model):
slug = fields.CharField(max_length=200)
name = fields.CharField(max_length=200)
user = fields.ForeignKeyField("models_second.User", description="User")
created_at = fields.DatetimeField(auto_now_add=True)
class Product(Model):
categories = fields.ManyToManyField("models_second.Category")
name = fields.CharField(max_length=50)
view_num = fields.IntField(description="View Num")
sort = fields.IntField()
is_reviewed = fields.BooleanField(description="Is Reviewed")
type = fields.IntEnumField(ProductType, description="Product Type")
image = fields.CharField(max_length=200)
body = fields.TextField()
created_at = fields.DatetimeField(auto_now_add=True)
class Config(Model):
label = fields.CharField(max_length=200)
key = fields.CharField(max_length=20)
value = fields.JSONField()
status: Status = fields.IntEnumField(Status, default=Status.on)

View File

@@ -42,7 +42,7 @@ def test_create_table():
"id" SERIAL NOT NULL PRIMARY KEY, "id" SERIAL NOT NULL PRIMARY KEY,
"slug" VARCHAR(200) NOT NULL, "slug" VARCHAR(200) NOT NULL,
"name" VARCHAR(200) NOT NULL, "name" VARCHAR(200) NOT NULL,
"created_at" TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP, "created_at" TIMESTAMPTZ NOT NULL DEFAULT CURRENT_TIMESTAMP,
"user_id" INT NOT NULL REFERENCES "user" ("id") ON DELETE CASCADE "user_id" INT NOT NULL REFERENCES "user" ("id") ON DELETE CASCADE
); );
COMMENT ON COLUMN "category"."user_id" IS 'User';""" COMMENT ON COLUMN "category"."user_id" IS 'User';"""

View File

@@ -62,18 +62,18 @@ def test_sort_all_version_files(mocker):
mocker.patch( mocker.patch(
"os.listdir", "os.listdir",
return_value=[ return_value=[
"1_datetime_update.json", "1_datetime_update.sql",
"11_datetime_update.json", "11_datetime_update.sql",
"10_datetime_update.json", "10_datetime_update.sql",
"2_datetime_update.json", "2_datetime_update.sql",
], ],
) )
Migrate.migrate_location = "." Migrate.migrate_location = "."
assert Migrate.get_all_version_files() == [ assert Migrate.get_all_version_files() == [
"1_datetime_update.json", "1_datetime_update.sql",
"2_datetime_update.json", "2_datetime_update.sql",
"10_datetime_update.json", "10_datetime_update.sql",
"11_datetime_update.json", "11_datetime_update.sql",
] ]