Compare commits
61 Commits
Author | SHA1 | Date | |
---|---|---|---|
|
40d0823c01 | ||
|
467406ed20 | ||
|
484b5900ce | ||
|
b8b6df0b65 | ||
|
f0bc3126e9 | ||
|
dbc0d9e7ef | ||
|
818dd29991 | ||
|
e199e03b53 | ||
|
d79dc25ee8 | ||
|
c6d51a4dcf | ||
|
241b30a710 | ||
|
8cf50c58d7 | ||
|
1c9b65cc37 | ||
|
3fbf9febfb | ||
|
7b6545d4e1 | ||
|
52b50a2161 | ||
|
90943a473c | ||
|
d7ecd97e88 | ||
|
20aebc4413 | ||
|
f8e1f9ff44 | ||
|
ab31445fb2 | ||
|
28d19a4b7b | ||
|
9da99824fe | ||
|
75db7cea60 | ||
|
d777c9c278 | ||
|
e9b76bdd35 | ||
|
8b7864d886 | ||
|
bef45941f2 | ||
|
7b472d7a84 | ||
|
1f0a6dfb50 | ||
|
36282f123f | ||
|
3cd4e24050 | ||
|
f8c2f1b551 | ||
|
131d97a3d6 | ||
|
1a0371e977 | ||
|
e5b092fd08 | ||
|
7a109f3c79 | ||
|
8c2ecbaef1 | ||
|
b141363c51 | ||
|
9dd474d79f | ||
|
e4bb9d838e | ||
|
029d522c79 | ||
|
d6627906c7 | ||
|
3c88833154 | ||
|
8f68f08eba | ||
|
60ba6963fd | ||
|
4c35c44bd2 | ||
|
bdeaf5495e | ||
|
db33059ec9 | ||
|
44b96058f8 | ||
|
abff753b6a | ||
|
dcd8441a05 | ||
|
b4a735b814 | ||
|
83ba13e99a | ||
|
d7b1c07d13 | ||
|
1ac16188fc | ||
|
4abc464ce0 | ||
|
d4430cec0d | ||
|
40c7ef7fd6 | ||
|
7a826df43f | ||
|
b1b9cc1454 |
4
.github/workflows/ci.yml
vendored
4
.github/workflows/ci.yml
vendored
@@ -2,10 +2,10 @@ name: ci
|
||||
on:
|
||||
push:
|
||||
branches-ignore:
|
||||
- master
|
||||
- main
|
||||
pull_request:
|
||||
branches-ignore:
|
||||
- master
|
||||
- main
|
||||
jobs:
|
||||
ci:
|
||||
runs-on: ubuntu-latest
|
||||
|
35
CHANGELOG.md
35
CHANGELOG.md
@@ -1,5 +1,40 @@
|
||||
# ChangeLog
|
||||
|
||||
## 0.7
|
||||
|
||||
### 0.7.2
|
||||
|
||||
- Support virtual fields.
|
||||
- Fix modify multiple times. (#279)
|
||||
- Added `-i` and `--in-transaction` options to `aerich migrate` command. (#296)
|
||||
- Fix generates two semicolons in a row. (#301)
|
||||
|
||||
### 0.7.1
|
||||
|
||||
- Fix syntax error with python3.8.10. (#265)
|
||||
- Fix sql generate error. (#263)
|
||||
- Fix initialize an empty database. (#267)
|
||||
|
||||
### 0.7.1rc1
|
||||
|
||||
- Fix postgres sql error (#263)
|
||||
|
||||
### 0.7.0
|
||||
|
||||
**Now aerich use `.py` file to record versions.**
|
||||
|
||||
Upgrade Note:
|
||||
|
||||
1. Drop `aerich` table
|
||||
2. Delete `migrations/models` folder
|
||||
3. Run `aerich init-db`
|
||||
|
||||
- Improve `inspectdb` adding support to `postgresql::numeric` data type
|
||||
- Add support for dynamically load DDL classes easing to add support to
|
||||
new databases without changing `Migrate` class logic
|
||||
- Fix decimal field change. (#246)
|
||||
- Support add/remove field with index.
|
||||
|
||||
## 0.6
|
||||
|
||||
### 0.6.3
|
||||
|
2
Makefile
2
Makefile
@@ -20,7 +20,7 @@ style: deps
|
||||
|
||||
check: deps
|
||||
@black --check $(black_opts) $(checkfiles) || (echo "Please run 'make style' to auto-fix style issues" && false)
|
||||
@pflake8 $(checkfiles)
|
||||
@ruff $(checkfiles)
|
||||
|
||||
test: deps
|
||||
$(py_warn) TEST_DB=sqlite://:memory: py.test
|
||||
|
14
README.md
14
README.md
@@ -5,6 +5,8 @@
|
||||
[](https://github.com/tortoise/aerich/actions?query=workflow:pypi)
|
||||
[](https://github.com/tortoise/aerich/actions?query=workflow:ci)
|
||||
|
||||
English | [Русский](./README_RU.md)
|
||||
|
||||
## Introduction
|
||||
|
||||
Aerich is a database migrations tool for TortoiseORM, which is like alembic for SQLAlchemy, or like Django ORM with
|
||||
@@ -101,11 +103,11 @@ e.g. `aerich --app other_models init-db`.
|
||||
```shell
|
||||
> aerich migrate --name drop_column
|
||||
|
||||
Success migrate 1_202029051520102929_drop_column.sql
|
||||
Success migrate 1_202029051520102929_drop_column.py
|
||||
```
|
||||
|
||||
Format of migrate filename is
|
||||
`{version_num}_{datetime}_{name|update}.sql`.
|
||||
`{version_num}_{datetime}_{name|update}.py`.
|
||||
|
||||
If `aerich` guesses you are renaming a column, it will ask `Rename {old_column} to {new_column} [True]`. You can choose
|
||||
`True` to rename column without column drop, or choose `False` to drop the column then create. Note that the latter may
|
||||
@@ -116,7 +118,7 @@ lose data.
|
||||
```shell
|
||||
> aerich upgrade
|
||||
|
||||
Success upgrade 1_202029051520102929_drop_column.sql
|
||||
Success upgrade 1_202029051520102929_drop_column.py
|
||||
```
|
||||
|
||||
Now your db is migrated to latest.
|
||||
@@ -142,7 +144,7 @@ Options:
|
||||
```shell
|
||||
> aerich downgrade
|
||||
|
||||
Success downgrade 1_202029051520102929_drop_column.sql
|
||||
Success downgrade 1_202029051520102929_drop_column.py
|
||||
```
|
||||
|
||||
Now your db is rolled back to the specified version.
|
||||
@@ -152,7 +154,7 @@ Now your db is rolled back to the specified version.
|
||||
```shell
|
||||
> aerich history
|
||||
|
||||
1_202029051520102929_drop_column.sql
|
||||
1_202029051520102929_drop_column.py
|
||||
```
|
||||
|
||||
### Show heads to be migrated
|
||||
@@ -160,7 +162,7 @@ Now your db is rolled back to the specified version.
|
||||
```shell
|
||||
> aerich heads
|
||||
|
||||
1_202029051520102929_drop_column.sql
|
||||
1_202029051520102929_drop_column.py
|
||||
```
|
||||
|
||||
### Inspect db tables to TortoiseORM model
|
||||
|
274
README_RU.md
Normal file
274
README_RU.md
Normal file
@@ -0,0 +1,274 @@
|
||||
# Aerich
|
||||
|
||||
[](https://pypi.python.org/pypi/aerich)
|
||||
[](https://github.com/tortoise/aerich)
|
||||
[](https://github.com/tortoise/aerich/actions?query=workflow:pypi)
|
||||
[](https://github.com/tortoise/aerich/actions?query=workflow:ci)
|
||||
|
||||
[English](./README.md) | Русский
|
||||
|
||||
## Введение
|
||||
|
||||
Aerich - это инструмент для миграции базы данных для TortoiseORM, который аналогичен Alembic для SQLAlchemy или встроенному решению миграций в Django ORM.
|
||||
|
||||
## Установка
|
||||
|
||||
Просто установите из pypi:
|
||||
|
||||
```shell
|
||||
pip install aerich
|
||||
```
|
||||
|
||||
## Быстрый старт
|
||||
|
||||
```shell
|
||||
> aerich -h
|
||||
|
||||
Usage: aerich [OPTIONS] COMMAND [ARGS]...
|
||||
|
||||
Options:
|
||||
-V, --version Show the version and exit.
|
||||
-c, --config TEXT Config file. [default: pyproject.toml]
|
||||
--app TEXT Tortoise-ORM app name.
|
||||
-h, --help Show this message and exit.
|
||||
|
||||
Commands:
|
||||
downgrade Downgrade to specified version.
|
||||
heads Show current available heads in migrate location.
|
||||
history List all migrate items.
|
||||
init Init config file and generate root migrate location.
|
||||
init-db Generate schema and generate app migrate location.
|
||||
inspectdb Introspects the database tables to standard output as...
|
||||
migrate Generate migrate changes file.
|
||||
upgrade Upgrade to specified version.
|
||||
```
|
||||
|
||||
## Использование
|
||||
|
||||
Сначала вам нужно добавить aerich.models в конфигурацию вашего Tortoise-ORM. Пример:
|
||||
|
||||
```python
|
||||
TORTOISE_ORM = {
|
||||
"connections": {"default": "mysql://root:123456@127.0.0.1:3306/test"},
|
||||
"apps": {
|
||||
"models": {
|
||||
"models": ["tests.models", "aerich.models"],
|
||||
"default_connection": "default",
|
||||
},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
### Инициализация
|
||||
|
||||
```shell
|
||||
> aerich init -h
|
||||
|
||||
Usage: aerich init [OPTIONS]
|
||||
|
||||
Init config file and generate root migrate location.
|
||||
|
||||
Options:
|
||||
-t, --tortoise-orm TEXT Tortoise-ORM config module dict variable, like
|
||||
settings.TORTOISE_ORM. [required]
|
||||
--location TEXT Migrate store location. [default: ./migrations]
|
||||
-s, --src_folder TEXT Folder of the source, relative to the project root.
|
||||
-h, --help Show this message and exit.
|
||||
```
|
||||
|
||||
Инициализируйте файл конфигурации и задайте местоположение миграций:
|
||||
|
||||
```shell
|
||||
> aerich init -t tests.backends.mysql.TORTOISE_ORM
|
||||
|
||||
Success create migrate location ./migrations
|
||||
Success write config to pyproject.toml
|
||||
```
|
||||
|
||||
### Инициализация базы данных
|
||||
|
||||
```shell
|
||||
> aerich init-db
|
||||
|
||||
Success create app migrate location ./migrations/models
|
||||
Success generate schema for app "models"
|
||||
```
|
||||
|
||||
Если ваше приложение Tortoise-ORM не является приложением по умолчанию с именем models, вы должны указать правильное имя приложения с помощью параметра --app, например: aerich --app other_models init-db.
|
||||
|
||||
### Обновление моделей и создание миграции
|
||||
|
||||
```shell
|
||||
> aerich migrate --name drop_column
|
||||
|
||||
Success migrate 1_202029051520102929_drop_column.py
|
||||
```
|
||||
|
||||
Формат имени файла миграции следующий: `{версия}_{дата_и_время}_{имя|обновление}.py`.
|
||||
|
||||
Если aerich предполагает, что вы переименовываете столбец, он спросит:
|
||||
Переименовать `{старый_столбец} в {новый_столбец} [True]`. Вы можете выбрать `True`,
|
||||
чтобы переименовать столбец без удаления столбца, или выбрать `False`, чтобы удалить столбец,
|
||||
а затем создать новый. Обратите внимание, что последний вариант может привести к потере данных.
|
||||
|
||||
|
||||
### Обновление до последней версии
|
||||
|
||||
```shell
|
||||
> aerich upgrade
|
||||
|
||||
Success upgrade 1_202029051520102929_drop_column.py
|
||||
```
|
||||
|
||||
Теперь ваша база данных обновлена до последней версии.
|
||||
|
||||
### Откат до указанной версии
|
||||
|
||||
```shell
|
||||
> aerich downgrade -h
|
||||
|
||||
Usage: aerich downgrade [OPTIONS]
|
||||
|
||||
Downgrade to specified version.
|
||||
|
||||
Options:
|
||||
-v, --version INTEGER Specified version, default to last. [default: -1]
|
||||
-d, --delete Delete version files at the same time. [default:
|
||||
False]
|
||||
|
||||
--yes Confirm the action without prompting.
|
||||
-h, --help Show this message and exit.
|
||||
```
|
||||
|
||||
```shell
|
||||
> aerich downgrade
|
||||
|
||||
Success downgrade 1_202029051520102929_drop_column.py
|
||||
```
|
||||
|
||||
Теперь ваша база данных откатилась до указанной версии.
|
||||
|
||||
### Показать историю
|
||||
|
||||
```shell
|
||||
> aerich history
|
||||
|
||||
1_202029051520102929_drop_column.py
|
||||
```
|
||||
|
||||
### Чтобы узнать, какие миграции должны быть применены, можно использовать команду:
|
||||
|
||||
```shell
|
||||
> aerich heads
|
||||
|
||||
1_202029051520102929_drop_column.py
|
||||
```
|
||||
|
||||
### Осмотр таблиц базы данных для модели TortoiseORM
|
||||
|
||||
В настоящее время inspectdb поддерживает MySQL, Postgres и SQLite.
|
||||
|
||||
```shell
|
||||
Usage: aerich inspectdb [OPTIONS]
|
||||
|
||||
Introspects the database tables to standard output as TortoiseORM model.
|
||||
|
||||
Options:
|
||||
-t, --table TEXT Which tables to inspect.
|
||||
-h, --help Show this message and exit.
|
||||
```
|
||||
|
||||
Посмотреть все таблицы и вывести их на консоль:
|
||||
|
||||
```shell
|
||||
aerich --app models inspectdb
|
||||
```
|
||||
|
||||
Осмотреть указанную таблицу в приложении по умолчанию и перенаправить в models.py:
|
||||
|
||||
```shell
|
||||
aerich inspectdb -t user > models.py
|
||||
```
|
||||
|
||||
Например, ваша таблица выглядит следующим образом:
|
||||
|
||||
```sql
|
||||
CREATE TABLE `test`
|
||||
(
|
||||
`id` int NOT NULL AUTO_INCREMENT,
|
||||
`decimal` decimal(10, 2) NOT NULL,
|
||||
`date` date DEFAULT NULL,
|
||||
`datetime` datetime NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
`time` time DEFAULT NULL,
|
||||
`float` float DEFAULT NULL,
|
||||
`string` varchar(200) COLLATE utf8mb4_general_ci DEFAULT NULL,
|
||||
`tinyint` tinyint DEFAULT NULL,
|
||||
PRIMARY KEY (`id`),
|
||||
KEY `asyncmy_string_index` (`string`)
|
||||
) ENGINE = InnoDB
|
||||
DEFAULT CHARSET = utf8mb4
|
||||
COLLATE = utf8mb4_general_ci
|
||||
```
|
||||
|
||||
Теперь выполните команду aerich inspectdb -t test, чтобы увидеть сгенерированную модель:
|
||||
|
||||
```python
|
||||
from tortoise import Model, fields
|
||||
|
||||
|
||||
class Test(Model):
|
||||
date = fields.DateField(null=True, )
|
||||
datetime = fields.DatetimeField(auto_now=True, )
|
||||
decimal = fields.DecimalField(max_digits=10, decimal_places=2, )
|
||||
float = fields.FloatField(null=True, )
|
||||
id = fields.IntField(pk=True, )
|
||||
string = fields.CharField(max_length=200, null=True, )
|
||||
time = fields.TimeField(null=True, )
|
||||
tinyint = fields.BooleanField(null=True, )
|
||||
```
|
||||
|
||||
Обратите внимание, что эта команда имеет ограничения и не может автоматически определить некоторые поля, такие как `IntEnumField`, `ForeignKeyField` и другие.
|
||||
|
||||
### Несколько баз данных
|
||||
|
||||
```python
|
||||
tortoise_orm = {
|
||||
"connections": {
|
||||
"default": expand_db_url(db_url, True),
|
||||
"second": expand_db_url(db_url_second, True),
|
||||
},
|
||||
"apps": {
|
||||
"models": {"models": ["tests.models", "aerich.models"], "default_connection": "default"},
|
||||
"models_second": {"models": ["tests.models_second"], "default_connection": "second", },
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
Вам нужно указать `aerich.models` только в одном приложении и должны указывать `--app` при запуске команды `aerich migrate` и т.д.
|
||||
|
||||
## Восстановление рабочего процесса aerich
|
||||
|
||||
В некоторых случаях, например, при возникновении проблем после обновления `aerich`, вы не можете запустить `aerich migrate` или `aerich upgrade`. В таком случае вы можете выполнить следующие шаги:
|
||||
|
||||
1. удалите таблицы `aerich`.
|
||||
2. удалите директорию `migrations/{app}`.
|
||||
3. rerun `aerich init-db`.
|
||||
|
||||
Обратите внимание, что эти действия безопасны, и вы можете использовать их для сброса миграций, если у вас слишком много файлов миграции.
|
||||
|
||||
## Использование aerich в приложении
|
||||
|
||||
Вы можете использовать `aerich` вне командной строки, используя класс `Command`.
|
||||
|
||||
```python
|
||||
from aerich import Command
|
||||
|
||||
command = Command(tortoise_config=config, app='models')
|
||||
await command.init()
|
||||
await command.migrate('test')
|
||||
```
|
||||
|
||||
## Лицензия
|
||||
|
||||
Этот проект лицензирован в соответствии с лицензией
|
||||
[Apache-2.0](https://github.com/long2ice/aerich/blob/master/LICENSE) Лицензия.
|
@@ -8,17 +8,16 @@ from tortoise.transactions import in_transaction
|
||||
from tortoise.utils import get_schema_sql
|
||||
|
||||
from aerich.exceptions import DowngradeError
|
||||
from aerich.inspect.mysql import InspectMySQL
|
||||
from aerich.inspect.postgres import InspectPostgres
|
||||
from aerich.inspect.sqlite import InspectSQLite
|
||||
from aerich.migrate import Migrate
|
||||
from aerich.inspectdb.mysql import InspectMySQL
|
||||
from aerich.inspectdb.postgres import InspectPostgres
|
||||
from aerich.inspectdb.sqlite import InspectSQLite
|
||||
from aerich.migrate import MIGRATE_TEMPLATE, Migrate
|
||||
from aerich.models import Aerich
|
||||
from aerich.utils import (
|
||||
get_app_connection,
|
||||
get_app_connection_name,
|
||||
get_models_describe,
|
||||
get_version_content_from_file,
|
||||
write_version_file,
|
||||
import_py_file,
|
||||
)
|
||||
|
||||
|
||||
@@ -37,7 +36,18 @@ class Command:
|
||||
async def init(self):
|
||||
await Migrate.init(self.tortoise_config, self.app, self.location)
|
||||
|
||||
async def upgrade(self):
|
||||
async def _upgrade(self, conn, version_file):
|
||||
file_path = Path(Migrate.migrate_location, version_file)
|
||||
m = import_py_file(file_path)
|
||||
upgrade = getattr(m, "upgrade")
|
||||
await conn.execute_script(await upgrade(conn))
|
||||
await Aerich.create(
|
||||
version=version_file,
|
||||
app=self.app,
|
||||
content=get_models_describe(self.app),
|
||||
)
|
||||
|
||||
async def upgrade(self, run_in_transaction: bool = True):
|
||||
migrated = []
|
||||
for version_file in Migrate.get_all_version_files():
|
||||
try:
|
||||
@@ -45,19 +55,13 @@ class Command:
|
||||
except OperationalError:
|
||||
exists = False
|
||||
if not exists:
|
||||
async with in_transaction(
|
||||
get_app_connection_name(self.tortoise_config, self.app)
|
||||
) as conn:
|
||||
file_path = Path(Migrate.migrate_location, version_file)
|
||||
content = get_version_content_from_file(file_path)
|
||||
upgrade_query_list = content.get("upgrade")
|
||||
for upgrade_query in upgrade_query_list:
|
||||
await conn.execute_script(upgrade_query)
|
||||
await Aerich.create(
|
||||
version=version_file,
|
||||
app=self.app,
|
||||
content=get_models_describe(self.app),
|
||||
)
|
||||
app_conn_name = get_app_connection_name(self.tortoise_config, self.app)
|
||||
if run_in_transaction:
|
||||
async with in_transaction(app_conn_name) as conn:
|
||||
await self._upgrade(conn, version_file)
|
||||
else:
|
||||
app_conn = get_app_connection(self.tortoise_config, self.app)
|
||||
await self._upgrade(app_conn, version_file)
|
||||
migrated.append(version_file)
|
||||
return migrated
|
||||
|
||||
@@ -81,12 +85,12 @@ class Command:
|
||||
get_app_connection_name(self.tortoise_config, self.app)
|
||||
) as conn:
|
||||
file_path = Path(Migrate.migrate_location, file)
|
||||
content = get_version_content_from_file(file_path)
|
||||
downgrade_query_list = content.get("downgrade")
|
||||
if not downgrade_query_list:
|
||||
m = import_py_file(file_path)
|
||||
downgrade = getattr(m, "downgrade")
|
||||
downgrade_sql = await downgrade(conn)
|
||||
if not downgrade_sql.strip():
|
||||
raise DowngradeError("No downgrade items found")
|
||||
for downgrade_query in downgrade_query_list:
|
||||
await conn.execute_query(downgrade_query)
|
||||
await conn.execute_script(downgrade_sql)
|
||||
await version.delete()
|
||||
if delete:
|
||||
os.unlink(file_path)
|
||||
@@ -102,11 +106,8 @@ class Command:
|
||||
return ret
|
||||
|
||||
async def history(self):
|
||||
ret = []
|
||||
versions = Migrate.get_all_version_files()
|
||||
for version in versions:
|
||||
ret.append(version)
|
||||
return ret
|
||||
return [version for version in versions]
|
||||
|
||||
async def inspectdb(self, tables: List[str] = None) -> str:
|
||||
connection = get_app_connection(self.tortoise_config, self.app)
|
||||
@@ -143,7 +144,7 @@ class Command:
|
||||
app=app,
|
||||
content=get_models_describe(app),
|
||||
)
|
||||
content = {
|
||||
"upgrade": [schema],
|
||||
}
|
||||
write_version_file(Path(dirname, version), content)
|
||||
version_file = Path(dirname, version)
|
||||
content = MIGRATE_TEMPLATE.format(upgrade_sql=schema, downgrade_sql="")
|
||||
with open(version_file, "w", encoding="utf-8") as f:
|
||||
f.write(content)
|
||||
|
@@ -26,11 +26,11 @@ def coro(f):
|
||||
def wrapper(*args, **kwargs):
|
||||
loop = asyncio.get_event_loop()
|
||||
|
||||
# Close db connections at the end of all all but the cli group function
|
||||
# Close db connections at the end of all but the cli group function
|
||||
try:
|
||||
loop.run_until_complete(f(*args, **kwargs))
|
||||
finally:
|
||||
if Tortoise._inited:
|
||||
if f.__name__ not in ["cli", "init"]:
|
||||
loop.run_until_complete(Tortoise.close_connections())
|
||||
|
||||
return wrapper
|
||||
@@ -54,10 +54,10 @@ async def cli(ctx: Context, config, app):
|
||||
|
||||
invoked_subcommand = ctx.invoked_subcommand
|
||||
if invoked_subcommand != "init":
|
||||
if not Path(config).exists():
|
||||
config_path = Path(config)
|
||||
if not config_path.exists():
|
||||
raise UsageError("You must exec init first", ctx=ctx)
|
||||
with open(config, "r") as f:
|
||||
content = f.read()
|
||||
content = config_path.read_text()
|
||||
doc = tomlkit.parse(content)
|
||||
try:
|
||||
tool = doc["tool"]["aerich"]
|
||||
@@ -90,11 +90,18 @@ async def migrate(ctx: Context, name):
|
||||
|
||||
|
||||
@cli.command(help="Upgrade to specified version.")
|
||||
@click.option(
|
||||
"--in-transaction",
|
||||
"-i",
|
||||
default=True,
|
||||
type=bool,
|
||||
help="Make migrations in transaction or not. Can be helpful for large migrations or creating concurrent indexes.",
|
||||
)
|
||||
@click.pass_context
|
||||
@coro
|
||||
async def upgrade(ctx: Context):
|
||||
async def upgrade(ctx: Context, in_transaction: bool):
|
||||
command = ctx.obj["command"]
|
||||
migrated = await command.upgrade()
|
||||
migrated = await command.upgrade(run_in_transaction=in_transaction)
|
||||
if not migrated:
|
||||
click.secho("No upgrade items found", fg=Color.yellow)
|
||||
else:
|
||||
@@ -192,9 +199,9 @@ async def init(ctx: Context, tortoise_orm, location, src_folder):
|
||||
# check that we can find the configuration, if not we can fail before the config file gets created
|
||||
add_src_path(src_folder)
|
||||
get_tortoise_config(ctx, tortoise_orm)
|
||||
if Path(config_file).exists():
|
||||
with open(config_file, "r") as f:
|
||||
content = f.read()
|
||||
config_path = Path(config_file)
|
||||
if config_path.exists():
|
||||
content = config_path.read_text()
|
||||
doc = tomlkit.parse(content)
|
||||
else:
|
||||
doc = tomlkit.parse("[tool.aerich]")
|
||||
@@ -204,8 +211,7 @@ async def init(ctx: Context, tortoise_orm, location, src_folder):
|
||||
table["src_folder"] = src_folder
|
||||
doc["tool"]["aerich"] = table
|
||||
|
||||
with open(config_file, "w") as f:
|
||||
f.write(tomlkit.dumps(doc))
|
||||
config_path.write_text(tomlkit.dumps(doc))
|
||||
|
||||
Path(location).mkdir(parents=True, exist_ok=True)
|
||||
|
||||
@@ -215,15 +221,17 @@ async def init(ctx: Context, tortoise_orm, location, src_folder):
|
||||
|
||||
@cli.command(help="Generate schema and generate app migrate location.")
|
||||
@click.option(
|
||||
"-s",
|
||||
"--safe",
|
||||
type=bool,
|
||||
is_flag=True,
|
||||
default=True,
|
||||
help="When set to true, creates the table only when it does not already exist.",
|
||||
show_default=True,
|
||||
)
|
||||
@click.pass_context
|
||||
@coro
|
||||
async def init_db(ctx: Context, safe):
|
||||
async def init_db(ctx: Context, safe: bool):
|
||||
command = ctx.obj["command"]
|
||||
app = command.app
|
||||
dirname = Path(command.location, app)
|
||||
|
@@ -23,7 +23,12 @@ class BaseDDL:
|
||||
_DROP_INDEX_TEMPLATE = 'ALTER TABLE "{table_name}" DROP INDEX "{index_name}"'
|
||||
_ADD_FK_TEMPLATE = 'ALTER TABLE "{table_name}" ADD CONSTRAINT "{fk_name}" FOREIGN KEY ("{db_column}") REFERENCES "{table}" ("{field}") ON DELETE {on_delete}'
|
||||
_DROP_FK_TEMPLATE = 'ALTER TABLE "{table_name}" DROP FOREIGN KEY "{fk_name}"'
|
||||
_M2M_TABLE_TEMPLATE = 'CREATE TABLE "{table_name}" ("{backward_key}" {backward_type} NOT NULL REFERENCES "{backward_table}" ("{backward_field}") ON DELETE CASCADE,"{forward_key}" {forward_type} NOT NULL REFERENCES "{forward_table}" ("{forward_field}") ON DELETE {on_delete}){extra}{comment}'
|
||||
_M2M_TABLE_TEMPLATE = (
|
||||
'CREATE TABLE "{table_name}" (\n'
|
||||
' "{backward_key}" {backward_type} NOT NULL REFERENCES "{backward_table}" ("{backward_field}") ON DELETE CASCADE,\n'
|
||||
' "{forward_key}" {forward_type} NOT NULL REFERENCES "{forward_table}" ("{forward_field}") ON DELETE {on_delete}\n'
|
||||
"){extra}{comment}"
|
||||
)
|
||||
_MODIFY_COLUMN_TEMPLATE = 'ALTER TABLE "{table_name}" MODIFY COLUMN {column}'
|
||||
_CHANGE_COLUMN_TEMPLATE = (
|
||||
'ALTER TABLE "{table_name}" CHANGE {old_column_name} {new_column_name} {new_column_type}'
|
||||
@@ -35,7 +40,9 @@ class BaseDDL:
|
||||
self.schema_generator = self.schema_generator_cls(client)
|
||||
|
||||
def create_table(self, model: "Type[Model]"):
|
||||
return self.schema_generator._get_table_sql(model, True)["table_creation_string"]
|
||||
return self.schema_generator._get_table_sql(model, True)["table_creation_string"].rstrip(
|
||||
";"
|
||||
)
|
||||
|
||||
def drop_table(self, table_name: str):
|
||||
return self._DROP_TABLE_TEMPLATE.format(table_name=table_name)
|
||||
@@ -180,7 +187,7 @@ class BaseDDL:
|
||||
"idx" if not unique else "uid", model, field_names
|
||||
),
|
||||
table_name=model._meta.db_table,
|
||||
column_names=", ".join([self.schema_generator.quote(f) for f in field_names]),
|
||||
column_names=", ".join(self.schema_generator.quote(f) for f in field_names),
|
||||
)
|
||||
|
||||
def drop_index(self, model: "Type[Model]", field_names: List[str], unique=False):
|
||||
|
@@ -22,6 +22,11 @@ class MysqlDDL(BaseDDL):
|
||||
_DROP_INDEX_TEMPLATE = "ALTER TABLE `{table_name}` DROP INDEX `{index_name}`"
|
||||
_ADD_FK_TEMPLATE = "ALTER TABLE `{table_name}` ADD CONSTRAINT `{fk_name}` FOREIGN KEY (`{db_column}`) REFERENCES `{table}` (`{field}`) ON DELETE {on_delete}"
|
||||
_DROP_FK_TEMPLATE = "ALTER TABLE `{table_name}` DROP FOREIGN KEY `{fk_name}`"
|
||||
_M2M_TABLE_TEMPLATE = "CREATE TABLE `{table_name}` (`{backward_key}` {backward_type} NOT NULL REFERENCES `{backward_table}` (`{backward_field}`) ON DELETE CASCADE,`{forward_key}` {forward_type} NOT NULL REFERENCES `{forward_table}` (`{forward_field}`) ON DELETE CASCADE){extra}{comment}"
|
||||
_M2M_TABLE_TEMPLATE = (
|
||||
"CREATE TABLE `{table_name}` (\n"
|
||||
" `{backward_key}` {backward_type} NOT NULL REFERENCES `{backward_table}` (`{backward_field}`) ON DELETE CASCADE,\n"
|
||||
" `{forward_key}` {forward_type} NOT NULL REFERENCES `{forward_table}` (`{forward_field}`) ON DELETE CASCADE\n"
|
||||
"){extra}{comment}"
|
||||
)
|
||||
_MODIFY_COLUMN_TEMPLATE = "ALTER TABLE `{table_name}` MODIFY COLUMN {column}"
|
||||
_RENAME_TABLE_TEMPLATE = "ALTER TABLE `{old_table_name}` RENAME TO `{new_table_name}`"
|
||||
|
@@ -30,8 +30,13 @@ class Column(BaseModel):
|
||||
index = "index=True, "
|
||||
if self.data_type in ["varchar", "VARCHAR"]:
|
||||
length = f"max_length={self.length}, "
|
||||
if self.data_type == "decimal":
|
||||
length = f"max_digits={self.max_digits}, decimal_places={self.decimal_places}, "
|
||||
if self.data_type in ["decimal", "numeric"]:
|
||||
length_parts = []
|
||||
if self.max_digits:
|
||||
length_parts.append(f"max_digits={self.max_digits}")
|
||||
if self.decimal_places:
|
||||
length_parts.append(f"decimal_places={self.decimal_places}")
|
||||
length = ", ".join(length_parts)
|
||||
if self.null:
|
||||
null = "null=True, "
|
||||
if self.default is not None:
|
@@ -1,6 +1,6 @@
|
||||
from typing import List
|
||||
|
||||
from aerich.inspect import Column, Inspect
|
||||
from aerich.inspectdb import Column, Inspect
|
||||
|
||||
|
||||
class InspectMySQL(Inspect):
|
@@ -2,7 +2,7 @@ from typing import List, Optional
|
||||
|
||||
from tortoise import BaseDBAsyncClient
|
||||
|
||||
from aerich.inspect import Column, Inspect
|
||||
from aerich.inspectdb import Column, Inspect
|
||||
|
||||
|
||||
class InspectPostgres(Inspect):
|
||||
@@ -25,6 +25,7 @@ class InspectPostgres(Inspect):
|
||||
"date": self.date_field,
|
||||
"time": self.time_field,
|
||||
"decimal": self.decimal_field,
|
||||
"numeric": self.decimal_field,
|
||||
"uuid": self.uuid_field,
|
||||
"jsonb": self.json_field,
|
||||
"bytea": self.binary_field,
|
@@ -1,6 +1,6 @@
|
||||
from typing import List
|
||||
|
||||
from aerich.inspect import Column, Inspect
|
||||
from aerich.inspectdb import Column, Inspect
|
||||
|
||||
|
||||
class InspectSQLite(Inspect):
|
@@ -1,3 +1,4 @@
|
||||
import importlib
|
||||
import os
|
||||
from datetime import datetime
|
||||
from hashlib import md5
|
||||
@@ -12,12 +13,20 @@ from tortoise.indexes import Index
|
||||
|
||||
from aerich.ddl import BaseDDL
|
||||
from aerich.models import MAX_VERSION_LENGTH, Aerich
|
||||
from aerich.utils import (
|
||||
get_app_connection,
|
||||
get_models_describe,
|
||||
is_default_function,
|
||||
write_version_file,
|
||||
)
|
||||
from aerich.utils import get_app_connection, get_models_describe, is_default_function
|
||||
|
||||
MIGRATE_TEMPLATE = """from tortoise import BaseDBAsyncClient
|
||||
|
||||
|
||||
async def upgrade(db: BaseDBAsyncClient) -> str:
|
||||
return \"\"\"
|
||||
{upgrade_sql}\"\"\"
|
||||
|
||||
|
||||
async def downgrade(db: BaseDBAsyncClient) -> str:
|
||||
return \"\"\"
|
||||
{downgrade_sql}\"\"\"
|
||||
"""
|
||||
|
||||
|
||||
class Migrate:
|
||||
@@ -41,7 +50,7 @@ class Migrate:
|
||||
@classmethod
|
||||
def get_all_version_files(cls) -> List[str]:
|
||||
return sorted(
|
||||
filter(lambda x: x.endswith("sql"), os.listdir(cls.migrate_location)),
|
||||
filter(lambda x: x.endswith("py"), os.listdir(cls.migrate_location)),
|
||||
key=lambda x: int(x.split("_")[0]),
|
||||
)
|
||||
|
||||
@@ -63,6 +72,11 @@ class Migrate:
|
||||
ret = await connection.execute_query(sql)
|
||||
cls._db_version = ret[1][0].get("version")
|
||||
|
||||
@classmethod
|
||||
async def load_ddl_class(cls):
|
||||
ddl_dialect_module = importlib.import_module(f"aerich.ddl.{cls.dialect}")
|
||||
return getattr(ddl_dialect_module, f"{cls.dialect.capitalize()}DDL")
|
||||
|
||||
@classmethod
|
||||
async def init(cls, config: dict, app: str, location: str):
|
||||
await Tortoise.init(config=config)
|
||||
@@ -74,18 +88,8 @@ class Migrate:
|
||||
|
||||
connection = get_app_connection(config, app)
|
||||
cls.dialect = connection.schema_generator.DIALECT
|
||||
if cls.dialect == "mysql":
|
||||
from aerich.ddl.mysql import MysqlDDL
|
||||
|
||||
cls.ddl = MysqlDDL(connection)
|
||||
elif cls.dialect == "sqlite":
|
||||
from aerich.ddl.sqlite import SqliteDDL
|
||||
|
||||
cls.ddl = SqliteDDL(connection)
|
||||
elif cls.dialect == "postgres":
|
||||
from aerich.ddl.postgres import PostgresDDL
|
||||
|
||||
cls.ddl = PostgresDDL(connection)
|
||||
cls.ddl_class = await cls.load_ddl_class()
|
||||
cls.ddl = cls.ddl_class(connection)
|
||||
await cls._get_db_version(connection)
|
||||
|
||||
@classmethod
|
||||
@@ -101,24 +105,28 @@ class Migrate:
|
||||
now = datetime.now().strftime("%Y%m%d%H%M%S").replace("/", "")
|
||||
last_version_num = await cls._get_last_version_num()
|
||||
if last_version_num is None:
|
||||
return f"0_{now}_init.sql"
|
||||
version = f"{last_version_num + 1}_{now}_{name}.sql"
|
||||
return f"0_{now}_init.py"
|
||||
version = f"{last_version_num + 1}_{now}_{name}.py"
|
||||
if len(version) > MAX_VERSION_LENGTH:
|
||||
raise ValueError(f"Version name exceeds maximum length ({MAX_VERSION_LENGTH})")
|
||||
return version
|
||||
|
||||
@classmethod
|
||||
async def _generate_diff_sql(cls, name):
|
||||
async def _generate_diff_py(cls, name):
|
||||
version = await cls.generate_version(name)
|
||||
# delete if same version exists
|
||||
for version_file in cls.get_all_version_files():
|
||||
if version_file.startswith(version.split("_")[0]):
|
||||
os.unlink(Path(cls.migrate_location, version_file))
|
||||
content = {
|
||||
"upgrade": list(dict.fromkeys(cls.upgrade_operators)),
|
||||
"downgrade": list(dict.fromkeys(cls.downgrade_operators)),
|
||||
}
|
||||
write_version_file(Path(cls.migrate_location, version), content)
|
||||
|
||||
version_file = Path(cls.migrate_location, version)
|
||||
content = MIGRATE_TEMPLATE.format(
|
||||
upgrade_sql=";\n ".join(cls.upgrade_operators) + ";",
|
||||
downgrade_sql=";\n ".join(cls.downgrade_operators) + ";",
|
||||
)
|
||||
|
||||
with open(version_file, "w", encoding="utf-8") as f:
|
||||
f.write(content)
|
||||
return version
|
||||
|
||||
@classmethod
|
||||
@@ -137,7 +145,7 @@ class Migrate:
|
||||
if not cls.upgrade_operators:
|
||||
return ""
|
||||
|
||||
return await cls._generate_diff_sql(name)
|
||||
return await cls._generate_diff_py(name)
|
||||
|
||||
@classmethod
|
||||
def _add_operator(cls, operator: str, upgrade=True, fk_m2m_index=False):
|
||||
@@ -148,6 +156,7 @@ class Migrate:
|
||||
:param fk_m2m_index:
|
||||
:return:
|
||||
"""
|
||||
operator = operator.rstrip(";")
|
||||
if upgrade:
|
||||
if fk_m2m_index:
|
||||
cls._upgrade_fk_m2m_index_operators.append(operator)
|
||||
@@ -273,8 +282,18 @@ class Migrate:
|
||||
# remove indexes
|
||||
for index in old_indexes.difference(new_indexes):
|
||||
cls._add_operator(cls._drop_index(model, index, False), upgrade, True)
|
||||
old_data_fields = old_model_describe.get("data_fields")
|
||||
new_data_fields = new_model_describe.get("data_fields")
|
||||
old_data_fields = list(
|
||||
filter(
|
||||
lambda x: x.get("db_field_types") is not None,
|
||||
old_model_describe.get("data_fields"),
|
||||
)
|
||||
)
|
||||
new_data_fields = list(
|
||||
filter(
|
||||
lambda x: x.get("db_field_types") is not None,
|
||||
new_model_describe.get("data_fields"),
|
||||
)
|
||||
)
|
||||
|
||||
old_data_fields_name = list(map(lambda x: x.get("name"), old_data_fields))
|
||||
new_data_fields_name = list(map(lambda x: x.get("name"), new_data_fields))
|
||||
@@ -347,26 +366,44 @@ class Migrate:
|
||||
),
|
||||
upgrade,
|
||||
)
|
||||
if new_data_field["indexed"]:
|
||||
cls._add_operator(
|
||||
cls._add_index(
|
||||
model, {new_data_field["db_column"]}, new_data_field["unique"]
|
||||
),
|
||||
upgrade,
|
||||
True,
|
||||
)
|
||||
# remove fields
|
||||
for old_data_field_name in set(old_data_fields_name).difference(
|
||||
set(new_data_fields_name)
|
||||
):
|
||||
# don't remove field if is rename
|
||||
# don't remove field if is renamed
|
||||
if (upgrade and old_data_field_name in cls._rename_old) or (
|
||||
not upgrade and old_data_field_name in cls._rename_new
|
||||
):
|
||||
continue
|
||||
old_data_field = next(
|
||||
filter(lambda x: x.get("name") == old_data_field_name, old_data_fields)
|
||||
)
|
||||
db_column = old_data_field["db_column"]
|
||||
cls._add_operator(
|
||||
cls._remove_field(
|
||||
model,
|
||||
next(
|
||||
filter(
|
||||
lambda x: x.get("name") == old_data_field_name, old_data_fields
|
||||
)
|
||||
).get("db_column"),
|
||||
db_column,
|
||||
),
|
||||
upgrade,
|
||||
)
|
||||
if old_data_field["indexed"]:
|
||||
cls._add_operator(
|
||||
cls._drop_index(
|
||||
model,
|
||||
{db_column},
|
||||
),
|
||||
upgrade,
|
||||
True,
|
||||
)
|
||||
|
||||
old_fk_fields = old_model_describe.get("fk_fields")
|
||||
new_fk_fields = new_model_describe.get("fk_fields")
|
||||
|
||||
@@ -412,6 +449,7 @@ class Migrate:
|
||||
filter(lambda x: x.get("name") == field_name, new_data_fields)
|
||||
)
|
||||
changes = diff(old_data_field, new_data_field)
|
||||
modified = False
|
||||
for change in changes:
|
||||
_, option, old_new = change
|
||||
if option == "indexed":
|
||||
@@ -426,8 +464,14 @@ class Migrate:
|
||||
cls._drop_index(model, (field_name,), unique), upgrade, True
|
||||
)
|
||||
elif option == "db_field_types.":
|
||||
# continue since repeated with others
|
||||
continue
|
||||
if new_data_field.get("field_type") == "DecimalField":
|
||||
# modify column
|
||||
cls._add_operator(
|
||||
cls._modify_field(model, new_data_field),
|
||||
upgrade,
|
||||
)
|
||||
else:
|
||||
continue
|
||||
elif option == "default":
|
||||
if not (
|
||||
is_default_function(old_new[0]) or is_default_function(old_new[1])
|
||||
@@ -443,11 +487,14 @@ class Migrate:
|
||||
# change nullable
|
||||
cls._add_operator(cls._alter_null(model, new_data_field), upgrade)
|
||||
else:
|
||||
if modified:
|
||||
continue
|
||||
# modify column
|
||||
cls._add_operator(
|
||||
cls._modify_field(model, new_data_field),
|
||||
upgrade,
|
||||
)
|
||||
modified = True
|
||||
|
||||
for old_model in old_models:
|
||||
if old_model not in new_models.keys():
|
||||
|
@@ -1,9 +1,9 @@
|
||||
import importlib
|
||||
import importlib.util
|
||||
import os
|
||||
import re
|
||||
import sys
|
||||
from pathlib import Path
|
||||
from typing import Dict, Union
|
||||
from typing import Dict
|
||||
|
||||
from click import BadOptionUsage, ClickException, Context
|
||||
from tortoise import BaseDBAsyncClient, Tortoise
|
||||
@@ -11,7 +11,7 @@ from tortoise import BaseDBAsyncClient, Tortoise
|
||||
|
||||
def add_src_path(path: str) -> str:
|
||||
"""
|
||||
add a folder to the paths so we can import from there
|
||||
add a folder to the paths, so we can import from there
|
||||
:param path: path to add
|
||||
:return: absolute path
|
||||
"""
|
||||
@@ -77,60 +77,6 @@ def get_tortoise_config(ctx: Context, tortoise_orm: str) -> dict:
|
||||
return config
|
||||
|
||||
|
||||
_UPGRADE = "-- upgrade --\n"
|
||||
_DOWNGRADE = "-- downgrade --\n"
|
||||
|
||||
|
||||
def get_version_content_from_file(version_file: Union[str, Path]) -> Dict:
|
||||
"""
|
||||
get version content
|
||||
:param version_file:
|
||||
:return:
|
||||
"""
|
||||
with open(version_file, "r", encoding="utf-8") as f:
|
||||
content = f.read()
|
||||
first = content.index(_UPGRADE)
|
||||
try:
|
||||
second = content.index(_DOWNGRADE)
|
||||
except ValueError:
|
||||
second = len(content) - 1
|
||||
upgrade_content = content[first + len(_UPGRADE) : second].strip() # noqa:E203
|
||||
downgrade_content = content[second + len(_DOWNGRADE) :].strip() # noqa:E203
|
||||
ret = {
|
||||
"upgrade": list(filter(lambda x: x or False, upgrade_content.split(";\n"))),
|
||||
"downgrade": list(filter(lambda x: x or False, downgrade_content.split(";\n"))),
|
||||
}
|
||||
return ret
|
||||
|
||||
|
||||
def write_version_file(version_file: Path, content: Dict):
|
||||
"""
|
||||
write version file
|
||||
:param version_file:
|
||||
:param content:
|
||||
:return:
|
||||
"""
|
||||
with open(version_file, "w", encoding="utf-8") as f:
|
||||
f.write(_UPGRADE)
|
||||
upgrade = content.get("upgrade")
|
||||
if len(upgrade) > 1:
|
||||
f.write(";\n".join(upgrade))
|
||||
if not upgrade[-1].endswith(";"):
|
||||
f.write(";\n")
|
||||
else:
|
||||
f.write(f"{upgrade[0]}")
|
||||
if not upgrade[0].endswith(";"):
|
||||
f.write(";")
|
||||
f.write("\n")
|
||||
downgrade = content.get("downgrade")
|
||||
if downgrade:
|
||||
f.write(_DOWNGRADE)
|
||||
if len(downgrade) > 1:
|
||||
f.write(";\n".join(downgrade) + ";\n")
|
||||
else:
|
||||
f.write(f"{downgrade[0]};\n")
|
||||
|
||||
|
||||
def get_models_describe(app: str) -> Dict:
|
||||
"""
|
||||
get app models describe
|
||||
@@ -146,3 +92,11 @@ def get_models_describe(app: str) -> Dict:
|
||||
|
||||
def is_default_function(string: str):
|
||||
return re.match(r"^<function.+>$", str(string or ""))
|
||||
|
||||
|
||||
def import_py_file(file: Path):
|
||||
module_name, file_ext = os.path.splitext(os.path.split(file)[-1])
|
||||
spec = importlib.util.spec_from_file_location(module_name, file)
|
||||
module = importlib.util.module_from_spec(spec)
|
||||
spec.loader.exec_module(module)
|
||||
return module
|
||||
|
@@ -1 +1 @@
|
||||
__version__ = "0.6.3"
|
||||
__version__ = "0.7.2"
|
||||
|
1308
poetry.lock
generated
1308
poetry.lock
generated
File diff suppressed because it is too large
Load Diff
@@ -1,6 +1,6 @@
|
||||
[tool.poetry]
|
||||
name = "aerich"
|
||||
version = "0.6.3"
|
||||
version = "0.7.2"
|
||||
description = "A database migrations tool for Tortoise ORM."
|
||||
authors = ["long2ice <long2ice@gmail.com>"]
|
||||
license = "Apache-2.0"
|
||||
@@ -19,13 +19,13 @@ python = "^3.7"
|
||||
tortoise-orm = "*"
|
||||
click = "*"
|
||||
asyncpg = { version = "*", optional = true }
|
||||
asyncmy = { version = "*", optional = true }
|
||||
asyncmy = { version = "^0.2.8rc1", optional = true, allow-prereleases = true }
|
||||
pydantic = "*"
|
||||
dictdiffer = "*"
|
||||
tomlkit = "*"
|
||||
|
||||
[tool.poetry.dev-dependencies]
|
||||
flake8 = "*"
|
||||
ruff = "*"
|
||||
isort = "*"
|
||||
black = "*"
|
||||
pytest = "*"
|
||||
@@ -34,7 +34,6 @@ pytest-asyncio = "*"
|
||||
bandit = "*"
|
||||
pytest-mock = "*"
|
||||
cryptography = "*"
|
||||
pyproject-flake8 = "*"
|
||||
|
||||
[tool.poetry.extras]
|
||||
asyncmy = ["asyncmy"]
|
||||
@@ -63,5 +62,5 @@ asyncio_mode = 'auto'
|
||||
pretty = true
|
||||
ignore_missing_imports = true
|
||||
|
||||
[tool.flake8]
|
||||
ignore = 'E501,W503,E203'
|
||||
[tool.ruff]
|
||||
ignore = ['E501']
|
||||
|
@@ -29,6 +29,7 @@ class User(Model):
|
||||
is_active = fields.BooleanField(default=True, description="Is Active")
|
||||
is_superuser = fields.BooleanField(default=False, description="Is SuperUser")
|
||||
intro = fields.TextField(default="")
|
||||
longitude = fields.DecimalField(max_digits=10, decimal_places=8)
|
||||
|
||||
|
||||
class Email(Model):
|
||||
|
@@ -29,6 +29,7 @@ class User(Model):
|
||||
is_superuser = fields.BooleanField(default=False, description="Is SuperUser")
|
||||
avatar = fields.CharField(max_length=200, default="")
|
||||
intro = fields.TextField(default="")
|
||||
longitude = fields.DecimalField(max_digits=12, decimal_places=9)
|
||||
|
||||
|
||||
class Email(Model):
|
||||
|
@@ -17,7 +17,7 @@ def test_create_table():
|
||||
`created_at` DATETIME(6) NOT NULL DEFAULT CURRENT_TIMESTAMP(6),
|
||||
`user_id` INT NOT NULL COMMENT 'User',
|
||||
CONSTRAINT `fk_category_user_e2e3874c` FOREIGN KEY (`user_id`) REFERENCES `user` (`id`) ON DELETE CASCADE
|
||||
) CHARACTER SET utf8mb4;"""
|
||||
) CHARACTER SET utf8mb4"""
|
||||
)
|
||||
|
||||
elif isinstance(Migrate.ddl, SqliteDDL):
|
||||
@@ -29,7 +29,7 @@ def test_create_table():
|
||||
"name" VARCHAR(200),
|
||||
"created_at" TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
"user_id" INT NOT NULL REFERENCES "user" ("id") ON DELETE CASCADE /* User */
|
||||
);"""
|
||||
)"""
|
||||
)
|
||||
|
||||
elif isinstance(Migrate.ddl, PostgresDDL):
|
||||
@@ -42,7 +42,7 @@ def test_create_table():
|
||||
"created_at" TIMESTAMPTZ NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
"user_id" INT NOT NULL REFERENCES "user" ("id") ON DELETE CASCADE
|
||||
);
|
||||
COMMENT ON COLUMN "category"."user_id" IS 'User';"""
|
||||
COMMENT ON COLUMN "category"."user_id" IS 'User'"""
|
||||
)
|
||||
|
||||
|
||||
@@ -72,18 +72,16 @@ def test_modify_column():
|
||||
ret1 = Migrate.ddl.modify_column(User, User._meta.fields_map.get("is_active").describe(False))
|
||||
if isinstance(Migrate.ddl, MysqlDDL):
|
||||
assert ret0 == "ALTER TABLE `category` MODIFY COLUMN `name` VARCHAR(200)"
|
||||
assert (
|
||||
ret1
|
||||
== "ALTER TABLE `user` MODIFY COLUMN `is_active` BOOL NOT NULL COMMENT 'Is Active' DEFAULT 1"
|
||||
)
|
||||
elif isinstance(Migrate.ddl, PostgresDDL):
|
||||
assert (
|
||||
ret0
|
||||
== 'ALTER TABLE "category" ALTER COLUMN "name" TYPE VARCHAR(200) USING "name"::VARCHAR(200)'
|
||||
)
|
||||
|
||||
if isinstance(Migrate.ddl, MysqlDDL):
|
||||
assert (
|
||||
ret1
|
||||
== "ALTER TABLE `user` MODIFY COLUMN `is_active` BOOL NOT NULL COMMENT 'Is Active' DEFAULT 1"
|
||||
)
|
||||
elif isinstance(Migrate.ddl, PostgresDDL):
|
||||
assert (
|
||||
ret1 == 'ALTER TABLE "user" ALTER COLUMN "is_active" TYPE BOOL USING "is_active"::BOOL'
|
||||
)
|
||||
|
@@ -644,6 +644,21 @@ old_models_describe = {
|
||||
"constraints": {},
|
||||
"db_field_types": {"": "TEXT", "mysql": "LONGTEXT"},
|
||||
},
|
||||
{
|
||||
"name": "longitude",
|
||||
"unique": False,
|
||||
"default": None,
|
||||
"indexed": False,
|
||||
"nullable": False,
|
||||
"db_column": "longitude",
|
||||
"docstring": None,
|
||||
"generated": False,
|
||||
"field_type": "DecimalField",
|
||||
"constraints": {},
|
||||
"description": None,
|
||||
"python_type": "decimal.Decimal",
|
||||
"db_field_types": {"": "DECIMAL(12,9)", "sqlite": "VARCHAR(40)"},
|
||||
},
|
||||
],
|
||||
"fk_fields": [],
|
||||
"backward_fk_fields": [
|
||||
@@ -787,104 +802,148 @@ def test_migrate(mocker: MockerFixture):
|
||||
Migrate.diff_models(models_describe, old_models_describe, False)
|
||||
Migrate._merge_operators()
|
||||
if isinstance(Migrate.ddl, MysqlDDL):
|
||||
assert sorted(Migrate.upgrade_operators) == sorted(
|
||||
[
|
||||
"ALTER TABLE `category` MODIFY COLUMN `name` VARCHAR(200)",
|
||||
"ALTER TABLE `category` MODIFY COLUMN `slug` VARCHAR(100) NOT NULL",
|
||||
"ALTER TABLE `config` ADD `user_id` INT NOT NULL COMMENT 'User'",
|
||||
"ALTER TABLE `config` ADD CONSTRAINT `fk_config_user_17daa970` FOREIGN KEY (`user_id`) REFERENCES `user` (`id`) ON DELETE CASCADE",
|
||||
"ALTER TABLE `config` ALTER COLUMN `status` DROP DEFAULT",
|
||||
"ALTER TABLE `email` ADD `address` VARCHAR(200) NOT NULL",
|
||||
"ALTER TABLE `email` DROP COLUMN `user_id`",
|
||||
"ALTER TABLE `configs` RENAME TO `config`",
|
||||
"ALTER TABLE `product` RENAME COLUMN `image` TO `pic`",
|
||||
"ALTER TABLE `email` RENAME COLUMN `id` TO `email_id`",
|
||||
"ALTER TABLE `product` ADD INDEX `idx_product_name_869427` (`name`, `type_db_alias`)",
|
||||
"ALTER TABLE `email` ADD INDEX `idx_email_email_4a1a33` (`email`)",
|
||||
"ALTER TABLE `product` ADD UNIQUE INDEX `uid_product_name_869427` (`name`, `type_db_alias`)",
|
||||
"ALTER TABLE `product` ALTER COLUMN `view_num` SET DEFAULT 0",
|
||||
"ALTER TABLE `user` DROP COLUMN `avatar`",
|
||||
"ALTER TABLE `user` MODIFY COLUMN `password` VARCHAR(100) NOT NULL",
|
||||
"CREATE TABLE IF NOT EXISTS `newmodel` (\n `id` INT NOT NULL PRIMARY KEY AUTO_INCREMENT,\n `name` VARCHAR(50) NOT NULL\n) CHARACTER SET utf8mb4;",
|
||||
"ALTER TABLE `user` ADD UNIQUE INDEX `uid_user_usernam_9987ab` (`username`)",
|
||||
"CREATE TABLE `email_user` (`email_id` INT NOT NULL REFERENCES `email` (`email_id`) ON DELETE CASCADE,`user_id` INT NOT NULL REFERENCES `user` (`id`) ON DELETE CASCADE) CHARACTER SET utf8mb4",
|
||||
]
|
||||
)
|
||||
expected_upgrade_operators = {
|
||||
"ALTER TABLE `category` MODIFY COLUMN `name` VARCHAR(200)",
|
||||
"ALTER TABLE `category` MODIFY COLUMN `slug` VARCHAR(100) NOT NULL",
|
||||
"ALTER TABLE `config` ADD `user_id` INT NOT NULL COMMENT 'User'",
|
||||
"ALTER TABLE `config` ADD CONSTRAINT `fk_config_user_17daa970` FOREIGN KEY (`user_id`) REFERENCES `user` (`id`) ON DELETE CASCADE",
|
||||
"ALTER TABLE `config` ALTER COLUMN `status` DROP DEFAULT",
|
||||
"ALTER TABLE `config` MODIFY COLUMN `value` JSON NOT NULL",
|
||||
"ALTER TABLE `email` ADD `address` VARCHAR(200) NOT NULL",
|
||||
"ALTER TABLE `email` DROP COLUMN `user_id`",
|
||||
"ALTER TABLE `configs` RENAME TO `config`",
|
||||
"ALTER TABLE `product` RENAME COLUMN `image` TO `pic`",
|
||||
"ALTER TABLE `email` RENAME COLUMN `id` TO `email_id`",
|
||||
"ALTER TABLE `product` ADD INDEX `idx_product_name_869427` (`name`, `type_db_alias`)",
|
||||
"ALTER TABLE `email` ADD INDEX `idx_email_email_4a1a33` (`email`)",
|
||||
"ALTER TABLE `product` ADD UNIQUE INDEX `uid_product_name_869427` (`name`, `type_db_alias`)",
|
||||
"ALTER TABLE `product` ALTER COLUMN `view_num` SET DEFAULT 0",
|
||||
"ALTER TABLE `product` MODIFY COLUMN `created_at` DATETIME(6) NOT NULL DEFAULT CURRENT_TIMESTAMP(6)",
|
||||
"ALTER TABLE `product` MODIFY COLUMN `is_reviewed` BOOL NOT NULL COMMENT 'Is Reviewed'",
|
||||
"ALTER TABLE `user` DROP COLUMN `avatar`",
|
||||
"ALTER TABLE `user` MODIFY COLUMN `password` VARCHAR(100) NOT NULL",
|
||||
"ALTER TABLE `user` MODIFY COLUMN `intro` LONGTEXT NOT NULL",
|
||||
"ALTER TABLE `user` MODIFY COLUMN `last_login` DATETIME(6) NOT NULL COMMENT 'Last Login'",
|
||||
"ALTER TABLE `user` MODIFY COLUMN `is_active` BOOL NOT NULL COMMENT 'Is Active' DEFAULT 1",
|
||||
"ALTER TABLE `user` MODIFY COLUMN `is_superuser` BOOL NOT NULL COMMENT 'Is SuperUser' DEFAULT 0",
|
||||
"ALTER TABLE `user` MODIFY COLUMN `longitude` DECIMAL(10,8) NOT NULL",
|
||||
"ALTER TABLE `user` ADD UNIQUE INDEX `uid_user_usernam_9987ab` (`username`)",
|
||||
"CREATE TABLE `email_user` (\n `email_id` INT NOT NULL REFERENCES `email` (`email_id`) ON DELETE CASCADE,\n `user_id` INT NOT NULL REFERENCES `user` (`id`) ON DELETE CASCADE\n) CHARACTER SET utf8mb4",
|
||||
"CREATE TABLE IF NOT EXISTS `newmodel` (\n `id` INT NOT NULL PRIMARY KEY AUTO_INCREMENT,\n `name` VARCHAR(50) NOT NULL\n) CHARACTER SET utf8mb4",
|
||||
"ALTER TABLE `category` MODIFY COLUMN `created_at` DATETIME(6) NOT NULL DEFAULT CURRENT_TIMESTAMP(6)",
|
||||
"ALTER TABLE `product` MODIFY COLUMN `body` LONGTEXT NOT NULL",
|
||||
"ALTER TABLE `email` MODIFY COLUMN `is_primary` BOOL NOT NULL DEFAULT 0",
|
||||
}
|
||||
expected_downgrade_operators = {
|
||||
"ALTER TABLE `category` MODIFY COLUMN `name` VARCHAR(200) NOT NULL",
|
||||
"ALTER TABLE `category` MODIFY COLUMN `slug` VARCHAR(200) NOT NULL",
|
||||
"ALTER TABLE `config` DROP COLUMN `user_id`",
|
||||
"ALTER TABLE `config` DROP FOREIGN KEY `fk_config_user_17daa970`",
|
||||
"ALTER TABLE `config` ALTER COLUMN `status` SET DEFAULT 1",
|
||||
"ALTER TABLE `email` ADD `user_id` INT NOT NULL",
|
||||
"ALTER TABLE `email` DROP COLUMN `address`",
|
||||
"ALTER TABLE `config` RENAME TO `configs`",
|
||||
"ALTER TABLE `product` RENAME COLUMN `pic` TO `image`",
|
||||
"ALTER TABLE `email` RENAME COLUMN `email_id` TO `id`",
|
||||
"ALTER TABLE `product` DROP INDEX `idx_product_name_869427`",
|
||||
"ALTER TABLE `email` DROP INDEX `idx_email_email_4a1a33`",
|
||||
"ALTER TABLE `product` DROP INDEX `uid_product_name_869427`",
|
||||
"ALTER TABLE `product` ALTER COLUMN `view_num` DROP DEFAULT",
|
||||
"ALTER TABLE `user` ADD `avatar` VARCHAR(200) NOT NULL DEFAULT ''",
|
||||
"ALTER TABLE `user` DROP INDEX `idx_user_usernam_9987ab`",
|
||||
"ALTER TABLE `user` MODIFY COLUMN `password` VARCHAR(200) NOT NULL",
|
||||
"DROP TABLE IF EXISTS `email_user`",
|
||||
"DROP TABLE IF EXISTS `newmodel`",
|
||||
"ALTER TABLE `user` MODIFY COLUMN `intro` LONGTEXT NOT NULL",
|
||||
"ALTER TABLE `config` MODIFY COLUMN `value` TEXT NOT NULL",
|
||||
"ALTER TABLE `category` MODIFY COLUMN `created_at` DATETIME(6) NOT NULL DEFAULT CURRENT_TIMESTAMP(6)",
|
||||
"ALTER TABLE `product` MODIFY COLUMN `created_at` DATETIME(6) NOT NULL DEFAULT CURRENT_TIMESTAMP(6)",
|
||||
"ALTER TABLE `product` MODIFY COLUMN `is_reviewed` BOOL NOT NULL COMMENT 'Is Reviewed'",
|
||||
"ALTER TABLE `user` MODIFY COLUMN `last_login` DATETIME(6) NOT NULL COMMENT 'Last Login'",
|
||||
"ALTER TABLE `user` MODIFY COLUMN `is_active` BOOL NOT NULL COMMENT 'Is Active' DEFAULT 1",
|
||||
"ALTER TABLE `user` MODIFY COLUMN `is_superuser` BOOL NOT NULL COMMENT 'Is SuperUser' DEFAULT 0",
|
||||
"ALTER TABLE `user` MODIFY COLUMN `longitude` DECIMAL(12,9) NOT NULL",
|
||||
"ALTER TABLE `product` MODIFY COLUMN `body` LONGTEXT NOT NULL",
|
||||
"ALTER TABLE `email` MODIFY COLUMN `is_primary` BOOL NOT NULL DEFAULT 0",
|
||||
}
|
||||
assert not set(Migrate.upgrade_operators).symmetric_difference(expected_upgrade_operators)
|
||||
|
||||
assert sorted(Migrate.downgrade_operators) == sorted(
|
||||
[
|
||||
"ALTER TABLE `category` MODIFY COLUMN `name` VARCHAR(200) NOT NULL",
|
||||
"ALTER TABLE `category` MODIFY COLUMN `slug` VARCHAR(200) NOT NULL",
|
||||
"ALTER TABLE `config` DROP COLUMN `user_id`",
|
||||
"ALTER TABLE `config` DROP FOREIGN KEY `fk_config_user_17daa970`",
|
||||
"ALTER TABLE `config` ALTER COLUMN `status` SET DEFAULT 1",
|
||||
"ALTER TABLE `email` ADD `user_id` INT NOT NULL",
|
||||
"ALTER TABLE `email` DROP COLUMN `address`",
|
||||
"ALTER TABLE `config` RENAME TO `configs`",
|
||||
"ALTER TABLE `product` RENAME COLUMN `pic` TO `image`",
|
||||
"ALTER TABLE `email` RENAME COLUMN `email_id` TO `id`",
|
||||
"ALTER TABLE `product` DROP INDEX `idx_product_name_869427`",
|
||||
"ALTER TABLE `email` DROP INDEX `idx_email_email_4a1a33`",
|
||||
"ALTER TABLE `product` DROP INDEX `uid_product_name_869427`",
|
||||
"ALTER TABLE `product` ALTER COLUMN `view_num` DROP DEFAULT",
|
||||
"ALTER TABLE `user` ADD `avatar` VARCHAR(200) NOT NULL DEFAULT ''",
|
||||
"ALTER TABLE `user` DROP INDEX `idx_user_usernam_9987ab`",
|
||||
"ALTER TABLE `user` MODIFY COLUMN `password` VARCHAR(200) NOT NULL",
|
||||
"DROP TABLE IF EXISTS `email_user`",
|
||||
"DROP TABLE IF EXISTS `newmodel`",
|
||||
]
|
||||
assert not set(Migrate.downgrade_operators).symmetric_difference(
|
||||
expected_downgrade_operators
|
||||
)
|
||||
|
||||
elif isinstance(Migrate.ddl, PostgresDDL):
|
||||
assert sorted(Migrate.upgrade_operators) == sorted(
|
||||
[
|
||||
'ALTER TABLE "category" ALTER COLUMN "name" DROP NOT NULL',
|
||||
'ALTER TABLE "category" ALTER COLUMN "slug" TYPE VARCHAR(100) USING "slug"::VARCHAR(100)',
|
||||
'ALTER TABLE "config" ADD "user_id" INT NOT NULL',
|
||||
'ALTER TABLE "config" ADD CONSTRAINT "fk_config_user_17daa970" FOREIGN KEY ("user_id") REFERENCES "user" ("id") ON DELETE CASCADE',
|
||||
'ALTER TABLE "config" ALTER COLUMN "status" DROP DEFAULT',
|
||||
'ALTER TABLE "configs" RENAME TO "config"',
|
||||
'ALTER TABLE "email" ADD "address" VARCHAR(200) NOT NULL',
|
||||
'ALTER TABLE "email" DROP COLUMN "user_id"',
|
||||
'ALTER TABLE "email" RENAME COLUMN "id" TO "email_id"',
|
||||
'ALTER TABLE "product" ALTER COLUMN "view_num" SET DEFAULT 0',
|
||||
'ALTER TABLE "product" RENAME COLUMN "image" TO "pic"',
|
||||
'ALTER TABLE "user" ALTER COLUMN "password" TYPE VARCHAR(100) USING "password"::VARCHAR(100)',
|
||||
'ALTER TABLE "user" DROP COLUMN "avatar"',
|
||||
'CREATE INDEX "idx_product_name_869427" ON "product" ("name", "type_db_alias")',
|
||||
'CREATE INDEX "idx_email_email_4a1a33" ON "email" ("email")',
|
||||
'CREATE TABLE "email_user" ("email_id" INT NOT NULL REFERENCES "email" ("email_id") ON DELETE CASCADE,"user_id" INT NOT NULL REFERENCES "user" ("id") ON DELETE CASCADE)',
|
||||
'CREATE TABLE IF NOT EXISTS "newmodel" (\n "id" SERIAL NOT NULL PRIMARY KEY,\n "name" VARCHAR(50) NOT NULL\n);\nCOMMENT ON COLUMN "config"."user_id" IS \'User\';',
|
||||
'CREATE UNIQUE INDEX "uid_product_name_869427" ON "product" ("name", "type_db_alias")',
|
||||
'CREATE UNIQUE INDEX "uid_user_usernam_9987ab" ON "user" ("username")',
|
||||
]
|
||||
)
|
||||
assert sorted(Migrate.downgrade_operators) == sorted(
|
||||
[
|
||||
'ALTER TABLE "category" ALTER COLUMN "name" SET NOT NULL',
|
||||
'ALTER TABLE "category" ALTER COLUMN "slug" TYPE VARCHAR(200) USING "slug"::VARCHAR(200)',
|
||||
'ALTER TABLE "config" ALTER COLUMN "status" SET DEFAULT 1',
|
||||
'ALTER TABLE "config" DROP COLUMN "user_id"',
|
||||
'ALTER TABLE "config" DROP CONSTRAINT "fk_config_user_17daa970"',
|
||||
'ALTER TABLE "config" RENAME TO "configs"',
|
||||
'ALTER TABLE "email" ADD "user_id" INT NOT NULL',
|
||||
'ALTER TABLE "email" DROP COLUMN "address"',
|
||||
'ALTER TABLE "email" RENAME COLUMN "email_id" TO "id"',
|
||||
'ALTER TABLE "product" ALTER COLUMN "view_num" DROP DEFAULT',
|
||||
'ALTER TABLE "product" RENAME COLUMN "pic" TO "image"',
|
||||
'ALTER TABLE "user" ADD "avatar" VARCHAR(200) NOT NULL DEFAULT \'\'',
|
||||
'ALTER TABLE "user" ALTER COLUMN "password" TYPE VARCHAR(200) USING "password"::VARCHAR(200)',
|
||||
'DROP INDEX "idx_product_name_869427"',
|
||||
'DROP INDEX "idx_email_email_4a1a33"',
|
||||
'DROP INDEX "idx_user_usernam_9987ab"',
|
||||
'DROP INDEX "uid_product_name_869427"',
|
||||
'DROP TABLE IF EXISTS "email_user"',
|
||||
'DROP TABLE IF EXISTS "newmodel"',
|
||||
]
|
||||
expected_upgrade_operators = {
|
||||
'ALTER TABLE "category" ALTER COLUMN "name" DROP NOT NULL',
|
||||
'ALTER TABLE "category" ALTER COLUMN "slug" TYPE VARCHAR(100) USING "slug"::VARCHAR(100)',
|
||||
'ALTER TABLE "category" ALTER COLUMN "created_at" TYPE TIMESTAMPTZ USING "created_at"::TIMESTAMPTZ',
|
||||
'ALTER TABLE "config" ADD "user_id" INT NOT NULL',
|
||||
'ALTER TABLE "config" ADD CONSTRAINT "fk_config_user_17daa970" FOREIGN KEY ("user_id") REFERENCES "user" ("id") ON DELETE CASCADE',
|
||||
'ALTER TABLE "config" ALTER COLUMN "status" DROP DEFAULT',
|
||||
'ALTER TABLE "config" ALTER COLUMN "value" TYPE JSONB USING "value"::JSONB',
|
||||
'ALTER TABLE "configs" RENAME TO "config"',
|
||||
'ALTER TABLE "email" ADD "address" VARCHAR(200) NOT NULL',
|
||||
'ALTER TABLE "email" DROP COLUMN "user_id"',
|
||||
'ALTER TABLE "email" RENAME COLUMN "id" TO "email_id"',
|
||||
'ALTER TABLE "email" ALTER COLUMN "is_primary" TYPE BOOL USING "is_primary"::BOOL',
|
||||
'ALTER TABLE "product" ALTER COLUMN "view_num" SET DEFAULT 0',
|
||||
'ALTER TABLE "product" RENAME COLUMN "image" TO "pic"',
|
||||
'ALTER TABLE "product" ALTER COLUMN "is_reviewed" TYPE BOOL USING "is_reviewed"::BOOL',
|
||||
'ALTER TABLE "product" ALTER COLUMN "body" TYPE TEXT USING "body"::TEXT',
|
||||
'ALTER TABLE "product" ALTER COLUMN "created_at" TYPE TIMESTAMPTZ USING "created_at"::TIMESTAMPTZ',
|
||||
'ALTER TABLE "user" ALTER COLUMN "password" TYPE VARCHAR(100) USING "password"::VARCHAR(100)',
|
||||
'ALTER TABLE "user" DROP COLUMN "avatar"',
|
||||
'ALTER TABLE "user" ALTER COLUMN "is_superuser" TYPE BOOL USING "is_superuser"::BOOL',
|
||||
'ALTER TABLE "user" ALTER COLUMN "last_login" TYPE TIMESTAMPTZ USING "last_login"::TIMESTAMPTZ',
|
||||
'ALTER TABLE "user" ALTER COLUMN "intro" TYPE TEXT USING "intro"::TEXT',
|
||||
'ALTER TABLE "user" ALTER COLUMN "is_active" TYPE BOOL USING "is_active"::BOOL',
|
||||
'ALTER TABLE "user" ALTER COLUMN "longitude" TYPE DECIMAL(10,8) USING "longitude"::DECIMAL(10,8)',
|
||||
'CREATE INDEX "idx_product_name_869427" ON "product" ("name", "type_db_alias")',
|
||||
'CREATE INDEX "idx_email_email_4a1a33" ON "email" ("email")',
|
||||
'CREATE TABLE "email_user" (\n "email_id" INT NOT NULL REFERENCES "email" ("email_id") ON DELETE CASCADE,\n "user_id" INT NOT NULL REFERENCES "user" ("id") ON DELETE CASCADE\n)',
|
||||
'CREATE TABLE IF NOT EXISTS "newmodel" (\n "id" SERIAL NOT NULL PRIMARY KEY,\n "name" VARCHAR(50) NOT NULL\n);\nCOMMENT ON COLUMN "config"."user_id" IS \'User\'',
|
||||
'CREATE UNIQUE INDEX "uid_product_name_869427" ON "product" ("name", "type_db_alias")',
|
||||
'CREATE UNIQUE INDEX "uid_user_usernam_9987ab" ON "user" ("username")',
|
||||
}
|
||||
expected_downgrade_operators = {
|
||||
'ALTER TABLE "category" ALTER COLUMN "name" SET NOT NULL',
|
||||
'ALTER TABLE "category" ALTER COLUMN "slug" TYPE VARCHAR(200) USING "slug"::VARCHAR(200)',
|
||||
'ALTER TABLE "category" ALTER COLUMN "created_at" TYPE TIMESTAMPTZ USING "created_at"::TIMESTAMPTZ',
|
||||
'ALTER TABLE "config" ALTER COLUMN "status" SET DEFAULT 1',
|
||||
'ALTER TABLE "config" DROP COLUMN "user_id"',
|
||||
'ALTER TABLE "config" DROP CONSTRAINT "fk_config_user_17daa970"',
|
||||
'ALTER TABLE "config" RENAME TO "configs"',
|
||||
'ALTER TABLE "config" ALTER COLUMN "value" TYPE JSONB USING "value"::JSONB',
|
||||
'ALTER TABLE "email" ADD "user_id" INT NOT NULL',
|
||||
'ALTER TABLE "email" DROP COLUMN "address"',
|
||||
'ALTER TABLE "email" RENAME COLUMN "email_id" TO "id"',
|
||||
'ALTER TABLE "email" ALTER COLUMN "is_primary" TYPE BOOL USING "is_primary"::BOOL',
|
||||
'ALTER TABLE "product" ALTER COLUMN "view_num" DROP DEFAULT',
|
||||
'ALTER TABLE "product" RENAME COLUMN "pic" TO "image"',
|
||||
'ALTER TABLE "user" ADD "avatar" VARCHAR(200) NOT NULL DEFAULT \'\'',
|
||||
'ALTER TABLE "user" ALTER COLUMN "password" TYPE VARCHAR(200) USING "password"::VARCHAR(200)',
|
||||
'ALTER TABLE "user" ALTER COLUMN "last_login" TYPE TIMESTAMPTZ USING "last_login"::TIMESTAMPTZ',
|
||||
'ALTER TABLE "user" ALTER COLUMN "is_superuser" TYPE BOOL USING "is_superuser"::BOOL',
|
||||
'ALTER TABLE "user" ALTER COLUMN "is_active" TYPE BOOL USING "is_active"::BOOL',
|
||||
'ALTER TABLE "user" ALTER COLUMN "intro" TYPE TEXT USING "intro"::TEXT',
|
||||
'ALTER TABLE "user" ALTER COLUMN "longitude" TYPE DECIMAL(12,9) USING "longitude"::DECIMAL(12,9)',
|
||||
'ALTER TABLE "product" ALTER COLUMN "created_at" TYPE TIMESTAMPTZ USING "created_at"::TIMESTAMPTZ',
|
||||
'ALTER TABLE "product" ALTER COLUMN "is_reviewed" TYPE BOOL USING "is_reviewed"::BOOL',
|
||||
'ALTER TABLE "product" ALTER COLUMN "body" TYPE TEXT USING "body"::TEXT',
|
||||
'DROP INDEX "idx_product_name_869427"',
|
||||
'DROP INDEX "idx_email_email_4a1a33"',
|
||||
'DROP INDEX "idx_user_usernam_9987ab"',
|
||||
'DROP INDEX "uid_product_name_869427"',
|
||||
'DROP TABLE IF EXISTS "email_user"',
|
||||
'DROP TABLE IF EXISTS "newmodel"',
|
||||
}
|
||||
assert not set(Migrate.upgrade_operators).symmetric_difference(expected_upgrade_operators)
|
||||
assert not set(Migrate.downgrade_operators).symmetric_difference(
|
||||
expected_downgrade_operators
|
||||
)
|
||||
|
||||
elif isinstance(Migrate.ddl, SqliteDDL):
|
||||
assert Migrate.upgrade_operators == []
|
||||
|
||||
assert Migrate.downgrade_operators == []
|
||||
|
||||
|
||||
@@ -892,18 +951,18 @@ def test_sort_all_version_files(mocker):
|
||||
mocker.patch(
|
||||
"os.listdir",
|
||||
return_value=[
|
||||
"1_datetime_update.sql",
|
||||
"11_datetime_update.sql",
|
||||
"10_datetime_update.sql",
|
||||
"2_datetime_update.sql",
|
||||
"1_datetime_update.py",
|
||||
"11_datetime_update.py",
|
||||
"10_datetime_update.py",
|
||||
"2_datetime_update.py",
|
||||
],
|
||||
)
|
||||
|
||||
Migrate.migrate_location = "."
|
||||
|
||||
assert Migrate.get_all_version_files() == [
|
||||
"1_datetime_update.sql",
|
||||
"2_datetime_update.sql",
|
||||
"10_datetime_update.sql",
|
||||
"11_datetime_update.sql",
|
||||
"1_datetime_update.py",
|
||||
"2_datetime_update.py",
|
||||
"10_datetime_update.py",
|
||||
"11_datetime_update.py",
|
||||
]
|
||||
|
6
tests/test_utils.py
Normal file
6
tests/test_utils.py
Normal file
@@ -0,0 +1,6 @@
|
||||
from aerich.utils import import_py_file
|
||||
|
||||
|
||||
def test_import_py_file():
|
||||
m = import_py_file("aerich/utils.py")
|
||||
assert getattr(m, "import_py_file")
|
Reference in New Issue
Block a user