Compare commits

..

107 Commits

Author SHA1 Message Date
Bastien Gérard
fb8f02d0c0 Merge pull request #2441 from bagerard/remove_useless_cls_var
minor improvement in code
2020-12-08 23:22:57 +01:00
Bastien Gerard
a025199294 Merge branch 'master' of github.com:MongoEngine/mongoengine into remove_useless_cls_var 2020-12-08 23:02:17 +01:00
Bastien Gérard
87babaaa30 Merge pull request #2439 from bagerard/improve_doc
Improve Fields documentation
2020-12-08 22:58:00 +01:00
Bastien Gerard
a4fff15491 minor improvement in code 2020-12-08 22:41:27 +01:00
Bastien Gerard
a190dfe2c4 additional improvements to fields constructor 2020-12-08 21:48:54 +01:00
Bastien Gerard
3926473917 Improve Fields documentation + remove versionadded/changed as it's not maintained 2020-12-06 22:25:12 +01:00
Bastien Gérard
9ffe0bcdee Merge pull request #2432 from bagerard/remove_deprecated_landscape
remove landscape integration
2020-12-01 21:40:56 +01:00
Bastien Gérard
4fa3134294 remove landscape integration as it is dead 2020-11-29 21:15:02 +01:00
Bastien Gérard
92f6fce77d Merge pull request #2431 from bagerard/remove_pillow_test_restriction
Remove restriction on Pillow version in tests
2020-11-29 20:48:13 +01:00
Bastien Gérard
b1a2cf061d update test requirement for pillow > 7.0 2020-11-29 10:22:59 +01:00
Bastien Gérard
0a05c1f590 Fix image size that needs to be forced in test 2020-11-29 10:08:58 +01:00
Bastien Gérard
7dbc217768 Remove restriction on Pillow version as it was there due to Py2 support 2020-11-28 22:27:24 +01:00
Bastien Gérard
bf411ab2ca Update changelog with recent changes that were merged 2020-11-28 22:21:51 +01:00
Bastien Gérard
277b827d4d Merge pull request #2426 from volfpeter/master
Fix LazyReferenceField dereferencing in embedded documents
2020-11-26 21:45:12 +01:00
Peter Volf
e0bec881bc removed unused variable to fix a warning 2020-11-25 15:17:52 +01:00
Peter Volf
cc5e2ba054 fix LazyReferenceField dereferencing bug in embedded documents, refs #2375 2020-11-25 11:05:37 +01:00
Bastien Gérard
904fcd1a0a Merge pull request #2424 from bagerard/bump_0_21_0
Prepare 0.21.0 release
2020-11-19 09:02:22 +01:00
Bastien Gérard
2ec454447f Bump version to 0.21.0 and update changelog 2020-11-18 22:28:06 +01:00
Bastien Gérard
ecd297e227 Merge pull request #2414 from bagerard/fix_db_fields_inconsistencies_in_constructor
Fix some issues related with db_field in constructor
2020-11-18 22:25:25 +01:00
Bastien Gérard
079ee3c191 Merge pull request #2417 from bagerard/add_migration_documentation
Add migration documentation
2020-11-18 22:19:19 +01:00
Bastien Gérard
f2638ecd02 update changelog 2020-11-14 15:31:39 +01:00
Bastien Gérard
ad6ff819fe Merge branch 'master' of github.com:MongoEngine/mongoengine into fix_db_fields_inconsistencies_in_constructor 2020-11-14 14:44:08 +01:00
Bastien Gérard
48357640c6 improve deprecated pymongo call 2020-11-14 14:41:59 +01:00
Bastien Gérard
e6c2169f76 Merge pull request #2418 from bagerard/add_black_formatting_badge
Add black badge to readme
2020-11-14 13:42:57 +01:00
Bastien Gérard
1d17dc4663 Add black badge to readme to emphasize that repo is using autoformatter black as it is often forgotten in PR and makes CI failing 2020-11-12 22:13:53 +01:00
Bastien Gérard
eeac3bd2e6 Merge pull request #2416 from bagerard/remove_python_35
Remove Py3.5 as it is EOL and added 3.9 to CI
2020-11-12 09:47:44 +01:00
Bastien Gérard
3f5a15d236 improve changelog 2020-11-12 00:43:22 +01:00
Bastien Gérard
91493a1e79 improve migration doc 2020-11-12 00:40:52 +01:00
Bastien Gérard
0c274908ec Merge branch 'master' of github.com:MongoEngine/mongoengine into add_migration_documentation 2020-11-11 21:18:52 +01:00
Bastien Gérard
338c40b5d5 Remove Py3.5 as it is EOL and added 3.9 to CI 2020-11-11 21:14:54 +01:00
Bastien Gérard
fc3ccf9606 Merge pull request #2415 from bagerard/add_srv_uri_connect_doc
Document fact that srv URI can be used with host #1956
2020-11-11 21:05:52 +01:00
Bastien Gérard
746faceb5c Document fact that srv URI can be used with host #1956 2020-11-08 22:55:24 +01:00
Bastien Gérard
8c3058d99b Fix some issues related with db_field in constructor by removing field/db_field translation that shouldn't occur in constructor 2020-11-08 22:36:58 +01:00
Bastien Gérard
eb56fb9bda Merge pull request #2413 from bagerard/dynamic_document_parsing_known_fields
Bug fix in DynamicDocument which is not parsing known fields
2020-11-08 13:17:07 +01:00
Bastien Gérard
161493c0d2 Merge pull request #2408 from bagerard/refactoring_remove_useless_code_only_fields
Removed code related to Document.__only_fields
2020-11-08 13:16:03 +01:00
Bastien Gérard
cb9f329d11 Merge pull request #2401 from SMASHDOCs/bugfix-save-sharding
Bugfix #2154
2020-11-07 21:43:49 +01:00
Bastien Gérard
03af784ebe Bug fix in DynamicDocument which isn not parsing known fields in constructor like Document do #2412 2020-11-07 21:30:23 +01:00
Felix Schultheiß
e5f6e4584a Merge commit master into bugfix-save-sharding 2020-11-03 10:05:31 +01:00
Felix Schultheiß
79f9f223d0 added to authors 2020-11-03 10:00:15 +01:00
Felix Schultheiß
0bc18cd6e1 fixed shard test case for old mongodb version 2020-11-03 10:00:02 +01:00
Felix Schultheiß
30a3c6a5b7 added testcase for save create with shard key 2020-11-02 17:30:24 +01:00
Bastien Gérard
90c5d83f84 remove deprecated comment 2020-11-02 15:02:11 +01:00
Bastien Gérard
d8b8ff6851 Removed code related to Document.__only_fields and Queryset.only_fields which appear to have no effect 2020-11-02 14:52:02 +01:00
Bastien Gérard
ee664f0c90 Merge pull request #2406 from bagerard/improve_enumfield_doc
improve EnumField Doc and add quick test
2020-11-01 23:08:30 +01:00
Bastien Gérard
f8d371229e tmp work on migration doc 2020-11-01 20:25:35 +01:00
Bastien Gérard
94a7e813b1 fix difference in test for certain version of pymongo 2020-11-01 19:37:13 +01:00
Bastien Gérard
8ef7213426 improve EnumField Doc and add quick test 2020-11-01 14:05:58 +01:00
Bastien Gérard
2f4464ead5 Merge pull request #2404 from mas15/add-enum-field
Add EnumField
2020-11-01 13:41:20 +01:00
Bastien Gérard
89b93461ac Merge pull request #2405 from bagerard/remove_encoding_declarations
remove utf8 encoding declaration in test files
2020-11-01 13:16:57 +01:00
Mateusz Stankiewicz
9e40f3ae83 PR ammends 2020-10-31 10:47:20 +01:00
Bastien Gérard
f4962fbc40 remove utf8 encoding declaration in test files as it's not needed/recommended 2020-10-30 21:10:21 +01:00
Mateusz Stankiewicz
c9d53ca5d5 Add EnumField 2020-10-30 13:06:37 +01:00
Bastien Gérard
65f50fd713 Merge pull request #2387 from bagerard/fix_change_fields_inconsistencies
fix inconsistencies in ._changed_fields computation
2020-10-29 21:15:31 +01:00
Felix Schultheiß
bf1d04e399 black reformatting 2020-10-29 14:56:08 +01:00
Felix Schultheiß
5a8e5e5a40 updated docstring 2020-10-27 16:34:57 +01:00
Felix Schultheiß
f3919dd839 stripped out integrating shard key from _save_update, use it also in _save_create 2020-10-27 12:55:35 +01:00
Bastien Gérard
9f82a02ddf Merge pull request #2106 from bagerard/add_validation_to_doc
Add a documentation page for validation
2020-10-20 00:27:05 +02:00
Bastien Gérard
015a36c85f minor styling fix in .rst 2020-10-19 23:59:12 +02:00
Bastien Gérard
fbd3388a59 Merge branch 'master' of github.com:MongoEngine/mongoengine into add_validation_to_doc 2020-10-19 23:36:12 +02:00
Bastien Gérard
d8a52d68c5 improve doc in .readthedocs.yml 2020-10-19 23:34:24 +02:00
Bastien Gérard
4286708e2e fix mongoengine setup.py path in .readthedocs.yml 2020-10-18 22:47:00 +02:00
Bastien Gérard
e362d089e1 install mongoengine for readthedocs build to work 2020-10-18 22:44:06 +02:00
Bastien Gérard
6b657886a5 remove explicit install from .readthedocs.yml to rely on default instead 2020-10-18 22:21:45 +02:00
Bastien Gérard
eb16945147 fix requirements.txt location for readthedocs 2020-10-18 22:06:15 +02:00
Bastien Gérard
38047ca992 Merge pull request #2396 from bagerard/fix_readthedocs_failed_build
Fix readthedocs build that failed
2020-10-18 22:04:01 +02:00
Bastien Gérard
c801e79d4b Fix readthedocs build that failed by making it use python3 instead of default python2.7 2020-10-18 21:33:30 +02:00
Bastien Gérard
3fca3739de rework validation documentation based on review 2020-10-18 21:11:16 +02:00
Bastien Gérard
c218c8bb6c Merge branch 'master' of github.com:MongoEngine/mongoengine into add_validation_to_doc 2020-10-17 15:05:27 +02:00
Bastien Gérard
0bbc05995a Merge pull request #2393 from bagerard/fix_listfield_change_0
Fix listfield change detection of index 0
2020-10-11 10:15:18 +02:00
Bastien Gérard
3adb67901b update changelog for #2392 2020-10-11 00:53:46 +02:00
Bastien Gérard
d4350e7da4 Fix for ListField that isnt detecting properly that item 0 is changed 2020-10-10 23:32:22 +02:00
Bastien Gérard
4665658145 Merge pull request #2390 from bagerard/bump_latest_lib_ci
Upgrade pymongo and mongodb versions used in CI
2020-10-07 21:41:12 +02:00
Bastien Gérard
0d289fd5a1 upgrade pymongo and mongodb versions used in CI 2020-10-07 21:30:43 +02:00
Bastien Gérard
aabc18755c fix inconsistencies in ._changed_fields computation 2020-10-07 00:01:09 +02:00
Bastien Gérard
1f2a5db016 fix deprecated use of .update in test suite 2020-08-12 22:30:52 +02:00
Bastien Gérard
ff40f66291 Merge pull request #2243 from bagerard/fix_count_documents_deprecation
Fix count documents deprecation
2020-08-12 22:18:02 +02:00
Bastien Gérard
7f77084e0e minor fixes in doc links 2020-08-12 21:56:38 +02:00
Bastien Gérard
aca4de728e Merge branch 'master' of github.com:MongoEngine/mongoengine into fix_count_documents_deprecation 2020-08-11 23:01:33 +02:00
Bastien Gérard
9e7ca43cad Merge pull request #2365 from hiimdoublej/fix/queryTransform
Fix query transformation regarding special operators
2020-08-11 22:18:33 +02:00
Bastien Gérard
7116dec74a run black to please ci 2020-08-11 21:55:22 +02:00
Bastien Gérard
a5302b870b Merge branch 'fix/queryTransform' of git://github.com/hiimdoublej/mongoengine into hiimdoublej-fix/queryTransform 2020-08-11 21:48:00 +02:00
Bastien Gérard
604e9974b6 Merge pull request #2363 from bagerard/AttributeError_message_attr
fix py3 incompatible code
2020-08-03 21:37:36 +02:00
Johnny Chang
3e1c83f8fa Fix query transformation regarding special operators 2020-08-04 00:30:15 +08:00
Bastien Gérard
e431e27cb2 #2360 fix py3 incompatible code 2020-08-01 15:09:10 +02:00
Bastien Gérard
4f188655d0 Merge pull request #2335 from bagerard/fix_limit0_bug
Fix bug with Doc.objects.limit(0) which should return all docs
2020-05-27 09:43:35 +02:00
Bastien Gérard
194b0cac88 improve doc + changelog 2020-05-26 23:45:35 +02:00
Bastien Gérard
7b4175fc5c Merge branch 'master' of github.com:MongoEngine/mongoengine into fix_limit0_bug 2020-05-26 23:44:05 +02:00
Bastien Gérard
adb5f74ddb Fix a bug in limit0 #2311 2020-05-26 23:37:55 +02:00
Bastien Gérard
107a1c34c8 Merge pull request #2331 from abarto/fix/clone-retain-read-preference-read-concern
Add read_concern to cloned properties. Add read_concern to aggregate().
2020-05-23 23:22:56 +02:00
Bastien Gérard
dc7da5204f Merge branch 'terencehonles-patch-1' 2020-05-23 23:12:33 +02:00
Bastien Gérard
0301bca176 Merge branch 'patch-1' of https://github.com/terencehonles/mongoengine into terencehonles-patch-1 2020-05-23 23:12:01 +02:00
Bastien Gérard
49f9bca23b fix black formatting 2020-05-23 23:08:56 +02:00
Agustin Barto
31498bd7dd Update changelog 2020-05-20 18:58:18 -03:00
Agustin Barto
1698f398eb Add _read_concern to copied properties. Add read_concern to aggregate. Add test to check the read_concern and read_preference values are kept after cloning. 2020-05-20 18:56:13 -03:00
Bastien Gérard
4275c2d7b7 Merge pull request #2330 from terencehonles/fix-empty-deprecation-warning-in-q-node
fix self inflicted deprecation warnings in QNode
2020-05-19 22:02:12 +02:00
Terence D. Honles
22bff8566d fix self inflicted deprecation warnings in QNode 2020-05-19 11:00:30 -07:00
Terence Honles
d8657be320 Fix requirement Pillow < 7 to mention it is for tests only 2020-05-19 10:23:07 -07:00
Bastien Gérard
412bed0f6d fix bug in legacy .count due to with_limit_and_skip that was missing 2020-01-12 11:04:05 +01:00
Bastien Gérard
53cf26b9af Merge branch 'master' of github.com:MongoEngine/mongoengine into fix_count_documents_deprecation 2020-01-12 10:07:36 +01:00
Bastien Gérard
2fa48cd9e5 fix for pymongo < 3.7 2020-01-07 22:24:55 +01:00
Bastien Gérard
e64a7a9448 reformat with latest black 2020-01-07 22:11:24 +01:00
Bastien Gérard
84f3dce492 fix flake8 findings 2020-01-05 22:50:19 +01:00
Bastien Gérard
60c42dddd5 finalize code related to count_documents migration 2020-01-05 22:29:13 +01:00
Bastien Gérard
f93f9406ee improve doc next to code 2020-01-05 21:08:20 +01:00
Bastien Gérard
928770c43a switching to count_documents 2020-01-05 21:01:03 +01:00
Bastien Gérard
d37a30e083 improve doc (based on review) 2019-06-30 20:46:40 +02:00
Bastien Gérard
c9ed930606 Add a documentation page for validation 2019-06-25 22:31:48 +02:00
68 changed files with 1258 additions and 478 deletions

View File

@@ -1,17 +0,0 @@
pylint:
disable:
# We use this a lot (e.g. via document._meta)
- protected-access
options:
additional-builtins:
# add long as valid built-ins.
- long
pyflakes:
disable:
# undefined variables are already covered by pylint (and exclude long)
- F821
ignore-paths:
- benchmark.py

20
.readthedocs.yml Normal file
View File

@@ -0,0 +1,20 @@
# .readthedocs.yml
# Read the Docs configuration file
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
# Required
version: 2
# Build documentation in the docs/ directory with Sphinx
sphinx:
configuration: docs/conf.py
# Optionally set the version of Python and requirements required to build your docs
python:
version: 3.7
install:
- requirements: docs/requirements.txt
# docs/conf.py is importing mongoengine
# so mongoengine needs to be installed as well
- method: setuptools
path: .

View File

@@ -16,26 +16,26 @@
language: python language: python
dist: xenial dist: xenial
python: python:
- 3.5
- 3.6 - 3.6
- 3.7 - 3.7
- 3.8 - 3.8
- 3.9
- pypy3 - pypy3
env: env:
global: global:
- MONGODB_3_4=3.4.17 - MONGODB_3_4=3.4.19
- MONGODB_3_6=3.6.12 - MONGODB_3_6=3.6.13
- MONGODB_4_0=4.0.13 - MONGODB_4_0=4.0.13
- PYMONGO_3_4=3.4 - PYMONGO_3_4=3.4
- PYMONGO_3_6=3.6 - PYMONGO_3_6=3.6
- PYMONGO_3_9=3.9 - PYMONGO_3_9=3.9
- PYMONGO_3_10=3.10 - PYMONGO_3_11=3.11
- MAIN_PYTHON_VERSION=3.7 - MAIN_PYTHON_VERSION=3.7
matrix: matrix:
- MONGODB=${MONGODB_3_4} PYMONGO=${PYMONGO_3_10} - MONGODB=${MONGODB_3_4} PYMONGO=${PYMONGO_3_11}
matrix: matrix:
# Finish the build as soon as one job fails # Finish the build as soon as one job fails
@@ -47,9 +47,9 @@ matrix:
- python: 3.7 - python: 3.7
env: MONGODB=${MONGODB_3_6} PYMONGO=${PYMONGO_3_9} env: MONGODB=${MONGODB_3_6} PYMONGO=${PYMONGO_3_9}
- python: 3.7 - python: 3.7
env: MONGODB=${MONGODB_3_6} PYMONGO=${PYMONGO_3_10} env: MONGODB=${MONGODB_3_6} PYMONGO=${PYMONGO_3_11}
- python: 3.8 - python: 3.8
env: MONGODB=${MONGODB_4_0} PYMONGO=${PYMONGO_3_10} env: MONGODB=${MONGODB_4_0} PYMONGO=${PYMONGO_3_11}
install: install:
# Install Mongo # Install Mongo
@@ -75,7 +75,7 @@ script:
- tox -e $(echo py$TRAVIS_PYTHON_VERSION-mg$PYMONGO | tr -d . | sed -e 's/pypypy/pypy/') -- -a "--cov=mongoengine" - tox -e $(echo py$TRAVIS_PYTHON_VERSION-mg$PYMONGO | tr -d . | sed -e 's/pypypy/pypy/') -- -a "--cov=mongoengine"
after_success: after_success:
- - if [[ $TRAVIS_PYTHON_VERSION == $MAIN_PYTHON_VERSION ]]; then coveralls --verbose; else echo "coveralls only sent for py37"; fi - if [[ $TRAVIS_PYTHON_VERSION == $MAIN_PYTHON_VERSION ]]; then coveralls --verbose; else echo "coveralls only sent for py37"; fi
notifications: notifications:
irc: irc.freenode.org#mongoengine irc: irc.freenode.org#mongoengine
@@ -103,5 +103,5 @@ deploy:
on: on:
tags: true tags: true
repo: MongoEngine/mongoengine repo: MongoEngine/mongoengine
condition: ($PYMONGO = ${PYMONGO_3_10}) && ($MONGODB = ${MONGODB_3_4}) condition: ($PYMONGO = ${PYMONGO_3_11}) && ($MONGODB = ${MONGODB_3_4})
python: 3.7 python: 3.7

View File

@@ -257,3 +257,5 @@ that much better:
* Matthew Simpson (https://github.com/mcsimps2) * Matthew Simpson (https://github.com/mcsimps2)
* Leonardo Domingues (https://github.com/leodmgs) * Leonardo Domingues (https://github.com/leodmgs)
* Agustin Barto (https://github.com/abarto) * Agustin Barto (https://github.com/abarto)
* Stankiewicz Mateusz (https://github.com/mas15)
* Felix Schultheiß (https://github.com/felix-smashdocs)

View File

@@ -12,9 +12,8 @@ MongoEngine
.. image:: https://coveralls.io/repos/github/MongoEngine/mongoengine/badge.svg?branch=master .. image:: https://coveralls.io/repos/github/MongoEngine/mongoengine/badge.svg?branch=master
:target: https://coveralls.io/github/MongoEngine/mongoengine?branch=master :target: https://coveralls.io/github/MongoEngine/mongoengine?branch=master
.. image:: https://landscape.io/github/MongoEngine/mongoengine/master/landscape.svg?style=flat .. image:: https://img.shields.io/badge/code%20style-black-000000.svg
:target: https://landscape.io/github/MongoEngine/mongoengine/master :target: https://github.com/ambv/black
:alt: Code Health
About About
===== =====

View File

@@ -33,7 +33,7 @@ clean:
html: html:
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo @echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html." @echo "Build finished. Check $(BUILDDIR)/html/index.html"
dirhtml: dirhtml:
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml

View File

@@ -6,6 +6,24 @@ Changelog
Development Development
=========== ===========
- (Fill this out as you fix issues and develop your features). - (Fill this out as you fix issues and develop your features).
- Fix LazyReferenceField dereferencing in embedded documents #2426
Changes in 0.21.0
=================
- Bug fix in DynamicDocument which is not parsing known fields in constructor like Document do #2412
- When using pymongo >= 3.7, make use of Collection.count_documents instead of Collection.count
and Cursor.count that got deprecated in pymongo >= 3.7.
This should have a negative impact on performance of count see Issue #2219
- Fix a bug that made the queryset drop the read_preference after clone().
- Remove Py3.5 from CI as it reached EOL and add Python 3.9
- Fix some issues related with db_field/field conflict in constructor #2414
- BREAKING CHANGE: Fix the behavior of Doc.objects.limit(0) which should return all documents (similar to mongodb) #2311
- Bug fix in ListField when updating the first item, it was saving the whole list, instead of
just replacing the first item (as usually done when updating 1 item of the list) #2392
- Add EnumField: ``mongoengine.fields.EnumField``
- Refactoring - Remove useless code related to Document.__only_fields and Queryset.only_fields
- Fix query transformation regarding special operators #2365
- Bug Fix: Document.save() fails when shard_key is not _id #2154
Changes in 0.20.0 Changes in 0.20.0
================= =================
@@ -28,7 +46,7 @@ Changes in 0.20.0
Changes in 0.19.1 Changes in 0.19.1
================= =================
- Requires Pillow < 7.0.0 as it dropped Python2 support - Tests require Pillow < 7.0.0 as it dropped Python2 support
- DEPRECATION: The interface of ``QuerySet.aggregate`` method was changed, it no longer takes an unpacked list of - DEPRECATION: The interface of ``QuerySet.aggregate`` method was changed, it no longer takes an unpacked list of
pipeline steps (*pipeline) but simply takes the pipeline list just like ``pymongo.Collection.aggregate`` does. #2079 pipeline steps (*pipeline) but simply takes the pipeline list just like ``pymongo.Collection.aggregate`` does. #2079
@@ -456,9 +474,6 @@ Changes in 0.8.3
- Document.select_related() now respects ``db_alias`` (#377) - Document.select_related() now respects ``db_alias`` (#377)
- Reload uses shard_key if applicable (#384) - Reload uses shard_key if applicable (#384)
- Dynamic fields are ordered based on creation and stored in _fields_ordered (#396) - Dynamic fields are ordered based on creation and stored in _fields_ordered (#396)
**Potential breaking change:** http://docs.mongoengine.org/en/latest/upgrade.html#to-0-8-3
- Fixed pickling dynamic documents ``_dynamic_fields`` (#387) - Fixed pickling dynamic documents ``_dynamic_fields`` (#387)
- Fixed ListField setslice and delslice dirty tracking (#390) - Fixed ListField setslice and delslice dirty tracking (#390)
- Added Django 1.5 PY3 support (#392) - Added Django 1.5 PY3 support (#392)

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# #
# MongoEngine documentation build configuration file, created by # MongoEngine documentation build configuration file, created by
# sphinx-quickstart on Sun Nov 22 18:14:13 2009. # sphinx-quickstart on Sun Nov 22 18:14:13 2009.

View File

@@ -31,6 +31,8 @@ the :attr:`host` to
connect('project1', host='mongodb://localhost/database_name') connect('project1', host='mongodb://localhost/database_name')
.. note:: URI containing SRV records (e.g mongodb+srv://server.example.com/) can be used as well as the :attr:`host`
.. note:: Database, username and password from URI string overrides .. note:: Database, username and password from URI string overrides
corresponding parameters in :func:`~mongoengine.connect`: :: corresponding parameters in :func:`~mongoengine.connect`: ::

View File

@@ -76,6 +76,7 @@ are as follows:
* :class:`~mongoengine.fields.EmailField` * :class:`~mongoengine.fields.EmailField`
* :class:`~mongoengine.fields.EmbeddedDocumentField` * :class:`~mongoengine.fields.EmbeddedDocumentField`
* :class:`~mongoengine.fields.EmbeddedDocumentListField` * :class:`~mongoengine.fields.EmbeddedDocumentListField`
* :class:`~mongoengine.fields.EnumField`
* :class:`~mongoengine.fields.FileField` * :class:`~mongoengine.fields.FileField`
* :class:`~mongoengine.fields.FloatField` * :class:`~mongoengine.fields.FloatField`
* :class:`~mongoengine.fields.GenericEmbeddedDocumentField` * :class:`~mongoengine.fields.GenericEmbeddedDocumentField`
@@ -426,19 +427,6 @@ either a single field name, or a list or tuple of field names::
first_name = StringField() first_name = StringField()
last_name = StringField(unique_with='first_name') last_name = StringField(unique_with='first_name')
Skipping Document validation on save
------------------------------------
You can also skip the whole document validation process by setting
``validate=False`` when calling the :meth:`~mongoengine.document.Document.save`
method::
class Recipient(Document):
name = StringField()
email = EmailField()
recipient = Recipient(name='admin', email='root@localhost')
recipient.save() # will raise a ValidationError while
recipient.save(validate=False) # won't
Document collections Document collections
==================== ====================

View File

@@ -41,35 +41,6 @@ already exist, then any changes will be updated atomically. For example::
.. seealso:: .. seealso::
:ref:`guide-atomic-updates` :ref:`guide-atomic-updates`
Pre save data validation and cleaning
-------------------------------------
MongoEngine allows you to create custom cleaning rules for your documents when
calling :meth:`~mongoengine.Document.save`. By providing a custom
:meth:`~mongoengine.Document.clean` method you can do any pre validation / data
cleaning.
This might be useful if you want to ensure a default value based on other
document values for example::
class Essay(Document):
status = StringField(choices=('Published', 'Draft'), required=True)
pub_date = DateTimeField()
def clean(self):
"""Ensures that only published essays have a `pub_date` and
automatically sets `pub_date` if essay is published and `pub_date`
is not set"""
if self.status == 'Draft' and self.pub_date is not None:
msg = 'Draft entries should not have a publication date.'
raise ValidationError(msg)
# Set the pub_date for published items if not set.
if self.status == 'Published' and self.pub_date is None:
self.pub_date = datetime.now()
.. note::
Cleaning is only called if validation is turned on and when calling
:meth:`~mongoengine.Document.save`.
Cascading Saves Cascading Saves
--------------- ---------------
If your document contains :class:`~mongoengine.fields.ReferenceField` or If your document contains :class:`~mongoengine.fields.ReferenceField` or

View File

@@ -2,8 +2,6 @@
GridFS GridFS
====== ======
.. versionadded:: 0.4
Writing Writing
------- -------

View File

@@ -10,8 +10,10 @@ User Guide
defining-documents defining-documents
document-instances document-instances
querying querying
validation
gridfs gridfs
signals signals
text-indexes text-indexes
migration
logging-monitoring logging-monitoring
mongomock mongomock

267
docs/guide/migration.rst Normal file
View File

@@ -0,0 +1,267 @@
===================
Documents migration
===================
The structure of your documents and their associated mongoengine schemas are likely
to change over the lifetime of an application. This section provides guidance and
recommendations on how to deal with migrations.
Due to the very flexible nature of mongodb, migrations of models aren't trivial and
for people that know about `alembic` for `sqlalchemy`, there is unfortunately no equivalent
library that will manage the migration in an automatic fashion for mongoengine.
Example 1: Addition of a field
==============================
Let's start by taking a simple example of a model change and review the different option you
have to deal with the migration.
Let's assume we start with the following schema and save an instance:
.. code-block:: python
class User(Document):
name = StringField()
User(name="John Doe").save()
# print the objects as they exist in mongodb
print(User.objects().as_pymongo()) # [{u'_id': ObjectId('5d06b9c3d7c1f18db3e7c874'), u'name': u'John Doe'}]
On the next version of your application, let's now assume that a new field `enabled` gets added to the
existing ``User`` model with a `default=True`. Thus you simply update the ``User`` class to the following:
.. code-block:: python
class User(Document):
name = StringField(required=True)
enabled = BooleaField(default=True)
Without applying any migration, we now reload an object from the database into the ``User`` class
and checks its `enabled` attribute:
.. code-block:: python
assert User.objects.count() == 1
user = User.objects().first()
assert user.enabled is True
assert User.objects(enabled=True).count() == 0 # uh?
assert User.objects(enabled=False).count() == 0 # uh?
# this is consistent with what we have in the database
# in fact, 'enabled' does not exist
print(User.objects().as_pymongo().first()) # {u'_id': ObjectId('5d06b9c3d7c1f18db3e7c874'), u'name': u'John'}
assert User.objects(enabled=None).count() == 1
As you can see, even if the document wasn't updated, mongoengine applies the default value seamlessly when it
loads the pymongo dict into a ``User`` instance. At first sight it looks like you don't need to migrate the
existing documents when adding new fields but this actually leads to inconsistencies when it comes to querying.
In fact, when querying, mongoengine isn't trying to account for the default value of the new field and so
if you don't actually migrate the existing documents, you are taking a risk that querying/updating
will be missing relevant record.
When adding fields/modifying default values, you can use any of the following to do the migration
as a standalone script:
.. code-block:: python
# Use mongoengine to set a default value for a given field
User.objects().update(enabled=True)
# or use pymongo
user_coll = User._get_collection()
user_coll.update_many({}, {'$set': {'enabled': True}})
Example 2: Inheritance change
=============================
Let's consider the following example:
.. code-block:: python
class Human(Document):
name = StringField()
meta = {"allow_inheritance": True}
class Jedi(Human):
dark_side = BooleanField()
light_saber_color = StringField()
Jedi(name="Darth Vader", dark_side=True, light_saber_color="red").save()
Jedi(name="Obi Wan Kenobi", dark_side=False, light_saber_color="blue").save()
assert Human.objects.count() == 2
assert Jedi.objects.count() == 2
# Let's check how these documents got stored in mongodb
print(Jedi.objects.as_pymongo())
# [
# {'_id': ObjectId('5fac4aaaf61d7fb06046e0f9'), '_cls': 'Human.Jedi', 'name': 'Darth Vader', 'dark_side': True, 'light_saber_color': 'red'},
# {'_id': ObjectId('5fac4ac4f61d7fb06046e0fa'), '_cls': 'Human.Jedi', 'name': 'Obi Wan Kenobi', 'dark_side': False, 'light_saber_color': 'blue'}
# ]
As you can observe, when you use inheritance, MongoEngine stores a field named '_cls' behind the scene to keep
track of the Document class.
Let's now take the scenario that you want to refactor the inheritance schema and:
- Have the Jedi's with dark_side=True/False become GoodJedi's/DarkSith
- get rid of the 'dark_side' field
move to the following schemas:
.. code-block:: python
# unchanged
class Human(Document):
name = StringField()
meta = {"allow_inheritance": True}
# attribute 'dark_side' removed
class GoodJedi(Human):
light_saber_color = StringField()
# new class
class BadSith(Human):
light_saber_color = StringField()
MongoEngine doesn't know about the change or how to map them with the existing data
so if you don't apply any migration, you will observe a strange behavior, as if the collection was suddenly
empty.
.. code-block:: python
# As a reminder, the documents that we inserted
# have the _cls field = 'Human.Jedi'
# Following has no match
# because the query that is used behind the scene is
# filtering on {'_cls': 'Human.GoodJedi'}
assert GoodJedi.objects().count() == 0
# Following has also no match
# because it is filtering on {'_cls': {'$in': ('Human', 'Human.GoodJedi', 'Human.BadSith')}}
# which has no match
assert Human.objects.count() == 0
assert Human.objects.first() is None
# If we bypass MongoEngine and make use of underlying driver (PyMongo)
# we can see that the documents are there
humans_coll = Human._get_collection()
assert humans_coll.count_documents({}) == 2
# print first document
print(humans_coll.find_one())
# {'_id': ObjectId('5fac4aaaf61d7fb06046e0f9'), '_cls': 'Human.Jedi', 'name': 'Darth Vader', 'dark_side': True, 'light_saber_color': 'red'}
As you can see, first obvious problem is that we need to modify '_cls' values based on existing values of
'dark_side' documents.
.. code-block:: python
humans_coll = Human._get_collection()
old_class = 'Human.Jedi'
good_jedi_class = 'Human.GoodJedi'
bad_sith_class = 'Human.BadSith'
humans_coll.update_many({'_cls': old_class, 'dark_side': False}, {'$set': {'_cls': good_jedi_class}})
humans_coll.update_many({'_cls': old_class, 'dark_side': True}, {'$set': {'_cls': bad_sith_class}})
Let's now check if querying improved in MongoEngine:
.. code-block:: python
assert GoodJedi.objects().count() == 1 # Hoorah!
assert BadSith.objects().count() == 1 # Hoorah!
assert Human.objects.count() == 2 # Hoorah!
# let's now check that documents load correctly
jedi = GoodJedi.objects().first()
# raises FieldDoesNotExist: The fields "{'dark_side'}" do not exist on the document "Human.GoodJedi"
In fact we only took care of renaming the _cls values but we havn't removed the 'dark_side' fields
which does not exist anymore on the GoodJedi's and BadSith's models.
Let's remove the field from the collections:
.. code-block:: python
humans_coll = Human._get_collection()
humans_coll.update_many({}, {'$unset': {'dark_side': 1}})
.. note:: We did this migration in 2 different steps for the sake of example but it could have been combined
with the migration of the _cls fields: ::
humans_coll.update_many(
{'_cls': old_class, 'dark_side': False},
{
'$set': {'_cls': good_jedi_class},
'$unset': {'dark_side': 1}
}
)
And verify that the documents now load correctly:
.. code-block:: python
jedi = GoodJedi.objects().first()
assert jedi.name == "Obi Wan Kenobi"
sith = BadSith.objects().first()
assert sith.name == "Darth Vader"
An other way of dealing with this migration is to iterate over
the documents and update/replace them one by one. This is way slower but
it is often useful for complex migrations of Document models.
.. code-block:: python
for doc in humans_coll.find():
if doc['_cls'] == 'Human.Jedi':
doc['_cls'] = 'Human.BadSith' if doc['dark_side'] else 'Human.GoodJedi'
doc.pop('dark_side')
humans_coll.replace_one({'_id': doc['_id']}, doc)
.. warning:: Be aware of this `flaw <https://groups.google.com/g/mongodb-user/c/AFC1ia7MHzk>`_ if you modify documents while iterating
Recommendations
===============
- Write migration scripts whenever you do changes to the model schemas
- Using :class:`~mongoengine.DynamicDocument` or ``meta = {"strict": False}`` may help to avoid some migrations or to have the 2 versions of your application to co-exist.
- Write post-processing checks to verify that migrations script worked. See below
Post-processing checks
======================
The following recipe can be used to sanity check a Document collection after you applied migration.
It does not make any assumption on what was migrated, it will fetch 1000 objects randomly and
run some quick checks on the documents to make sure the document looks OK. As it is, it will fail
on the first occurrence of an error but this is something that can be adapted based on your needs.
.. code-block:: python
def get_random_oids(collection, sample_size):
pipeline = [{"$project": {'_id': 1}}, {"$sample": {"size": sample_size}}]
return [s['_id'] for s in collection.aggregate(pipeline)]
def get_random_documents(DocCls, sample_size):
doc_collection = DocCls._get_collection()
random_oids = get_random_oids(doc_collection, sample_size)
return DocCls.objects(id__in=random_oids)
def check_documents(DocCls, sample_size):
for doc in get_random_documents(DocCls, sample_size):
# general validation (types and values)
doc.validate()
# load all subfields,
# this may trigger additional queries if you have ReferenceFields
# so it may be slow
for field in doc._fields:
try:
getattr(doc, field)
except Exception:
LOG.warning(f"Could not load field {field} in Document {doc.id}")
raise
check_documents(Human, sample_size=1000)

View File

@@ -609,7 +609,7 @@ to push values with index::
.. note:: .. note::
Currently only top level lists are handled, future versions of mongodb / Currently only top level lists are handled, future versions of mongodb /
pymongo plan to support nested positional operators. See `The $ positional pymongo plan to support nested positional operators. See `The $ positional
operator <http://www.mongodb.org/display/DOCS/Updating#Updating-The%24positionaloperator>`_. operator <https://docs.mongodb.com/manual/tutorial/update-documents/#Updating-The%24positionaloperator>`_.
Server-side javascript execution Server-side javascript execution
================================ ================================

123
docs/guide/validation.rst Normal file
View File

@@ -0,0 +1,123 @@
====================
Document Validation
====================
By design, MongoEngine strictly validates the documents right before they are inserted in MongoDB
and makes sure they are consistent with the fields defined in your models.
MongoEngine makes the assumption that the documents that exists in the DB are compliant with the schema.
This means that Mongoengine will not validate a document when an object is loaded from the DB into an instance
of your model but this operation may fail under some circumstances (e.g. if there is a field in
the document fetched from the database that is not defined in your model).
Built-in validation
===================
Mongoengine provides different fields that encapsulate the corresponding validation
out of the box. Validation runs when calling `.validate()` or `.save()`
.. code-block:: python
from mongoengine import Document, EmailField
class User(Document):
email = EmailField()
age = IntField(min_value=0, max_value=99)
user = User(email='invalid@', age=24)
user.validate() # raises ValidationError (Invalid email address: ['email'])
user.save() # raises ValidationError (Invalid email address: ['email'])
user2 = User(email='john.doe@garbage.com', age=1000)
user2.save() # raises ValidationError (Integer value is too large: ['age'])
Custom validation
=================
The following feature can be used to customize the validation:
* Field `validation` parameter
.. code-block:: python
def not_john_doe(name):
if name == 'John Doe':
raise ValidationError("John Doe is not a valid name")
class Person(Document):
full_name = StringField(validation=not_john_doe)
Person(full_name='Billy Doe').save()
Person(full_name='John Doe').save() # raises ValidationError (John Doe is not a valid name)
* Document `clean` method
This method is called as part of :meth:`~mongoengine.document.Document.save` and should be used to provide
custom model validation and/or to modify some of the field values prior to validation.
For instance, you could use it to automatically provide a value for a field, or to do validation
that requires access to more than a single field.
.. code-block:: python
class Essay(Document):
status = StringField(choices=('Published', 'Draft'), required=True)
pub_date = DateTimeField()
def clean(self):
# Validate that only published essays have a `pub_date`
if self.status == 'Draft' and self.pub_date is not None:
raise ValidationError('Draft entries should not have a publication date.')
# Set the pub_date for published items if not set.
if self.status == 'Published' and self.pub_date is None:
self.pub_date = datetime.now()
.. note::
Cleaning is only called if validation is turned on and when calling
:meth:`~mongoengine.Document.save`.
* Adding custom Field classes
We recommend as much as possible to use fields provided by MongoEngine. However, it is also possible
to subclass a Field and encapsulate some validation by overriding the `validate` method
.. code-block:: python
class AgeField(IntField):
def validate(self, value):
super(AgeField, self).validate(value) # let IntField.validate run first
if value == 60:
self.error('60 is not allowed')
class Person(Document):
age = AgeField(min_value=0, max_value=99)
Person(age=20).save() # passes
Person(age=1000).save() # raises ValidationError (Integer value is too large: ['age'])
Person(age=60).save() # raises ValidationError (Person:None) (60 is not allowed: ['age'])
.. note::
When overriding `validate`, use `self.error("your-custom-error")` instead of raising ValidationError explicitly,
it will provide a better context with the error message
Skipping validation
====================
Although discouraged as it allows to violate fields constraints, if for some reason you need to disable
the validation and cleaning of a document when you call :meth:`~mongoengine.document.Document.save`, you can use `.save(validate=False)`.
.. code-block:: python
class Person(Document):
age = IntField(max_value=100)
Person(age=1000).save() # raises ValidationError (Integer value is too large)
Person(age=1000).save(validate=False)
person = Person.objects.first()
assert person.age == 1000

3
docs/requirements.txt Normal file
View File

@@ -0,0 +1,3 @@
pymongo>=3.11
Sphinx==3.2.1
sphinx-rtd-theme==0.5.0

View File

@@ -28,7 +28,7 @@ __all__ = (
) )
VERSION = (0, 20, 0) VERSION = (0, 21, 0)
def get_version(): def get_version():

View File

@@ -179,7 +179,7 @@ class BaseList(list):
def _mark_as_changed(self, key=None): def _mark_as_changed(self, key=None):
if hasattr(self._instance, "_mark_as_changed"): if hasattr(self._instance, "_mark_as_changed"):
if key: if key is not None:
self._instance._mark_as_changed( self._instance._mark_as_changed(
"{}.{}".format(self._name, key % len(self)) "{}.{}".format(self._name, key % len(self))
) )
@@ -215,7 +215,7 @@ class EmbeddedDocumentList(BaseList):
Filters the list by only including embedded documents with the Filters the list by only including embedded documents with the
given keyword arguments. given keyword arguments.
This method only supports simple comparison (e.g: .filter(name='John Doe')) This method only supports simple comparison (e.g. .filter(name='John Doe'))
and does not support operators like __gte, __lte, __icontains like queryset.filter does and does not support operators like __gte, __lte, __icontains like queryset.filter does
:param kwargs: The keyword arguments corresponding to the fields to :param kwargs: The keyword arguments corresponding to the fields to

View File

@@ -64,8 +64,6 @@ class BaseDocument:
It may contain additional reserved keywords, e.g. "__auto_convert". It may contain additional reserved keywords, e.g. "__auto_convert".
:param __auto_convert: If True, supplied values will be converted :param __auto_convert: If True, supplied values will be converted
to Python-type values via each field's `to_python` method. to Python-type values via each field's `to_python` method.
:param __only_fields: A set of fields that have been loaded for
this document. Empty if all fields have been loaded.
:param _created: Indicates whether this is a brand new document :param _created: Indicates whether this is a brand new document
or whether it's already been persisted before. Defaults to true. or whether it's already been persisted before. Defaults to true.
""" """
@@ -80,8 +78,6 @@ class BaseDocument:
__auto_convert = values.pop("__auto_convert", True) __auto_convert = values.pop("__auto_convert", True)
__only_fields = set(values.pop("__only_fields", values))
_created = values.pop("_created", True) _created = values.pop("_created", True)
signals.pre_init.send(self.__class__, document=self, values=values) signals.pre_init.send(self.__class__, document=self, values=values)
@@ -105,37 +101,32 @@ class BaseDocument:
self._dynamic_fields = SON() self._dynamic_fields = SON()
# Assign default values to the instance. # Assign default values for fields
# We set default values only for fields loaded from DB. See # not set in the constructor
# https://github.com/mongoengine/mongoengine/issues/399 for more info. for field_name in self._fields:
for key, field in self._fields.items(): if field_name in values:
if self._db_field_map.get(key, key) in __only_fields:
continue continue
value = getattr(self, key, None) value = getattr(self, field_name, None)
setattr(self, key, value) setattr(self, field_name, value)
if "_cls" not in values: if "_cls" not in values:
self._cls = self._class_name self._cls = self._class_name
# Set passed values after initialisation # Set actual values
if self._dynamic:
dynamic_data = {} dynamic_data = {}
for key, value in values.items():
if key in self._fields or key == "_id":
setattr(self, key, value)
else:
dynamic_data[key] = value
else:
FileField = _import_class("FileField") FileField = _import_class("FileField")
for key, value in values.items(): for key, value in values.items():
key = self._reverse_db_field_map.get(key, key)
if key in self._fields or key in ("id", "pk", "_cls"):
if __auto_convert and value is not None:
field = self._fields.get(key) field = self._fields.get(key)
if field or key in ("id", "pk", "_cls"):
if __auto_convert and value is not None:
if field and not isinstance(field, FileField): if field and not isinstance(field, FileField):
value = field.to_python(value) value = field.to_python(value)
setattr(self, key, value) setattr(self, key, value)
else: else:
if self._dynamic:
dynamic_data[key] = value
else:
# For strict Document
self._data[key] = value self._data[key] = value
# Set any get_<field>_display methods # Set any get_<field>_display methods
@@ -314,7 +305,8 @@ class BaseDocument:
def clean(self): def clean(self):
""" """
Hook for doing document level data cleaning before validation is run. Hook for doing document level data cleaning (usually validation or assignment)
before validation is run.
Any ValidationError raised by this method will not be associated with Any ValidationError raised by this method will not be associated with
a particular field; it will have a special-case association with the a particular field; it will have a special-case association with the
@@ -537,6 +529,9 @@ class BaseDocument:
"""Using _get_changed_fields iterate and remove any fields that """Using _get_changed_fields iterate and remove any fields that
are marked as changed. are marked as changed.
""" """
ReferenceField = _import_class("ReferenceField")
GenericReferenceField = _import_class("GenericReferenceField")
for changed in self._get_changed_fields(): for changed in self._get_changed_fields():
parts = changed.split(".") parts = changed.split(".")
data = self data = self
@@ -549,7 +544,8 @@ class BaseDocument:
elif isinstance(data, dict): elif isinstance(data, dict):
data = data.get(part, None) data = data.get(part, None)
else: else:
data = getattr(data, part, None) field_name = data._reverse_db_field_map.get(part, part)
data = getattr(data, field_name, None)
if not isinstance(data, LazyReference) and hasattr( if not isinstance(data, LazyReference) and hasattr(
data, "_changed_fields" data, "_changed_fields"
@@ -558,10 +554,40 @@ class BaseDocument:
continue continue
data._changed_fields = [] data._changed_fields = []
elif isinstance(data, (list, tuple, dict)):
if hasattr(data, "field") and isinstance(
data.field, (ReferenceField, GenericReferenceField)
):
continue
BaseDocument._nestable_types_clear_changed_fields(data)
self._changed_fields = [] self._changed_fields = []
def _nestable_types_changed_fields(self, changed_fields, base_key, data): @staticmethod
def _nestable_types_clear_changed_fields(data):
"""Inspect nested data for changed fields
:param data: data to inspect for changes
"""
Document = _import_class("Document")
# Loop list / dict fields as they contain documents
# Determine the iterator to use
if not hasattr(data, "items"):
iterator = enumerate(data)
else:
iterator = data.items()
for index_or_key, value in iterator:
if hasattr(value, "_get_changed_fields") and not isinstance(
value, Document
): # don't follow references
value._clear_changed_fields()
elif isinstance(value, (list, tuple, dict)):
BaseDocument._nestable_types_clear_changed_fields(value)
@staticmethod
def _nestable_types_changed_fields(changed_fields, base_key, data):
"""Inspect nested data for changed fields """Inspect nested data for changed fields
:param changed_fields: Previously collected changed fields :param changed_fields: Previously collected changed fields
@@ -586,7 +612,9 @@ class BaseDocument:
changed = value._get_changed_fields() changed = value._get_changed_fields()
changed_fields += ["{}{}".format(item_key, k) for k in changed if k] changed_fields += ["{}{}".format(item_key, k) for k in changed if k]
elif isinstance(value, (list, tuple, dict)): elif isinstance(value, (list, tuple, dict)):
self._nestable_types_changed_fields(changed_fields, item_key, value) BaseDocument._nestable_types_changed_fields(
changed_fields, item_key, value
)
def _get_changed_fields(self): def _get_changed_fields(self):
"""Return a list of all fields that have explicitly been changed. """Return a list of all fields that have explicitly been changed.
@@ -721,11 +749,9 @@ class BaseDocument:
return cls._meta.get("collection", None) return cls._meta.get("collection", None)
@classmethod @classmethod
def _from_son(cls, son, _auto_dereference=True, only_fields=None, created=False): def _from_son(cls, son, _auto_dereference=True, created=False):
"""Create an instance of a Document (subclass) from a PyMongo SON.""" """Create an instance of a Document (subclass) from a PyMongo SON (dict)
if not only_fields: """
only_fields = []
if son and not isinstance(son, dict): if son and not isinstance(son, dict):
raise ValueError( raise ValueError(
"The source SON object needs to be of type 'dict' but a '%s' was found" "The source SON object needs to be of type 'dict' but a '%s' was found"
@@ -738,6 +764,8 @@ class BaseDocument:
# Convert SON to a data dict, making sure each key is a string and # Convert SON to a data dict, making sure each key is a string and
# corresponds to the right db field. # corresponds to the right db field.
# This is needed as _from_son is currently called both from BaseDocument.__init__
# and from EmbeddedDocumentField.to_python
data = {} data = {}
for key, value in son.items(): for key, value in son.items():
key = str(key) key = str(key)
@@ -780,9 +808,7 @@ class BaseDocument:
if cls.STRICT: if cls.STRICT:
data = {k: v for k, v in data.items() if k in cls._fields} data = {k: v for k, v in data.items() if k in cls._fields}
obj = cls( obj = cls(__auto_convert=False, _created=created, **data)
__auto_convert=False, _created=created, __only_fields=only_fields, **data
)
obj._changed_fields = [] obj._changed_fields = []
if not _auto_dereference: if not _auto_dereference:
obj._fields = fields obj._fields = fields

View File

@@ -1,5 +1,4 @@
import operator import operator
import warnings
import weakref import weakref
from bson import DBRef, ObjectId, SON from bson import DBRef, ObjectId, SON
@@ -16,11 +15,9 @@ __all__ = ("BaseField", "ComplexBaseField", "ObjectIdField", "GeoJsonBaseField")
class BaseField: class BaseField:
"""A base class for fields in a MongoDB document. Instances of this class """A base class for fields in a MongoDB document. Instances of this class
may be added to subclasses of `Document` to define a document's schema. may be added to subclasses of `Document` to define a document's schema.
.. versionchanged:: 0.5 - added verbose and help text
""" """
name = None name = None # set in TopLevelDocumentMetaclass
_geo_index = False _geo_index = False
_auto_gen = False # Call `generate` to generate a value _auto_gen = False # Call `generate` to generate a value
_auto_dereference = True _auto_dereference = True
@@ -265,11 +262,11 @@ class ComplexBaseField(BaseField):
Allows for nesting of embedded documents inside complex types. Allows for nesting of embedded documents inside complex types.
Handles the lazy dereferencing of a queryset by lazily dereferencing all Handles the lazy dereferencing of a queryset by lazily dereferencing all
items in a list / dict rather than one at a time. items in a list / dict rather than one at a time.
.. versionadded:: 0.5
""" """
field = None def __init__(self, field=None, **kwargs):
self.field = field
super().__init__(**kwargs)
def __get__(self, instance, owner): def __get__(self, instance, owner):
"""Descriptor to automatically dereference references.""" """Descriptor to automatically dereference references."""
@@ -521,8 +518,6 @@ class ObjectIdField(BaseField):
class GeoJsonBaseField(BaseField): class GeoJsonBaseField(BaseField):
"""A geo json field storing a geojson style object. """A geo json field storing a geojson style object.
.. versionadded:: 0.8
""" """
_geo_index = pymongo.GEOSPHERE _geo_index = pymongo.GEOSPHERE

View File

@@ -74,8 +74,6 @@ def _get_connection_settings(
: param kwargs: ad-hoc parameters to be passed into the pymongo driver, : param kwargs: ad-hoc parameters to be passed into the pymongo driver,
for example maxpoolsize, tz_aware, etc. See the documentation for example maxpoolsize, tz_aware, etc. See the documentation
for pymongo's `MongoClient` for a full list. for pymongo's `MongoClient` for a full list.
.. versionchanged:: 0.10.6 - added mongomock support
""" """
conn_settings = { conn_settings = {
"name": name or db or DEFAULT_DATABASE_NAME, "name": name or db or DEFAULT_DATABASE_NAME,
@@ -201,8 +199,6 @@ def register_connection(
: param kwargs: ad-hoc parameters to be passed into the pymongo driver, : param kwargs: ad-hoc parameters to be passed into the pymongo driver,
for example maxpoolsize, tz_aware, etc. See the documentation for example maxpoolsize, tz_aware, etc. See the documentation
for pymongo's `MongoClient` for a full list. for pymongo's `MongoClient` for a full list.
.. versionchanged:: 0.10.6 - added mongomock support
""" """
conn_settings = _get_connection_settings( conn_settings = _get_connection_settings(
db=db, db=db,
@@ -386,8 +382,6 @@ def connect(db=None, alias=DEFAULT_CONNECTION_NAME, **kwargs):
See the docstring for `register_connection` for more details about all See the docstring for `register_connection` for more details about all
supported kwargs. supported kwargs.
.. versionchanged:: 0.6 - added multiple database support.
""" """
if alias in _connections: if alias in _connections:
prev_conn_setting = _connection_settings[alias] prev_conn_setting = _connection_settings[alias]

View File

@@ -1,5 +1,4 @@
import re import re
import warnings
from bson.dbref import DBRef from bson.dbref import DBRef
import pymongo import pymongo
@@ -367,15 +366,6 @@ class Document(BaseDocument, metaclass=TopLevelDocumentMetaclass):
meta['cascade'] = True. Also you can pass different kwargs to meta['cascade'] = True. Also you can pass different kwargs to
the cascade save using cascade_kwargs which overwrites the the cascade save using cascade_kwargs which overwrites the
existing kwargs with custom values. existing kwargs with custom values.
.. versionchanged:: 0.8.5
Optional save_condition that only overwrites existing documents
if the condition is satisfied in the current db record.
.. versionchanged:: 0.10
:class:`OperationError` exception raised if save_condition fails.
.. versionchanged:: 0.10.1
:class: save_condition failure now raises a `SaveConditionError`
.. versionchanged:: 0.10.7
Add signal_kwargs argument
""" """
signal_kwargs = signal_kwargs or {} signal_kwargs = signal_kwargs or {}
@@ -464,9 +454,9 @@ class Document(BaseDocument, metaclass=TopLevelDocumentMetaclass):
# insert_one will provoke UniqueError alongside save does not # insert_one will provoke UniqueError alongside save does not
# therefore, it need to catch and call replace_one. # therefore, it need to catch and call replace_one.
if "_id" in doc: if "_id" in doc:
raw_object = wc_collection.find_one_and_replace( select_dict = {"_id": doc["_id"]}
{"_id": doc["_id"]}, doc select_dict = self._integrate_shard_key(doc, select_dict)
) raw_object = wc_collection.find_one_and_replace(select_dict, doc)
if raw_object: if raw_object:
return doc["_id"] return doc["_id"]
@@ -489,6 +479,23 @@ class Document(BaseDocument, metaclass=TopLevelDocumentMetaclass):
return update_doc return update_doc
def _integrate_shard_key(self, doc, select_dict):
"""Integrates the collection's shard key to the `select_dict`, which will be used for the query.
The value from the shard key is taken from the `doc` and finally the select_dict is returned.
"""
# Need to add shard key to query, or you get an error
shard_key = self._meta.get("shard_key", tuple())
for k in shard_key:
path = self._lookup_field(k.split("."))
actual_key = [p.db_field for p in path]
val = doc
for ak in actual_key:
val = val[ak]
select_dict[".".join(actual_key)] = val
return select_dict
def _save_update(self, doc, save_condition, write_concern): def _save_update(self, doc, save_condition, write_concern):
"""Update an existing document. """Update an existing document.
@@ -504,15 +511,7 @@ class Document(BaseDocument, metaclass=TopLevelDocumentMetaclass):
select_dict["_id"] = object_id select_dict["_id"] = object_id
# Need to add shard key to query, or you get an error select_dict = self._integrate_shard_key(doc, select_dict)
shard_key = self._meta.get("shard_key", tuple())
for k in shard_key:
path = self._lookup_field(k.split("."))
actual_key = [p.db_field for p in path]
val = doc
for ak in actual_key:
val = val[ak]
select_dict[".".join(actual_key)] = val
update_doc = self._get_update_doc() update_doc = self._get_update_doc()
if update_doc: if update_doc:
@@ -621,9 +620,6 @@ class Document(BaseDocument, metaclass=TopLevelDocumentMetaclass):
For example, ``save(..., w: 2, fsync: True)`` will For example, ``save(..., w: 2, fsync: True)`` will
wait until at least two servers have recorded the write and wait until at least two servers have recorded the write and
will force an fsync on the primary server. will force an fsync on the primary server.
.. versionchanged:: 0.10.7
Add signal_kwargs argument
""" """
signal_kwargs = signal_kwargs or {} signal_kwargs = signal_kwargs or {}
signals.pre_delete.send(self.__class__, document=self, **signal_kwargs) signals.pre_delete.send(self.__class__, document=self, **signal_kwargs)
@@ -639,7 +635,7 @@ class Document(BaseDocument, metaclass=TopLevelDocumentMetaclass):
write_concern=write_concern, _from_doc_delete=True write_concern=write_concern, _from_doc_delete=True
) )
except pymongo.errors.OperationFailure as err: except pymongo.errors.OperationFailure as err:
message = "Could not delete document (%s)" % err.message message = "Could not delete document (%s)" % err.args
raise OperationError(message) raise OperationError(message)
signals.post_delete.send(self.__class__, document=self, **signal_kwargs) signals.post_delete.send(self.__class__, document=self, **signal_kwargs)
@@ -705,8 +701,6 @@ class Document(BaseDocument, metaclass=TopLevelDocumentMetaclass):
def select_related(self, max_depth=1): def select_related(self, max_depth=1):
"""Handles dereferencing of :class:`~bson.dbref.DBRef` objects to """Handles dereferencing of :class:`~bson.dbref.DBRef` objects to
a maximum depth in order to cut down the number queries to mongodb. a maximum depth in order to cut down the number queries to mongodb.
.. versionadded:: 0.5
""" """
DeReference = _import_class("DeReference") DeReference = _import_class("DeReference")
DeReference()([self], max_depth + 1) DeReference()([self], max_depth + 1)
@@ -717,10 +711,6 @@ class Document(BaseDocument, metaclass=TopLevelDocumentMetaclass):
:param fields: (optional) args list of fields to reload :param fields: (optional) args list of fields to reload
:param max_depth: (optional) depth of dereferencing to follow :param max_depth: (optional) depth of dereferencing to follow
.. versionadded:: 0.1.2
.. versionchanged:: 0.6 Now chainable
.. versionchanged:: 0.9 Can provide specific fields to reload
""" """
max_depth = 1 max_depth = 1
if fields and isinstance(fields[0], int): if fields and isinstance(fields[0], int):
@@ -822,9 +812,6 @@ class Document(BaseDocument, metaclass=TopLevelDocumentMetaclass):
Raises :class:`OperationError` if the document has no collection set Raises :class:`OperationError` if the document has no collection set
(i.g. if it is `abstract`) (i.g. if it is `abstract`)
.. versionchanged:: 0.10.7
:class:`OperationError` exception raised if no collection available
""" """
coll_name = cls._get_collection_name() coll_name = cls._get_collection_name()
if not coll_name: if not coll_name:
@@ -919,7 +906,7 @@ class Document(BaseDocument, metaclass=TopLevelDocumentMetaclass):
@classmethod @classmethod
def list_indexes(cls): def list_indexes(cls):
""" Lists all of the indexes that should be created for given """Lists all of the indexes that should be created for given
collection. It includes all the indexes from super- and sub-classes. collection. It includes all the indexes from super- and sub-classes.
""" """
if cls._meta.get("abstract"): if cls._meta.get("abstract"):
@@ -984,7 +971,7 @@ class Document(BaseDocument, metaclass=TopLevelDocumentMetaclass):
@classmethod @classmethod
def compare_indexes(cls): def compare_indexes(cls):
""" Compares the indexes defined in MongoEngine with the ones """Compares the indexes defined in MongoEngine with the ones
existing in the database. Returns any missing/extra indexes. existing in the database. Returns any missing/extra indexes.
""" """
@@ -1079,8 +1066,6 @@ class MapReduceDocument:
an ``ObjectId`` found in the given ``collection``, an ``ObjectId`` found in the given ``collection``,
the object can be accessed via the ``object`` property. the object can be accessed via the ``object`` property.
:param value: The result(s) for this key. :param value: The result(s) for this key.
.. versionadded:: 0.3
""" """
def __init__(self, document, collection, key, value): def __init__(self, document, collection, key, value):

View File

@@ -36,7 +36,6 @@ from mongoengine.common import _import_class
from mongoengine.connection import DEFAULT_CONNECTION_NAME, get_db from mongoengine.connection import DEFAULT_CONNECTION_NAME, get_db
from mongoengine.document import Document, EmbeddedDocument from mongoengine.document import Document, EmbeddedDocument
from mongoengine.errors import DoesNotExist, InvalidQueryError, ValidationError from mongoengine.errors import DoesNotExist, InvalidQueryError, ValidationError
from mongoengine.mongodb_support import MONGODB_36, get_mongodb_version
from mongoengine.queryset import DO_NOTHING from mongoengine.queryset import DO_NOTHING
from mongoengine.queryset.base import BaseQuerySet from mongoengine.queryset.base import BaseQuerySet
from mongoengine.queryset.transform import STRING_OPERATORS from mongoengine.queryset.transform import STRING_OPERATORS
@@ -87,6 +86,7 @@ __all__ = (
"PolygonField", "PolygonField",
"SequenceField", "SequenceField",
"UUIDField", "UUIDField",
"EnumField",
"MultiPointField", "MultiPointField",
"MultiLineStringField", "MultiLineStringField",
"MultiPolygonField", "MultiPolygonField",
@@ -100,6 +100,12 @@ class StringField(BaseField):
"""A unicode string field.""" """A unicode string field."""
def __init__(self, regex=None, max_length=None, min_length=None, **kwargs): def __init__(self, regex=None, max_length=None, min_length=None, **kwargs):
"""
:param regex: (optional) A string pattern that will be applied during validation
:param max_length: (optional) A max length that will be applied during validation
:param min_length: (optional) A min length that will be applied during validation
:param kwargs: Keyword arguments passed into the parent :class:`~mongoengine.BaseField`
"""
self.regex = re.compile(regex) if regex else None self.regex = re.compile(regex) if regex else None
self.max_length = max_length self.max_length = max_length
self.min_length = min_length self.min_length = min_length
@@ -155,10 +161,7 @@ class StringField(BaseField):
class URLField(StringField): class URLField(StringField):
"""A field that validates input as an URL. """A field that validates input as an URL."""
.. versionadded:: 0.3
"""
_URL_REGEX = LazyRegexCompiler( _URL_REGEX = LazyRegexCompiler(
r"^(?:[a-z0-9\.\-]*)://" # scheme is validated separately r"^(?:[a-z0-9\.\-]*)://" # scheme is validated separately
@@ -173,6 +176,11 @@ class URLField(StringField):
_URL_SCHEMES = ["http", "https", "ftp", "ftps"] _URL_SCHEMES = ["http", "https", "ftp", "ftps"]
def __init__(self, url_regex=None, schemes=None, **kwargs): def __init__(self, url_regex=None, schemes=None, **kwargs):
"""
:param url_regex: (optional) Overwrite the default regex used for validation
:param schemes: (optional) Overwrite the default URL schemes that are allowed
:param kwargs: Keyword arguments passed into the parent :class:`~mongoengine.StringField`
"""
self.url_regex = url_regex or self._URL_REGEX self.url_regex = url_regex or self._URL_REGEX
self.schemes = schemes or self._URL_SCHEMES self.schemes = schemes or self._URL_SCHEMES
super().__init__(**kwargs) super().__init__(**kwargs)
@@ -191,7 +199,6 @@ class URLField(StringField):
class EmailField(StringField): class EmailField(StringField):
"""A field that validates input as an email address. """A field that validates input as an email address.
.. versionadded:: 0.4
""" """
USER_REGEX = LazyRegexCompiler( USER_REGEX = LazyRegexCompiler(
@@ -228,16 +235,11 @@ class EmailField(StringField):
*args, *args,
**kwargs **kwargs
): ):
"""Initialize the EmailField. """
:param domain_whitelist: (optional) list of valid domain names applied during validation
Args: :param allow_utf8_user: Allow user part of the email to contain utf8 char
domain_whitelist (list) - list of otherwise invalid domain :param allow_ip_domain: Allow domain part of the email to be an IPv4 or IPv6 address
names which you'd like to support. :param kwargs: Keyword arguments passed into the parent :class:`~mongoengine.StringField`
allow_utf8_user (bool) - if True, the user part of the email
address can contain UTF8 characters.
False by default.
allow_ip_domain (bool) - if True, the domain part of the email
can be a valid IPv4 or IPv6 address.
""" """
self.domain_whitelist = domain_whitelist or [] self.domain_whitelist = domain_whitelist or []
self.allow_utf8_user = allow_utf8_user self.allow_utf8_user = allow_utf8_user
@@ -309,6 +311,11 @@ class IntField(BaseField):
"""32-bit integer field.""" """32-bit integer field."""
def __init__(self, min_value=None, max_value=None, **kwargs): def __init__(self, min_value=None, max_value=None, **kwargs):
"""
:param min_value: (optional) A min value that will be applied during validation
:param max_value: (optional) A max value that will be applied during validation
:param kwargs: Keyword arguments passed into the parent :class:`~mongoengine.BaseField`
"""
self.min_value, self.max_value = min_value, max_value self.min_value, self.max_value = min_value, max_value
super().__init__(**kwargs) super().__init__(**kwargs)
@@ -342,6 +349,11 @@ class LongField(BaseField):
"""64-bit integer field. (Equivalent to IntField since the support to Python2 was dropped)""" """64-bit integer field. (Equivalent to IntField since the support to Python2 was dropped)"""
def __init__(self, min_value=None, max_value=None, **kwargs): def __init__(self, min_value=None, max_value=None, **kwargs):
"""
:param min_value: (optional) A min value that will be applied during validation
:param max_value: (optional) A max value that will be applied during validation
:param kwargs: Keyword arguments passed into the parent :class:`~mongoengine.BaseField`
"""
self.min_value, self.max_value = min_value, max_value self.min_value, self.max_value = min_value, max_value
super().__init__(**kwargs) super().__init__(**kwargs)
@@ -378,6 +390,11 @@ class FloatField(BaseField):
"""Floating point number field.""" """Floating point number field."""
def __init__(self, min_value=None, max_value=None, **kwargs): def __init__(self, min_value=None, max_value=None, **kwargs):
"""
:param min_value: (optional) A min value that will be applied during validation
:param max_value: (optional) A max value that will be applied during validation
:param kwargs: Keyword arguments passed into the parent :class:`~mongoengine.BaseField`
"""
self.min_value, self.max_value = min_value, max_value self.min_value, self.max_value = min_value, max_value
super().__init__(**kwargs) super().__init__(**kwargs)
@@ -414,9 +431,6 @@ class FloatField(BaseField):
class DecimalField(BaseField): class DecimalField(BaseField):
"""Fixed-point decimal number field. Stores the value as a float by default unless `force_string` is used. """Fixed-point decimal number field. Stores the value as a float by default unless `force_string` is used.
If using floats, beware of Decimal to float conversion (potential precision loss) If using floats, beware of Decimal to float conversion (potential precision loss)
.. versionchanged:: 0.8
.. versionadded:: 0.3
""" """
def __init__( def __init__(
@@ -429,11 +443,11 @@ class DecimalField(BaseField):
**kwargs **kwargs
): ):
""" """
:param min_value: Validation rule for the minimum acceptable value. :param min_value: (optional) A min value that will be applied during validation
:param max_value: Validation rule for the maximum acceptable value. :param max_value: (optional) A max value that will be applied during validation
:param force_string: Store the value as a string (instead of a float). :param force_string: Store the value as a string (instead of a float).
Be aware that this affects query sorting and operation like lte, gte (as string comparison is applied) Be aware that this affects query sorting and operation like lte, gte (as string comparison is applied)
and some query operator won't work (e.g: inc, dec) and some query operator won't work (e.g. inc, dec)
:param precision: Number of decimal places to store. :param precision: Number of decimal places to store.
:param rounding: The rounding rule from the python decimal library: :param rounding: The rounding rule from the python decimal library:
@@ -447,7 +461,7 @@ class DecimalField(BaseField):
- decimal.ROUND_05UP (away from zero if last digit after rounding towards zero would have been 0 or 5; otherwise towards zero) - decimal.ROUND_05UP (away from zero if last digit after rounding towards zero would have been 0 or 5; otherwise towards zero)
Defaults to: ``decimal.ROUND_HALF_UP`` Defaults to: ``decimal.ROUND_HALF_UP``
:param kwargs: Keyword arguments passed into the parent :class:`~mongoengine.BaseField`
""" """
self.min_value = min_value self.min_value = min_value
self.max_value = max_value self.max_value = max_value
@@ -497,10 +511,7 @@ class DecimalField(BaseField):
class BooleanField(BaseField): class BooleanField(BaseField):
"""Boolean field type. """Boolean field type."""
.. versionadded:: 0.1.2
"""
def to_python(self, value): def to_python(self, value):
try: try:
@@ -545,12 +556,13 @@ class DateTimeField(BaseField):
if callable(value): if callable(value):
return value() return value()
if not isinstance(value, str): if isinstance(value, str):
return self._parse_datetime(value)
else:
return None return None
return self._parse_datetime(value) @staticmethod
def _parse_datetime(value):
def _parse_datetime(self, value):
# Attempt to parse a datetime from a string # Attempt to parse a datetime from a string
value = value.strip() value = value.strip()
if not value: if not value:
@@ -626,13 +638,12 @@ class ComplexDateTimeField(StringField):
keyword when initializing the field. keyword when initializing the field.
Note: To default the field to the current datetime, use: DateTimeField(default=datetime.utcnow) Note: To default the field to the current datetime, use: DateTimeField(default=datetime.utcnow)
.. versionadded:: 0.5
""" """
def __init__(self, separator=",", **kwargs): def __init__(self, separator=",", **kwargs):
""" """
:param separator: Allows to customize the separator used for storage (default ``,``) :param separator: Allows to customize the separator used for storage (default ``,``)
:param kwargs: Keyword arguments passed into the parent :class:`~mongoengine.StringField`
""" """
self.separator = separator self.separator = separator
self.format = separator.join(["%Y", "%m", "%d", "%H", "%M", "%S", "%f"]) self.format = separator.join(["%Y", "%m", "%d", "%H", "%M", "%S", "%f"])
@@ -773,6 +784,9 @@ class EmbeddedDocumentField(BaseField):
def prepare_query_value(self, op, value): def prepare_query_value(self, op, value):
if value is not None and not isinstance(value, self.document_type): if value is not None and not isinstance(value, self.document_type):
# Short circuit for special operators, returning them as is
if isinstance(value, dict) and all(k.startswith("$") for k in value.keys()):
return value
try: try:
value = self.document_type._from_son(value) value = self.document_type._from_son(value)
except ValueError: except ValueError:
@@ -844,8 +858,7 @@ class DynamicField(BaseField):
Used by :class:`~mongoengine.DynamicDocument` to handle dynamic data""" Used by :class:`~mongoengine.DynamicDocument` to handle dynamic data"""
def to_mongo(self, value, use_db_field=True, fields=None): def to_mongo(self, value, use_db_field=True, fields=None):
"""Convert a Python type to a MongoDB compatible type. """Convert a Python type to a MongoDB compatible type."""
"""
if isinstance(value, str): if isinstance(value, str):
return value return value
@@ -910,10 +923,9 @@ class ListField(ComplexBaseField):
""" """
def __init__(self, field=None, max_length=None, **kwargs): def __init__(self, field=None, max_length=None, **kwargs):
self.field = field
self.max_length = max_length self.max_length = max_length
kwargs.setdefault("default", lambda: []) kwargs.setdefault("default", lambda: [])
super().__init__(**kwargs) super().__init__(field=field, **kwargs)
def __get__(self, instance, owner): def __get__(self, instance, owner):
if instance is None: if instance is None:
@@ -972,16 +984,13 @@ class EmbeddedDocumentListField(ListField):
.. note:: .. note::
The only valid list values are subclasses of The only valid list values are subclasses of
:class:`~mongoengine.EmbeddedDocument`. :class:`~mongoengine.EmbeddedDocument`.
.. versionadded:: 0.9
""" """
def __init__(self, document_type, **kwargs): def __init__(self, document_type, **kwargs):
""" """
:param document_type: The type of :param document_type: The type of
:class:`~mongoengine.EmbeddedDocument` the list will hold. :class:`~mongoengine.EmbeddedDocument` the list will hold.
:param kwargs: Keyword arguments passed directly into the parent :param kwargs: Keyword arguments passed into the parent :class:`~mongoengine.ListField`
:class:`~mongoengine.ListField`.
""" """
super().__init__(field=EmbeddedDocumentField(document_type), **kwargs) super().__init__(field=EmbeddedDocumentField(document_type), **kwargs)
@@ -996,19 +1005,11 @@ class SortedListField(ListField):
save the whole list then other processes trying to save the whole list save the whole list then other processes trying to save the whole list
as well could overwrite changes. The safest way to append to a list is as well could overwrite changes. The safest way to append to a list is
to perform a push operation. to perform a push operation.
.. versionadded:: 0.4
.. versionchanged:: 0.6 - added reverse keyword
""" """
_ordering = None
_order_reverse = False
def __init__(self, field, **kwargs): def __init__(self, field, **kwargs):
if "ordering" in kwargs.keys(): self._ordering = kwargs.pop("ordering", None)
self._ordering = kwargs.pop("ordering") self._order_reverse = kwargs.pop("reverse", False)
if "reverse" in kwargs.keys():
self._order_reverse = kwargs.pop("reverse")
super().__init__(field, **kwargs) super().__init__(field, **kwargs)
def to_mongo(self, value, use_db_field=True, fields=None): def to_mongo(self, value, use_db_field=True, fields=None):
@@ -1055,17 +1056,13 @@ class DictField(ComplexBaseField):
.. note:: .. note::
Required means it cannot be empty - as the default for DictFields is {} Required means it cannot be empty - as the default for DictFields is {}
.. versionadded:: 0.3
.. versionchanged:: 0.5 - Can now handle complex / varying types of data
""" """
def __init__(self, field=None, *args, **kwargs): def __init__(self, field=None, *args, **kwargs):
self.field = field
self._auto_dereference = False self._auto_dereference = False
kwargs.setdefault("default", lambda: {}) kwargs.setdefault("default", lambda: {})
super().__init__(*args, **kwargs) super().__init__(*args, field=field, **kwargs)
def validate(self, value): def validate(self, value):
"""Make sure that a list of valid fields is being used.""" """Make sure that a list of valid fields is being used."""
@@ -1121,8 +1118,6 @@ class MapField(DictField):
"""A field that maps a name to a specified field type. Similar to """A field that maps a name to a specified field type. Similar to
a DictField, except the 'value' of each item must match the specified a DictField, except the 'value' of each item must match the specified
field type. field type.
.. versionadded:: 0.5
""" """
def __init__(self, field=None, *args, **kwargs): def __init__(self, field=None, *args, **kwargs):
@@ -1170,8 +1165,6 @@ class ReferenceField(BaseField):
org = ReferenceField('Org', reverse_delete_rule=CASCADE) org = ReferenceField('Org', reverse_delete_rule=CASCADE)
User.register_delete_rule(Org, 'owner', DENY) User.register_delete_rule(Org, 'owner', DENY)
.. versionchanged:: 0.5 added `reverse_delete_rule`
""" """
def __init__( def __init__(
@@ -1179,10 +1172,12 @@ class ReferenceField(BaseField):
): ):
"""Initialises the Reference Field. """Initialises the Reference Field.
:param document_type: The type of Document that will be referenced
:param dbref: Store the reference as :class:`~pymongo.dbref.DBRef` :param dbref: Store the reference as :class:`~pymongo.dbref.DBRef`
or as the :class:`~pymongo.objectid.ObjectId`.id . or as the :class:`~pymongo.objectid.ObjectId`.id .
:param reverse_delete_rule: Determines what to do when the referring :param reverse_delete_rule: Determines what to do when the referring
object is deleted object is deleted
:param kwargs: Keyword arguments passed into the parent :class:`~mongoengine.BaseField`
.. note :: .. note ::
A reference to an abstract document type is always stored as a A reference to an abstract document type is always stored as a
@@ -1304,17 +1299,16 @@ class ReferenceField(BaseField):
class CachedReferenceField(BaseField): class CachedReferenceField(BaseField):
""" """A referencefield with cache fields to purpose pseudo-joins
A referencefield with cache fields to purpose pseudo-joins
.. versionadded:: 0.9
""" """
def __init__(self, document_type, fields=None, auto_sync=True, **kwargs): def __init__(self, document_type, fields=None, auto_sync=True, **kwargs):
"""Initialises the Cached Reference Field. """Initialises the Cached Reference Field.
:param document_type: The type of Document that will be referenced
:param fields: A list of fields to be cached in document :param fields: A list of fields to be cached in document
:param auto_sync: if True documents are auto updated. :param auto_sync: if True documents are auto updated
:param kwargs: Keyword arguments passed into the parent :class:`~mongoengine.BaseField`
""" """
if fields is None: if fields is None:
fields = [] fields = []
@@ -1482,8 +1476,6 @@ class GenericReferenceField(BaseField):
it. it.
* You can use the choices param to limit the acceptable Document types * You can use the choices param to limit the acceptable Document types
.. versionadded:: 0.3
""" """
def __init__(self, *args, **kwargs): def __init__(self, *args, **kwargs):
@@ -1619,16 +1611,76 @@ class BinaryField(BaseField):
return super().prepare_query_value(op, self.to_mongo(value)) return super().prepare_query_value(op, self.to_mongo(value))
class EnumField(BaseField):
"""Enumeration Field. Values are stored underneath as is,
so it will only work with simple types (str, int, etc) that
are bson encodable
Example usage:
.. code-block:: python
class Status(Enum):
NEW = 'new'
DONE = 'done'
class ModelWithEnum(Document):
status = EnumField(Status, default=Status.NEW)
ModelWithEnum(status='done')
ModelWithEnum(status=Status.DONE)
Enum fields can be searched using enum or its value:
.. code-block:: python
ModelWithEnum.objects(status='new').count()
ModelWithEnum.objects(status=Status.NEW).count()
Note that choices cannot be set explicitly, they are derived
from the provided enum class.
"""
def __init__(self, enum, **kwargs):
self._enum_cls = enum
if "choices" in kwargs:
raise ValueError(
"'choices' can't be set on EnumField, "
"it is implicitly set as the enum class"
)
kwargs["choices"] = list(self._enum_cls)
super().__init__(**kwargs)
def __set__(self, instance, value):
is_legal_value = value is None or isinstance(value, self._enum_cls)
if not is_legal_value:
try:
value = self._enum_cls(value)
except Exception:
pass
return super().__set__(instance, value)
def to_mongo(self, value):
if isinstance(value, self._enum_cls):
return value.value
return value
def validate(self, value):
if value and not isinstance(value, self._enum_cls):
try:
self._enum_cls(value)
except Exception as e:
self.error(str(e))
def prepare_query_value(self, op, value):
if value is None:
return value
return super().prepare_query_value(op, self.to_mongo(value))
class GridFSError(Exception): class GridFSError(Exception):
pass pass
class GridFSProxy: class GridFSProxy:
"""Proxy object to handle writing and reading of files to and from GridFS """Proxy object to handle writing and reading of files to and from GridFS
.. versionadded:: 0.4
.. versionchanged:: 0.5 - added optional size param to read
.. versionchanged:: 0.6 - added collection name param
""" """
_fs = None _fs = None
@@ -1792,10 +1844,6 @@ class GridFSProxy:
class FileField(BaseField): class FileField(BaseField):
"""A GridFS storage field. """A GridFS storage field.
.. versionadded:: 0.4
.. versionchanged:: 0.5 added optional size param for read
.. versionchanged:: 0.6 added db_alias for multidb support
""" """
proxy_class = GridFSProxy proxy_class = GridFSProxy
@@ -1878,11 +1926,7 @@ class FileField(BaseField):
class ImageGridFsProxy(GridFSProxy): class ImageGridFsProxy(GridFSProxy):
""" """Proxy for ImageField"""
Proxy for ImageField
versionadded: 0.6
"""
def put(self, file_obj, **kwargs): def put(self, file_obj, **kwargs):
""" """
@@ -2016,8 +2060,6 @@ class ImageField(FileField):
:param size: max size to store images, provided as (width, height, force) :param size: max size to store images, provided as (width, height, force)
if larger, it will be automatically resized (ex: size=(800, 600, True)) if larger, it will be automatically resized (ex: size=(800, 600, True))
:param thumbnail_size: size to generate a thumbnail, provided as (width, height, force) :param thumbnail_size: size to generate a thumbnail, provided as (width, height, force)
.. versionadded:: 0.6
""" """
proxy_class = ImageGridFsProxy proxy_class = ImageGridFsProxy
@@ -2042,7 +2084,7 @@ class ImageField(FileField):
class SequenceField(BaseField): class SequenceField(BaseField):
"""Provides a sequential counter see: """Provides a sequential counter see:
http://www.mongodb.org/display/DOCS/Object+IDs#ObjectIDs-SequenceNumbers https://docs.mongodb.com/manual/reference/method/ObjectId/#ObjectIDs-SequenceNumbers
.. note:: .. note::
@@ -2065,9 +2107,6 @@ class SequenceField(BaseField):
In case the counter is defined in the abstract document, it will be In case the counter is defined in the abstract document, it will be
common to all inherited documents and the default sequence name will common to all inherited documents and the default sequence name will
be the class name of the abstract document. be the class name of the abstract document.
.. versionadded:: 0.5
.. versionchanged:: 0.8 added `value_decorator`
""" """
_auto_gen = True _auto_gen = True
@@ -2181,8 +2220,6 @@ class SequenceField(BaseField):
class UUIDField(BaseField): class UUIDField(BaseField):
"""A UUID field. """A UUID field.
.. versionadded:: 0.6
""" """
_binary = None _binary = None
@@ -2192,9 +2229,6 @@ class UUIDField(BaseField):
Store UUID data in the database Store UUID data in the database
:param binary: if False store as a string. :param binary: if False store as a string.
.. versionchanged:: 0.8.0
.. versionchanged:: 0.6.19
""" """
self._binary = binary self._binary = binary
super().__init__(**kwargs) super().__init__(**kwargs)
@@ -2239,8 +2273,6 @@ class GeoPointField(BaseField):
representing a geo point. It admits 2d indexes but not "2dsphere" indexes representing a geo point. It admits 2d indexes but not "2dsphere" indexes
in MongoDB > 2.4 which are more natural for modeling geospatial points. in MongoDB > 2.4 which are more natural for modeling geospatial points.
See :ref:`geospatial-indexes` See :ref:`geospatial-indexes`
.. versionadded:: 0.4
""" """
_geo_index = pymongo.GEO2D _geo_index = pymongo.GEO2D
@@ -2272,8 +2304,6 @@ class PointField(GeoJsonBaseField):
to set the value. to set the value.
Requires mongodb >= 2.4 Requires mongodb >= 2.4
.. versionadded:: 0.8
""" """
_type = "Point" _type = "Point"
@@ -2292,8 +2322,6 @@ class LineStringField(GeoJsonBaseField):
You can either pass a dict with the full information or a list of points. You can either pass a dict with the full information or a list of points.
Requires mongodb >= 2.4 Requires mongodb >= 2.4
.. versionadded:: 0.8
""" """
_type = "LineString" _type = "LineString"
@@ -2315,8 +2343,6 @@ class PolygonField(GeoJsonBaseField):
holes. holes.
Requires mongodb >= 2.4 Requires mongodb >= 2.4
.. versionadded:: 0.8
""" """
_type = "Polygon" _type = "Polygon"
@@ -2336,8 +2362,6 @@ class MultiPointField(GeoJsonBaseField):
to set the value. to set the value.
Requires mongodb >= 2.6 Requires mongodb >= 2.6
.. versionadded:: 0.9
""" """
_type = "MultiPoint" _type = "MultiPoint"
@@ -2357,8 +2381,6 @@ class MultiLineStringField(GeoJsonBaseField):
You can either pass a dict with the full information or a list of points. You can either pass a dict with the full information or a list of points.
Requires mongodb >= 2.6 Requires mongodb >= 2.6
.. versionadded:: 0.9
""" """
_type = "MultiLineString" _type = "MultiLineString"
@@ -2385,8 +2407,6 @@ class MultiPolygonField(GeoJsonBaseField):
of Polygons. of Polygons.
Requires mongodb >= 2.6 Requires mongodb >= 2.6
.. versionadded:: 0.9
""" """
_type = "MultiPolygon" _type = "MultiPolygon"
@@ -2399,8 +2419,6 @@ class LazyReferenceField(BaseField):
Instead, access will return a :class:`~mongoengine.base.LazyReference` class Instead, access will return a :class:`~mongoengine.base.LazyReference` class
instance, allowing access to `pk` or manual dereference by using instance, allowing access to `pk` or manual dereference by using
``fetch()`` method. ``fetch()`` method.
.. versionadded:: 0.15
""" """
def __init__( def __init__(
@@ -2503,6 +2521,7 @@ class LazyReferenceField(BaseField):
if not isinstance(value, (DBRef, Document, EmbeddedDocument)): if not isinstance(value, (DBRef, Document, EmbeddedDocument)):
collection = self.document_type._get_collection_name() collection = self.document_type._get_collection_name()
value = DBRef(collection, self.document_type.id.to_python(value)) value = DBRef(collection, self.document_type.id.to_python(value))
value = self.build_lazyref(value)
return value return value
def validate(self, value): def validate(self, value):
@@ -2563,8 +2582,6 @@ class GenericLazyReferenceField(GenericReferenceField):
it. it.
* You can use the choices param to limit the acceptable Document types * You can use the choices param to limit the acceptable Document types
.. versionadded:: 0.15
""" """
def __init__(self, *args, **kwargs): def __init__(self, *args, **kwargs):

View File

@@ -2,6 +2,7 @@
Helper functions, constants, and types to aid with PyMongo v2.7 - v3.x support. Helper functions, constants, and types to aid with PyMongo v2.7 - v3.x support.
""" """
import pymongo import pymongo
from pymongo.errors import OperationFailure
_PYMONGO_37 = (3, 7) _PYMONGO_37 = (3, 7)
@@ -10,13 +11,41 @@ PYMONGO_VERSION = tuple(pymongo.version_tuple[:2])
IS_PYMONGO_GTE_37 = PYMONGO_VERSION >= _PYMONGO_37 IS_PYMONGO_GTE_37 = PYMONGO_VERSION >= _PYMONGO_37
def count_documents(collection, filter): def count_documents(
"""Pymongo>3.7 deprecates count in favour of count_documents""" collection, filter, skip=None, limit=None, hint=None, collation=None
):
"""Pymongo>3.7 deprecates count in favour of count_documents
"""
if limit == 0:
return 0 # Pymongo raises an OperationFailure if called with limit=0
kwargs = {}
if skip is not None:
kwargs["skip"] = skip
if limit is not None:
kwargs["limit"] = limit
if hint not in (-1, None):
kwargs["hint"] = hint
if collation is not None:
kwargs["collation"] = collation
# count_documents appeared in pymongo 3.7
if IS_PYMONGO_GTE_37: if IS_PYMONGO_GTE_37:
return collection.count_documents(filter) try:
else: return collection.count_documents(filter=filter, **kwargs)
count = collection.find(filter).count() except OperationFailure:
return count # OperationFailure - accounts for some operators that used to work
# with .count but are no longer working with count_documents (i.e $geoNear, $near, and $nearSphere)
# fallback to deprecated Cursor.count
# Keeping this should be reevaluated the day pymongo removes .count entirely
pass
cursor = collection.find(filter)
for option, option_value in kwargs.items():
cursor_method = getattr(cursor, option)
cursor = cursor_method(option_value)
with_limit_and_skip = "skip" in kwargs or "limit" in kwargs
return cursor.count(with_limit_and_skip=with_limit_and_skip)
def list_collection_names(db, include_system_collections=False): def list_collection_names(db, include_system_collections=False):

View File

@@ -29,6 +29,7 @@ from mongoengine.errors import (
NotUniqueError, NotUniqueError,
OperationError, OperationError,
) )
from mongoengine.pymongo_support import count_documents
from mongoengine.queryset import transform from mongoengine.queryset import transform
from mongoengine.queryset.field_list import QueryFieldList from mongoengine.queryset.field_list import QueryFieldList
from mongoengine.queryset.visitor import Q, QNode from mongoengine.queryset.visitor import Q, QNode
@@ -83,13 +84,20 @@ class BaseQuerySet:
self._cursor_obj = None self._cursor_obj = None
self._limit = None self._limit = None
self._skip = None self._skip = None
self._hint = -1 # Using -1 as None is a valid value for hint self._hint = -1 # Using -1 as None is a valid value for hint
self._collation = None self._collation = None
self._batch_size = None self._batch_size = None
self.only_fields = []
self._max_time_ms = None self._max_time_ms = None
self._comment = None self._comment = None
# Hack - As people expect cursor[5:5] to return
# an empty result set. It's hard to do that right, though, because the
# server uses limit(0) to mean 'no limit'. So we set _empty
# in that case and check for it when iterating. We also unset
# it anytime we change _limit. Inspired by how it is done in pymongo.Cursor
self._empty = False
def __call__(self, q_obj=None, **query): def __call__(self, q_obj=None, **query):
"""Filter the selected documents by calling the """Filter the selected documents by calling the
:class:`~mongoengine.queryset.QuerySet` with a query. :class:`~mongoengine.queryset.QuerySet` with a query.
@@ -162,6 +170,7 @@ class BaseQuerySet:
[<User: User object>, <User: User object>] [<User: User object>, <User: User object>]
""" """
queryset = self.clone() queryset = self.clone()
queryset._empty = False
# Handle a slice # Handle a slice
if isinstance(key, slice): if isinstance(key, slice):
@@ -169,6 +178,8 @@ class BaseQuerySet:
queryset._skip, queryset._limit = key.start, key.stop queryset._skip, queryset._limit = key.start, key.stop
if key.start and key.stop: if key.start and key.stop:
queryset._limit = key.stop - key.start queryset._limit = key.stop - key.start
if queryset._limit == 0:
queryset._empty = True
# Allow further QuerySet modifications to be performed # Allow further QuerySet modifications to be performed
return queryset return queryset
@@ -178,9 +189,7 @@ class BaseQuerySet:
if queryset._scalar: if queryset._scalar:
return queryset._get_scalar( return queryset._get_scalar(
queryset._document._from_son( queryset._document._from_son(
queryset._cursor[key], queryset._cursor[key], _auto_dereference=self._auto_dereference,
_auto_dereference=self._auto_dereference,
only_fields=self.only_fields,
) )
) )
@@ -188,9 +197,7 @@ class BaseQuerySet:
return queryset._cursor[key] return queryset._cursor[key]
return queryset._document._from_son( return queryset._document._from_son(
queryset._cursor[key], queryset._cursor[key], _auto_dereference=self._auto_dereference,
_auto_dereference=self._auto_dereference,
only_fields=self.only_fields,
) )
raise TypeError("Provide a slice or an integer index") raise TypeError("Provide a slice or an integer index")
@@ -249,8 +256,6 @@ class BaseQuerySet:
`DocumentName.MultipleObjectsReturned` exception if multiple results `DocumentName.MultipleObjectsReturned` exception if multiple results
and :class:`~mongoengine.queryset.DoesNotExist` or and :class:`~mongoengine.queryset.DoesNotExist` or
`DocumentName.DoesNotExist` if no results are found. `DocumentName.DoesNotExist` if no results are found.
.. versionadded:: 0.3
""" """
queryset = self.clone() queryset = self.clone()
queryset = queryset.order_by().limit(2) queryset = queryset.order_by().limit(2)
@@ -275,8 +280,6 @@ class BaseQuerySet:
def create(self, **kwargs): def create(self, **kwargs):
"""Create new object. Returns the saved object instance. """Create new object. Returns the saved object instance.
.. versionadded:: 0.4
""" """
return self._document(**kwargs).save(force_insert=True) return self._document(**kwargs).save(force_insert=True)
@@ -309,10 +312,6 @@ class BaseQuerySet:
By default returns document instances, set ``load_bulk`` to False to By default returns document instances, set ``load_bulk`` to False to
return just ``ObjectIds`` return just ``ObjectIds``
.. versionadded:: 0.5
.. versionchanged:: 0.10.7
Add signal_kwargs argument
""" """
Document = _import_class("Document") Document = _import_class("Document")
@@ -394,9 +393,36 @@ class BaseQuerySet:
:meth:`skip` that has been applied to this cursor into account when :meth:`skip` that has been applied to this cursor into account when
getting the count getting the count
""" """
if self._limit == 0 and with_limit_and_skip is False or self._none: # mimic the fact that setting .limit(0) in pymongo sets no limit
# https://docs.mongodb.com/manual/reference/method/cursor.limit/#zero-value
if (
self._limit == 0
and with_limit_and_skip is False
or self._none
or self._empty
):
return 0 return 0
count = self._cursor.count(with_limit_and_skip=with_limit_and_skip)
kwargs = (
{"limit": self._limit, "skip": self._skip} if with_limit_and_skip else {}
)
if self._limit == 0:
# mimic the fact that historically .limit(0) sets no limit
kwargs.pop("limit", None)
if self._hint not in (-1, None):
kwargs["hint"] = self._hint
if self._collation:
kwargs["collation"] = self._collation
count = count_documents(
collection=self._cursor.collection,
filter=self._cursor._Cursor__spec,
**kwargs
)
self._cursor_obj = None self._cursor_obj = None
return count return count
@@ -516,8 +542,6 @@ class BaseQuerySet:
:param update: Django-style update keyword arguments :param update: Django-style update keyword arguments
:returns the number of updated documents (unless ``full_result`` is True) :returns the number of updated documents (unless ``full_result`` is True)
.. versionadded:: 0.2
""" """
if not update and not upsert: if not update and not upsert:
raise OperationError("No update parameters, would remove data") raise OperationError("No update parameters, would remove data")
@@ -569,8 +593,6 @@ class BaseQuerySet:
:param update: Django-style update keyword arguments :param update: Django-style update keyword arguments
:returns the new or overwritten document :returns the new or overwritten document
.. versionadded:: 0.10.2
""" """
atomic_update = self.update( atomic_update = self.update(
@@ -604,7 +626,6 @@ class BaseQuerySet:
:param update: Django-style update keyword arguments :param update: Django-style update keyword arguments
full_result full_result
:returns the number of updated documents (unless ``full_result`` is True) :returns the number of updated documents (unless ``full_result`` is True)
.. versionadded:: 0.2
""" """
return self.update( return self.update(
upsert=upsert, upsert=upsert,
@@ -636,8 +657,6 @@ class BaseQuerySet:
:param new: return updated rather than original document :param new: return updated rather than original document
(default ``False``) (default ``False``)
:param update: Django-style update keyword arguments :param update: Django-style update keyword arguments
.. versionadded:: 0.9
""" """
if remove and new: if remove and new:
@@ -680,12 +699,10 @@ class BaseQuerySet:
if full_response: if full_response:
if result["value"] is not None: if result["value"] is not None:
result["value"] = self._document._from_son( result["value"] = self._document._from_son(result["value"])
result["value"], only_fields=self.only_fields
)
else: else:
if result is not None: if result is not None:
result = self._document._from_son(result, only_fields=self.only_fields) result = self._document._from_son(result)
return result return result
@@ -695,8 +712,6 @@ class BaseQuerySet:
`None` if no document exists with that id. `None` if no document exists with that id.
:param object_id: the value for the id of the document to look up :param object_id: the value for the id of the document to look up
.. versionchanged:: 0.6 Raises InvalidQueryError if filter has been set
""" """
queryset = self.clone() queryset = self.clone()
if not queryset._query_obj.empty: if not queryset._query_obj.empty:
@@ -710,32 +725,28 @@ class BaseQuerySet:
:param object_ids: a list or tuple of ObjectId's :param object_ids: a list or tuple of ObjectId's
:rtype: dict of ObjectId's as keys and collection-specific :rtype: dict of ObjectId's as keys and collection-specific
Document subclasses as values. Document subclasses as values.
.. versionadded:: 0.3
""" """
doc_map = {} doc_map = {}
docs = self._collection.find({"_id": {"$in": object_ids}}, **self._cursor_args) docs = self._collection.find({"_id": {"$in": object_ids}}, **self._cursor_args)
if self._scalar: if self._scalar:
for doc in docs: for doc in docs:
doc_map[doc["_id"]] = self._get_scalar( doc_map[doc["_id"]] = self._get_scalar(self._document._from_son(doc))
self._document._from_son(doc, only_fields=self.only_fields)
)
elif self._as_pymongo: elif self._as_pymongo:
for doc in docs: for doc in docs:
doc_map[doc["_id"]] = doc doc_map[doc["_id"]] = doc
else: else:
for doc in docs: for doc in docs:
doc_map[doc["_id"]] = self._document._from_son( doc_map[doc["_id"]] = self._document._from_son(
doc, doc, _auto_dereference=self._auto_dereference,
only_fields=self.only_fields,
_auto_dereference=self._auto_dereference,
) )
return doc_map return doc_map
def none(self): def none(self):
"""Helper that just returns a list""" """Returns a queryset that never returns any objects and no query will be executed when accessing the results
inspired by django none() https://docs.djangoproject.com/en/dev/ref/models/querysets/#none
"""
queryset = self.clone() queryset = self.clone()
queryset._none = True queryset._none = True
return queryset return queryset
@@ -755,8 +766,6 @@ class BaseQuerySet:
evaluated against if you are using more than one database. evaluated against if you are using more than one database.
:param alias: The database alias :param alias: The database alias
.. versionadded:: 0.9
""" """
with switch_db(self._document, alias) as cls: with switch_db(self._document, alias) as cls:
@@ -789,16 +798,17 @@ class BaseQuerySet:
"_snapshot", "_snapshot",
"_timeout", "_timeout",
"_read_preference", "_read_preference",
"_read_concern",
"_iter", "_iter",
"_scalar", "_scalar",
"_as_pymongo", "_as_pymongo",
"_limit", "_limit",
"_skip", "_skip",
"_empty",
"_hint", "_hint",
"_collation", "_collation",
"_auto_dereference", "_auto_dereference",
"_search_text", "_search_text",
"only_fields",
"_max_time_ms", "_max_time_ms",
"_comment", "_comment",
"_batch_size", "_batch_size",
@@ -817,8 +827,6 @@ class BaseQuerySet:
"""Handles dereferencing of :class:`~bson.dbref.DBRef` objects or """Handles dereferencing of :class:`~bson.dbref.DBRef` objects or
:class:`~bson.object_id.ObjectId` a maximum depth in order to cut down :class:`~bson.object_id.ObjectId` a maximum depth in order to cut down
the number queries to mongodb. the number queries to mongodb.
.. versionadded:: 0.5
""" """
# Make select related work the same for querysets # Make select related work the same for querysets
max_depth += 1 max_depth += 1
@@ -834,6 +842,7 @@ class BaseQuerySet:
""" """
queryset = self.clone() queryset = self.clone()
queryset._limit = n queryset._limit = n
queryset._empty = False # cancels the effect of empty
# If a cursor object has already been created, apply the limit to it. # If a cursor object has already been created, apply the limit to it.
if queryset._cursor_obj: if queryset._cursor_obj:
@@ -866,8 +875,6 @@ class BaseQuerySet:
Hinting will not do anything if the corresponding index does not exist. Hinting will not do anything if the corresponding index does not exist.
The last hint applied to this cursor takes precedence over all others. The last hint applied to this cursor takes precedence over all others.
.. versionadded:: 0.5
""" """
queryset = self.clone() queryset = self.clone()
queryset._hint = index queryset._hint = index
@@ -929,10 +936,6 @@ class BaseQuerySet:
.. note:: This is a command and won't take ordering or limit into .. note:: This is a command and won't take ordering or limit into
account. account.
.. versionadded:: 0.4
.. versionchanged:: 0.5 - Fixed handling references
.. versionchanged:: 0.6 - Improved db_field refrence handling
""" """
queryset = self.clone() queryset = self.clone()
@@ -996,12 +999,8 @@ class BaseQuerySet:
field filters. field filters.
:param fields: fields to include :param fields: fields to include
.. versionadded:: 0.3
.. versionchanged:: 0.5 - Added subfield support
""" """
fields = {f: QueryFieldList.ONLY for f in fields} fields = {f: QueryFieldList.ONLY for f in fields}
self.only_fields = list(fields.keys())
return self.fields(True, **fields) return self.fields(True, **fields)
def exclude(self, *fields): def exclude(self, *fields):
@@ -1018,8 +1017,6 @@ class BaseQuerySet:
field filters. field filters.
:param fields: fields to exclude :param fields: fields to exclude
.. versionadded:: 0.5
""" """
fields = {f: QueryFieldList.EXCLUDE for f in fields} fields = {f: QueryFieldList.EXCLUDE for f in fields}
return self.fields(**fields) return self.fields(**fields)
@@ -1046,8 +1043,6 @@ class BaseQuerySet:
:param kwargs: A set of keyword arguments identifying what to :param kwargs: A set of keyword arguments identifying what to
include, exclude, or slice. include, exclude, or slice.
.. versionadded:: 0.5
""" """
# Check for an operator and transform to mongo-style if there is # Check for an operator and transform to mongo-style if there is
@@ -1089,8 +1084,6 @@ class BaseQuerySet:
.exclude(). :: .exclude(). ::
post = BlogPost.objects.exclude('comments').all_fields() post = BlogPost.objects.exclude('comments').all_fields()
.. versionadded:: 0.5
""" """
queryset = self.clone() queryset = self.clone()
queryset._loaded_fields = QueryFieldList( queryset._loaded_fields = QueryFieldList(
@@ -1163,9 +1156,6 @@ class BaseQuerySet:
"""Enable or disable snapshot mode when querying. """Enable or disable snapshot mode when querying.
:param enabled: whether or not snapshot mode is enabled :param enabled: whether or not snapshot mode is enabled
..versionchanged:: 0.5 - made chainable
.. deprecated:: Ignored with PyMongo 3+
""" """
msg = "snapshot is deprecated as it has no impact when using PyMongo 3+." msg = "snapshot is deprecated as it has no impact when using PyMongo 3+."
warnings.warn(msg, DeprecationWarning) warnings.warn(msg, DeprecationWarning)
@@ -1177,8 +1167,6 @@ class BaseQuerySet:
"""Enable or disable the default mongod timeout when querying. (no_cursor_timeout option) """Enable or disable the default mongod timeout when querying. (no_cursor_timeout option)
:param enabled: whether or not the timeout is used :param enabled: whether or not the timeout is used
..versionchanged:: 0.5 - made chainable
""" """
queryset = self.clone() queryset = self.clone()
queryset._timeout = enabled queryset._timeout = enabled
@@ -1266,10 +1254,7 @@ class BaseQuerySet:
def from_json(self, json_data): def from_json(self, json_data):
"""Converts json data to unsaved objects""" """Converts json data to unsaved objects"""
son_data = json_util.loads(json_data) son_data = json_util.loads(json_data)
return [ return [self._document._from_son(data) for data in son_data]
self._document._from_son(data, only_fields=self.only_fields)
for data in son_data
]
def aggregate(self, pipeline, *suppl_pipeline, **kwargs): def aggregate(self, pipeline, *suppl_pipeline, **kwargs):
"""Perform a aggregate function based in your queryset params """Perform a aggregate function based in your queryset params
@@ -1280,7 +1265,6 @@ class BaseQuerySet:
parameter will be removed shortly parameter will be removed shortly
:param kwargs: (optional) kwargs dictionary to be passed to pymongo's aggregate call :param kwargs: (optional) kwargs dictionary to be passed to pymongo's aggregate call
See https://api.mongodb.com/python/current/api/pymongo/collection.html#pymongo.collection.Collection.aggregate See https://api.mongodb.com/python/current/api/pymongo/collection.html#pymongo.collection.Collection.aggregate
.. versionadded:: 0.9
""" """
using_deprecated_interface = isinstance(pipeline, dict) or bool(suppl_pipeline) using_deprecated_interface = isinstance(pipeline, dict) or bool(suppl_pipeline)
user_pipeline = [pipeline] if isinstance(pipeline, dict) else list(pipeline) user_pipeline = [pipeline] if isinstance(pipeline, dict) else list(pipeline)
@@ -1311,10 +1295,11 @@ class BaseQuerySet:
final_pipeline = initial_pipeline + user_pipeline final_pipeline = initial_pipeline + user_pipeline
collection = self._collection collection = self._collection
if self._read_preference is not None: if self._read_preference is not None or self._read_concern is not None:
collection = self._collection.with_options( collection = self._collection.with_options(
read_preference=self._read_preference read_preference=self._read_preference, read_concern=self._read_concern
) )
return collection.aggregate(final_pipeline, cursor={}, **kwargs) return collection.aggregate(final_pipeline, cursor={}, **kwargs)
# JS functionality # JS functionality
@@ -1351,12 +1336,6 @@ class BaseQuerySet:
Map/Reduce changed in server version **>= 1.7.4**. The PyMongo Map/Reduce changed in server version **>= 1.7.4**. The PyMongo
:meth:`~pymongo.collection.Collection.map_reduce` helper requires :meth:`~pymongo.collection.Collection.map_reduce` helper requires
PyMongo version **>= 1.11**. PyMongo version **>= 1.11**.
.. versionchanged:: 0.5
- removed ``keep_temp`` keyword argument, which was only relevant
for MongoDB server versions older than 1.7.4
.. versionadded:: 0.3
""" """
queryset = self.clone() queryset = self.clone()
@@ -1493,8 +1472,6 @@ class BaseQuerySet:
.. note:: When using this mode of query, the database will call your .. note:: When using this mode of query, the database will call your
function, or evaluate your predicate clause, for each object function, or evaluate your predicate clause, for each object
in the collection. in the collection.
.. versionadded:: 0.5
""" """
queryset = self.clone() queryset = self.clone()
where_clause = queryset._sub_js_fields(where_clause) where_clause = queryset._sub_js_fields(where_clause)
@@ -1571,9 +1548,6 @@ class BaseQuerySet:
:param field: the field to use :param field: the field to use
:param normalize: normalize the results so they add to 1.0 :param normalize: normalize the results so they add to 1.0
:param map_reduce: Use map_reduce over exec_js :param map_reduce: Use map_reduce over exec_js
.. versionchanged:: 0.5 defaults to map_reduce and can handle embedded
document lookups
""" """
if map_reduce: if map_reduce:
return self._item_frequencies_map_reduce(field, normalize=normalize) return self._item_frequencies_map_reduce(field, normalize=normalize)
@@ -1584,7 +1558,7 @@ class BaseQuerySet:
def __next__(self): def __next__(self):
"""Wrap the result in a :class:`~mongoengine.Document` object. """Wrap the result in a :class:`~mongoengine.Document` object.
""" """
if self._limit == 0 or self._none: if self._none or self._empty:
raise StopIteration raise StopIteration
raw_doc = next(self._cursor) raw_doc = next(self._cursor)
@@ -1593,9 +1567,7 @@ class BaseQuerySet:
return raw_doc return raw_doc
doc = self._document._from_son( doc = self._document._from_son(
raw_doc, raw_doc, _auto_dereference=self._auto_dereference,
_auto_dereference=self._auto_dereference,
only_fields=self.only_fields,
) )
if self._scalar: if self._scalar:
@@ -1603,12 +1575,8 @@ class BaseQuerySet:
return doc return doc
next = __next__ # For Python2 support
def rewind(self): def rewind(self):
"""Rewind the cursor to its unevaluated state. """Rewind the cursor to its unevaluated state.
.. versionadded:: 0.3
""" """
self._iter = False self._iter = False
self._cursor.rewind() self._cursor.rewind()

View File

@@ -144,14 +144,13 @@ class QuerySet(BaseQuerySet):
return super().count(with_limit_and_skip) return super().count(with_limit_and_skip)
if self._len is None: if self._len is None:
# cache the length
self._len = super().count(with_limit_and_skip) self._len = super().count(with_limit_and_skip)
return self._len return self._len
def no_cache(self): def no_cache(self):
"""Convert to a non-caching queryset """Convert to a non-caching queryset
.. versionadded:: 0.8.3 Convert to non caching queryset
""" """
if self._result_cache is not None: if self._result_cache is not None:
raise OperationError("QuerySet already cached") raise OperationError("QuerySet already cached")
@@ -164,15 +163,11 @@ class QuerySetNoCache(BaseQuerySet):
def cache(self): def cache(self):
"""Convert to a caching queryset """Convert to a caching queryset
.. versionadded:: 0.8.3 Convert to caching queryset
""" """
return self._clone_into(QuerySet(self._document, self._collection)) return self._clone_into(QuerySet(self._document, self._collection))
def __repr__(self): def __repr__(self):
"""Provides the string representation of the QuerySet """Provides the string representation of the QuerySet
.. versionchanged:: 0.6.13 Now doesnt modify the cursor
""" """
if self._iter: if self._iter:
return ".. queryset mid-iteration .." return ".. queryset mid-iteration .."

View File

@@ -7,6 +7,11 @@ from mongoengine.queryset import transform
__all__ = ("Q", "QNode") __all__ = ("Q", "QNode")
def warn_empty_is_deprecated():
msg = "'empty' property is deprecated in favour of using 'not bool(filter)'"
warnings.warn(msg, DeprecationWarning, stacklevel=2)
class QNodeVisitor: class QNodeVisitor:
"""Base visitor class for visiting Q-object nodes in a query tree. """Base visitor class for visiting Q-object nodes in a query tree.
""" """
@@ -98,19 +103,18 @@ class QNode:
object. object.
""" """
# If the other Q() is empty, ignore it and just use `self`. # If the other Q() is empty, ignore it and just use `self`.
if getattr(other, "empty", True): if not bool(other):
return self return self
# Or if this Q is empty, ignore it and just use `other`. # Or if this Q is empty, ignore it and just use `other`.
if self.empty: if not bool(self):
return other return other
return QCombination(operation, [self, other]) return QCombination(operation, [self, other])
@property @property
def empty(self): def empty(self):
msg = "'empty' property is deprecated in favour of using 'not bool(filter)'" warn_empty_is_deprecated()
warnings.warn(msg, DeprecationWarning)
return False return False
def __or__(self, other): def __or__(self, other):
@@ -152,8 +156,7 @@ class QCombination(QNode):
@property @property
def empty(self): def empty(self):
msg = "'empty' property is deprecated in favour of using 'not bool(filter)'" warn_empty_is_deprecated()
warnings.warn(msg, DeprecationWarning)
return not bool(self.children) return not bool(self.children)
def __eq__(self, other): def __eq__(self, other):
@@ -186,4 +189,5 @@ class Q(QNode):
@property @property
def empty(self): def empty(self):
warn_empty_is_deprecated()
return not bool(self.query) return not bool(self.query)

View File

@@ -1,3 +0,0 @@
pymongo>=3.4
Sphinx==1.5.5
sphinx-rtd-theme==0.2.4

View File

@@ -115,7 +115,7 @@ extra_opts = {
"pytest-cov", "pytest-cov",
"coverage<5.0", # recent coverage switched to sqlite format for the .coverage file which isn't handled properly by coveralls "coverage<5.0", # recent coverage switched to sqlite format for the .coverage file which isn't handled properly by coveralls
"blinker", "blinker",
"Pillow>=2.0.0, <7.0.0", # 7.0.0 dropped Python2 support "Pillow>=7.0.0",
], ],
} }

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import unittest import unittest
from mongoengine import * from mongoengine import *

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import unittest import unittest
from bson import SON from bson import SON
@@ -29,7 +28,8 @@ class TestDelta(MongoDBTestCase):
self.delta(Document) self.delta(Document)
self.delta(DynamicDocument) self.delta(DynamicDocument)
def delta(self, DocClass): @staticmethod
def delta(DocClass):
class Doc(DocClass): class Doc(DocClass):
string_field = StringField() string_field = StringField()
int_field = IntField() int_field = IntField()
@@ -428,13 +428,20 @@ class TestDelta(MongoDBTestCase):
assert doc.dict_field == {"hello": "world"} assert doc.dict_field == {"hello": "world"}
assert doc.list_field == ["1", 2, {"hello": "world"}] assert doc.list_field == ["1", 2, {"hello": "world"}]
def test_delta_recursive_db_field(self): def test_delta_recursive_db_field_on_doc_and_embeddeddoc(self):
self.delta_recursive_db_field(Document, EmbeddedDocument) self.delta_recursive_db_field(Document, EmbeddedDocument)
def test_delta_recursive_db_field_on_doc_and_dynamicembeddeddoc(self):
self.delta_recursive_db_field(Document, DynamicEmbeddedDocument) self.delta_recursive_db_field(Document, DynamicEmbeddedDocument)
def test_delta_recursive_db_field_on_dynamicdoc_and_embeddeddoc(self):
self.delta_recursive_db_field(DynamicDocument, EmbeddedDocument) self.delta_recursive_db_field(DynamicDocument, EmbeddedDocument)
def test_delta_recursive_db_field_on_dynamicdoc_and_dynamicembeddeddoc(self):
self.delta_recursive_db_field(DynamicDocument, DynamicEmbeddedDocument) self.delta_recursive_db_field(DynamicDocument, DynamicEmbeddedDocument)
def delta_recursive_db_field(self, DocClass, EmbeddedClass): @staticmethod
def delta_recursive_db_field(DocClass, EmbeddedClass):
class Embedded(EmbeddedClass): class Embedded(EmbeddedClass):
string_field = StringField(db_field="db_string_field") string_field = StringField(db_field="db_string_field")
int_field = IntField(db_field="db_int_field") int_field = IntField(db_field="db_int_field")
@@ -487,6 +494,7 @@ class TestDelta(MongoDBTestCase):
doc = doc.reload(10) doc = doc.reload(10)
assert doc.embedded_field.dict_field == {} assert doc.embedded_field.dict_field == {}
assert doc._get_changed_fields() == []
doc.embedded_field.list_field = [] doc.embedded_field.list_field = []
assert doc._get_changed_fields() == ["db_embedded_field.db_list_field"] assert doc._get_changed_fields() == ["db_embedded_field.db_list_field"]
assert doc.embedded_field._delta() == ({}, {"db_list_field": 1}) assert doc.embedded_field._delta() == ({}, {"db_list_field": 1})
@@ -537,6 +545,7 @@ class TestDelta(MongoDBTestCase):
{}, {},
) )
doc.save() doc.save()
assert doc._get_changed_fields() == []
doc = doc.reload(10) doc = doc.reload(10)
assert doc.embedded_field.list_field[0] == "1" assert doc.embedded_field.list_field[0] == "1"
@@ -634,6 +643,7 @@ class TestDelta(MongoDBTestCase):
doc.save() doc.save()
doc = doc.reload(10) doc = doc.reload(10)
assert doc._delta() == ({}, {},)
del doc.embedded_field.list_field[2].list_field del doc.embedded_field.list_field[2].list_field
assert doc._delta() == ( assert doc._delta() == (
{}, {},
@@ -732,12 +742,12 @@ class TestDelta(MongoDBTestCase):
assert organization._get_changed_fields() == [] assert organization._get_changed_fields() == []
updates, removals = organization._delta() updates, removals = organization._delta()
assert {} == removals assert removals == {}
assert {} == updates assert updates == {}
organization.employees.append(person) organization.employees.append(person)
updates, removals = organization._delta() updates, removals = organization._delta()
assert {} == removals assert removals == {}
assert "employees" in updates assert "employees" in updates
def test_delta_with_dbref_false(self): def test_delta_with_dbref_false(self):
@@ -749,12 +759,12 @@ class TestDelta(MongoDBTestCase):
assert organization._get_changed_fields() == [] assert organization._get_changed_fields() == []
updates, removals = organization._delta() updates, removals = organization._delta()
assert {} == removals assert removals == {}
assert {} == updates assert updates == {}
organization.employees.append(person) organization.employees.append(person)
updates, removals = organization._delta() updates, removals = organization._delta()
assert {} == removals assert removals == {}
assert "employees" in updates assert "employees" in updates
def test_nested_nested_fields_mark_as_changed(self): def test_nested_nested_fields_mark_as_changed(self):
@@ -767,19 +777,46 @@ class TestDelta(MongoDBTestCase):
MyDoc.drop_collection() MyDoc.drop_collection()
mydoc = MyDoc( MyDoc(name="testcase1", subs={"a": {"b": EmbeddedDoc(name="foo")}}).save()
name="testcase1", subs={"a": {"b": EmbeddedDoc(name="foo")}}
).save()
mydoc = MyDoc.objects.first() mydoc = MyDoc.objects.first()
subdoc = mydoc.subs["a"]["b"] subdoc = mydoc.subs["a"]["b"]
subdoc.name = "bar" subdoc.name = "bar"
assert ["name"] == subdoc._get_changed_fields() assert subdoc._get_changed_fields() == ["name"]
assert ["subs.a.b.name"] == mydoc._get_changed_fields() assert mydoc._get_changed_fields() == ["subs.a.b.name"]
mydoc._clear_changed_fields() mydoc._clear_changed_fields()
assert [] == mydoc._get_changed_fields() assert mydoc._get_changed_fields() == []
def test_nested_nested_fields_db_field_set__gets_mark_as_changed_and_cleaned(self):
class EmbeddedDoc(EmbeddedDocument):
name = StringField(db_field="db_name")
class MyDoc(Document):
embed = EmbeddedDocumentField(EmbeddedDoc, db_field="db_embed")
name = StringField(db_field="db_name")
MyDoc.drop_collection()
MyDoc(name="testcase1", embed=EmbeddedDoc(name="foo")).save()
mydoc = MyDoc.objects.first()
mydoc.embed.name = "foo1"
assert mydoc.embed._get_changed_fields() == ["db_name"]
assert mydoc._get_changed_fields() == ["db_embed.db_name"]
mydoc = MyDoc.objects.first()
embed = EmbeddedDoc(name="foo2")
embed.name = "bar"
mydoc.embed = embed
assert embed._get_changed_fields() == ["db_name"]
assert mydoc._get_changed_fields() == ["db_embed"]
mydoc._clear_changed_fields()
assert mydoc._get_changed_fields() == []
def test_lower_level_mark_as_changed(self): def test_lower_level_mark_as_changed(self):
class EmbeddedDoc(EmbeddedDocument): class EmbeddedDoc(EmbeddedDocument):
@@ -794,17 +831,17 @@ class TestDelta(MongoDBTestCase):
mydoc = MyDoc.objects.first() mydoc = MyDoc.objects.first()
mydoc.subs["a"] = EmbeddedDoc() mydoc.subs["a"] = EmbeddedDoc()
assert ["subs.a"] == mydoc._get_changed_fields() assert mydoc._get_changed_fields() == ["subs.a"]
subdoc = mydoc.subs["a"] subdoc = mydoc.subs["a"]
subdoc.name = "bar" subdoc.name = "bar"
assert ["name"] == subdoc._get_changed_fields() assert subdoc._get_changed_fields() == ["name"]
assert ["subs.a"] == mydoc._get_changed_fields() assert mydoc._get_changed_fields() == ["subs.a"]
mydoc.save() mydoc.save()
mydoc._clear_changed_fields() mydoc._clear_changed_fields()
assert [] == mydoc._get_changed_fields() assert mydoc._get_changed_fields() == []
def test_upper_level_mark_as_changed(self): def test_upper_level_mark_as_changed(self):
class EmbeddedDoc(EmbeddedDocument): class EmbeddedDoc(EmbeddedDocument):
@@ -821,15 +858,15 @@ class TestDelta(MongoDBTestCase):
subdoc = mydoc.subs["a"] subdoc = mydoc.subs["a"]
subdoc.name = "bar" subdoc.name = "bar"
assert ["name"] == subdoc._get_changed_fields() assert subdoc._get_changed_fields() == ["name"]
assert ["subs.a.name"] == mydoc._get_changed_fields() assert mydoc._get_changed_fields() == ["subs.a.name"]
mydoc.subs["a"] = EmbeddedDoc() mydoc.subs["a"] = EmbeddedDoc()
assert ["subs.a"] == mydoc._get_changed_fields() assert mydoc._get_changed_fields() == ["subs.a"]
mydoc.save() mydoc.save()
mydoc._clear_changed_fields() mydoc._clear_changed_fields()
assert [] == mydoc._get_changed_fields() assert mydoc._get_changed_fields() == []
def test_referenced_object_changed_attributes(self): def test_referenced_object_changed_attributes(self):
"""Ensures that when you save a new reference to a field, the referenced object isn't altered""" """Ensures that when you save a new reference to a field, the referenced object isn't altered"""

View File

@@ -37,6 +37,19 @@ class TestDynamicDocument(MongoDBTestCase):
# Confirm no changes to self.Person # Confirm no changes to self.Person
assert not hasattr(self.Person, "age") assert not hasattr(self.Person, "age")
def test_dynamic_document_parse_values_in_constructor_like_document_do(self):
class ProductDynamicDocument(DynamicDocument):
title = StringField()
price = FloatField()
class ProductDocument(Document):
title = StringField()
price = FloatField()
product = ProductDocument(title="Blabla", price="12.5")
dyn_product = ProductDynamicDocument(title="Blabla", price="12.5")
assert product.price == dyn_product.price == 12.5
def test_change_scope_of_variable(self): def test_change_scope_of_variable(self):
"""Test changing the scope of a dynamic field has no adverse effects""" """Test changing the scope of a dynamic field has no adverse effects"""
p = self.Person() p = self.Person()

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import unittest import unittest
from datetime import datetime from datetime import datetime
@@ -551,8 +550,9 @@ class TestIndexes(unittest.TestCase):
assert 5 == query_result.count() assert 5 == query_result.count()
incorrect_collation = {"arndom": "wrdo"} incorrect_collation = {"arndom": "wrdo"}
with pytest.raises(OperationFailure): with pytest.raises(OperationFailure) as exc_info:
BlogPost.objects.collation(incorrect_collation).count() BlogPost.objects.collation(incorrect_collation).count()
assert "Missing expected field" in str(exc_info.value)
query_result = BlogPost.objects.collation({}).order_by("name") query_result = BlogPost.objects.collation({}).order_by("name")
assert [x.name for x in query_result] == sorted(names) assert [x.name for x in query_result] == sorted(names)

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import unittest import unittest
import warnings import warnings

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import os import os
import pickle import pickle
import unittest import unittest
@@ -188,7 +187,7 @@ class TestDocumentInstance(MongoDBTestCase):
def test_queryset_resurrects_dropped_collection(self): def test_queryset_resurrects_dropped_collection(self):
self.Person.drop_collection() self.Person.drop_collection()
assert [] == list(self.Person.objects()) assert list(self.Person.objects()) == []
# Ensure works correctly with inhertited classes # Ensure works correctly with inhertited classes
class Actor(self.Person): class Actor(self.Person):
@@ -196,7 +195,7 @@ class TestDocumentInstance(MongoDBTestCase):
Actor.objects() Actor.objects()
self.Person.drop_collection() self.Person.drop_collection()
assert [] == list(Actor.objects()) assert list(Actor.objects()) == []
def test_polymorphic_references(self): def test_polymorphic_references(self):
"""Ensure that the correct subclasses are returned from a query """Ensure that the correct subclasses are returned from a query
@@ -501,7 +500,7 @@ class TestDocumentInstance(MongoDBTestCase):
doc.reload() doc.reload()
Animal.drop_collection() Animal.drop_collection()
def test_update_shard_key_routing(self): def test_save_update_shard_key_routing(self):
"""Ensures updating a doc with a specified shard_key includes it in """Ensures updating a doc with a specified shard_key includes it in
the query. the query.
""" """
@@ -529,6 +528,29 @@ class TestDocumentInstance(MongoDBTestCase):
Animal.drop_collection() Animal.drop_collection()
def test_save_create_shard_key_routing(self):
"""Ensures inserting a doc with a specified shard_key includes it in
the query.
"""
class Animal(Document):
_id = UUIDField(binary=False, primary_key=True, default=uuid.uuid4)
is_mammal = BooleanField()
name = StringField()
meta = {"shard_key": ("is_mammal",)}
Animal.drop_collection()
doc = Animal(is_mammal=True, name="Dog")
with query_counter() as q:
doc.save()
query_op = q.db.system.profile.find({"ns": "mongoenginetest.animal"})[0]
assert query_op["op"] == "command"
assert query_op["command"]["findAndModify"] == "animal"
assert set(query_op["command"]["query"].keys()) == set(["_id", "is_mammal"])
Animal.drop_collection()
def test_reload_with_changed_fields(self): def test_reload_with_changed_fields(self):
"""Ensures reloading will not affect changed fields""" """Ensures reloading will not affect changed fields"""
@@ -578,7 +600,8 @@ class TestDocumentInstance(MongoDBTestCase):
doc.embedded_field.list_field.append(1) doc.embedded_field.list_field.append(1)
doc.embedded_field.dict_field["woot"] = "woot" doc.embedded_field.dict_field["woot"] = "woot"
assert doc._get_changed_fields() == [ changed = doc._get_changed_fields()
assert changed == [
"list_field", "list_field",
"dict_field.woot", "dict_field.woot",
"embedded_field.list_field", "embedded_field.list_field",
@@ -3411,7 +3434,7 @@ class TestDocumentInstance(MongoDBTestCase):
assert obj3 != dbref2 assert obj3 != dbref2
assert dbref2 != obj3 assert dbref2 != obj3
def test_default_values(self): def test_default_values_dont_get_override_upon_save_when_only_is_used(self):
class Person(Document): class Person(Document):
created_on = DateTimeField(default=lambda: datetime.utcnow()) created_on = DateTimeField(default=lambda: datetime.utcnow())
name = StringField() name = StringField()
@@ -3799,5 +3822,95 @@ class ObjectKeyTestCase(MongoDBTestCase):
assert book._object_key == {"pk": book.pk, "author__name": "Author"} assert book._object_key == {"pk": book.pk, "author__name": "Author"}
class DBFieldMappingTest(MongoDBTestCase):
def setUp(self):
class Fields(object):
w1 = BooleanField(db_field="w2")
x1 = BooleanField(db_field="x2")
x2 = BooleanField(db_field="x3")
y1 = BooleanField(db_field="y0")
y2 = BooleanField(db_field="y1")
z1 = BooleanField(db_field="z2")
z2 = BooleanField(db_field="z1")
class Doc(Fields, Document):
pass
class DynDoc(Fields, DynamicDocument):
pass
self.Doc = Doc
self.DynDoc = DynDoc
def tearDown(self):
for collection in list_collection_names(self.db):
self.db.drop_collection(collection)
def test_setting_fields_in_constructor_of_strict_doc_uses_model_names(self):
doc = self.Doc(z1=True, z2=False)
assert doc.z1 is True
assert doc.z2 is False
def test_setting_fields_in_constructor_of_dyn_doc_uses_model_names(self):
doc = self.DynDoc(z1=True, z2=False)
assert doc.z1 is True
assert doc.z2 is False
def test_setting_unknown_field_in_constructor_of_dyn_doc_does_not_overwrite_model_fields(
self,
):
doc = self.DynDoc(w2=True)
assert doc.w1 is None
assert doc.w2 is True
def test_unknown_fields_of_strict_doc_do_not_overwrite_dbfields_1(self):
doc = self.Doc()
doc.w2 = True
doc.x3 = True
doc.y0 = True
doc.save()
reloaded = self.Doc.objects.get(id=doc.id)
assert reloaded.w1 is None
assert reloaded.x1 is None
assert reloaded.x2 is None
assert reloaded.y1 is None
assert reloaded.y2 is None
def test_dbfields_are_loaded_to_the_right_modelfield_for_strict_doc_2(self):
doc = self.Doc()
doc.x2 = True
doc.y2 = True
doc.z2 = True
doc.save()
reloaded = self.Doc.objects.get(id=doc.id)
assert (
reloaded.x1,
reloaded.x2,
reloaded.y1,
reloaded.y2,
reloaded.z1,
reloaded.z2,
) == (doc.x1, doc.x2, doc.y1, doc.y2, doc.z1, doc.z2)
def test_dbfields_are_loaded_to_the_right_modelfield_for_dyn_doc_2(self):
doc = self.DynDoc()
doc.x2 = True
doc.y2 = True
doc.z2 = True
doc.save()
reloaded = self.DynDoc.objects.get(id=doc.id)
assert (
reloaded.x1,
reloaded.x2,
reloaded.y1,
reloaded.y2,
reloaded.z1,
reloaded.z2,
) == (doc.x1, doc.x2, doc.y1, doc.y2, doc.z1, doc.z2)
if __name__ == "__main__": if __name__ == "__main__":
unittest.main() unittest.main()

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import unittest import unittest
from datetime import datetime from datetime import datetime

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import uuid import uuid
from bson import Binary from bson import Binary

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import pytest import pytest
from mongoengine import * from mongoengine import *

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
from decimal import Decimal from decimal import Decimal
import pytest import pytest

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import datetime import datetime
import itertools import itertools
import math import math

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import datetime import datetime
import pytest import pytest

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import datetime as dt import datetime as dt
import pytest import pytest

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
from decimal import Decimal from decimal import Decimal
import pytest import pytest

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
from bson import InvalidDocument from bson import InvalidDocument
import pytest import pytest
@@ -113,7 +112,7 @@ class TestDictField(MongoDBTestCase):
post.info.setdefault("authors", []) post.info.setdefault("authors", [])
post.save() post.save()
post.reload() post.reload()
assert [] == post.info["authors"] assert post.info["authors"] == []
def test_dictfield_dump_document(self): def test_dictfield_dump_document(self):
"""Ensure a DictField can handle another document's dump.""" """Ensure a DictField can handle another document's dump."""

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import sys import sys
import pytest import pytest

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import pytest import pytest
from mongoengine import ( from mongoengine import (

View File

@@ -0,0 +1,122 @@
from enum import Enum
from bson import InvalidDocument
import pytest
from mongoengine import *
from tests.utils import MongoDBTestCase, get_as_pymongo
class Status(Enum):
NEW = "new"
DONE = "done"
class ModelWithEnum(Document):
status = EnumField(Status)
class TestStringEnumField(MongoDBTestCase):
def test_storage(self):
model = ModelWithEnum(status=Status.NEW).save()
assert get_as_pymongo(model) == {"_id": model.id, "status": "new"}
def test_set_enum(self):
ModelWithEnum.drop_collection()
ModelWithEnum(status=Status.NEW).save()
assert ModelWithEnum.objects(status=Status.NEW).count() == 1
assert ModelWithEnum.objects.first().status == Status.NEW
def test_set_by_value(self):
ModelWithEnum.drop_collection()
ModelWithEnum(status="new").save()
assert ModelWithEnum.objects.first().status == Status.NEW
def test_filter(self):
ModelWithEnum.drop_collection()
ModelWithEnum(status="new").save()
assert ModelWithEnum.objects(status="new").count() == 1
assert ModelWithEnum.objects(status=Status.NEW).count() == 1
assert ModelWithEnum.objects(status=Status.DONE).count() == 0
def test_change_value(self):
m = ModelWithEnum(status="new")
m.status = Status.DONE
m.save()
assert m.status == Status.DONE
def test_set_default(self):
class ModelWithDefault(Document):
status = EnumField(Status, default=Status.DONE)
m = ModelWithDefault().save()
assert m.status == Status.DONE
def test_enum_field_can_be_empty(self):
ModelWithEnum.drop_collection()
m = ModelWithEnum().save()
assert m.status is None
assert ModelWithEnum.objects()[0].status is None
assert ModelWithEnum.objects(status=None).count() == 1
def test_set_none_explicitly(self):
ModelWithEnum.drop_collection()
ModelWithEnum(status=None).save()
assert ModelWithEnum.objects.first().status is None
def test_cannot_create_model_with_wrong_enum_value(self):
m = ModelWithEnum(status="wrong_one")
with pytest.raises(ValidationError):
m.validate()
def test_user_is_informed_when_tries_to_set_choices(self):
with pytest.raises(ValueError, match="'choices' can't be set on EnumField"):
EnumField(Status, choices=["my", "custom", "options"])
class Color(Enum):
RED = 1
BLUE = 2
class ModelWithColor(Document):
color = EnumField(Color, default=Color.RED)
class TestIntEnumField(MongoDBTestCase):
def test_enum_with_int(self):
ModelWithColor.drop_collection()
m = ModelWithColor().save()
assert m.color == Color.RED
assert ModelWithColor.objects(color=Color.RED).count() == 1
assert ModelWithColor.objects(color=1).count() == 1
assert ModelWithColor.objects(color=2).count() == 0
def test_create_int_enum_by_value(self):
model = ModelWithColor(color=2).save()
assert model.color == Color.BLUE
def test_storage_enum_with_int(self):
model = ModelWithColor(color=Color.BLUE).save()
assert get_as_pymongo(model) == {"_id": model.id, "color": 2}
def test_validate_model(self):
with pytest.raises(ValidationError, match="Value must be one of"):
ModelWithColor(color=3).validate()
with pytest.raises(ValidationError, match="Value must be one of"):
ModelWithColor(color="wrong_type").validate()
class TestFunkyEnumField(MongoDBTestCase):
def test_enum_incompatible_bson_type_fails_during_save(self):
class FunkyColor(Enum):
YELLOW = object()
class ModelWithFunkyColor(Document):
color = EnumField(FunkyColor)
m = ModelWithFunkyColor(color=FunkyColor.YELLOW)
with pytest.raises(InvalidDocument, match="[cC]annot encode object"):
m.save()

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import datetime import datetime
import unittest import unittest
@@ -336,7 +335,7 @@ class TestField(MongoDBTestCase):
doc.save() doc.save()
# Unset all the fields # Unset all the fields
HandleNoneFields._get_collection().update( HandleNoneFields._get_collection().update_one(
{"_id": doc.id}, {"_id": doc.id},
{"$unset": {"str_fld": 1, "int_fld": 1, "flt_fld": 1, "comp_dt_fld": 1}}, {"$unset": {"str_fld": 1, "int_fld": 1, "flt_fld": 1, "comp_dt_fld": 1}},
) )
@@ -1084,7 +1083,7 @@ class TestField(MongoDBTestCase):
e = Simple().save() e = Simple().save()
e.mapping = [] e.mapping = []
assert [] == e._changed_fields assert e._changed_fields == []
class Simple(Document): class Simple(Document):
mapping = DictField() mapping = DictField()
@@ -1093,7 +1092,7 @@ class TestField(MongoDBTestCase):
e = Simple().save() e = Simple().save()
e.mapping = {} e.mapping = {}
assert [] == e._changed_fields assert e._changed_fields == []
def test_slice_marks_field_as_changed(self): def test_slice_marks_field_as_changed(self):
class Simple(Document): class Simple(Document):
@@ -2273,6 +2272,13 @@ class TestField(MongoDBTestCase):
with pytest.raises(FieldDoesNotExist): with pytest.raises(FieldDoesNotExist):
Doc(bar="test") Doc(bar="test")
def test_undefined_field_works_no_confusion_with_db_field(self):
class Doc(Document):
foo = StringField(db_field="bar")
with pytest.raises(FieldDoesNotExist):
Doc(bar="test")
class TestEmbeddedDocumentListField(MongoDBTestCase): class TestEmbeddedDocumentListField(MongoDBTestCase):
def setUp(self): def setUp(self):

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import copy import copy
import os import os
import tempfile import tempfile
@@ -430,7 +429,7 @@ class TestFileField(MongoDBTestCase):
@require_pil @require_pil
def test_image_field_resize(self): def test_image_field_resize(self):
class TestImage(Document): class TestImage(Document):
image = ImageField(size=(185, 37)) image = ImageField(size=(185, 37, True))
TestImage.drop_collection() TestImage.drop_collection()
@@ -472,7 +471,7 @@ class TestFileField(MongoDBTestCase):
@require_pil @require_pil
def test_image_field_thumbnail(self): def test_image_field_thumbnail(self):
class TestImage(Document): class TestImage(Document):
image = ImageField(thumbnail_size=(92, 18)) image = ImageField(thumbnail_size=(92, 18, True))
TestImage.drop_collection() TestImage.drop_collection()

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import pytest import pytest
from mongoengine import * from mongoengine import *

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import unittest import unittest
from mongoengine import * from mongoengine import *
@@ -381,7 +380,7 @@ class TestGeoField(MongoDBTestCase):
meta = {"indexes": [[("location", "2dsphere"), ("datetime", 1)]]} meta = {"indexes": [[("location", "2dsphere"), ("datetime", 1)]]}
assert [] == Log._geo_indices() assert Log._geo_indices() == []
Log.drop_collection() Log.drop_collection()
Log.ensure_indexes() Log.ensure_indexes()
@@ -401,7 +400,7 @@ class TestGeoField(MongoDBTestCase):
"indexes": [{"fields": [("location", "2dsphere"), ("datetime", 1)]}] "indexes": [{"fields": [("location", "2dsphere"), ("datetime", 1)]}]
} }
assert [] == Log._geo_indices() assert Log._geo_indices() == []
Log.drop_collection() Log.drop_collection()
Log.ensure_indexes() Log.ensure_indexes()

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import pytest import pytest
from mongoengine import * from mongoengine import *

View File

@@ -1,9 +1,9 @@
# -*- coding: utf-8 -*-
from bson import DBRef, ObjectId from bson import DBRef, ObjectId
import pytest import pytest
from mongoengine import * from mongoengine import *
from mongoengine.base import LazyReference from mongoengine.base import LazyReference
from mongoengine.context_managers import query_counter
from tests.utils import MongoDBTestCase from tests.utils import MongoDBTestCase
@@ -331,6 +331,50 @@ class TestLazyReferenceField(MongoDBTestCase):
occ.in_embedded.in_list = [animal1.id, animal2.id] occ.in_embedded.in_list = [animal1.id, animal2.id]
check_fields_type(occ) check_fields_type(occ)
def test_lazy_reference_embedded_dereferencing(self):
# Test case for #2375
# -- Test documents
class Author(Document):
name = StringField()
class AuthorReference(EmbeddedDocument):
author = LazyReferenceField(Author)
class Book(Document):
authors = EmbeddedDocumentListField(AuthorReference)
# -- Cleanup
Author.drop_collection()
Book.drop_collection()
# -- Create test data
author_1 = Author(name="A1").save()
author_2 = Author(name="A2").save()
author_3 = Author(name="A3").save()
book = Book(
authors=[
AuthorReference(author=author_1),
AuthorReference(author=author_2),
AuthorReference(author=author_3),
]
).save()
with query_counter() as qc:
book = Book.objects.first()
# Accessing the list must not trigger dereferencing.
book.authors
assert qc == 1
for ref in book.authors:
with pytest.raises(AttributeError):
ref["author"].name
assert isinstance(ref.author, LazyReference)
assert isinstance(ref.author.id, ObjectId)
class TestGenericLazyReferenceField(MongoDBTestCase): class TestGenericLazyReferenceField(MongoDBTestCase):
def test_generic_lazy_reference_simple(self): def test_generic_lazy_reference_simple(self):

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import datetime import datetime
import pytest import pytest

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
from bson import DBRef, SON from bson import DBRef, SON
import pytest import pytest

View File

@@ -1,5 +1,3 @@
# -*- coding: utf-8 -*-
from mongoengine import * from mongoengine import *
from tests.utils import MongoDBTestCase from tests.utils import MongoDBTestCase

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import pytest import pytest
from mongoengine import * from mongoengine import *

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import uuid import uuid
import pytest import pytest

View File

@@ -1,5 +1,3 @@
# -*- coding: utf-8 -*-
import datetime import datetime
import unittest import unittest
import uuid import uuid
@@ -114,6 +112,38 @@ class TestQueryset(unittest.TestCase):
assert person.name == "User A" assert person.name == "User A"
assert person.age == 20 assert person.age == 20
def test_slicing_sets_empty_limit_skip(self):
self.Person.objects.insert(
[self.Person(name="User {}".format(i), age=i) for i in range(5)],
load_bulk=False,
)
self.Person.objects.create(name="User B", age=30)
self.Person.objects.create(name="User C", age=40)
qs = self.Person.objects()[1:2]
assert (qs._empty, qs._skip, qs._limit) == (False, 1, 1)
assert len(list(qs)) == 1
# Test edge case of [1:1] which should return nothing
# and require a hack so that it doesn't clash with limit(0)
qs = self.Person.objects()[1:1]
assert (qs._empty, qs._skip, qs._limit) == (True, 1, 0)
assert len(list(qs)) == 0
qs2 = qs[1:5] # Make sure that further slicing resets _empty
assert (qs2._empty, qs2._skip, qs2._limit) == (False, 1, 4)
assert len(list(qs2)) == 4
def test_limit_0_returns_all_documents(self):
self.Person.objects.create(name="User A", age=20)
self.Person.objects.create(name="User B", age=30)
n_docs = self.Person.objects().count()
persons = list(self.Person.objects().limit(0))
assert len(persons) == 2 == n_docs
def test_limit(self): def test_limit(self):
"""Ensure that QuerySet.limit works as expected.""" """Ensure that QuerySet.limit works as expected."""
user_a = self.Person.objects.create(name="User A", age=20) user_a = self.Person.objects.create(name="User A", age=20)
@@ -377,6 +407,9 @@ class TestQueryset(unittest.TestCase):
assert list(A.objects.none()) == [] assert list(A.objects.none()) == []
assert list(A.objects.none().all()) == [] assert list(A.objects.none().all()) == []
assert list(A.objects.none().limit(1)) == []
assert list(A.objects.none().skip(1)) == []
assert list(A.objects.none()[:5]) == []
def test_chaining(self): def test_chaining(self):
class A(Document): class A(Document):
@@ -4021,6 +4054,32 @@ class TestQueryset(unittest.TestCase):
Number.drop_collection() Number.drop_collection()
def test_clone_retains_settings(self):
"""Ensure that cloning retains the read_preference and read_concern
"""
class Number(Document):
n = IntField()
Number.drop_collection()
qs = Number.objects
qs_clone = qs.clone()
assert qs._read_preference == qs_clone._read_preference
assert qs._read_concern == qs_clone._read_concern
qs = Number.objects.read_preference(ReadPreference.PRIMARY_PREFERRED)
qs_clone = qs.clone()
assert qs._read_preference == ReadPreference.PRIMARY_PREFERRED
assert qs._read_preference == qs_clone._read_preference
qs = Number.objects.read_concern({"level": "majority"})
qs_clone = qs.clone()
assert qs._read_concern.document == {"level": "majority"}
assert qs._read_concern == qs_clone._read_concern
Number.drop_collection()
def test_using(self): def test_using(self):
"""Ensure that switching databases for a queryset is possible """Ensure that switching databases for a queryset is possible
""" """
@@ -4442,7 +4501,9 @@ class TestQueryset(unittest.TestCase):
assert len(people) == 1 assert len(people) == 1
assert people[0] == "User B" assert people[0] == "User B"
people = list(self.Person.objects[1:1].scalar("name")) # people = list(self.Person.objects[1:1].scalar("name"))
people = self.Person.objects[1:1]
people = people.scalar("name")
assert len(people) == 0 assert len(people) == 0
# Test slice out of range # Test slice out of range

View File

@@ -1,5 +1,3 @@
# -*- coding: utf-8 -*-
import unittest import unittest
import warnings import warnings

View File

@@ -344,6 +344,31 @@ class TestTransform(unittest.TestCase):
) )
assert update == {"$pull": {"content.text": {"word": {"$nin": ["foo", "bar"]}}}} assert update == {"$pull": {"content.text": {"word": {"$nin": ["foo", "bar"]}}}}
def test_transform_embedded_document_list_fields(self):
"""
Test added to check filtering
EmbeddedDocumentListField which is inside a EmbeddedDocumentField
"""
class Drink(EmbeddedDocument):
id = StringField()
meta = {"strict": False}
class Shop(Document):
drinks = EmbeddedDocumentListField(Drink)
Shop.drop_collection()
drinks = [Drink(id="drink_1"), Drink(id="drink_2")]
Shop.objects.create(drinks=drinks)
q_obj = transform.query(
Shop, drinks__all=[{"$elemMatch": {"_id": x.id}} for x in drinks]
)
assert q_obj == {
"drinks": {"$all": [{"$elemMatch": {"_id": x.id}} for x in drinks]}
}
Shop.drop_collection()
if __name__ == "__main__": if __name__ == "__main__":
unittest.main() unittest.main()

View File

@@ -282,7 +282,7 @@ class ConnectionTest(unittest.TestCase):
# database won't exist until we save a document # database won't exist until we save a document
some_document.save() some_document.save()
assert conn.get_default_database().name == "mongoenginetest" assert conn.get_default_database().name == "mongoenginetest"
assert conn.database_names()[0] == "mongoenginetest" assert conn.list_database_names()[0] == "mongoenginetest"
@require_mongomock @require_mongomock
def test_connect_with_host_list(self): def test_connect_with_host_list(self):

View File

@@ -9,10 +9,14 @@ from mongoengine.base.datastructures import BaseDict, BaseList, StrictDict
class DocumentStub(object): class DocumentStub(object):
def __init__(self): def __init__(self):
self._changed_fields = [] self._changed_fields = []
self._unset_fields = []
def _mark_as_changed(self, key): def _mark_as_changed(self, key):
self._changed_fields.append(key) self._changed_fields.append(key)
def _mark_as_unset(self, key):
self._unset_fields.append(key)
class TestBaseDict: class TestBaseDict:
@staticmethod @staticmethod
@@ -314,7 +318,7 @@ class TestBaseList:
def test___setitem___item_0_calls_mark_as_changed(self): def test___setitem___item_0_calls_mark_as_changed(self):
base_list = self._get_baselist([True]) base_list = self._get_baselist([True])
base_list[0] = False base_list[0] = False
assert base_list._instance._changed_fields == ["my_name"] assert base_list._instance._changed_fields == ["my_name.0"]
assert base_list == [False] assert base_list == [False]
def test___setitem___item_1_calls_mark_as_changed(self): def test___setitem___item_1_calls_mark_as_changed(self):

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import unittest import unittest
from bson import DBRef, ObjectId from bson import DBRef, ObjectId
@@ -370,8 +369,7 @@ class FieldTest(unittest.TestCase):
assert Post.objects.all()[0].user_lists == [[u1, u2], [u3]] assert Post.objects.all()[0].user_lists == [[u1, u2], [u3]]
def test_circular_reference(self): def test_circular_reference(self):
"""Ensure you can handle circular references """Ensure you can handle circular references"""
"""
class Relation(EmbeddedDocument): class Relation(EmbeddedDocument):
name = StringField() name = StringField()
@@ -426,6 +424,7 @@ class FieldTest(unittest.TestCase):
daughter.relations.append(mother) daughter.relations.append(mother)
daughter.relations.append(daughter) daughter.relations.append(daughter)
assert daughter._get_changed_fields() == ["relations"]
daughter.save() daughter.save()
assert "[<Person: Mother>, <Person: Daughter>]" == "%s" % Person.objects() assert "[<Person: Mother>, <Person: Daughter>]" == "%s" % Person.objects()

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
import unittest import unittest
from mongoengine import * from mongoengine import *

View File

@@ -50,7 +50,7 @@ def _decorated_with_ver_requirement(func, mongo_version_req, oper):
ran against MongoDB < v3.6. ran against MongoDB < v3.6.
:param mongo_version_req: The mongodb version requirement (tuple(int, int)) :param mongo_version_req: The mongodb version requirement (tuple(int, int))
:param oper: The operator to apply (e.g: operator.ge) :param oper: The operator to apply (e.g. operator.ge)
""" """
def _inner(*args, **kwargs): def _inner(*args, **kwargs):

View File

@@ -1,5 +1,5 @@
[tox] [tox]
envlist = {py35,pypy3}-{mg34,mg36,mg39,mg310} envlist = {py35,pypy3}-{mg34,mg36,mg39,mg311}
[testenv] [testenv]
commands = commands =
@@ -8,6 +8,6 @@ deps =
mg34: pymongo>=3.4,<3.5 mg34: pymongo>=3.4,<3.5
mg36: pymongo>=3.6,<3.7 mg36: pymongo>=3.6,<3.7
mg39: pymongo>=3.9,<3.10 mg39: pymongo>=3.9,<3.10
mg310: pymongo>=3.10,<3.11 mg311: pymongo>=3.11,<3.12
setenv = setenv =
PYTHON_EGG_CACHE = {envdir}/python-eggs PYTHON_EGG_CACHE = {envdir}/python-eggs