Compare commits

..

50 Commits

Author SHA1 Message Date
Stefan Wojcik
5e0b97e90c add a test with an invalid data type 2017-04-16 11:50:28 -04:00
Stefan Wojcik
a0a3805e2d Revert "switch from octal to hex for consistency" (because of pypy3)
This reverts commit 7d5caf8368.
2017-04-15 23:14:19 -04:00
Stefan Wojcik
7d5caf8368 switch from octal to hex for consistency 2017-04-15 22:20:19 -04:00
Stefan Wojcik
dee5465440 dont run the unicode email test on pypy3 2017-04-15 18:07:32 -04:00
Stefan Wojcik
33e50e48c1 use six.u 2017-04-15 16:46:36 -04:00
Stefan Wojcik
41371e5fc5 empty whitelist by default + allow_ip_domain option 2017-04-10 10:00:49 -04:00
Stefan Wojcik
ce86ea4c9a flake8 fixes 2017-04-10 08:18:56 -04:00
Stefan Wojcik
601b79865d support unicode in EmailField 2017-04-09 22:33:11 -04:00
Stefan Wojcik
b52d3e3a7b added one more item to the v0.12.0 changelog 2017-04-07 10:34:04 -04:00
Stefan Wojcik
888a6da4a5 update the changelog and bump the version to v0.12.0 2017-04-07 10:18:39 -04:00
Omer Katz
972ac73dd9 Merge pull request #1497 from userlocalhost/feature/order_guarantee
added a feature to save object data in order
2017-04-07 10:49:39 +03:00
Hiroyasu OHYAMA
d8b238d5f1 Refactored the implementation of DynamicField extension for storing data in order 2017-04-06 00:42:11 +00:00
Omer Katz
63206c3da2 Merge pull request #1520 from ZoetropeLabs/fix/allow-reference-fields-take-object-ids
Allow ReferenceFields to take ObjectIds
2017-04-02 13:57:58 +03:00
Richard Fortescue-Webb
5713de8966 Use the objectid in the test 2017-03-29 11:34:57 +01:00
Richard Fortescue-Webb
58f293fef3 Allow ReferenceFields to take ObjectIds 2017-03-29 10:34:50 +01:00
Hiroyasu OHYAMA
ffbb2c9689 This is Additional tests for the container_class parameter of DynamicField
This tests DynamicField dereference with ordering guarantee.
2017-03-08 14:46:04 +00:00
Hiroyasu OHYAMA
9cd3dcdebf Added a test for the change of the condition in DeReference processing
This checks DBRef conversion using DynamicField with the ordering
guarantee.
2017-03-08 14:45:43 +00:00
Hiroyasu OHYAMA
f2fe58c3c5 Added a condition to store data to ObjectDict when the items type is it
Previous dereference implementation re-contains data as `dict` except
for the predicted type.
But the OrderedDict is not predicted, so the its data would be converted
`dict` implicitly.
As the result, the order of stored data get wrong. And this patch
prevents it.
2017-03-08 14:35:50 +00:00
Stefan Wojcik
b78010aa94 remove test_last_field_name_like_operator (it's a dupe of the same test in tests/queryset/transform.py) 2017-03-05 21:24:46 -05:00
Stefan Wójcik
49035543b9 cleanup BaseQuerySet.__getitem__ (#1502) 2017-03-05 21:17:53 -05:00
Stefan Wójcik
f9ccf635ca Respect db fields in multiple layers of embedded docs (#1501) 2017-03-05 18:20:09 -05:00
Stefan Wojcik
e8ea294964 test negative indexes (closes #1119) 2017-03-05 18:12:01 -05:00
Stefan Wojcik
19ef2be88b fix #937 2017-03-05 00:05:33 -05:00
Stefan Wojcik
30e8b8186f clean up document instance tests 2017-03-02 00:25:56 -05:00
Stefan Wójcik
741643af5f clean up field unit tests (#1498) 2017-03-02 00:05:10 -05:00
Hiroyasu OHYAMA
6aaf9ba470 removed a checking of dict order because this order is not cared (some implementation might be in ordered, but other one is not) 2017-03-01 09:32:28 +00:00
Hiroyasu OHYAMA
5957dc72eb To achive storing object data in order with minimum implementation, I
changed followings.

- added optional parameter `container_class` which enables to choose
  intermediate class at encoding Python data, instead of additional
  field class.
- removed OrderedDocument class because the equivalent feature could
  be implemented by the outside of Mongoengine.
2017-03-01 09:20:57 +00:00
Hiroyasu OHYAMA
e32a9777d7 added test for OrderedDynamicField and OrderedDocument 2017-02-28 03:35:53 +00:00
Hiroyasu OHYAMA
84a8f1eb2b added OrderedDocument class to decode BSON data to OrderedDict for retrieving data in order 2017-02-28 03:35:39 +00:00
Hiroyasu OHYAMA
6810953014 added OrderedDynamicField class to store data in the defined order because of #203 2017-02-28 03:34:42 +00:00
Ephraim Berkovitch
398964945a Document.objects.create should raise NotUniqueError upon saving duplicate primary key (#1485) 2017-02-27 09:42:44 -05:00
Stefan Wójcik
5f43c032f2 revamp the "connecting" user guide and test more ways of connecting to a replica set (#1490) 2017-02-26 21:29:06 -05:00
Stefan Wojcik
627cf90de0 tutorial tweaks: better copy + use py3-friendly syntax 2017-02-26 20:30:37 -05:00
Omer Katz
2bedb36d7f Test against multiple MongoDB versions in Travis (#1074) 2017-02-26 14:52:43 -05:00
Stefan Wójcik
e93a95d0cb Test and document controlling the size of the connection pool (#1489) 2017-02-25 14:09:10 -05:00
Stefan Wójcik
3f31666796 Fix the exception message when validating unicode URLs (#1486) 2017-02-24 16:18:34 -05:00
Stefan Wojcik
3fe8031cf3 fix EmbeddedDocumentListFieldTestCase 2017-02-22 12:44:05 -05:00
bagerard
b27c7ce11b allow to use sets in field choices (#1482) 2017-02-15 08:51:47 -05:00
Stefan Wojcik
ed34c2ca68 update the changelog and upgrade docs 2017-02-09 12:13:56 -08:00
Stefan Wójcik
3ca2e953fb Fix limit/skip/hint/batch_size chaining (#1476) 2017-02-09 12:02:46 -08:00
martin sereinig
d8a7328365 Fix docs regarding reverse_delete_rule and delete signals (#1473) 2017-02-06 14:11:42 -07:00
Stefan Wojcik
f33cd625bf nicer readme 2017-01-17 02:47:45 -05:00
Stefan Wojcik
80530bb13c nicer readme 2017-01-17 02:46:37 -05:00
Stefan Wójcik
affc12df4b Update README.rst 2017-01-17 02:43:29 -05:00
Stefan Wojcik
4eedf00025 nicer readme note about dependencies 2017-01-17 02:42:23 -05:00
Eli Boyarski
e5acbcc0dd Improved a docstring for FieldDoesNotExist (#1466) 2017-01-09 11:24:27 -05:00
Stefan Wojcik
1b6743ee53 add a changelog entry about broken references raising DoesNotExist 2017-01-08 14:50:16 -05:00
Eli Boyarski
b5fb82d95d Typo fix (#1463) 2017-01-08 12:57:36 -05:00
lanf0n
193aa4e1f2 [#1459] fix typo __neq__ to __ne__ (#1461) 2017-01-05 22:37:09 -05:00
Stefan Wójcik
ebd34427c7 Cleaner Document.save (#1458) 2016-12-30 05:43:56 -05:00
33 changed files with 3615 additions and 2991 deletions

1
.gitignore vendored
View File

@@ -17,3 +17,4 @@ tests/test_bugfix.py
htmlcov/ htmlcov/
venv venv
venv3 venv3
scratchpad

View File

@@ -0,0 +1,23 @@
#!/bin/bash
sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 7F0CEB10
if [ "$MONGODB" = "2.4" ]; then
echo "deb http://downloads-distro.mongodb.org/repo/ubuntu-upstart dist 10gen" | sudo tee /etc/apt/sources.list.d/mongodb.list
sudo apt-get update
sudo apt-get install mongodb-10gen=2.4.14
sudo service mongodb start
elif [ "$MONGODB" = "2.6" ]; then
echo "deb http://downloads-distro.mongodb.org/repo/ubuntu-upstart dist 10gen" | sudo tee /etc/apt/sources.list.d/mongodb.list
sudo apt-get update
sudo apt-get install mongodb-org-server=2.6.12
# service should be started automatically
elif [ "$MONGODB" = "3.0" ]; then
echo "deb http://repo.mongodb.org/apt/ubuntu precise/mongodb-org/3.0 multiverse" | sudo tee /etc/apt/sources.list.d/mongodb.list
sudo apt-get update
sudo apt-get install mongodb-org-server=3.0.14
# service should be started automatically
else
echo "Invalid MongoDB version, expected 2.4, 2.6, or 3.0."
exit 1
fi;

View File

@@ -1,28 +1,48 @@
# For full coverage, we'd have to test all supported Python, MongoDB, and
# PyMongo combinations. However, that would result in an overly long build
# with a very large number of jobs, hence we only test a subset of all the
# combinations:
# * MongoDB v2.4 & v3.0 are only tested against Python v2.7 & v3.5.
# * MongoDB v2.4 is tested against PyMongo v2.7 & v3.x.
# * MongoDB v3.0 is tested against PyMongo v3.x.
# * MongoDB v2.6 is currently the "main" version tested against Python v2.7,
# v3.5, PyPy & PyPy3, and PyMongo v2.7, v2.8 & v3.x.
#
# Reminder: Update README.rst if you change MongoDB versions we test.
language: python language: python
python: python:
- '2.7' - 2.7
- '3.3' - 3.5
- '3.4'
- '3.5'
- pypy - pypy
- pypy3 - pypy3
env: env:
- PYMONGO=2.7 - MONGODB=2.6 PYMONGO=2.7
- PYMONGO=2.8 - MONGODB=2.6 PYMONGO=2.8
- PYMONGO=3.0 - MONGODB=2.6 PYMONGO=3.0
- PYMONGO=dev
matrix: matrix:
# Finish the build as soon as one job fails
fast_finish: true fast_finish: true
include:
- python: 2.7
env: MONGODB=2.4 PYMONGO=2.7
- python: 2.7
env: MONGODB=2.4 PYMONGO=3.0
- python: 2.7
env: MONGODB=3.0 PYMONGO=3.0
- python: 3.5
env: MONGODB=2.4 PYMONGO=2.7
- python: 3.5
env: MONGODB=2.4 PYMONGO=3.0
- python: 3.5
env: MONGODB=3.0 PYMONGO=3.0
before_install: before_install:
- travis_retry sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 7F0CEB10 - bash .install_mongodb_on_travis.sh
- echo 'deb http://downloads-distro.mongodb.org/repo/ubuntu-upstart dist 10gen' |
sudo tee /etc/apt/sources.list.d/mongodb.list
- travis_retry sudo apt-get update
- travis_retry sudo apt-get install mongodb-org-server
install: install:
- sudo apt-get install python-dev python3-dev libopenjpeg-dev zlib1g-dev libjpeg-turbo8-dev - sudo apt-get install python-dev python3-dev libopenjpeg-dev zlib1g-dev libjpeg-turbo8-dev
@@ -30,14 +50,17 @@ install:
python-tk python-tk
- travis_retry pip install --upgrade pip - travis_retry pip install --upgrade pip
- travis_retry pip install coveralls - travis_retry pip install coveralls
- travis_retry pip install flake8 - travis_retry pip install flake8 flake8-import-order
- travis_retry pip install tox>=1.9 - travis_retry pip install tox>=1.9
- travis_retry pip install "virtualenv<14.0.0" # virtualenv>=14.0.0 has dropped Python 3.2 support (and pypy3 is based on py32) - travis_retry pip install "virtualenv<14.0.0" # virtualenv>=14.0.0 has dropped Python 3.2 support (and pypy3 is based on py32)
- travis_retry tox -e $(echo py$TRAVIS_PYTHON_VERSION-mg$PYMONGO | tr -d . | sed -e 's/pypypy/pypy/') -- -e test - travis_retry tox -e $(echo py$TRAVIS_PYTHON_VERSION-mg$PYMONGO | tr -d . | sed -e 's/pypypy/pypy/') -- -e test
# Cache dependencies installed via pip
cache: pip
# Run flake8 for py27 # Run flake8 for py27
before_script: before_script:
- if [[ $TRAVIS_PYTHON_VERSION == '2.7' ]]; then tox -e flake8; fi - if [[ $TRAVIS_PYTHON_VERSION == '2.7' ]]; then flake8 .; else echo "flake8 only runs on py27"; fi
script: script:
- tox -e $(echo py$TRAVIS_PYTHON_VERSION-mg$PYMONGO | tr -d . | sed -e 's/pypypy/pypy/') -- --with-coverage - tox -e $(echo py$TRAVIS_PYTHON_VERSION-mg$PYMONGO | tr -d . | sed -e 's/pypypy/pypy/') -- --with-coverage
@@ -45,22 +68,34 @@ script:
# For now only submit coveralls for Python v2.7. Python v3.x currently shows # For now only submit coveralls for Python v2.7. Python v3.x currently shows
# 0% coverage. That's caused by 'use_2to3', which builds the py3-compatible # 0% coverage. That's caused by 'use_2to3', which builds the py3-compatible
# code in a separate dir and runs tests on that. # code in a separate dir and runs tests on that.
after_script: after_success:
- if [[ $TRAVIS_PYTHON_VERSION == '2.7' ]]; then coveralls --verbose; fi - if [[ $TRAVIS_PYTHON_VERSION == '2.7' ]]; then coveralls --verbose; fi
notifications: notifications:
irc: irc.freenode.org#mongoengine irc: irc.freenode.org#mongoengine
# Only run builds on the master branch and GitHub releases (tagged as vX.Y.Z)
branches: branches:
only: only:
- master - master
- /^v.*$/ - /^v.*$/
# Whenever a new release is created via GitHub, publish it on PyPI.
deploy: deploy:
provider: pypi provider: pypi
user: the_drow user: the_drow
password: password:
secure: QMyatmWBnC6ZN3XLW2+fTBDU4LQcp1m/LjR2/0uamyeUzWKdlOoh/Wx5elOgLwt/8N9ppdPeG83ose1jOz69l5G0MUMjv8n/RIcMFSpCT59tGYqn3kh55b0cIZXFT9ar+5cxlif6a5rS72IHm5li7QQyxexJIII6Uxp0kpvUmek= secure: QMyatmWBnC6ZN3XLW2+fTBDU4LQcp1m/LjR2/0uamyeUzWKdlOoh/Wx5elOgLwt/8N9ppdPeG83ose1jOz69l5G0MUMjv8n/RIcMFSpCT59tGYqn3kh55b0cIZXFT9ar+5cxlif6a5rS72IHm5li7QQyxexJIII6Uxp0kpvUmek=
# create a source distribution and a pure python wheel for faster installs
distributions: "sdist bdist_wheel"
# only deploy on tagged commits (aka GitHub releases) and only for the
# parent repo's builds running Python 2.7 along with dev PyMongo (we run
# Travis against many different Python and PyMongo versions and we don't
# want the deploy to occur multiple times).
on: on:
tags: true tags: true
repo: MongoEngine/mongoengine repo: MongoEngine/mongoengine
condition: "$PYMONGO = 3.0"
python: 2.7

View File

@@ -29,19 +29,20 @@ Style Guide
----------- -----------
MongoEngine aims to follow `PEP8 <http://www.python.org/dev/peps/pep-0008/>`_ MongoEngine aims to follow `PEP8 <http://www.python.org/dev/peps/pep-0008/>`_
including 4 space indents. When possible we try to stick to 79 character line limits. including 4 space indents. When possible we try to stick to 79 character line
However, screens got bigger and an ORM has a strong focus on readability and limits. However, screens got bigger and an ORM has a strong focus on
if it can help, we accept 119 as maximum line length, in a similar way as readability and if it can help, we accept 119 as maximum line length, in a
`django does <https://docs.djangoproject.com/en/dev/internals/contributing/writing-code/coding-style/#python-style>`_ similar way as `django does
<https://docs.djangoproject.com/en/dev/internals/contributing/writing-code/coding-style/#python-style>`_
Testing Testing
------- -------
All tests are run on `Travis <http://travis-ci.org/MongoEngine/mongoengine>`_ All tests are run on `Travis <http://travis-ci.org/MongoEngine/mongoengine>`_
and any pull requests are automatically tested by Travis. Any pull requests and any pull requests are automatically tested. Any pull requests without
without tests will take longer to be integrated and might be refused. tests will take longer to be integrated and might be refused.
You may also submit a simple failing test as a PullRequest if you don't know You may also submit a simple failing test as a pull request if you don't know
how to fix it, it will be easier for other people to work on it and it may get how to fix it, it will be easier for other people to work on it and it may get
fixed faster. fixed faster.
@@ -49,13 +50,18 @@ General Guidelines
------------------ ------------------
- Avoid backward breaking changes if at all possible. - Avoid backward breaking changes if at all possible.
- If you *have* to introduce a breaking change, make it very clear in your
pull request's description. Also, describe how users of this package
should adapt to the breaking change in docs/upgrade.rst.
- Write inline documentation for new classes and methods. - Write inline documentation for new classes and methods.
- Write tests and make sure they pass (make sure you have a mongod - Write tests and make sure they pass (make sure you have a mongod
running on the default port, then execute ``python setup.py nosetests`` running on the default port, then execute ``python setup.py nosetests``
from the cmd line to run the test suite). from the cmd line to run the test suite).
- Ensure tests pass on every Python and PyMongo versions. - Ensure tests pass on all supported Python, PyMongo, and MongoDB versions.
You can test on these versions locally by executing ``tox`` You can test various Python and PyMongo versions locally by executing
- Add enhancements or problematic bug fixes to docs/changelog.rst ``tox``. For different MongoDB versions, you can rely on our automated
Travis tests.
- Add enhancements or problematic bug fixes to docs/changelog.rst.
- Add yourself to AUTHORS :) - Add yourself to AUTHORS :)
Documentation Documentation
@@ -69,3 +75,6 @@ just make your changes to the inline documentation of the appropriate
branch and submit a `pull request <https://help.github.com/articles/using-pull-requests>`_. branch and submit a `pull request <https://help.github.com/articles/using-pull-requests>`_.
You might also use the github `Edit <https://github.com/blog/844-forking-with-the-edit-button>`_ You might also use the github `Edit <https://github.com/blog/844-forking-with-the-edit-button>`_
button. button.
If you want to test your documentation changes locally, you need to install
the ``sphinx`` package.

View File

@@ -19,32 +19,42 @@ MongoEngine
About About
===== =====
MongoEngine is a Python Object-Document Mapper for working with MongoDB. MongoEngine is a Python Object-Document Mapper for working with MongoDB.
Documentation available at https://mongoengine-odm.readthedocs.io - there is currently Documentation is available at https://mongoengine-odm.readthedocs.io - there
a `tutorial <https://mongoengine-odm.readthedocs.io/tutorial.html>`_, a `user guide is currently a `tutorial <https://mongoengine-odm.readthedocs.io/tutorial.html>`_,
<https://mongoengine-odm.readthedocs.io/guide/index.html>`_ and an `API reference a `user guide <https://mongoengine-odm.readthedocs.io/guide/index.html>`_, and
<https://mongoengine-odm.readthedocs.io/apireference.html>`_. an `API reference <https://mongoengine-odm.readthedocs.io/apireference.html>`_.
Supported MongoDB Versions
==========================
MongoEngine is currently tested against MongoDB v2.4, v2.6, and v3.0. Future
versions should be supported as well, but aren't actively tested at the moment.
Make sure to open an issue or submit a pull request if you experience any
problems with MongoDB v3.2+.
Installation Installation
============ ============
We recommend the use of `virtualenv <https://virtualenv.pypa.io/>`_ and of We recommend the use of `virtualenv <https://virtualenv.pypa.io/>`_ and of
`pip <https://pip.pypa.io/>`_. You can then use ``pip install -U mongoengine``. `pip <https://pip.pypa.io/>`_. You can then use ``pip install -U mongoengine``.
You may also have `setuptools <http://peak.telecommunity.com/DevCenter/setuptools>`_ and thus You may also have `setuptools <http://peak.telecommunity.com/DevCenter/setuptools>`_
you can use ``easy_install -U mongoengine``. Otherwise, you can download the and thus you can use ``easy_install -U mongoengine``. Otherwise, you can download the
source from `GitHub <http://github.com/MongoEngine/mongoengine>`_ and run ``python source from `GitHub <http://github.com/MongoEngine/mongoengine>`_ and run ``python
setup.py install``. setup.py install``.
Dependencies Dependencies
============ ============
- pymongo>=2.7.1 All of the dependencies can easily be installed via `pip <https://pip.pypa.io/>`_.
- sphinx (optional - for documentation generation) At the very least, you'll need these two packages to use MongoEngine:
- pymongo>=2.7.1
- six>=1.10.0
If you utilize a ``DateTimeField``, you might also use a more flexible date parser:
Optional Dependencies
---------------------
- **Image Fields**: Pillow>=2.0.0
- dateutil>=2.1.0 - dateutil>=2.1.0
.. note If you need to use an ``ImageField`` or ``ImageGridFsProxy``:
MongoEngine always runs it's test suite against the latest patch version of each dependecy. e.g.: PyMongo 3.0.1
- Pillow>=2.0.0
Examples Examples
======== ========
@@ -104,11 +114,11 @@ Some simple examples of what MongoEngine code looks like:
Tests Tests
===== =====
To run the test suite, ensure you are running a local instance of MongoDB on To run the test suite, ensure you are running a local instance of MongoDB on
the standard port and have ``nose`` installed. Then, run: ``python setup.py nosetests``. the standard port and have ``nose`` installed. Then, run ``python setup.py nosetests``.
To run the test suite on every supported Python version and every supported PyMongo version, To run the test suite on every supported Python and PyMongo version, you can
you can use ``tox``. use ``tox``. You'll need to make sure you have each supported Python version
tox and each supported Python version should be installed in your environment: installed in your environment and then:
.. code-block:: shell .. code-block:: shell
@@ -117,13 +127,16 @@ tox and each supported Python version should be installed in your environment:
# Run the test suites # Run the test suites
$ tox $ tox
If you wish to run one single or selected tests, use the nosetest convention. It will find the folder, If you wish to run a subset of tests, use the nosetests convention:
eventually the file, go to the TestClass specified after the colon and eventually right to the single test.
Also use the -s argument if you want to print out whatever or access pdb while testing.
.. code-block:: shell .. code-block:: shell
$ python setup.py nosetests --tests tests/fields/fields.py:FieldTest.test_cls_field -s # Run all the tests in a particular test file
$ python setup.py nosetests --tests tests/fields/fields.py
# Run only particular test class in that file
$ python setup.py nosetests --tests tests/fields/fields.py:FieldTest
# Use the -s option if you want to print some debug statements or use pdb
$ python setup.py nosetests --tests tests/fields/fields.py:FieldTest -s
Community Community
========= =========

View File

@@ -4,15 +4,30 @@ Changelog
Development Development
=========== ===========
- (Fill this out as you fix issues and develop you features). - (Fill this out as you fix issues and develop your features).
Changes in 0.12.0
=================
- POTENTIAL BREAKING CHANGE: Fixed limit/skip/hint/batch_size chaining #1476
- POTENTIAL BREAKING CHANGE: Changed a public `QuerySet.clone_into` method to a private `QuerySet._clone_into` #1476
- Fixed the way `Document.objects.create` works with duplicate IDs #1485
- Fixed connecting to a replica set with PyMongo 2.x #1436 - Fixed connecting to a replica set with PyMongo 2.x #1436
- Fixed using sets in field choices #1481
- Fixed deleting items from a `ListField` #1318
- Fixed an obscure error message when filtering by `field__in=non_iterable`. #1237 - Fixed an obscure error message when filtering by `field__in=non_iterable`. #1237
- Fixed behavior of a `dec` update operator #1450
- Added a `rename` update operator #1454
- Added validation for the `db_field` parameter #1448
- Fixed the error message displayed when querying an `EmbeddedDocumentField` by an invalid value #1440
- Fixed the error message displayed when validating unicode URLs #1486
- Raise an error when trying to save an abstract document #1449
Changes in 0.11.0 Changes in 0.11.0
================= =================
- BREAKING CHANGE: Renamed `ConnectionError` to `MongoEngineConnectionError` since the former is a built-in exception name in Python v3.x. #1428 - BREAKING CHANGE: Renamed `ConnectionError` to `MongoEngineConnectionError` since the former is a built-in exception name in Python v3.x. #1428
- BREAKING CHANGE: Dropped Python 2.6 support. #1428 - BREAKING CHANGE: Dropped Python 2.6 support. #1428
- BREAKING CHANGE: `from mongoengine.base import ErrorClass` won't work anymore for any error from `mongoengine.errors` (e.g. `ValidationError`). Use `from mongoengine.errors import ErrorClass instead`. #1428 - BREAKING CHANGE: `from mongoengine.base import ErrorClass` won't work anymore for any error from `mongoengine.errors` (e.g. `ValidationError`). Use `from mongoengine.errors import ErrorClass instead`. #1428
- BREAKING CHANGE: Accessing a broken reference will raise a `DoesNotExist` error. In the past it used to return `None`. #1334
- Fixed absent rounding for DecimalField when `force_string` is set. #1103 - Fixed absent rounding for DecimalField when `force_string` is set. #1103
Changes in 0.10.8 Changes in 0.10.8

View File

@@ -42,13 +42,18 @@ the :attr:`host` to
will establish connection to ``production`` database using will establish connection to ``production`` database using
``admin`` username and ``qwerty`` password. ``admin`` username and ``qwerty`` password.
ReplicaSets Replica Sets
=========== ============
MongoEngine supports MongoEngine supports connecting to replica sets::
:class:`~pymongo.mongo_replica_set_client.MongoReplicaSetClient`. To use them,
please use an URI style connection and provide the ``replicaSet`` name from mongoengine import connect
in the connection kwargs.
# Regular connect
connect('dbname', replicaset='rs-name')
# MongoDB URI-style connect
connect(host='mongodb://localhost/dbname?replicaSet=rs-name')
Read preferences are supported through the connection or via individual Read preferences are supported through the connection or via individual
queries by passing the read_preference :: queries by passing the read_preference ::
@@ -59,76 +64,74 @@ queries by passing the read_preference ::
Multiple Databases Multiple Databases
================== ==================
Multiple database support was added in MongoEngine 0.6. To use multiple To use multiple databases you can use :func:`~mongoengine.connect` and provide
databases you can use :func:`~mongoengine.connect` and provide an `alias` name an `alias` name for the connection - if no `alias` is provided then "default"
for the connection - if no `alias` is provided then "default" is used. is used.
In the background this uses :func:`~mongoengine.register_connection` to In the background this uses :func:`~mongoengine.register_connection` to
store the data and you can register all aliases up front if required. store the data and you can register all aliases up front if required.
Individual documents can also support multiple databases by providing a Individual documents can also support multiple databases by providing a
`db_alias` in their meta data. This allows :class:`~pymongo.dbref.DBRef` objects `db_alias` in their meta data. This allows :class:`~pymongo.dbref.DBRef`
to point across databases and collections. Below is an example schema, using objects to point across databases and collections. Below is an example schema,
3 different databases to store data:: using 3 different databases to store data::
class User(Document): class User(Document):
name = StringField() name = StringField()
meta = {"db_alias": "user-db"} meta = {'db_alias': 'user-db'}
class Book(Document): class Book(Document):
name = StringField() name = StringField()
meta = {"db_alias": "book-db"} meta = {'db_alias': 'book-db'}
class AuthorBooks(Document): class AuthorBooks(Document):
author = ReferenceField(User) author = ReferenceField(User)
book = ReferenceField(Book) book = ReferenceField(Book)
meta = {"db_alias": "users-books-db"} meta = {'db_alias': 'users-books-db'}
Context Managers Context Managers
================ ================
Sometimes you may want to switch the database or collection to query against Sometimes you may want to switch the database or collection to query against.
for a class.
For example, archiving older data into a separate database for performance For example, archiving older data into a separate database for performance
reasons or writing functions that dynamically choose collections to write reasons or writing functions that dynamically choose collections to write
document to. a document to.
Switch Database Switch Database
--------------- ---------------
The :class:`~mongoengine.context_managers.switch_db` context manager allows The :class:`~mongoengine.context_managers.switch_db` context manager allows
you to change the database alias for a given class allowing quick and easy you to change the database alias for a given class allowing quick and easy
access the same User document across databases:: access to the same User document across databases::
from mongoengine.context_managers import switch_db from mongoengine.context_managers import switch_db
class User(Document): class User(Document):
name = StringField() name = StringField()
meta = {"db_alias": "user-db"} meta = {'db_alias': 'user-db'}
with switch_db(User, 'archive-user-db') as User: with switch_db(User, 'archive-user-db') as User:
User(name="Ross").save() # Saves the 'archive-user-db' User(name='Ross').save() # Saves the 'archive-user-db'
Switch Collection Switch Collection
----------------- -----------------
The :class:`~mongoengine.context_managers.switch_collection` context manager The :class:`~mongoengine.context_managers.switch_collection` context manager
allows you to change the collection for a given class allowing quick and easy allows you to change the collection for a given class allowing quick and easy
access the same Group document across collection:: access to the same Group document across collection::
from mongoengine.context_managers import switch_collection from mongoengine.context_managers import switch_collection
class Group(Document): class Group(Document):
name = StringField() name = StringField()
Group(name="test").save() # Saves in the default db Group(name='test').save() # Saves in the default db
with switch_collection(Group, 'group2000') as Group: with switch_collection(Group, 'group2000') as Group:
Group(name="hello Group 2000 collection!").save() # Saves in group2000 collection Group(name='hello Group 2000 collection!').save() # Saves in group2000 collection
.. note:: Make sure any aliases have been registered with .. note:: Make sure any aliases have been registered with

View File

@@ -150,7 +150,7 @@ arguments can be set on all fields:
.. note:: If set, this field is also accessible through the `pk` field. .. note:: If set, this field is also accessible through the `pk` field.
:attr:`choices` (Default: None) :attr:`choices` (Default: None)
An iterable (e.g. a list or tuple) of choices to which the value of this An iterable (e.g. list, tuple or set) of choices to which the value of this
field should be limited. field should be limited.
Can be either be a nested tuples of value (stored in mongo) and a Can be either be a nested tuples of value (stored in mongo) and a
@@ -361,11 +361,6 @@ Its value can take any of the following constants:
In Django, be sure to put all apps that have such delete rule declarations in In Django, be sure to put all apps that have such delete rule declarations in
their :file:`models.py` in the :const:`INSTALLED_APPS` tuple. their :file:`models.py` in the :const:`INSTALLED_APPS` tuple.
.. warning::
Signals are not triggered when doing cascading updates / deletes - if this
is required you must manually handle the update / delete.
Generic reference fields Generic reference fields
'''''''''''''''''''''''' ''''''''''''''''''''''''
A second kind of reference field also exists, A second kind of reference field also exists,

View File

@@ -2,13 +2,13 @@
Installing MongoEngine Installing MongoEngine
====================== ======================
To use MongoEngine, you will need to download `MongoDB <http://mongodb.org/>`_ To use MongoEngine, you will need to download `MongoDB <http://mongodb.com/>`_
and ensure it is running in an accessible location. You will also need and ensure it is running in an accessible location. You will also need
`PyMongo <http://api.mongodb.org/python>`_ to use MongoEngine, but if you `PyMongo <http://api.mongodb.org/python>`_ to use MongoEngine, but if you
install MongoEngine using setuptools, then the dependencies will be handled for install MongoEngine using setuptools, then the dependencies will be handled for
you. you.
MongoEngine is available on PyPI, so to use it you can use :program:`pip`: MongoEngine is available on PyPI, so you can use :program:`pip`:
.. code-block:: console .. code-block:: console

View File

@@ -340,14 +340,19 @@ Javascript code that is executed on the database server.
Counting results Counting results
---------------- ----------------
Just as with limiting and skipping results, there is a method on Just as with limiting and skipping results, there is a method on a
:class:`~mongoengine.queryset.QuerySet` objects -- :class:`~mongoengine.queryset.QuerySet` object --
:meth:`~mongoengine.queryset.QuerySet.count`, but there is also a more Pythonic :meth:`~mongoengine.queryset.QuerySet.count`::
way of achieving this::
num_users = len(User.objects) num_users = User.objects.count()
Even if len() is the Pythonic way of counting results, keep in mind that if you concerned about performance, :meth:`~mongoengine.queryset.QuerySet.count` is the way to go since it only execute a server side count query, while len() retrieves the results, places them in cache, and finally counts them. If we compare the performance of the two operations, len() is much slower than :meth:`~mongoengine.queryset.QuerySet.count`. You could technically use ``len(User.objects)`` to get the same result, but it
would be significantly slower than :meth:`~mongoengine.queryset.QuerySet.count`.
When you execute a server-side count query, you let MongoDB do the heavy
lifting and you receive a single integer over the wire. Meanwhile, len()
retrieves all the results, places them in a local cache, and finally counts
them. If we compare the performance of the two operations, len() is much slower
than :meth:`~mongoengine.queryset.QuerySet.count`.
Further aggregation Further aggregation
------------------- -------------------

View File

@@ -142,11 +142,4 @@ cleaner looking while still allowing manual execution of the callback::
modified = DateTimeField() modified = DateTimeField()
ReferenceFields and Signals
---------------------------
Currently `reverse_delete_rule` does not trigger signals on the other part of
the relationship. If this is required you must manually handle the
reverse deletion.
.. _blinker: http://pypi.python.org/pypi/blinker .. _blinker: http://pypi.python.org/pypi/blinker

View File

@@ -3,11 +3,10 @@ Tutorial
======== ========
This tutorial introduces **MongoEngine** by means of example --- we will walk This tutorial introduces **MongoEngine** by means of example --- we will walk
through how to create a simple **Tumblelog** application. A Tumblelog is a type through how to create a simple **Tumblelog** application. A tumblelog is a
of blog where posts are not constrained to being conventional text-based posts. blog that supports mixed media content, including text, images, links, video,
As well as text-based entries, users may post images, links, videos, etc. For audio, etc. For simplicity's sake, we'll stick to text, image, and link
simplicity's sake, we'll stick to text, image and link entries in our entries. As the purpose of this tutorial is to introduce MongoEngine, we'll
application. As the purpose of this tutorial is to introduce MongoEngine, we'll
focus on the data-modelling side of the application, leaving out a user focus on the data-modelling side of the application, leaving out a user
interface. interface.
@@ -16,14 +15,14 @@ Getting started
Before we start, make sure that a copy of MongoDB is running in an accessible Before we start, make sure that a copy of MongoDB is running in an accessible
location --- running it locally will be easier, but if that is not an option location --- running it locally will be easier, but if that is not an option
then it may be run on a remote server. If you haven't installed mongoengine, then it may be run on a remote server. If you haven't installed MongoEngine,
simply use pip to install it like so:: simply use pip to install it like so::
$ pip install mongoengine $ pip install mongoengine
Before we can start using MongoEngine, we need to tell it how to connect to our Before we can start using MongoEngine, we need to tell it how to connect to our
instance of :program:`mongod`. For this we use the :func:`~mongoengine.connect` instance of :program:`mongod`. For this we use the :func:`~mongoengine.connect`
function. If running locally the only argument we need to provide is the name function. If running locally, the only argument we need to provide is the name
of the MongoDB database to use:: of the MongoDB database to use::
from mongoengine import * from mongoengine import *
@@ -39,18 +38,18 @@ Defining our documents
MongoDB is *schemaless*, which means that no schema is enforced by the database MongoDB is *schemaless*, which means that no schema is enforced by the database
--- we may add and remove fields however we want and MongoDB won't complain. --- we may add and remove fields however we want and MongoDB won't complain.
This makes life a lot easier in many regards, especially when there is a change This makes life a lot easier in many regards, especially when there is a change
to the data model. However, defining schemata for our documents can help to to the data model. However, defining schemas for our documents can help to iron
iron out bugs involving incorrect types or missing fields, and also allow us to out bugs involving incorrect types or missing fields, and also allow us to
define utility methods on our documents in the same way that traditional define utility methods on our documents in the same way that traditional
:abbr:`ORMs (Object-Relational Mappers)` do. :abbr:`ORMs (Object-Relational Mappers)` do.
In our Tumblelog application we need to store several different types of In our Tumblelog application we need to store several different types of
information. We will need to have a collection of **users**, so that we may information. We will need to have a collection of **users**, so that we may
link posts to an individual. We also need to store our different types of link posts to an individual. We also need to store our different types of
**posts** (eg: text, image and link) in the database. To aid navigation of our **posts** (eg: text, image and link) in the database. To aid navigation of our
Tumblelog, posts may have **tags** associated with them, so that the list of Tumblelog, posts may have **tags** associated with them, so that the list of
posts shown to the user may be limited to posts that have been assigned a posts shown to the user may be limited to posts that have been assigned a
specific tag. Finally, it would be nice if **comments** could be added to specific tag. Finally, it would be nice if **comments** could be added to
posts. We'll start with **users**, as the other document models are slightly posts. We'll start with **users**, as the other document models are slightly
more involved. more involved.
@@ -78,7 +77,7 @@ Now we'll think about how to store the rest of the information. If we were
using a relational database, we would most likely have a table of **posts**, a using a relational database, we would most likely have a table of **posts**, a
table of **comments** and a table of **tags**. To associate the comments with table of **comments** and a table of **tags**. To associate the comments with
individual posts, we would put a column in the comments table that contained a individual posts, we would put a column in the comments table that contained a
foreign key to the posts table. We'd also need a link table to provide the foreign key to the posts table. We'd also need a link table to provide the
many-to-many relationship between posts and tags. Then we'd need to address the many-to-many relationship between posts and tags. Then we'd need to address the
problem of storing the specialised post-types (text, image and link). There are problem of storing the specialised post-types (text, image and link). There are
several ways we can achieve this, but each of them have their problems --- none several ways we can achieve this, but each of them have their problems --- none
@@ -96,7 +95,7 @@ using* the new fields we need to support video posts. This fits with the
Object-Oriented principle of *inheritance* nicely. We can think of Object-Oriented principle of *inheritance* nicely. We can think of
:class:`Post` as a base class, and :class:`TextPost`, :class:`ImagePost` and :class:`Post` as a base class, and :class:`TextPost`, :class:`ImagePost` and
:class:`LinkPost` as subclasses of :class:`Post`. In fact, MongoEngine supports :class:`LinkPost` as subclasses of :class:`Post`. In fact, MongoEngine supports
this kind of modelling out of the box --- all you need do is turn on inheritance this kind of modeling out of the box --- all you need do is turn on inheritance
by setting :attr:`allow_inheritance` to True in the :attr:`meta`:: by setting :attr:`allow_inheritance` to True in the :attr:`meta`::
class Post(Document): class Post(Document):
@@ -128,8 +127,8 @@ link table, we can just store a list of tags in each post. So, for both
efficiency and simplicity's sake, we'll store the tags as strings directly efficiency and simplicity's sake, we'll store the tags as strings directly
within the post, rather than storing references to tags in a separate within the post, rather than storing references to tags in a separate
collection. Especially as tags are generally very short (often even shorter collection. Especially as tags are generally very short (often even shorter
than a document's id), this denormalisation won't impact very strongly on the than a document's id), this denormalization won't impact the size of the
size of our database. So let's take a look that the code our modified database very strongly. Let's take a look at the code of our modified
:class:`Post` class:: :class:`Post` class::
class Post(Document): class Post(Document):
@@ -141,7 +140,7 @@ The :class:`~mongoengine.fields.ListField` object that is used to define a Post'
takes a field object as its first argument --- this means that you can have takes a field object as its first argument --- this means that you can have
lists of any type of field (including lists). lists of any type of field (including lists).
.. note:: We don't need to modify the specialised post types as they all .. note:: We don't need to modify the specialized post types as they all
inherit from :class:`Post`. inherit from :class:`Post`.
Comments Comments
@@ -149,7 +148,7 @@ Comments
A comment is typically associated with *one* post. In a relational database, to A comment is typically associated with *one* post. In a relational database, to
display a post with its comments, we would have to retrieve the post from the display a post with its comments, we would have to retrieve the post from the
database, then query the database again for the comments associated with the database and then query the database again for the comments associated with the
post. This works, but there is no real reason to be storing the comments post. This works, but there is no real reason to be storing the comments
separately from their associated posts, other than to work around the separately from their associated posts, other than to work around the
relational model. Using MongoDB we can store the comments as a list of relational model. Using MongoDB we can store the comments as a list of
@@ -219,8 +218,8 @@ Now that we've got our user in the database, let's add a couple of posts::
post2.tags = ['mongoengine'] post2.tags = ['mongoengine']
post2.save() post2.save()
.. note:: If you change a field on a object that has already been saved, then .. note:: If you change a field on an object that has already been saved and
call :meth:`save` again, the document will be updated. then call :meth:`save` again, the document will be updated.
Accessing our data Accessing our data
================== ==================
@@ -232,17 +231,17 @@ used to access the documents in the database collection associated with that
class. So let's see how we can get our posts' titles:: class. So let's see how we can get our posts' titles::
for post in Post.objects: for post in Post.objects:
print post.title print(post.title)
Retrieving type-specific information Retrieving type-specific information
------------------------------------ ------------------------------------
This will print the titles of our posts, one on each line. But What if we want This will print the titles of our posts, one on each line. But what if we want
to access the type-specific data (link_url, content, etc.)? One way is simply to access the type-specific data (link_url, content, etc.)? One way is simply
to use the :attr:`objects` attribute of a subclass of :class:`Post`:: to use the :attr:`objects` attribute of a subclass of :class:`Post`::
for post in TextPost.objects: for post in TextPost.objects:
print post.content print(post.content)
Using TextPost's :attr:`objects` attribute only returns documents that were Using TextPost's :attr:`objects` attribute only returns documents that were
created using :class:`TextPost`. Actually, there is a more general rule here: created using :class:`TextPost`. Actually, there is a more general rule here:
@@ -259,16 +258,14 @@ instances of :class:`Post` --- they were instances of the subclass of
practice:: practice::
for post in Post.objects: for post in Post.objects:
print post.title print(post.title)
print '=' * len(post.title) print('=' * len(post.title))
if isinstance(post, TextPost): if isinstance(post, TextPost):
print post.content print(post.content)
if isinstance(post, LinkPost): if isinstance(post, LinkPost):
print 'Link:', post.link_url print('Link: {}'.format(post.link_url))
print
This would print the title of each post, followed by the content if it was a This would print the title of each post, followed by the content if it was a
text post, and "Link: <url>" if it was a link post. text post, and "Link: <url>" if it was a link post.
@@ -283,7 +280,7 @@ your query. Let's adjust our query so that only posts with the tag "mongodb"
are returned:: are returned::
for post in Post.objects(tags='mongodb'): for post in Post.objects(tags='mongodb'):
print post.title print(post.title)
There are also methods available on :class:`~mongoengine.queryset.QuerySet` There are also methods available on :class:`~mongoengine.queryset.QuerySet`
objects that allow different results to be returned, for example, calling objects that allow different results to be returned, for example, calling
@@ -292,11 +289,11 @@ the first matched by the query you provide. Aggregation functions may also be
used on :class:`~mongoengine.queryset.QuerySet` objects:: used on :class:`~mongoengine.queryset.QuerySet` objects::
num_posts = Post.objects(tags='mongodb').count() num_posts = Post.objects(tags='mongodb').count()
print 'Found %d posts with tag "mongodb"' % num_posts print('Found {} posts with tag "mongodb"'.format(num_posts))
Learning more about mongoengine Learning more about MongoEngine
------------------------------- -------------------------------
If you got this far you've made a great start, so well done! The next step on If you got this far you've made a great start, so well done! The next step on
your mongoengine journey is the `full user guide <guide/index.html>`_, where you your MongoEngine journey is the `full user guide <guide/index.html>`_, where
can learn indepth about how to use mongoengine and mongodb. you can learn in-depth about how to use MongoEngine and MongoDB.

View File

@@ -2,6 +2,23 @@
Upgrading Upgrading
######### #########
Development
***********
(Fill this out whenever you introduce breaking changes to MongoEngine)
0.12.0
******
This release includes various fixes for the `BaseQuerySet` methods and how they
are chained together. Since version 0.10.1 applying limit/skip/hint/batch_size
to an already-existing queryset wouldn't modify the underlying PyMongo cursor.
This has been fixed now, so you'll need to make sure that your code didn't rely
on the broken implementation.
Additionally, a public `BaseQuerySet.clone_into` has been renamed to a private
`_clone_into`. If you directly used that method in your code, you'll need to
rename its occurrences.
0.11.0 0.11.0
****** ******
This release includes a major rehaul of MongoEngine's code quality and This release includes a major rehaul of MongoEngine's code quality and

View File

@@ -23,7 +23,7 @@ __all__ = (list(document.__all__) + list(fields.__all__) +
list(signals.__all__) + list(errors.__all__)) list(signals.__all__) + list(errors.__all__))
VERSION = (0, 11, 0) VERSION = (0, 12, 0)
def get_version(): def get_version():

View File

@@ -429,7 +429,7 @@ class StrictDict(object):
def __eq__(self, other): def __eq__(self, other):
return self.items() == other.items() return self.items() == other.items()
def __neq__(self, other): def __ne__(self, other):
return self.items() != other.items() return self.items() != other.items()
@classmethod @classmethod

View File

@@ -684,8 +684,13 @@ class BaseDocument(object):
# class if unavailable # class if unavailable
class_name = son.get('_cls', cls._class_name) class_name = son.get('_cls', cls._class_name)
# Convert SON to a dict, making sure each key is a string # Convert SON to a data dict, making sure each key is a string and
data = {str(key): value for key, value in son.iteritems()} # corresponds to the right db field.
data = {}
for key, value in son.iteritems():
key = str(key)
key = cls._db_field_map.get(key, key)
data[key] = value
# Return correct subclass for document type # Return correct subclass for document type
if class_name != cls._class_name: if class_name != cls._class_name:

View File

@@ -193,7 +193,8 @@ class BaseField(object):
EmbeddedDocument = _import_class('EmbeddedDocument') EmbeddedDocument = _import_class('EmbeddedDocument')
choice_list = self.choices choice_list = self.choices
if isinstance(choice_list[0], (list, tuple)): if isinstance(next(iter(choice_list)), (list, tuple)):
# next(iter) is useful for sets
choice_list = [k for k, _ in choice_list] choice_list = [k for k, _ in choice_list]
# Choices which are other types of Documents # Choices which are other types of Documents

View File

@@ -51,7 +51,9 @@ def register_connection(alias, name=None, host=None, port=None,
MONGODB-CR (MongoDB Challenge Response protocol) for older servers. MONGODB-CR (MongoDB Challenge Response protocol) for older servers.
:param is_mock: explicitly use mongomock for this connection :param is_mock: explicitly use mongomock for this connection
(can also be done by using `mongomock://` as db host prefix) (can also be done by using `mongomock://` as db host prefix)
:param kwargs: allow ad-hoc parameters to be passed into the pymongo driver :param kwargs: ad-hoc parameters to be passed into the pymongo driver,
for example maxpoolsize, tz_aware, etc. See the documentation
for pymongo's `MongoClient` for a full list.
.. versionchanged:: 0.10.6 - added mongomock support .. versionchanged:: 0.10.6 - added mongomock support
""" """
@@ -241,9 +243,12 @@ def connect(db=None, alias=DEFAULT_CONNECTION_NAME, **kwargs):
running on the default port on localhost. If authentication is needed, running on the default port on localhost. If authentication is needed,
provide username and password arguments as well. provide username and password arguments as well.
Multiple databases are supported by using aliases. Provide a separate Multiple databases are supported by using aliases. Provide a separate
`alias` to connect to a different instance of :program:`mongod`. `alias` to connect to a different instance of :program:`mongod`.
See the docstring for `register_connection` for more details about all
supported kwargs.
.. versionchanged:: 0.6 - added multiple database support. .. versionchanged:: 0.6 - added multiple database support.
""" """
if alias not in _connections: if alias not in _connections:

View File

@@ -1,3 +1,4 @@
from collections import OrderedDict
from bson import DBRef, SON from bson import DBRef, SON
import six import six
@@ -201,6 +202,10 @@ class DeReference(object):
as_tuple = isinstance(items, tuple) as_tuple = isinstance(items, tuple)
iterator = enumerate(items) iterator = enumerate(items)
data = [] data = []
elif isinstance(items, OrderedDict):
is_list = False
iterator = items.iteritems()
data = OrderedDict()
else: else:
is_list = False is_list = False
iterator = items.iteritems() iterator = items.iteritems()

View File

@@ -50,8 +50,8 @@ class FieldDoesNotExist(Exception):
or an :class:`~mongoengine.EmbeddedDocument`. or an :class:`~mongoengine.EmbeddedDocument`.
To avoid this behavior on data loading, To avoid this behavior on data loading,
you should the :attr:`strict` to ``False`` you should set the :attr:`strict` to ``False``
in the :attr:`meta` dictionnary. in the :attr:`meta` dictionary.
""" """

View File

@@ -2,9 +2,11 @@ import datetime
import decimal import decimal
import itertools import itertools
import re import re
import socket
import time import time
import uuid import uuid
import warnings import warnings
from collections import Mapping
from operator import itemgetter from operator import itemgetter
from bson import Binary, DBRef, ObjectId, SON from bson import Binary, DBRef, ObjectId, SON
@@ -139,12 +141,12 @@ class URLField(StringField):
# Check first if the scheme is valid # Check first if the scheme is valid
scheme = value.split('://')[0].lower() scheme = value.split('://')[0].lower()
if scheme not in self.schemes: if scheme not in self.schemes:
self.error('Invalid scheme {} in URL: {}'.format(scheme, value)) self.error(u'Invalid scheme {} in URL: {}'.format(scheme, value))
return return
# Then check full URL # Then check full URL
if not self.url_regex.match(value): if not self.url_regex.match(value):
self.error('Invalid URL: {}'.format(value)) self.error(u'Invalid URL: {}'.format(value))
return return
@@ -153,21 +155,105 @@ class EmailField(StringField):
.. versionadded:: 0.4 .. versionadded:: 0.4
""" """
USER_REGEX = re.compile(
EMAIL_REGEX = re.compile( # `dot-atom` defined in RFC 5322 Section 3.2.3.
# dot-atom r"(^[-!#$%&'*+/=?^_`{}|~0-9A-Z]+(\.[-!#$%&'*+/=?^_`{}|~0-9A-Z]+)*\Z"
r"(^[-!#$%&'*+/=?^_`{}|~0-9A-Z]+(\.[-!#$%&'*+/=?^_`{}|~0-9A-Z]+)*" # `quoted-string` defined in RFC 5322 Section 3.2.4.
# quoted-string r'|^"([\001-\010\013\014\016-\037!#-\[\]-\177]|\\[\001-\011\013\014\016-\177])*"\Z)',
r'|^"([\001-\010\013\014\016-\037!#-\[\]-\177]|\\[\001-011\013\014\016-\177])*"' re.IGNORECASE
# domain (max length of an ICAAN TLD is 22 characters)
r')@(?:[A-Z0-9](?:[A-Z0-9-]{0,61}[A-Z0-9])?\.)+(?:[A-Z]{2,6}|[A-Z0-9-]{2,}(?<!-))$', re.IGNORECASE
) )
UTF8_USER_REGEX = re.compile(
six.u(
# RFC 6531 Section 3.3 extends `atext` (used by dot-atom) to
# include `UTF8-non-ascii`.
r"(^[-!#$%&'*+/=?^_`{}|~0-9A-Z\u0080-\U0010FFFF]+(\.[-!#$%&'*+/=?^_`{}|~0-9A-Z\u0080-\U0010FFFF]+)*\Z"
# `quoted-string`
r'|^"([\001-\010\013\014\016-\037!#-\[\]-\177]|\\[\001-\011\013\014\016-\177])*"\Z)'
), re.IGNORECASE | re.UNICODE
)
DOMAIN_REGEX = re.compile(
r'((?:[A-Z0-9](?:[A-Z0-9-]{0,61}[A-Z0-9])?\.)+)(?:[A-Z0-9-]{2,63}(?<!-))\Z',
re.IGNORECASE
)
error_msg = u'Invalid email address: %s'
def __init__(self, domain_whitelist=None, allow_utf8_user=False,
allow_ip_domain=False, *args, **kwargs):
"""Initialize the EmailField.
Args:
domain_whitelist (list) - list of otherwise invalid domain
names which you'd like to support.
allow_utf8_user (bool) - if True, the user part of the email
address can contain UTF8 characters.
False by default.
allow_ip_domain (bool) - if True, the domain part of the email
can be a valid IPv4 or IPv6 address.
"""
self.domain_whitelist = domain_whitelist or []
self.allow_utf8_user = allow_utf8_user
self.allow_ip_domain = allow_ip_domain
super(EmailField, self).__init__(*args, **kwargs)
def validate_user_part(self, user_part):
"""Validate the user part of the email address. Return True if
valid and False otherwise.
"""
if self.allow_utf8_user:
return self.UTF8_USER_REGEX.match(user_part)
return self.USER_REGEX.match(user_part)
def validate_domain_part(self, domain_part):
"""Validate the domain part of the email address. Return True if
valid and False otherwise.
"""
# Skip domain validation if it's in the whitelist.
if domain_part in self.domain_whitelist:
return True
if self.DOMAIN_REGEX.match(domain_part):
return True
# Validate IPv4/IPv6, e.g. user@[192.168.0.1]
if (
self.allow_ip_domain and
domain_part[0] == '[' and
domain_part[-1] == ']'
):
for addr_family in (socket.AF_INET, socket.AF_INET6):
try:
socket.inet_pton(addr_family, domain_part[1:-1])
return True
except (socket.error, UnicodeEncodeError):
pass
return False
def validate(self, value): def validate(self, value):
if not EmailField.EMAIL_REGEX.match(value):
self.error('Invalid email address: %s' % value)
super(EmailField, self).validate(value) super(EmailField, self).validate(value)
if '@' not in value:
self.error(self.error_msg % value)
user_part, domain_part = value.rsplit('@', 1)
# Validate the user part.
if not self.validate_user_part(user_part):
self.error(self.error_msg % value)
# Validate the domain and, if invalid, see if it's IDN-encoded.
if not self.validate_domain_part(domain_part):
try:
domain_part = domain_part.encode('idna').decode('ascii')
except UnicodeError:
self.error(self.error_msg % value)
else:
if not self.validate_domain_part(domain_part):
self.error(self.error_msg % value)
class IntField(BaseField): class IntField(BaseField):
"""32-bit integer field.""" """32-bit integer field."""
@@ -619,6 +705,14 @@ class DynamicField(BaseField):
Used by :class:`~mongoengine.DynamicDocument` to handle dynamic data""" Used by :class:`~mongoengine.DynamicDocument` to handle dynamic data"""
def __init__(self, container_class=dict, *args, **kwargs):
self._container_cls = container_class
if not issubclass(self._container_cls, Mapping):
self.error('The class that is specified in `container_class` parameter '
'must be a subclass of `dict`.')
super(DynamicField, self).__init__(*args, **kwargs)
def to_mongo(self, value, use_db_field=True, fields=None): def to_mongo(self, value, use_db_field=True, fields=None):
"""Convert a Python type to a MongoDB compatible type. """Convert a Python type to a MongoDB compatible type.
""" """
@@ -644,7 +738,7 @@ class DynamicField(BaseField):
is_list = True is_list = True
value = {k: v for k, v in enumerate(value)} value = {k: v for k, v in enumerate(value)}
data = {} data = self._container_cls()
for k, v in value.iteritems(): for k, v in value.iteritems():
data[k] = self.to_mongo(v, use_db_field, fields) data[k] = self.to_mongo(v, use_db_field, fields)
@@ -888,10 +982,6 @@ class ReferenceField(BaseField):
Foo.register_delete_rule(Bar, 'foo', NULLIFY) Foo.register_delete_rule(Bar, 'foo', NULLIFY)
.. note ::
`reverse_delete_rule` does not trigger pre / post delete signals to be
triggered.
.. versionchanged:: 0.5 added `reverse_delete_rule` .. versionchanged:: 0.5 added `reverse_delete_rule`
""" """
@@ -1002,8 +1092,8 @@ class ReferenceField(BaseField):
def validate(self, value): def validate(self, value):
if not isinstance(value, (self.document_type, DBRef)): if not isinstance(value, (self.document_type, DBRef, ObjectId)):
self.error('A ReferenceField only accepts DBRef or documents') self.error('A ReferenceField only accepts DBRef, ObjectId or documents')
if isinstance(value, Document) and value.id is None: if isinstance(value, Document) and value.id is None:
self.error('You can only reference documents once they have been ' self.error('You can only reference documents once they have been '

View File

@@ -86,6 +86,7 @@ class BaseQuerySet(object):
self._batch_size = None self._batch_size = None
self.only_fields = [] self.only_fields = []
self._max_time_ms = None self._max_time_ms = None
self._comment = None
def __call__(self, q_obj=None, class_check=True, read_preference=None, def __call__(self, q_obj=None, class_check=True, read_preference=None,
**query): **query):
@@ -157,44 +158,49 @@ class BaseQuerySet(object):
# self._cursor # self._cursor
def __getitem__(self, key): def __getitem__(self, key):
"""Support skip and limit using getitem and slicing syntax.""" """Return a document instance corresponding to a given index if
the key is an integer. If the key is a slice, translate its
bounds into a skip and a limit, and return a cloned queryset
with that skip/limit applied. For example:
>>> User.objects[0]
<User: User object>
>>> User.objects[1:3]
[<User: User object>, <User: User object>]
"""
queryset = self.clone() queryset = self.clone()
# Slice provided # Handle a slice
if isinstance(key, slice): if isinstance(key, slice):
try: queryset._cursor_obj = queryset._cursor[key]
queryset._cursor_obj = queryset._cursor[key] queryset._skip, queryset._limit = key.start, key.stop
queryset._skip, queryset._limit = key.start, key.stop if key.start and key.stop:
if key.start and key.stop: queryset._limit = key.stop - key.start
queryset._limit = key.stop - key.start
except IndexError as err:
# PyMongo raises an error if key.start == key.stop, catch it,
# bin it, kill it.
start = key.start or 0
if start >= 0 and key.stop >= 0 and key.step is None:
if start == key.stop:
queryset.limit(0)
queryset._skip = key.start
queryset._limit = key.stop - start
return queryset
raise err
# Allow further QuerySet modifications to be performed # Allow further QuerySet modifications to be performed
return queryset return queryset
# Integer index provided
# Handle an index
elif isinstance(key, int): elif isinstance(key, int):
if queryset._scalar: if queryset._scalar:
return queryset._get_scalar( return queryset._get_scalar(
queryset._document._from_son(queryset._cursor[key], queryset._document._from_son(
_auto_dereference=self._auto_dereference, queryset._cursor[key],
only_fields=self.only_fields)) _auto_dereference=self._auto_dereference,
only_fields=self.only_fields
)
)
if queryset._as_pymongo: if queryset._as_pymongo:
return queryset._get_as_pymongo(queryset._cursor[key]) return queryset._get_as_pymongo(queryset._cursor[key])
return queryset._document._from_son(queryset._cursor[key],
_auto_dereference=self._auto_dereference,
only_fields=self.only_fields)
raise AttributeError return queryset._document._from_son(
queryset._cursor[key],
_auto_dereference=self._auto_dereference,
only_fields=self.only_fields
)
raise AttributeError('Provide a slice or an integer index')
def __iter__(self): def __iter__(self):
raise NotImplementedError raise NotImplementedError
@@ -285,7 +291,7 @@ class BaseQuerySet(object):
.. versionadded:: 0.4 .. versionadded:: 0.4
""" """
return self._document(**kwargs).save() return self._document(**kwargs).save(force_insert=True)
def first(self): def first(self):
"""Retrieve the first object matching the query.""" """Retrieve the first object matching the query."""
@@ -706,39 +712,36 @@ class BaseQuerySet(object):
with switch_db(self._document, alias) as cls: with switch_db(self._document, alias) as cls:
collection = cls._get_collection() collection = cls._get_collection()
return self.clone_into(self.__class__(self._document, collection)) return self._clone_into(self.__class__(self._document, collection))
def clone(self): def clone(self):
"""Creates a copy of the current """Create a copy of the current queryset."""
:class:`~mongoengine.queryset.QuerySet` return self._clone_into(self.__class__(self._document, self._collection_obj))
.. versionadded:: 0.5 def _clone_into(self, new_qs):
"""Copy all of the relevant properties of this queryset to
a new queryset (which has to be an instance of
:class:`~mongoengine.queryset.base.BaseQuerySet`).
""" """
return self.clone_into(self.__class__(self._document, self._collection_obj)) if not isinstance(new_qs, BaseQuerySet):
def clone_into(self, cls):
"""Creates a copy of the current
:class:`~mongoengine.queryset.base.BaseQuerySet` into another child class
"""
if not isinstance(cls, BaseQuerySet):
raise OperationError( raise OperationError(
'%s is not a subclass of BaseQuerySet' % cls.__name__) '%s is not a subclass of BaseQuerySet' % new_qs.__name__)
copy_props = ('_mongo_query', '_initial_query', '_none', '_query_obj', copy_props = ('_mongo_query', '_initial_query', '_none', '_query_obj',
'_where_clause', '_loaded_fields', '_ordering', '_snapshot', '_where_clause', '_loaded_fields', '_ordering', '_snapshot',
'_timeout', '_class_check', '_slave_okay', '_read_preference', '_timeout', '_class_check', '_slave_okay', '_read_preference',
'_iter', '_scalar', '_as_pymongo', '_as_pymongo_coerce', '_iter', '_scalar', '_as_pymongo', '_as_pymongo_coerce',
'_limit', '_skip', '_hint', '_auto_dereference', '_limit', '_skip', '_hint', '_auto_dereference',
'_search_text', 'only_fields', '_max_time_ms') '_search_text', 'only_fields', '_max_time_ms', '_comment')
for prop in copy_props: for prop in copy_props:
val = getattr(self, prop) val = getattr(self, prop)
setattr(cls, prop, copy.copy(val)) setattr(new_qs, prop, copy.copy(val))
if self._cursor_obj: if self._cursor_obj:
cls._cursor_obj = self._cursor_obj.clone() new_qs._cursor_obj = self._cursor_obj.clone()
return cls return new_qs
def select_related(self, max_depth=1): def select_related(self, max_depth=1):
"""Handles dereferencing of :class:`~bson.dbref.DBRef` objects or """Handles dereferencing of :class:`~bson.dbref.DBRef` objects or
@@ -760,7 +763,11 @@ class BaseQuerySet(object):
""" """
queryset = self.clone() queryset = self.clone()
queryset._limit = n if n != 0 else 1 queryset._limit = n if n != 0 else 1
# Return self to allow chaining
# If a cursor object has already been created, apply the limit to it.
if queryset._cursor_obj:
queryset._cursor_obj.limit(queryset._limit)
return queryset return queryset
def skip(self, n): def skip(self, n):
@@ -771,6 +778,11 @@ class BaseQuerySet(object):
""" """
queryset = self.clone() queryset = self.clone()
queryset._skip = n queryset._skip = n
# If a cursor object has already been created, apply the skip to it.
if queryset._cursor_obj:
queryset._cursor_obj.skip(queryset._skip)
return queryset return queryset
def hint(self, index=None): def hint(self, index=None):
@@ -788,6 +800,11 @@ class BaseQuerySet(object):
""" """
queryset = self.clone() queryset = self.clone()
queryset._hint = index queryset._hint = index
# If a cursor object has already been created, apply the hint to it.
if queryset._cursor_obj:
queryset._cursor_obj.hint(queryset._hint)
return queryset return queryset
def batch_size(self, size): def batch_size(self, size):
@@ -801,6 +818,11 @@ class BaseQuerySet(object):
""" """
queryset = self.clone() queryset = self.clone()
queryset._batch_size = size queryset._batch_size = size
# If a cursor object has already been created, apply the batch size to it.
if queryset._cursor_obj:
queryset._cursor_obj.batch_size(queryset._batch_size)
return queryset return queryset
def distinct(self, field): def distinct(self, field):
@@ -972,13 +994,31 @@ class BaseQuerySet(object):
def order_by(self, *keys): def order_by(self, *keys):
"""Order the :class:`~mongoengine.queryset.QuerySet` by the keys. The """Order the :class:`~mongoengine.queryset.QuerySet` by the keys. The
order may be specified by prepending each of the keys by a + or a -. order may be specified by prepending each of the keys by a + or a -.
Ascending order is assumed. Ascending order is assumed. If no keys are passed, existing ordering
is cleared instead.
:param keys: fields to order the query results by; keys may be :param keys: fields to order the query results by; keys may be
prefixed with **+** or **-** to determine the ordering direction prefixed with **+** or **-** to determine the ordering direction
""" """
queryset = self.clone() queryset = self.clone()
queryset._ordering = queryset._get_order_by(keys)
old_ordering = queryset._ordering
new_ordering = queryset._get_order_by(keys)
if queryset._cursor_obj:
# If a cursor object has already been created, apply the sort to it
if new_ordering:
queryset._cursor_obj.sort(new_ordering)
# If we're trying to clear a previous explicit ordering, we need
# to clear the cursor entirely (because PyMongo doesn't allow
# clearing an existing sort on a cursor).
elif old_ordering:
queryset._cursor_obj = None
queryset._ordering = new_ordering
return queryset return queryset
def comment(self, text): def comment(self, text):
@@ -1424,10 +1464,13 @@ class BaseQuerySet(object):
raise StopIteration raise StopIteration
raw_doc = self._cursor.next() raw_doc = self._cursor.next()
if self._as_pymongo: if self._as_pymongo:
return self._get_as_pymongo(raw_doc) return self._get_as_pymongo(raw_doc)
doc = self._document._from_son(raw_doc,
_auto_dereference=self._auto_dereference, only_fields=self.only_fields) doc = self._document._from_son(
raw_doc, _auto_dereference=self._auto_dereference,
only_fields=self.only_fields)
if self._scalar: if self._scalar:
return self._get_scalar(doc) return self._get_scalar(doc)
@@ -1437,7 +1480,6 @@ class BaseQuerySet(object):
def rewind(self): def rewind(self):
"""Rewind the cursor to its unevaluated state. """Rewind the cursor to its unevaluated state.
.. versionadded:: 0.3 .. versionadded:: 0.3
""" """
self._iter = False self._iter = False
@@ -1487,43 +1529,54 @@ class BaseQuerySet(object):
@property @property
def _cursor(self): def _cursor(self):
if self._cursor_obj is None: """Return a PyMongo cursor object corresponding to this queryset."""
# In PyMongo 3+, we define the read preference on a collection # If _cursor_obj already exists, return it immediately.
# level, not a cursor level. Thus, we need to get a cloned if self._cursor_obj is not None:
# collection object using `with_options` first. return self._cursor_obj
if IS_PYMONGO_3 and self._read_preference is not None:
self._cursor_obj = self._collection\
.with_options(read_preference=self._read_preference)\
.find(self._query, **self._cursor_args)
else:
self._cursor_obj = self._collection.find(self._query,
**self._cursor_args)
# Apply where clauses to cursor
if self._where_clause:
where_clause = self._sub_js_fields(self._where_clause)
self._cursor_obj.where(where_clause)
if self._ordering: # Create a new PyMongo cursor.
# Apply query ordering # XXX In PyMongo 3+, we define the read preference on a collection
self._cursor_obj.sort(self._ordering) # level, not a cursor level. Thus, we need to get a cloned collection
elif self._ordering is None and self._document._meta['ordering']: # object using `with_options` first.
# Otherwise, apply the ordering from the document model, unless if IS_PYMONGO_3 and self._read_preference is not None:
# it's been explicitly cleared via order_by with no arguments self._cursor_obj = self._collection\
order = self._get_order_by(self._document._meta['ordering']) .with_options(read_preference=self._read_preference)\
self._cursor_obj.sort(order) .find(self._query, **self._cursor_args)
else:
self._cursor_obj = self._collection.find(self._query,
**self._cursor_args)
# Apply "where" clauses to cursor
if self._where_clause:
where_clause = self._sub_js_fields(self._where_clause)
self._cursor_obj.where(where_clause)
if self._limit is not None: # Apply ordering to the cursor.
self._cursor_obj.limit(self._limit) # XXX self._ordering can be equal to:
# * None if we didn't explicitly call order_by on this queryset.
# * A list of PyMongo-style sorting tuples.
# * An empty list if we explicitly called order_by() without any
# arguments. This indicates that we want to clear the default
# ordering.
if self._ordering:
# explicit ordering
self._cursor_obj.sort(self._ordering)
elif self._ordering is None and self._document._meta['ordering']:
# default ordering
order = self._get_order_by(self._document._meta['ordering'])
self._cursor_obj.sort(order)
if self._skip is not None: if self._limit is not None:
self._cursor_obj.skip(self._skip) self._cursor_obj.limit(self._limit)
if self._hint != -1: if self._skip is not None:
self._cursor_obj.hint(self._hint) self._cursor_obj.skip(self._skip)
if self._batch_size is not None: if self._hint != -1:
self._cursor_obj.batch_size(self._batch_size) self._cursor_obj.hint(self._hint)
if self._batch_size is not None:
self._cursor_obj.batch_size(self._batch_size)
return self._cursor_obj return self._cursor_obj
@@ -1698,7 +1751,13 @@ class BaseQuerySet(object):
return ret return ret
def _get_order_by(self, keys): def _get_order_by(self, keys):
"""Creates a list of order by fields""" """Given a list of MongoEngine-style sort keys, return a list
of sorting tuples that can be applied to a PyMongo cursor. For
example:
>>> qs._get_order_by(['-last_name', 'first_name'])
[('last_name', -1), ('first_name', 1)]
"""
key_list = [] key_list = []
for key in keys: for key in keys:
if not key: if not key:
@@ -1711,17 +1770,19 @@ class BaseQuerySet(object):
direction = pymongo.ASCENDING direction = pymongo.ASCENDING
if key[0] == '-': if key[0] == '-':
direction = pymongo.DESCENDING direction = pymongo.DESCENDING
if key[0] in ('-', '+'): if key[0] in ('-', '+'):
key = key[1:] key = key[1:]
key = key.replace('__', '.') key = key.replace('__', '.')
try: try:
key = self._document._translate_field_name(key) key = self._document._translate_field_name(key)
except Exception: except Exception:
# TODO this exception should be more specific
pass pass
key_list.append((key, direction)) key_list.append((key, direction))
if self._cursor_obj and key_list:
self._cursor_obj.sort(key_list)
return key_list return key_list
def _get_scalar(self, doc): def _get_scalar(self, doc):
@@ -1819,10 +1880,21 @@ class BaseQuerySet(object):
return code return code
def _chainable_method(self, method_name, val): def _chainable_method(self, method_name, val):
"""Call a particular method on the PyMongo cursor call a particular chainable method
with the provided value.
"""
queryset = self.clone() queryset = self.clone()
method = getattr(queryset._cursor, method_name)
method(val) # Get an existing cursor object or create a new one
cursor = queryset._cursor
# Find the requested method on the cursor and call it with the
# provided value
getattr(cursor, method_name)(val)
# Cache the value on the queryset._{method_name}
setattr(queryset, '_' + method_name, val) setattr(queryset, '_' + method_name, val)
return queryset return queryset
# Deprecated # Deprecated

View File

@@ -136,13 +136,15 @@ class QuerySet(BaseQuerySet):
return self._len return self._len
def no_cache(self): def no_cache(self):
"""Convert to a non_caching queryset """Convert to a non-caching queryset
.. versionadded:: 0.8.3 Convert to non caching queryset .. versionadded:: 0.8.3 Convert to non caching queryset
""" """
if self._result_cache is not None: if self._result_cache is not None:
raise OperationError('QuerySet already cached') raise OperationError('QuerySet already cached')
return self.clone_into(QuerySetNoCache(self._document, self._collection))
return self._clone_into(QuerySetNoCache(self._document,
self._collection))
class QuerySetNoCache(BaseQuerySet): class QuerySetNoCache(BaseQuerySet):
@@ -153,7 +155,7 @@ class QuerySetNoCache(BaseQuerySet):
.. versionadded:: 0.8.3 Convert to caching queryset .. versionadded:: 0.8.3 Convert to caching queryset
""" """
return self.clone_into(QuerySet(self._document, self._collection)) return self._clone_into(QuerySet(self._document, self._collection))
def __repr__(self): def __repr__(self):
"""Provides the string representation of the QuerySet """Provides the string representation of the QuerySet

View File

@@ -2,14 +2,14 @@
import unittest import unittest
import sys import sys
import pymongo
from nose.plugins.skip import SkipTest from nose.plugins.skip import SkipTest
from datetime import datetime from datetime import datetime
import pymongo
from mongoengine import * from mongoengine import *
from mongoengine.connection import get_db, get_connection from mongoengine.connection import get_db
from tests.utils import get_mongodb_version, needs_mongodb_v26
__all__ = ("IndexesTest", ) __all__ = ("IndexesTest", )
@@ -412,7 +412,6 @@ class IndexesTest(unittest.TestCase):
User.ensure_indexes() User.ensure_indexes()
info = User.objects._collection.index_information() info = User.objects._collection.index_information()
self.assertEqual(sorted(info.keys()), ['_cls_1_user_guid_1', '_id_']) self.assertEqual(sorted(info.keys()), ['_cls_1_user_guid_1', '_id_'])
User.drop_collection()
def test_embedded_document_index(self): def test_embedded_document_index(self):
"""Tests settings an index on an embedded document """Tests settings an index on an embedded document
@@ -434,7 +433,6 @@ class IndexesTest(unittest.TestCase):
info = BlogPost.objects._collection.index_information() info = BlogPost.objects._collection.index_information()
self.assertEqual(sorted(info.keys()), ['_id_', 'date.yr_-1']) self.assertEqual(sorted(info.keys()), ['_id_', 'date.yr_-1'])
BlogPost.drop_collection()
def test_list_embedded_document_index(self): def test_list_embedded_document_index(self):
"""Ensure list embedded documents can be indexed """Ensure list embedded documents can be indexed
@@ -461,7 +459,6 @@ class IndexesTest(unittest.TestCase):
post1 = BlogPost(title="Embedded Indexes tests in place", post1 = BlogPost(title="Embedded Indexes tests in place",
tags=[Tag(name="about"), Tag(name="time")]) tags=[Tag(name="about"), Tag(name="time")])
post1.save() post1.save()
BlogPost.drop_collection()
def test_recursive_embedded_objects_dont_break_indexes(self): def test_recursive_embedded_objects_dont_break_indexes(self):
@@ -494,8 +491,7 @@ class IndexesTest(unittest.TestCase):
obj = Test(a=1) obj = Test(a=1)
obj.save() obj.save()
connection = get_connection() IS_MONGODB_3 = get_mongodb_version()[0] >= 3
IS_MONGODB_3 = connection.server_info()['versionArray'][0] >= 3
# Need to be explicit about covered indexes as mongoDB doesn't know if # Need to be explicit about covered indexes as mongoDB doesn't know if
# the documents returned might have more keys in that here. # the documents returned might have more keys in that here.
@@ -623,8 +619,6 @@ class IndexesTest(unittest.TestCase):
post3 = BlogPost(title='test3', date=Date(year=2010), slug='test') post3 = BlogPost(title='test3', date=Date(year=2010), slug='test')
self.assertRaises(OperationError, post3.save) self.assertRaises(OperationError, post3.save)
BlogPost.drop_collection()
def test_unique_embedded_document(self): def test_unique_embedded_document(self):
"""Ensure that uniqueness constraints are applied to fields on embedded documents. """Ensure that uniqueness constraints are applied to fields on embedded documents.
""" """
@@ -652,8 +646,6 @@ class IndexesTest(unittest.TestCase):
sub=SubDocument(year=2010, slug='test')) sub=SubDocument(year=2010, slug='test'))
self.assertRaises(NotUniqueError, post3.save) self.assertRaises(NotUniqueError, post3.save)
BlogPost.drop_collection()
def test_unique_embedded_document_in_list(self): def test_unique_embedded_document_in_list(self):
""" """
Ensure that the uniqueness constraints are applied to fields in Ensure that the uniqueness constraints are applied to fields in
@@ -684,8 +676,6 @@ class IndexesTest(unittest.TestCase):
self.assertRaises(NotUniqueError, post2.save) self.assertRaises(NotUniqueError, post2.save)
BlogPost.drop_collection()
def test_unique_with_embedded_document_and_embedded_unique(self): def test_unique_with_embedded_document_and_embedded_unique(self):
"""Ensure that uniqueness constraints are applied to fields on """Ensure that uniqueness constraints are applied to fields on
embedded documents. And work with unique_with as well. embedded documents. And work with unique_with as well.
@@ -719,8 +709,6 @@ class IndexesTest(unittest.TestCase):
sub=SubDocument(year=2009, slug='test-1')) sub=SubDocument(year=2009, slug='test-1'))
self.assertRaises(NotUniqueError, post3.save) self.assertRaises(NotUniqueError, post3.save)
BlogPost.drop_collection()
def test_ttl_indexes(self): def test_ttl_indexes(self):
class Log(Document): class Log(Document):
@@ -733,14 +721,6 @@ class IndexesTest(unittest.TestCase):
Log.drop_collection() Log.drop_collection()
if pymongo.version_tuple[0] < 2 and pymongo.version_tuple[1] < 3:
raise SkipTest('pymongo needs to be 2.3 or higher for this test')
connection = get_connection()
version_array = connection.server_info()['versionArray']
if version_array[0] < 2 and version_array[1] < 2:
raise SkipTest('MongoDB needs to be 2.2 or higher for this test')
# Indexes are lazy so use list() to perform query # Indexes are lazy so use list() to perform query
list(Log.objects) list(Log.objects)
info = Log.objects._collection.index_information() info = Log.objects._collection.index_information()
@@ -768,13 +748,11 @@ class IndexesTest(unittest.TestCase):
raise AssertionError("We saved a dupe!") raise AssertionError("We saved a dupe!")
except NotUniqueError: except NotUniqueError:
pass pass
Customer.drop_collection()
def test_unique_and_primary(self): def test_unique_and_primary(self):
"""If you set a field as primary, then unexpected behaviour can occur. """If you set a field as primary, then unexpected behaviour can occur.
You won't create a duplicate but you will update an existing document. You won't create a duplicate but you will update an existing document.
""" """
class User(Document): class User(Document):
name = StringField(primary_key=True, unique=True) name = StringField(primary_key=True, unique=True)
password = StringField() password = StringField()
@@ -790,8 +768,23 @@ class IndexesTest(unittest.TestCase):
self.assertEqual(User.objects.count(), 1) self.assertEqual(User.objects.count(), 1)
self.assertEqual(User.objects.get().password, 'secret2') self.assertEqual(User.objects.get().password, 'secret2')
def test_unique_and_primary_create(self):
"""Create a new record with a duplicate primary key
throws an exception
"""
class User(Document):
name = StringField(primary_key=True)
password = StringField()
User.drop_collection() User.drop_collection()
User.objects.create(name='huangz', password='secret')
with self.assertRaises(NotUniqueError):
User.objects.create(name='huangz', password='secret2')
self.assertEqual(User.objects.count(), 1)
self.assertEqual(User.objects.get().password, 'secret')
def test_index_with_pk(self): def test_index_with_pk(self):
"""Ensure you can use `pk` as part of a query""" """Ensure you can use `pk` as part of a query"""
@@ -874,8 +867,8 @@ class IndexesTest(unittest.TestCase):
info['provider_ids.foo_1_provider_ids.bar_1']['key']) info['provider_ids.foo_1_provider_ids.bar_1']['key'])
self.assertTrue(info['provider_ids.foo_1_provider_ids.bar_1']['sparse']) self.assertTrue(info['provider_ids.foo_1_provider_ids.bar_1']['sparse'])
@needs_mongodb_v26
def test_text_indexes(self): def test_text_indexes(self):
class Book(Document): class Book(Document):
title = DictField() title = DictField()
meta = { meta = {

View File

@@ -28,8 +28,6 @@ TEST_IMAGE_PATH = os.path.join(os.path.dirname(__file__),
__all__ = ("InstanceTest",) __all__ = ("InstanceTest",)
class InstanceTest(unittest.TestCase): class InstanceTest(unittest.TestCase):
def setUp(self): def setUp(self):
@@ -72,8 +70,7 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(field._instance, instance) self.assertEqual(field._instance, instance)
def test_capped_collection(self): def test_capped_collection(self):
"""Ensure that capped collections work properly. """Ensure that capped collections work properly."""
"""
class Log(Document): class Log(Document):
date = DateTimeField(default=datetime.now) date = DateTimeField(default=datetime.now)
meta = { meta = {
@@ -181,8 +178,7 @@ class InstanceTest(unittest.TestCase):
self.assertEqual('<Article: привет мир>', repr(doc)) self.assertEqual('<Article: привет мир>', repr(doc))
def test_repr_none(self): def test_repr_none(self):
"""Ensure None values handled correctly """Ensure None values are handled correctly."""
"""
class Article(Document): class Article(Document):
title = StringField() title = StringField()
@@ -190,25 +186,23 @@ class InstanceTest(unittest.TestCase):
return None return None
doc = Article(title=u'привет мир') doc = Article(title=u'привет мир')
self.assertEqual('<Article: None>', repr(doc)) self.assertEqual('<Article: None>', repr(doc))
def test_queryset_resurrects_dropped_collection(self): def test_queryset_resurrects_dropped_collection(self):
self.Person.drop_collection() self.Person.drop_collection()
self.assertEqual([], list(self.Person.objects())) self.assertEqual([], list(self.Person.objects()))
# Ensure works correctly with inhertited classes
class Actor(self.Person): class Actor(self.Person):
pass pass
# Ensure works correctly with inhertited classes
Actor.objects() Actor.objects()
self.Person.drop_collection() self.Person.drop_collection()
self.assertEqual([], list(Actor.objects())) self.assertEqual([], list(Actor.objects()))
def test_polymorphic_references(self): def test_polymorphic_references(self):
"""Ensure that the correct subclasses are returned from a query when """Ensure that the correct subclasses are returned from a query
using references / generic references when using references / generic references
""" """
class Animal(Document): class Animal(Document):
meta = {'allow_inheritance': True} meta = {'allow_inheritance': True}
@@ -258,9 +252,6 @@ class InstanceTest(unittest.TestCase):
classes = [a.__class__ for a in Zoo.objects.first().animals] classes = [a.__class__ for a in Zoo.objects.first().animals]
self.assertEqual(classes, [Animal, Fish, Mammal, Dog, Human]) self.assertEqual(classes, [Animal, Fish, Mammal, Dog, Human])
Zoo.drop_collection()
Animal.drop_collection()
def test_reference_inheritance(self): def test_reference_inheritance(self):
class Stats(Document): class Stats(Document):
created = DateTimeField(default=datetime.now) created = DateTimeField(default=datetime.now)
@@ -287,8 +278,7 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(list_stats, CompareStats.objects.first().stats) self.assertEqual(list_stats, CompareStats.objects.first().stats)
def test_db_field_load(self): def test_db_field_load(self):
"""Ensure we load data correctly """Ensure we load data correctly from the right db field."""
"""
class Person(Document): class Person(Document):
name = StringField(required=True) name = StringField(required=True)
_rank = StringField(required=False, db_field="rank") _rank = StringField(required=False, db_field="rank")
@@ -307,8 +297,7 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(Person.objects.get(name="Fred").rank, "Private") self.assertEqual(Person.objects.get(name="Fred").rank, "Private")
def test_db_embedded_doc_field_load(self): def test_db_embedded_doc_field_load(self):
"""Ensure we load embedded document data correctly """Ensure we load embedded document data correctly."""
"""
class Rank(EmbeddedDocument): class Rank(EmbeddedDocument):
title = StringField(required=True) title = StringField(required=True)
@@ -333,8 +322,7 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(Person.objects.get(name="Fred").rank, "Private") self.assertEqual(Person.objects.get(name="Fred").rank, "Private")
def test_custom_id_field(self): def test_custom_id_field(self):
"""Ensure that documents may be created with custom primary keys. """Ensure that documents may be created with custom primary keys."""
"""
class User(Document): class User(Document):
username = StringField(primary_key=True) username = StringField(primary_key=True)
name = StringField() name = StringField()
@@ -382,10 +370,7 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(user_son['_id'], 'mongo') self.assertEqual(user_son['_id'], 'mongo')
self.assertTrue('username' not in user_son['_id']) self.assertTrue('username' not in user_son['_id'])
User.drop_collection()
def test_document_not_registered(self): def test_document_not_registered(self):
class Place(Document): class Place(Document):
name = StringField() name = StringField()
@@ -407,7 +392,6 @@ class InstanceTest(unittest.TestCase):
list(Place.objects.all()) list(Place.objects.all())
def test_document_registry_regressions(self): def test_document_registry_regressions(self):
class Location(Document): class Location(Document):
name = StringField() name = StringField()
meta = {'allow_inheritance': True} meta = {'allow_inheritance': True}
@@ -421,18 +405,16 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(Area, get_document("Location.Area")) self.assertEqual(Area, get_document("Location.Area"))
def test_creation(self): def test_creation(self):
"""Ensure that document may be created using keyword arguments. """Ensure that document may be created using keyword arguments."""
"""
person = self.Person(name="Test User", age=30) person = self.Person(name="Test User", age=30)
self.assertEqual(person.name, "Test User") self.assertEqual(person.name, "Test User")
self.assertEqual(person.age, 30) self.assertEqual(person.age, 30)
def test_to_dbref(self): def test_to_dbref(self):
"""Ensure that you can get a dbref of a document""" """Ensure that you can get a dbref of a document."""
person = self.Person(name="Test User", age=30) person = self.Person(name="Test User", age=30)
self.assertRaises(OperationError, person.to_dbref) self.assertRaises(OperationError, person.to_dbref)
person.save() person.save()
person.to_dbref() person.to_dbref()
def test_save_abstract_document(self): def test_save_abstract_document(self):
@@ -445,8 +427,7 @@ class InstanceTest(unittest.TestCase):
Doc(name='aaa').save() Doc(name='aaa').save()
def test_reload(self): def test_reload(self):
"""Ensure that attributes may be reloaded. """Ensure that attributes may be reloaded."""
"""
person = self.Person(name="Test User", age=20) person = self.Person(name="Test User", age=20)
person.save() person.save()
@@ -479,7 +460,6 @@ class InstanceTest(unittest.TestCase):
doc = Animal(superphylum='Deuterostomia') doc = Animal(superphylum='Deuterostomia')
doc.save() doc.save()
doc.reload() doc.reload()
Animal.drop_collection()
def test_reload_sharded_nested(self): def test_reload_sharded_nested(self):
class SuperPhylum(EmbeddedDocument): class SuperPhylum(EmbeddedDocument):
@@ -493,11 +473,9 @@ class InstanceTest(unittest.TestCase):
doc = Animal(superphylum=SuperPhylum(name='Deuterostomia')) doc = Animal(superphylum=SuperPhylum(name='Deuterostomia'))
doc.save() doc.save()
doc.reload() doc.reload()
Animal.drop_collection()
def test_reload_referencing(self): def test_reload_referencing(self):
"""Ensures reloading updates weakrefs correctly """Ensures reloading updates weakrefs correctly."""
"""
class Embedded(EmbeddedDocument): class Embedded(EmbeddedDocument):
dict_field = DictField() dict_field = DictField()
list_field = ListField() list_field = ListField()
@@ -569,8 +547,7 @@ class InstanceTest(unittest.TestCase):
self.assertFalse("Threw wrong exception") self.assertFalse("Threw wrong exception")
def test_reload_of_non_strict_with_special_field_name(self): def test_reload_of_non_strict_with_special_field_name(self):
"""Ensures reloading works for documents with meta strict == False """Ensures reloading works for documents with meta strict == False."""
"""
class Post(Document): class Post(Document):
meta = { meta = {
'strict': False 'strict': False
@@ -591,8 +568,7 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(post.items, ["more lorem", "even more ipsum"]) self.assertEqual(post.items, ["more lorem", "even more ipsum"])
def test_dictionary_access(self): def test_dictionary_access(self):
"""Ensure that dictionary-style field access works properly. """Ensure that dictionary-style field access works properly."""
"""
person = self.Person(name='Test User', age=30, job=self.Job()) person = self.Person(name='Test User', age=30, job=self.Job())
self.assertEqual(person['name'], 'Test User') self.assertEqual(person['name'], 'Test User')
@@ -634,8 +610,7 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(sub_doc.to_mongo().keys(), ['id']) self.assertEqual(sub_doc.to_mongo().keys(), ['id'])
def test_embedded_document(self): def test_embedded_document(self):
"""Ensure that embedded documents are set up correctly. """Ensure that embedded documents are set up correctly."""
"""
class Comment(EmbeddedDocument): class Comment(EmbeddedDocument):
content = StringField() content = StringField()
@@ -643,8 +618,7 @@ class InstanceTest(unittest.TestCase):
self.assertFalse('id' in Comment._fields) self.assertFalse('id' in Comment._fields)
def test_embedded_document_instance(self): def test_embedded_document_instance(self):
"""Ensure that embedded documents can reference parent instance """Ensure that embedded documents can reference parent instance."""
"""
class Embedded(EmbeddedDocument): class Embedded(EmbeddedDocument):
string = StringField() string = StringField()
@@ -652,6 +626,7 @@ class InstanceTest(unittest.TestCase):
embedded_field = EmbeddedDocumentField(Embedded) embedded_field = EmbeddedDocumentField(Embedded)
Doc.drop_collection() Doc.drop_collection()
doc = Doc(embedded_field=Embedded(string="Hi")) doc = Doc(embedded_field=Embedded(string="Hi"))
self.assertHasInstance(doc.embedded_field, doc) self.assertHasInstance(doc.embedded_field, doc)
@@ -661,7 +636,8 @@ class InstanceTest(unittest.TestCase):
def test_embedded_document_complex_instance(self): def test_embedded_document_complex_instance(self):
"""Ensure that embedded documents in complex fields can reference """Ensure that embedded documents in complex fields can reference
parent instance""" parent instance.
"""
class Embedded(EmbeddedDocument): class Embedded(EmbeddedDocument):
string = StringField() string = StringField()
@@ -677,8 +653,7 @@ class InstanceTest(unittest.TestCase):
self.assertHasInstance(doc.embedded_field[0], doc) self.assertHasInstance(doc.embedded_field[0], doc)
def test_embedded_document_complex_instance_no_use_db_field(self): def test_embedded_document_complex_instance_no_use_db_field(self):
"""Ensure that use_db_field is propagated to list of Emb Docs """Ensure that use_db_field is propagated to list of Emb Docs."""
"""
class Embedded(EmbeddedDocument): class Embedded(EmbeddedDocument):
string = StringField(db_field='s') string = StringField(db_field='s')
@@ -690,7 +665,6 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(d['embedded_field'], [{'string': 'Hi'}]) self.assertEqual(d['embedded_field'], [{'string': 'Hi'}])
def test_instance_is_set_on_setattr(self): def test_instance_is_set_on_setattr(self):
class Email(EmbeddedDocument): class Email(EmbeddedDocument):
email = EmailField() email = EmailField()
@@ -698,6 +672,7 @@ class InstanceTest(unittest.TestCase):
email = EmbeddedDocumentField(Email) email = EmbeddedDocumentField(Email)
Account.drop_collection() Account.drop_collection()
acc = Account() acc = Account()
acc.email = Email(email='test@example.com') acc.email = Email(email='test@example.com')
self.assertHasInstance(acc._data["email"], acc) self.assertHasInstance(acc._data["email"], acc)
@@ -707,7 +682,6 @@ class InstanceTest(unittest.TestCase):
self.assertHasInstance(acc1._data["email"], acc1) self.assertHasInstance(acc1._data["email"], acc1)
def test_instance_is_set_on_setattr_on_embedded_document_list(self): def test_instance_is_set_on_setattr_on_embedded_document_list(self):
class Email(EmbeddedDocument): class Email(EmbeddedDocument):
email = EmailField() email = EmailField()
@@ -853,32 +827,28 @@ class InstanceTest(unittest.TestCase):
self.assertDbEqual([dict(other_doc.to_mongo()), dict(doc.to_mongo())]) self.assertDbEqual([dict(other_doc.to_mongo()), dict(doc.to_mongo())])
def test_save(self): def test_save(self):
"""Ensure that a document may be saved in the database. """Ensure that a document may be saved in the database."""
"""
# Create person object and save it to the database # Create person object and save it to the database
person = self.Person(name='Test User', age=30) person = self.Person(name='Test User', age=30)
person.save() person.save()
# Ensure that the object is in the database # Ensure that the object is in the database
collection = self.db[self.Person._get_collection_name()] collection = self.db[self.Person._get_collection_name()]
person_obj = collection.find_one({'name': 'Test User'}) person_obj = collection.find_one({'name': 'Test User'})
self.assertEqual(person_obj['name'], 'Test User') self.assertEqual(person_obj['name'], 'Test User')
self.assertEqual(person_obj['age'], 30) self.assertEqual(person_obj['age'], 30)
self.assertEqual(person_obj['_id'], person.id) self.assertEqual(person_obj['_id'], person.id)
# Test skipping validation on save
# Test skipping validation on save
class Recipient(Document): class Recipient(Document):
email = EmailField(required=True) email = EmailField(required=True)
recipient = Recipient(email='root@localhost') recipient = Recipient(email='not-an-email')
self.assertRaises(ValidationError, recipient.save) self.assertRaises(ValidationError, recipient.save)
recipient.save(validate=False)
try:
recipient.save(validate=False)
except ValidationError:
self.fail()
def test_save_to_a_value_that_equates_to_false(self): def test_save_to_a_value_that_equates_to_false(self):
class Thing(EmbeddedDocument): class Thing(EmbeddedDocument):
count = IntField() count = IntField()
@@ -898,7 +868,6 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(user.thing.count, 0) self.assertEqual(user.thing.count, 0)
def test_save_max_recursion_not_hit(self): def test_save_max_recursion_not_hit(self):
class Person(Document): class Person(Document):
name = StringField() name = StringField()
parent = ReferenceField('self') parent = ReferenceField('self')
@@ -924,7 +893,6 @@ class InstanceTest(unittest.TestCase):
p0.save() p0.save()
def test_save_max_recursion_not_hit_with_file_field(self): def test_save_max_recursion_not_hit_with_file_field(self):
class Foo(Document): class Foo(Document):
name = StringField() name = StringField()
picture = FileField() picture = FileField()
@@ -948,7 +916,6 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(b.picture, b.bar.picture, b.bar.bar.picture) self.assertEqual(b.picture, b.bar.picture, b.bar.bar.picture)
def test_save_cascades(self): def test_save_cascades(self):
class Person(Document): class Person(Document):
name = StringField() name = StringField()
parent = ReferenceField('self') parent = ReferenceField('self')
@@ -971,7 +938,6 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(p1.name, p.parent.name) self.assertEqual(p1.name, p.parent.name)
def test_save_cascade_kwargs(self): def test_save_cascade_kwargs(self):
class Person(Document): class Person(Document):
name = StringField() name = StringField()
parent = ReferenceField('self') parent = ReferenceField('self')
@@ -992,7 +958,6 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(p1.name, p2.parent.name) self.assertEqual(p1.name, p2.parent.name)
def test_save_cascade_meta_false(self): def test_save_cascade_meta_false(self):
class Person(Document): class Person(Document):
name = StringField() name = StringField()
parent = ReferenceField('self') parent = ReferenceField('self')
@@ -1021,7 +986,6 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(p1.name, p.parent.name) self.assertEqual(p1.name, p.parent.name)
def test_save_cascade_meta_true(self): def test_save_cascade_meta_true(self):
class Person(Document): class Person(Document):
name = StringField() name = StringField()
parent = ReferenceField('self') parent = ReferenceField('self')
@@ -1046,7 +1010,6 @@ class InstanceTest(unittest.TestCase):
self.assertNotEqual(p1.name, p.parent.name) self.assertNotEqual(p1.name, p.parent.name)
def test_save_cascades_generically(self): def test_save_cascades_generically(self):
class Person(Document): class Person(Document):
name = StringField() name = StringField()
parent = GenericReferenceField() parent = GenericReferenceField()
@@ -1072,7 +1035,6 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(p1.name, p.parent.name) self.assertEqual(p1.name, p.parent.name)
def test_save_atomicity_condition(self): def test_save_atomicity_condition(self):
class Widget(Document): class Widget(Document):
toggle = BooleanField(default=False) toggle = BooleanField(default=False)
count = IntField(default=0) count = IntField(default=0)
@@ -1150,7 +1112,8 @@ class InstanceTest(unittest.TestCase):
def test_update(self): def test_update(self):
"""Ensure that an existing document is updated instead of be """Ensure that an existing document is updated instead of be
overwritten.""" overwritten.
"""
# Create person object and save it to the database # Create person object and save it to the database
person = self.Person(name='Test User', age=30) person = self.Person(name='Test User', age=30)
person.save() person.save()
@@ -1254,7 +1217,6 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(2, self.Person.objects.count()) self.assertEqual(2, self.Person.objects.count())
def test_can_save_if_not_included(self): def test_can_save_if_not_included(self):
class EmbeddedDoc(EmbeddedDocument): class EmbeddedDoc(EmbeddedDocument):
pass pass
@@ -1341,10 +1303,7 @@ class InstanceTest(unittest.TestCase):
doc2.update(set__name=doc1.name) doc2.update(set__name=doc1.name)
def test_embedded_update(self): def test_embedded_update(self):
""" """Test update on `EmbeddedDocumentField` fields."""
Test update on `EmbeddedDocumentField` fields
"""
class Page(EmbeddedDocument): class Page(EmbeddedDocument):
log_message = StringField(verbose_name="Log message", log_message = StringField(verbose_name="Log message",
required=True) required=True)
@@ -1365,11 +1324,9 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(site.page.log_message, "Error: Dummy message") self.assertEqual(site.page.log_message, "Error: Dummy message")
def test_embedded_update_db_field(self): def test_embedded_update_db_field(self):
"""Test update on `EmbeddedDocumentField` fields when db_field
is other than default.
""" """
Test update on `EmbeddedDocumentField` fields when db_field is other
than default.
"""
class Page(EmbeddedDocument): class Page(EmbeddedDocument):
log_message = StringField(verbose_name="Log message", log_message = StringField(verbose_name="Log message",
db_field="page_log_message", db_field="page_log_message",
@@ -1392,9 +1349,7 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(site.page.log_message, "Error: Dummy message") self.assertEqual(site.page.log_message, "Error: Dummy message")
def test_save_only_changed_fields(self): def test_save_only_changed_fields(self):
"""Ensure save only sets / unsets changed fields """Ensure save only sets / unsets changed fields."""
"""
class User(self.Person): class User(self.Person):
active = BooleanField(default=True) active = BooleanField(default=True)
@@ -1514,8 +1469,8 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(q, 3) self.assertEqual(q, 3)
def test_set_unset_one_operation(self): def test_set_unset_one_operation(self):
"""Ensure that $set and $unset actions are performed in the same """Ensure that $set and $unset actions are performed in the
operation. same operation.
""" """
class FooBar(Document): class FooBar(Document):
foo = StringField(default=None) foo = StringField(default=None)
@@ -1536,9 +1491,7 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(1, q) self.assertEqual(1, q)
def test_save_only_changed_fields_recursive(self): def test_save_only_changed_fields_recursive(self):
"""Ensure save only sets / unsets changed fields """Ensure save only sets / unsets changed fields."""
"""
class Comment(EmbeddedDocument): class Comment(EmbeddedDocument):
published = BooleanField(default=True) published = BooleanField(default=True)
@@ -1578,8 +1531,7 @@ class InstanceTest(unittest.TestCase):
self.assertFalse(person.comments_dict['first_post'].published) self.assertFalse(person.comments_dict['first_post'].published)
def test_delete(self): def test_delete(self):
"""Ensure that document may be deleted using the delete method. """Ensure that document may be deleted using the delete method."""
"""
person = self.Person(name="Test User", age=30) person = self.Person(name="Test User", age=30)
person.save() person.save()
self.assertEqual(self.Person.objects.count(), 1) self.assertEqual(self.Person.objects.count(), 1)
@@ -1587,33 +1539,34 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(self.Person.objects.count(), 0) self.assertEqual(self.Person.objects.count(), 0)
def test_save_custom_id(self): def test_save_custom_id(self):
"""Ensure that a document may be saved with a custom _id. """Ensure that a document may be saved with a custom _id."""
"""
# Create person object and save it to the database # Create person object and save it to the database
person = self.Person(name='Test User', age=30, person = self.Person(name='Test User', age=30,
id='497ce96f395f2f052a494fd4') id='497ce96f395f2f052a494fd4')
person.save() person.save()
# Ensure that the object is in the database with the correct _id # Ensure that the object is in the database with the correct _id
collection = self.db[self.Person._get_collection_name()] collection = self.db[self.Person._get_collection_name()]
person_obj = collection.find_one({'name': 'Test User'}) person_obj = collection.find_one({'name': 'Test User'})
self.assertEqual(str(person_obj['_id']), '497ce96f395f2f052a494fd4') self.assertEqual(str(person_obj['_id']), '497ce96f395f2f052a494fd4')
def test_save_custom_pk(self): def test_save_custom_pk(self):
""" """Ensure that a document may be saved with a custom _id using
Ensure that a document may be saved with a custom _id using pk alias. pk alias.
""" """
# Create person object and save it to the database # Create person object and save it to the database
person = self.Person(name='Test User', age=30, person = self.Person(name='Test User', age=30,
pk='497ce96f395f2f052a494fd4') pk='497ce96f395f2f052a494fd4')
person.save() person.save()
# Ensure that the object is in the database with the correct _id # Ensure that the object is in the database with the correct _id
collection = self.db[self.Person._get_collection_name()] collection = self.db[self.Person._get_collection_name()]
person_obj = collection.find_one({'name': 'Test User'}) person_obj = collection.find_one({'name': 'Test User'})
self.assertEqual(str(person_obj['_id']), '497ce96f395f2f052a494fd4') self.assertEqual(str(person_obj['_id']), '497ce96f395f2f052a494fd4')
def test_save_list(self): def test_save_list(self):
"""Ensure that a list field may be properly saved. """Ensure that a list field may be properly saved."""
"""
class Comment(EmbeddedDocument): class Comment(EmbeddedDocument):
content = StringField() content = StringField()
@@ -1636,8 +1589,6 @@ class InstanceTest(unittest.TestCase):
for comment_obj, comment in zip(post_obj['comments'], comments): for comment_obj, comment in zip(post_obj['comments'], comments):
self.assertEqual(comment_obj['content'], comment['content']) self.assertEqual(comment_obj['content'], comment['content'])
BlogPost.drop_collection()
def test_list_search_by_embedded(self): def test_list_search_by_embedded(self):
class User(Document): class User(Document):
username = StringField(required=True) username = StringField(required=True)
@@ -1697,8 +1648,8 @@ class InstanceTest(unittest.TestCase):
list(Page.objects.filter(comments__user=u3))) list(Page.objects.filter(comments__user=u3)))
def test_save_embedded_document(self): def test_save_embedded_document(self):
"""Ensure that a document with an embedded document field may be """Ensure that a document with an embedded document field may
saved in the database. be saved in the database.
""" """
class EmployeeDetails(EmbeddedDocument): class EmployeeDetails(EmbeddedDocument):
position = StringField() position = StringField()
@@ -1717,13 +1668,13 @@ class InstanceTest(unittest.TestCase):
employee_obj = collection.find_one({'name': 'Test Employee'}) employee_obj = collection.find_one({'name': 'Test Employee'})
self.assertEqual(employee_obj['name'], 'Test Employee') self.assertEqual(employee_obj['name'], 'Test Employee')
self.assertEqual(employee_obj['age'], 50) self.assertEqual(employee_obj['age'], 50)
# Ensure that the 'details' embedded object saved correctly # Ensure that the 'details' embedded object saved correctly
self.assertEqual(employee_obj['details']['position'], 'Developer') self.assertEqual(employee_obj['details']['position'], 'Developer')
def test_embedded_update_after_save(self): def test_embedded_update_after_save(self):
""" """Test update of `EmbeddedDocumentField` attached to a newly
Test update of `EmbeddedDocumentField` attached to a newly saved saved document.
document.
""" """
class Page(EmbeddedDocument): class Page(EmbeddedDocument):
log_message = StringField(verbose_name="Log message", log_message = StringField(verbose_name="Log message",
@@ -1744,8 +1695,8 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(site.page.log_message, "Error: Dummy message") self.assertEqual(site.page.log_message, "Error: Dummy message")
def test_updating_an_embedded_document(self): def test_updating_an_embedded_document(self):
"""Ensure that a document with an embedded document field may be """Ensure that a document with an embedded document field may
saved in the database. be saved in the database.
""" """
class EmployeeDetails(EmbeddedDocument): class EmployeeDetails(EmbeddedDocument):
position = StringField() position = StringField()
@@ -1780,7 +1731,6 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(promoted_employee.details, None) self.assertEqual(promoted_employee.details, None)
def test_object_mixins(self): def test_object_mixins(self):
class NameMixin(object): class NameMixin(object):
name = StringField() name = StringField()
@@ -1819,9 +1769,9 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(t.count, 12) self.assertEqual(t.count, 12)
def test_save_reference(self): def test_save_reference(self):
"""Ensure that a document reference field may be saved in the database. """Ensure that a document reference field may be saved in the
database.
""" """
class BlogPost(Document): class BlogPost(Document):
meta = {'collection': 'blogpost_1'} meta = {'collection': 'blogpost_1'}
content = StringField() content = StringField()
@@ -1852,8 +1802,6 @@ class InstanceTest(unittest.TestCase):
author = list(self.Person.objects(name='Test User'))[-1] author = list(self.Person.objects(name='Test User'))[-1]
self.assertEqual(author.age, 25) self.assertEqual(author.age, 25)
BlogPost.drop_collection()
def test_duplicate_db_fields_raise_invalid_document_error(self): def test_duplicate_db_fields_raise_invalid_document_error(self):
"""Ensure a InvalidDocumentError is thrown if duplicate fields """Ensure a InvalidDocumentError is thrown if duplicate fields
declare the same db_field. declare the same db_field.
@@ -1864,7 +1812,7 @@ class InstanceTest(unittest.TestCase):
name2 = StringField(db_field='name') name2 = StringField(db_field='name')
def test_invalid_son(self): def test_invalid_son(self):
"""Raise an error if loading invalid data""" """Raise an error if loading invalid data."""
class Occurrence(EmbeddedDocument): class Occurrence(EmbeddedDocument):
number = IntField() number = IntField()
@@ -1887,9 +1835,9 @@ class InstanceTest(unittest.TestCase):
Word._from_son('this is not a valid SON dict') Word._from_son('this is not a valid SON dict')
def test_reverse_delete_rule_cascade_and_nullify(self): def test_reverse_delete_rule_cascade_and_nullify(self):
"""Ensure that a referenced document is also deleted upon deletion. """Ensure that a referenced document is also deleted upon
deletion.
""" """
class BlogPost(Document): class BlogPost(Document):
content = StringField() content = StringField()
author = ReferenceField(self.Person, reverse_delete_rule=CASCADE) author = ReferenceField(self.Person, reverse_delete_rule=CASCADE)
@@ -1944,7 +1892,8 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(Book.objects.count(), 0) self.assertEqual(Book.objects.count(), 0)
def test_reverse_delete_rule_with_shared_id_among_collections(self): def test_reverse_delete_rule_with_shared_id_among_collections(self):
"""Ensure that cascade delete rule doesn't mix id among collections. """Ensure that cascade delete rule doesn't mix id among
collections.
""" """
class User(Document): class User(Document):
id = IntField(primary_key=True) id = IntField(primary_key=True)
@@ -1975,10 +1924,9 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(Book.objects.get(), book_2) self.assertEqual(Book.objects.get(), book_2)
def test_reverse_delete_rule_with_document_inheritance(self): def test_reverse_delete_rule_with_document_inheritance(self):
"""Ensure that a referenced document is also deleted upon deletion """Ensure that a referenced document is also deleted upon
of a child document. deletion of a child document.
""" """
class Writer(self.Person): class Writer(self.Person):
pass pass
@@ -2010,10 +1958,9 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(BlogPost.objects.count(), 0) self.assertEqual(BlogPost.objects.count(), 0)
def test_reverse_delete_rule_cascade_and_nullify_complex_field(self): def test_reverse_delete_rule_cascade_and_nullify_complex_field(self):
"""Ensure that a referenced document is also deleted upon deletion for """Ensure that a referenced document is also deleted upon
complex fields. deletion for complex fields.
""" """
class BlogPost(Document): class BlogPost(Document):
content = StringField() content = StringField()
authors = ListField(ReferenceField( authors = ListField(ReferenceField(
@@ -2022,7 +1969,6 @@ class InstanceTest(unittest.TestCase):
self.Person, reverse_delete_rule=NULLIFY)) self.Person, reverse_delete_rule=NULLIFY))
self.Person.drop_collection() self.Person.drop_collection()
BlogPost.drop_collection() BlogPost.drop_collection()
author = self.Person(name='Test User') author = self.Person(name='Test User')
@@ -2046,10 +1992,10 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(BlogPost.objects.count(), 0) self.assertEqual(BlogPost.objects.count(), 0)
def test_reverse_delete_rule_cascade_triggers_pre_delete_signal(self): def test_reverse_delete_rule_cascade_triggers_pre_delete_signal(self):
""" ensure the pre_delete signal is triggered upon a cascading deletion """Ensure the pre_delete signal is triggered upon a cascading
setup a blog post with content, an author and editor deletion setup a blog post with content, an author and editor
delete the author which triggers deletion of blogpost via cascade delete the author which triggers deletion of blogpost via
blog post's pre_delete signal alters an editor attribute cascade blog post's pre_delete signal alters an editor attribute.
""" """
class Editor(self.Person): class Editor(self.Person):
review_queue = IntField(default=0) review_queue = IntField(default=0)
@@ -2077,6 +2023,7 @@ class InstanceTest(unittest.TestCase):
# delete the author, the post is also deleted due to the CASCADE rule # delete the author, the post is also deleted due to the CASCADE rule
author.delete() author.delete()
# the pre-delete signal should have decremented the editor's queue # the pre-delete signal should have decremented the editor's queue
editor = Editor.objects(name='Max P.').get() editor = Editor.objects(name='Max P.').get()
self.assertEqual(editor.review_queue, 0) self.assertEqual(editor.review_queue, 0)
@@ -2085,7 +2032,6 @@ class InstanceTest(unittest.TestCase):
"""Ensure that Bi-Directional relationships work with """Ensure that Bi-Directional relationships work with
reverse_delete_rule reverse_delete_rule
""" """
class Bar(Document): class Bar(Document):
content = StringField() content = StringField()
foo = ReferenceField('Foo') foo = ReferenceField('Foo')
@@ -2131,8 +2077,8 @@ class InstanceTest(unittest.TestCase):
mother = ReferenceField('Person', reverse_delete_rule=DENY) mother = ReferenceField('Person', reverse_delete_rule=DENY)
def test_reverse_delete_rule_cascade_recurs(self): def test_reverse_delete_rule_cascade_recurs(self):
"""Ensure that a chain of documents is also deleted upon cascaded """Ensure that a chain of documents is also deleted upon
deletion. cascaded deletion.
""" """
class BlogPost(Document): class BlogPost(Document):
content = StringField() content = StringField()
@@ -2162,15 +2108,10 @@ class InstanceTest(unittest.TestCase):
author.delete() author.delete()
self.assertEqual(Comment.objects.count(), 0) self.assertEqual(Comment.objects.count(), 0)
self.Person.drop_collection()
BlogPost.drop_collection()
Comment.drop_collection()
def test_reverse_delete_rule_deny(self): def test_reverse_delete_rule_deny(self):
"""Ensure that a document cannot be referenced if there are still """Ensure that a document cannot be referenced if there are
documents referring to it. still documents referring to it.
""" """
class BlogPost(Document): class BlogPost(Document):
content = StringField() content = StringField()
author = ReferenceField(self.Person, reverse_delete_rule=DENY) author = ReferenceField(self.Person, reverse_delete_rule=DENY)
@@ -2198,11 +2139,7 @@ class InstanceTest(unittest.TestCase):
author.delete() author.delete()
self.assertEqual(self.Person.objects.count(), 1) self.assertEqual(self.Person.objects.count(), 1)
self.Person.drop_collection()
BlogPost.drop_collection()
def subclasses_and_unique_keys_works(self): def subclasses_and_unique_keys_works(self):
class A(Document): class A(Document):
pass pass
@@ -2218,12 +2155,9 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(A.objects.count(), 2) self.assertEqual(A.objects.count(), 2)
self.assertEqual(B.objects.count(), 1) self.assertEqual(B.objects.count(), 1)
A.drop_collection()
B.drop_collection()
def test_document_hash(self): def test_document_hash(self):
"""Test document in list, dict, set """Test document in list, dict, set."""
"""
class User(Document): class User(Document):
pass pass
@@ -2266,11 +2200,9 @@ class InstanceTest(unittest.TestCase):
# in Set # in Set
all_user_set = set(User.objects.all()) all_user_set = set(User.objects.all())
self.assertTrue(u1 in all_user_set) self.assertTrue(u1 in all_user_set)
def test_picklable(self): def test_picklable(self):
pickle_doc = PickleTest(number=1, string="One", lists=['1', '2']) pickle_doc = PickleTest(number=1, string="One", lists=['1', '2'])
pickle_doc.embedded = PickleEmbedded() pickle_doc.embedded = PickleEmbedded()
pickled_doc = pickle.dumps(pickle_doc) # make sure pickling works even before the doc is saved pickled_doc = pickle.dumps(pickle_doc) # make sure pickling works even before the doc is saved
@@ -2296,7 +2228,6 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(pickle_doc.lists, ["1", "2", "3"]) self.assertEqual(pickle_doc.lists, ["1", "2", "3"])
def test_regular_document_pickle(self): def test_regular_document_pickle(self):
pickle_doc = PickleTest(number=1, string="One", lists=['1', '2']) pickle_doc = PickleTest(number=1, string="One", lists=['1', '2'])
pickled_doc = pickle.dumps(pickle_doc) # make sure pickling works even before the doc is saved pickled_doc = pickle.dumps(pickle_doc) # make sure pickling works even before the doc is saved
pickle_doc.save() pickle_doc.save()
@@ -2319,7 +2250,6 @@ class InstanceTest(unittest.TestCase):
fixtures.PickleTest = PickleTest fixtures.PickleTest = PickleTest
def test_dynamic_document_pickle(self): def test_dynamic_document_pickle(self):
pickle_doc = PickleDynamicTest( pickle_doc = PickleDynamicTest(
name="test", number=1, string="One", lists=['1', '2']) name="test", number=1, string="One", lists=['1', '2'])
pickle_doc.embedded = PickleDynamicEmbedded(foo="Bar") pickle_doc.embedded = PickleDynamicEmbedded(foo="Bar")
@@ -2358,7 +2288,6 @@ class InstanceTest(unittest.TestCase):
validate = DictField() validate = DictField()
def test_mutating_documents(self): def test_mutating_documents(self):
class B(EmbeddedDocument): class B(EmbeddedDocument):
field1 = StringField(default='field1') field1 = StringField(default='field1')
@@ -2366,6 +2295,7 @@ class InstanceTest(unittest.TestCase):
b = EmbeddedDocumentField(B, default=lambda: B()) b = EmbeddedDocumentField(B, default=lambda: B())
A.drop_collection() A.drop_collection()
a = A() a = A()
a.save() a.save()
a.reload() a.reload()
@@ -2389,12 +2319,13 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(a.b.field2.c_field, 'new value') self.assertEqual(a.b.field2.c_field, 'new value')
def test_can_save_false_values(self): def test_can_save_false_values(self):
"""Ensures you can save False values on save""" """Ensures you can save False values on save."""
class Doc(Document): class Doc(Document):
foo = StringField() foo = StringField()
archived = BooleanField(default=False, required=True) archived = BooleanField(default=False, required=True)
Doc.drop_collection() Doc.drop_collection()
d = Doc() d = Doc()
d.save() d.save()
d.archived = False d.archived = False
@@ -2403,11 +2334,12 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(Doc.objects(archived=False).count(), 1) self.assertEqual(Doc.objects(archived=False).count(), 1)
def test_can_save_false_values_dynamic(self): def test_can_save_false_values_dynamic(self):
"""Ensures you can save False values on dynamic docs""" """Ensures you can save False values on dynamic docs."""
class Doc(DynamicDocument): class Doc(DynamicDocument):
foo = StringField() foo = StringField()
Doc.drop_collection() Doc.drop_collection()
d = Doc() d = Doc()
d.save() d.save()
d.archived = False d.archived = False
@@ -2447,7 +2379,7 @@ class InstanceTest(unittest.TestCase):
Collection.update = orig_update Collection.update = orig_update
def test_db_alias_tests(self): def test_db_alias_tests(self):
""" DB Alias tests """ """DB Alias tests."""
# mongoenginetest - Is default connection alias from setUp() # mongoenginetest - Is default connection alias from setUp()
# Register Aliases # Register Aliases
register_connection('testdb-1', 'mongoenginetest2') register_connection('testdb-1', 'mongoenginetest2')
@@ -2509,8 +2441,7 @@ class InstanceTest(unittest.TestCase):
get_db("testdb-3")[AuthorBooks._get_collection_name()]) get_db("testdb-3")[AuthorBooks._get_collection_name()])
def test_db_alias_overrides(self): def test_db_alias_overrides(self):
"""db_alias can be overriden """Test db_alias can be overriden."""
"""
# Register a connection with db_alias testdb-2 # Register a connection with db_alias testdb-2
register_connection('testdb-2', 'mongoenginetest2') register_connection('testdb-2', 'mongoenginetest2')
@@ -2534,8 +2465,7 @@ class InstanceTest(unittest.TestCase):
B._get_collection().database.name) B._get_collection().database.name)
def test_db_alias_propagates(self): def test_db_alias_propagates(self):
"""db_alias propagates? """db_alias propagates?"""
"""
register_connection('testdb-1', 'mongoenginetest2') register_connection('testdb-1', 'mongoenginetest2')
class A(Document): class A(Document):
@@ -2548,8 +2478,7 @@ class InstanceTest(unittest.TestCase):
self.assertEqual('testdb-1', B._meta.get('db_alias')) self.assertEqual('testdb-1', B._meta.get('db_alias'))
def test_db_ref_usage(self): def test_db_ref_usage(self):
""" DB Ref usage in dict_fields""" """DB Ref usage in dict_fields."""
class User(Document): class User(Document):
name = StringField() name = StringField()
@@ -2784,7 +2713,6 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(user.thing._data['data'], [1, 2, 3]) self.assertEqual(user.thing._data['data'], [1, 2, 3])
def test_spaces_in_keys(self): def test_spaces_in_keys(self):
class Embedded(DynamicEmbeddedDocument): class Embedded(DynamicEmbeddedDocument):
pass pass
@@ -2873,7 +2801,6 @@ class InstanceTest(unittest.TestCase):
log.machine = "127.0.0.1" log.machine = "127.0.0.1"
def test_kwargs_simple(self): def test_kwargs_simple(self):
class Embedded(EmbeddedDocument): class Embedded(EmbeddedDocument):
name = StringField() name = StringField()
@@ -2893,7 +2820,6 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(classic_doc._data, dict_doc._data) self.assertEqual(classic_doc._data, dict_doc._data)
def test_kwargs_complex(self): def test_kwargs_complex(self):
class Embedded(EmbeddedDocument): class Embedded(EmbeddedDocument):
name = StringField() name = StringField()
@@ -2916,36 +2842,35 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(classic_doc._data, dict_doc._data) self.assertEqual(classic_doc._data, dict_doc._data)
def test_positional_creation(self): def test_positional_creation(self):
"""Ensure that document may be created using positional arguments. """Ensure that document may be created using positional arguments."""
"""
person = self.Person("Test User", 42) person = self.Person("Test User", 42)
self.assertEqual(person.name, "Test User") self.assertEqual(person.name, "Test User")
self.assertEqual(person.age, 42) self.assertEqual(person.age, 42)
def test_mixed_creation(self): def test_mixed_creation(self):
"""Ensure that document may be created using mixed arguments. """Ensure that document may be created using mixed arguments."""
"""
person = self.Person("Test User", age=42) person = self.Person("Test User", age=42)
self.assertEqual(person.name, "Test User") self.assertEqual(person.name, "Test User")
self.assertEqual(person.age, 42) self.assertEqual(person.age, 42)
def test_positional_creation_embedded(self): def test_positional_creation_embedded(self):
"""Ensure that embedded document may be created using positional arguments. """Ensure that embedded document may be created using positional
arguments.
""" """
job = self.Job("Test Job", 4) job = self.Job("Test Job", 4)
self.assertEqual(job.name, "Test Job") self.assertEqual(job.name, "Test Job")
self.assertEqual(job.years, 4) self.assertEqual(job.years, 4)
def test_mixed_creation_embedded(self): def test_mixed_creation_embedded(self):
"""Ensure that embedded document may be created using mixed arguments. """Ensure that embedded document may be created using mixed
arguments.
""" """
job = self.Job("Test Job", years=4) job = self.Job("Test Job", years=4)
self.assertEqual(job.name, "Test Job") self.assertEqual(job.name, "Test Job")
self.assertEqual(job.years, 4) self.assertEqual(job.years, 4)
def test_mixed_creation_dynamic(self): def test_mixed_creation_dynamic(self):
"""Ensure that document may be created using mixed arguments. """Ensure that document may be created using mixed arguments."""
"""
class Person(DynamicDocument): class Person(DynamicDocument):
name = StringField() name = StringField()
@@ -2954,14 +2879,14 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(person.age, 42) self.assertEqual(person.age, 42)
def test_bad_mixed_creation(self): def test_bad_mixed_creation(self):
"""Ensure that document gives correct error when duplicating arguments """Ensure that document gives correct error when duplicating
arguments.
""" """
with self.assertRaises(TypeError): with self.assertRaises(TypeError):
return self.Person("Test User", 42, name="Bad User") return self.Person("Test User", 42, name="Bad User")
def test_data_contains_id_field(self): def test_data_contains_id_field(self):
"""Ensure that asking for _data returns 'id' """Ensure that asking for _data returns 'id'."""
"""
class Person(Document): class Person(Document):
name = StringField() name = StringField()
@@ -2973,7 +2898,6 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(person._data.get('id'), person.id) self.assertEqual(person._data.get('id'), person.id)
def test_complex_nesting_document_and_embedded_document(self): def test_complex_nesting_document_and_embedded_document(self):
class Macro(EmbeddedDocument): class Macro(EmbeddedDocument):
value = DynamicField(default="UNDEFINED") value = DynamicField(default="UNDEFINED")
@@ -3016,7 +2940,6 @@ class InstanceTest(unittest.TestCase):
system.nodes["node"].parameters["param"].macros["test"].value) system.nodes["node"].parameters["param"].macros["test"].value)
def test_embedded_document_equality(self): def test_embedded_document_equality(self):
class Test(Document): class Test(Document):
field = StringField(required=True) field = StringField(required=True)
@@ -3202,8 +3125,7 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(idx, 2) self.assertEqual(idx, 2)
def test_falsey_pk(self): def test_falsey_pk(self):
"""Ensure that we can create and update a document with Falsey PK. """Ensure that we can create and update a document with Falsey PK."""
"""
class Person(Document): class Person(Document):
age = IntField(primary_key=True) age = IntField(primary_key=True)
height = FloatField() height = FloatField()

File diff suppressed because it is too large Load Diff

View File

@@ -18,15 +18,13 @@ try:
except ImportError: except ImportError:
HAS_PIL = False HAS_PIL = False
from tests.utils import MongoDBTestCase
TEST_IMAGE_PATH = os.path.join(os.path.dirname(__file__), 'mongoengine.png') TEST_IMAGE_PATH = os.path.join(os.path.dirname(__file__), 'mongoengine.png')
TEST_IMAGE2_PATH = os.path.join(os.path.dirname(__file__), 'mongodb_leaf.png') TEST_IMAGE2_PATH = os.path.join(os.path.dirname(__file__), 'mongodb_leaf.png')
class FileTest(unittest.TestCase): class FileTest(MongoDBTestCase):
def setUp(self):
connect(db='mongoenginetest')
self.db = get_db()
def tearDown(self): def tearDown(self):
self.db.drop_collection('fs.files') self.db.drop_collection('fs.files')

View File

@@ -1,105 +1,139 @@
from datetime import datetime, timedelta import datetime
import unittest import unittest
from pymongo.errors import OperationFailure
from mongoengine import * from mongoengine import *
from mongoengine.connection import get_connection
from nose.plugins.skip import SkipTest from tests.utils import MongoDBTestCase, needs_mongodb_v3
__all__ = ("GeoQueriesTest",) __all__ = ("GeoQueriesTest",)
class GeoQueriesTest(unittest.TestCase): class GeoQueriesTest(MongoDBTestCase):
def setUp(self): def _create_event_data(self, point_field_class=GeoPointField):
connect(db='mongoenginetest') """Create some sample data re-used in many of the tests below."""
def test_geospatial_operators(self):
"""Ensure that geospatial queries are working.
"""
class Event(Document): class Event(Document):
title = StringField() title = StringField()
date = DateTimeField() date = DateTimeField()
location = GeoPointField() location = point_field_class()
def __unicode__(self): def __unicode__(self):
return self.title return self.title
self.Event = Event
Event.drop_collection() Event.drop_collection()
event1 = Event(title="Coltrane Motion @ Double Door", event1 = Event.objects.create(
date=datetime.now() - timedelta(days=1), title="Coltrane Motion @ Double Door",
location=[-87.677137, 41.909889]).save() date=datetime.datetime.now() - datetime.timedelta(days=1),
event2 = Event(title="Coltrane Motion @ Bottom of the Hill", location=[-87.677137, 41.909889])
date=datetime.now() - timedelta(days=10), event2 = Event.objects.create(
location=[-122.4194155, 37.7749295]).save() title="Coltrane Motion @ Bottom of the Hill",
event3 = Event(title="Coltrane Motion @ Empty Bottle", date=datetime.datetime.now() - datetime.timedelta(days=10),
date=datetime.now(), location=[-122.4194155, 37.7749295])
location=[-87.686638, 41.900474]).save() event3 = Event.objects.create(
title="Coltrane Motion @ Empty Bottle",
date=datetime.datetime.now(),
location=[-87.686638, 41.900474])
return event1, event2, event3
def test_near(self):
"""Make sure the "near" operator works."""
event1, event2, event3 = self._create_event_data()
# find all events "near" pitchfork office, chicago. # find all events "near" pitchfork office, chicago.
# note that "near" will show the san francisco event, too, # note that "near" will show the san francisco event, too,
# although it sorts to last. # although it sorts to last.
events = Event.objects(location__near=[-87.67892, 41.9120459]) events = self.Event.objects(location__near=[-87.67892, 41.9120459])
self.assertEqual(events.count(), 3) self.assertEqual(events.count(), 3)
self.assertEqual(list(events), [event1, event3, event2]) self.assertEqual(list(events), [event1, event3, event2])
# ensure ordering is respected by "near"
events = self.Event.objects(location__near=[-87.67892, 41.9120459])
events = events.order_by("-date")
self.assertEqual(events.count(), 3)
self.assertEqual(list(events), [event3, event1, event2])
def test_near_and_max_distance(self):
"""Ensure the "max_distance" operator works alongside the "near"
operator.
"""
event1, event2, event3 = self._create_event_data()
# find events within 10 degrees of san francisco
point = [-122.415579, 37.7566023]
events = self.Event.objects(location__near=point,
location__max_distance=10)
self.assertEqual(events.count(), 1)
self.assertEqual(events[0], event2)
# $minDistance was added in MongoDB v2.6, but continued being buggy
# until v3.0; skip for older versions
@needs_mongodb_v3
def test_near_and_min_distance(self):
"""Ensure the "min_distance" operator works alongside the "near"
operator.
"""
event1, event2, event3 = self._create_event_data()
# find events at least 10 degrees away of san francisco
point = [-122.415579, 37.7566023]
events = self.Event.objects(location__near=point,
location__min_distance=10)
self.assertEqual(events.count(), 2)
def test_within_distance(self):
"""Make sure the "within_distance" operator works."""
event1, event2, event3 = self._create_event_data()
# find events within 5 degrees of pitchfork office, chicago # find events within 5 degrees of pitchfork office, chicago
point_and_distance = [[-87.67892, 41.9120459], 5] point_and_distance = [[-87.67892, 41.9120459], 5]
events = Event.objects(location__within_distance=point_and_distance) events = self.Event.objects(
location__within_distance=point_and_distance)
self.assertEqual(events.count(), 2) self.assertEqual(events.count(), 2)
events = list(events) events = list(events)
self.assertTrue(event2 not in events) self.assertTrue(event2 not in events)
self.assertTrue(event1 in events) self.assertTrue(event1 in events)
self.assertTrue(event3 in events) self.assertTrue(event3 in events)
# ensure ordering is respected by "near"
events = Event.objects(location__near=[-87.67892, 41.9120459])
events = events.order_by("-date")
self.assertEqual(events.count(), 3)
self.assertEqual(list(events), [event3, event1, event2])
# find events within 10 degrees of san francisco
point = [-122.415579, 37.7566023]
events = Event.objects(location__near=point, location__max_distance=10)
self.assertEqual(events.count(), 1)
self.assertEqual(events[0], event2)
# find events at least 10 degrees away of san francisco
point = [-122.415579, 37.7566023]
events = Event.objects(location__near=point, location__min_distance=10)
# The following real test passes on MongoDB 3 but minDistance seems
# buggy on older MongoDB versions
if get_connection().server_info()['versionArray'][0] > 2:
self.assertEqual(events.count(), 2)
else:
self.assertTrue(events.count() >= 2)
# find events within 10 degrees of san francisco # find events within 10 degrees of san francisco
point_and_distance = [[-122.415579, 37.7566023], 10] point_and_distance = [[-122.415579, 37.7566023], 10]
events = Event.objects(location__within_distance=point_and_distance) events = self.Event.objects(
location__within_distance=point_and_distance)
self.assertEqual(events.count(), 1) self.assertEqual(events.count(), 1)
self.assertEqual(events[0], event2) self.assertEqual(events[0], event2)
# find events within 1 degree of greenpoint, broolyn, nyc, ny # find events within 1 degree of greenpoint, broolyn, nyc, ny
point_and_distance = [[-73.9509714, 40.7237134], 1] point_and_distance = [[-73.9509714, 40.7237134], 1]
events = Event.objects(location__within_distance=point_and_distance) events = self.Event.objects(
location__within_distance=point_and_distance)
self.assertEqual(events.count(), 0) self.assertEqual(events.count(), 0)
# ensure ordering is respected by "within_distance" # ensure ordering is respected by "within_distance"
point_and_distance = [[-87.67892, 41.9120459], 10] point_and_distance = [[-87.67892, 41.9120459], 10]
events = Event.objects(location__within_distance=point_and_distance) events = self.Event.objects(
location__within_distance=point_and_distance)
events = events.order_by("-date") events = events.order_by("-date")
self.assertEqual(events.count(), 2) self.assertEqual(events.count(), 2)
self.assertEqual(events[0], event3) self.assertEqual(events[0], event3)
def test_within_box(self):
"""Ensure the "within_box" operator works."""
event1, event2, event3 = self._create_event_data()
# check that within_box works # check that within_box works
box = [(-125.0, 35.0), (-100.0, 40.0)] box = [(-125.0, 35.0), (-100.0, 40.0)]
events = Event.objects(location__within_box=box) events = self.Event.objects(location__within_box=box)
self.assertEqual(events.count(), 1) self.assertEqual(events.count(), 1)
self.assertEqual(events[0].id, event2.id) self.assertEqual(events[0].id, event2.id)
def test_within_polygon(self):
"""Ensure the "within_polygon" operator works."""
event1, event2, event3 = self._create_event_data()
polygon = [ polygon = [
(-87.694445, 41.912114), (-87.694445, 41.912114),
(-87.69084, 41.919395), (-87.69084, 41.919395),
@@ -107,7 +141,7 @@ class GeoQueriesTest(unittest.TestCase):
(-87.654276, 41.911731), (-87.654276, 41.911731),
(-87.656164, 41.898061), (-87.656164, 41.898061),
] ]
events = Event.objects(location__within_polygon=polygon) events = self.Event.objects(location__within_polygon=polygon)
self.assertEqual(events.count(), 1) self.assertEqual(events.count(), 1)
self.assertEqual(events[0].id, event1.id) self.assertEqual(events[0].id, event1.id)
@@ -116,13 +150,151 @@ class GeoQueriesTest(unittest.TestCase):
(-1.225891, 52.792797), (-1.225891, 52.792797),
(-4.40094, 53.389881) (-4.40094, 53.389881)
] ]
events = Event.objects(location__within_polygon=polygon2) events = self.Event.objects(location__within_polygon=polygon2)
self.assertEqual(events.count(), 0) self.assertEqual(events.count(), 0)
def test_geo_spatial_embedded(self): def test_2dsphere_near(self):
"""Make sure the "near" operator works with a PointField, which
corresponds to a 2dsphere index.
"""
event1, event2, event3 = self._create_event_data(
point_field_class=PointField
)
# find all events "near" pitchfork office, chicago.
# note that "near" will show the san francisco event, too,
# although it sorts to last.
events = self.Event.objects(location__near=[-87.67892, 41.9120459])
self.assertEqual(events.count(), 3)
self.assertEqual(list(events), [event1, event3, event2])
# ensure ordering is respected by "near"
events = self.Event.objects(location__near=[-87.67892, 41.9120459])
events = events.order_by("-date")
self.assertEqual(events.count(), 3)
self.assertEqual(list(events), [event3, event1, event2])
def test_2dsphere_near_and_max_distance(self):
"""Ensure the "max_distance" operator works alongside the "near"
operator with a 2dsphere index.
"""
event1, event2, event3 = self._create_event_data(
point_field_class=PointField
)
# find events within 10km of san francisco
point = [-122.415579, 37.7566023]
events = self.Event.objects(location__near=point,
location__max_distance=10000)
self.assertEqual(events.count(), 1)
self.assertEqual(events[0], event2)
# find events within 1km of greenpoint, broolyn, nyc, ny
events = self.Event.objects(location__near=[-73.9509714, 40.7237134],
location__max_distance=1000)
self.assertEqual(events.count(), 0)
# ensure ordering is respected by "near"
events = self.Event.objects(
location__near=[-87.67892, 41.9120459],
location__max_distance=10000
).order_by("-date")
self.assertEqual(events.count(), 2)
self.assertEqual(events[0], event3)
def test_2dsphere_geo_within_box(self):
"""Ensure the "geo_within_box" operator works with a 2dsphere
index.
"""
event1, event2, event3 = self._create_event_data(
point_field_class=PointField
)
# check that within_box works
box = [(-125.0, 35.0), (-100.0, 40.0)]
events = self.Event.objects(location__geo_within_box=box)
self.assertEqual(events.count(), 1)
self.assertEqual(events[0].id, event2.id)
def test_2dsphere_geo_within_polygon(self):
"""Ensure the "geo_within_polygon" operator works with a
2dsphere index.
"""
event1, event2, event3 = self._create_event_data(
point_field_class=PointField
)
polygon = [
(-87.694445, 41.912114),
(-87.69084, 41.919395),
(-87.681742, 41.927186),
(-87.654276, 41.911731),
(-87.656164, 41.898061),
]
events = self.Event.objects(location__geo_within_polygon=polygon)
self.assertEqual(events.count(), 1)
self.assertEqual(events[0].id, event1.id)
polygon2 = [
(-1.742249, 54.033586),
(-1.225891, 52.792797),
(-4.40094, 53.389881)
]
events = self.Event.objects(location__geo_within_polygon=polygon2)
self.assertEqual(events.count(), 0)
# $minDistance was added in MongoDB v2.6, but continued being buggy
# until v3.0; skip for older versions
@needs_mongodb_v3
def test_2dsphere_near_and_min_max_distance(self):
"""Ensure "min_distace" and "max_distance" operators work well
together with the "near" operator in a 2dsphere index.
"""
event1, event2, event3 = self._create_event_data(
point_field_class=PointField
)
# ensure min_distance and max_distance combine well
events = self.Event.objects(
location__near=[-87.67892, 41.9120459],
location__min_distance=1000,
location__max_distance=10000
).order_by("-date")
self.assertEqual(events.count(), 1)
self.assertEqual(events[0], event3)
# ensure ordering is respected by "near" with "min_distance"
events = self.Event.objects(
location__near=[-87.67892, 41.9120459],
location__min_distance=10000
).order_by("-date")
self.assertEqual(events.count(), 1)
self.assertEqual(events[0], event2)
def test_2dsphere_geo_within_center(self):
"""Make sure the "geo_within_center" operator works with a
2dsphere index.
"""
event1, event2, event3 = self._create_event_data(
point_field_class=PointField
)
# find events within 5 degrees of pitchfork office, chicago
point_and_distance = [[-87.67892, 41.9120459], 2]
events = self.Event.objects(
location__geo_within_center=point_and_distance)
self.assertEqual(events.count(), 2)
events = list(events)
self.assertTrue(event2 not in events)
self.assertTrue(event1 in events)
self.assertTrue(event3 in events)
def _test_embedded(self, point_field_class):
"""Helper test method ensuring given point field class works
well in an embedded document.
"""
class Venue(EmbeddedDocument): class Venue(EmbeddedDocument):
location = GeoPointField() location = point_field_class()
name = StringField() name = StringField()
class Event(Document): class Event(Document):
@@ -148,16 +320,18 @@ class GeoQueriesTest(unittest.TestCase):
self.assertEqual(events.count(), 3) self.assertEqual(events.count(), 3)
self.assertEqual(list(events), [event1, event3, event2]) self.assertEqual(list(events), [event1, event3, event2])
def test_spherical_geospatial_operators(self): def test_geo_spatial_embedded(self):
"""Ensure that spherical geospatial queries are working """Make sure GeoPointField works properly in an embedded document."""
""" self._test_embedded(point_field_class=GeoPointField)
# Needs MongoDB > 2.6.4 https://jira.mongodb.org/browse/SERVER-14039
connection = get_connection()
info = connection.test.command('buildInfo')
mongodb_version = tuple([int(i) for i in info['version'].split('.')])
if mongodb_version < (2, 6, 4):
raise SkipTest("Need MongoDB version 2.6.4+")
def test_2dsphere_point_embedded(self):
"""Make sure PointField works properly in an embedded document."""
self._test_embedded(point_field_class=PointField)
# Needs MongoDB > 2.6.4 https://jira.mongodb.org/browse/SERVER-14039
@needs_mongodb_v3
def test_spherical_geospatial_operators(self):
"""Ensure that spherical geospatial queries are working."""
class Point(Document): class Point(Document):
location = GeoPointField() location = GeoPointField()
@@ -177,7 +351,10 @@ class GeoQueriesTest(unittest.TestCase):
# Same behavior for _within_spherical_distance # Same behavior for _within_spherical_distance
points = Point.objects( points = Point.objects(
location__within_spherical_distance=[[-122, 37.5], 60 / earth_radius] location__within_spherical_distance=[
[-122, 37.5],
60 / earth_radius
]
) )
self.assertEqual(points.count(), 2) self.assertEqual(points.count(), 2)
@@ -194,14 +371,9 @@ class GeoQueriesTest(unittest.TestCase):
# Test query works with min_distance, being farer from one point # Test query works with min_distance, being farer from one point
points = Point.objects(location__near_sphere=[-122, 37.8], points = Point.objects(location__near_sphere=[-122, 37.8],
location__min_distance=60 / earth_radius) location__min_distance=60 / earth_radius)
# The following real test passes on MongoDB 3 but minDistance seems self.assertEqual(points.count(), 1)
# buggy on older MongoDB versions far_point = points.first()
if get_connection().server_info()['versionArray'][0] > 2: self.assertNotEqual(close_point, far_point)
self.assertEqual(points.count(), 1)
far_point = points.first()
self.assertNotEqual(close_point, far_point)
else:
self.assertTrue(points.count() >= 1)
# Finds both points, but orders the north point first because it's # Finds both points, but orders the north point first because it's
# closer to the reference point to the north. # closer to the reference point to the north.
@@ -220,141 +392,15 @@ class GeoQueriesTest(unittest.TestCase):
# Finds only one point because only the first point is within 60km of # Finds only one point because only the first point is within 60km of
# the reference point to the south. # the reference point to the south.
points = Point.objects( points = Point.objects(
location__within_spherical_distance=[[-122, 36.5], 60/earth_radius]) location__within_spherical_distance=[
[-122, 36.5],
60 / earth_radius
]
)
self.assertEqual(points.count(), 1) self.assertEqual(points.count(), 1)
self.assertEqual(points[0].id, south_point.id) self.assertEqual(points[0].id, south_point.id)
def test_2dsphere_point(self):
class Event(Document):
title = StringField()
date = DateTimeField()
location = PointField()
def __unicode__(self):
return self.title
Event.drop_collection()
event1 = Event(title="Coltrane Motion @ Double Door",
date=datetime.now() - timedelta(days=1),
location=[-87.677137, 41.909889])
event1.save()
event2 = Event(title="Coltrane Motion @ Bottom of the Hill",
date=datetime.now() - timedelta(days=10),
location=[-122.4194155, 37.7749295]).save()
event3 = Event(title="Coltrane Motion @ Empty Bottle",
date=datetime.now(),
location=[-87.686638, 41.900474]).save()
# find all events "near" pitchfork office, chicago.
# note that "near" will show the san francisco event, too,
# although it sorts to last.
events = Event.objects(location__near=[-87.67892, 41.9120459])
self.assertEqual(events.count(), 3)
self.assertEqual(list(events), [event1, event3, event2])
# find events within 5 degrees of pitchfork office, chicago
point_and_distance = [[-87.67892, 41.9120459], 2]
events = Event.objects(location__geo_within_center=point_and_distance)
self.assertEqual(events.count(), 2)
events = list(events)
self.assertTrue(event2 not in events)
self.assertTrue(event1 in events)
self.assertTrue(event3 in events)
# ensure ordering is respected by "near"
events = Event.objects(location__near=[-87.67892, 41.9120459])
events = events.order_by("-date")
self.assertEqual(events.count(), 3)
self.assertEqual(list(events), [event3, event1, event2])
# find events within 10km of san francisco
point = [-122.415579, 37.7566023]
events = Event.objects(location__near=point, location__max_distance=10000)
self.assertEqual(events.count(), 1)
self.assertEqual(events[0], event2)
# find events within 1km of greenpoint, broolyn, nyc, ny
events = Event.objects(location__near=[-73.9509714, 40.7237134], location__max_distance=1000)
self.assertEqual(events.count(), 0)
# ensure ordering is respected by "near"
events = Event.objects(location__near=[-87.67892, 41.9120459],
location__max_distance=10000).order_by("-date")
self.assertEqual(events.count(), 2)
self.assertEqual(events[0], event3)
# ensure min_distance and max_distance combine well
events = Event.objects(location__near=[-87.67892, 41.9120459],
location__min_distance=1000,
location__max_distance=10000).order_by("-date")
self.assertEqual(events.count(), 1)
self.assertEqual(events[0], event3)
# ensure ordering is respected by "near"
events = Event.objects(location__near=[-87.67892, 41.9120459],
# location__min_distance=10000
location__min_distance=10000).order_by("-date")
self.assertEqual(events.count(), 1)
self.assertEqual(events[0], event2)
# check that within_box works
box = [(-125.0, 35.0), (-100.0, 40.0)]
events = Event.objects(location__geo_within_box=box)
self.assertEqual(events.count(), 1)
self.assertEqual(events[0].id, event2.id)
polygon = [
(-87.694445, 41.912114),
(-87.69084, 41.919395),
(-87.681742, 41.927186),
(-87.654276, 41.911731),
(-87.656164, 41.898061),
]
events = Event.objects(location__geo_within_polygon=polygon)
self.assertEqual(events.count(), 1)
self.assertEqual(events[0].id, event1.id)
polygon2 = [
(-1.742249, 54.033586),
(-1.225891, 52.792797),
(-4.40094, 53.389881)
]
events = Event.objects(location__geo_within_polygon=polygon2)
self.assertEqual(events.count(), 0)
def test_2dsphere_point_embedded(self):
class Venue(EmbeddedDocument):
location = GeoPointField()
name = StringField()
class Event(Document):
title = StringField()
venue = EmbeddedDocumentField(Venue)
Event.drop_collection()
venue1 = Venue(name="The Rock", location=[-87.677137, 41.909889])
venue2 = Venue(name="The Bridge", location=[-122.4194155, 37.7749295])
event1 = Event(title="Coltrane Motion @ Double Door",
venue=venue1).save()
event2 = Event(title="Coltrane Motion @ Bottom of the Hill",
venue=venue2).save()
event3 = Event(title="Coltrane Motion @ Empty Bottle",
venue=venue1).save()
# find all events "near" pitchfork office, chicago.
# note that "near" will show the san francisco event, too,
# although it sorts to last.
events = Event.objects(venue__location__near=[-87.67892, 41.9120459])
self.assertEqual(events.count(), 3)
self.assertEqual(list(events), [event1, event3, event2])
def test_linestring(self): def test_linestring(self):
class Road(Document): class Road(Document):
name = StringField() name = StringField()
line = LineStringField() line = LineStringField()
@@ -410,7 +456,6 @@ class GeoQueriesTest(unittest.TestCase):
self.assertEqual(1, roads) self.assertEqual(1, roads)
def test_polygon(self): def test_polygon(self):
class Road(Document): class Road(Document):
name = StringField() name = StringField()
poly = PolygonField() poly = PolygonField()
@@ -507,5 +552,6 @@ class GeoQueriesTest(unittest.TestCase):
loc = Location.objects.as_pymongo()[0] loc = Location.objects.as_pymongo()[0]
self.assertEqual(loc["poly"], {"type": "Polygon", "coordinates": [[[40, 4], [40, 6], [41, 6], [40, 4]]]}) self.assertEqual(loc["poly"], {"type": "Polygon", "coordinates": [[[40, 4], [40, 6], [41, 6], [40, 4]]]})
if __name__ == '__main__': if __name__ == '__main__':
unittest.main() unittest.main()

View File

@@ -19,6 +19,9 @@ from mongoengine.python_support import IS_PYMONGO_3
from mongoengine.queryset import (DoesNotExist, MultipleObjectsReturned, from mongoengine.queryset import (DoesNotExist, MultipleObjectsReturned,
QuerySet, QuerySetManager, queryset_manager) QuerySet, QuerySetManager, queryset_manager)
from tests.utils import needs_mongodb_v26, skip_pymongo3
__all__ = ("QuerySetTest",) __all__ = ("QuerySetTest",)
@@ -32,37 +35,6 @@ class db_ops_tracker(query_counter):
return list(self.db.system.profile.find(ignore_query)) return list(self.db.system.profile.find(ignore_query))
def skip_older_mongodb(f):
def _inner(*args, **kwargs):
connection = get_connection()
info = connection.test.command('buildInfo')
mongodb_version = tuple([int(i) for i in info['version'].split('.')])
if mongodb_version < (2, 6):
raise SkipTest("Need MongoDB version 2.6+")
return f(*args, **kwargs)
_inner.__name__ = f.__name__
_inner.__doc__ = f.__doc__
return _inner
def skip_pymongo3(f):
def _inner(*args, **kwargs):
if IS_PYMONGO_3:
raise SkipTest("Useless with PyMongo 3+")
return f(*args, **kwargs)
_inner.__name__ = f.__name__
_inner.__doc__ = f.__doc__
return _inner
class QuerySetTest(unittest.TestCase): class QuerySetTest(unittest.TestCase):
def setUp(self): def setUp(self):
@@ -106,58 +78,111 @@ class QuerySetTest(unittest.TestCase):
list(BlogPost.objects(author2__name="test")) list(BlogPost.objects(author2__name="test"))
def test_find(self): def test_find(self):
"""Ensure that a query returns a valid set of results. """Ensure that a query returns a valid set of results."""
""" user_a = self.Person.objects.create(name='User A', age=20)
self.Person(name="User A", age=20).save() user_b = self.Person.objects.create(name='User B', age=30)
self.Person(name="User B", age=30).save()
# Find all people in the collection # Find all people in the collection
people = self.Person.objects people = self.Person.objects
self.assertEqual(people.count(), 2) self.assertEqual(people.count(), 2)
results = list(people) results = list(people)
self.assertTrue(isinstance(results[0], self.Person)) self.assertTrue(isinstance(results[0], self.Person))
self.assertTrue(isinstance(results[0].id, (ObjectId, str, unicode))) self.assertTrue(isinstance(results[0].id, (ObjectId, str, unicode)))
self.assertEqual(results[0].name, "User A")
self.assertEqual(results[0], user_a)
self.assertEqual(results[0].name, 'User A')
self.assertEqual(results[0].age, 20) self.assertEqual(results[0].age, 20)
self.assertEqual(results[1].name, "User B")
self.assertEqual(results[1], user_b)
self.assertEqual(results[1].name, 'User B')
self.assertEqual(results[1].age, 30) self.assertEqual(results[1].age, 30)
# Use a query to filter the people found to just person1 # Filter people by age
people = self.Person.objects(age=20) people = self.Person.objects(age=20)
self.assertEqual(people.count(), 1) self.assertEqual(people.count(), 1)
person = people.next() person = people.next()
self.assertEqual(person, user_a)
self.assertEqual(person.name, "User A") self.assertEqual(person.name, "User A")
self.assertEqual(person.age, 20) self.assertEqual(person.age, 20)
# Test limit def test_limit(self):
"""Ensure that QuerySet.limit works as expected."""
user_a = self.Person.objects.create(name='User A', age=20)
user_b = self.Person.objects.create(name='User B', age=30)
# Test limit on a new queryset
people = list(self.Person.objects.limit(1)) people = list(self.Person.objects.limit(1))
self.assertEqual(len(people), 1) self.assertEqual(len(people), 1)
self.assertEqual(people[0].name, 'User A') self.assertEqual(people[0], user_a)
# Test skip # Test limit on an existing queryset
people = self.Person.objects
self.assertEqual(len(people), 2)
people2 = people.limit(1)
self.assertEqual(len(people), 2)
self.assertEqual(len(people2), 1)
self.assertEqual(people2[0], user_a)
# Test chaining of only after limit
person = self.Person.objects().limit(1).only('name').first()
self.assertEqual(person, user_a)
self.assertEqual(person.name, 'User A')
self.assertEqual(person.age, None)
def test_skip(self):
"""Ensure that QuerySet.skip works as expected."""
user_a = self.Person.objects.create(name='User A', age=20)
user_b = self.Person.objects.create(name='User B', age=30)
# Test skip on a new queryset
people = list(self.Person.objects.skip(1)) people = list(self.Person.objects.skip(1))
self.assertEqual(len(people), 1) self.assertEqual(len(people), 1)
self.assertEqual(people[0].name, 'User B') self.assertEqual(people[0], user_b)
person3 = self.Person(name="User C", age=40) # Test skip on an existing queryset
person3.save() people = self.Person.objects
self.assertEqual(len(people), 2)
people2 = people.skip(1)
self.assertEqual(len(people), 2)
self.assertEqual(len(people2), 1)
self.assertEqual(people2[0], user_b)
# Test chaining of only after skip
person = self.Person.objects().skip(1).only('name').first()
self.assertEqual(person, user_b)
self.assertEqual(person.name, 'User B')
self.assertEqual(person.age, None)
def test_slice(self):
"""Ensure slicing a queryset works as expected."""
user_a = self.Person.objects.create(name='User A', age=20)
user_b = self.Person.objects.create(name='User B', age=30)
user_c = self.Person.objects.create(name="User C", age=40)
# Test slice limit # Test slice limit
people = list(self.Person.objects[:2]) people = list(self.Person.objects[:2])
self.assertEqual(len(people), 2) self.assertEqual(len(people), 2)
self.assertEqual(people[0].name, 'User A') self.assertEqual(people[0], user_a)
self.assertEqual(people[1].name, 'User B') self.assertEqual(people[1], user_b)
# Test slice skip # Test slice skip
people = list(self.Person.objects[1:]) people = list(self.Person.objects[1:])
self.assertEqual(len(people), 2) self.assertEqual(len(people), 2)
self.assertEqual(people[0].name, 'User B') self.assertEqual(people[0], user_b)
self.assertEqual(people[1].name, 'User C') self.assertEqual(people[1], user_c)
# Test slice limit and skip # Test slice limit and skip
people = list(self.Person.objects[1:2]) people = list(self.Person.objects[1:2])
self.assertEqual(len(people), 1) self.assertEqual(len(people), 1)
self.assertEqual(people[0].name, 'User B') self.assertEqual(people[0], user_b)
# Test slice limit and skip on an existing queryset
people = self.Person.objects
self.assertEqual(len(people), 3)
people2 = people[1:2]
self.assertEqual(len(people2), 1)
self.assertEqual(people2[0], user_b)
# Test slice limit and skip cursor reset # Test slice limit and skip cursor reset
qs = self.Person.objects[1:2] qs = self.Person.objects[1:2]
@@ -168,6 +193,7 @@ class QuerySetTest(unittest.TestCase):
self.assertEqual(len(people), 1) self.assertEqual(len(people), 1)
self.assertEqual(people[0].name, 'User B') self.assertEqual(people[0].name, 'User B')
# Test empty slice
people = list(self.Person.objects[1:1]) people = list(self.Person.objects[1:1])
self.assertEqual(len(people), 0) self.assertEqual(len(people), 0)
@@ -187,12 +213,6 @@ class QuerySetTest(unittest.TestCase):
self.assertEqual("[<Person: Person object>, <Person: Person object>]", self.assertEqual("[<Person: Person object>, <Person: Person object>]",
"%s" % self.Person.objects[51:53]) "%s" % self.Person.objects[51:53])
# Test only after limit
self.assertEqual(self.Person.objects().limit(2).only('name')[0].age, None)
# Test only after skip
self.assertEqual(self.Person.objects().skip(2).only('name')[0].age, None)
def test_find_one(self): def test_find_one(self):
"""Ensure that a query using find_one returns a valid result. """Ensure that a query using find_one returns a valid result.
""" """
@@ -551,16 +571,23 @@ class QuerySetTest(unittest.TestCase):
self.assertEqual(post.comments[0].by, 'joe') self.assertEqual(post.comments[0].by, 'joe')
self.assertEqual(post.comments[0].votes.score, 4) self.assertEqual(post.comments[0].votes.score, 4)
@needs_mongodb_v26
def test_update_min_max(self): def test_update_min_max(self):
class Scores(Document): class Scores(Document):
high_score = IntField() high_score = IntField()
low_score = IntField() low_score = IntField()
scores = Scores(high_score=800, low_score=200)
scores.save() scores = Scores.objects.create(high_score=800, low_score=200)
Scores.objects(id=scores.id).update(min__low_score=150) Scores.objects(id=scores.id).update(min__low_score=150)
self.assertEqual(Scores.objects(id=scores.id).get().low_score, 150) self.assertEqual(Scores.objects.get(id=scores.id).low_score, 150)
Scores.objects(id=scores.id).update(min__low_score=250) Scores.objects(id=scores.id).update(min__low_score=250)
self.assertEqual(Scores.objects(id=scores.id).get().low_score, 150) self.assertEqual(Scores.objects.get(id=scores.id).low_score, 150)
Scores.objects(id=scores.id).update(max__high_score=1000)
self.assertEqual(Scores.objects.get(id=scores.id).high_score, 1000)
Scores.objects(id=scores.id).update(max__high_score=500)
self.assertEqual(Scores.objects.get(id=scores.id).high_score, 1000)
def test_updates_can_have_match_operators(self): def test_updates_can_have_match_operators(self):
@@ -964,7 +991,7 @@ class QuerySetTest(unittest.TestCase):
self.assertEqual(person.name, "User A") self.assertEqual(person.name, "User A")
self.assertEqual(person.age, 20) self.assertEqual(person.age, 20)
@skip_older_mongodb @needs_mongodb_v26
@skip_pymongo3 @skip_pymongo3
def test_cursor_args(self): def test_cursor_args(self):
"""Ensures the cursor args can be set as expected """Ensures the cursor args can be set as expected
@@ -1226,6 +1253,7 @@ class QuerySetTest(unittest.TestCase):
BlogPost.drop_collection() BlogPost.drop_collection()
# default ordering should be used by default
with db_ops_tracker() as q: with db_ops_tracker() as q:
BlogPost.objects.filter(title='whatever').first() BlogPost.objects.filter(title='whatever').first()
self.assertEqual(len(q.get_ops()), 1) self.assertEqual(len(q.get_ops()), 1)
@@ -1234,11 +1262,28 @@ class QuerySetTest(unittest.TestCase):
{'published_date': -1} {'published_date': -1}
) )
# calling order_by() should clear the default ordering
with db_ops_tracker() as q: with db_ops_tracker() as q:
BlogPost.objects.filter(title='whatever').order_by().first() BlogPost.objects.filter(title='whatever').order_by().first()
self.assertEqual(len(q.get_ops()), 1) self.assertEqual(len(q.get_ops()), 1)
self.assertFalse('$orderby' in q.get_ops()[0]['query']) self.assertFalse('$orderby' in q.get_ops()[0]['query'])
# calling an explicit order_by should use a specified sort
with db_ops_tracker() as q:
BlogPost.objects.filter(title='whatever').order_by('published_date').first()
self.assertEqual(len(q.get_ops()), 1)
self.assertEqual(
q.get_ops()[0]['query']['$orderby'],
{'published_date': 1}
)
# calling order_by() after an explicit sort should clear it
with db_ops_tracker() as q:
qs = BlogPost.objects.filter(title='whatever').order_by('published_date')
qs.order_by().first()
self.assertEqual(len(q.get_ops()), 1)
self.assertFalse('$orderby' in q.get_ops()[0]['query'])
def test_no_ordering_for_get(self): def test_no_ordering_for_get(self):
""" Ensure that Doc.objects.get doesn't use any ordering. """ Ensure that Doc.objects.get doesn't use any ordering.
""" """
@@ -3063,7 +3108,7 @@ class QuerySetTest(unittest.TestCase):
self.assertEqual(Foo.objects.distinct("bar"), [bar]) self.assertEqual(Foo.objects.distinct("bar"), [bar])
@skip_older_mongodb @needs_mongodb_v26
def test_text_indexes(self): def test_text_indexes(self):
class News(Document): class News(Document):
title = StringField() title = StringField()
@@ -3150,7 +3195,7 @@ class QuerySetTest(unittest.TestCase):
'brasil').order_by('$text_score').first() 'brasil').order_by('$text_score').first()
self.assertEqual(item.get_text_score(), max_text_score) self.assertEqual(item.get_text_score(), max_text_score)
@skip_older_mongodb @needs_mongodb_v26
def test_distinct_handles_references_to_alias(self): def test_distinct_handles_references_to_alias(self):
register_connection('testdb', 'mongoenginetest2') register_connection('testdb', 'mongoenginetest2')
@@ -4825,6 +4870,7 @@ class QuerySetTest(unittest.TestCase):
self.assertTrue(Person.objects._has_data(), self.assertTrue(Person.objects._has_data(),
'Cursor has data and returned False') 'Cursor has data and returned False')
@needs_mongodb_v26
def test_queryset_aggregation_framework(self): def test_queryset_aggregation_framework(self):
class Person(Document): class Person(Document):
name = StringField() name = StringField()
@@ -4859,17 +4905,13 @@ class QuerySetTest(unittest.TestCase):
{'_id': p1.pk, 'name': "ISABELLA LUANNA"} {'_id': p1.pk, 'name': "ISABELLA LUANNA"}
]) ])
data = Person.objects( data = Person.objects(age__gte=17, age__lte=40).order_by('-age').aggregate({
age__gte=17, age__lte=40).order_by('-age').aggregate( '$group': {
{'$group': { '_id': None,
'_id': None, 'total': {'$sum': 1},
'total': {'$sum': 1}, 'avg': {'$avg': '$age'}
'avg': {'$avg': '$age'} }
} })
}
)
self.assertEqual(list(data), [ self.assertEqual(list(data), [
{'_id': None, 'avg': 29, 'total': 2} {'_id': None, 'avg': 29, 'total': 2}
]) ])
@@ -4910,28 +4952,16 @@ class QuerySetTest(unittest.TestCase):
self.assertEquals(Animal.objects(folded_ears=True).count(), 1) self.assertEquals(Animal.objects(folded_ears=True).count(), 1)
self.assertEquals(Animal.objects(whiskers_length=5.1).count(), 1) self.assertEquals(Animal.objects(whiskers_length=5.1).count(), 1)
def test_loop_via_invalid_id_does_not_crash(self): def test_loop_over_invalid_id_does_not_crash(self):
class Person(Document): class Person(Document):
name = StringField() name = StringField()
Person.objects.delete()
Person._get_collection().update({"name": "a"}, {"$set": {"_id": ""}}, upsert=True) Person.drop_collection()
Person._get_collection().insert({'name': 'a', 'id': ''})
for p in Person.objects(): for p in Person.objects():
self.assertEqual(p.name, 'a') self.assertEqual(p.name, 'a')
def test_last_field_name_like_operator(self):
class EmbeddedItem(EmbeddedDocument):
type = StringField()
class Doc(Document):
item = EmbeddedDocumentField(EmbeddedItem)
Doc.drop_collection()
doc = Doc(item=EmbeddedItem(type="axe"))
doc.save()
self.assertEqual(1, Doc.objects(item__type__="axe").count())
def test_len_during_iteration(self): def test_len_during_iteration(self):
"""Tests that calling len on a queyset during iteration doesn't """Tests that calling len on a queyset during iteration doesn't
stop paging. stop paging.

View File

@@ -35,8 +35,7 @@ class ConnectionTest(unittest.TestCase):
mongoengine.connection._dbs = {} mongoengine.connection._dbs = {}
def test_connect(self): def test_connect(self):
"""Ensure that the connect() method works properly. """Ensure that the connect() method works properly."""
"""
connect('mongoenginetest') connect('mongoenginetest')
conn = get_connection() conn = get_connection()
@@ -146,8 +145,7 @@ class ConnectionTest(unittest.TestCase):
self.assertEqual(expected_connection, actual_connection) self.assertEqual(expected_connection, actual_connection)
def test_connect_uri(self): def test_connect_uri(self):
"""Ensure that the connect() method works properly with uri's """Ensure that the connect() method works properly with URIs."""
"""
c = connect(db='mongoenginetest', alias='admin') c = connect(db='mongoenginetest', alias='admin')
c.admin.system.users.remove({}) c.admin.system.users.remove({})
c.mongoenginetest.system.users.remove({}) c.mongoenginetest.system.users.remove({})
@@ -200,19 +198,6 @@ class ConnectionTest(unittest.TestCase):
self.assertTrue(isinstance(db, pymongo.database.Database)) self.assertTrue(isinstance(db, pymongo.database.Database))
self.assertEqual(db.name, 'test') self.assertEqual(db.name, 'test')
def test_connect_uri_with_replicaset(self):
"""Ensure connect() works when specifying a replicaSet."""
if IS_PYMONGO_3:
c = connect(host='mongodb://localhost/test?replicaSet=local-rs')
db = get_db()
self.assertTrue(isinstance(db, pymongo.database.Database))
self.assertEqual(db.name, 'test')
else:
# PyMongo < v3.x raises an exception:
# "localhost:27017 is not a member of replica set local-rs"
with self.assertRaises(MongoEngineConnectionError):
c = connect(host='mongodb://localhost/test?replicaSet=local-rs')
def test_uri_without_credentials_doesnt_override_conn_settings(self): def test_uri_without_credentials_doesnt_override_conn_settings(self):
"""Ensure connect() uses the username & password params if the URI """Ensure connect() uses the username & password params if the URI
doesn't explicitly specify them. doesn't explicitly specify them.
@@ -227,9 +212,8 @@ class ConnectionTest(unittest.TestCase):
self.assertRaises(OperationFailure, get_db) self.assertRaises(OperationFailure, get_db)
def test_connect_uri_with_authsource(self): def test_connect_uri_with_authsource(self):
"""Ensure that the connect() method works well with """Ensure that the connect() method works well with `authSource`
the option `authSource` in URI. option in the URI.
This feature was introduced in MongoDB 2.4 and removed in 2.6
""" """
# Create users # Create users
c = connect('mongoenginetest') c = connect('mongoenginetest')
@@ -238,30 +222,31 @@ class ConnectionTest(unittest.TestCase):
# Authentication fails without "authSource" # Authentication fails without "authSource"
if IS_PYMONGO_3: if IS_PYMONGO_3:
test_conn = connect('mongoenginetest', alias='test1', test_conn = connect(
host='mongodb://username2:password@localhost/mongoenginetest') 'mongoenginetest', alias='test1',
host='mongodb://username2:password@localhost/mongoenginetest'
)
self.assertRaises(OperationFailure, test_conn.server_info) self.assertRaises(OperationFailure, test_conn.server_info)
else: else:
self.assertRaises( self.assertRaises(
MongoEngineConnectionError, connect, 'mongoenginetest', MongoEngineConnectionError,
alias='test1', connect, 'mongoenginetest', alias='test1',
host='mongodb://username2:password@localhost/mongoenginetest' host='mongodb://username2:password@localhost/mongoenginetest'
) )
self.assertRaises(MongoEngineConnectionError, get_db, 'test1') self.assertRaises(MongoEngineConnectionError, get_db, 'test1')
# Authentication succeeds with "authSource" # Authentication succeeds with "authSource"
connect( authd_conn = connect(
'mongoenginetest', alias='test2', 'mongoenginetest', alias='test2',
host=('mongodb://username2:password@localhost/' host=('mongodb://username2:password@localhost/'
'mongoenginetest?authSource=admin') 'mongoenginetest?authSource=admin')
) )
# This will fail starting from MongoDB 2.6+
db = get_db('test2') db = get_db('test2')
self.assertTrue(isinstance(db, pymongo.database.Database)) self.assertTrue(isinstance(db, pymongo.database.Database))
self.assertEqual(db.name, 'mongoenginetest') self.assertEqual(db.name, 'mongoenginetest')
# Clear all users # Clear all users
c.admin.system.users.remove({}) authd_conn.admin.system.users.remove({})
def test_register_connection(self): def test_register_connection(self):
"""Ensure that connections with different aliases may be registered. """Ensure that connections with different aliases may be registered.
@@ -285,8 +270,7 @@ class ConnectionTest(unittest.TestCase):
self.assertTrue(isinstance(conn, pymongo.mongo_client.MongoClient)) self.assertTrue(isinstance(conn, pymongo.mongo_client.MongoClient))
def test_connection_kwargs(self): def test_connection_kwargs(self):
"""Ensure that connection kwargs get passed to pymongo. """Ensure that connection kwargs get passed to pymongo."""
"""
connect('mongoenginetest', alias='t1', tz_aware=True) connect('mongoenginetest', alias='t1', tz_aware=True)
conn = get_connection('t1') conn = get_connection('t1')
@@ -296,6 +280,32 @@ class ConnectionTest(unittest.TestCase):
conn = get_connection('t2') conn = get_connection('t2')
self.assertFalse(get_tz_awareness(conn)) self.assertFalse(get_tz_awareness(conn))
def test_connection_pool_via_kwarg(self):
"""Ensure we can specify a max connection pool size using
a connection kwarg.
"""
# Use "max_pool_size" or "maxpoolsize" depending on PyMongo version
# (former was changed to the latter as described in
# https://jira.mongodb.org/browse/PYTHON-854).
# TODO remove once PyMongo < 3.0 support is dropped
if pymongo.version_tuple[0] >= 3:
pool_size_kwargs = {'maxpoolsize': 100}
else:
pool_size_kwargs = {'max_pool_size': 100}
conn = connect('mongoenginetest', alias='max_pool_size_via_kwarg', **pool_size_kwargs)
self.assertEqual(conn.max_pool_size, 100)
def test_connection_pool_via_uri(self):
"""Ensure we can specify a max connection pool size using
an option in a connection URI.
"""
if pymongo.version_tuple[0] == 2 and pymongo.version_tuple[1] < 9:
raise SkipTest('maxpoolsize as a URI option is only supported in PyMongo v2.9+')
conn = connect(host='mongodb://localhost/test?maxpoolsize=100', alias='max_pool_size_via_uri')
self.assertEqual(conn.max_pool_size, 100)
def test_write_concern(self): def test_write_concern(self):
"""Ensure write concern can be specified in connect() via """Ensure write concern can be specified in connect() via
a kwarg or as part of the connection URI. a kwarg or as part of the connection URI.
@@ -309,6 +319,38 @@ class ConnectionTest(unittest.TestCase):
self.assertEqual(dict(conn1.write_concern), {'w': 1, 'j': True}) self.assertEqual(dict(conn1.write_concern), {'w': 1, 'j': True})
self.assertEqual(dict(conn2.write_concern), {'w': 1, 'j': True}) self.assertEqual(dict(conn2.write_concern), {'w': 1, 'j': True})
def test_connect_with_replicaset_via_uri(self):
"""Ensure connect() works when specifying a replicaSet via the
MongoDB URI.
"""
if IS_PYMONGO_3:
c = connect(host='mongodb://localhost/test?replicaSet=local-rs')
db = get_db()
self.assertTrue(isinstance(db, pymongo.database.Database))
self.assertEqual(db.name, 'test')
else:
# PyMongo < v3.x raises an exception:
# "localhost:27017 is not a member of replica set local-rs"
with self.assertRaises(MongoEngineConnectionError):
c = connect(host='mongodb://localhost/test?replicaSet=local-rs')
def test_connect_with_replicaset_via_kwargs(self):
"""Ensure connect() works when specifying a replicaSet via the
connection kwargs
"""
if IS_PYMONGO_3:
c = connect(replicaset='local-rs')
self.assertEqual(c._MongoClient__options.replica_set_name,
'local-rs')
db = get_db()
self.assertTrue(isinstance(db, pymongo.database.Database))
self.assertEqual(db.name, 'test')
else:
# PyMongo < v3.x raises an exception:
# "localhost:27017 is not a member of replica set local-rs"
with self.assertRaises(MongoEngineConnectionError):
c = connect(replicaset='local-rs')
def test_datetime(self): def test_datetime(self):
connect('mongoenginetest', tz_aware=True) connect('mongoenginetest', tz_aware=True)
d = datetime.datetime(2010, 5, 5, tzinfo=utc) d = datetime.datetime(2010, 5, 5, tzinfo=utc)

View File

@@ -2,10 +2,15 @@
import unittest import unittest
from bson import DBRef, ObjectId from bson import DBRef, ObjectId
from collections import OrderedDict
from mongoengine import * from mongoengine import *
from mongoengine.connection import get_db from mongoengine.connection import get_db
from mongoengine.context_managers import query_counter from mongoengine.context_managers import query_counter
from mongoengine.python_support import IS_PYMONGO_3
from mongoengine.base import TopLevelDocumentMetaclass
if IS_PYMONGO_3:
from bson import CodecOptions
class FieldTest(unittest.TestCase): class FieldTest(unittest.TestCase):
@@ -1287,5 +1292,70 @@ class FieldTest(unittest.TestCase):
self.assertEqual(q, 2) self.assertEqual(q, 2)
def test_dynamic_field_dereference(self):
class Merchandise(Document):
name = StringField()
price = IntField()
class Store(Document):
merchandises = DynamicField()
Merchandise.drop_collection()
Store.drop_collection()
merchandises = {
'#1': Merchandise(name='foo', price=100).save(),
'#2': Merchandise(name='bar', price=120).save(),
'#3': Merchandise(name='baz', price=110).save(),
}
Store(merchandises=merchandises).save()
store = Store.objects().first()
for obj in store.merchandises.values():
self.assertFalse(isinstance(obj, Merchandise))
store.select_related()
for obj in store.merchandises.values():
self.assertTrue(isinstance(obj, Merchandise))
def test_dynamic_field_dereference_with_ordering_guarantee_on_pymongo3(self):
# This is because 'codec_options' is supported on pymongo3 or later
if IS_PYMONGO_3:
class OrderedDocument(Document):
my_metaclass = TopLevelDocumentMetaclass
__metaclass__ = TopLevelDocumentMetaclass
@classmethod
def _get_collection(cls):
collection = super(OrderedDocument, cls)._get_collection()
opts = CodecOptions(document_class=OrderedDict)
return collection.with_options(codec_options=opts)
class Merchandise(Document):
name = StringField()
price = IntField()
class Store(OrderedDocument):
merchandises = DynamicField(container_class=OrderedDict)
Merchandise.drop_collection()
Store.drop_collection()
merchandises = OrderedDict()
merchandises['#1'] = Merchandise(name='foo', price=100).save()
merchandises['#2'] = Merchandise(name='bar', price=120).save()
merchandises['#3'] = Merchandise(name='baz', price=110).save()
Store(merchandises=merchandises).save()
store = Store.objects().first()
store.select_related()
# confirms that the load data order is same with the one at storing
self.assertTrue(type(store.merchandises), OrderedDict)
self.assertEqual(','.join(store.merchandises.keys()), '#1,#2,#3')
if __name__ == '__main__': if __name__ == '__main__':
unittest.main() unittest.main()

78
tests/utils.py Normal file
View File

@@ -0,0 +1,78 @@
import unittest
from nose.plugins.skip import SkipTest
from mongoengine import connect
from mongoengine.connection import get_db, get_connection
from mongoengine.python_support import IS_PYMONGO_3
MONGO_TEST_DB = 'mongoenginetest'
class MongoDBTestCase(unittest.TestCase):
"""Base class for tests that need a mongodb connection
db is being dropped automatically
"""
@classmethod
def setUpClass(cls):
cls._connection = connect(db=MONGO_TEST_DB)
cls._connection.drop_database(MONGO_TEST_DB)
cls.db = get_db()
@classmethod
def tearDownClass(cls):
cls._connection.drop_database(MONGO_TEST_DB)
def get_mongodb_version():
"""Return the version tuple of the MongoDB server that the default
connection is connected to.
"""
return tuple(get_connection().server_info()['versionArray'])
def _decorated_with_ver_requirement(func, ver_tuple):
"""Return a given function decorated with the version requirement
for a particular MongoDB version tuple.
"""
def _inner(*args, **kwargs):
mongodb_ver = get_mongodb_version()
if mongodb_ver >= ver_tuple:
return func(*args, **kwargs)
raise SkipTest('Needs MongoDB v{}+'.format(
'.'.join([str(v) for v in ver_tuple])
))
_inner.__name__ = func.__name__
_inner.__doc__ = func.__doc__
return _inner
def needs_mongodb_v26(func):
"""Raise a SkipTest exception if we're working with MongoDB version
lower than v2.6.
"""
return _decorated_with_ver_requirement(func, (2, 6))
def needs_mongodb_v3(func):
"""Raise a SkipTest exception if we're working with MongoDB version
lower than v3.0.
"""
return _decorated_with_ver_requirement(func, (3, 0))
def skip_pymongo3(f):
"""Raise a SkipTest exception if we're running a test against
PyMongo v3.x.
"""
def _inner(*args, **kwargs):
if IS_PYMONGO_3:
raise SkipTest("Useless with PyMongo 3+")
return f(*args, **kwargs)
_inner.__name__ = f.__name__
_inner.__doc__ = f.__doc__
return _inner

13
tox.ini
View File

@@ -1,5 +1,5 @@
[tox] [tox]
envlist = {py26,py27,py33,py34,py35,pypy,pypy3}-{mg27,mg28},flake8 envlist = {py27,py35,pypy,pypy3}-{mg27,mg28,mg30}
[testenv] [testenv]
commands = commands =
@@ -7,16 +7,7 @@ commands =
deps = deps =
nose nose
mg27: PyMongo<2.8 mg27: PyMongo<2.8
mg28: PyMongo>=2.8,<3.0 mg28: PyMongo>=2.8,<2.9
mg30: PyMongo>=3.0 mg30: PyMongo>=3.0
mgdev: https://github.com/mongodb/mongo-python-driver/tarball/master
setenv = setenv =
PYTHON_EGG_CACHE = {envdir}/python-eggs PYTHON_EGG_CACHE = {envdir}/python-eggs
passenv = windir
[testenv:flake8]
deps =
flake8
flake8-import-order
commands =
flake8