Merge branch 'master' of github.com:MongoEngine/mongoengine into fix_baselist_marked_changed_bug

This commit is contained in:
Bastien Gérard 2018-12-15 20:36:42 +01:00
commit 4492874d08
57 changed files with 2051 additions and 1062 deletions

View File

@ -3,12 +3,7 @@
sudo apt-get remove mongodb-org-server sudo apt-get remove mongodb-org-server
sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 7F0CEB10 sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 7F0CEB10
if [ "$MONGODB" = "2.4" ]; then if [ "$MONGODB" = "2.6" ]; then
echo "deb http://downloads-distro.mongodb.org/repo/ubuntu-upstart dist 10gen" | sudo tee /etc/apt/sources.list.d/mongodb.list
sudo apt-get update
sudo apt-get install mongodb-10gen=2.4.14
sudo service mongodb start
elif [ "$MONGODB" = "2.6" ]; then
echo "deb http://downloads-distro.mongodb.org/repo/ubuntu-upstart dist 10gen" | sudo tee /etc/apt/sources.list.d/mongodb.list echo "deb http://downloads-distro.mongodb.org/repo/ubuntu-upstart dist 10gen" | sudo tee /etc/apt/sources.list.d/mongodb.list
sudo apt-get update sudo apt-get update
sudo apt-get install mongodb-org-server=2.6.12 sudo apt-get install mongodb-org-server=2.6.12
@ -18,8 +13,14 @@ elif [ "$MONGODB" = "3.0" ]; then
sudo apt-get update sudo apt-get update
sudo apt-get install mongodb-org-server=3.0.14 sudo apt-get install mongodb-org-server=3.0.14
# service should be started automatically # service should be started automatically
elif [ "$MONGODB" = "3.2" ]; then
sudo apt-key adv --keyserver keyserver.ubuntu.com --recv EA312927
echo "deb http://repo.mongodb.org/apt/ubuntu trusty/mongodb-org/3.2 multiverse" | sudo tee /etc/apt/sources.list.d/mongodb-org-3.2.list
sudo apt-get update
sudo apt-get install mongodb-org-server=3.2.20
# service should be started automatically
else else
echo "Invalid MongoDB version, expected 2.4, 2.6, or 3.0." echo "Invalid MongoDB version, expected 2.6, 3.0, or 3.2"
exit 1 exit 1
fi; fi;

View File

@ -2,12 +2,10 @@
# PyMongo combinations. However, that would result in an overly long build # PyMongo combinations. However, that would result in an overly long build
# with a very large number of jobs, hence we only test a subset of all the # with a very large number of jobs, hence we only test a subset of all the
# combinations: # combinations:
# * MongoDB v2.4 & v3.0 are only tested against Python v2.7 & v3.5.
# * MongoDB v2.4 is tested against PyMongo v2.7 & v3.x.
# * MongoDB v3.0 is tested against PyMongo v3.x.
# * MongoDB v2.6 is currently the "main" version tested against Python v2.7, # * MongoDB v2.6 is currently the "main" version tested against Python v2.7,
# v3.5, PyPy & PyPy3, and PyMongo v2.7, v2.8 & v3.x. # v3.5, v3.6, PyPy, and PyMongo v3.x.
# # * MongoDB v3.0 & v3.2 are tested against Python v2.7, v3.5 & v3.6
# and Pymongo v3.5 & v3.x
# Reminder: Update README.rst if you change MongoDB versions we test. # Reminder: Update README.rst if you change MongoDB versions we test.
language: python language: python
@ -27,17 +25,17 @@ matrix:
include: include:
- python: 2.7 - python: 2.7
env: MONGODB=2.4 PYMONGO=3.5 env: MONGODB=3.0 PYMONGO=3.5
- python: 2.7 - python: 2.7
env: MONGODB=3.0 PYMONGO=3.x env: MONGODB=3.2 PYMONGO=3.x
- python: 3.5 - python: 3.5
env: MONGODB=2.4 PYMONGO=3.5 env: MONGODB=3.0 PYMONGO=3.5
- python: 3.5 - python: 3.5
env: MONGODB=3.0 PYMONGO=3.x env: MONGODB=3.2 PYMONGO=3.x
- python: 3.6 - python: 3.6
env: MONGODB=2.4 PYMONGO=3.5 env: MONGODB=3.0 PYMONGO=3.5
- python: 3.6 - python: 3.6
env: MONGODB=3.0 PYMONGO=3.x env: MONGODB=3.2 PYMONGO=3.x
before_install: before_install:
- bash .install_mongodb_on_travis.sh - bash .install_mongodb_on_travis.sh

View File

@ -247,3 +247,5 @@ that much better:
* Erdenezul Batmunkh (https://github.com/erdenezul) * Erdenezul Batmunkh (https://github.com/erdenezul)
* Andy Yankovsky (https://github.com/werat) * Andy Yankovsky (https://github.com/werat)
* Bastien Gérard (https://github.com/bagerard) * Bastien Gérard (https://github.com/bagerard)
* Trevor Hall (https://github.com/tjhall13)
* Gleb Voropaev (https://github.com/buggyspace)

View File

@ -26,19 +26,21 @@ an `API reference <https://mongoengine-odm.readthedocs.io/apireference.html>`_.
Supported MongoDB Versions Supported MongoDB Versions
========================== ==========================
MongoEngine is currently tested against MongoDB v2.4, v2.6, and v3.0. Future MongoEngine is currently tested against MongoDB v2.6, v3.0 and v3.2. Future
versions should be supported as well, but aren't actively tested at the moment. versions should be supported as well, but aren't actively tested at the moment.
Make sure to open an issue or submit a pull request if you experience any Make sure to open an issue or submit a pull request if you experience any
problems with MongoDB v3.2+. problems with MongoDB v3.4+.
Installation Installation
============ ============
We recommend the use of `virtualenv <https://virtualenv.pypa.io/>`_ and of We recommend the use of `virtualenv <https://virtualenv.pypa.io/>`_ and of
`pip <https://pip.pypa.io/>`_. You can then use ``pip install -U mongoengine``. `pip <https://pip.pypa.io/>`_. You can then use ``pip install -U mongoengine``.
You may also have `setuptools <http://peak.telecommunity.com/DevCenter/setuptools>`_ You may also have `setuptools <http://peak.telecommunity.com/DevCenter/setuptools>`_
and thus you can use ``easy_install -U mongoengine``. Otherwise, you can download the and thus you can use ``easy_install -U mongoengine``. Another option is
source from `GitHub <http://github.com/MongoEngine/mongoengine>`_ and run ``python `pipenv <https://docs.pipenv.org/>`_. You can then use ``pipenv install mongoengine``
setup.py install``. to both create the virtual environment and install the package. Otherwise, you can
download the source from `GitHub <http://github.com/MongoEngine/mongoengine>`_ and
run ``python setup.py install``.
Dependencies Dependencies
============ ============

View File

@ -2,9 +2,64 @@
Changelog Changelog
========= =========
Changes in 0.15.4 Development
===========
- (Fill this out as you fix issues and develop your features).
- Fix .only() working improperly after using .count() of the same instance of QuerySet
================= =================
- Added `DateField` #513 Changes in 0.16.3
=================
- Fix $push with $position operator not working with lists in embedded document #1965
=================
Changes in 0.16.2
=================
- Fix .save() that fails when called with write_concern=None (regression of 0.16.1) #1958
=================
Changes in 0.16.1
=================
- Fix `_cls` that is not set properly in Document constructor (regression) #1950
- Fix bug in _delta method - Update of a ListField depends on an unrelated dynamic field update #1733
- Remove deprecated `save()` method and used `insert_one()` #1899
=================
Changes in 0.16.0
=================
- Various improvements to the doc
- Improvement to code quality
- POTENTIAL BREAKING CHANGES:
- EmbeddedDocumentField will no longer accept references to Document classes in its constructor #1661
- Get rid of the `basecls` parameter from the DictField constructor (dead code) #1876
- default value of ComplexDateTime is now None (and no longer the current datetime) #1368
- Fix unhashable TypeError when referencing a Document with a compound key in an EmbeddedDocument #1685
- Fix bug where an EmbeddedDocument with the same id as its parent would not be tracked for changes #1768
- Fix the fact that bulk `insert()` was not setting primary keys of inserted documents instances #1919
- Fix bug when referencing the abstract class in a ReferenceField #1920
- Allow modification to the document made in pre_save_post_validation to be taken into account #1202
- Replaced MongoDB 2.4 tests in CI by MongoDB 3.2 #1903
- Fix side effects of using queryset.`no_dereference` on other documents #1677
- Fix TypeError when using lazy django translation objects as translated choices #1879
- Improve 2-3 codebase compatibility #1889
- Fix the support for changing the default value of ComplexDateTime #1368
- Improves error message in case an EmbeddedDocumentListField receives an EmbeddedDocument instance
instead of a list #1877
- Fix the Decimal operator inc/dec #1517 #1320
- Ignore killcursors queries in `query_counter` context manager #1869
- Fix the fact that `query_counter` was modifying the initial profiling_level in case it was != 0 #1870
- Repaired the `no_sub_classes` context manager + fix the fact that it was swallowing exceptions #1865
- Fix index creation error that was swallowed by hasattr under python2 #1688
- QuerySet limit function behaviour: Passing 0 as parameter will return all the documents in the cursor #1611
- bulk insert updates the ids of the input documents instances #1919
- Fix an harmless bug related to GenericReferenceField where modifications in the generic-referenced document
were tracked in the parent #1934
- Improve validator of BinaryField #273
- Implemented lazy regex compiling in Field classes to improve 'import mongoengine' performance #1806
- Updated GridFSProxy.__str__ so that it would always print both the filename and grid_id #710
- Add __repr__ to Q and QCombination #1843
- fix bug in BaseList.__iter__ operator (was occuring when modifying a BaseList while iterating over it) #1676
- Added field `DateField`#513
Changes in 0.15.3 Changes in 0.15.3
================= =================

View File

@ -45,27 +45,27 @@ post2.link_url = 'http://tractiondigital.com/labs/mongoengine/docs'
post2.tags = ['mongoengine'] post2.tags = ['mongoengine']
post2.save() post2.save()
print 'ALL POSTS' print('ALL POSTS')
print print()
for post in Post.objects: for post in Post.objects:
print post.title print(post.title)
#print '=' * post.title.count() #print '=' * post.title.count()
print "=" * 20 print("=" * 20)
if isinstance(post, TextPost): if isinstance(post, TextPost):
print post.content print(post.content)
if isinstance(post, LinkPost): if isinstance(post, LinkPost):
print 'Link:', post.link_url print('Link:', post.link_url)
print print()
print print()
print 'POSTS TAGGED \'MONGODB\'' print('POSTS TAGGED \'MONGODB\'')
print print()
for post in Post.objects(tags='mongodb'): for post in Post.objects(tags='mongodb'):
print post.title print(post.title)
print print()
num_posts = Post.objects(tags='mongodb').count() num_posts = Post.objects(tags='mongodb').count()
print 'Found %d posts with tag "mongodb"' % num_posts print('Found %d posts with tag "mongodb"' % num_posts)

View File

@ -155,7 +155,7 @@ arguments can be set on all fields:
An iterable (e.g. list, tuple or set) of choices to which the value of this An iterable (e.g. list, tuple or set) of choices to which the value of this
field should be limited. field should be limited.
Can be either be a nested tuples of value (stored in mongo) and a Can either be nested tuples of value (stored in mongo) and a
human readable key :: human readable key ::
SIZE = (('S', 'Small'), SIZE = (('S', 'Small'),
@ -492,7 +492,9 @@ the field name with a **#**::
] ]
} }
If a dictionary is passed then the following options are available: If a dictionary is passed then additional options become available. Valid options include,
but are not limited to:
:attr:`fields` (Default: None) :attr:`fields` (Default: None)
The fields to index. Specified in the same format as described above. The fields to index. Specified in the same format as described above.
@ -513,8 +515,15 @@ If a dictionary is passed then the following options are available:
Allows you to automatically expire data from a collection by setting the Allows you to automatically expire data from a collection by setting the
time in seconds to expire the a field. time in seconds to expire the a field.
:attr:`name` (Optional)
Allows you to specify a name for the index
:attr:`collation` (Optional)
Allows to create case insensitive indexes (MongoDB v3.4+ only)
.. note:: .. note::
Additional options are forwarded as **kwargs to pymongo's create_index method.
Inheritance adds extra fields indices see: :ref:`document-inheritance`. Inheritance adds extra fields indices see: :ref:`document-inheritance`.
Global index default options Global index default options
@ -526,7 +535,7 @@ There are a few top level defaults for all indexes that can be set::
title = StringField() title = StringField()
rating = StringField() rating = StringField()
meta = { meta = {
'index_options': {}, 'index_opts': {},
'index_background': True, 'index_background': True,
'index_cls': False, 'index_cls': False,
'auto_create_index': True, 'auto_create_index': True,
@ -534,8 +543,8 @@ There are a few top level defaults for all indexes that can be set::
} }
:attr:`index_options` (Optional) :attr:`index_opts` (Optional)
Set any default index options - see the `full options list <http://docs.mongodb.org/manual/reference/method/db.collection.ensureIndex/#db.collection.ensureIndex>`_ Set any default index options - see the `full options list <https://docs.mongodb.com/manual/reference/method/db.collection.createIndex/#db.collection.createIndex>`_
:attr:`index_background` (Optional) :attr:`index_background` (Optional)
Set the default value for if an index should be indexed in the background Set the default value for if an index should be indexed in the background
@ -551,8 +560,7 @@ There are a few top level defaults for all indexes that can be set::
:attr:`index_drop_dups` (Optional) :attr:`index_drop_dups` (Optional)
Set the default value for if an index should drop duplicates Set the default value for if an index should drop duplicates
Since MongoDB 3.0 drop_dups is not supported anymore. Raises a Warning
.. note:: Since MongoDB 3.0 drop_dups is not supported anymore. Raises a Warning
and has no effect and has no effect
@ -734,6 +742,9 @@ document.::
.. note:: From 0.8 onwards :attr:`allow_inheritance` defaults .. note:: From 0.8 onwards :attr:`allow_inheritance` defaults
to False, meaning you must set it to True to use inheritance. to False, meaning you must set it to True to use inheritance.
Setting :attr:`allow_inheritance` to True should also be used in
:class:`~mongoengine.EmbeddedDocument` class in case you need to subclass it
Working with existing data Working with existing data
-------------------------- --------------------------
As MongoEngine no longer defaults to needing :attr:`_cls`, you can quickly and As MongoEngine no longer defaults to needing :attr:`_cls`, you can quickly and

View File

@ -57,7 +57,8 @@ document values for example::
def clean(self): def clean(self):
"""Ensures that only published essays have a `pub_date` and """Ensures that only published essays have a `pub_date` and
automatically sets the pub_date if published and not set""" automatically sets `pub_date` if essay is published and `pub_date`
is not set"""
if self.status == 'Draft' and self.pub_date is not None: if self.status == 'Draft' and self.pub_date is not None:
msg = 'Draft entries should not have a publication date.' msg = 'Draft entries should not have a publication date.'
raise ValidationError(msg) raise ValidationError(msg)

View File

@ -456,14 +456,14 @@ data. To turn off dereferencing of the results of a query use
:func:`~mongoengine.queryset.QuerySet.no_dereference` on the queryset like so:: :func:`~mongoengine.queryset.QuerySet.no_dereference` on the queryset like so::
post = Post.objects.no_dereference().first() post = Post.objects.no_dereference().first()
assert(isinstance(post.author, ObjectId)) assert(isinstance(post.author, DBRef))
You can also turn off all dereferencing for a fixed period by using the You can also turn off all dereferencing for a fixed period by using the
:class:`~mongoengine.context_managers.no_dereference` context manager:: :class:`~mongoengine.context_managers.no_dereference` context manager::
with no_dereference(Post) as Post: with no_dereference(Post) as Post:
post = Post.objects.first() post = Post.objects.first()
assert(isinstance(post.author, ObjectId)) assert(isinstance(post.author, DBRef))
# Outside the context manager dereferencing occurs. # Outside the context manager dereferencing occurs.
assert(isinstance(post.author, User)) assert(isinstance(post.author, User))

View File

@ -6,6 +6,11 @@ Development
*********** ***********
(Fill this out whenever you introduce breaking changes to MongoEngine) (Fill this out whenever you introduce breaking changes to MongoEngine)
URLField's constructor no longer takes `verify_exists`
0.15.0
******
0.14.0 0.14.0
****** ******
This release includes a few bug fixes and a significant code cleanup. The most This release includes a few bug fixes and a significant code cleanup. The most

View File

@ -23,7 +23,7 @@ __all__ = (list(document.__all__) + list(fields.__all__) +
list(signals.__all__) + list(errors.__all__)) list(signals.__all__) + list(errors.__all__))
VERSION = (0, 15, 3) VERSION = (0, 16, 3)
def get_version(): def get_version():

View File

@ -3,10 +3,10 @@ from mongoengine.errors import NotRegistered
__all__ = ('UPDATE_OPERATORS', 'get_document', '_document_registry') __all__ = ('UPDATE_OPERATORS', 'get_document', '_document_registry')
UPDATE_OPERATORS = set(['set', 'unset', 'inc', 'dec', 'mul', UPDATE_OPERATORS = {'set', 'unset', 'inc', 'dec', 'mul',
'pop', 'push', 'push_all', 'pull', 'pop', 'push', 'push_all', 'pull',
'pull_all', 'add_to_set', 'set_on_insert', 'pull_all', 'add_to_set', 'set_on_insert',
'min', 'max', 'rename']) 'min', 'max', 'rename'}
_document_registry = {} _document_registry = {}
@ -19,7 +19,7 @@ def get_document(name):
# Possible old style name # Possible old style name
single_end = name.split('.')[-1] single_end = name.split('.')[-1]
compound_end = '.%s' % single_end compound_end = '.%s' % single_end
possible_match = [k for k in _document_registry.keys() possible_match = [k for k in _document_registry
if k.endswith(compound_end) or k == single_end] if k.endswith(compound_end) or k == single_end]
if len(possible_match) == 1: if len(possible_match) == 1:
doc = _document_registry.get(possible_match.pop(), None) doc = _document_registry.get(possible_match.pop(), None)

View File

@ -35,10 +35,9 @@ class BaseDict(dict):
_name = None _name = None
def __init__(self, dict_items, instance, name): def __init__(self, dict_items, instance, name):
Document = _import_class('Document') BaseDocument = _import_class('BaseDocument')
EmbeddedDocument = _import_class('EmbeddedDocument')
if isinstance(instance, (Document, EmbeddedDocument)): if isinstance(instance, BaseDocument):
self._instance = weakref.proxy(instance) self._instance = weakref.proxy(instance)
self._name = name self._name = name
super(BaseDict, self).__init__(dict_items) super(BaseDict, self).__init__(dict_items)
@ -56,11 +55,11 @@ class BaseDict(dict):
EmbeddedDocument = _import_class('EmbeddedDocument') EmbeddedDocument = _import_class('EmbeddedDocument')
if isinstance(value, EmbeddedDocument) and value._instance is None: if isinstance(value, EmbeddedDocument) and value._instance is None:
value._instance = self._instance value._instance = self._instance
elif not isinstance(value, BaseDict) and isinstance(value, dict): elif isinstance(value, dict) and not isinstance(value, BaseDict):
value = BaseDict(value, None, '%s.%s' % (self._name, key)) value = BaseDict(value, None, '%s.%s' % (self._name, key))
super(BaseDict, self).__setitem__(key, value) super(BaseDict, self).__setitem__(key, value)
value._instance = self._instance value._instance = self._instance
elif not isinstance(value, BaseList) and isinstance(value, list): elif isinstance(value, list) and not isinstance(value, BaseList):
value = BaseList(value, None, '%s.%s' % (self._name, key)) value = BaseList(value, None, '%s.%s' % (self._name, key))
super(BaseDict, self).__setitem__(key, value) super(BaseDict, self).__setitem__(key, value)
value._instance = self._instance value._instance = self._instance
@ -100,10 +99,9 @@ class BaseList(list):
_name = None _name = None
def __init__(self, list_items, instance, name): def __init__(self, list_items, instance, name):
Document = _import_class('Document') BaseDocument = _import_class('BaseDocument')
EmbeddedDocument = _import_class('EmbeddedDocument')
if isinstance(instance, (Document, EmbeddedDocument)): if isinstance(instance, BaseDocument):
self._instance = weakref.proxy(instance) self._instance = weakref.proxy(instance)
self._name = name self._name = name
super(BaseList, self).__init__(list_items) super(BaseList, self).__init__(list_items)
@ -119,12 +117,12 @@ class BaseList(list):
EmbeddedDocument = _import_class('EmbeddedDocument') EmbeddedDocument = _import_class('EmbeddedDocument')
if isinstance(value, EmbeddedDocument) and value._instance is None: if isinstance(value, EmbeddedDocument) and value._instance is None:
value._instance = self._instance value._instance = self._instance
elif not isinstance(value, BaseDict) and isinstance(value, dict): elif isinstance(value, dict) and not isinstance(value, BaseDict):
# Replace dict by BaseDict # Replace dict by BaseDict
value = BaseDict(value, None, '%s.%s' % (self._name, key)) value = BaseDict(value, None, '%s.%s' % (self._name, key))
super(BaseList, self).__setitem__(key, value) super(BaseList, self).__setitem__(key, value)
value._instance = self._instance value._instance = self._instance
elif not isinstance(value, BaseList) and isinstance(value, list): elif isinstance(value, list) and not isinstance(value, BaseList):
# Replace list by BaseList # Replace list by BaseList
value = BaseList(value, None, '%s.%s' % (self._name, key)) value = BaseList(value, None, '%s.%s' % (self._name, key))
super(BaseList, self).__setitem__(key, value) super(BaseList, self).__setitem__(key, value)
@ -218,6 +216,9 @@ class EmbeddedDocumentList(BaseList):
Filters the list by only including embedded documents with the Filters the list by only including embedded documents with the
given keyword arguments. given keyword arguments.
This method only supports simple comparison (e.g: .filter(name='John Doe'))
and does not support operators like __gte, __lte, __icontains like queryset.filter does
:param kwargs: The keyword arguments corresponding to the fields to :param kwargs: The keyword arguments corresponding to the fields to
filter on. *Multiple arguments are treated as if they are ANDed filter on. *Multiple arguments are treated as if they are ANDed
together.* together.*
@ -358,7 +359,7 @@ class EmbeddedDocumentList(BaseList):
class StrictDict(object): class StrictDict(object):
__slots__ = () __slots__ = ()
_special_fields = set(['get', 'pop', 'iteritems', 'items', 'keys', 'create']) _special_fields = {'get', 'pop', 'iteritems', 'items', 'keys', 'create'}
_classes = {} _classes = {}
def __init__(self, **kwargs): def __init__(self, **kwargs):

View File

@ -1,11 +1,8 @@
import copy import copy
import numbers import numbers
from collections import Hashable
from functools import partial from functools import partial
from bson import ObjectId, json_util from bson import DBRef, ObjectId, SON, json_util
from bson.dbref import DBRef
from bson.son import SON
import pymongo import pymongo
import six import six
@ -19,6 +16,7 @@ from mongoengine.base.fields import ComplexBaseField
from mongoengine.common import _import_class from mongoengine.common import _import_class
from mongoengine.errors import (FieldDoesNotExist, InvalidDocumentError, from mongoengine.errors import (FieldDoesNotExist, InvalidDocumentError,
LookUpError, OperationError, ValidationError) LookUpError, OperationError, ValidationError)
from mongoengine.python_support import Hashable
__all__ = ('BaseDocument', 'NON_FIELD_ERRORS') __all__ = ('BaseDocument', 'NON_FIELD_ERRORS')
@ -302,7 +300,7 @@ class BaseDocument(object):
data['_cls'] = self._class_name data['_cls'] = self._class_name
# only root fields ['test1.a', 'test2'] => ['test1', 'test2'] # only root fields ['test1.a', 'test2'] => ['test1', 'test2']
root_fields = set([f.split('.')[0] for f in fields]) root_fields = {f.split('.')[0] for f in fields}
for field_name in self: for field_name in self:
if root_fields and field_name not in root_fields: if root_fields and field_name not in root_fields:
@ -404,7 +402,15 @@ class BaseDocument(object):
@classmethod @classmethod
def from_json(cls, json_data, created=False): def from_json(cls, json_data, created=False):
"""Converts json data to an unsaved document instance""" """Converts json data to a Document instance
:param json_data: The json data to load into the Document
:param created: If True, the document will be considered as a brand new document
If False and an id is provided, it will consider that the data being
loaded corresponds to what's already in the database (This has an impact of subsequent call to .save())
If False and no id is provided, it will consider the data as a new document
(default ``False``)
"""
return cls._from_son(json_util.loads(json_data), created=created) return cls._from_son(json_util.loads(json_data), created=created)
def __expand_dynamic_values(self, name, value): def __expand_dynamic_values(self, name, value):
@ -495,7 +501,13 @@ class BaseDocument(object):
self._changed_fields = [] self._changed_fields = []
def _nestable_types_changed_fields(self, changed_fields, key, data, inspected): def _nestable_types_changed_fields(self, changed_fields, base_key, data):
"""Inspect nested data for changed fields
:param changed_fields: Previously collected changed fields
:param base_key: The base key that must be used to prepend changes to this data
:param data: data to inspect for changes
"""
# Loop list / dict fields as they contain documents # Loop list / dict fields as they contain documents
# Determine the iterator to use # Determine the iterator to use
if not hasattr(data, 'items'): if not hasattr(data, 'items'):
@ -503,68 +515,60 @@ class BaseDocument(object):
else: else:
iterator = data.iteritems() iterator = data.iteritems()
for index, value in iterator: for index_or_key, value in iterator:
list_key = '%s%s.' % (key, index) item_key = '%s%s.' % (base_key, index_or_key)
# don't check anything lower if this key is already marked # don't check anything lower if this key is already marked
# as changed. # as changed.
if list_key[:-1] in changed_fields: if item_key[:-1] in changed_fields:
continue continue
if hasattr(value, '_get_changed_fields'): if hasattr(value, '_get_changed_fields'):
changed = value._get_changed_fields(inspected) changed = value._get_changed_fields()
changed_fields += ['%s%s' % (list_key, k) changed_fields += ['%s%s' % (item_key, k) for k in changed if k]
for k in changed if k]
elif isinstance(value, (list, tuple, dict)): elif isinstance(value, (list, tuple, dict)):
self._nestable_types_changed_fields( self._nestable_types_changed_fields(
changed_fields, list_key, value, inspected) changed_fields, item_key, value)
def _get_changed_fields(self, inspected=None): def _get_changed_fields(self):
"""Return a list of all fields that have explicitly been changed. """Return a list of all fields that have explicitly been changed.
""" """
EmbeddedDocument = _import_class('EmbeddedDocument') EmbeddedDocument = _import_class('EmbeddedDocument')
DynamicEmbeddedDocument = _import_class('DynamicEmbeddedDocument')
ReferenceField = _import_class('ReferenceField') ReferenceField = _import_class('ReferenceField')
GenericReferenceField = _import_class('GenericReferenceField')
SortedListField = _import_class('SortedListField') SortedListField = _import_class('SortedListField')
changed_fields = [] changed_fields = []
changed_fields += getattr(self, '_changed_fields', []) changed_fields += getattr(self, '_changed_fields', [])
inspected = inspected or set()
if hasattr(self, 'id') and isinstance(self.id, Hashable):
if self.id in inspected:
return changed_fields
inspected.add(self.id)
for field_name in self._fields_ordered: for field_name in self._fields_ordered:
db_field_name = self._db_field_map.get(field_name, field_name) db_field_name = self._db_field_map.get(field_name, field_name)
key = '%s.' % db_field_name key = '%s.' % db_field_name
data = self._data.get(field_name, None) data = self._data.get(field_name, None)
field = self._fields.get(field_name) field = self._fields.get(field_name)
if hasattr(data, 'id'): if db_field_name in changed_fields:
if data.id in inspected: # Whole field already marked as changed, no need to go further
continue
if isinstance(field, ReferenceField):
continue continue
elif (
isinstance(data, (EmbeddedDocument, DynamicEmbeddedDocument)) and if isinstance(field, ReferenceField): # Don't follow referenced documents
db_field_name not in changed_fields continue
):
if isinstance(data, EmbeddedDocument):
# Find all embedded fields that have been changed # Find all embedded fields that have been changed
changed = data._get_changed_fields(inspected) changed = data._get_changed_fields()
changed_fields += ['%s%s' % (key, k) for k in changed if k] changed_fields += ['%s%s' % (key, k) for k in changed if k]
elif (isinstance(data, (list, tuple, dict)) and elif isinstance(data, (list, tuple, dict)):
db_field_name not in changed_fields):
if (hasattr(field, 'field') and if (hasattr(field, 'field') and
isinstance(field.field, ReferenceField)): isinstance(field.field, (ReferenceField, GenericReferenceField))):
continue continue
elif isinstance(field, SortedListField) and field._ordering: elif isinstance(field, SortedListField) and field._ordering:
# if ordering is affected whole list is changed # if ordering is affected whole list is changed
if any(map(lambda d: field._ordering in d._changed_fields, data)): if any(field._ordering in d._changed_fields for d in data):
changed_fields.append(db_field_name) changed_fields.append(db_field_name)
continue continue
self._nestable_types_changed_fields( self._nestable_types_changed_fields(
changed_fields, key, data, inspected) changed_fields, key, data)
return changed_fields return changed_fields
def _delta(self): def _delta(self):
@ -576,7 +580,6 @@ class BaseDocument(object):
set_fields = self._get_changed_fields() set_fields = self._get_changed_fields()
unset_data = {} unset_data = {}
parts = []
if hasattr(self, '_changed_fields'): if hasattr(self, '_changed_fields'):
set_data = {} set_data = {}
# Fetch each set item from its path # Fetch each set item from its path
@ -586,15 +589,13 @@ class BaseDocument(object):
new_path = [] new_path = []
for p in parts: for p in parts:
if isinstance(d, (ObjectId, DBRef)): if isinstance(d, (ObjectId, DBRef)):
# Don't dig in the references
break break
elif isinstance(d, list) and p.lstrip('-').isdigit(): elif isinstance(d, list) and p.isdigit():
if p[0] == '-': # An item of a list (identified by its index) is updated
p = str(len(d) + int(p)) d = d[int(p)]
try:
d = d[int(p)]
except IndexError:
d = None
elif hasattr(d, 'get'): elif hasattr(d, 'get'):
# dict-like (dict, embedded document)
d = d.get(p) d = d.get(p)
new_path.append(p) new_path.append(p)
path = '.'.join(new_path) path = '.'.join(new_path)
@ -606,26 +607,26 @@ class BaseDocument(object):
# Determine if any changed items were actually unset. # Determine if any changed items were actually unset.
for path, value in set_data.items(): for path, value in set_data.items():
if value or isinstance(value, (numbers.Number, bool)): if value or isinstance(value, (numbers.Number, bool)): # Account for 0 and True that are truthy
continue continue
# If we've set a value that ain't the default value don't unset it. parts = path.split('.')
default = None
if (self._dynamic and len(parts) and parts[0] in if (self._dynamic and len(parts) and parts[0] in
self._dynamic_fields): self._dynamic_fields):
del set_data[path] del set_data[path]
unset_data[path] = 1 unset_data[path] = 1
continue continue
elif path in self._fields:
# If we've set a value that ain't the default value don't unset it.
default = None
if path in self._fields:
default = self._fields[path].default default = self._fields[path].default
else: # Perform a full lookup for lists / embedded lookups else: # Perform a full lookup for lists / embedded lookups
d = self d = self
parts = path.split('.')
db_field_name = parts.pop() db_field_name = parts.pop()
for p in parts: for p in parts:
if isinstance(d, list) and p.lstrip('-').isdigit(): if isinstance(d, list) and p.isdigit():
if p[0] == '-':
p = str(len(d) + int(p))
d = d[int(p)] d = d[int(p)]
elif (hasattr(d, '__getattribute__') and elif (hasattr(d, '__getattribute__') and
not isinstance(d, dict)): not isinstance(d, dict)):
@ -643,10 +644,9 @@ class BaseDocument(object):
default = None default = None
if default is not None: if default is not None:
if callable(default): default = default() if callable(default) else default
default = default()
if default != value: if value != default:
continue continue
del set_data[path] del set_data[path]
@ -692,7 +692,7 @@ class BaseDocument(object):
fields = cls._fields fields = cls._fields
if not _auto_dereference: if not _auto_dereference:
fields = copy.copy(fields) fields = copy.deepcopy(fields)
for field_name, field in fields.iteritems(): for field_name, field in fields.iteritems():
field._auto_dereference = _auto_dereference field._auto_dereference = _auto_dereference
@ -1083,6 +1083,6 @@ class BaseDocument(object):
sep = getattr(field, 'display_sep', ' ') sep = getattr(field, 'display_sep', ' ')
values = value if field.__class__.__name__ in ('ListField', 'SortedListField') else [value] values = value if field.__class__.__name__ in ('ListField', 'SortedListField') else [value]
return sep.join([ return sep.join([
dict(field.choices).get(val, val) six.text_type(dict(field.choices).get(val, val))
for val in values or []]) for val in values or []])
return value return value

View File

@ -55,7 +55,7 @@ class BaseField(object):
field. Generally this is deprecated in favour of the field. Generally this is deprecated in favour of the
`FIELD.validate` method `FIELD.validate` method
:param choices: (optional) The valid choices :param choices: (optional) The valid choices
:param null: (optional) Is the field value can be null. If no and there is a default value :param null: (optional) If the field value can be null. If no and there is a default value
then the default value is set then the default value is set
:param sparse: (optional) `sparse=True` combined with `unique=True` and `required=False` :param sparse: (optional) `sparse=True` combined with `unique=True` and `required=False`
means that uniqueness won't be enforced for `None` values means that uniqueness won't be enforced for `None` values
@ -130,7 +130,6 @@ class BaseField(object):
def __set__(self, instance, value): def __set__(self, instance, value):
"""Descriptor for assigning a value to a field in a document. """Descriptor for assigning a value to a field in a document.
""" """
# If setting to None and there is a default # If setting to None and there is a default
# Then set the value to the default value # Then set the value to the default value
if value is None: if value is None:
@ -267,13 +266,15 @@ class ComplexBaseField(BaseField):
ReferenceField = _import_class('ReferenceField') ReferenceField = _import_class('ReferenceField')
GenericReferenceField = _import_class('GenericReferenceField') GenericReferenceField = _import_class('GenericReferenceField')
EmbeddedDocumentListField = _import_class('EmbeddedDocumentListField') EmbeddedDocumentListField = _import_class('EmbeddedDocumentListField')
dereference = (self._auto_dereference and
auto_dereference = instance._fields[self.name]._auto_dereference
dereference = (auto_dereference and
(self.field is None or isinstance(self.field, (self.field is None or isinstance(self.field,
(GenericReferenceField, ReferenceField)))) (GenericReferenceField, ReferenceField))))
_dereference = _import_class('DeReference')() _dereference = _import_class('DeReference')()
self._auto_dereference = instance._fields[self.name]._auto_dereference
if instance._initialised and dereference and instance._data.get(self.name): if instance._initialised and dereference and instance._data.get(self.name):
instance._data[self.name] = _dereference( instance._data[self.name] = _dereference(
instance._data.get(self.name), max_depth=1, instance=instance, instance._data.get(self.name), max_depth=1, instance=instance,
@ -294,7 +295,7 @@ class ComplexBaseField(BaseField):
value = BaseDict(value, instance, self.name) value = BaseDict(value, instance, self.name)
instance._data[self.name] = value instance._data[self.name] = value
if (self._auto_dereference and instance._initialised and if (auto_dereference and instance._initialised and
isinstance(value, (BaseList, BaseDict)) and isinstance(value, (BaseList, BaseDict)) and
not value._dereferenced): not value._dereferenced):
value = _dereference( value = _dereference(
@ -313,11 +314,16 @@ class ComplexBaseField(BaseField):
if hasattr(value, 'to_python'): if hasattr(value, 'to_python'):
return value.to_python() return value.to_python()
BaseDocument = _import_class('BaseDocument')
if isinstance(value, BaseDocument):
# Something is wrong, return the value as it is
return value
is_list = False is_list = False
if not hasattr(value, 'items'): if not hasattr(value, 'items'):
try: try:
is_list = True is_list = True
value = {k: v for k, v in enumerate(value)} value = {idx: v for idx, v in enumerate(value)}
except TypeError: # Not iterable return the value except TypeError: # Not iterable return the value
return value return value
@ -502,7 +508,7 @@ class GeoJsonBaseField(BaseField):
def validate(self, value): def validate(self, value):
"""Validate the GeoJson object based on its type.""" """Validate the GeoJson object based on its type."""
if isinstance(value, dict): if isinstance(value, dict):
if set(value.keys()) == set(['type', 'coordinates']): if set(value.keys()) == {'type', 'coordinates'}:
if value['type'] != self._type: if value['type'] != self._type:
self.error('%s type must be "%s"' % self.error('%s type must be "%s"' %
(self._name, self._type)) (self._name, self._type))

View File

@ -18,14 +18,14 @@ class DocumentMetaclass(type):
"""Metaclass for all documents.""" """Metaclass for all documents."""
# TODO lower complexity of this method # TODO lower complexity of this method
def __new__(cls, name, bases, attrs): def __new__(mcs, name, bases, attrs):
flattened_bases = cls._get_bases(bases) flattened_bases = mcs._get_bases(bases)
super_new = super(DocumentMetaclass, cls).__new__ super_new = super(DocumentMetaclass, mcs).__new__
# If a base class just call super # If a base class just call super
metaclass = attrs.get('my_metaclass') metaclass = attrs.get('my_metaclass')
if metaclass and issubclass(metaclass, DocumentMetaclass): if metaclass and issubclass(metaclass, DocumentMetaclass):
return super_new(cls, name, bases, attrs) return super_new(mcs, name, bases, attrs)
attrs['_is_document'] = attrs.get('_is_document', False) attrs['_is_document'] = attrs.get('_is_document', False)
attrs['_cached_reference_fields'] = [] attrs['_cached_reference_fields'] = []
@ -121,7 +121,8 @@ class DocumentMetaclass(type):
# inheritance of classes where inheritance is set to False # inheritance of classes where inheritance is set to False
allow_inheritance = base._meta.get('allow_inheritance') allow_inheritance = base._meta.get('allow_inheritance')
if not allow_inheritance and not base._meta.get('abstract'): if not allow_inheritance and not base._meta.get('abstract'):
raise ValueError('Document %s may not be subclassed' % raise ValueError('Document %s may not be subclassed. '
'To enable inheritance, use the "allow_inheritance" meta attribute.' %
base.__name__) base.__name__)
# Get superclasses from last base superclass # Get superclasses from last base superclass
@ -138,7 +139,7 @@ class DocumentMetaclass(type):
attrs['_types'] = attrs['_subclasses'] # TODO depreciate _types attrs['_types'] = attrs['_subclasses'] # TODO depreciate _types
# Create the new_class # Create the new_class
new_class = super_new(cls, name, bases, attrs) new_class = super_new(mcs, name, bases, attrs)
# Set _subclasses # Set _subclasses
for base in document_bases: for base in document_bases:
@ -147,7 +148,7 @@ class DocumentMetaclass(type):
base._types = base._subclasses # TODO depreciate _types base._types = base._subclasses # TODO depreciate _types
(Document, EmbeddedDocument, DictField, (Document, EmbeddedDocument, DictField,
CachedReferenceField) = cls._import_classes() CachedReferenceField) = mcs._import_classes()
if issubclass(new_class, Document): if issubclass(new_class, Document):
new_class._collection = None new_class._collection = None
@ -219,29 +220,26 @@ class DocumentMetaclass(type):
return new_class return new_class
def add_to_class(self, name, value):
setattr(self, name, value)
@classmethod @classmethod
def _get_bases(cls, bases): def _get_bases(mcs, bases):
if isinstance(bases, BasesTuple): if isinstance(bases, BasesTuple):
return bases return bases
seen = [] seen = []
bases = cls.__get_bases(bases) bases = mcs.__get_bases(bases)
unique_bases = (b for b in bases if not (b in seen or seen.append(b))) unique_bases = (b for b in bases if not (b in seen or seen.append(b)))
return BasesTuple(unique_bases) return BasesTuple(unique_bases)
@classmethod @classmethod
def __get_bases(cls, bases): def __get_bases(mcs, bases):
for base in bases: for base in bases:
if base is object: if base is object:
continue continue
yield base yield base
for child_base in cls.__get_bases(base.__bases__): for child_base in mcs.__get_bases(base.__bases__):
yield child_base yield child_base
@classmethod @classmethod
def _import_classes(cls): def _import_classes(mcs):
Document = _import_class('Document') Document = _import_class('Document')
EmbeddedDocument = _import_class('EmbeddedDocument') EmbeddedDocument = _import_class('EmbeddedDocument')
DictField = _import_class('DictField') DictField = _import_class('DictField')
@ -254,9 +252,9 @@ class TopLevelDocumentMetaclass(DocumentMetaclass):
collection in the database. collection in the database.
""" """
def __new__(cls, name, bases, attrs): def __new__(mcs, name, bases, attrs):
flattened_bases = cls._get_bases(bases) flattened_bases = mcs._get_bases(bases)
super_new = super(TopLevelDocumentMetaclass, cls).__new__ super_new = super(TopLevelDocumentMetaclass, mcs).__new__
# Set default _meta data if base class, otherwise get user defined meta # Set default _meta data if base class, otherwise get user defined meta
if attrs.get('my_metaclass') == TopLevelDocumentMetaclass: if attrs.get('my_metaclass') == TopLevelDocumentMetaclass:
@ -319,7 +317,7 @@ class TopLevelDocumentMetaclass(DocumentMetaclass):
not parent_doc_cls._meta.get('abstract', False)): not parent_doc_cls._meta.get('abstract', False)):
msg = 'Abstract document cannot have non-abstract base' msg = 'Abstract document cannot have non-abstract base'
raise ValueError(msg) raise ValueError(msg)
return super_new(cls, name, bases, attrs) return super_new(mcs, name, bases, attrs)
# Merge base class metas. # Merge base class metas.
# Uses a special MetaDict that handles various merging rules # Uses a special MetaDict that handles various merging rules
@ -360,7 +358,7 @@ class TopLevelDocumentMetaclass(DocumentMetaclass):
attrs['_meta'] = meta attrs['_meta'] = meta
# Call super and get the new class # Call super and get the new class
new_class = super_new(cls, name, bases, attrs) new_class = super_new(mcs, name, bases, attrs)
meta = new_class._meta meta = new_class._meta
@ -394,7 +392,7 @@ class TopLevelDocumentMetaclass(DocumentMetaclass):
'_auto_id_field', False) '_auto_id_field', False)
if not new_class._meta.get('id_field'): if not new_class._meta.get('id_field'):
# After 0.10, find not existing names, instead of overwriting # After 0.10, find not existing names, instead of overwriting
id_name, id_db_name = cls.get_auto_id_names(new_class) id_name, id_db_name = mcs.get_auto_id_names(new_class)
new_class._auto_id_field = True new_class._auto_id_field = True
new_class._meta['id_field'] = id_name new_class._meta['id_field'] = id_name
new_class._fields[id_name] = ObjectIdField(db_field=id_db_name) new_class._fields[id_name] = ObjectIdField(db_field=id_db_name)
@ -419,7 +417,7 @@ class TopLevelDocumentMetaclass(DocumentMetaclass):
return new_class return new_class
@classmethod @classmethod
def get_auto_id_names(cls, new_class): def get_auto_id_names(mcs, new_class):
id_name, id_db_name = ('id', '_id') id_name, id_db_name = ('id', '_id')
if id_name not in new_class._fields and \ if id_name not in new_class._fields and \
id_db_name not in (v.db_field for v in new_class._fields.values()): id_db_name not in (v.db_field for v in new_class._fields.values()):

22
mongoengine/base/utils.py Normal file
View File

@ -0,0 +1,22 @@
import re
class LazyRegexCompiler(object):
"""Descriptor to allow lazy compilation of regex"""
def __init__(self, pattern, flags=0):
self._pattern = pattern
self._flags = flags
self._compiled_regex = None
@property
def compiled_regex(self):
if self._compiled_regex is None:
self._compiled_regex = re.compile(self._pattern, self._flags)
return self._compiled_regex
def __get__(self, instance, owner):
return self.compiled_regex
def __set__(self, instance, value):
raise AttributeError("Can not set attribute LazyRegexCompiler")

View File

@ -104,6 +104,18 @@ def register_connection(alias, db=None, name=None, host=None, port=None,
conn_settings['authentication_source'] = uri_options['authsource'] conn_settings['authentication_source'] = uri_options['authsource']
if 'authmechanism' in uri_options: if 'authmechanism' in uri_options:
conn_settings['authentication_mechanism'] = uri_options['authmechanism'] conn_settings['authentication_mechanism'] = uri_options['authmechanism']
if IS_PYMONGO_3 and 'readpreference' in uri_options:
read_preferences = (
ReadPreference.NEAREST,
ReadPreference.PRIMARY,
ReadPreference.PRIMARY_PREFERRED,
ReadPreference.SECONDARY,
ReadPreference.SECONDARY_PREFERRED)
read_pf_mode = uri_options['readpreference'].lower()
for preference in read_preferences:
if preference.name.lower() == read_pf_mode:
conn_settings['read_preference'] = preference
break
else: else:
resolved_hosts.append(entity) resolved_hosts.append(entity)
conn_settings['host'] = resolved_hosts conn_settings['host'] = resolved_hosts

View File

@ -145,66 +145,85 @@ class no_sub_classes(object):
:param cls: the class to turn querying sub classes on :param cls: the class to turn querying sub classes on
""" """
self.cls = cls self.cls = cls
self.cls_initial_subclasses = None
def __enter__(self): def __enter__(self):
"""Change the objects default and _auto_dereference values.""" """Change the objects default and _auto_dereference values."""
self.cls._all_subclasses = self.cls._subclasses self.cls_initial_subclasses = self.cls._subclasses
self.cls._subclasses = (self.cls,) self.cls._subclasses = (self.cls._class_name,)
return self.cls return self.cls
def __exit__(self, t, value, traceback): def __exit__(self, t, value, traceback):
"""Reset the default and _auto_dereference values.""" """Reset the default and _auto_dereference values."""
self.cls._subclasses = self.cls._all_subclasses self.cls._subclasses = self.cls_initial_subclasses
delattr(self.cls, '_all_subclasses')
return self.cls
class query_counter(object): class query_counter(object):
"""Query_counter context manager to get the number of queries.""" """Query_counter context manager to get the number of queries.
This works by updating the `profiling_level` of the database so that all queries get logged,
resetting the db.system.profile collection at the beginnig of the context and counting the new entries.
This was designed for debugging purpose. In fact it is a global counter so queries issued by other threads/processes
can interfere with it
Be aware that:
- Iterating over large amount of documents (>101) makes pymongo issue `getmore` queries to fetch the next batch of
documents (https://docs.mongodb.com/manual/tutorial/iterate-a-cursor/#cursor-batches)
- Some queries are ignored by default by the counter (killcursors, db.system.indexes)
"""
def __init__(self): def __init__(self):
"""Construct the query_counter.""" """Construct the query_counter
self.counter = 0 """
self.db = get_db() self.db = get_db()
self.initial_profiling_level = None
self._ctx_query_counter = 0 # number of queries issued by the context
def __enter__(self): self._ignored_query = {
"""On every with block we need to drop the profile collection.""" 'ns':
{'$ne': '%s.system.indexes' % self.db.name},
'op': # MONGODB < 3.2
{'$ne': 'killcursors'},
'command.killCursors': # MONGODB >= 3.2
{'$exists': False}
}
def _turn_on_profiling(self):
self.initial_profiling_level = self.db.profiling_level()
self.db.set_profiling_level(0) self.db.set_profiling_level(0)
self.db.system.profile.drop() self.db.system.profile.drop()
self.db.set_profiling_level(2) self.db.set_profiling_level(2)
def _resets_profiling(self):
self.db.set_profiling_level(self.initial_profiling_level)
def __enter__(self):
self._turn_on_profiling()
return self return self
def __exit__(self, t, value, traceback): def __exit__(self, t, value, traceback):
"""Reset the profiling level.""" self._resets_profiling()
self.db.set_profiling_level(0)
def __eq__(self, value): def __eq__(self, value):
"""== Compare querycounter."""
counter = self._get_count() counter = self._get_count()
return value == counter return value == counter
def __ne__(self, value): def __ne__(self, value):
"""!= Compare querycounter."""
return not self.__eq__(value) return not self.__eq__(value)
def __lt__(self, value): def __lt__(self, value):
"""< Compare querycounter."""
return self._get_count() < value return self._get_count() < value
def __le__(self, value): def __le__(self, value):
"""<= Compare querycounter."""
return self._get_count() <= value return self._get_count() <= value
def __gt__(self, value): def __gt__(self, value):
"""> Compare querycounter."""
return self._get_count() > value return self._get_count() > value
def __ge__(self, value): def __ge__(self, value):
""">= Compare querycounter."""
return self._get_count() >= value return self._get_count() >= value
def __int__(self): def __int__(self):
"""int representation."""
return self._get_count() return self._get_count()
def __repr__(self): def __repr__(self):
@ -212,10 +231,12 @@ class query_counter(object):
return u"%s" % self._get_count() return u"%s" % self._get_count()
def _get_count(self): def _get_count(self):
"""Get the number of queries.""" """Get the number of queries by counting the current number of entries in db.system.profile
ignore_query = {'ns': {'$ne': '%s.system.indexes' % self.db.name}} and substracting the queries issued by this context. In fact everytime this is called, 1 query is
count = self.db.system.profile.find(ignore_query).count() - self.counter issued so we need to balance that
self.counter += 1 """
count = self.db.system.profile.find(self._ignored_query).count() - self._ctx_query_counter
self._ctx_query_counter += 1 # Account for the query we just issued to gather the information
return count return count

View File

@ -52,26 +52,40 @@ class DeReference(object):
[i.__class__ == doc_type for i in items.values()]): [i.__class__ == doc_type for i in items.values()]):
return items return items
elif not field.dbref: elif not field.dbref:
# We must turn the ObjectIds into DBRefs
# Recursively dig into the sub items of a list/dict
# to turn the ObjectIds into DBRefs
def _get_items_from_list(items):
new_items = []
for v in items:
value = v
if isinstance(v, dict):
value = _get_items_from_dict(v)
elif isinstance(v, list):
value = _get_items_from_list(v)
elif not isinstance(v, (DBRef, Document)):
value = field.to_python(v)
new_items.append(value)
return new_items
def _get_items_from_dict(items):
new_items = {}
for k, v in items.iteritems():
value = v
if isinstance(v, list):
value = _get_items_from_list(v)
elif isinstance(v, dict):
value = _get_items_from_dict(v)
elif not isinstance(v, (DBRef, Document)):
value = field.to_python(v)
new_items[k] = value
return new_items
if not hasattr(items, 'items'): if not hasattr(items, 'items'):
items = _get_items_from_list(items)
def _get_items(items):
new_items = []
for v in items:
if isinstance(v, list):
new_items.append(_get_items(v))
elif not isinstance(v, (DBRef, Document)):
new_items.append(field.to_python(v))
else:
new_items.append(v)
return new_items
items = _get_items(items)
else: else:
items = { items = _get_items_from_dict(items)
k: (v if isinstance(v, (DBRef, Document))
else field.to_python(v))
for k, v in items.iteritems()
}
self.reference_map = self._find_references(items) self.reference_map = self._find_references(items)
self.object_map = self._fetch_objects(doc_type=doc_type) self.object_map = self._fetch_objects(doc_type=doc_type)
@ -133,7 +147,12 @@ class DeReference(object):
""" """
object_map = {} object_map = {}
for collection, dbrefs in self.reference_map.iteritems(): for collection, dbrefs in self.reference_map.iteritems():
if hasattr(collection, 'objects'): # We have a document class for the refs
# we use getattr instead of hasattr because hasattr swallows any exception under python2
# so it could hide nasty things without raising exceptions (cfr bug #1688))
ref_document_cls_exists = (getattr(collection, 'objects', None) is not None)
if ref_document_cls_exists:
col_name = collection._get_collection_name() col_name = collection._get_collection_name()
refs = [dbref for dbref in dbrefs refs = [dbref for dbref in dbrefs
if (col_name, dbref) not in object_map] if (col_name, dbref) not in object_map]
@ -141,7 +160,7 @@ class DeReference(object):
for key, doc in references.iteritems(): for key, doc in references.iteritems():
object_map[(col_name, key)] = doc object_map[(col_name, key)] = doc
else: # Generic reference: use the refs data to convert to document else: # Generic reference: use the refs data to convert to document
if isinstance(doc_type, (ListField, DictField, MapField,)): if isinstance(doc_type, (ListField, DictField, MapField)):
continue continue
refs = [dbref for dbref in dbrefs refs = [dbref for dbref in dbrefs

View File

@ -12,7 +12,9 @@ from mongoengine.base import (BaseDict, BaseDocument, BaseList,
TopLevelDocumentMetaclass, get_document) TopLevelDocumentMetaclass, get_document)
from mongoengine.common import _import_class from mongoengine.common import _import_class
from mongoengine.connection import DEFAULT_CONNECTION_NAME, get_db from mongoengine.connection import DEFAULT_CONNECTION_NAME, get_db
from mongoengine.context_managers import switch_collection, switch_db from mongoengine.context_managers import (set_write_concern,
switch_collection,
switch_db)
from mongoengine.errors import (InvalidDocumentError, InvalidQueryError, from mongoengine.errors import (InvalidDocumentError, InvalidQueryError,
SaveConditionError) SaveConditionError)
from mongoengine.python_support import IS_PYMONGO_3 from mongoengine.python_support import IS_PYMONGO_3
@ -39,7 +41,7 @@ class InvalidCollectionError(Exception):
pass pass
class EmbeddedDocument(BaseDocument): class EmbeddedDocument(six.with_metaclass(DocumentMetaclass, BaseDocument)):
"""A :class:`~mongoengine.Document` that isn't stored in its own """A :class:`~mongoengine.Document` that isn't stored in its own
collection. :class:`~mongoengine.EmbeddedDocument`\ s should be used as collection. :class:`~mongoengine.EmbeddedDocument`\ s should be used as
fields on :class:`~mongoengine.Document`\ s through the fields on :class:`~mongoengine.Document`\ s through the
@ -58,7 +60,6 @@ class EmbeddedDocument(BaseDocument):
# The __metaclass__ attribute is removed by 2to3 when running with Python3 # The __metaclass__ attribute is removed by 2to3 when running with Python3
# my_metaclass is defined so that metaclass can be queried in Python 2 & 3 # my_metaclass is defined so that metaclass can be queried in Python 2 & 3
my_metaclass = DocumentMetaclass my_metaclass = DocumentMetaclass
__metaclass__ = DocumentMetaclass
# A generic embedded document doesn't have any immutable properties # A generic embedded document doesn't have any immutable properties
# that describe it uniquely, hence it shouldn't be hashable. You can # that describe it uniquely, hence it shouldn't be hashable. You can
@ -95,7 +96,7 @@ class EmbeddedDocument(BaseDocument):
self._instance.reload(*args, **kwargs) self._instance.reload(*args, **kwargs)
class Document(BaseDocument): class Document(six.with_metaclass(TopLevelDocumentMetaclass, BaseDocument)):
"""The base class used for defining the structure and properties of """The base class used for defining the structure and properties of
collections of documents stored in MongoDB. Inherit from this class, and collections of documents stored in MongoDB. Inherit from this class, and
add fields as class attributes to define a document's structure. add fields as class attributes to define a document's structure.
@ -150,7 +151,6 @@ class Document(BaseDocument):
# The __metaclass__ attribute is removed by 2to3 when running with Python3 # The __metaclass__ attribute is removed by 2to3 when running with Python3
# my_metaclass is defined so that metaclass can be queried in Python 2 & 3 # my_metaclass is defined so that metaclass can be queried in Python 2 & 3
my_metaclass = TopLevelDocumentMetaclass my_metaclass = TopLevelDocumentMetaclass
__metaclass__ = TopLevelDocumentMetaclass
__slots__ = ('__objects',) __slots__ = ('__objects',)
@ -172,8 +172,8 @@ class Document(BaseDocument):
""" """
if self.pk is None: if self.pk is None:
return super(BaseDocument, self).__hash__() return super(BaseDocument, self).__hash__()
else:
return hash(self.pk) return hash(self.pk)
@classmethod @classmethod
def _get_db(cls): def _get_db(cls):
@ -370,6 +370,8 @@ class Document(BaseDocument):
signals.pre_save_post_validation.send(self.__class__, document=self, signals.pre_save_post_validation.send(self.__class__, document=self,
created=created, **signal_kwargs) created=created, **signal_kwargs)
# it might be refreshed by the pre_save_post_validation hook, e.g., for etag generation
doc = self.to_mongo()
if self._meta.get('auto_create_index', True): if self._meta.get('auto_create_index', True):
self.ensure_indexes() self.ensure_indexes()
@ -429,11 +431,18 @@ class Document(BaseDocument):
Helper method, should only be used inside save(). Helper method, should only be used inside save().
""" """
collection = self._get_collection() collection = self._get_collection()
with set_write_concern(collection, write_concern) as wc_collection:
if force_insert:
return wc_collection.insert_one(doc).inserted_id
# insert_one will provoke UniqueError alongside save does not
# therefore, it need to catch and call replace_one.
if '_id' in doc:
raw_object = wc_collection.find_one_and_replace(
{'_id': doc['_id']}, doc)
if raw_object:
return doc['_id']
if force_insert: object_id = wc_collection.insert_one(doc).inserted_id
return collection.insert(doc, **write_concern)
object_id = collection.save(doc, **write_concern)
# In PyMongo 3.0, the save() call calls internally the _update() call # In PyMongo 3.0, the save() call calls internally the _update() call
# but they forget to return the _id value passed back, therefore getting it back here # but they forget to return the _id value passed back, therefore getting it back here
@ -585,9 +594,8 @@ class Document(BaseDocument):
:param signal_kwargs: (optional) kwargs dictionary to be passed to :param signal_kwargs: (optional) kwargs dictionary to be passed to
the signal calls. the signal calls.
:param write_concern: Extra keyword arguments are passed down which :param write_concern: Extra keyword arguments are passed down which
will be used as options for the resultant will be used as options for the resultant ``getLastError`` command.
``getLastError`` command. For example, For example, ``save(..., w: 2, fsync: True)`` will
``save(..., write_concern={w: 2, fsync: True}, ...)`` will
wait until at least two servers have recorded the write and wait until at least two servers have recorded the write and
will force an fsync on the primary server. will force an fsync on the primary server.
@ -997,7 +1005,7 @@ class Document(BaseDocument):
return {'missing': missing, 'extra': extra} return {'missing': missing, 'extra': extra}
class DynamicDocument(Document): class DynamicDocument(six.with_metaclass(TopLevelDocumentMetaclass, Document)):
"""A Dynamic Document class allowing flexible, expandable and uncontrolled """A Dynamic Document class allowing flexible, expandable and uncontrolled
schemas. As a :class:`~mongoengine.Document` subclass, acts in the same schemas. As a :class:`~mongoengine.Document` subclass, acts in the same
way as an ordinary document but has expanded style properties. Any data way as an ordinary document but has expanded style properties. Any data
@ -1014,7 +1022,6 @@ class DynamicDocument(Document):
# The __metaclass__ attribute is removed by 2to3 when running with Python3 # The __metaclass__ attribute is removed by 2to3 when running with Python3
# my_metaclass is defined so that metaclass can be queried in Python 2 & 3 # my_metaclass is defined so that metaclass can be queried in Python 2 & 3
my_metaclass = TopLevelDocumentMetaclass my_metaclass = TopLevelDocumentMetaclass
__metaclass__ = TopLevelDocumentMetaclass
_dynamic = True _dynamic = True
@ -1030,7 +1037,7 @@ class DynamicDocument(Document):
super(DynamicDocument, self).__delattr__(*args, **kwargs) super(DynamicDocument, self).__delattr__(*args, **kwargs)
class DynamicEmbeddedDocument(EmbeddedDocument): class DynamicEmbeddedDocument(six.with_metaclass(DocumentMetaclass, EmbeddedDocument)):
"""A Dynamic Embedded Document class allowing flexible, expandable and """A Dynamic Embedded Document class allowing flexible, expandable and
uncontrolled schemas. See :class:`~mongoengine.DynamicDocument` for more uncontrolled schemas. See :class:`~mongoengine.DynamicDocument` for more
information about dynamic documents. information about dynamic documents.
@ -1039,7 +1046,6 @@ class DynamicEmbeddedDocument(EmbeddedDocument):
# The __metaclass__ attribute is removed by 2to3 when running with Python3 # The __metaclass__ attribute is removed by 2to3 when running with Python3
# my_metaclass is defined so that metaclass can be queried in Python 2 & 3 # my_metaclass is defined so that metaclass can be queried in Python 2 & 3
my_metaclass = DocumentMetaclass my_metaclass = DocumentMetaclass
__metaclass__ = DocumentMetaclass
_dynamic = True _dynamic = True

View File

@ -71,6 +71,7 @@ class ValidationError(AssertionError):
_message = None _message = None
def __init__(self, message='', **kwargs): def __init__(self, message='', **kwargs):
super(ValidationError, self).__init__(message)
self.errors = kwargs.get('errors', {}) self.errors = kwargs.get('errors', {})
self.field_name = kwargs.get('field_name') self.field_name = kwargs.get('field_name')
self.message = message self.message = message

View File

@ -5,7 +5,6 @@ import re
import socket import socket
import time import time
import uuid import uuid
import warnings
from operator import itemgetter from operator import itemgetter
from bson import Binary, DBRef, ObjectId, SON from bson import Binary, DBRef, ObjectId, SON
@ -25,15 +24,18 @@ try:
except ImportError: except ImportError:
Int64 = long Int64 = long
from mongoengine.base import (BaseDocument, BaseField, ComplexBaseField, from mongoengine.base import (BaseDocument, BaseField, ComplexBaseField,
GeoJsonBaseField, LazyReference, ObjectIdField, GeoJsonBaseField, LazyReference, ObjectIdField,
get_document) get_document)
from mongoengine.base.utils import LazyRegexCompiler
from mongoengine.common import _import_class from mongoengine.common import _import_class
from mongoengine.connection import DEFAULT_CONNECTION_NAME, get_db from mongoengine.connection import DEFAULT_CONNECTION_NAME, get_db
from mongoengine.document import Document, EmbeddedDocument from mongoengine.document import Document, EmbeddedDocument
from mongoengine.errors import DoesNotExist, InvalidQueryError, ValidationError from mongoengine.errors import DoesNotExist, InvalidQueryError, ValidationError
from mongoengine.python_support import StringIO from mongoengine.python_support import StringIO
from mongoengine.queryset import DO_NOTHING, QuerySet from mongoengine.queryset import DO_NOTHING
from mongoengine.queryset.base import BaseQuerySet
try: try:
from PIL import Image, ImageOps from PIL import Image, ImageOps
@ -41,6 +43,12 @@ except ImportError:
Image = None Image = None
ImageOps = None ImageOps = None
if six.PY3:
# Useless as long as 2to3 gets executed
# as it turns `long` into `int` blindly
long = int
__all__ = ( __all__ = (
'StringField', 'URLField', 'EmailField', 'IntField', 'LongField', 'StringField', 'URLField', 'EmailField', 'IntField', 'LongField',
'FloatField', 'DecimalField', 'BooleanField', 'DateTimeField', 'DateField', 'FloatField', 'DecimalField', 'BooleanField', 'DateTimeField', 'DateField',
@ -123,9 +131,9 @@ class URLField(StringField):
.. versionadded:: 0.3 .. versionadded:: 0.3
""" """
_URL_REGEX = re.compile( _URL_REGEX = LazyRegexCompiler(
r'^(?:[a-z0-9\.\-]*)://' # scheme is validated separately r'^(?:[a-z0-9\.\-]*)://' # scheme is validated separately
r'(?:(?:[A-Z0-9](?:[A-Z0-9-]{0,61}[A-Z0-9])?\.)+(?:[A-Z]{2,6}\.?|[A-Z0-9-]{2,}(?<!-)\.?)|' # domain... r'(?:(?:[A-Z0-9](?:[A-Z0-9-_]{0,61}[A-Z0-9])?\.)+(?:[A-Z]{2,6}\.?|[A-Z0-9-]{2,}(?<!-)\.?)|' # domain...
r'localhost|' # localhost... r'localhost|' # localhost...
r'\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}|' # ...or ipv4 r'\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}|' # ...or ipv4
r'\[?[A-F0-9]*:[A-F0-9:]+\]?)' # ...or ipv6 r'\[?[A-F0-9]*:[A-F0-9:]+\]?)' # ...or ipv6
@ -133,8 +141,7 @@ class URLField(StringField):
r'(?:/?|[/?]\S+)$', re.IGNORECASE) r'(?:/?|[/?]\S+)$', re.IGNORECASE)
_URL_SCHEMES = ['http', 'https', 'ftp', 'ftps'] _URL_SCHEMES = ['http', 'https', 'ftp', 'ftps']
def __init__(self, verify_exists=False, url_regex=None, schemes=None, **kwargs): def __init__(self, url_regex=None, schemes=None, **kwargs):
self.verify_exists = verify_exists
self.url_regex = url_regex or self._URL_REGEX self.url_regex = url_regex or self._URL_REGEX
self.schemes = schemes or self._URL_SCHEMES self.schemes = schemes or self._URL_SCHEMES
super(URLField, self).__init__(**kwargs) super(URLField, self).__init__(**kwargs)
@ -157,7 +164,7 @@ class EmailField(StringField):
.. versionadded:: 0.4 .. versionadded:: 0.4
""" """
USER_REGEX = re.compile( USER_REGEX = LazyRegexCompiler(
# `dot-atom` defined in RFC 5322 Section 3.2.3. # `dot-atom` defined in RFC 5322 Section 3.2.3.
r"(^[-!#$%&'*+/=?^_`{}|~0-9A-Z]+(\.[-!#$%&'*+/=?^_`{}|~0-9A-Z]+)*\Z" r"(^[-!#$%&'*+/=?^_`{}|~0-9A-Z]+(\.[-!#$%&'*+/=?^_`{}|~0-9A-Z]+)*\Z"
# `quoted-string` defined in RFC 5322 Section 3.2.4. # `quoted-string` defined in RFC 5322 Section 3.2.4.
@ -165,7 +172,7 @@ class EmailField(StringField):
re.IGNORECASE re.IGNORECASE
) )
UTF8_USER_REGEX = re.compile( UTF8_USER_REGEX = LazyRegexCompiler(
six.u( six.u(
# RFC 6531 Section 3.3 extends `atext` (used by dot-atom) to # RFC 6531 Section 3.3 extends `atext` (used by dot-atom) to
# include `UTF8-non-ascii`. # include `UTF8-non-ascii`.
@ -175,7 +182,7 @@ class EmailField(StringField):
), re.IGNORECASE | re.UNICODE ), re.IGNORECASE | re.UNICODE
) )
DOMAIN_REGEX = re.compile( DOMAIN_REGEX = LazyRegexCompiler(
r'((?:[A-Z0-9](?:[A-Z0-9-]{0,61}[A-Z0-9])?\.)+)(?:[A-Z0-9-]{2,63}(?<!-))\Z', r'((?:[A-Z0-9](?:[A-Z0-9-]{0,61}[A-Z0-9])?\.)+)(?:[A-Z0-9-]{2,63}(?<!-))\Z',
re.IGNORECASE re.IGNORECASE
) )
@ -267,14 +274,14 @@ class IntField(BaseField):
def to_python(self, value): def to_python(self, value):
try: try:
value = int(value) value = int(value)
except ValueError: except (TypeError, ValueError):
pass pass
return value return value
def validate(self, value): def validate(self, value):
try: try:
value = int(value) value = int(value)
except Exception: except (TypeError, ValueError):
self.error('%s could not be converted to int' % value) self.error('%s could not be converted to int' % value)
if self.min_value is not None and value < self.min_value: if self.min_value is not None and value < self.min_value:
@ -300,7 +307,7 @@ class LongField(BaseField):
def to_python(self, value): def to_python(self, value):
try: try:
value = long(value) value = long(value)
except ValueError: except (TypeError, ValueError):
pass pass
return value return value
@ -310,7 +317,7 @@ class LongField(BaseField):
def validate(self, value): def validate(self, value):
try: try:
value = long(value) value = long(value)
except Exception: except (TypeError, ValueError):
self.error('%s could not be converted to long' % value) self.error('%s could not be converted to long' % value)
if self.min_value is not None and value < self.min_value: if self.min_value is not None and value < self.min_value:
@ -364,7 +371,8 @@ class FloatField(BaseField):
class DecimalField(BaseField): class DecimalField(BaseField):
"""Fixed-point decimal number field. """Fixed-point decimal number field. Stores the value as a float by default unless `force_string` is used.
If using floats, beware of Decimal to float conversion (potential precision loss)
.. versionchanged:: 0.8 .. versionchanged:: 0.8
.. versionadded:: 0.3 .. versionadded:: 0.3
@ -375,7 +383,9 @@ class DecimalField(BaseField):
""" """
:param min_value: Validation rule for the minimum acceptable value. :param min_value: Validation rule for the minimum acceptable value.
:param max_value: Validation rule for the maximum acceptable value. :param max_value: Validation rule for the maximum acceptable value.
:param force_string: Store as a string. :param force_string: Store the value as a string (instead of a float).
Be aware that this affects query sorting and operation like lte, gte (as string comparison is applied)
and some query operator won't work (e.g: inc, dec)
:param precision: Number of decimal places to store. :param precision: Number of decimal places to store.
:param rounding: The rounding rule from the python decimal library: :param rounding: The rounding rule from the python decimal library:
@ -406,7 +416,7 @@ class DecimalField(BaseField):
# Convert to string for python 2.6 before casting to Decimal # Convert to string for python 2.6 before casting to Decimal
try: try:
value = decimal.Decimal('%s' % value) value = decimal.Decimal('%s' % value)
except decimal.InvalidOperation: except (TypeError, ValueError, decimal.InvalidOperation):
return value return value
return value.quantize(decimal.Decimal('.%s' % ('0' * self.precision)), rounding=self.rounding) return value.quantize(decimal.Decimal('.%s' % ('0' * self.precision)), rounding=self.rounding)
@ -423,7 +433,7 @@ class DecimalField(BaseField):
value = six.text_type(value) value = six.text_type(value)
try: try:
value = decimal.Decimal(value) value = decimal.Decimal(value)
except Exception as exc: except (TypeError, ValueError, decimal.InvalidOperation) as exc:
self.error('Could not convert value to decimal: %s' % exc) self.error('Could not convert value to decimal: %s' % exc)
if self.min_value is not None and value < self.min_value: if self.min_value is not None and value < self.min_value:
@ -462,6 +472,8 @@ class DateTimeField(BaseField):
installed you can utilise it to convert varying types of date formats into valid installed you can utilise it to convert varying types of date formats into valid
python datetime objects. python datetime objects.
Note: To default the field to the current datetime, use: DateTimeField(default=datetime.utcnow)
Note: Microseconds are rounded to the nearest millisecond. Note: Microseconds are rounded to the nearest millisecond.
Pre UTC microsecond support is effectively broken. Pre UTC microsecond support is effectively broken.
Use :class:`~mongoengine.fields.ComplexDateTimeField` if you Use :class:`~mongoengine.fields.ComplexDateTimeField` if you
@ -557,11 +569,15 @@ class ComplexDateTimeField(StringField):
The `,` as the separator can be easily modified by passing the `separator` The `,` as the separator can be easily modified by passing the `separator`
keyword when initializing the field. keyword when initializing the field.
Note: To default the field to the current datetime, use: DateTimeField(default=datetime.utcnow)
.. versionadded:: 0.5 .. versionadded:: 0.5
""" """
def __init__(self, separator=',', **kwargs): def __init__(self, separator=',', **kwargs):
self.names = ['year', 'month', 'day', 'hour', 'minute', 'second', 'microsecond'] """
:param separator: Allows to customize the separator used for storage (default ``,``)
"""
self.separator = separator self.separator = separator
self.format = separator.join(['%Y', '%m', '%d', '%H', '%M', '%S', '%f']) self.format = separator.join(['%Y', '%m', '%d', '%H', '%M', '%S', '%f'])
super(ComplexDateTimeField, self).__init__(**kwargs) super(ComplexDateTimeField, self).__init__(**kwargs)
@ -588,20 +604,24 @@ class ComplexDateTimeField(StringField):
>>> ComplexDateTimeField()._convert_from_string(a) >>> ComplexDateTimeField()._convert_from_string(a)
datetime.datetime(2011, 6, 8, 20, 26, 24, 92284) datetime.datetime(2011, 6, 8, 20, 26, 24, 92284)
""" """
values = map(int, data.split(self.separator)) values = [int(d) for d in data.split(self.separator)]
return datetime.datetime(*values) return datetime.datetime(*values)
def __get__(self, instance, owner): def __get__(self, instance, owner):
if instance is None:
return self
data = super(ComplexDateTimeField, self).__get__(instance, owner) data = super(ComplexDateTimeField, self).__get__(instance, owner)
if data is None:
return None if self.null else datetime.datetime.now() if isinstance(data, datetime.datetime) or data is None:
if isinstance(data, datetime.datetime):
return data return data
return self._convert_from_string(data) return self._convert_from_string(data)
def __set__(self, instance, value): def __set__(self, instance, value):
value = self._convert_from_datetime(value) if value else value super(ComplexDateTimeField, self).__set__(instance, value)
return super(ComplexDateTimeField, self).__set__(instance, value) value = instance._data[self.name]
if value is not None:
instance._data[self.name] = self._convert_from_datetime(value)
def validate(self, value): def validate(self, value):
value = self.to_python(value) value = self.to_python(value)
@ -645,9 +665,17 @@ class EmbeddedDocumentField(BaseField):
def document_type(self): def document_type(self):
if isinstance(self.document_type_obj, six.string_types): if isinstance(self.document_type_obj, six.string_types):
if self.document_type_obj == RECURSIVE_REFERENCE_CONSTANT: if self.document_type_obj == RECURSIVE_REFERENCE_CONSTANT:
self.document_type_obj = self.owner_document resolved_document_type = self.owner_document
else: else:
self.document_type_obj = get_document(self.document_type_obj) resolved_document_type = get_document(self.document_type_obj)
if not issubclass(resolved_document_type, EmbeddedDocument):
# Due to the late resolution of the document_type
# There is a chance that it won't be an EmbeddedDocument (#1661)
self.error('Invalid embedded document class provided to an '
'EmbeddedDocumentField')
self.document_type_obj = resolved_document_type
return self.document_type_obj return self.document_type_obj
def to_python(self, value): def to_python(self, value):
@ -824,8 +852,7 @@ class ListField(ComplexBaseField):
def validate(self, value): def validate(self, value):
"""Make sure that a list of valid fields is being used.""" """Make sure that a list of valid fields is being used."""
if (not isinstance(value, (list, tuple, QuerySet)) or if not isinstance(value, (list, tuple, BaseQuerySet)):
isinstance(value, six.string_types)):
self.error('Only lists and tuples may be used in a list field') self.error('Only lists and tuples may be used in a list field')
super(ListField, self).validate(value) super(ListField, self).validate(value)
@ -932,14 +959,9 @@ class DictField(ComplexBaseField):
.. versionchanged:: 0.5 - Can now handle complex / varying types of data .. versionchanged:: 0.5 - Can now handle complex / varying types of data
""" """
def __init__(self, basecls=None, field=None, *args, **kwargs): def __init__(self, field=None, *args, **kwargs):
self.field = field self.field = field
self._auto_dereference = False self._auto_dereference = False
self.basecls = basecls or BaseField
# XXX ValidationError raised outside of the "validate" method.
if not issubclass(self.basecls, BaseField):
self.error('DictField only accepts dict values')
kwargs.setdefault('default', lambda: {}) kwargs.setdefault('default', lambda: {})
super(DictField, self).__init__(*args, **kwargs) super(DictField, self).__init__(*args, **kwargs)
@ -959,7 +981,7 @@ class DictField(ComplexBaseField):
super(DictField, self).validate(value) super(DictField, self).validate(value)
def lookup_member(self, member_name): def lookup_member(self, member_name):
return DictField(basecls=self.basecls, db_field=member_name) return DictField(db_field=member_name)
def prepare_query_value(self, op, value): def prepare_query_value(self, op, value):
match_operators = ['contains', 'icontains', 'startswith', match_operators = ['contains', 'icontains', 'startswith',
@ -969,7 +991,7 @@ class DictField(ComplexBaseField):
if op in match_operators and isinstance(value, six.string_types): if op in match_operators and isinstance(value, six.string_types):
return StringField().prepare_query_value(op, value) return StringField().prepare_query_value(op, value)
if hasattr(self.field, 'field'): if hasattr(self.field, 'field'): # Used for instance when using DictField(ListField(IntField()))
if op in ('set', 'unset') and isinstance(value, dict): if op in ('set', 'unset') and isinstance(value, dict):
return { return {
k: self.field.prepare_query_value(op, v) k: self.field.prepare_query_value(op, v)
@ -1027,11 +1049,13 @@ class ReferenceField(BaseField):
.. code-block:: python .. code-block:: python
class Bar(Document): class Org(Document):
content = StringField() owner = ReferenceField('User')
foo = ReferenceField('Foo')
Foo.register_delete_rule(Bar, 'foo', NULLIFY) class User(Document):
org = ReferenceField('Org', reverse_delete_rule=CASCADE)
User.register_delete_rule(Org, 'owner', DENY)
.. versionchanged:: 0.5 added `reverse_delete_rule` .. versionchanged:: 0.5 added `reverse_delete_rule`
""" """
@ -1079,9 +1103,9 @@ class ReferenceField(BaseField):
# Get value from document instance if available # Get value from document instance if available
value = instance._data.get(self.name) value = instance._data.get(self.name)
self._auto_dereference = instance._fields[self.name]._auto_dereference auto_dereference = instance._fields[self.name]._auto_dereference
# Dereference DBRefs # Dereference DBRefs
if self._auto_dereference and isinstance(value, DBRef): if auto_dereference and isinstance(value, DBRef):
if hasattr(value, 'cls'): if hasattr(value, 'cls'):
# Dereference using the class type specified in the reference # Dereference using the class type specified in the reference
cls = get_document(value.cls) cls = get_document(value.cls)
@ -1152,16 +1176,6 @@ class ReferenceField(BaseField):
self.error('You can only reference documents once they have been ' self.error('You can only reference documents once they have been '
'saved to the database') 'saved to the database')
if (
self.document_type._meta.get('abstract') and
not isinstance(value, self.document_type)
):
self.error(
'%s is not an instance of abstract reference type %s' % (
self.document_type._class_name
)
)
def lookup_member(self, member_name): def lookup_member(self, member_name):
return self.document_type._fields.get(member_name) return self.document_type._fields.get(member_name)
@ -1242,9 +1256,10 @@ class CachedReferenceField(BaseField):
# Get value from document instance if available # Get value from document instance if available
value = instance._data.get(self.name) value = instance._data.get(self.name)
self._auto_dereference = instance._fields[self.name]._auto_dereference auto_dereference = instance._fields[self.name]._auto_dereference
# Dereference DBRefs # Dereference DBRefs
if self._auto_dereference and isinstance(value, DBRef): if auto_dereference and isinstance(value, DBRef):
dereferenced = self.document_type._get_db().dereference(value) dereferenced = self.document_type._get_db().dereference(value)
if dereferenced is None: if dereferenced is None:
raise DoesNotExist('Trying to dereference unknown document %s' % value) raise DoesNotExist('Trying to dereference unknown document %s' % value)
@ -1377,8 +1392,8 @@ class GenericReferenceField(BaseField):
value = instance._data.get(self.name) value = instance._data.get(self.name)
self._auto_dereference = instance._fields[self.name]._auto_dereference auto_dereference = instance._fields[self.name]._auto_dereference
if self._auto_dereference and isinstance(value, (dict, SON)): if auto_dereference and isinstance(value, (dict, SON)):
dereferenced = self.dereference(value) dereferenced = self.dereference(value)
if dereferenced is None: if dereferenced is None:
raise DoesNotExist('Trying to dereference unknown document %s' % value) raise DoesNotExist('Trying to dereference unknown document %s' % value)
@ -1460,10 +1475,10 @@ class BinaryField(BaseField):
return Binary(value) return Binary(value)
def validate(self, value): def validate(self, value):
if not isinstance(value, (six.binary_type, six.text_type, Binary)): if not isinstance(value, (six.binary_type, Binary)):
self.error('BinaryField only accepts instances of ' self.error('BinaryField only accepts instances of '
'(%s, %s, Binary)' % ( '(%s, %s, Binary)' % (
six.binary_type.__name__, six.text_type.__name__)) six.binary_type.__name__, Binary.__name__))
if self.max_bytes is not None and len(value) > self.max_bytes: if self.max_bytes is not None and len(value) > self.max_bytes:
self.error('Binary value is too long') self.error('Binary value is too long')
@ -1508,9 +1523,11 @@ class GridFSProxy(object):
def __get__(self, instance, value): def __get__(self, instance, value):
return self return self
def __nonzero__(self): def __bool__(self):
return bool(self.grid_id) return bool(self.grid_id)
__nonzero__ = __bool__ # For Py2 support
def __getstate__(self): def __getstate__(self):
self_dict = self.__dict__ self_dict = self.__dict__
self_dict['_fs'] = None self_dict['_fs'] = None
@ -1850,12 +1867,9 @@ class ImageField(FileField):
""" """
A Image File storage field. A Image File storage field.
@size (width, height, force): :param size: max size to store images, provided as (width, height, force)
max size to store images, if larger will be automatically resized if larger, it will be automatically resized (ex: size=(800, 600, True))
ex: size=(800, 600, True) :param thumbnail_size: size to generate a thumbnail, provided as (width, height, force)
@thumbnail (width, height, force):
size to generate a thumbnail
.. versionadded:: 0.6 .. versionadded:: 0.6
""" """
@ -1926,8 +1940,7 @@ class SequenceField(BaseField):
self.collection_name = collection_name or self.COLLECTION_NAME self.collection_name = collection_name or self.COLLECTION_NAME
self.db_alias = db_alias or DEFAULT_CONNECTION_NAME self.db_alias = db_alias or DEFAULT_CONNECTION_NAME
self.sequence_name = sequence_name self.sequence_name = sequence_name
self.value_decorator = (callable(value_decorator) and self.value_decorator = value_decorator if callable(value_decorator) else self.VALUE_DECORATOR
value_decorator or self.VALUE_DECORATOR)
super(SequenceField, self).__init__(*args, **kwargs) super(SequenceField, self).__init__(*args, **kwargs)
def generate(self): def generate(self):
@ -2036,7 +2049,7 @@ class UUIDField(BaseField):
if not isinstance(value, six.string_types): if not isinstance(value, six.string_types):
value = six.text_type(value) value = six.text_type(value)
return uuid.UUID(value) return uuid.UUID(value)
except Exception: except (ValueError, TypeError, AttributeError):
return original_value return original_value
return value return value
@ -2058,7 +2071,7 @@ class UUIDField(BaseField):
value = str(value) value = str(value)
try: try:
uuid.UUID(value) uuid.UUID(value)
except Exception as exc: except (ValueError, TypeError, AttributeError) as exc:
self.error('Could not convert to UUID: %s' % exc) self.error('Could not convert to UUID: %s' % exc)

View File

@ -6,11 +6,7 @@ import pymongo
import six import six
if pymongo.version_tuple[0] < 3: IS_PYMONGO_3 = pymongo.version_tuple[0] >= 3
IS_PYMONGO_3 = False
else:
IS_PYMONGO_3 = True
# six.BytesIO resolves to StringIO.StringIO in Py2 and io.BytesIO in Py3. # six.BytesIO resolves to StringIO.StringIO in Py2 and io.BytesIO in Py3.
StringIO = six.BytesIO StringIO = six.BytesIO
@ -23,3 +19,10 @@ if not six.PY3:
pass pass
else: else:
StringIO = cStringIO.StringIO StringIO = cStringIO.StringIO
if six.PY3:
from collections.abc import Hashable
else:
# raises DeprecationWarnings in Python >=3.7
from collections import Hashable

View File

@ -2,7 +2,6 @@ from __future__ import absolute_import
import copy import copy
import itertools import itertools
import operator
import pprint import pprint
import re import re
import warnings import warnings
@ -39,8 +38,6 @@ CASCADE = 2
DENY = 3 DENY = 3
PULL = 4 PULL = 4
RE_TYPE = type(re.compile(''))
class BaseQuerySet(object): class BaseQuerySet(object):
"""A set of results returned from a query. Wraps a MongoDB cursor, """A set of results returned from a query. Wraps a MongoDB cursor,
@ -209,14 +206,12 @@ class BaseQuerySet(object):
queryset = self.order_by() queryset = self.order_by()
return False if queryset.first() is None else True return False if queryset.first() is None else True
def __nonzero__(self):
"""Avoid to open all records in an if stmt in Py2."""
return self._has_data()
def __bool__(self): def __bool__(self):
"""Avoid to open all records in an if stmt in Py3.""" """Avoid to open all records in an if stmt in Py3."""
return self._has_data() return self._has_data()
__nonzero__ = __bool__ # For Py2 support
# Core functions # Core functions
def all(self): def all(self):
@ -269,13 +264,13 @@ class BaseQuerySet(object):
queryset = queryset.filter(*q_objs, **query) queryset = queryset.filter(*q_objs, **query)
try: try:
result = queryset.next() result = six.next(queryset)
except StopIteration: except StopIteration:
msg = ('%s matching query does not exist.' msg = ('%s matching query does not exist.'
% queryset._document._class_name) % queryset._document._class_name)
raise queryset._document.DoesNotExist(msg) raise queryset._document.DoesNotExist(msg)
try: try:
queryset.next() six.next(queryset)
except StopIteration: except StopIteration:
return result return result
@ -359,7 +354,7 @@ class BaseQuerySet(object):
try: try:
inserted_result = insert_func(raw) inserted_result = insert_func(raw)
ids = return_one and [inserted_result.inserted_id] or inserted_result.inserted_ids ids = [inserted_result.inserted_id] if return_one else inserted_result.inserted_ids
except pymongo.errors.DuplicateKeyError as err: except pymongo.errors.DuplicateKeyError as err:
message = 'Could not save document (%s)' message = 'Could not save document (%s)'
raise NotUniqueError(message % six.text_type(err)) raise NotUniqueError(message % six.text_type(err))
@ -377,17 +372,20 @@ class BaseQuerySet(object):
raise NotUniqueError(message % six.text_type(err)) raise NotUniqueError(message % six.text_type(err))
raise OperationError(message % six.text_type(err)) raise OperationError(message % six.text_type(err))
# Apply inserted_ids to documents
for doc, doc_id in zip(docs, ids):
doc.pk = doc_id
if not load_bulk: if not load_bulk:
signals.post_bulk_insert.send( signals.post_bulk_insert.send(
self._document, documents=docs, loaded=False, **signal_kwargs) self._document, documents=docs, loaded=False, **signal_kwargs)
return return_one and ids[0] or ids return ids[0] if return_one else ids
documents = self.in_bulk(ids) documents = self.in_bulk(ids)
results = [] results = [documents.get(obj_id) for obj_id in ids]
for obj_id in ids:
results.append(documents.get(obj_id))
signals.post_bulk_insert.send( signals.post_bulk_insert.send(
self._document, documents=results, loaded=True, **signal_kwargs) self._document, documents=results, loaded=True, **signal_kwargs)
return return_one and results[0] or results return results[0] if return_one else results
def count(self, with_limit_and_skip=False): def count(self, with_limit_and_skip=False):
"""Count the selected elements in the query. """Count the selected elements in the query.
@ -396,9 +394,11 @@ class BaseQuerySet(object):
:meth:`skip` that has been applied to this cursor into account when :meth:`skip` that has been applied to this cursor into account when
getting the count getting the count
""" """
if self._limit == 0 and with_limit_and_skip or self._none: if self._limit == 0 and with_limit_and_skip is False or self._none:
return 0 return 0
return self._cursor.count(with_limit_and_skip=with_limit_and_skip) count = self._cursor.count(with_limit_and_skip=with_limit_and_skip)
self._cursor_obj = None
return count
def delete(self, write_concern=None, _from_doc_delete=False, def delete(self, write_concern=None, _from_doc_delete=False,
cascade_refs=None): cascade_refs=None):
@ -775,10 +775,11 @@ class BaseQuerySet(object):
"""Limit the number of returned documents to `n`. This may also be """Limit the number of returned documents to `n`. This may also be
achieved using array-slicing syntax (e.g. ``User.objects[:5]``). achieved using array-slicing syntax (e.g. ``User.objects[:5]``).
:param n: the maximum number of objects to return :param n: the maximum number of objects to return if n is greater than 0.
When 0 is passed, returns all the documents in the cursor
""" """
queryset = self.clone() queryset = self.clone()
queryset._limit = n if n != 0 else 1 queryset._limit = n
# If a cursor object has already been created, apply the limit to it. # If a cursor object has already been created, apply the limit to it.
if queryset._cursor_obj: if queryset._cursor_obj:
@ -976,11 +977,10 @@ class BaseQuerySet(object):
# explicitly included, and then more complicated operators such as # explicitly included, and then more complicated operators such as
# $slice. # $slice.
def _sort_key(field_tuple): def _sort_key(field_tuple):
key, value = field_tuple _, value = field_tuple
if isinstance(value, (int)): if isinstance(value, int):
return value # 0 for exclusion, 1 for inclusion return value # 0 for exclusion, 1 for inclusion
else: return 2 # so that complex values appear last
return 2 # so that complex values appear last
fields = sorted(cleaned_fields, key=_sort_key) fields = sorted(cleaned_fields, key=_sort_key)
@ -1477,13 +1477,13 @@ class BaseQuerySet(object):
# Iterator helpers # Iterator helpers
def next(self): def __next__(self):
"""Wrap the result in a :class:`~mongoengine.Document` object. """Wrap the result in a :class:`~mongoengine.Document` object.
""" """
if self._limit == 0 or self._none: if self._limit == 0 or self._none:
raise StopIteration raise StopIteration
raw_doc = self._cursor.next() raw_doc = six.next(self._cursor)
if self._as_pymongo: if self._as_pymongo:
return self._get_as_pymongo(raw_doc) return self._get_as_pymongo(raw_doc)
@ -1497,6 +1497,8 @@ class BaseQuerySet(object):
return doc return doc
next = __next__ # For Python2 support
def rewind(self): def rewind(self):
"""Rewind the cursor to its unevaluated state. """Rewind the cursor to its unevaluated state.
@ -1872,8 +1874,8 @@ class BaseQuerySet(object):
# Substitute the correct name for the field into the javascript # Substitute the correct name for the field into the javascript
return '.'.join([f.db_field for f in fields]) return '.'.join([f.db_field for f in fields])
code = re.sub(u'\[\s*~([A-z_][A-z_0-9.]+?)\s*\]', field_sub, code) code = re.sub(r'\[\s*~([A-z_][A-z_0-9.]+?)\s*\]', field_sub, code)
code = re.sub(u'\{\{\s*~([A-z_][A-z_0-9.]+?)\s*\}\}', field_path_sub, code = re.sub(r'\{\{\s*~([A-z_][A-z_0-9.]+?)\s*\}\}', field_path_sub,
code) code)
return code return code

View File

@ -63,9 +63,11 @@ class QueryFieldList(object):
self._only_called = True self._only_called = True
return self return self
def __nonzero__(self): def __bool__(self):
return bool(self.fields) return bool(self.fields)
__nonzero__ = __bool__ # For Py2 support
def as_dict(self): def as_dict(self):
field_list = {field: self.value for field in self.fields} field_list = {field: self.value for field in self.fields}
if self.slice: if self.slice:

View File

@ -36,7 +36,7 @@ class QuerySetManager(object):
queryset_class = owner._meta.get('queryset_class', self.default) queryset_class = owner._meta.get('queryset_class', self.default)
queryset = queryset_class(owner, owner._get_collection()) queryset = queryset_class(owner, owner._get_collection())
if self.get_queryset: if self.get_queryset:
arg_count = self.get_queryset.func_code.co_argcount arg_count = self.get_queryset.__code__.co_argcount
if arg_count == 1: if arg_count == 1:
queryset = self.get_queryset(queryset) queryset = self.get_queryset(queryset)
elif arg_count == 2: elif arg_count == 2:

View File

@ -89,7 +89,7 @@ class QuerySet(BaseQuerySet):
yield self._result_cache[pos] yield self._result_cache[pos]
pos += 1 pos += 1
# Raise StopIteration if we already established there were no more # return if we already established there were no more
# docs in the db cursor. # docs in the db cursor.
if not self._has_more: if not self._has_more:
return return
@ -115,7 +115,7 @@ class QuerySet(BaseQuerySet):
# the result cache. # the result cache.
try: try:
for _ in six.moves.range(ITER_CHUNK_SIZE): for _ in six.moves.range(ITER_CHUNK_SIZE):
self._result_cache.append(self.next()) self._result_cache.append(six.next(self))
except StopIteration: except StopIteration:
# Getting this exception means there are no more docs in the # Getting this exception means there are no more docs in the
# db cursor. Set _has_more to False so that we can use that # db cursor. Set _has_more to False so that we can use that
@ -170,7 +170,7 @@ class QuerySetNoCache(BaseQuerySet):
data = [] data = []
for _ in six.moves.range(REPR_OUTPUT_SIZE + 1): for _ in six.moves.range(REPR_OUTPUT_SIZE + 1):
try: try:
data.append(self.next()) data.append(six.next(self))
except StopIteration: except StopIteration:
break break
@ -186,10 +186,3 @@ class QuerySetNoCache(BaseQuerySet):
queryset = self.clone() queryset = self.clone()
queryset.rewind() queryset.rewind()
return queryset return queryset
class QuerySetNoDeRef(QuerySet):
"""Special no_dereference QuerySet"""
def __dereference(items, max_depth=1, instance=None, name=None):
return items

View File

@ -147,7 +147,7 @@ def query(_doc_cls=None, **kwargs):
if op is None or key not in mongo_query: if op is None or key not in mongo_query:
mongo_query[key] = value mongo_query[key] = value
elif key in mongo_query: elif key in mongo_query:
if isinstance(mongo_query[key], dict): if isinstance(mongo_query[key], dict) and isinstance(value, dict):
mongo_query[key].update(value) mongo_query[key].update(value)
# $max/minDistance needs to come last - convert to SON # $max/minDistance needs to come last - convert to SON
value_dict = mongo_query[key] value_dict = mongo_query[key]
@ -201,30 +201,37 @@ def update(_doc_cls=None, **update):
format. format.
""" """
mongo_update = {} mongo_update = {}
for key, value in update.items(): for key, value in update.items():
if key == '__raw__': if key == '__raw__':
mongo_update.update(value) mongo_update.update(value)
continue continue
parts = key.split('__') parts = key.split('__')
# if there is no operator, default to 'set' # if there is no operator, default to 'set'
if len(parts) < 3 and parts[0] not in UPDATE_OPERATORS: if len(parts) < 3 and parts[0] not in UPDATE_OPERATORS:
parts.insert(0, 'set') parts.insert(0, 'set')
# Check for an operator and transform to mongo-style if there is # Check for an operator and transform to mongo-style if there is
op = None op = None
if parts[0] in UPDATE_OPERATORS: if parts[0] in UPDATE_OPERATORS:
op = parts.pop(0) op = parts.pop(0)
# Convert Pythonic names to Mongo equivalents # Convert Pythonic names to Mongo equivalents
if op in ('push_all', 'pull_all'): operator_map = {
op = op.replace('_all', 'All') 'push_all': 'pushAll',
elif op == 'dec': 'pull_all': 'pullAll',
'dec': 'inc',
'add_to_set': 'addToSet',
'set_on_insert': 'setOnInsert'
}
if op == 'dec':
# Support decrement by flipping a positive value's sign # Support decrement by flipping a positive value's sign
# and using 'inc' # and using 'inc'
op = 'inc'
value = -value value = -value
elif op == 'add_to_set': # If the operator doesn't found from operator map, the op value
op = 'addToSet' # will stay unchanged
elif op == 'set_on_insert': op = operator_map.get(op, op)
op = 'setOnInsert'
match = None match = None
if parts[-1] in COMPARISON_OPERATORS: if parts[-1] in COMPARISON_OPERATORS:
@ -291,6 +298,8 @@ def update(_doc_cls=None, **update):
value = field.prepare_query_value(op, value) value = field.prepare_query_value(op, value)
elif op == 'unset': elif op == 'unset':
value = 1 value = 1
elif op == 'inc':
value = field.prepare_query_value(op, value)
if match: if match:
match = '$' + match match = '$' + match
@ -336,7 +345,7 @@ def update(_doc_cls=None, **update):
value = {key: {'$each': value}} value = {key: {'$each': value}}
elif op in ('push', 'pushAll'): elif op in ('push', 'pushAll'):
if parts[-1].isdigit(): if parts[-1].isdigit():
key = parts[0] key = '.'.join(parts[0:-1])
position = int(parts[-1]) position = int(parts[-1])
# $position expects an iterable. If pushing a single value, # $position expects an iterable. If pushing a single value,
# wrap it in a list. # wrap it in a list.
@ -420,7 +429,6 @@ def _infer_geometry(value):
'type and coordinates keys') 'type and coordinates keys')
elif isinstance(value, (list, set)): elif isinstance(value, (list, set)):
# TODO: shouldn't we test value[0][0][0][0] to see if it is MultiPolygon? # TODO: shouldn't we test value[0][0][0][0] to see if it is MultiPolygon?
# TODO: should both TypeError and IndexError be alike interpreted?
try: try:
value[0][0][0] value[0][0][0]

View File

@ -3,7 +3,7 @@ import copy
from mongoengine.errors import InvalidQueryError from mongoengine.errors import InvalidQueryError
from mongoengine.queryset import transform from mongoengine.queryset import transform
__all__ = ('Q',) __all__ = ('Q', 'QNode')
class QNodeVisitor(object): class QNodeVisitor(object):
@ -131,6 +131,10 @@ class QCombination(QNode):
else: else:
self.children.append(node) self.children.append(node)
def __repr__(self):
op = ' & ' if self.operation is self.AND else ' | '
return '(%s)' % op.join([repr(node) for node in self.children])
def accept(self, visitor): def accept(self, visitor):
for i in range(len(self.children)): for i in range(len(self.children)):
if isinstance(self.children[i], QNode): if isinstance(self.children[i], QNode):
@ -151,6 +155,9 @@ class Q(QNode):
def __init__(self, **query): def __init__(self, **query):
self.query = query self.query = query
def __repr__(self):
return 'Q(**%s)' % repr(self.query)
def accept(self, visitor): def accept(self, visitor):
return visitor.visit_query(self) return visitor.visit_query(self)

View File

@ -5,7 +5,7 @@ detailed-errors=1
cover-package=mongoengine cover-package=mongoengine
[flake8] [flake8]
ignore=E501,F401,F403,F405,I201,I202 ignore=E501,F401,F403,F405,I201,I202,W504, W605
exclude=build,dist,docs,venv,venv3,.tox,.eggs,tests exclude=build,dist,docs,venv,venv3,.tox,.eggs,tests
max-complexity=47 max-complexity=47
application-import-names=mongoengine,tests application-import-names=mongoengine,tests

View File

@ -44,9 +44,8 @@ CLASSIFIERS = [
"Programming Language :: Python :: 2", "Programming Language :: Python :: 2",
"Programming Language :: Python :: 2.7", "Programming Language :: Python :: 2.7",
"Programming Language :: Python :: 3", "Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.3",
"Programming Language :: Python :: 3.4",
"Programming Language :: Python :: 3.5", "Programming Language :: Python :: 3.5",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: Implementation :: CPython", "Programming Language :: Python :: Implementation :: CPython",
"Programming Language :: Python :: Implementation :: PyPy", "Programming Language :: Python :: Implementation :: PyPy",
'Topic :: Database', 'Topic :: Database',

View File

@ -1,4 +1,4 @@
from all_warnings import AllWarnings from .all_warnings import AllWarnings
from document import * from .document import *
from queryset import * from .queryset import *
from fields import * from .fields import *

View File

@ -1,13 +1,13 @@
import unittest import unittest
from class_methods import * from .class_methods import *
from delta import * from .delta import *
from dynamic import * from .dynamic import *
from indexes import * from .indexes import *
from inheritance import * from .inheritance import *
from instance import * from .instance import *
from json_serialisation import * from .json_serialisation import *
from validation import * from .validation import *
if __name__ == '__main__': if __name__ == '__main__':
unittest.main() unittest.main()

View File

@ -5,7 +5,7 @@ from mongoengine import *
from mongoengine.queryset import NULLIFY, PULL from mongoengine.queryset import NULLIFY, PULL
from mongoengine.connection import get_db from mongoengine.connection import get_db
from tests.utils import needs_mongodb_v26 from tests.utils import requires_mongodb_gte_26
__all__ = ("ClassMethodsTest", ) __all__ = ("ClassMethodsTest", )
@ -66,10 +66,10 @@ class ClassMethodsTest(unittest.TestCase):
""" """
collection_name = 'person' collection_name = 'person'
self.Person(name='Test').save() self.Person(name='Test').save()
self.assertTrue(collection_name in self.db.collection_names()) self.assertIn(collection_name, self.db.collection_names())
self.Person.drop_collection() self.Person.drop_collection()
self.assertFalse(collection_name in self.db.collection_names()) self.assertNotIn(collection_name, self.db.collection_names())
def test_register_delete_rule(self): def test_register_delete_rule(self):
"""Ensure that register delete rule adds a delete rule to the document """Ensure that register delete rule adds a delete rule to the document
@ -188,7 +188,7 @@ class ClassMethodsTest(unittest.TestCase):
self.assertEqual(BlogPostWithTags.compare_indexes(), { 'missing': [], 'extra': [] }) self.assertEqual(BlogPostWithTags.compare_indexes(), { 'missing': [], 'extra': [] })
self.assertEqual(BlogPostWithCustomField.compare_indexes(), { 'missing': [], 'extra': [] }) self.assertEqual(BlogPostWithCustomField.compare_indexes(), { 'missing': [], 'extra': [] })
@needs_mongodb_v26 @requires_mongodb_gte_26
def test_compare_indexes_for_text_indexes(self): def test_compare_indexes_for_text_indexes(self):
""" Ensure that compare_indexes behaves correctly for text indexes """ """ Ensure that compare_indexes behaves correctly for text indexes """
@ -340,7 +340,7 @@ class ClassMethodsTest(unittest.TestCase):
meta = {'collection': collection_name} meta = {'collection': collection_name}
Person(name="Test User").save() Person(name="Test User").save()
self.assertTrue(collection_name in self.db.collection_names()) self.assertIn(collection_name, self.db.collection_names())
user_obj = self.db[collection_name].find_one() user_obj = self.db[collection_name].find_one()
self.assertEqual(user_obj['name'], "Test User") self.assertEqual(user_obj['name'], "Test User")
@ -349,7 +349,7 @@ class ClassMethodsTest(unittest.TestCase):
self.assertEqual(user_obj.name, "Test User") self.assertEqual(user_obj.name, "Test User")
Person.drop_collection() Person.drop_collection()
self.assertFalse(collection_name in self.db.collection_names()) self.assertNotIn(collection_name, self.db.collection_names())
def test_collection_name_and_primary(self): def test_collection_name_and_primary(self):
"""Ensure that a collection with a specified name may be used. """Ensure that a collection with a specified name may be used.

View File

@ -694,7 +694,7 @@ class DeltaTest(unittest.TestCase):
organization.employees.append(person) organization.employees.append(person)
updates, removals = organization._delta() updates, removals = organization._delta()
self.assertEqual({}, removals) self.assertEqual({}, removals)
self.assertTrue('employees' in updates) self.assertIn('employees', updates)
def test_delta_with_dbref_false(self): def test_delta_with_dbref_false(self):
person, organization, employee = self.circular_reference_deltas_2(Document, Document, False) person, organization, employee = self.circular_reference_deltas_2(Document, Document, False)
@ -709,7 +709,7 @@ class DeltaTest(unittest.TestCase):
organization.employees.append(person) organization.employees.append(person)
updates, removals = organization._delta() updates, removals = organization._delta()
self.assertEqual({}, removals) self.assertEqual({}, removals)
self.assertTrue('employees' in updates) self.assertIn('employees', updates)
def test_nested_nested_fields_mark_as_changed(self): def test_nested_nested_fields_mark_as_changed(self):
class EmbeddedDoc(EmbeddedDocument): class EmbeddedDoc(EmbeddedDocument):

View File

@ -174,8 +174,8 @@ class DynamicTest(unittest.TestCase):
Employee.drop_collection() Employee.drop_collection()
self.assertTrue('name' in Employee._fields) self.assertIn('name', Employee._fields)
self.assertTrue('salary' in Employee._fields) self.assertIn('salary', Employee._fields)
self.assertEqual(Employee._get_collection_name(), self.assertEqual(Employee._get_collection_name(),
self.Person._get_collection_name()) self.Person._get_collection_name())
@ -189,7 +189,7 @@ class DynamicTest(unittest.TestCase):
self.assertEqual(1, Employee.objects(age=20).count()) self.assertEqual(1, Employee.objects(age=20).count())
joe_bloggs = self.Person.objects.first() joe_bloggs = self.Person.objects.first()
self.assertTrue(isinstance(joe_bloggs, Employee)) self.assertIsInstance(joe_bloggs, Employee)
def test_embedded_dynamic_document(self): def test_embedded_dynamic_document(self):
"""Test dynamic embedded documents""" """Test dynamic embedded documents"""

View File

@ -1,15 +1,14 @@
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
import unittest import unittest
import sys from datetime import datetime
from nose.plugins.skip import SkipTest from nose.plugins.skip import SkipTest
from datetime import datetime from pymongo.errors import OperationFailure
import pymongo import pymongo
from mongoengine import * from mongoengine import *
from mongoengine.connection import get_db from mongoengine.connection import get_db
from tests.utils import get_mongodb_version, requires_mongodb_gte_26, MONGODB_32, MONGODB_3
from tests.utils import get_mongodb_version, needs_mongodb_v26
__all__ = ("IndexesTest", ) __all__ = ("IndexesTest", )
@ -19,6 +18,7 @@ class IndexesTest(unittest.TestCase):
def setUp(self): def setUp(self):
self.connection = connect(db='mongoenginetest') self.connection = connect(db='mongoenginetest')
self.db = get_db() self.db = get_db()
self.mongodb_version = get_mongodb_version()
class Person(Document): class Person(Document):
name = StringField() name = StringField()
@ -70,7 +70,7 @@ class IndexesTest(unittest.TestCase):
self.assertEqual(len(info), 4) self.assertEqual(len(info), 4)
info = [value['key'] for key, value in info.iteritems()] info = [value['key'] for key, value in info.iteritems()]
for expected in expected_specs: for expected in expected_specs:
self.assertTrue(expected['fields'] in info) self.assertIn(expected['fields'], info)
def _index_test_inheritance(self, InheritFrom): def _index_test_inheritance(self, InheritFrom):
@ -102,7 +102,7 @@ class IndexesTest(unittest.TestCase):
self.assertEqual(len(info), 4) self.assertEqual(len(info), 4)
info = [value['key'] for key, value in info.iteritems()] info = [value['key'] for key, value in info.iteritems()]
for expected in expected_specs: for expected in expected_specs:
self.assertTrue(expected['fields'] in info) self.assertIn(expected['fields'], info)
class ExtendedBlogPost(BlogPost): class ExtendedBlogPost(BlogPost):
title = StringField() title = StringField()
@ -117,7 +117,7 @@ class IndexesTest(unittest.TestCase):
info = ExtendedBlogPost.objects._collection.index_information() info = ExtendedBlogPost.objects._collection.index_information()
info = [value['key'] for key, value in info.iteritems()] info = [value['key'] for key, value in info.iteritems()]
for expected in expected_specs: for expected in expected_specs:
self.assertTrue(expected['fields'] in info) self.assertIn(expected['fields'], info)
def test_indexes_document_inheritance(self): def test_indexes_document_inheritance(self):
"""Ensure that indexes are used when meta[indexes] is specified for """Ensure that indexes are used when meta[indexes] is specified for
@ -226,7 +226,7 @@ class IndexesTest(unittest.TestCase):
list(Person.objects) list(Person.objects)
info = Person.objects._collection.index_information() info = Person.objects._collection.index_information()
info = [value['key'] for key, value in info.iteritems()] info = [value['key'] for key, value in info.iteritems()]
self.assertTrue([('rank.title', 1)] in info) self.assertIn([('rank.title', 1)], info)
def test_explicit_geo2d_index(self): def test_explicit_geo2d_index(self):
"""Ensure that geo2d indexes work when created via meta[indexes] """Ensure that geo2d indexes work when created via meta[indexes]
@ -246,7 +246,7 @@ class IndexesTest(unittest.TestCase):
Place.ensure_indexes() Place.ensure_indexes()
info = Place._get_collection().index_information() info = Place._get_collection().index_information()
info = [value['key'] for key, value in info.iteritems()] info = [value['key'] for key, value in info.iteritems()]
self.assertTrue([('location.point', '2d')] in info) self.assertIn([('location.point', '2d')], info)
def test_explicit_geo2d_index_embedded(self): def test_explicit_geo2d_index_embedded(self):
"""Ensure that geo2d indexes work when created via meta[indexes] """Ensure that geo2d indexes work when created via meta[indexes]
@ -269,7 +269,7 @@ class IndexesTest(unittest.TestCase):
Place.ensure_indexes() Place.ensure_indexes()
info = Place._get_collection().index_information() info = Place._get_collection().index_information()
info = [value['key'] for key, value in info.iteritems()] info = [value['key'] for key, value in info.iteritems()]
self.assertTrue([('current.location.point', '2d')] in info) self.assertIn([('current.location.point', '2d')], info)
def test_explicit_geosphere_index(self): def test_explicit_geosphere_index(self):
"""Ensure that geosphere indexes work when created via meta[indexes] """Ensure that geosphere indexes work when created via meta[indexes]
@ -289,7 +289,7 @@ class IndexesTest(unittest.TestCase):
Place.ensure_indexes() Place.ensure_indexes()
info = Place._get_collection().index_information() info = Place._get_collection().index_information()
info = [value['key'] for key, value in info.iteritems()] info = [value['key'] for key, value in info.iteritems()]
self.assertTrue([('location.point', '2dsphere')] in info) self.assertIn([('location.point', '2dsphere')], info)
def test_explicit_geohaystack_index(self): def test_explicit_geohaystack_index(self):
"""Ensure that geohaystack indexes work when created via meta[indexes] """Ensure that geohaystack indexes work when created via meta[indexes]
@ -311,7 +311,7 @@ class IndexesTest(unittest.TestCase):
Place.ensure_indexes() Place.ensure_indexes()
info = Place._get_collection().index_information() info = Place._get_collection().index_information()
info = [value['key'] for key, value in info.iteritems()] info = [value['key'] for key, value in info.iteritems()]
self.assertTrue([('location.point', 'geoHaystack')] in info) self.assertIn([('location.point', 'geoHaystack')], info)
def test_create_geohaystack_index(self): def test_create_geohaystack_index(self):
"""Ensure that geohaystack indexes can be created """Ensure that geohaystack indexes can be created
@ -323,7 +323,7 @@ class IndexesTest(unittest.TestCase):
Place.create_index({'fields': (')location.point', 'name')}, bucketSize=10) Place.create_index({'fields': (')location.point', 'name')}, bucketSize=10)
info = Place._get_collection().index_information() info = Place._get_collection().index_information()
info = [value['key'] for key, value in info.iteritems()] info = [value['key'] for key, value in info.iteritems()]
self.assertTrue([('location.point', 'geoHaystack'), ('name', 1)] in info) self.assertIn([('location.point', 'geoHaystack'), ('name', 1)], info)
def test_dictionary_indexes(self): def test_dictionary_indexes(self):
"""Ensure that indexes are used when meta[indexes] contains """Ensure that indexes are used when meta[indexes] contains
@ -356,7 +356,7 @@ class IndexesTest(unittest.TestCase):
value.get('unique', False), value.get('unique', False),
value.get('sparse', False)) value.get('sparse', False))
for key, value in info.iteritems()] for key, value in info.iteritems()]
self.assertTrue(([('addDate', -1)], True, True) in info) self.assertIn(([('addDate', -1)], True, True), info)
BlogPost.drop_collection() BlogPost.drop_collection()
@ -491,7 +491,7 @@ class IndexesTest(unittest.TestCase):
obj = Test(a=1) obj = Test(a=1)
obj.save() obj.save()
IS_MONGODB_3 = get_mongodb_version()[0] >= 3 IS_MONGODB_3 = get_mongodb_version() >= MONGODB_3
# Need to be explicit about covered indexes as mongoDB doesn't know if # Need to be explicit about covered indexes as mongoDB doesn't know if
# the documents returned might have more keys in that here. # the documents returned might have more keys in that here.
@ -541,19 +541,24 @@ class IndexesTest(unittest.TestCase):
[('categories', 1), ('_id', 1)]) [('categories', 1), ('_id', 1)])
def test_hint(self): def test_hint(self):
MONGO_VER = self.mongodb_version
TAGS_INDEX_NAME = 'tags_1'
class BlogPost(Document): class BlogPost(Document):
tags = ListField(StringField()) tags = ListField(StringField())
meta = { meta = {
'indexes': [ 'indexes': [
'tags', {
'fields': ['tags'],
'name': TAGS_INDEX_NAME
}
], ],
} }
BlogPost.drop_collection() BlogPost.drop_collection()
for i in range(0, 10): for i in range(10):
tags = [("tag %i" % n) for n in range(0, i % 2)] tags = [("tag %i" % n) for n in range(i % 2)]
BlogPost(tags=tags).save() BlogPost(tags=tags).save()
self.assertEqual(BlogPost.objects.count(), 10) self.assertEqual(BlogPost.objects.count(), 10)
@ -563,18 +568,18 @@ class IndexesTest(unittest.TestCase):
if pymongo.version != '3.0': if pymongo.version != '3.0':
self.assertEqual(BlogPost.objects.hint([('tags', 1)]).count(), 10) self.assertEqual(BlogPost.objects.hint([('tags', 1)]).count(), 10)
if MONGO_VER == MONGODB_32:
# Mongo32 throws an error if an index exists (i.e `tags` in our case)
# and you use hint on an index name that does not exist
with self.assertRaises(OperationFailure):
BlogPost.objects.hint([('ZZ', 1)]).count()
else:
self.assertEqual(BlogPost.objects.hint([('ZZ', 1)]).count(), 10) self.assertEqual(BlogPost.objects.hint([('ZZ', 1)]).count(), 10)
if pymongo.version >= '2.8': self.assertEqual(BlogPost.objects.hint(TAGS_INDEX_NAME ).count(), 10)
self.assertEqual(BlogPost.objects.hint('tags').count(), 10)
else:
def invalid_index():
BlogPost.objects.hint('tags').next()
self.assertRaises(TypeError, invalid_index)
def invalid_index_2(): with self.assertRaises(Exception):
return BlogPost.objects.hint(('tags', 1)).next() BlogPost.objects.hint(('tags', 1)).next()
self.assertRaises(Exception, invalid_index_2)
def test_unique(self): def test_unique(self):
"""Ensure that uniqueness constraints are applied to fields. """Ensure that uniqueness constraints are applied to fields.
@ -749,7 +754,7 @@ class IndexesTest(unittest.TestCase):
except NotUniqueError: except NotUniqueError:
pass pass
def test_unique_and_primary(self): def test_primary_save_duplicate_update_existing_object(self):
"""If you set a field as primary, then unexpected behaviour can occur. """If you set a field as primary, then unexpected behaviour can occur.
You won't create a duplicate but you will update an existing document. You won't create a duplicate but you will update an existing document.
""" """
@ -803,7 +808,7 @@ class IndexesTest(unittest.TestCase):
info = BlogPost.objects._collection.index_information() info = BlogPost.objects._collection.index_information()
info = [value['key'] for key, value in info.iteritems()] info = [value['key'] for key, value in info.iteritems()]
index_item = [('_id', 1), ('comments.comment_id', 1)] index_item = [('_id', 1), ('comments.comment_id', 1)]
self.assertTrue(index_item in info) self.assertIn(index_item, info)
def test_compound_key_embedded(self): def test_compound_key_embedded(self):
@ -850,8 +855,8 @@ class IndexesTest(unittest.TestCase):
info = MyDoc.objects._collection.index_information() info = MyDoc.objects._collection.index_information()
info = [value['key'] for key, value in info.iteritems()] info = [value['key'] for key, value in info.iteritems()]
self.assertTrue([('provider_ids.foo', 1)] in info) self.assertIn([('provider_ids.foo', 1)], info)
self.assertTrue([('provider_ids.bar', 1)] in info) self.assertIn([('provider_ids.bar', 1)], info)
def test_sparse_compound_indexes(self): def test_sparse_compound_indexes(self):
@ -867,7 +872,7 @@ class IndexesTest(unittest.TestCase):
info['provider_ids.foo_1_provider_ids.bar_1']['key']) info['provider_ids.foo_1_provider_ids.bar_1']['key'])
self.assertTrue(info['provider_ids.foo_1_provider_ids.bar_1']['sparse']) self.assertTrue(info['provider_ids.foo_1_provider_ids.bar_1']['sparse'])
@needs_mongodb_v26 @requires_mongodb_gte_26
def test_text_indexes(self): def test_text_indexes(self):
class Book(Document): class Book(Document):
title = DictField() title = DictField()
@ -876,9 +881,9 @@ class IndexesTest(unittest.TestCase):
} }
indexes = Book.objects._collection.index_information() indexes = Book.objects._collection.index_information()
self.assertTrue("title_text" in indexes) self.assertIn("title_text", indexes)
key = indexes["title_text"]["key"] key = indexes["title_text"]["key"]
self.assertTrue(('_fts', 'text') in key) self.assertIn(('_fts', 'text'), key)
def test_hashed_indexes(self): def test_hashed_indexes(self):
@ -889,8 +894,8 @@ class IndexesTest(unittest.TestCase):
} }
indexes = Book.objects._collection.index_information() indexes = Book.objects._collection.index_information()
self.assertTrue("ref_id_hashed" in indexes) self.assertIn("ref_id_hashed", indexes)
self.assertTrue(('ref_id', 'hashed') in indexes["ref_id_hashed"]["key"]) self.assertIn(('ref_id', 'hashed'), indexes["ref_id_hashed"]["key"])
def test_indexes_after_database_drop(self): def test_indexes_after_database_drop(self):
""" """
@ -1013,7 +1018,7 @@ class IndexesTest(unittest.TestCase):
TestDoc.ensure_indexes() TestDoc.ensure_indexes()
index_info = TestDoc._get_collection().index_information() index_info = TestDoc._get_collection().index_information()
self.assertTrue('shard_1_1__cls_1_txt_1_1' in index_info) self.assertIn('shard_1_1__cls_1_txt_1_1', index_info)
if __name__ == '__main__': if __name__ == '__main__':

View File

@ -2,14 +2,11 @@
import unittest import unittest
import warnings import warnings
from datetime import datetime from mongoengine import (BooleanField, Document, EmbeddedDocument,
EmbeddedDocumentField, GenericReferenceField,
from tests.fixtures import Base IntField, ReferenceField, StringField, connect)
from mongoengine import Document, EmbeddedDocument, connect
from mongoengine.connection import get_db from mongoengine.connection import get_db
from mongoengine.fields import (BooleanField, GenericReferenceField, from tests.fixtures import Base
IntField, StringField)
__all__ = ('InheritanceTest', ) __all__ = ('InheritanceTest', )
@ -26,6 +23,27 @@ class InheritanceTest(unittest.TestCase):
continue continue
self.db.drop_collection(collection) self.db.drop_collection(collection)
def test_constructor_cls(self):
# Ensures _cls is properly set during construction
# and when object gets reloaded (prevent regression of #1950)
class EmbedData(EmbeddedDocument):
data = StringField()
meta = {'allow_inheritance': True}
class DataDoc(Document):
name = StringField()
embed = EmbeddedDocumentField(EmbedData)
meta = {'allow_inheritance': True}
test_doc = DataDoc(name='test', embed=EmbedData(data='data'))
assert test_doc._cls == 'DataDoc'
assert test_doc.embed._cls == 'EmbedData'
test_doc.save()
saved_doc = DataDoc.objects.with_id(test_doc.id)
assert test_doc._cls == saved_doc._cls
assert test_doc.embed._cls == saved_doc.embed._cls
test_doc.delete()
def test_superclasses(self): def test_superclasses(self):
"""Ensure that the correct list of superclasses is assembled. """Ensure that the correct list of superclasses is assembled.
""" """
@ -258,9 +276,10 @@ class InheritanceTest(unittest.TestCase):
name = StringField() name = StringField()
# can't inherit because Animal didn't explicitly allow inheritance # can't inherit because Animal didn't explicitly allow inheritance
with self.assertRaises(ValueError): with self.assertRaises(ValueError) as cm:
class Dog(Animal): class Dog(Animal):
pass pass
self.assertIn("Document Animal may not be subclassed", str(cm.exception))
# Check that _cls etc aren't present on simple documents # Check that _cls etc aren't present on simple documents
dog = Animal(name='dog').save() dog = Animal(name='dog').save()
@ -268,7 +287,7 @@ class InheritanceTest(unittest.TestCase):
collection = self.db[Animal._get_collection_name()] collection = self.db[Animal._get_collection_name()]
obj = collection.find_one() obj = collection.find_one()
self.assertFalse('_cls' in obj) self.assertNotIn('_cls', obj)
def test_cant_turn_off_inheritance_on_subclass(self): def test_cant_turn_off_inheritance_on_subclass(self):
"""Ensure if inheritance is on in a subclass you cant turn it off. """Ensure if inheritance is on in a subclass you cant turn it off.
@ -277,9 +296,10 @@ class InheritanceTest(unittest.TestCase):
name = StringField() name = StringField()
meta = {'allow_inheritance': True} meta = {'allow_inheritance': True}
with self.assertRaises(ValueError): with self.assertRaises(ValueError) as cm:
class Mammal(Animal): class Mammal(Animal):
meta = {'allow_inheritance': False} meta = {'allow_inheritance': False}
self.assertEqual(str(cm.exception), 'Only direct subclasses of Document may set "allow_inheritance" to False')
def test_allow_inheritance_abstract_document(self): def test_allow_inheritance_abstract_document(self):
"""Ensure that abstract documents can set inheritance rules and that """Ensure that abstract documents can set inheritance rules and that
@ -292,13 +312,48 @@ class InheritanceTest(unittest.TestCase):
class Animal(FinalDocument): class Animal(FinalDocument):
name = StringField() name = StringField()
with self.assertRaises(ValueError): with self.assertRaises(ValueError) as cm:
class Mammal(Animal): class Mammal(Animal):
pass pass
# Check that _cls isn't present in simple documents # Check that _cls isn't present in simple documents
doc = Animal(name='dog') doc = Animal(name='dog')
self.assertFalse('_cls' in doc.to_mongo()) self.assertNotIn('_cls', doc.to_mongo())
def test_using_abstract_class_in_reference_field(self):
# Ensures no regression of #1920
class AbstractHuman(Document):
meta = {'abstract': True}
class Dad(AbstractHuman):
name = StringField()
class Home(Document):
dad = ReferenceField(AbstractHuman) # Referencing the abstract class
address = StringField()
dad = Dad(name='5').save()
Home(dad=dad, address='street').save()
home = Home.objects.first()
home.address = 'garbage'
home.save() # Was failing with ValidationError
def test_abstract_class_referencing_self(self):
# Ensures no regression of #1920
class Human(Document):
meta = {'abstract': True}
creator = ReferenceField('self', dbref=True)
class User(Human):
name = StringField()
user = User(name='John').save()
user2 = User(name='Foo', creator=user).save()
user2 = User.objects.with_id(user2.id)
user2.name = 'Bar'
user2.save() # Was failing with ValidationError
def test_abstract_handle_ids_in_metaclass_properly(self): def test_abstract_handle_ids_in_metaclass_properly(self):
@ -358,11 +413,11 @@ class InheritanceTest(unittest.TestCase):
meta = {'abstract': True, meta = {'abstract': True,
'allow_inheritance': False} 'allow_inheritance': False}
bkk = City(continent='asia') city = City(continent='asia')
self.assertEqual(None, bkk.pk) self.assertEqual(None, city.pk)
# TODO: expected error? Shouldn't we create a new error type? # TODO: expected error? Shouldn't we create a new error type?
with self.assertRaises(KeyError): with self.assertRaises(KeyError):
setattr(bkk, 'pk', 1) setattr(city, 'pk', 1)
def test_allow_inheritance_embedded_document(self): def test_allow_inheritance_embedded_document(self):
"""Ensure embedded documents respect inheritance.""" """Ensure embedded documents respect inheritance."""
@ -374,14 +429,14 @@ class InheritanceTest(unittest.TestCase):
pass pass
doc = Comment(content='test') doc = Comment(content='test')
self.assertFalse('_cls' in doc.to_mongo()) self.assertNotIn('_cls', doc.to_mongo())
class Comment(EmbeddedDocument): class Comment(EmbeddedDocument):
content = StringField() content = StringField()
meta = {'allow_inheritance': True} meta = {'allow_inheritance': True}
doc = Comment(content='test') doc = Comment(content='test')
self.assertTrue('_cls' in doc.to_mongo()) self.assertIn('_cls', doc.to_mongo())
def test_document_inheritance(self): def test_document_inheritance(self):
"""Ensure mutliple inheritance of abstract documents """Ensure mutliple inheritance of abstract documents
@ -434,8 +489,8 @@ class InheritanceTest(unittest.TestCase):
for cls in [Animal, Fish, Guppy]: for cls in [Animal, Fish, Guppy]:
self.assertEqual(cls._meta[k], v) self.assertEqual(cls._meta[k], v)
self.assertFalse('collection' in Animal._meta) self.assertNotIn('collection', Animal._meta)
self.assertFalse('collection' in Mammal._meta) self.assertNotIn('collection', Mammal._meta)
self.assertEqual(Animal._get_collection_name(), None) self.assertEqual(Animal._get_collection_name(), None)
self.assertEqual(Mammal._get_collection_name(), None) self.assertEqual(Mammal._get_collection_name(), None)

View File

@ -8,9 +8,12 @@ import weakref
from datetime import datetime from datetime import datetime
from bson import DBRef, ObjectId from bson import DBRef, ObjectId
from pymongo.errors import DuplicateKeyError
from tests import fixtures from tests import fixtures
from tests.fixtures import (PickleEmbedded, PickleTest, PickleSignalsTest, from tests.fixtures import (PickleEmbedded, PickleTest, PickleSignalsTest,
PickleDynamicEmbedded, PickleDynamicTest) PickleDynamicEmbedded, PickleDynamicTest)
from tests.utils import MongoDBTestCase
from mongoengine import * from mongoengine import *
from mongoengine.base import get_document, _document_registry from mongoengine.base import get_document, _document_registry
@ -22,7 +25,7 @@ from mongoengine.queryset import NULLIFY, Q
from mongoengine.context_managers import switch_db, query_counter from mongoengine.context_managers import switch_db, query_counter
from mongoengine import signals from mongoengine import signals
from tests.utils import needs_mongodb_v26 from tests.utils import requires_mongodb_gte_26
TEST_IMAGE_PATH = os.path.join(os.path.dirname(__file__), TEST_IMAGE_PATH = os.path.join(os.path.dirname(__file__),
'../fields/mongoengine.png') '../fields/mongoengine.png')
@ -30,12 +33,9 @@ TEST_IMAGE_PATH = os.path.join(os.path.dirname(__file__),
__all__ = ("InstanceTest",) __all__ = ("InstanceTest",)
class InstanceTest(unittest.TestCase): class InstanceTest(MongoDBTestCase):
def setUp(self): def setUp(self):
connect(db='mongoenginetest')
self.db = get_db()
class Job(EmbeddedDocument): class Job(EmbeddedDocument):
name = StringField() name = StringField()
years = IntField() years = IntField()
@ -357,7 +357,7 @@ class InstanceTest(unittest.TestCase):
user_son = User.objects._collection.find_one() user_son = User.objects._collection.find_one()
self.assertEqual(user_son['_id'], 'test') self.assertEqual(user_son['_id'], 'test')
self.assertTrue('username' not in user_son['_id']) self.assertNotIn('username', user_son['_id'])
User.drop_collection() User.drop_collection()
@ -370,7 +370,7 @@ class InstanceTest(unittest.TestCase):
user_son = User.objects._collection.find_one() user_son = User.objects._collection.find_one()
self.assertEqual(user_son['_id'], 'mongo') self.assertEqual(user_son['_id'], 'mongo')
self.assertTrue('username' not in user_son['_id']) self.assertNotIn('username', user_son['_id'])
def test_document_not_registered(self): def test_document_not_registered(self):
class Place(Document): class Place(Document):
@ -550,21 +550,14 @@ class InstanceTest(unittest.TestCase):
pass pass
f = Foo() f = Foo()
try: with self.assertRaises(Foo.DoesNotExist):
f.reload() f.reload()
except Foo.DoesNotExist:
pass
except Exception:
self.assertFalse("Threw wrong exception")
f.save() f.save()
f.delete() f.delete()
try:
with self.assertRaises(Foo.DoesNotExist):
f.reload() f.reload()
except Foo.DoesNotExist:
pass
except Exception:
self.assertFalse("Threw wrong exception")
def test_reload_of_non_strict_with_special_field_name(self): def test_reload_of_non_strict_with_special_field_name(self):
"""Ensures reloading works for documents with meta strict == False.""" """Ensures reloading works for documents with meta strict == False."""
@ -601,10 +594,10 @@ class InstanceTest(unittest.TestCase):
# Length = length(assigned fields + id) # Length = length(assigned fields + id)
self.assertEqual(len(person), 5) self.assertEqual(len(person), 5)
self.assertTrue('age' in person) self.assertIn('age', person)
person.age = None person.age = None
self.assertFalse('age' in person) self.assertNotIn('age', person)
self.assertFalse('nationality' in person) self.assertNotIn('nationality', person)
def test_embedded_document_to_mongo(self): def test_embedded_document_to_mongo(self):
class Person(EmbeddedDocument): class Person(EmbeddedDocument):
@ -634,8 +627,8 @@ class InstanceTest(unittest.TestCase):
class Comment(EmbeddedDocument): class Comment(EmbeddedDocument):
content = StringField() content = StringField()
self.assertTrue('content' in Comment._fields) self.assertIn('content', Comment._fields)
self.assertFalse('id' in Comment._fields) self.assertNotIn('id', Comment._fields)
def test_embedded_document_instance(self): def test_embedded_document_instance(self):
"""Ensure that embedded documents can reference parent instance.""" """Ensure that embedded documents can reference parent instance."""
@ -734,12 +727,12 @@ class InstanceTest(unittest.TestCase):
t = TestDocument(status="draft", pub_date=datetime.now()) t = TestDocument(status="draft", pub_date=datetime.now())
try: with self.assertRaises(ValidationError) as cm:
t.save() t.save()
except ValidationError as e:
expect_msg = "Draft entries may not have a publication date." expected_msg = "Draft entries may not have a publication date."
self.assertTrue(expect_msg in e.message) self.assertIn(expected_msg, cm.exception.message)
self.assertEqual(e.to_dict(), {'__all__': expect_msg}) self.assertEqual(cm.exception.to_dict(), {'__all__': expected_msg})
t = TestDocument(status="published") t = TestDocument(status="published")
t.save(clean=False) t.save(clean=False)
@ -773,12 +766,13 @@ class InstanceTest(unittest.TestCase):
TestDocument.drop_collection() TestDocument.drop_collection()
t = TestDocument(doc=TestEmbeddedDocument(x=10, y=25, z=15)) t = TestDocument(doc=TestEmbeddedDocument(x=10, y=25, z=15))
try:
with self.assertRaises(ValidationError) as cm:
t.save() t.save()
except ValidationError as e:
expect_msg = "Value of z != x + y" expected_msg = "Value of z != x + y"
self.assertTrue(expect_msg in e.message) self.assertIn(expected_msg, cm.exception.message)
self.assertEqual(e.to_dict(), {'doc': {'__all__': expect_msg}}) self.assertEqual(cm.exception.to_dict(), {'doc': {'__all__': expected_msg}})
t = TestDocument(doc=TestEmbeddedDocument(x=10, y=25)).save() t = TestDocument(doc=TestEmbeddedDocument(x=10, y=25)).save()
self.assertEqual(t.doc.z, 35) self.assertEqual(t.doc.z, 35)
@ -846,12 +840,18 @@ class InstanceTest(unittest.TestCase):
self.assertDbEqual([dict(other_doc.to_mongo()), dict(doc.to_mongo())]) self.assertDbEqual([dict(other_doc.to_mongo()), dict(doc.to_mongo())])
@needs_mongodb_v26 @requires_mongodb_gte_26
def test_modify_with_positional_push(self): def test_modify_with_positional_push(self):
class Content(EmbeddedDocument):
keywords = ListField(StringField())
class BlogPost(Document): class BlogPost(Document):
tags = ListField(StringField()) tags = ListField(StringField())
content = EmbeddedDocumentField(Content)
post = BlogPost.objects.create(
tags=['python'], content=Content(keywords=['ipsum']))
post = BlogPost.objects.create(tags=['python'])
self.assertEqual(post.tags, ['python']) self.assertEqual(post.tags, ['python'])
post.modify(push__tags__0=['code', 'mongo']) post.modify(push__tags__0=['code', 'mongo'])
self.assertEqual(post.tags, ['code', 'mongo', 'python']) self.assertEqual(post.tags, ['code', 'mongo', 'python'])
@ -862,6 +862,16 @@ class InstanceTest(unittest.TestCase):
['code', 'mongo', 'python'] ['code', 'mongo', 'python']
) )
self.assertEqual(post.content.keywords, ['ipsum'])
post.modify(push__content__keywords__0=['lorem'])
self.assertEqual(post.content.keywords, ['lorem', 'ipsum'])
# Assert same order of the list items is maintained in the db
self.assertEqual(
BlogPost._get_collection().find_one({'_id': post.pk})['content']['keywords'],
['lorem', 'ipsum']
)
def test_save(self): def test_save(self):
"""Ensure that a document may be saved in the database.""" """Ensure that a document may be saved in the database."""
@ -1428,6 +1438,60 @@ class InstanceTest(unittest.TestCase):
self.assertEqual(person.age, 21) self.assertEqual(person.age, 21)
self.assertEqual(person.active, False) self.assertEqual(person.active, False)
def test__get_changed_fields_same_ids_reference_field_does_not_enters_infinite_loop(self):
# Refers to Issue #1685
class EmbeddedChildModel(EmbeddedDocument):
id = DictField(primary_key=True)
class ParentModel(Document):
child = EmbeddedDocumentField(
EmbeddedChildModel)
emb = EmbeddedChildModel(id={'1': [1]})
ParentModel(children=emb)._get_changed_fields()
def test__get_changed_fields_same_ids_reference_field_does_not_enters_infinite_loop(self):
class User(Document):
id = IntField(primary_key=True)
name = StringField()
class Message(Document):
id = IntField(primary_key=True)
author = ReferenceField(User)
Message.drop_collection()
# All objects share the same id, but each in a different collection
user = User(id=1, name='user-name').save()
message = Message(id=1, author=user).save()
message.author.name = 'tutu'
self.assertEqual(message._get_changed_fields(), [])
self.assertEqual(user._get_changed_fields(), ['name'])
def test__get_changed_fields_same_ids_embedded(self):
# Refers to Issue #1768
class User(EmbeddedDocument):
id = IntField()
name = StringField()
class Message(Document):
id = IntField(primary_key=True)
author = EmbeddedDocumentField(User)
Message.drop_collection()
# All objects share the same id, but each in a different collection
user = User(id=1, name='user-name')#.save()
message = Message(id=1, author=user).save()
message.author.name = 'tutu'
self.assertEqual(message._get_changed_fields(), ['author.name'])
message.save()
message_fetched = Message.objects.with_id(message.id)
self.assertEqual(message_fetched.author.name, 'tutu')
def test_query_count_when_saving(self): def test_query_count_when_saving(self):
"""Ensure references don't cause extra fetches when saving""" """Ensure references don't cause extra fetches when saving"""
class Organization(Document): class Organization(Document):
@ -1461,9 +1525,9 @@ class InstanceTest(unittest.TestCase):
user = User.objects.first() user = User.objects.first()
# Even if stored as ObjectId's internally mongoengine uses DBRefs # Even if stored as ObjectId's internally mongoengine uses DBRefs
# As ObjectId's aren't automatically derefenced # As ObjectId's aren't automatically derefenced
self.assertTrue(isinstance(user._data['orgs'][0], DBRef)) self.assertIsInstance(user._data['orgs'][0], DBRef)
self.assertTrue(isinstance(user.orgs[0], Organization)) self.assertIsInstance(user.orgs[0], Organization)
self.assertTrue(isinstance(user._data['orgs'][0], Organization)) self.assertIsInstance(user._data['orgs'][0], Organization)
# Changing a value # Changing a value
with query_counter() as q: with query_counter() as q:
@ -1843,9 +1907,8 @@ class InstanceTest(unittest.TestCase):
post_obj = BlogPost.objects.first() post_obj = BlogPost.objects.first()
# Test laziness # Test laziness
self.assertTrue(isinstance(post_obj._data['author'], self.assertIsInstance(post_obj._data['author'], bson.DBRef)
bson.DBRef)) self.assertIsInstance(post_obj.author, self.Person)
self.assertTrue(isinstance(post_obj.author, self.Person))
self.assertEqual(post_obj.author.name, 'Test User') self.assertEqual(post_obj.author.name, 'Test User')
# Ensure that the dereferenced object may be changed and saved # Ensure that the dereferenced object may be changed and saved
@ -2251,12 +2314,12 @@ class InstanceTest(unittest.TestCase):
# Make sure docs are properly identified in a list (__eq__ is used # Make sure docs are properly identified in a list (__eq__ is used
# for the comparison). # for the comparison).
all_user_list = list(User.objects.all()) all_user_list = list(User.objects.all())
self.assertTrue(u1 in all_user_list) self.assertIn(u1, all_user_list)
self.assertTrue(u2 in all_user_list) self.assertIn(u2, all_user_list)
self.assertTrue(u3 in all_user_list) self.assertIn(u3, all_user_list)
self.assertTrue(u4 not in all_user_list) # New object self.assertNotIn(u4, all_user_list) # New object
self.assertTrue(b1 not in all_user_list) # Other object self.assertNotIn(b1, all_user_list) # Other object
self.assertTrue(b2 not in all_user_list) # Other object self.assertNotIn(b2, all_user_list) # Other object
# Make sure docs can be used as keys in a dict (__hash__ is used # Make sure docs can be used as keys in a dict (__hash__ is used
# for hashing the docs). # for hashing the docs).
@ -2274,10 +2337,10 @@ class InstanceTest(unittest.TestCase):
# Make sure docs are properly identified in a set (__hash__ is used # Make sure docs are properly identified in a set (__hash__ is used
# for hashing the docs). # for hashing the docs).
all_user_set = set(User.objects.all()) all_user_set = set(User.objects.all())
self.assertTrue(u1 in all_user_set) self.assertIn(u1, all_user_set)
self.assertTrue(u4 not in all_user_set) self.assertNotIn(u4, all_user_set)
self.assertTrue(b1 not in all_user_list) self.assertNotIn(b1, all_user_list)
self.assertTrue(b2 not in all_user_list) self.assertNotIn(b2, all_user_list)
# Make sure duplicate docs aren't accepted in the set # Make sure duplicate docs aren't accepted in the set
self.assertEqual(len(all_user_set), 3) self.assertEqual(len(all_user_set), 3)
@ -2978,7 +3041,7 @@ class InstanceTest(unittest.TestCase):
Person(name="Harry Potter").save() Person(name="Harry Potter").save()
person = Person.objects.first() person = Person.objects.first()
self.assertTrue('id' in person._data.keys()) self.assertIn('id', person._data.keys())
self.assertEqual(person._data.get('id'), person.id) self.assertEqual(person._data.get('id'), person.id)
def test_complex_nesting_document_and_embedded_document(self): def test_complex_nesting_document_and_embedded_document(self):
@ -3070,36 +3133,36 @@ class InstanceTest(unittest.TestCase):
dbref2 = f._data['test2'] dbref2 = f._data['test2']
obj2 = f.test2 obj2 = f.test2
self.assertTrue(isinstance(dbref2, DBRef)) self.assertIsInstance(dbref2, DBRef)
self.assertTrue(isinstance(obj2, Test2)) self.assertIsInstance(obj2, Test2)
self.assertTrue(obj2.id == dbref2.id) self.assertEqual(obj2.id, dbref2.id)
self.assertTrue(obj2 == dbref2) self.assertEqual(obj2, dbref2)
self.assertTrue(dbref2 == obj2) self.assertEqual(dbref2, obj2)
dbref3 = f._data['test3'] dbref3 = f._data['test3']
obj3 = f.test3 obj3 = f.test3
self.assertTrue(isinstance(dbref3, DBRef)) self.assertIsInstance(dbref3, DBRef)
self.assertTrue(isinstance(obj3, Test3)) self.assertIsInstance(obj3, Test3)
self.assertTrue(obj3.id == dbref3.id) self.assertEqual(obj3.id, dbref3.id)
self.assertTrue(obj3 == dbref3) self.assertEqual(obj3, dbref3)
self.assertTrue(dbref3 == obj3) self.assertEqual(dbref3, obj3)
self.assertTrue(obj2.id == obj3.id) self.assertEqual(obj2.id, obj3.id)
self.assertTrue(dbref2.id == dbref3.id) self.assertEqual(dbref2.id, dbref3.id)
self.assertFalse(dbref2 == dbref3) self.assertNotEqual(dbref2, dbref3)
self.assertFalse(dbref3 == dbref2) self.assertNotEqual(dbref3, dbref2)
self.assertTrue(dbref2 != dbref3) self.assertNotEqual(dbref2, dbref3)
self.assertTrue(dbref3 != dbref2) self.assertNotEqual(dbref3, dbref2)
self.assertFalse(obj2 == dbref3) self.assertNotEqual(obj2, dbref3)
self.assertFalse(dbref3 == obj2) self.assertNotEqual(dbref3, obj2)
self.assertTrue(obj2 != dbref3) self.assertNotEqual(obj2, dbref3)
self.assertTrue(dbref3 != obj2) self.assertNotEqual(dbref3, obj2)
self.assertFalse(obj3 == dbref2) self.assertNotEqual(obj3, dbref2)
self.assertFalse(dbref2 == obj3) self.assertNotEqual(dbref2, obj3)
self.assertTrue(obj3 != dbref2) self.assertNotEqual(obj3, dbref2)
self.assertTrue(dbref2 != obj3) self.assertNotEqual(dbref2, obj3)
def test_default_values(self): def test_default_values(self):
class Person(Document): class Person(Document):
@ -3148,6 +3211,64 @@ class InstanceTest(unittest.TestCase):
self.assertEquals(p.id, None) self.assertEquals(p.id, None)
p.id = "12345" # in case it is not working: "OperationError: Shard Keys are immutable..." will be raised here p.id = "12345" # in case it is not working: "OperationError: Shard Keys are immutable..." will be raised here
def test_from_son_created_False_without_id(self):
class MyPerson(Document):
name = StringField()
MyPerson.objects.delete()
p = MyPerson.from_json('{"name": "a_fancy_name"}', created=False)
self.assertFalse(p._created)
self.assertIsNone(p.id)
p.save()
self.assertIsNotNone(p.id)
saved_p = MyPerson.objects.get(id=p.id)
self.assertEqual(saved_p.name, 'a_fancy_name')
def test_from_son_created_False_with_id(self):
# 1854
class MyPerson(Document):
name = StringField()
MyPerson.objects.delete()
p = MyPerson.from_json('{"_id": "5b85a8b04ec5dc2da388296e", "name": "a_fancy_name"}', created=False)
self.assertFalse(p._created)
self.assertEqual(p._changed_fields, [])
self.assertEqual(p.name, 'a_fancy_name')
self.assertEqual(p.id, ObjectId('5b85a8b04ec5dc2da388296e'))
p.save()
with self.assertRaises(DoesNotExist):
# Since created=False and we gave an id in the json and _changed_fields is empty
# mongoengine assumes that the document exits with that structure already
# and calling .save() didn't save anything
MyPerson.objects.get(id=p.id)
self.assertFalse(p._created)
p.name = 'a new fancy name'
self.assertEqual(p._changed_fields, ['name'])
p.save()
saved_p = MyPerson.objects.get(id=p.id)
self.assertEqual(saved_p.name, p.name)
def test_from_son_created_True_with_an_id(self):
class MyPerson(Document):
name = StringField()
MyPerson.objects.delete()
p = MyPerson.from_json('{"_id": "5b85a8b04ec5dc2da388296e", "name": "a_fancy_name"}', created=True)
self.assertTrue(p._created)
self.assertEqual(p._changed_fields, [])
self.assertEqual(p.name, 'a_fancy_name')
self.assertEqual(p.id, ObjectId('5b85a8b04ec5dc2da388296e'))
p.save()
saved_p = MyPerson.objects.get(id=p.id)
self.assertEqual(saved_p, p)
self.assertEqual(p.name, 'a_fancy_name')
def test_null_field(self): def test_null_field(self):
# 734 # 734
class User(Document): class User(Document):
@ -3221,7 +3342,7 @@ class InstanceTest(unittest.TestCase):
person.update(set__height=2.0) person.update(set__height=2.0)
@needs_mongodb_v26 @requires_mongodb_gte_26
def test_push_with_position(self): def test_push_with_position(self):
"""Ensure that push with position works properly for an instance.""" """Ensure that push with position works properly for an instance."""
class BlogPost(Document): class BlogPost(Document):
@ -3248,6 +3369,23 @@ class InstanceTest(unittest.TestCase):
blog.reload() blog.reload()
self.assertEqual(blog.tags, [["value1", 123]]) self.assertEqual(blog.tags, [["value1", 123]])
def test_accessing_objects_with_indexes_error(self):
insert_result = self.db.company.insert_many([{'name': 'Foo'},
{'name': 'Foo'}]) # Force 2 doc with same name
REF_OID = insert_result.inserted_ids[0]
self.db.user.insert_one({'company': REF_OID}) # Force 2 doc with same name
class Company(Document):
name = StringField(unique=True)
class User(Document):
company = ReferenceField(Company)
# Ensure index creation exception aren't swallowed (#1688)
with self.assertRaises(DuplicateKeyError):
User.objects().select_related()
if __name__ == '__main__': if __name__ == '__main__':
unittest.main() unittest.main()

View File

@ -20,16 +20,16 @@ class ValidatorErrorTest(unittest.TestCase):
# 1st level error schema # 1st level error schema
error.errors = {'1st': ValidationError('bad 1st'), } error.errors = {'1st': ValidationError('bad 1st'), }
self.assertTrue('1st' in error.to_dict()) self.assertIn('1st', error.to_dict())
self.assertEqual(error.to_dict()['1st'], 'bad 1st') self.assertEqual(error.to_dict()['1st'], 'bad 1st')
# 2nd level error schema # 2nd level error schema
error.errors = {'1st': ValidationError('bad 1st', errors={ error.errors = {'1st': ValidationError('bad 1st', errors={
'2nd': ValidationError('bad 2nd'), '2nd': ValidationError('bad 2nd'),
})} })}
self.assertTrue('1st' in error.to_dict()) self.assertIn('1st', error.to_dict())
self.assertTrue(isinstance(error.to_dict()['1st'], dict)) self.assertIsInstance(error.to_dict()['1st'], dict)
self.assertTrue('2nd' in error.to_dict()['1st']) self.assertIn('2nd', error.to_dict()['1st'])
self.assertEqual(error.to_dict()['1st']['2nd'], 'bad 2nd') self.assertEqual(error.to_dict()['1st']['2nd'], 'bad 2nd')
# moar levels # moar levels
@ -40,10 +40,10 @@ class ValidatorErrorTest(unittest.TestCase):
}), }),
}), }),
})} })}
self.assertTrue('1st' in error.to_dict()) self.assertIn('1st', error.to_dict())
self.assertTrue('2nd' in error.to_dict()['1st']) self.assertIn('2nd', error.to_dict()['1st'])
self.assertTrue('3rd' in error.to_dict()['1st']['2nd']) self.assertIn('3rd', error.to_dict()['1st']['2nd'])
self.assertTrue('4th' in error.to_dict()['1st']['2nd']['3rd']) self.assertIn('4th', error.to_dict()['1st']['2nd']['3rd'])
self.assertEqual(error.to_dict()['1st']['2nd']['3rd']['4th'], self.assertEqual(error.to_dict()['1st']['2nd']['3rd']['4th'],
'Inception') 'Inception')
@ -58,7 +58,7 @@ class ValidatorErrorTest(unittest.TestCase):
try: try:
User().validate() User().validate()
except ValidationError as e: except ValidationError as e:
self.assertTrue("User:None" in e.message) self.assertIn("User:None", e.message)
self.assertEqual(e.to_dict(), { self.assertEqual(e.to_dict(), {
'username': 'Field is required', 'username': 'Field is required',
'name': 'Field is required'}) 'name': 'Field is required'})
@ -68,7 +68,7 @@ class ValidatorErrorTest(unittest.TestCase):
try: try:
user.save() user.save()
except ValidationError as e: except ValidationError as e:
self.assertTrue("User:RossC0" in e.message) self.assertIn("User:RossC0", e.message)
self.assertEqual(e.to_dict(), { self.assertEqual(e.to_dict(), {
'name': 'Field is required'}) 'name': 'Field is required'})
@ -116,7 +116,7 @@ class ValidatorErrorTest(unittest.TestCase):
try: try:
Doc(id="bad").validate() Doc(id="bad").validate()
except ValidationError as e: except ValidationError as e:
self.assertTrue("SubDoc:None" in e.message) self.assertIn("SubDoc:None", e.message)
self.assertEqual(e.to_dict(), { self.assertEqual(e.to_dict(), {
"e": {'val': 'OK could not be converted to int'}}) "e": {'val': 'OK could not be converted to int'}})
@ -127,14 +127,14 @@ class ValidatorErrorTest(unittest.TestCase):
doc = Doc.objects.first() doc = Doc.objects.first()
keys = doc._data.keys() keys = doc._data.keys()
self.assertEqual(2, len(keys)) self.assertEqual(2, len(keys))
self.assertTrue('e' in keys) self.assertIn('e', keys)
self.assertTrue('id' in keys) self.assertIn('id', keys)
doc.e.val = "OK" doc.e.val = "OK"
try: try:
doc.save() doc.save()
except ValidationError as e: except ValidationError as e:
self.assertTrue("Doc:test" in e.message) self.assertIn("Doc:test", e.message)
self.assertEqual(e.to_dict(), { self.assertEqual(e.to_dict(), {
"e": {'val': 'OK could not be converted to int'}}) "e": {'val': 'OK could not be converted to int'}})

View File

@ -1,3 +1,3 @@
from fields import * from .fields import *
from file_tests import * from .file_tests import *
from geo import * from .geo import *

View File

@ -175,7 +175,7 @@ class FieldTest(MongoDBTestCase):
self.assertEqual(person.name, None) self.assertEqual(person.name, None)
self.assertEqual(person.age, 30) self.assertEqual(person.age, 30)
self.assertEqual(person.userid, 'test') self.assertEqual(person.userid, 'test')
self.assertTrue(isinstance(person.created, datetime.datetime)) self.assertIsInstance(person.created, datetime.datetime)
self.assertEqual(person._data['name'], person.name) self.assertEqual(person._data['name'], person.name)
self.assertEqual(person._data['age'], person.age) self.assertEqual(person._data['age'], person.age)
@ -186,6 +186,31 @@ class FieldTest(MongoDBTestCase):
data_to_be_saved = sorted(person.to_mongo().keys()) data_to_be_saved = sorted(person.to_mongo().keys())
self.assertEqual(data_to_be_saved, ['age', 'created', 'userid']) self.assertEqual(data_to_be_saved, ['age', 'created', 'userid'])
def test_default_value_is_not_used_when_changing_value_to_empty_list_for_strict_doc(self):
"""List field with default can be set to the empty list (strict)"""
# Issue #1733
class Doc(Document):
x = ListField(IntField(), default=lambda: [42])
doc = Doc(x=[1]).save()
doc.x = []
doc.save()
reloaded = Doc.objects.get(id=doc.id)
self.assertEqual(reloaded.x, [])
def test_default_value_is_not_used_when_changing_value_to_empty_list_for_dyn_doc(self):
"""List field with default can be set to the empty list (dynamic)"""
# Issue #1733
class Doc(DynamicDocument):
x = ListField(IntField(), default=lambda: [42])
doc = Doc(x=[1]).save()
doc.x = []
doc.y = 2 # Was triggering the bug
doc.save()
reloaded = Doc.objects.get(id=doc.id)
self.assertEqual(reloaded.x, [])
def test_default_values_when_deleting_value(self): def test_default_values_when_deleting_value(self):
"""Ensure that default field values are used after non-default """Ensure that default field values are used after non-default
values are explicitly deleted. values are explicitly deleted.
@ -211,7 +236,7 @@ class FieldTest(MongoDBTestCase):
self.assertEqual(person.name, None) self.assertEqual(person.name, None)
self.assertEqual(person.age, 30) self.assertEqual(person.age, 30)
self.assertEqual(person.userid, 'test') self.assertEqual(person.userid, 'test')
self.assertTrue(isinstance(person.created, datetime.datetime)) self.assertIsInstance(person.created, datetime.datetime)
self.assertNotEqual(person.created, datetime.datetime(2014, 6, 12)) self.assertNotEqual(person.created, datetime.datetime(2014, 6, 12))
self.assertEqual(person._data['name'], person.name) self.assertEqual(person._data['name'], person.name)
@ -264,12 +289,11 @@ class FieldTest(MongoDBTestCase):
# Retrive data from db and verify it. # Retrive data from db and verify it.
ret = HandleNoneFields.objects.all()[0] ret = HandleNoneFields.objects.all()[0]
self.assertEqual(ret.str_fld, None) self.assertIsNone(ret.str_fld)
self.assertEqual(ret.int_fld, None) self.assertIsNone(ret.int_fld)
self.assertEqual(ret.flt_fld, None) self.assertIsNone(ret.flt_fld)
# Return current time if retrived value is None. self.assertIsNone(ret.comp_dt_fld)
self.assertTrue(isinstance(ret.comp_dt_fld, datetime.datetime))
def test_not_required_handles_none_from_database(self): def test_not_required_handles_none_from_database(self):
"""Ensure that every field can handle null values from the """Ensure that every field can handle null values from the
@ -287,7 +311,7 @@ class FieldTest(MongoDBTestCase):
doc.str_fld = u'spam ham egg' doc.str_fld = u'spam ham egg'
doc.int_fld = 42 doc.int_fld = 42
doc.flt_fld = 4.2 doc.flt_fld = 4.2
doc.com_dt_fld = datetime.datetime.utcnow() doc.comp_dt_fld = datetime.datetime.utcnow()
doc.save() doc.save()
# Unset all the fields # Unset all the fields
@ -302,12 +326,10 @@ class FieldTest(MongoDBTestCase):
# Retrive data from db and verify it. # Retrive data from db and verify it.
ret = HandleNoneFields.objects.first() ret = HandleNoneFields.objects.first()
self.assertEqual(ret.str_fld, None) self.assertIsNone(ret.str_fld)
self.assertEqual(ret.int_fld, None) self.assertIsNone(ret.int_fld)
self.assertEqual(ret.flt_fld, None) self.assertIsNone(ret.flt_fld)
self.assertIsNone(ret.comp_dt_fld)
# ComplexDateTimeField returns current time if retrived value is None.
self.assertTrue(isinstance(ret.comp_dt_fld, datetime.datetime))
# Retrieved object shouldn't pass validation when a re-save is # Retrieved object shouldn't pass validation when a re-save is
# attempted. # attempted.
@ -428,6 +450,16 @@ class FieldTest(MongoDBTestCase):
scheme_link.url = 'ws://google.com' scheme_link.url = 'ws://google.com'
scheme_link.validate() scheme_link.validate()
def test_url_allowed_domains(self):
"""Allow underscore in domain names.
"""
class Link(Document):
url = URLField()
link = Link()
link.url = 'https://san_leandro-ca.geebo.com'
link.validate()
def test_int_validation(self): def test_int_validation(self):
"""Ensure that invalid values cannot be assigned to int fields. """Ensure that invalid values cannot be assigned to int fields.
""" """
@ -611,6 +643,8 @@ class FieldTest(MongoDBTestCase):
self.assertRaises(ValidationError, person.validate) self.assertRaises(ValidationError, person.validate)
person.admin = 'Yes' person.admin = 'Yes'
self.assertRaises(ValidationError, person.validate) self.assertRaises(ValidationError, person.validate)
person.admin = 'False'
self.assertRaises(ValidationError, person.validate)
def test_uuid_field_string(self): def test_uuid_field_string(self):
"""Test UUID fields storing as String """Test UUID fields storing as String
@ -928,137 +962,6 @@ class FieldTest(MongoDBTestCase):
logs = LogEntry.objects.filter(date__gte=datetime.datetime(1980, 1, 1)) logs = LogEntry.objects.filter(date__gte=datetime.datetime(1980, 1, 1))
self.assertEqual(logs.count(), 10) self.assertEqual(logs.count(), 10)
def test_complexdatetime_storage(self):
"""Tests for complex datetime fields - which can handle
microseconds without rounding.
"""
class LogEntry(Document):
date = ComplexDateTimeField()
date_with_dots = ComplexDateTimeField(separator='.')
LogEntry.drop_collection()
# Post UTC - microseconds are rounded (down) nearest millisecond and
# dropped - with default datetimefields
d1 = datetime.datetime(1970, 1, 1, 0, 0, 1, 999)
log = LogEntry()
log.date = d1
log.save()
log.reload()
self.assertEqual(log.date, d1)
# Post UTC - microseconds are rounded (down) nearest millisecond - with
# default datetimefields
d1 = datetime.datetime(1970, 1, 1, 0, 0, 1, 9999)
log.date = d1
log.save()
log.reload()
self.assertEqual(log.date, d1)
# Pre UTC dates microseconds below 1000 are dropped - with default
# datetimefields
d1 = datetime.datetime(1969, 12, 31, 23, 59, 59, 999)
log.date = d1
log.save()
log.reload()
self.assertEqual(log.date, d1)
# Pre UTC microseconds above 1000 is wonky - with default datetimefields
# log.date has an invalid microsecond value so I can't construct
# a date to compare.
for i in range(1001, 3113, 33):
d1 = datetime.datetime(1969, 12, 31, 23, 59, 59, i)
log.date = d1
log.save()
log.reload()
self.assertEqual(log.date, d1)
log1 = LogEntry.objects.get(date=d1)
self.assertEqual(log, log1)
# Test string padding
microsecond = map(int, [math.pow(10, x) for x in range(6)])
mm = dd = hh = ii = ss = [1, 10]
for values in itertools.product([2014], mm, dd, hh, ii, ss, microsecond):
stored = LogEntry(date=datetime.datetime(*values)).to_mongo()['date']
self.assertTrue(re.match('^\d{4},\d{2},\d{2},\d{2},\d{2},\d{2},\d{6}$', stored) is not None)
# Test separator
stored = LogEntry(date_with_dots=datetime.datetime(2014, 1, 1)).to_mongo()['date_with_dots']
self.assertTrue(re.match('^\d{4}.\d{2}.\d{2}.\d{2}.\d{2}.\d{2}.\d{6}$', stored) is not None)
def test_complexdatetime_usage(self):
"""Tests for complex datetime fields - which can handle
microseconds without rounding.
"""
class LogEntry(Document):
date = ComplexDateTimeField()
LogEntry.drop_collection()
d1 = datetime.datetime(1950, 1, 1, 0, 0, 1, 999)
log = LogEntry()
log.date = d1
log.save()
log1 = LogEntry.objects.get(date=d1)
self.assertEqual(log, log1)
# create extra 59 log entries for a total of 60
for i in range(1951, 2010):
d = datetime.datetime(i, 1, 1, 0, 0, 1, 999)
LogEntry(date=d).save()
self.assertEqual(LogEntry.objects.count(), 60)
# Test ordering
logs = LogEntry.objects.order_by("date")
i = 0
while i < 59:
self.assertTrue(logs[i].date <= logs[i + 1].date)
i += 1
logs = LogEntry.objects.order_by("-date")
i = 0
while i < 59:
self.assertTrue(logs[i].date >= logs[i + 1].date)
i += 1
# Test searching
logs = LogEntry.objects.filter(date__gte=datetime.datetime(1980, 1, 1))
self.assertEqual(logs.count(), 30)
logs = LogEntry.objects.filter(date__lte=datetime.datetime(1980, 1, 1))
self.assertEqual(logs.count(), 30)
logs = LogEntry.objects.filter(
date__lte=datetime.datetime(2011, 1, 1),
date__gte=datetime.datetime(2000, 1, 1),
)
self.assertEqual(logs.count(), 10)
LogEntry.drop_collection()
# Test microsecond-level ordering/filtering
for microsecond in (99, 999, 9999, 10000):
LogEntry(
date=datetime.datetime(2015, 1, 1, 0, 0, 0, microsecond)
).save()
logs = list(LogEntry.objects.order_by('date'))
for next_idx, log in enumerate(logs[:-1], start=1):
next_log = logs[next_idx]
self.assertTrue(log.date < next_log.date)
logs = list(LogEntry.objects.order_by('-date'))
for next_idx, log in enumerate(logs[:-1], start=1):
next_log = logs[next_idx]
self.assertTrue(log.date > next_log.date)
logs = LogEntry.objects.filter(
date__lte=datetime.datetime(2015, 1, 1, 0, 0, 0, 10000))
self.assertEqual(logs.count(), 4)
def test_list_validation(self): def test_list_validation(self):
"""Ensure that a list field only accepts lists with valid elements.""" """Ensure that a list field only accepts lists with valid elements."""
AccessLevelChoices = ( AccessLevelChoices = (
@ -1311,7 +1214,7 @@ class FieldTest(MongoDBTestCase):
# aka 'del list[index]' # aka 'del list[index]'
# aka 'operator.delitem(list, index)' # aka 'operator.delitem(list, index)'
reset_post() reset_post()
del post.info[2] # del from middle ('2') del post.info[2] # del from middle ('2')
self.assertEqual(post.info, ['0', '1', '3', '4', '5']) self.assertEqual(post.info, ['0', '1', '3', '4', '5'])
post.save() post.save()
post.reload() post.reload()
@ -1321,7 +1224,7 @@ class FieldTest(MongoDBTestCase):
# aka 'del list[i:j]' # aka 'del list[i:j]'
# aka 'operator.delitem(list, slice(i,j))' # aka 'operator.delitem(list, slice(i,j))'
reset_post() reset_post()
del post.info[1:3] # removes '1', '2' del post.info[1:3] # removes '1', '2'
self.assertEqual(post.info, ['0', '3', '4', '5']) self.assertEqual(post.info, ['0', '3', '4', '5'])
post.save() post.save()
post.reload() post.reload()
@ -1736,8 +1639,8 @@ class FieldTest(MongoDBTestCase):
e.save() e.save()
e2 = Simple.objects.get(id=e.id) e2 = Simple.objects.get(id=e.id)
self.assertTrue(isinstance(e2.mapping[0], StringSetting)) self.assertIsInstance(e2.mapping[0], StringSetting)
self.assertTrue(isinstance(e2.mapping[1], IntegerSetting)) self.assertIsInstance(e2.mapping[1], IntegerSetting)
# Test querying # Test querying
self.assertEqual( self.assertEqual(
@ -1906,8 +1809,8 @@ class FieldTest(MongoDBTestCase):
e.save() e.save()
e2 = Simple.objects.get(id=e.id) e2 = Simple.objects.get(id=e.id)
self.assertTrue(isinstance(e2.mapping['somestring'], StringSetting)) self.assertIsInstance(e2.mapping['somestring'], StringSetting)
self.assertTrue(isinstance(e2.mapping['someint'], IntegerSetting)) self.assertIsInstance(e2.mapping['someint'], IntegerSetting)
# Test querying # Test querying
self.assertEqual( self.assertEqual(
@ -1950,6 +1853,48 @@ class FieldTest(MongoDBTestCase):
with self.assertRaises(ValueError): with self.assertRaises(ValueError):
e.update(set__mapping={"somestrings": ["foo", "bar", ]}) e.update(set__mapping={"somestrings": ["foo", "bar", ]})
def test_dictfield_with_referencefield_complex_nesting_cases(self):
"""Ensure complex nesting inside DictField handles dereferencing of ReferenceField(dbref=True | False)"""
# Relates to Issue #1453
class Doc(Document):
s = StringField()
class Simple(Document):
mapping0 = DictField(ReferenceField(Doc, dbref=True))
mapping1 = DictField(ReferenceField(Doc, dbref=False))
mapping2 = DictField(ListField(ReferenceField(Doc, dbref=True)))
mapping3 = DictField(ListField(ReferenceField(Doc, dbref=False)))
mapping4 = DictField(DictField(field=ReferenceField(Doc, dbref=True)))
mapping5 = DictField(DictField(field=ReferenceField(Doc, dbref=False)))
mapping6 = DictField(ListField(DictField(ReferenceField(Doc, dbref=True))))
mapping7 = DictField(ListField(DictField(ReferenceField(Doc, dbref=False))))
mapping8 = DictField(ListField(DictField(ListField(ReferenceField(Doc, dbref=True)))))
mapping9 = DictField(ListField(DictField(ListField(ReferenceField(Doc, dbref=False)))))
Doc.drop_collection()
Simple.drop_collection()
d = Doc(s='aa').save()
e = Simple()
e.mapping0['someint'] = e.mapping1['someint'] = d
e.mapping2['someint'] = e.mapping3['someint'] = [d]
e.mapping4['someint'] = e.mapping5['someint'] = {'d': d}
e.mapping6['someint'] = e.mapping7['someint'] = [{'d': d}]
e.mapping8['someint'] = e.mapping9['someint'] = [{'d': [d]}]
e.save()
s = Simple.objects.first()
self.assertIsInstance(s.mapping0['someint'], Doc)
self.assertIsInstance(s.mapping1['someint'], Doc)
self.assertIsInstance(s.mapping2['someint'][0], Doc)
self.assertIsInstance(s.mapping3['someint'][0], Doc)
self.assertIsInstance(s.mapping4['someint']['d'], Doc)
self.assertIsInstance(s.mapping5['someint']['d'], Doc)
self.assertIsInstance(s.mapping6['someint'][0]['d'], Doc)
self.assertIsInstance(s.mapping7['someint'][0]['d'], Doc)
self.assertIsInstance(s.mapping8['someint'][0]['d'][0], Doc)
self.assertIsInstance(s.mapping9['someint'][0]['d'][0], Doc)
def test_mapfield(self): def test_mapfield(self):
"""Ensure that the MapField handles the declared type.""" """Ensure that the MapField handles the declared type."""
class Simple(Document): class Simple(Document):
@ -1991,8 +1936,8 @@ class FieldTest(MongoDBTestCase):
e.save() e.save()
e2 = Extensible.objects.get(id=e.id) e2 = Extensible.objects.get(id=e.id)
self.assertTrue(isinstance(e2.mapping['somestring'], StringSetting)) self.assertIsInstance(e2.mapping['somestring'], StringSetting)
self.assertTrue(isinstance(e2.mapping['someint'], IntegerSetting)) self.assertIsInstance(e2.mapping['someint'], IntegerSetting)
with self.assertRaises(ValidationError): with self.assertRaises(ValidationError):
e.mapping['someint'] = 123 e.mapping['someint'] = 123
@ -2147,6 +2092,15 @@ class FieldTest(MongoDBTestCase):
])) ]))
self.assertEqual(a.b.c.txt, 'hi') self.assertEqual(a.b.c.txt, 'hi')
def test_embedded_document_field_cant_reference_using_a_str_if_it_does_not_exist_yet(self):
raise SkipTest("Using a string reference in an EmbeddedDocumentField does not work if the class isnt registerd yet")
class MyDoc2(Document):
emb = EmbeddedDocumentField('MyDoc')
class MyDoc(EmbeddedDocument):
name = StringField()
def test_embedded_document_validation(self): def test_embedded_document_validation(self):
"""Ensure that invalid embedded documents cannot be assigned to """Ensure that invalid embedded documents cannot be assigned to
embedded document fields. embedded document fields.
@ -2688,7 +2642,7 @@ class FieldTest(MongoDBTestCase):
bm = Bookmark.objects(bookmark_object=post_1).first() bm = Bookmark.objects(bookmark_object=post_1).first()
self.assertEqual(bm.bookmark_object, post_1) self.assertEqual(bm.bookmark_object, post_1)
self.assertTrue(isinstance(bm.bookmark_object, Post)) self.assertIsInstance(bm.bookmark_object, Post)
bm.bookmark_object = link_1 bm.bookmark_object = link_1
bm.save() bm.save()
@ -2696,7 +2650,7 @@ class FieldTest(MongoDBTestCase):
bm = Bookmark.objects(bookmark_object=link_1).first() bm = Bookmark.objects(bookmark_object=link_1).first()
self.assertEqual(bm.bookmark_object, link_1) self.assertEqual(bm.bookmark_object, link_1)
self.assertTrue(isinstance(bm.bookmark_object, Link)) self.assertIsInstance(bm.bookmark_object, Link)
def test_generic_reference_list(self): def test_generic_reference_list(self):
"""Ensure that a ListField properly dereferences generic references. """Ensure that a ListField properly dereferences generic references.
@ -2931,7 +2885,32 @@ class FieldTest(MongoDBTestCase):
doc = Doc.objects.get(ref=DBRef('doc', doc1.pk)) doc = Doc.objects.get(ref=DBRef('doc', doc1.pk))
self.assertEqual(doc, doc2) self.assertEqual(doc, doc2)
def test_generic_reference_filter_by_objectid(self): def test_generic_reference_is_not_tracked_in_parent_doc(self):
"""Ensure that modifications of related documents (through generic reference) don't influence
the owner changed fields (#1934)
"""
class Doc1(Document):
name = StringField()
class Doc2(Document):
ref = GenericReferenceField()
refs = ListField(GenericReferenceField())
Doc1.drop_collection()
Doc2.drop_collection()
doc1 = Doc1(name='garbage1').save()
doc11 = Doc1(name='garbage11').save()
doc2 = Doc2(ref=doc1, refs=[doc11]).save()
doc2.ref.name = 'garbage2'
self.assertEqual(doc2._get_changed_fields(), [])
doc2.refs[0].name = 'garbage3'
self.assertEqual(doc2._get_changed_fields(), [])
self.assertEqual(doc2._delta(), ({}, {}))
def test_generic_reference_field(self):
"""Ensure we can search for a specific generic reference by """Ensure we can search for a specific generic reference by
providing its DBRef. providing its DBRef.
""" """
@ -2943,7 +2922,7 @@ class FieldTest(MongoDBTestCase):
doc1 = Doc.objects.create() doc1 = Doc.objects.create()
doc2 = Doc.objects.create(ref=doc1) doc2 = Doc.objects.create(ref=doc1)
self.assertTrue(isinstance(doc1.pk, ObjectId)) self.assertIsInstance(doc1.pk, ObjectId)
doc = Doc.objects.get(ref=doc1.pk) doc = Doc.objects.get(ref=doc1.pk)
self.assertEqual(doc, doc2) self.assertEqual(doc, doc2)
@ -2967,37 +2946,33 @@ class FieldTest(MongoDBTestCase):
self.assertEqual(MIME_TYPE, attachment_1.content_type) self.assertEqual(MIME_TYPE, attachment_1.content_type)
self.assertEqual(BLOB, six.binary_type(attachment_1.blob)) self.assertEqual(BLOB, six.binary_type(attachment_1.blob))
def test_binary_validation(self): def test_binary_validation_succeeds(self):
"""Ensure that invalid values cannot be assigned to binary fields. """Ensure that valid values can be assigned to binary fields.
""" """
class Attachment(Document):
blob = BinaryField()
class AttachmentRequired(Document): class AttachmentRequired(Document):
blob = BinaryField(required=True) blob = BinaryField(required=True)
class AttachmentSizeLimit(Document): class AttachmentSizeLimit(Document):
blob = BinaryField(max_bytes=4) blob = BinaryField(max_bytes=4)
Attachment.drop_collection()
AttachmentRequired.drop_collection()
AttachmentSizeLimit.drop_collection()
attachment = Attachment()
attachment.validate()
attachment.blob = 2
self.assertRaises(ValidationError, attachment.validate)
attachment_required = AttachmentRequired() attachment_required = AttachmentRequired()
self.assertRaises(ValidationError, attachment_required.validate) self.assertRaises(ValidationError, attachment_required.validate)
attachment_required.blob = Binary(six.b('\xe6\x00\xc4\xff\x07')) attachment_required.blob = Binary(six.b('\xe6\x00\xc4\xff\x07'))
attachment_required.validate() attachment_required.validate()
attachment_size_limit = AttachmentSizeLimit( _5_BYTES = six.b('\xe6\x00\xc4\xff\x07')
blob=six.b('\xe6\x00\xc4\xff\x07')) _4_BYTES = six.b('\xe6\x00\xc4\xff')
self.assertRaises(ValidationError, attachment_size_limit.validate) self.assertRaises(ValidationError, AttachmentSizeLimit(blob=_5_BYTES).validate)
attachment_size_limit.blob = six.b('\xe6\x00\xc4\xff') AttachmentSizeLimit(blob=_4_BYTES).validate()
attachment_size_limit.validate()
def test_binary_validation_fails(self):
"""Ensure that invalid values cannot be assigned to binary fields."""
class Attachment(Document):
blob = BinaryField()
for invalid_data in (2, u'Im_a_unicode', ['some_str']):
self.assertRaises(ValidationError, Attachment(blob=invalid_data).validate)
def test_binary_field_primary(self): def test_binary_field_primary(self):
class Attachment(Document): class Attachment(Document):
@ -3546,13 +3521,13 @@ class FieldTest(MongoDBTestCase):
person.save() person.save()
person = Person.objects.first() person = Person.objects.first()
self.assertTrue(isinstance(person.like, Car)) self.assertIsInstance(person.like, Car)
person.like = Dish(food="arroz", number=15) person.like = Dish(food="arroz", number=15)
person.save() person.save()
person = Person.objects.first() person = Person.objects.first()
self.assertTrue(isinstance(person.like, Dish)) self.assertIsInstance(person.like, Dish)
def test_generic_embedded_document_choices(self): def test_generic_embedded_document_choices(self):
"""Ensure you can limit GenericEmbeddedDocument choices.""" """Ensure you can limit GenericEmbeddedDocument choices."""
@ -3577,7 +3552,7 @@ class FieldTest(MongoDBTestCase):
person.save() person.save()
person = Person.objects.first() person = Person.objects.first()
self.assertTrue(isinstance(person.like, Dish)) self.assertIsInstance(person.like, Dish)
def test_generic_list_embedded_document_choices(self): def test_generic_list_embedded_document_choices(self):
"""Ensure you can limit GenericEmbeddedDocument choices inside """Ensure you can limit GenericEmbeddedDocument choices inside
@ -3604,7 +3579,7 @@ class FieldTest(MongoDBTestCase):
person.save() person.save()
person = Person.objects.first() person = Person.objects.first()
self.assertTrue(isinstance(person.likes[0], Dish)) self.assertIsInstance(person.likes[0], Dish)
def test_recursive_validation(self): def test_recursive_validation(self):
"""Ensure that a validation result to_dict is available.""" """Ensure that a validation result to_dict is available."""
@ -3630,18 +3605,17 @@ class FieldTest(MongoDBTestCase):
except ValidationError as error: except ValidationError as error:
# ValidationError.errors property # ValidationError.errors property
self.assertTrue(hasattr(error, 'errors')) self.assertTrue(hasattr(error, 'errors'))
self.assertTrue(isinstance(error.errors, dict)) self.assertIsInstance(error.errors, dict)
self.assertTrue('comments' in error.errors) self.assertIn('comments', error.errors)
self.assertTrue(1 in error.errors['comments']) self.assertIn(1, error.errors['comments'])
self.assertTrue(isinstance(error.errors['comments'][1]['content'], self.assertIsInstance(error.errors['comments'][1]['content'], ValidationError)
ValidationError))
# ValidationError.schema property # ValidationError.schema property
error_dict = error.to_dict() error_dict = error.to_dict()
self.assertTrue(isinstance(error_dict, dict)) self.assertIsInstance(error_dict, dict)
self.assertTrue('comments' in error_dict) self.assertIn('comments', error_dict)
self.assertTrue(1 in error_dict['comments']) self.assertIn(1, error_dict['comments'])
self.assertTrue('content' in error_dict['comments'][1]) self.assertIn('content', error_dict['comments'][1])
self.assertEqual(error_dict['comments'][1]['content'], self.assertEqual(error_dict['comments'][1]['content'],
u'Field is required') u'Field is required')
@ -3757,7 +3731,7 @@ class FieldTest(MongoDBTestCase):
# Passes regex validation # Passes regex validation
user = User(email='me@example.com') user = User(email='me@example.com')
self.assertTrue(user.validate() is None) self.assertIsNone(user.validate())
def test_tuples_as_tuples(self): def test_tuples_as_tuples(self):
"""Ensure that tuples remain tuples when they are inside """Ensure that tuples remain tuples when they are inside
@ -3784,10 +3758,10 @@ class FieldTest(MongoDBTestCase):
doc.items = tuples doc.items = tuples
doc.save() doc.save()
x = TestDoc.objects().get() x = TestDoc.objects().get()
self.assertTrue(x is not None) self.assertIsNotNone(x)
self.assertTrue(len(x.items) == 1) self.assertEqual(len(x.items), 1)
self.assertTrue(tuple(x.items[0]) in tuples) self.assertIn(tuple(x.items[0]), tuples)
self.assertTrue(x.items[0] in tuples) self.assertIn(x.items[0], tuples)
def test_dynamic_fields_class(self): def test_dynamic_fields_class(self):
class Doc2(Document): class Doc2(Document):
@ -3859,7 +3833,7 @@ class FieldTest(MongoDBTestCase):
assert isinstance(doc.field, ToEmbedChild) assert isinstance(doc.field, ToEmbedChild)
assert doc.field == to_embed_child assert doc.field == to_embed_child
def test_invalid_dict_value(self): def test_dict_field_invalid_dict_value(self):
class DictFieldTest(Document): class DictFieldTest(Document):
dictionary = DictField(required=True) dictionary = DictField(required=True)
@ -3873,6 +3847,22 @@ class FieldTest(MongoDBTestCase):
test.dictionary # Just access to test getter test.dictionary # Just access to test getter
self.assertRaises(ValidationError, test.validate) self.assertRaises(ValidationError, test.validate)
def test_dict_field_raises_validation_error_if_wrongly_assign_embedded_doc(self):
class DictFieldTest(Document):
dictionary = DictField(required=True)
DictFieldTest.drop_collection()
class Embedded(EmbeddedDocument):
name = StringField()
embed = Embedded(name='garbage')
doc = DictFieldTest(dictionary=embed)
with self.assertRaises(ValidationError) as ctx_err:
doc.validate()
self.assertIn("'dictionary'", str(ctx_err.exception))
self.assertIn('Only dictionaries may be used in a DictField', str(ctx_err.exception))
def test_cls_field(self): def test_cls_field(self):
class Animal(Document): class Animal(Document):
meta = {'allow_inheritance': True} meta = {'allow_inheritance': True}
@ -3937,8 +3927,8 @@ class FieldTest(MongoDBTestCase):
doc = TestLongFieldConsideredAsInt64(some_long=42).save() doc = TestLongFieldConsideredAsInt64(some_long=42).save()
db = get_db() db = get_db()
self.assertTrue(isinstance(db.test_long_field_considered_as_int64.find()[0]['some_long'], Int64)) self.assertIsInstance(db.test_long_field_considered_as_int64.find()[0]['some_long'], Int64)
self.assertTrue(isinstance(doc.some_long, six.integer_types)) self.assertIsInstance(doc.some_long, six.integer_types)
class EmbeddedDocumentListFieldTestCase(MongoDBTestCase): class EmbeddedDocumentListFieldTestCase(MongoDBTestCase):
@ -3971,6 +3961,28 @@ class EmbeddedDocumentListFieldTestCase(MongoDBTestCase):
self.Comments(author='user3', message='message1') self.Comments(author='user3', message='message1')
]).save() ]).save()
def test_fails_upon_validate_if_provide_a_doc_instead_of_a_list_of_doc(self):
# Relates to Issue #1464
comment = self.Comments(author='John')
class Title(Document):
content = StringField()
# Test with an embeddedDocument instead of a list(embeddedDocument)
# It's an edge case but it used to fail with a vague error, making it difficult to troubleshoot it
post = self.BlogPost(comments=comment)
with self.assertRaises(ValidationError) as ctx_err:
post.validate()
self.assertIn("'comments'", str(ctx_err.exception))
self.assertIn('Only lists and tuples may be used in a list field', str(ctx_err.exception))
# Test with a Document
post = self.BlogPost(comments=Title(content='garbage'))
with self.assertRaises(ValidationError) as e:
post.validate()
self.assertIn("'comments'", str(ctx_err.exception))
self.assertIn('Only lists and tuples may be used in a list field', str(ctx_err.exception))
def test_no_keyword_filter(self): def test_no_keyword_filter(self):
""" """
Tests the filter method of a List of Embedded Documents Tests the filter method of a List of Embedded Documents
@ -4388,6 +4400,45 @@ class EmbeddedDocumentListFieldTestCase(MongoDBTestCase):
self.assertEqual(custom_data['a'], CustomData.c_field.custom_data['a']) self.assertEqual(custom_data['a'], CustomData.c_field.custom_data['a'])
class TestEmbeddedDocumentField(MongoDBTestCase):
def test___init___(self):
class MyDoc(EmbeddedDocument):
name = StringField()
field = EmbeddedDocumentField(MyDoc)
self.assertEqual(field.document_type_obj, MyDoc)
field2 = EmbeddedDocumentField('MyDoc')
self.assertEqual(field2.document_type_obj, 'MyDoc')
def test___init___throw_error_if_document_type_is_not_EmbeddedDocument(self):
with self.assertRaises(ValidationError):
EmbeddedDocumentField(dict)
def test_document_type_throw_error_if_not_EmbeddedDocument_subclass(self):
class MyDoc(Document):
name = StringField()
emb = EmbeddedDocumentField('MyDoc')
with self.assertRaises(ValidationError) as ctx:
emb.document_type
self.assertIn('Invalid embedded document class provided to an EmbeddedDocumentField', str(ctx.exception))
def test_embedded_document_field_only_allow_subclasses_of_embedded_document(self):
# Relates to #1661
class MyDoc(Document):
name = StringField()
with self.assertRaises(ValidationError):
class MyFailingDoc(Document):
emb = EmbeddedDocumentField(MyDoc)
with self.assertRaises(ValidationError):
class MyFailingdoc2(Document):
emb = EmbeddedDocumentField('MyDoc')
class CachedReferenceFieldTest(MongoDBTestCase): class CachedReferenceFieldTest(MongoDBTestCase):
def test_cached_reference_field_get_and_save(self): def test_cached_reference_field_get_and_save(self):
@ -4451,7 +4502,7 @@ class CachedReferenceFieldTest(MongoDBTestCase):
ocorrence = Ocorrence.objects(animal__tag='heavy').first() ocorrence = Ocorrence.objects(animal__tag='heavy').first()
self.assertEqual(ocorrence.person, "teste") self.assertEqual(ocorrence.person, "teste")
self.assertTrue(isinstance(ocorrence.animal, Animal)) self.assertIsInstance(ocorrence.animal, Animal)
def test_cached_reference_field_decimal(self): def test_cached_reference_field_decimal(self):
class PersonAuto(Document): class PersonAuto(Document):
@ -4768,7 +4819,7 @@ class CachedReferenceFieldTest(MongoDBTestCase):
animal__tag='heavy', animal__tag='heavy',
animal__owner__tp='u').first() animal__owner__tp='u').first()
self.assertEqual(ocorrence.person, "teste") self.assertEqual(ocorrence.person, "teste")
self.assertTrue(isinstance(ocorrence.animal, Animal)) self.assertIsInstance(ocorrence.animal, Animal)
def test_cached_reference_embedded_list_fields(self): def test_cached_reference_embedded_list_fields(self):
class Owner(EmbeddedDocument): class Owner(EmbeddedDocument):
@ -4822,7 +4873,7 @@ class CachedReferenceFieldTest(MongoDBTestCase):
animal__tag='heavy', animal__tag='heavy',
animal__owner__tags='cool').first() animal__owner__tags='cool').first()
self.assertEqual(ocorrence.person, "teste 2") self.assertEqual(ocorrence.person, "teste 2")
self.assertTrue(isinstance(ocorrence.animal, Animal)) self.assertIsInstance(ocorrence.animal, Animal)
class LazyReferenceFieldTest(MongoDBTestCase): class LazyReferenceFieldTest(MongoDBTestCase):
@ -5342,5 +5393,180 @@ class GenericLazyReferenceFieldTest(MongoDBTestCase):
check_fields_type(occ) check_fields_type(occ)
class ComplexDateTimeFieldTest(MongoDBTestCase):
def test_complexdatetime_storage(self):
"""Tests for complex datetime fields - which can handle
microseconds without rounding.
"""
class LogEntry(Document):
date = ComplexDateTimeField()
date_with_dots = ComplexDateTimeField(separator='.')
LogEntry.drop_collection()
# Post UTC - microseconds are rounded (down) nearest millisecond and
# dropped - with default datetimefields
d1 = datetime.datetime(1970, 1, 1, 0, 0, 1, 999)
log = LogEntry()
log.date = d1
log.save()
log.reload()
self.assertEqual(log.date, d1)
# Post UTC - microseconds are rounded (down) nearest millisecond - with
# default datetimefields
d1 = datetime.datetime(1970, 1, 1, 0, 0, 1, 9999)
log.date = d1
log.save()
log.reload()
self.assertEqual(log.date, d1)
# Pre UTC dates microseconds below 1000 are dropped - with default
# datetimefields
d1 = datetime.datetime(1969, 12, 31, 23, 59, 59, 999)
log.date = d1
log.save()
log.reload()
self.assertEqual(log.date, d1)
# Pre UTC microseconds above 1000 is wonky - with default datetimefields
# log.date has an invalid microsecond value so I can't construct
# a date to compare.
for i in range(1001, 3113, 33):
d1 = datetime.datetime(1969, 12, 31, 23, 59, 59, i)
log.date = d1
log.save()
log.reload()
self.assertEqual(log.date, d1)
log1 = LogEntry.objects.get(date=d1)
self.assertEqual(log, log1)
# Test string padding
microsecond = map(int, [math.pow(10, x) for x in range(6)])
mm = dd = hh = ii = ss = [1, 10]
for values in itertools.product([2014], mm, dd, hh, ii, ss, microsecond):
stored = LogEntry(date=datetime.datetime(*values)).to_mongo()['date']
self.assertTrue(re.match('^\d{4},\d{2},\d{2},\d{2},\d{2},\d{2},\d{6}$', stored) is not None)
# Test separator
stored = LogEntry(date_with_dots=datetime.datetime(2014, 1, 1)).to_mongo()['date_with_dots']
self.assertTrue(re.match('^\d{4}.\d{2}.\d{2}.\d{2}.\d{2}.\d{2}.\d{6}$', stored) is not None)
def test_complexdatetime_usage(self):
"""Tests for complex datetime fields - which can handle
microseconds without rounding.
"""
class LogEntry(Document):
date = ComplexDateTimeField()
LogEntry.drop_collection()
d1 = datetime.datetime(1950, 1, 1, 0, 0, 1, 999)
log = LogEntry()
log.date = d1
log.save()
log1 = LogEntry.objects.get(date=d1)
self.assertEqual(log, log1)
# create extra 59 log entries for a total of 60
for i in range(1951, 2010):
d = datetime.datetime(i, 1, 1, 0, 0, 1, 999)
LogEntry(date=d).save()
self.assertEqual(LogEntry.objects.count(), 60)
# Test ordering
logs = LogEntry.objects.order_by("date")
i = 0
while i < 59:
self.assertTrue(logs[i].date <= logs[i + 1].date)
i += 1
logs = LogEntry.objects.order_by("-date")
i = 0
while i < 59:
self.assertTrue(logs[i].date >= logs[i + 1].date)
i += 1
# Test searching
logs = LogEntry.objects.filter(date__gte=datetime.datetime(1980, 1, 1))
self.assertEqual(logs.count(), 30)
logs = LogEntry.objects.filter(date__lte=datetime.datetime(1980, 1, 1))
self.assertEqual(logs.count(), 30)
logs = LogEntry.objects.filter(
date__lte=datetime.datetime(2011, 1, 1),
date__gte=datetime.datetime(2000, 1, 1),
)
self.assertEqual(logs.count(), 10)
LogEntry.drop_collection()
# Test microsecond-level ordering/filtering
for microsecond in (99, 999, 9999, 10000):
LogEntry(
date=datetime.datetime(2015, 1, 1, 0, 0, 0, microsecond)
).save()
logs = list(LogEntry.objects.order_by('date'))
for next_idx, log in enumerate(logs[:-1], start=1):
next_log = logs[next_idx]
self.assertTrue(log.date < next_log.date)
logs = list(LogEntry.objects.order_by('-date'))
for next_idx, log in enumerate(logs[:-1], start=1):
next_log = logs[next_idx]
self.assertTrue(log.date > next_log.date)
logs = LogEntry.objects.filter(
date__lte=datetime.datetime(2015, 1, 1, 0, 0, 0, 10000))
self.assertEqual(logs.count(), 4)
def test_no_default_value(self):
class Log(Document):
timestamp = ComplexDateTimeField()
Log.drop_collection()
log = Log()
self.assertIsNone(log.timestamp)
log.save()
fetched_log = Log.objects.with_id(log.id)
self.assertIsNone(fetched_log.timestamp)
def test_default_static_value(self):
NOW = datetime.datetime.utcnow()
class Log(Document):
timestamp = ComplexDateTimeField(default=NOW)
Log.drop_collection()
log = Log()
self.assertEqual(log.timestamp, NOW)
log.save()
fetched_log = Log.objects.with_id(log.id)
self.assertEqual(fetched_log.timestamp, NOW)
def test_default_callable(self):
NOW = datetime.datetime.utcnow()
class Log(Document):
timestamp = ComplexDateTimeField(default=datetime.datetime.utcnow)
Log.drop_collection()
log = Log()
self.assertGreaterEqual(log.timestamp, NOW)
log.save()
fetched_log = Log.objects.with_id(log.id)
self.assertGreaterEqual(fetched_log.timestamp, NOW)
if __name__ == '__main__': if __name__ == '__main__':
unittest.main() unittest.main()

View File

@ -53,7 +53,7 @@ class FileTest(MongoDBTestCase):
putfile.save() putfile.save()
result = PutFile.objects.first() result = PutFile.objects.first()
self.assertTrue(putfile == result) self.assertEqual(putfile, result)
self.assertEqual("%s" % result.the_file, "<GridFSProxy: hello (%s)>" % result.the_file.grid_id) self.assertEqual("%s" % result.the_file, "<GridFSProxy: hello (%s)>" % result.the_file.grid_id)
self.assertEqual(result.the_file.read(), text) self.assertEqual(result.the_file.read(), text)
self.assertEqual(result.the_file.content_type, content_type) self.assertEqual(result.the_file.content_type, content_type)
@ -71,7 +71,7 @@ class FileTest(MongoDBTestCase):
putfile.save() putfile.save()
result = PutFile.objects.first() result = PutFile.objects.first()
self.assertTrue(putfile == result) self.assertEqual(putfile, result)
self.assertEqual(result.the_file.read(), text) self.assertEqual(result.the_file.read(), text)
self.assertEqual(result.the_file.content_type, content_type) self.assertEqual(result.the_file.content_type, content_type)
result.the_file.delete() result.the_file.delete()
@ -96,7 +96,7 @@ class FileTest(MongoDBTestCase):
streamfile.save() streamfile.save()
result = StreamFile.objects.first() result = StreamFile.objects.first()
self.assertTrue(streamfile == result) self.assertEqual(streamfile, result)
self.assertEqual(result.the_file.read(), text + more_text) self.assertEqual(result.the_file.read(), text + more_text)
self.assertEqual(result.the_file.content_type, content_type) self.assertEqual(result.the_file.content_type, content_type)
result.the_file.seek(0) result.the_file.seek(0)
@ -132,7 +132,7 @@ class FileTest(MongoDBTestCase):
streamfile.save() streamfile.save()
result = StreamFile.objects.first() result = StreamFile.objects.first()
self.assertTrue(streamfile == result) self.assertEqual(streamfile, result)
self.assertEqual(result.the_file.read(), text + more_text) self.assertEqual(result.the_file.read(), text + more_text)
# self.assertEqual(result.the_file.content_type, content_type) # self.assertEqual(result.the_file.content_type, content_type)
result.the_file.seek(0) result.the_file.seek(0)
@ -161,7 +161,7 @@ class FileTest(MongoDBTestCase):
setfile.save() setfile.save()
result = SetFile.objects.first() result = SetFile.objects.first()
self.assertTrue(setfile == result) self.assertEqual(setfile, result)
self.assertEqual(result.the_file.read(), text) self.assertEqual(result.the_file.read(), text)
# Try replacing file with new one # Try replacing file with new one
@ -169,7 +169,7 @@ class FileTest(MongoDBTestCase):
result.save() result.save()
result = SetFile.objects.first() result = SetFile.objects.first()
self.assertTrue(setfile == result) self.assertEqual(setfile, result)
self.assertEqual(result.the_file.read(), more_text) self.assertEqual(result.the_file.read(), more_text)
result.the_file.delete() result.the_file.delete()
@ -231,8 +231,8 @@ class FileTest(MongoDBTestCase):
test_file_dupe = TestFile() test_file_dupe = TestFile()
data = test_file_dupe.the_file.read() # Should be None data = test_file_dupe.the_file.read() # Should be None
self.assertTrue(test_file.name != test_file_dupe.name) self.assertNotEqual(test_file.name, test_file_dupe.name)
self.assertTrue(test_file.the_file.read() != data) self.assertNotEqual(test_file.the_file.read(), data)
TestFile.drop_collection() TestFile.drop_collection()
@ -291,7 +291,7 @@ class FileTest(MongoDBTestCase):
the_file = FileField() the_file = FileField()
test_file = TestFile() test_file = TestFile()
self.assertFalse(test_file.the_file in [{"test": 1}]) self.assertNotIn(test_file.the_file, [{"test": 1}])
def test_file_disk_space(self): def test_file_disk_space(self):
""" Test disk space usage when we delete/replace a file """ """ Test disk space usage when we delete/replace a file """

View File

@ -298,9 +298,9 @@ class GeoFieldTest(unittest.TestCase):
polygon = PolygonField() polygon = PolygonField()
geo_indicies = Event._geo_indices() geo_indicies = Event._geo_indices()
self.assertTrue({'fields': [('line', '2dsphere')]} in geo_indicies) self.assertIn({'fields': [('line', '2dsphere')]}, geo_indicies)
self.assertTrue({'fields': [('polygon', '2dsphere')]} in geo_indicies) self.assertIn({'fields': [('polygon', '2dsphere')]}, geo_indicies)
self.assertTrue({'fields': [('point', '2dsphere')]} in geo_indicies) self.assertIn({'fields': [('point', '2dsphere')]}, geo_indicies)
def test_indexes_2dsphere_embedded(self): def test_indexes_2dsphere_embedded(self):
"""Ensure that indexes are created automatically for GeoPointFields. """Ensure that indexes are created automatically for GeoPointFields.
@ -316,9 +316,9 @@ class GeoFieldTest(unittest.TestCase):
venue = EmbeddedDocumentField(Venue) venue = EmbeddedDocumentField(Venue)
geo_indicies = Event._geo_indices() geo_indicies = Event._geo_indices()
self.assertTrue({'fields': [('venue.line', '2dsphere')]} in geo_indicies) self.assertIn({'fields': [('venue.line', '2dsphere')]}, geo_indicies)
self.assertTrue({'fields': [('venue.polygon', '2dsphere')]} in geo_indicies) self.assertIn({'fields': [('venue.polygon', '2dsphere')]}, geo_indicies)
self.assertTrue({'fields': [('venue.point', '2dsphere')]} in geo_indicies) self.assertIn({'fields': [('venue.point', '2dsphere')]}, geo_indicies)
def test_geo_indexes_recursion(self): def test_geo_indexes_recursion(self):
@ -335,9 +335,9 @@ class GeoFieldTest(unittest.TestCase):
Parent(name='Berlin').save() Parent(name='Berlin').save()
info = Parent._get_collection().index_information() info = Parent._get_collection().index_information()
self.assertFalse('location_2d' in info) self.assertNotIn('location_2d', info)
info = Location._get_collection().index_information() info = Location._get_collection().index_information()
self.assertTrue('location_2d' in info) self.assertIn('location_2d', info)
self.assertEqual(len(Parent._geo_indices()), 0) self.assertEqual(len(Parent._geo_indices()), 0)
self.assertEqual(len(Location._geo_indices()), 1) self.assertEqual(len(Location._geo_indices()), 1)

View File

@ -1,6 +1,6 @@
from transform import * from .transform import *
from field_list import * from .field_list import *
from queryset import * from .queryset import *
from visitor import * from .visitor import *
from geo import * from .geo import *
from modify import * from .modify import *

View File

@ -181,7 +181,7 @@ class OnlyExcludeAllTest(unittest.TestCase):
employee.save() employee.save()
obj = self.Person.objects(id=employee.id).only('age').get() obj = self.Person.objects(id=employee.id).only('age').get()
self.assertTrue(isinstance(obj, Employee)) self.assertIsInstance(obj, Employee)
# Check field names are looked up properly # Check field names are looked up properly
obj = Employee.objects(id=employee.id).only('salary').get() obj = Employee.objects(id=employee.id).only('salary').get()

View File

@ -3,7 +3,7 @@ import unittest
from mongoengine import * from mongoengine import *
from tests.utils import MongoDBTestCase, needs_mongodb_v3 from tests.utils import MongoDBTestCase, requires_mongodb_gte_3
__all__ = ("GeoQueriesTest",) __all__ = ("GeoQueriesTest",)
@ -72,7 +72,7 @@ class GeoQueriesTest(MongoDBTestCase):
# $minDistance was added in MongoDB v2.6, but continued being buggy # $minDistance was added in MongoDB v2.6, but continued being buggy
# until v3.0; skip for older versions # until v3.0; skip for older versions
@needs_mongodb_v3 @requires_mongodb_gte_3
def test_near_and_min_distance(self): def test_near_and_min_distance(self):
"""Ensure the "min_distance" operator works alongside the "near" """Ensure the "min_distance" operator works alongside the "near"
operator. operator.
@ -95,9 +95,9 @@ class GeoQueriesTest(MongoDBTestCase):
location__within_distance=point_and_distance) location__within_distance=point_and_distance)
self.assertEqual(events.count(), 2) self.assertEqual(events.count(), 2)
events = list(events) events = list(events)
self.assertTrue(event2 not in events) self.assertNotIn(event2, events)
self.assertTrue(event1 in events) self.assertIn(event1, events)
self.assertTrue(event3 in events) self.assertIn(event3, events)
# find events within 10 degrees of san francisco # find events within 10 degrees of san francisco
point_and_distance = [[-122.415579, 37.7566023], 10] point_and_distance = [[-122.415579, 37.7566023], 10]
@ -245,7 +245,7 @@ class GeoQueriesTest(MongoDBTestCase):
# $minDistance was added in MongoDB v2.6, but continued being buggy # $minDistance was added in MongoDB v2.6, but continued being buggy
# until v3.0; skip for older versions # until v3.0; skip for older versions
@needs_mongodb_v3 @requires_mongodb_gte_3
def test_2dsphere_near_and_min_max_distance(self): def test_2dsphere_near_and_min_max_distance(self):
"""Ensure "min_distace" and "max_distance" operators work well """Ensure "min_distace" and "max_distance" operators work well
together with the "near" operator in a 2dsphere index. together with the "near" operator in a 2dsphere index.
@ -285,9 +285,9 @@ class GeoQueriesTest(MongoDBTestCase):
location__geo_within_center=point_and_distance) location__geo_within_center=point_and_distance)
self.assertEqual(events.count(), 2) self.assertEqual(events.count(), 2)
events = list(events) events = list(events)
self.assertTrue(event2 not in events) self.assertNotIn(event2, events)
self.assertTrue(event1 in events) self.assertIn(event1, events)
self.assertTrue(event3 in events) self.assertIn(event3, events)
def _test_embedded(self, point_field_class): def _test_embedded(self, point_field_class):
"""Helper test method ensuring given point field class works """Helper test method ensuring given point field class works
@ -329,7 +329,7 @@ class GeoQueriesTest(MongoDBTestCase):
self._test_embedded(point_field_class=PointField) self._test_embedded(point_field_class=PointField)
# Needs MongoDB > 2.6.4 https://jira.mongodb.org/browse/SERVER-14039 # Needs MongoDB > 2.6.4 https://jira.mongodb.org/browse/SERVER-14039
@needs_mongodb_v3 @requires_mongodb_gte_3
def test_spherical_geospatial_operators(self): def test_spherical_geospatial_operators(self):
"""Ensure that spherical geospatial queries are working.""" """Ensure that spherical geospatial queries are working."""
class Point(Document): class Point(Document):

View File

@ -2,7 +2,7 @@ import unittest
from mongoengine import connect, Document, IntField, StringField, ListField from mongoengine import connect, Document, IntField, StringField, ListField
from tests.utils import needs_mongodb_v26 from tests.utils import requires_mongodb_gte_26
__all__ = ("FindAndModifyTest",) __all__ = ("FindAndModifyTest",)
@ -96,7 +96,7 @@ class FindAndModifyTest(unittest.TestCase):
self.assertEqual(old_doc.to_mongo(), {"_id": 1}) self.assertEqual(old_doc.to_mongo(), {"_id": 1})
self.assertDbEqual([{"_id": 0, "value": 0}, {"_id": 1, "value": -1}]) self.assertDbEqual([{"_id": 0, "value": 0}, {"_id": 1, "value": -1}])
@needs_mongodb_v26 @requires_mongodb_gte_26
def test_modify_with_push(self): def test_modify_with_push(self):
class BlogPost(Document): class BlogPost(Document):
tags = ListField(StringField()) tags = ListField(StringField())

File diff suppressed because it is too large Load Diff

View File

@ -48,15 +48,15 @@ class TransformTest(unittest.TestCase):
for k, v in (("set", "$set"), ("set_on_insert", "$setOnInsert"), ("push", "$push")): for k, v in (("set", "$set"), ("set_on_insert", "$setOnInsert"), ("push", "$push")):
update = transform.update(DicDoc, **{"%s__dictField__test" % k: doc}) update = transform.update(DicDoc, **{"%s__dictField__test" % k: doc})
self.assertTrue(isinstance(update[v]["dictField.test"], dict)) self.assertIsInstance(update[v]["dictField.test"], dict)
# Update special cases # Update special cases
update = transform.update(DicDoc, unset__dictField__test=doc) update = transform.update(DicDoc, unset__dictField__test=doc)
self.assertEqual(update["$unset"]["dictField.test"], 1) self.assertEqual(update["$unset"]["dictField.test"], 1)
update = transform.update(DicDoc, pull__dictField__test=doc) update = transform.update(DicDoc, pull__dictField__test=doc)
self.assertTrue(isinstance(update["$pull"]["dictField"]["test"], dict)) self.assertIsInstance(update["$pull"]["dictField"]["test"], dict)
update = transform.update(LisDoc, pull__foo__in=['a']) update = transform.update(LisDoc, pull__foo__in=['a'])
self.assertEqual(update, {'$pull': {'foo': {'$in': ['a']}}}) self.assertEqual(update, {'$pull': {'foo': {'$in': ['a']}}})
@ -88,17 +88,15 @@ class TransformTest(unittest.TestCase):
post = BlogPost(**data) post = BlogPost(**data)
post.save() post.save()
self.assertTrue('postTitle' in self.assertIn('postTitle', BlogPost.objects(title=data['title'])._query)
BlogPost.objects(title=data['title'])._query)
self.assertFalse('title' in self.assertFalse('title' in
BlogPost.objects(title=data['title'])._query) BlogPost.objects(title=data['title'])._query)
self.assertEqual(BlogPost.objects(title=data['title']).count(), 1) self.assertEqual(BlogPost.objects(title=data['title']).count(), 1)
self.assertTrue('_id' in BlogPost.objects(pk=post.id)._query) self.assertIn('_id', BlogPost.objects(pk=post.id)._query)
self.assertEqual(BlogPost.objects(pk=post.id).count(), 1) self.assertEqual(BlogPost.objects(pk=post.id).count(), 1)
self.assertTrue('postComments.commentContent' in self.assertIn('postComments.commentContent', BlogPost.objects(comments__content='test')._query)
BlogPost.objects(comments__content='test')._query)
self.assertEqual(BlogPost.objects(comments__content='test').count(), 1) self.assertEqual(BlogPost.objects(comments__content='test').count(), 1)
BlogPost.drop_collection() BlogPost.drop_collection()
@ -116,8 +114,8 @@ class TransformTest(unittest.TestCase):
post = BlogPost(**data) post = BlogPost(**data)
post.save() post.save()
self.assertTrue('_id' in BlogPost.objects(pk=data['title'])._query) self.assertIn('_id', BlogPost.objects(pk=data['title'])._query)
self.assertTrue('_id' in BlogPost.objects(title=data['title'])._query) self.assertIn('_id', BlogPost.objects(title=data['title'])._query)
self.assertEqual(BlogPost.objects(pk=data['title']).count(), 1) self.assertEqual(BlogPost.objects(pk=data['title']).count(), 1)
BlogPost.drop_collection() BlogPost.drop_collection()
@ -260,31 +258,31 @@ class TransformTest(unittest.TestCase):
events = Event.objects(location__within=box) events = Event.objects(location__within=box)
with self.assertRaises(InvalidQueryError): with self.assertRaises(InvalidQueryError):
events.count() events.count()
def test_update_pull_for_list_fields(self): def test_update_pull_for_list_fields(self):
""" """
Test added to check pull operation in update for Test added to check pull operation in update for
EmbeddedDocumentListField which is inside a EmbeddedDocumentField EmbeddedDocumentListField which is inside a EmbeddedDocumentField
""" """
class Word(EmbeddedDocument): class Word(EmbeddedDocument):
word = StringField() word = StringField()
index = IntField() index = IntField()
class SubDoc(EmbeddedDocument): class SubDoc(EmbeddedDocument):
heading = ListField(StringField()) heading = ListField(StringField())
text = EmbeddedDocumentListField(Word) text = EmbeddedDocumentListField(Word)
class MainDoc(Document): class MainDoc(Document):
title = StringField() title = StringField()
content = EmbeddedDocumentField(SubDoc) content = EmbeddedDocumentField(SubDoc)
word = Word(word='abc', index=1) word = Word(word='abc', index=1)
update = transform.update(MainDoc, pull__content__text=word) update = transform.update(MainDoc, pull__content__text=word)
self.assertEqual(update, {'$pull': {'content.text': SON([('word', u'abc'), ('index', 1)])}}) self.assertEqual(update, {'$pull': {'content.text': SON([('word', u'abc'), ('index', 1)])}})
update = transform.update(MainDoc, pull__content__heading='xyz') update = transform.update(MainDoc, pull__content__heading='xyz')
self.assertEqual(update, {'$pull': {'content.heading': 'xyz'}}) self.assertEqual(update, {'$pull': {'content.heading': 'xyz'}})
if __name__ == '__main__': if __name__ == '__main__':
unittest.main() unittest.main()

View File

@ -196,7 +196,7 @@ class QTest(unittest.TestCase):
test2 = test.clone() test2 = test.clone()
self.assertEqual(test2.count(), 3) self.assertEqual(test2.count(), 3)
self.assertFalse(test2 == test) self.assertNotEqual(test2, test)
test3 = test2.filter(x=6) test3 = test2.filter(x=6)
self.assertEqual(test3.count(), 1) self.assertEqual(test3.count(), 1)
@ -296,6 +296,18 @@ class QTest(unittest.TestCase):
obj = self.Person.objects(Q(name__not=re.compile('^Gui'))).first() obj = self.Person.objects(Q(name__not=re.compile('^Gui'))).first()
self.assertEqual(obj, None) self.assertEqual(obj, None)
def test_q_repr(self):
self.assertEqual(repr(Q()), 'Q(**{})')
self.assertEqual(repr(Q(name='test')), "Q(**{'name': 'test'})")
self.assertEqual(
repr(Q(name='test') & Q(age__gte=18)),
"(Q(**{'name': 'test'}) & Q(**{'age__gte': 18}))")
self.assertEqual(
repr(Q(name='test') | Q(age__gte=18)),
"(Q(**{'name': 'test'}) | Q(**{'age__gte': 18}))")
def test_q_lists(self): def test_q_lists(self):
"""Ensure that Q objects query ListFields correctly. """Ensure that Q objects query ListFields correctly.
""" """

View File

@ -39,15 +39,15 @@ class ConnectionTest(unittest.TestCase):
connect('mongoenginetest') connect('mongoenginetest')
conn = get_connection() conn = get_connection()
self.assertTrue(isinstance(conn, pymongo.mongo_client.MongoClient)) self.assertIsInstance(conn, pymongo.mongo_client.MongoClient)
db = get_db() db = get_db()
self.assertTrue(isinstance(db, pymongo.database.Database)) self.assertIsInstance(db, pymongo.database.Database)
self.assertEqual(db.name, 'mongoenginetest') self.assertEqual(db.name, 'mongoenginetest')
connect('mongoenginetest2', alias='testdb') connect('mongoenginetest2', alias='testdb')
conn = get_connection('testdb') conn = get_connection('testdb')
self.assertTrue(isinstance(conn, pymongo.mongo_client.MongoClient)) self.assertIsInstance(conn, pymongo.mongo_client.MongoClient)
def test_connect_in_mocking(self): def test_connect_in_mocking(self):
"""Ensure that the connect() method works properly in mocking. """Ensure that the connect() method works properly in mocking.
@ -59,31 +59,31 @@ class ConnectionTest(unittest.TestCase):
connect('mongoenginetest', host='mongomock://localhost') connect('mongoenginetest', host='mongomock://localhost')
conn = get_connection() conn = get_connection()
self.assertTrue(isinstance(conn, mongomock.MongoClient)) self.assertIsInstance(conn, mongomock.MongoClient)
connect('mongoenginetest2', host='mongomock://localhost', alias='testdb2') connect('mongoenginetest2', host='mongomock://localhost', alias='testdb2')
conn = get_connection('testdb2') conn = get_connection('testdb2')
self.assertTrue(isinstance(conn, mongomock.MongoClient)) self.assertIsInstance(conn, mongomock.MongoClient)
connect('mongoenginetest3', host='mongodb://localhost', is_mock=True, alias='testdb3') connect('mongoenginetest3', host='mongodb://localhost', is_mock=True, alias='testdb3')
conn = get_connection('testdb3') conn = get_connection('testdb3')
self.assertTrue(isinstance(conn, mongomock.MongoClient)) self.assertIsInstance(conn, mongomock.MongoClient)
connect('mongoenginetest4', is_mock=True, alias='testdb4') connect('mongoenginetest4', is_mock=True, alias='testdb4')
conn = get_connection('testdb4') conn = get_connection('testdb4')
self.assertTrue(isinstance(conn, mongomock.MongoClient)) self.assertIsInstance(conn, mongomock.MongoClient)
connect(host='mongodb://localhost:27017/mongoenginetest5', is_mock=True, alias='testdb5') connect(host='mongodb://localhost:27017/mongoenginetest5', is_mock=True, alias='testdb5')
conn = get_connection('testdb5') conn = get_connection('testdb5')
self.assertTrue(isinstance(conn, mongomock.MongoClient)) self.assertIsInstance(conn, mongomock.MongoClient)
connect(host='mongomock://localhost:27017/mongoenginetest6', alias='testdb6') connect(host='mongomock://localhost:27017/mongoenginetest6', alias='testdb6')
conn = get_connection('testdb6') conn = get_connection('testdb6')
self.assertTrue(isinstance(conn, mongomock.MongoClient)) self.assertIsInstance(conn, mongomock.MongoClient)
connect(host='mongomock://localhost:27017/mongoenginetest7', is_mock=True, alias='testdb7') connect(host='mongomock://localhost:27017/mongoenginetest7', is_mock=True, alias='testdb7')
conn = get_connection('testdb7') conn = get_connection('testdb7')
self.assertTrue(isinstance(conn, mongomock.MongoClient)) self.assertIsInstance(conn, mongomock.MongoClient)
def test_connect_with_host_list(self): def test_connect_with_host_list(self):
"""Ensure that the connect() method works when host is a list """Ensure that the connect() method works when host is a list
@ -97,27 +97,27 @@ class ConnectionTest(unittest.TestCase):
connect(host=['mongomock://localhost']) connect(host=['mongomock://localhost'])
conn = get_connection() conn = get_connection()
self.assertTrue(isinstance(conn, mongomock.MongoClient)) self.assertIsInstance(conn, mongomock.MongoClient)
connect(host=['mongodb://localhost'], is_mock=True, alias='testdb2') connect(host=['mongodb://localhost'], is_mock=True, alias='testdb2')
conn = get_connection('testdb2') conn = get_connection('testdb2')
self.assertTrue(isinstance(conn, mongomock.MongoClient)) self.assertIsInstance(conn, mongomock.MongoClient)
connect(host=['localhost'], is_mock=True, alias='testdb3') connect(host=['localhost'], is_mock=True, alias='testdb3')
conn = get_connection('testdb3') conn = get_connection('testdb3')
self.assertTrue(isinstance(conn, mongomock.MongoClient)) self.assertIsInstance(conn, mongomock.MongoClient)
connect(host=['mongomock://localhost:27017', 'mongomock://localhost:27018'], alias='testdb4') connect(host=['mongomock://localhost:27017', 'mongomock://localhost:27018'], alias='testdb4')
conn = get_connection('testdb4') conn = get_connection('testdb4')
self.assertTrue(isinstance(conn, mongomock.MongoClient)) self.assertIsInstance(conn, mongomock.MongoClient)
connect(host=['mongodb://localhost:27017', 'mongodb://localhost:27018'], is_mock=True, alias='testdb5') connect(host=['mongodb://localhost:27017', 'mongodb://localhost:27018'], is_mock=True, alias='testdb5')
conn = get_connection('testdb5') conn = get_connection('testdb5')
self.assertTrue(isinstance(conn, mongomock.MongoClient)) self.assertIsInstance(conn, mongomock.MongoClient)
connect(host=['localhost:27017', 'localhost:27018'], is_mock=True, alias='testdb6') connect(host=['localhost:27017', 'localhost:27018'], is_mock=True, alias='testdb6')
conn = get_connection('testdb6') conn = get_connection('testdb6')
self.assertTrue(isinstance(conn, mongomock.MongoClient)) self.assertIsInstance(conn, mongomock.MongoClient)
def test_disconnect(self): def test_disconnect(self):
"""Ensure that the disconnect() method works properly """Ensure that the disconnect() method works properly
@ -163,10 +163,10 @@ class ConnectionTest(unittest.TestCase):
connect("testdb_uri", host='mongodb://username:password@localhost/mongoenginetest') connect("testdb_uri", host='mongodb://username:password@localhost/mongoenginetest')
conn = get_connection() conn = get_connection()
self.assertTrue(isinstance(conn, pymongo.mongo_client.MongoClient)) self.assertIsInstance(conn, pymongo.mongo_client.MongoClient)
db = get_db() db = get_db()
self.assertTrue(isinstance(db, pymongo.database.Database)) self.assertIsInstance(db, pymongo.database.Database)
self.assertEqual(db.name, 'mongoenginetest') self.assertEqual(db.name, 'mongoenginetest')
c.admin.system.users.remove({}) c.admin.system.users.remove({})
@ -179,10 +179,10 @@ class ConnectionTest(unittest.TestCase):
connect("mongoenginetest", host='mongodb://localhost/') connect("mongoenginetest", host='mongodb://localhost/')
conn = get_connection() conn = get_connection()
self.assertTrue(isinstance(conn, pymongo.mongo_client.MongoClient)) self.assertIsInstance(conn, pymongo.mongo_client.MongoClient)
db = get_db() db = get_db()
self.assertTrue(isinstance(db, pymongo.database.Database)) self.assertIsInstance(db, pymongo.database.Database)
self.assertEqual(db.name, 'mongoenginetest') self.assertEqual(db.name, 'mongoenginetest')
def test_connect_uri_default_db(self): def test_connect_uri_default_db(self):
@ -192,10 +192,10 @@ class ConnectionTest(unittest.TestCase):
connect(host='mongodb://localhost/') connect(host='mongodb://localhost/')
conn = get_connection() conn = get_connection()
self.assertTrue(isinstance(conn, pymongo.mongo_client.MongoClient)) self.assertIsInstance(conn, pymongo.mongo_client.MongoClient)
db = get_db() db = get_db()
self.assertTrue(isinstance(db, pymongo.database.Database)) self.assertIsInstance(db, pymongo.database.Database)
self.assertEqual(db.name, 'test') self.assertEqual(db.name, 'test')
def test_uri_without_credentials_doesnt_override_conn_settings(self): def test_uri_without_credentials_doesnt_override_conn_settings(self):
@ -242,7 +242,7 @@ class ConnectionTest(unittest.TestCase):
'mongoenginetest?authSource=admin') 'mongoenginetest?authSource=admin')
) )
db = get_db('test2') db = get_db('test2')
self.assertTrue(isinstance(db, pymongo.database.Database)) self.assertIsInstance(db, pymongo.database.Database)
self.assertEqual(db.name, 'mongoenginetest') self.assertEqual(db.name, 'mongoenginetest')
# Clear all users # Clear all users
@ -255,10 +255,10 @@ class ConnectionTest(unittest.TestCase):
self.assertRaises(MongoEngineConnectionError, get_connection) self.assertRaises(MongoEngineConnectionError, get_connection)
conn = get_connection('testdb') conn = get_connection('testdb')
self.assertTrue(isinstance(conn, pymongo.mongo_client.MongoClient)) self.assertIsInstance(conn, pymongo.mongo_client.MongoClient)
db = get_db('testdb') db = get_db('testdb')
self.assertTrue(isinstance(db, pymongo.database.Database)) self.assertIsInstance(db, pymongo.database.Database)
self.assertEqual(db.name, 'mongoenginetest2') self.assertEqual(db.name, 'mongoenginetest2')
def test_register_connection_defaults(self): def test_register_connection_defaults(self):
@ -267,7 +267,7 @@ class ConnectionTest(unittest.TestCase):
register_connection('testdb', 'mongoenginetest', host=None, port=None) register_connection('testdb', 'mongoenginetest', host=None, port=None)
conn = get_connection('testdb') conn = get_connection('testdb')
self.assertTrue(isinstance(conn, pymongo.mongo_client.MongoClient)) self.assertIsInstance(conn, pymongo.mongo_client.MongoClient)
def test_connection_kwargs(self): def test_connection_kwargs(self):
"""Ensure that connection kwargs get passed to pymongo.""" """Ensure that connection kwargs get passed to pymongo."""
@ -326,7 +326,7 @@ class ConnectionTest(unittest.TestCase):
if IS_PYMONGO_3: if IS_PYMONGO_3:
c = connect(host='mongodb://localhost/test?replicaSet=local-rs') c = connect(host='mongodb://localhost/test?replicaSet=local-rs')
db = get_db() db = get_db()
self.assertTrue(isinstance(db, pymongo.database.Database)) self.assertIsInstance(db, pymongo.database.Database)
self.assertEqual(db.name, 'test') self.assertEqual(db.name, 'test')
else: else:
# PyMongo < v3.x raises an exception: # PyMongo < v3.x raises an exception:
@ -343,7 +343,7 @@ class ConnectionTest(unittest.TestCase):
self.assertEqual(c._MongoClient__options.replica_set_name, self.assertEqual(c._MongoClient__options.replica_set_name,
'local-rs') 'local-rs')
db = get_db() db = get_db()
self.assertTrue(isinstance(db, pymongo.database.Database)) self.assertIsInstance(db, pymongo.database.Database)
self.assertEqual(db.name, 'test') self.assertEqual(db.name, 'test')
else: else:
# PyMongo < v3.x raises an exception: # PyMongo < v3.x raises an exception:
@ -364,6 +364,12 @@ class ConnectionTest(unittest.TestCase):
date_doc = DateDoc.objects.first() date_doc = DateDoc.objects.first()
self.assertEqual(d, date_doc.the_date) self.assertEqual(d, date_doc.the_date)
def test_read_preference_from_parse(self):
if IS_PYMONGO_3:
from pymongo import ReadPreference
conn = connect(host="mongodb://a1.vpc,a2.vpc,a3.vpc/prod?readPreference=secondaryPreferred")
self.assertEqual(conn.read_preference, ReadPreference.SECONDARY_PREFERRED)
def test_multiple_connection_settings(self): def test_multiple_connection_settings(self):
connect('mongoenginetest', alias='t1', host="localhost") connect('mongoenginetest', alias='t1', host="localhost")
@ -371,8 +377,8 @@ class ConnectionTest(unittest.TestCase):
mongo_connections = mongoengine.connection._connections mongo_connections = mongoengine.connection._connections
self.assertEqual(len(mongo_connections.items()), 2) self.assertEqual(len(mongo_connections.items()), 2)
self.assertTrue('t1' in mongo_connections.keys()) self.assertIn('t1', mongo_connections.keys())
self.assertTrue('t2' in mongo_connections.keys()) self.assertIn('t2', mongo_connections.keys())
if not IS_PYMONGO_3: if not IS_PYMONGO_3:
self.assertEqual(mongo_connections['t1'].host, 'localhost') self.assertEqual(mongo_connections['t1'].host, 'localhost')
self.assertEqual(mongo_connections['t2'].host, '127.0.0.1') self.assertEqual(mongo_connections['t2'].host, '127.0.0.1')

View File

@ -89,15 +89,15 @@ class ContextManagersTest(unittest.TestCase):
with no_dereference(Group) as Group: with no_dereference(Group) as Group:
group = Group.objects.first() group = Group.objects.first()
self.assertTrue(all([not isinstance(m, User) for m in group.members:
for m in group.members])) self.assertNotIsInstance(m, User)
self.assertFalse(isinstance(group.ref, User)) self.assertNotIsInstance(group.ref, User)
self.assertFalse(isinstance(group.generic, User)) self.assertNotIsInstance(group.generic, User)
self.assertTrue(all([isinstance(m, User) for m in group.members:
for m in group.members])) self.assertIsInstance(m, User)
self.assertTrue(isinstance(group.ref, User)) self.assertIsInstance(group.ref, User)
self.assertTrue(isinstance(group.generic, User)) self.assertIsInstance(group.generic, User)
def test_no_dereference_context_manager_dbref(self): def test_no_dereference_context_manager_dbref(self):
"""Ensure that DBRef items in ListFields aren't dereferenced. """Ensure that DBRef items in ListFields aren't dereferenced.
@ -129,19 +129,17 @@ class ContextManagersTest(unittest.TestCase):
group = Group.objects.first() group = Group.objects.first()
self.assertTrue(all([not isinstance(m, User) self.assertTrue(all([not isinstance(m, User)
for m in group.members])) for m in group.members]))
self.assertFalse(isinstance(group.ref, User)) self.assertNotIsInstance(group.ref, User)
self.assertFalse(isinstance(group.generic, User)) self.assertNotIsInstance(group.generic, User)
self.assertTrue(all([isinstance(m, User) self.assertTrue(all([isinstance(m, User)
for m in group.members])) for m in group.members]))
self.assertTrue(isinstance(group.ref, User)) self.assertIsInstance(group.ref, User)
self.assertTrue(isinstance(group.generic, User)) self.assertIsInstance(group.generic, User)
def test_no_sub_classes(self): def test_no_sub_classes(self):
class A(Document): class A(Document):
x = IntField() x = IntField()
y = IntField()
meta = {'allow_inheritance': True} meta = {'allow_inheritance': True}
class B(A): class B(A):
@ -152,29 +150,29 @@ class ContextManagersTest(unittest.TestCase):
A.drop_collection() A.drop_collection()
A(x=10, y=20).save() A(x=10).save()
A(x=15, y=30).save() A(x=15).save()
B(x=20, y=40).save() B(x=20).save()
B(x=30, y=50).save() B(x=30).save()
C(x=40, y=60).save() C(x=40).save()
self.assertEqual(A.objects.count(), 5) self.assertEqual(A.objects.count(), 5)
self.assertEqual(B.objects.count(), 3) self.assertEqual(B.objects.count(), 3)
self.assertEqual(C.objects.count(), 1) self.assertEqual(C.objects.count(), 1)
with no_sub_classes(A) as A: with no_sub_classes(A):
self.assertEqual(A.objects.count(), 2) self.assertEqual(A.objects.count(), 2)
for obj in A.objects: for obj in A.objects:
self.assertEqual(obj.__class__, A) self.assertEqual(obj.__class__, A)
with no_sub_classes(B) as B: with no_sub_classes(B):
self.assertEqual(B.objects.count(), 2) self.assertEqual(B.objects.count(), 2)
for obj in B.objects: for obj in B.objects:
self.assertEqual(obj.__class__, B) self.assertEqual(obj.__class__, B)
with no_sub_classes(C) as C: with no_sub_classes(C):
self.assertEqual(C.objects.count(), 1) self.assertEqual(C.objects.count(), 1)
for obj in C.objects: for obj in C.objects:
@ -185,18 +183,124 @@ class ContextManagersTest(unittest.TestCase):
self.assertEqual(B.objects.count(), 3) self.assertEqual(B.objects.count(), 3)
self.assertEqual(C.objects.count(), 1) self.assertEqual(C.objects.count(), 1)
def test_no_sub_classes_modification_to_document_class_are_temporary(self):
class A(Document):
x = IntField()
meta = {'allow_inheritance': True}
class B(A):
z = IntField()
self.assertEqual(A._subclasses, ('A', 'A.B'))
with no_sub_classes(A):
self.assertEqual(A._subclasses, ('A',))
self.assertEqual(A._subclasses, ('A', 'A.B'))
self.assertEqual(B._subclasses, ('A.B',))
with no_sub_classes(B):
self.assertEqual(B._subclasses, ('A.B',))
self.assertEqual(B._subclasses, ('A.B',))
def test_no_subclass_context_manager_does_not_swallow_exception(self):
class User(Document):
name = StringField()
with self.assertRaises(TypeError):
with no_sub_classes(User):
raise TypeError()
def test_query_counter_does_not_swallow_exception(self):
with self.assertRaises(TypeError):
with query_counter() as q:
raise TypeError()
def test_query_counter_temporarily_modifies_profiling_level(self):
connect('mongoenginetest')
db = get_db()
initial_profiling_level = db.profiling_level()
try:
NEW_LEVEL = 1
db.set_profiling_level(NEW_LEVEL)
self.assertEqual(db.profiling_level(), NEW_LEVEL)
with query_counter() as q:
self.assertEqual(db.profiling_level(), 2)
self.assertEqual(db.profiling_level(), NEW_LEVEL)
except Exception:
db.set_profiling_level(initial_profiling_level) # Ensures it gets reseted no matter the outcome of the test
raise
def test_query_counter(self): def test_query_counter(self):
connect('mongoenginetest') connect('mongoenginetest')
db = get_db() db = get_db()
db.test.find({})
collection = db.query_counter
collection.drop()
def issue_1_count_query():
collection.find({}).count()
def issue_1_insert_query():
collection.insert_one({'test': 'garbage'})
def issue_1_find_query():
collection.find_one()
counter = 0
with query_counter() as q:
self.assertEqual(q, counter)
self.assertEqual(q, counter) # Ensures previous count query did not get counted
for _ in range(10):
issue_1_insert_query()
counter += 1
self.assertEqual(q, counter)
for _ in range(4):
issue_1_find_query()
counter += 1
self.assertEqual(q, counter)
for _ in range(3):
issue_1_count_query()
counter += 1
self.assertEqual(q, counter)
def test_query_counter_counts_getmore_queries(self):
connect('mongoenginetest')
db = get_db()
collection = db.query_counter
collection.drop()
many_docs = [{'test': 'garbage %s' % i} for i in range(150)]
collection.insert_many(many_docs) # first batch of documents contains 101 documents
with query_counter() as q: with query_counter() as q:
self.assertEqual(0, q) self.assertEqual(q, 0)
list(collection.find())
self.assertEqual(q, 2) # 1st select + 1 getmore
for i in range(1, 51): def test_query_counter_ignores_particular_queries(self):
db.test.find({}).count() connect('mongoenginetest')
db = get_db()
self.assertEqual(50, q) collection = db.query_counter
collection.insert_many([{'test': 'garbage %s' % i} for i in range(10)])
with query_counter() as q:
self.assertEqual(q, 0)
cursor = collection.find()
self.assertEqual(q, 0) # cursor wasn't opened yet
_ = next(cursor) # opens the cursor and fires the find query
self.assertEqual(q, 1)
cursor.close() # issues a `killcursors` query that is ignored by the context
self.assertEqual(q, 1)
_ = db.system.indexes.find_one() # queries on db.system.indexes are ignored as well
self.assertEqual(q, 1)
if __name__ == '__main__': if __name__ == '__main__':
unittest.main() unittest.main()

View File

@ -200,8 +200,8 @@ class FieldTest(unittest.TestCase):
group = Group(author=user, members=[user]).save() group = Group(author=user, members=[user]).save()
raw_data = Group._get_collection().find_one() raw_data = Group._get_collection().find_one()
self.assertTrue(isinstance(raw_data['author'], DBRef)) self.assertIsInstance(raw_data['author'], DBRef)
self.assertTrue(isinstance(raw_data['members'][0], DBRef)) self.assertIsInstance(raw_data['members'][0], DBRef)
group = Group.objects.first() group = Group.objects.first()
self.assertEqual(group.author, user) self.assertEqual(group.author, user)
@ -224,8 +224,8 @@ class FieldTest(unittest.TestCase):
self.assertEqual(group.members, [user]) self.assertEqual(group.members, [user])
raw_data = Group._get_collection().find_one() raw_data = Group._get_collection().find_one()
self.assertTrue(isinstance(raw_data['author'], ObjectId)) self.assertIsInstance(raw_data['author'], ObjectId)
self.assertTrue(isinstance(raw_data['members'][0], ObjectId)) self.assertIsInstance(raw_data['members'][0], ObjectId)
def test_recursive_reference(self): def test_recursive_reference(self):
"""Ensure that ReferenceFields can reference their own documents. """Ensure that ReferenceFields can reference their own documents.
@ -469,7 +469,7 @@ class FieldTest(unittest.TestCase):
self.assertEqual(q, 4) self.assertEqual(q, 4)
for m in group_obj.members: for m in group_obj.members:
self.assertTrue('User' in m.__class__.__name__) self.assertIn('User', m.__class__.__name__)
# Document select_related # Document select_related
with query_counter() as q: with query_counter() as q:
@ -485,7 +485,7 @@ class FieldTest(unittest.TestCase):
self.assertEqual(q, 4) self.assertEqual(q, 4)
for m in group_obj.members: for m in group_obj.members:
self.assertTrue('User' in m.__class__.__name__) self.assertIn('User', m.__class__.__name__)
# Queryset select_related # Queryset select_related
with query_counter() as q: with query_counter() as q:
@ -502,7 +502,7 @@ class FieldTest(unittest.TestCase):
self.assertEqual(q, 4) self.assertEqual(q, 4)
for m in group_obj.members: for m in group_obj.members:
self.assertTrue('User' in m.__class__.__name__) self.assertIn('User', m.__class__.__name__)
UserA.drop_collection() UserA.drop_collection()
UserB.drop_collection() UserB.drop_collection()
@ -560,7 +560,7 @@ class FieldTest(unittest.TestCase):
self.assertEqual(q, 4) self.assertEqual(q, 4)
for m in group_obj.members: for m in group_obj.members:
self.assertTrue('User' in m.__class__.__name__) self.assertIn('User', m.__class__.__name__)
# Document select_related # Document select_related
with query_counter() as q: with query_counter() as q:
@ -576,7 +576,7 @@ class FieldTest(unittest.TestCase):
self.assertEqual(q, 4) self.assertEqual(q, 4)
for m in group_obj.members: for m in group_obj.members:
self.assertTrue('User' in m.__class__.__name__) self.assertIn('User', m.__class__.__name__)
# Queryset select_related # Queryset select_related
with query_counter() as q: with query_counter() as q:
@ -593,7 +593,7 @@ class FieldTest(unittest.TestCase):
self.assertEqual(q, 4) self.assertEqual(q, 4)
for m in group_obj.members: for m in group_obj.members:
self.assertTrue('User' in m.__class__.__name__) self.assertIn('User', m.__class__.__name__)
UserA.drop_collection() UserA.drop_collection()
UserB.drop_collection() UserB.drop_collection()
@ -633,7 +633,7 @@ class FieldTest(unittest.TestCase):
self.assertEqual(q, 2) self.assertEqual(q, 2)
for k, m in group_obj.members.iteritems(): for k, m in group_obj.members.iteritems():
self.assertTrue(isinstance(m, User)) self.assertIsInstance(m, User)
# Document select_related # Document select_related
with query_counter() as q: with query_counter() as q:
@ -646,7 +646,7 @@ class FieldTest(unittest.TestCase):
self.assertEqual(q, 2) self.assertEqual(q, 2)
for k, m in group_obj.members.iteritems(): for k, m in group_obj.members.iteritems():
self.assertTrue(isinstance(m, User)) self.assertIsInstance(m, User)
# Queryset select_related # Queryset select_related
with query_counter() as q: with query_counter() as q:
@ -660,7 +660,7 @@ class FieldTest(unittest.TestCase):
self.assertEqual(q, 2) self.assertEqual(q, 2)
for k, m in group_obj.members.iteritems(): for k, m in group_obj.members.iteritems():
self.assertTrue(isinstance(m, User)) self.assertIsInstance(m, User)
User.drop_collection() User.drop_collection()
Group.drop_collection() Group.drop_collection()
@ -715,7 +715,7 @@ class FieldTest(unittest.TestCase):
self.assertEqual(q, 4) self.assertEqual(q, 4)
for k, m in group_obj.members.iteritems(): for k, m in group_obj.members.iteritems():
self.assertTrue('User' in m.__class__.__name__) self.assertIn('User', m.__class__.__name__)
# Document select_related # Document select_related
with query_counter() as q: with query_counter() as q:
@ -731,7 +731,7 @@ class FieldTest(unittest.TestCase):
self.assertEqual(q, 4) self.assertEqual(q, 4)
for k, m in group_obj.members.iteritems(): for k, m in group_obj.members.iteritems():
self.assertTrue('User' in m.__class__.__name__) self.assertIn('User', m.__class__.__name__)
# Queryset select_related # Queryset select_related
with query_counter() as q: with query_counter() as q:
@ -748,7 +748,7 @@ class FieldTest(unittest.TestCase):
self.assertEqual(q, 4) self.assertEqual(q, 4)
for k, m in group_obj.members.iteritems(): for k, m in group_obj.members.iteritems():
self.assertTrue('User' in m.__class__.__name__) self.assertIn('User', m.__class__.__name__)
Group.objects.delete() Group.objects.delete()
Group().save() Group().save()
@ -806,7 +806,7 @@ class FieldTest(unittest.TestCase):
self.assertEqual(q, 2) self.assertEqual(q, 2)
for k, m in group_obj.members.iteritems(): for k, m in group_obj.members.iteritems():
self.assertTrue(isinstance(m, UserA)) self.assertIsInstance(m, UserA)
# Document select_related # Document select_related
with query_counter() as q: with query_counter() as q:
@ -822,7 +822,7 @@ class FieldTest(unittest.TestCase):
self.assertEqual(q, 2) self.assertEqual(q, 2)
for k, m in group_obj.members.iteritems(): for k, m in group_obj.members.iteritems():
self.assertTrue(isinstance(m, UserA)) self.assertIsInstance(m, UserA)
# Queryset select_related # Queryset select_related
with query_counter() as q: with query_counter() as q:
@ -839,7 +839,7 @@ class FieldTest(unittest.TestCase):
self.assertEqual(q, 2) self.assertEqual(q, 2)
for k, m in group_obj.members.iteritems(): for k, m in group_obj.members.iteritems():
self.assertTrue(isinstance(m, UserA)) self.assertIsInstance(m, UserA)
UserA.drop_collection() UserA.drop_collection()
Group.drop_collection() Group.drop_collection()
@ -894,7 +894,7 @@ class FieldTest(unittest.TestCase):
self.assertEqual(q, 4) self.assertEqual(q, 4)
for k, m in group_obj.members.iteritems(): for k, m in group_obj.members.iteritems():
self.assertTrue('User' in m.__class__.__name__) self.assertIn('User', m.__class__.__name__)
# Document select_related # Document select_related
with query_counter() as q: with query_counter() as q:
@ -910,7 +910,7 @@ class FieldTest(unittest.TestCase):
self.assertEqual(q, 4) self.assertEqual(q, 4)
for k, m in group_obj.members.iteritems(): for k, m in group_obj.members.iteritems():
self.assertTrue('User' in m.__class__.__name__) self.assertIn('User', m.__class__.__name__)
# Queryset select_related # Queryset select_related
with query_counter() as q: with query_counter() as q:
@ -927,7 +927,7 @@ class FieldTest(unittest.TestCase):
self.assertEqual(q, 4) self.assertEqual(q, 4)
for k, m in group_obj.members.iteritems(): for k, m in group_obj.members.iteritems():
self.assertTrue('User' in m.__class__.__name__) self.assertIn('User', m.__class__.__name__)
Group.objects.delete() Group.objects.delete()
Group().save() Group().save()
@ -1029,7 +1029,6 @@ class FieldTest(unittest.TestCase):
self.assertEqual(type(foo.bar), Bar) self.assertEqual(type(foo.bar), Bar)
self.assertEqual(type(foo.baz), Baz) self.assertEqual(type(foo.baz), Baz)
def test_document_reload_reference_integrity(self): def test_document_reload_reference_integrity(self):
""" """
Ensure reloading a document with multiple similar id Ensure reloading a document with multiple similar id
@ -1209,10 +1208,10 @@ class FieldTest(unittest.TestCase):
# Can't use query_counter across databases - so test the _data object # Can't use query_counter across databases - so test the _data object
book = Book.objects.first() book = Book.objects.first()
self.assertFalse(isinstance(book._data['author'], User)) self.assertNotIsInstance(book._data['author'], User)
book.select_related() book.select_related()
self.assertTrue(isinstance(book._data['author'], User)) self.assertIsInstance(book._data['author'], User)
def test_non_ascii_pk(self): def test_non_ascii_pk(self):
""" """

38
tests/test_utils.py Normal file
View File

@ -0,0 +1,38 @@
import unittest
import re
from mongoengine.base.utils import LazyRegexCompiler
signal_output = []
class LazyRegexCompilerTest(unittest.TestCase):
def test_lazy_regex_compiler_verify_laziness_of_descriptor(self):
class UserEmail(object):
EMAIL_REGEX = LazyRegexCompiler('@', flags=32)
descriptor = UserEmail.__dict__['EMAIL_REGEX']
self.assertIsNone(descriptor._compiled_regex)
regex = UserEmail.EMAIL_REGEX
self.assertEqual(regex, re.compile('@', flags=32))
self.assertEqual(regex.search('user@domain.com').group(), '@')
user_email = UserEmail()
self.assertIs(user_email.EMAIL_REGEX, UserEmail.EMAIL_REGEX)
def test_lazy_regex_compiler_verify_cannot_set_descriptor_on_instance(self):
class UserEmail(object):
EMAIL_REGEX = LazyRegexCompiler('@')
user_email = UserEmail()
with self.assertRaises(AttributeError):
user_email.EMAIL_REGEX = re.compile('@')
def test_lazy_regex_compiler_verify_can_override_class_attr(self):
class UserEmail(object):
EMAIL_REGEX = LazyRegexCompiler('@')
UserEmail.EMAIL_REGEX = re.compile('cookies')
self.assertEqual(UserEmail.EMAIL_REGEX.search('Cake & cookies').group(), 'cookies')

View File

@ -7,12 +7,19 @@ from mongoengine.connection import get_db, get_connection
from mongoengine.python_support import IS_PYMONGO_3 from mongoengine.python_support import IS_PYMONGO_3
MONGO_TEST_DB = 'mongoenginetest' MONGO_TEST_DB = 'mongoenginetest' # standard name for the test database
# Constant that can be used to compare the version retrieved with
# get_mongodb_version()
MONGODB_26 = (2, 6)
MONGODB_3 = (3,0)
MONGODB_32 = (3, 2)
class MongoDBTestCase(unittest.TestCase): class MongoDBTestCase(unittest.TestCase):
"""Base class for tests that need a mongodb connection """Base class for tests that need a mongodb connection
db is being dropped automatically It ensures that the db is clean at the beginning and dropped at the end automatically
""" """
@classmethod @classmethod
@ -27,40 +34,46 @@ class MongoDBTestCase(unittest.TestCase):
def get_mongodb_version(): def get_mongodb_version():
"""Return the version tuple of the MongoDB server that the default """Return the version of the connected mongoDB (first 2 digits)
connection is connected to.
"""
return tuple(get_connection().server_info()['versionArray'])
def _decorated_with_ver_requirement(func, ver_tuple): :return: tuple(int, int)
"""
version_list = get_connection().server_info()['versionArray'][:2] # e.g: (3, 2)
return tuple(version_list)
def _decorated_with_ver_requirement(func, version):
"""Return a given function decorated with the version requirement """Return a given function decorated with the version requirement
for a particular MongoDB version tuple. for a particular MongoDB version tuple.
:param version: The version required (tuple(int, int))
""" """
def _inner(*args, **kwargs): def _inner(*args, **kwargs):
mongodb_ver = get_mongodb_version() MONGODB_V = get_mongodb_version()
if mongodb_ver >= ver_tuple: if MONGODB_V >= version:
return func(*args, **kwargs) return func(*args, **kwargs)
raise SkipTest('Needs MongoDB v{}+'.format( raise SkipTest('Needs MongoDB v{}+'.format('.'.join(str(n) for n in version)))
'.'.join([str(v) for v in ver_tuple])
))
_inner.__name__ = func.__name__ _inner.__name__ = func.__name__
_inner.__doc__ = func.__doc__ _inner.__doc__ = func.__doc__
return _inner return _inner
def needs_mongodb_v26(func):
def requires_mongodb_gte_26(func):
"""Raise a SkipTest exception if we're working with MongoDB version """Raise a SkipTest exception if we're working with MongoDB version
lower than v2.6. lower than v2.6.
""" """
return _decorated_with_ver_requirement(func, (2, 6)) return _decorated_with_ver_requirement(func, MONGODB_26)
def needs_mongodb_v3(func):
def requires_mongodb_gte_3(func):
"""Raise a SkipTest exception if we're working with MongoDB version """Raise a SkipTest exception if we're working with MongoDB version
lower than v3.0. lower than v3.0.
""" """
return _decorated_with_ver_requirement(func, (3, 0)) return _decorated_with_ver_requirement(func, MONGODB_3)
def skip_pymongo3(f): def skip_pymongo3(f):
"""Raise a SkipTest exception if we're running a test against """Raise a SkipTest exception if we're running a test against