Merge branch 'master' of https://github.com/MongoEngine/mongoengine into yalon-master
This commit is contained in:
commit
d59c4044b7
@ -91,11 +91,11 @@ deploy:
|
|||||||
distributions: "sdist bdist_wheel"
|
distributions: "sdist bdist_wheel"
|
||||||
|
|
||||||
# only deploy on tagged commits (aka GitHub releases) and only for the
|
# only deploy on tagged commits (aka GitHub releases) and only for the
|
||||||
# parent repo's builds running Python 2.7 along with PyMongo v3.0 (we run
|
# parent repo's builds running Python 2.7 along with PyMongo v3.x (we run
|
||||||
# Travis against many different Python and PyMongo versions and we don't
|
# Travis against many different Python and PyMongo versions and we don't
|
||||||
# want the deploy to occur multiple times).
|
# want the deploy to occur multiple times).
|
||||||
on:
|
on:
|
||||||
tags: true
|
tags: true
|
||||||
repo: MongoEngine/mongoengine
|
repo: MongoEngine/mongoengine
|
||||||
condition: "$PYMONGO = 3.0"
|
condition: "$PYMONGO = 3.x"
|
||||||
python: 2.7
|
python: 2.7
|
||||||
|
1
AUTHORS
1
AUTHORS
@ -246,3 +246,4 @@ that much better:
|
|||||||
* Renjianxin (https://github.com/Davidrjx)
|
* Renjianxin (https://github.com/Davidrjx)
|
||||||
* Erdenezul Batmunkh (https://github.com/erdenezul)
|
* Erdenezul Batmunkh (https://github.com/erdenezul)
|
||||||
* Andy Yankovsky (https://github.com/werat)
|
* Andy Yankovsky (https://github.com/werat)
|
||||||
|
* Bastien Gérard (https://github.com/bagerard)
|
||||||
|
@ -22,8 +22,11 @@ Supported Interpreters
|
|||||||
|
|
||||||
MongoEngine supports CPython 2.7 and newer. Language
|
MongoEngine supports CPython 2.7 and newer. Language
|
||||||
features not supported by all interpreters can not be used.
|
features not supported by all interpreters can not be used.
|
||||||
Please also ensure that your code is properly converted by
|
The codebase is written in python 2 so you must be using python 2
|
||||||
`2to3 <http://docs.python.org/library/2to3.html>`_ for Python 3 support.
|
when developing new features. Compatibility of the library with Python 3
|
||||||
|
relies on the 2to3 package that gets executed as part of the installation
|
||||||
|
build. You should ensure that your code is properly converted by
|
||||||
|
`2to3 <http://docs.python.org/library/2to3.html>`_.
|
||||||
|
|
||||||
Style Guide
|
Style Guide
|
||||||
-----------
|
-----------
|
||||||
|
@ -2,8 +2,12 @@
|
|||||||
Changelog
|
Changelog
|
||||||
=========
|
=========
|
||||||
|
|
||||||
dev
|
Changes in 0.15.4
|
||||||
===
|
=================
|
||||||
|
- Added `DateField` #513
|
||||||
|
|
||||||
|
Changes in 0.15.3
|
||||||
|
=================
|
||||||
- Subfield resolve error in generic_emdedded_document query #1651 #1652
|
- Subfield resolve error in generic_emdedded_document query #1651 #1652
|
||||||
- use each modifier only with $position #1673 #1675
|
- use each modifier only with $position #1673 #1675
|
||||||
- Improve LazyReferenceField and GenericLazyReferenceField with nested fields #1704
|
- Improve LazyReferenceField and GenericLazyReferenceField with nested fields #1704
|
||||||
|
@ -18,10 +18,10 @@ provide the :attr:`host` and :attr:`port` arguments to
|
|||||||
|
|
||||||
connect('project1', host='192.168.1.35', port=12345)
|
connect('project1', host='192.168.1.35', port=12345)
|
||||||
|
|
||||||
If the database requires authentication, :attr:`username` and :attr:`password`
|
If the database requires authentication, :attr:`username`, :attr:`password`
|
||||||
arguments should be provided::
|
and :attr:`authentication_source` arguments should be provided::
|
||||||
|
|
||||||
connect('project1', username='webapp', password='pwd123')
|
connect('project1', username='webapp', password='pwd123', authentication_source='admin')
|
||||||
|
|
||||||
URI style connections are also supported -- just supply the URI as
|
URI style connections are also supported -- just supply the URI as
|
||||||
the :attr:`host` to
|
the :attr:`host` to
|
||||||
|
@ -513,6 +513,9 @@ If a dictionary is passed then the following options are available:
|
|||||||
Allows you to automatically expire data from a collection by setting the
|
Allows you to automatically expire data from a collection by setting the
|
||||||
time in seconds to expire the a field.
|
time in seconds to expire the a field.
|
||||||
|
|
||||||
|
:attr:`name` (Optional)
|
||||||
|
Allows you to specify a name for the index
|
||||||
|
|
||||||
.. note::
|
.. note::
|
||||||
|
|
||||||
Inheritance adds extra fields indices see: :ref:`document-inheritance`.
|
Inheritance adds extra fields indices see: :ref:`document-inheritance`.
|
||||||
|
@ -57,7 +57,8 @@ document values for example::
|
|||||||
|
|
||||||
def clean(self):
|
def clean(self):
|
||||||
"""Ensures that only published essays have a `pub_date` and
|
"""Ensures that only published essays have a `pub_date` and
|
||||||
automatically sets the pub_date if published and not set"""
|
automatically sets `pub_date` if essay is published and `pub_date`
|
||||||
|
is not set"""
|
||||||
if self.status == 'Draft' and self.pub_date is not None:
|
if self.status == 'Draft' and self.pub_date is not None:
|
||||||
msg = 'Draft entries should not have a publication date.'
|
msg = 'Draft entries should not have a publication date.'
|
||||||
raise ValidationError(msg)
|
raise ValidationError(msg)
|
||||||
|
@ -53,7 +53,8 @@ Deletion
|
|||||||
|
|
||||||
Deleting stored files is achieved with the :func:`delete` method::
|
Deleting stored files is achieved with the :func:`delete` method::
|
||||||
|
|
||||||
marmot.photo.delete()
|
marmot.photo.delete() # Deletes the GridFS document
|
||||||
|
marmot.save() # Saves the GridFS reference (being None) contained in the marmot instance
|
||||||
|
|
||||||
.. warning::
|
.. warning::
|
||||||
|
|
||||||
@ -71,4 +72,5 @@ Files can be replaced with the :func:`replace` method. This works just like
|
|||||||
the :func:`put` method so even metadata can (and should) be replaced::
|
the :func:`put` method so even metadata can (and should) be replaced::
|
||||||
|
|
||||||
another_marmot = open('another_marmot.png', 'rb')
|
another_marmot = open('another_marmot.png', 'rb')
|
||||||
marmot.photo.replace(another_marmot, content_type='image/png')
|
marmot.photo.replace(another_marmot, content_type='image/png') # Replaces the GridFS document
|
||||||
|
marmot.save() # Replaces the GridFS reference contained in marmot instance
|
||||||
|
@ -113,6 +113,10 @@ handlers within your subclass::
|
|||||||
signals.pre_save.connect(Author.pre_save, sender=Author)
|
signals.pre_save.connect(Author.pre_save, sender=Author)
|
||||||
signals.post_save.connect(Author.post_save, sender=Author)
|
signals.post_save.connect(Author.post_save, sender=Author)
|
||||||
|
|
||||||
|
.. warning::
|
||||||
|
|
||||||
|
Note that EmbeddedDocument only supports pre/post_init signals. pre/post_save, etc should be attached to Document's class only. Attaching pre_save to an EmbeddedDocument is ignored silently.
|
||||||
|
|
||||||
Finally, you can also use this small decorator to quickly create a number of
|
Finally, you can also use this small decorator to quickly create a number of
|
||||||
signals and attach them to your :class:`~mongoengine.Document` or
|
signals and attach them to your :class:`~mongoengine.Document` or
|
||||||
:class:`~mongoengine.EmbeddedDocument` subclasses as class decorators::
|
:class:`~mongoengine.EmbeddedDocument` subclasses as class decorators::
|
||||||
|
@ -23,7 +23,7 @@ __all__ = (list(document.__all__) + list(fields.__all__) +
|
|||||||
list(signals.__all__) + list(errors.__all__))
|
list(signals.__all__) + list(errors.__all__))
|
||||||
|
|
||||||
|
|
||||||
VERSION = (0, 15, 0)
|
VERSION = (0, 15, 3)
|
||||||
|
|
||||||
|
|
||||||
def get_version():
|
def get_version():
|
||||||
|
@ -128,8 +128,8 @@ class BaseList(list):
|
|||||||
return value
|
return value
|
||||||
|
|
||||||
def __iter__(self):
|
def __iter__(self):
|
||||||
for i in six.moves.range(self.__len__()):
|
for v in super(BaseList, self).__iter__():
|
||||||
yield self[i]
|
yield v
|
||||||
|
|
||||||
def __setitem__(self, key, value, *args, **kwargs):
|
def __setitem__(self, key, value, *args, **kwargs):
|
||||||
if isinstance(key, slice):
|
if isinstance(key, slice):
|
||||||
@ -138,7 +138,7 @@ class BaseList(list):
|
|||||||
self._mark_as_changed(key)
|
self._mark_as_changed(key)
|
||||||
return super(BaseList, self).__setitem__(key, value)
|
return super(BaseList, self).__setitem__(key, value)
|
||||||
|
|
||||||
def __delitem__(self, key, *args, **kwargs):
|
def __delitem__(self, key):
|
||||||
self._mark_as_changed()
|
self._mark_as_changed()
|
||||||
return super(BaseList, self).__delitem__(key)
|
return super(BaseList, self).__delitem__(key)
|
||||||
|
|
||||||
@ -187,7 +187,7 @@ class BaseList(list):
|
|||||||
self._mark_as_changed()
|
self._mark_as_changed()
|
||||||
return super(BaseList, self).remove(*args, **kwargs)
|
return super(BaseList, self).remove(*args, **kwargs)
|
||||||
|
|
||||||
def reverse(self, *args, **kwargs):
|
def reverse(self):
|
||||||
self._mark_as_changed()
|
self._mark_as_changed()
|
||||||
return super(BaseList, self).reverse()
|
return super(BaseList, self).reverse()
|
||||||
|
|
||||||
@ -234,6 +234,9 @@ class EmbeddedDocumentList(BaseList):
|
|||||||
Filters the list by only including embedded documents with the
|
Filters the list by only including embedded documents with the
|
||||||
given keyword arguments.
|
given keyword arguments.
|
||||||
|
|
||||||
|
This method only supports simple comparison (e.g: .filter(name='John Doe'))
|
||||||
|
and does not support operators like __gte, __lte, __icontains like queryset.filter does
|
||||||
|
|
||||||
:param kwargs: The keyword arguments corresponding to the fields to
|
:param kwargs: The keyword arguments corresponding to the fields to
|
||||||
filter on. *Multiple arguments are treated as if they are ANDed
|
filter on. *Multiple arguments are treated as if they are ANDed
|
||||||
together.*
|
together.*
|
||||||
|
@ -100,13 +100,11 @@ class BaseDocument(object):
|
|||||||
for key, value in values.iteritems():
|
for key, value in values.iteritems():
|
||||||
if key in self._fields or key == '_id':
|
if key in self._fields or key == '_id':
|
||||||
setattr(self, key, value)
|
setattr(self, key, value)
|
||||||
elif self._dynamic:
|
else:
|
||||||
dynamic_data[key] = value
|
dynamic_data[key] = value
|
||||||
else:
|
else:
|
||||||
FileField = _import_class('FileField')
|
FileField = _import_class('FileField')
|
||||||
for key, value in values.iteritems():
|
for key, value in values.iteritems():
|
||||||
if key == '__auto_convert':
|
|
||||||
continue
|
|
||||||
key = self._reverse_db_field_map.get(key, key)
|
key = self._reverse_db_field_map.get(key, key)
|
||||||
if key in self._fields or key in ('id', 'pk', '_cls'):
|
if key in self._fields or key in ('id', 'pk', '_cls'):
|
||||||
if __auto_convert and value is not None:
|
if __auto_convert and value is not None:
|
||||||
@ -406,7 +404,15 @@ class BaseDocument(object):
|
|||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def from_json(cls, json_data, created=False):
|
def from_json(cls, json_data, created=False):
|
||||||
"""Converts json data to an unsaved document instance"""
|
"""Converts json data to a Document instance
|
||||||
|
|
||||||
|
:param json_data: The json data to load into the Document
|
||||||
|
:param created: If True, the document will be considered as a brand new document
|
||||||
|
If False and an id is provided, it will consider that the data being
|
||||||
|
loaded corresponds to what's already in the database (This has an impact of subsequent call to .save())
|
||||||
|
If False and no id is provided, it will consider the data as a new document
|
||||||
|
(default ``False``)
|
||||||
|
"""
|
||||||
return cls._from_son(json_util.loads(json_data), created=created)
|
return cls._from_son(json_util.loads(json_data), created=created)
|
||||||
|
|
||||||
def __expand_dynamic_values(self, name, value):
|
def __expand_dynamic_values(self, name, value):
|
||||||
|
22
mongoengine/base/utils.py
Normal file
22
mongoengine/base/utils.py
Normal file
@ -0,0 +1,22 @@
|
|||||||
|
import re
|
||||||
|
|
||||||
|
|
||||||
|
class LazyRegexCompiler(object):
|
||||||
|
"""Descriptor to allow lazy compilation of regex"""
|
||||||
|
|
||||||
|
def __init__(self, pattern, flags=0):
|
||||||
|
self._pattern = pattern
|
||||||
|
self._flags = flags
|
||||||
|
self._compiled_regex = None
|
||||||
|
|
||||||
|
@property
|
||||||
|
def compiled_regex(self):
|
||||||
|
if self._compiled_regex is None:
|
||||||
|
self._compiled_regex = re.compile(self._pattern, self._flags)
|
||||||
|
return self._compiled_regex
|
||||||
|
|
||||||
|
def __get__(self, obj, objtype):
|
||||||
|
return self.compiled_regex
|
||||||
|
|
||||||
|
def __set__(self, instance, value):
|
||||||
|
raise AttributeError("Can not set attribute LazyRegexCompiler")
|
@ -145,18 +145,17 @@ class no_sub_classes(object):
|
|||||||
:param cls: the class to turn querying sub classes on
|
:param cls: the class to turn querying sub classes on
|
||||||
"""
|
"""
|
||||||
self.cls = cls
|
self.cls = cls
|
||||||
|
self.cls_initial_subclasses = None
|
||||||
|
|
||||||
def __enter__(self):
|
def __enter__(self):
|
||||||
"""Change the objects default and _auto_dereference values."""
|
"""Change the objects default and _auto_dereference values."""
|
||||||
self.cls._all_subclasses = self.cls._subclasses
|
self.cls_initial_subclasses = self.cls._subclasses
|
||||||
self.cls._subclasses = (self.cls,)
|
self.cls._subclasses = (self.cls._class_name,)
|
||||||
return self.cls
|
return self.cls
|
||||||
|
|
||||||
def __exit__(self, t, value, traceback):
|
def __exit__(self, t, value, traceback):
|
||||||
"""Reset the default and _auto_dereference values."""
|
"""Reset the default and _auto_dereference values."""
|
||||||
self.cls._subclasses = self.cls._all_subclasses
|
self.cls._subclasses = self.cls_initial_subclasses
|
||||||
delattr(self.cls, '_all_subclasses')
|
|
||||||
return self.cls
|
|
||||||
|
|
||||||
|
|
||||||
class query_counter(object):
|
class query_counter(object):
|
||||||
@ -215,7 +214,7 @@ class query_counter(object):
|
|||||||
"""Get the number of queries."""
|
"""Get the number of queries."""
|
||||||
ignore_query = {'ns': {'$ne': '%s.system.indexes' % self.db.name}}
|
ignore_query = {'ns': {'$ne': '%s.system.indexes' % self.db.name}}
|
||||||
count = self.db.system.profile.find(ignore_query).count() - self.counter
|
count = self.db.system.profile.find(ignore_query).count() - self.counter
|
||||||
self.counter += 1
|
self.counter += 1 # Account for the query we just fired
|
||||||
return count
|
return count
|
||||||
|
|
||||||
|
|
||||||
|
@ -133,7 +133,12 @@ class DeReference(object):
|
|||||||
"""
|
"""
|
||||||
object_map = {}
|
object_map = {}
|
||||||
for collection, dbrefs in self.reference_map.iteritems():
|
for collection, dbrefs in self.reference_map.iteritems():
|
||||||
if hasattr(collection, 'objects'): # We have a document class for the refs
|
|
||||||
|
# we use getattr instead of hasattr because as hasattr swallows any exception under python2
|
||||||
|
# so it could hide nasty things without raising exceptions (cfr bug #1688))
|
||||||
|
ref_document_cls_exists = (getattr(collection, 'objects', None) is not None)
|
||||||
|
|
||||||
|
if ref_document_cls_exists:
|
||||||
col_name = collection._get_collection_name()
|
col_name = collection._get_collection_name()
|
||||||
refs = [dbref for dbref in dbrefs
|
refs = [dbref for dbref in dbrefs
|
||||||
if (col_name, dbref) not in object_map]
|
if (col_name, dbref) not in object_map]
|
||||||
|
@ -585,9 +585,8 @@ class Document(BaseDocument):
|
|||||||
:param signal_kwargs: (optional) kwargs dictionary to be passed to
|
:param signal_kwargs: (optional) kwargs dictionary to be passed to
|
||||||
the signal calls.
|
the signal calls.
|
||||||
:param write_concern: Extra keyword arguments are passed down which
|
:param write_concern: Extra keyword arguments are passed down which
|
||||||
will be used as options for the resultant
|
will be used as options for the resultant ``getLastError`` command.
|
||||||
``getLastError`` command. For example,
|
For example, ``save(..., w: 2, fsync: True)`` will
|
||||||
``save(..., write_concern={w: 2, fsync: True}, ...)`` will
|
|
||||||
wait until at least two servers have recorded the write and
|
wait until at least two servers have recorded the write and
|
||||||
will force an fsync on the primary server.
|
will force an fsync on the primary server.
|
||||||
|
|
||||||
@ -715,7 +714,7 @@ class Document(BaseDocument):
|
|||||||
except (KeyError, AttributeError):
|
except (KeyError, AttributeError):
|
||||||
try:
|
try:
|
||||||
# If field is a special field, e.g. items is stored as _reserved_items,
|
# If field is a special field, e.g. items is stored as _reserved_items,
|
||||||
# an KeyError is thrown. So try to retrieve the field from _data
|
# a KeyError is thrown. So try to retrieve the field from _data
|
||||||
setattr(self, field, self._reload(field, obj._data.get(field)))
|
setattr(self, field, self._reload(field, obj._data.get(field)))
|
||||||
except KeyError:
|
except KeyError:
|
||||||
# If field is removed from the database while the object
|
# If field is removed from the database while the object
|
||||||
@ -1000,7 +999,7 @@ class Document(BaseDocument):
|
|||||||
class DynamicDocument(Document):
|
class DynamicDocument(Document):
|
||||||
"""A Dynamic Document class allowing flexible, expandable and uncontrolled
|
"""A Dynamic Document class allowing flexible, expandable and uncontrolled
|
||||||
schemas. As a :class:`~mongoengine.Document` subclass, acts in the same
|
schemas. As a :class:`~mongoengine.Document` subclass, acts in the same
|
||||||
way as an ordinary document but has expando style properties. Any data
|
way as an ordinary document but has expanded style properties. Any data
|
||||||
passed or set against the :class:`~mongoengine.DynamicDocument` that is
|
passed or set against the :class:`~mongoengine.DynamicDocument` that is
|
||||||
not a field is automatically converted into a
|
not a field is automatically converted into a
|
||||||
:class:`~mongoengine.fields.DynamicField` and data can be attributed to that
|
:class:`~mongoengine.fields.DynamicField` and data can be attributed to that
|
||||||
|
@ -5,7 +5,6 @@ import re
|
|||||||
import socket
|
import socket
|
||||||
import time
|
import time
|
||||||
import uuid
|
import uuid
|
||||||
import warnings
|
|
||||||
from operator import itemgetter
|
from operator import itemgetter
|
||||||
|
|
||||||
from bson import Binary, DBRef, ObjectId, SON
|
from bson import Binary, DBRef, ObjectId, SON
|
||||||
@ -28,6 +27,7 @@ except ImportError:
|
|||||||
from mongoengine.base import (BaseDocument, BaseField, ComplexBaseField,
|
from mongoengine.base import (BaseDocument, BaseField, ComplexBaseField,
|
||||||
GeoJsonBaseField, LazyReference, ObjectIdField,
|
GeoJsonBaseField, LazyReference, ObjectIdField,
|
||||||
get_document)
|
get_document)
|
||||||
|
from mongoengine.base.utils import LazyRegexCompiler
|
||||||
from mongoengine.common import _import_class
|
from mongoengine.common import _import_class
|
||||||
from mongoengine.connection import DEFAULT_CONNECTION_NAME, get_db
|
from mongoengine.connection import DEFAULT_CONNECTION_NAME, get_db
|
||||||
from mongoengine.document import Document, EmbeddedDocument
|
from mongoengine.document import Document, EmbeddedDocument
|
||||||
@ -43,7 +43,7 @@ except ImportError:
|
|||||||
|
|
||||||
__all__ = (
|
__all__ = (
|
||||||
'StringField', 'URLField', 'EmailField', 'IntField', 'LongField',
|
'StringField', 'URLField', 'EmailField', 'IntField', 'LongField',
|
||||||
'FloatField', 'DecimalField', 'BooleanField', 'DateTimeField',
|
'FloatField', 'DecimalField', 'BooleanField', 'DateTimeField', 'DateField',
|
||||||
'ComplexDateTimeField', 'EmbeddedDocumentField', 'ObjectIdField',
|
'ComplexDateTimeField', 'EmbeddedDocumentField', 'ObjectIdField',
|
||||||
'GenericEmbeddedDocumentField', 'DynamicField', 'ListField',
|
'GenericEmbeddedDocumentField', 'DynamicField', 'ListField',
|
||||||
'SortedListField', 'EmbeddedDocumentListField', 'DictField',
|
'SortedListField', 'EmbeddedDocumentListField', 'DictField',
|
||||||
@ -123,7 +123,7 @@ class URLField(StringField):
|
|||||||
.. versionadded:: 0.3
|
.. versionadded:: 0.3
|
||||||
"""
|
"""
|
||||||
|
|
||||||
_URL_REGEX = re.compile(
|
_URL_REGEX = LazyRegexCompiler(
|
||||||
r'^(?:[a-z0-9\.\-]*)://' # scheme is validated separately
|
r'^(?:[a-z0-9\.\-]*)://' # scheme is validated separately
|
||||||
r'(?:(?:[A-Z0-9](?:[A-Z0-9-]{0,61}[A-Z0-9])?\.)+(?:[A-Z]{2,6}\.?|[A-Z0-9-]{2,}(?<!-)\.?)|' # domain...
|
r'(?:(?:[A-Z0-9](?:[A-Z0-9-]{0,61}[A-Z0-9])?\.)+(?:[A-Z]{2,6}\.?|[A-Z0-9-]{2,}(?<!-)\.?)|' # domain...
|
||||||
r'localhost|' # localhost...
|
r'localhost|' # localhost...
|
||||||
@ -157,7 +157,7 @@ class EmailField(StringField):
|
|||||||
|
|
||||||
.. versionadded:: 0.4
|
.. versionadded:: 0.4
|
||||||
"""
|
"""
|
||||||
USER_REGEX = re.compile(
|
USER_REGEX = LazyRegexCompiler(
|
||||||
# `dot-atom` defined in RFC 5322 Section 3.2.3.
|
# `dot-atom` defined in RFC 5322 Section 3.2.3.
|
||||||
r"(^[-!#$%&'*+/=?^_`{}|~0-9A-Z]+(\.[-!#$%&'*+/=?^_`{}|~0-9A-Z]+)*\Z"
|
r"(^[-!#$%&'*+/=?^_`{}|~0-9A-Z]+(\.[-!#$%&'*+/=?^_`{}|~0-9A-Z]+)*\Z"
|
||||||
# `quoted-string` defined in RFC 5322 Section 3.2.4.
|
# `quoted-string` defined in RFC 5322 Section 3.2.4.
|
||||||
@ -165,7 +165,7 @@ class EmailField(StringField):
|
|||||||
re.IGNORECASE
|
re.IGNORECASE
|
||||||
)
|
)
|
||||||
|
|
||||||
UTF8_USER_REGEX = re.compile(
|
UTF8_USER_REGEX = LazyRegexCompiler(
|
||||||
six.u(
|
six.u(
|
||||||
# RFC 6531 Section 3.3 extends `atext` (used by dot-atom) to
|
# RFC 6531 Section 3.3 extends `atext` (used by dot-atom) to
|
||||||
# include `UTF8-non-ascii`.
|
# include `UTF8-non-ascii`.
|
||||||
@ -175,7 +175,7 @@ class EmailField(StringField):
|
|||||||
), re.IGNORECASE | re.UNICODE
|
), re.IGNORECASE | re.UNICODE
|
||||||
)
|
)
|
||||||
|
|
||||||
DOMAIN_REGEX = re.compile(
|
DOMAIN_REGEX = LazyRegexCompiler(
|
||||||
r'((?:[A-Z0-9](?:[A-Z0-9-]{0,61}[A-Z0-9])?\.)+)(?:[A-Z0-9-]{2,63}(?<!-))\Z',
|
r'((?:[A-Z0-9](?:[A-Z0-9-]{0,61}[A-Z0-9])?\.)+)(?:[A-Z0-9-]{2,63}(?<!-))\Z',
|
||||||
re.IGNORECASE
|
re.IGNORECASE
|
||||||
)
|
)
|
||||||
@ -462,6 +462,8 @@ class DateTimeField(BaseField):
|
|||||||
installed you can utilise it to convert varying types of date formats into valid
|
installed you can utilise it to convert varying types of date formats into valid
|
||||||
python datetime objects.
|
python datetime objects.
|
||||||
|
|
||||||
|
Note: To default the field to the current datetime, use: DateTimeField(default=datetime.utcnow)
|
||||||
|
|
||||||
Note: Microseconds are rounded to the nearest millisecond.
|
Note: Microseconds are rounded to the nearest millisecond.
|
||||||
Pre UTC microsecond support is effectively broken.
|
Pre UTC microsecond support is effectively broken.
|
||||||
Use :class:`~mongoengine.fields.ComplexDateTimeField` if you
|
Use :class:`~mongoengine.fields.ComplexDateTimeField` if you
|
||||||
@ -525,6 +527,22 @@ class DateTimeField(BaseField):
|
|||||||
return super(DateTimeField, self).prepare_query_value(op, self.to_mongo(value))
|
return super(DateTimeField, self).prepare_query_value(op, self.to_mongo(value))
|
||||||
|
|
||||||
|
|
||||||
|
class DateField(DateTimeField):
|
||||||
|
def to_mongo(self, value):
|
||||||
|
value = super(DateField, self).to_mongo(value)
|
||||||
|
# drop hours, minutes, seconds
|
||||||
|
if isinstance(value, datetime.datetime):
|
||||||
|
value = datetime.datetime(value.year, value.month, value.day)
|
||||||
|
return value
|
||||||
|
|
||||||
|
def to_python(self, value):
|
||||||
|
value = super(DateField, self).to_python(value)
|
||||||
|
# convert datetime to date
|
||||||
|
if isinstance(value, datetime.datetime):
|
||||||
|
value = datetime.date(value.year, value.month, value.day)
|
||||||
|
return value
|
||||||
|
|
||||||
|
|
||||||
class ComplexDateTimeField(StringField):
|
class ComplexDateTimeField(StringField):
|
||||||
"""
|
"""
|
||||||
ComplexDateTimeField handles microseconds exactly instead of rounding
|
ComplexDateTimeField handles microseconds exactly instead of rounding
|
||||||
@ -629,9 +647,17 @@ class EmbeddedDocumentField(BaseField):
|
|||||||
def document_type(self):
|
def document_type(self):
|
||||||
if isinstance(self.document_type_obj, six.string_types):
|
if isinstance(self.document_type_obj, six.string_types):
|
||||||
if self.document_type_obj == RECURSIVE_REFERENCE_CONSTANT:
|
if self.document_type_obj == RECURSIVE_REFERENCE_CONSTANT:
|
||||||
self.document_type_obj = self.owner_document
|
resolved_document_type = self.owner_document
|
||||||
else:
|
else:
|
||||||
self.document_type_obj = get_document(self.document_type_obj)
|
resolved_document_type = get_document(self.document_type_obj)
|
||||||
|
|
||||||
|
if not issubclass(resolved_document_type, EmbeddedDocument):
|
||||||
|
# Due to the late resolution of the document_type
|
||||||
|
# There is a chance that it won't be an EmbeddedDocument (#1661)
|
||||||
|
self.error('Invalid embedded document class provided to an '
|
||||||
|
'EmbeddedDocumentField')
|
||||||
|
self.document_type_obj = resolved_document_type
|
||||||
|
|
||||||
return self.document_type_obj
|
return self.document_type_obj
|
||||||
|
|
||||||
def to_python(self, value):
|
def to_python(self, value):
|
||||||
@ -1142,8 +1168,7 @@ class ReferenceField(BaseField):
|
|||||||
):
|
):
|
||||||
self.error(
|
self.error(
|
||||||
'%s is not an instance of abstract reference type %s' % (
|
'%s is not an instance of abstract reference type %s' % (
|
||||||
self.document_type._class_name
|
value, self.document_type._class_name)
|
||||||
)
|
|
||||||
)
|
)
|
||||||
|
|
||||||
def lookup_member(self, member_name):
|
def lookup_member(self, member_name):
|
||||||
@ -1512,9 +1537,9 @@ class GridFSProxy(object):
|
|||||||
return '<%s: %s>' % (self.__class__.__name__, self.grid_id)
|
return '<%s: %s>' % (self.__class__.__name__, self.grid_id)
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
name = getattr(
|
gridout = self.get()
|
||||||
self.get(), 'filename', self.grid_id) if self.get() else '(no file)'
|
filename = getattr(gridout, 'filename') if gridout else '<no file>'
|
||||||
return '<%s: %s>' % (self.__class__.__name__, name)
|
return '<%s: %s (%s)>' % (self.__class__.__name__, filename, self.grid_id)
|
||||||
|
|
||||||
def __eq__(self, other):
|
def __eq__(self, other):
|
||||||
if isinstance(other, GridFSProxy):
|
if isinstance(other, GridFSProxy):
|
||||||
|
@ -6,11 +6,7 @@ import pymongo
|
|||||||
import six
|
import six
|
||||||
|
|
||||||
|
|
||||||
if pymongo.version_tuple[0] < 3:
|
IS_PYMONGO_3 = pymongo.version_tuple[0] >= 3
|
||||||
IS_PYMONGO_3 = False
|
|
||||||
else:
|
|
||||||
IS_PYMONGO_3 = True
|
|
||||||
|
|
||||||
|
|
||||||
# six.BytesIO resolves to StringIO.StringIO in Py2 and io.BytesIO in Py3.
|
# six.BytesIO resolves to StringIO.StringIO in Py2 and io.BytesIO in Py3.
|
||||||
StringIO = six.BytesIO
|
StringIO = six.BytesIO
|
||||||
|
@ -8,9 +8,12 @@ import weakref
|
|||||||
|
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from bson import DBRef, ObjectId
|
from bson import DBRef, ObjectId
|
||||||
|
from pymongo.errors import DuplicateKeyError
|
||||||
|
|
||||||
from tests import fixtures
|
from tests import fixtures
|
||||||
from tests.fixtures import (PickleEmbedded, PickleTest, PickleSignalsTest,
|
from tests.fixtures import (PickleEmbedded, PickleTest, PickleSignalsTest,
|
||||||
PickleDynamicEmbedded, PickleDynamicTest)
|
PickleDynamicEmbedded, PickleDynamicTest)
|
||||||
|
from tests.utils import MongoDBTestCase
|
||||||
|
|
||||||
from mongoengine import *
|
from mongoengine import *
|
||||||
from mongoengine.base import get_document, _document_registry
|
from mongoengine.base import get_document, _document_registry
|
||||||
@ -30,12 +33,9 @@ TEST_IMAGE_PATH = os.path.join(os.path.dirname(__file__),
|
|||||||
__all__ = ("InstanceTest",)
|
__all__ = ("InstanceTest",)
|
||||||
|
|
||||||
|
|
||||||
class InstanceTest(unittest.TestCase):
|
class InstanceTest(MongoDBTestCase):
|
||||||
|
|
||||||
def setUp(self):
|
def setUp(self):
|
||||||
connect(db='mongoenginetest')
|
|
||||||
self.db = get_db()
|
|
||||||
|
|
||||||
class Job(EmbeddedDocument):
|
class Job(EmbeddedDocument):
|
||||||
name = StringField()
|
name = StringField()
|
||||||
years = IntField()
|
years = IntField()
|
||||||
@ -550,21 +550,14 @@ class InstanceTest(unittest.TestCase):
|
|||||||
pass
|
pass
|
||||||
|
|
||||||
f = Foo()
|
f = Foo()
|
||||||
try:
|
with self.assertRaises(Foo.DoesNotExist):
|
||||||
f.reload()
|
f.reload()
|
||||||
except Foo.DoesNotExist:
|
|
||||||
pass
|
|
||||||
except Exception:
|
|
||||||
self.assertFalse("Threw wrong exception")
|
|
||||||
|
|
||||||
f.save()
|
f.save()
|
||||||
f.delete()
|
f.delete()
|
||||||
try:
|
|
||||||
|
with self.assertRaises(Foo.DoesNotExist):
|
||||||
f.reload()
|
f.reload()
|
||||||
except Foo.DoesNotExist:
|
|
||||||
pass
|
|
||||||
except Exception:
|
|
||||||
self.assertFalse("Threw wrong exception")
|
|
||||||
|
|
||||||
def test_reload_of_non_strict_with_special_field_name(self):
|
def test_reload_of_non_strict_with_special_field_name(self):
|
||||||
"""Ensures reloading works for documents with meta strict == False."""
|
"""Ensures reloading works for documents with meta strict == False."""
|
||||||
@ -734,12 +727,12 @@ class InstanceTest(unittest.TestCase):
|
|||||||
|
|
||||||
t = TestDocument(status="draft", pub_date=datetime.now())
|
t = TestDocument(status="draft", pub_date=datetime.now())
|
||||||
|
|
||||||
try:
|
with self.assertRaises(ValidationError) as cm:
|
||||||
t.save()
|
t.save()
|
||||||
except ValidationError as e:
|
|
||||||
expect_msg = "Draft entries may not have a publication date."
|
expected_msg = "Draft entries may not have a publication date."
|
||||||
self.assertTrue(expect_msg in e.message)
|
self.assertIn(expected_msg, cm.exception.message)
|
||||||
self.assertEqual(e.to_dict(), {'__all__': expect_msg})
|
self.assertEqual(cm.exception.to_dict(), {'__all__': expected_msg})
|
||||||
|
|
||||||
t = TestDocument(status="published")
|
t = TestDocument(status="published")
|
||||||
t.save(clean=False)
|
t.save(clean=False)
|
||||||
@ -773,12 +766,13 @@ class InstanceTest(unittest.TestCase):
|
|||||||
TestDocument.drop_collection()
|
TestDocument.drop_collection()
|
||||||
|
|
||||||
t = TestDocument(doc=TestEmbeddedDocument(x=10, y=25, z=15))
|
t = TestDocument(doc=TestEmbeddedDocument(x=10, y=25, z=15))
|
||||||
try:
|
|
||||||
|
with self.assertRaises(ValidationError) as cm:
|
||||||
t.save()
|
t.save()
|
||||||
except ValidationError as e:
|
|
||||||
expect_msg = "Value of z != x + y"
|
expected_msg = "Value of z != x + y"
|
||||||
self.assertTrue(expect_msg in e.message)
|
self.assertIn(expected_msg, cm.exception.message)
|
||||||
self.assertEqual(e.to_dict(), {'doc': {'__all__': expect_msg}})
|
self.assertEqual(cm.exception.to_dict(), {'doc': {'__all__': expected_msg}})
|
||||||
|
|
||||||
t = TestDocument(doc=TestEmbeddedDocument(x=10, y=25)).save()
|
t = TestDocument(doc=TestEmbeddedDocument(x=10, y=25)).save()
|
||||||
self.assertEqual(t.doc.z, 35)
|
self.assertEqual(t.doc.z, 35)
|
||||||
@ -3148,6 +3142,64 @@ class InstanceTest(unittest.TestCase):
|
|||||||
self.assertEquals(p.id, None)
|
self.assertEquals(p.id, None)
|
||||||
p.id = "12345" # in case it is not working: "OperationError: Shard Keys are immutable..." will be raised here
|
p.id = "12345" # in case it is not working: "OperationError: Shard Keys are immutable..." will be raised here
|
||||||
|
|
||||||
|
def test_from_son_created_False_without_id(self):
|
||||||
|
class MyPerson(Document):
|
||||||
|
name = StringField()
|
||||||
|
|
||||||
|
MyPerson.objects.delete()
|
||||||
|
|
||||||
|
p = MyPerson.from_json('{"name": "a_fancy_name"}', created=False)
|
||||||
|
self.assertFalse(p._created)
|
||||||
|
self.assertIsNone(p.id)
|
||||||
|
p.save()
|
||||||
|
self.assertIsNotNone(p.id)
|
||||||
|
saved_p = MyPerson.objects.get(id=p.id)
|
||||||
|
self.assertEqual(saved_p.name, 'a_fancy_name')
|
||||||
|
|
||||||
|
def test_from_son_created_False_with_id(self):
|
||||||
|
# 1854
|
||||||
|
class MyPerson(Document):
|
||||||
|
name = StringField()
|
||||||
|
|
||||||
|
MyPerson.objects.delete()
|
||||||
|
|
||||||
|
p = MyPerson.from_json('{"_id": "5b85a8b04ec5dc2da388296e", "name": "a_fancy_name"}', created=False)
|
||||||
|
self.assertFalse(p._created)
|
||||||
|
self.assertEqual(p._changed_fields, [])
|
||||||
|
self.assertEqual(p.name, 'a_fancy_name')
|
||||||
|
self.assertEqual(p.id, ObjectId('5b85a8b04ec5dc2da388296e'))
|
||||||
|
p.save()
|
||||||
|
|
||||||
|
with self.assertRaises(DoesNotExist):
|
||||||
|
# Since created=False and we gave an id in the json and _changed_fields is empty
|
||||||
|
# mongoengine assumes that the document exits with that structure already
|
||||||
|
# and calling .save() didn't save anything
|
||||||
|
MyPerson.objects.get(id=p.id)
|
||||||
|
|
||||||
|
self.assertFalse(p._created)
|
||||||
|
p.name = 'a new fancy name'
|
||||||
|
self.assertEqual(p._changed_fields, ['name'])
|
||||||
|
p.save()
|
||||||
|
saved_p = MyPerson.objects.get(id=p.id)
|
||||||
|
self.assertEqual(saved_p.name, p.name)
|
||||||
|
|
||||||
|
def test_from_son_created_True_with_an_id(self):
|
||||||
|
class MyPerson(Document):
|
||||||
|
name = StringField()
|
||||||
|
|
||||||
|
MyPerson.objects.delete()
|
||||||
|
|
||||||
|
p = MyPerson.from_json('{"_id": "5b85a8b04ec5dc2da388296e", "name": "a_fancy_name"}', created=True)
|
||||||
|
self.assertTrue(p._created)
|
||||||
|
self.assertEqual(p._changed_fields, [])
|
||||||
|
self.assertEqual(p.name, 'a_fancy_name')
|
||||||
|
self.assertEqual(p.id, ObjectId('5b85a8b04ec5dc2da388296e'))
|
||||||
|
p.save()
|
||||||
|
|
||||||
|
saved_p = MyPerson.objects.get(id=p.id)
|
||||||
|
self.assertEqual(saved_p, p)
|
||||||
|
self.assertEqual(p.name, 'a_fancy_name')
|
||||||
|
|
||||||
def test_null_field(self):
|
def test_null_field(self):
|
||||||
# 734
|
# 734
|
||||||
class User(Document):
|
class User(Document):
|
||||||
@ -3248,6 +3300,23 @@ class InstanceTest(unittest.TestCase):
|
|||||||
blog.reload()
|
blog.reload()
|
||||||
self.assertEqual(blog.tags, [["value1", 123]])
|
self.assertEqual(blog.tags, [["value1", 123]])
|
||||||
|
|
||||||
|
def test_accessing_objects_with_indexes_error(self):
|
||||||
|
insert_result = self.db.company.insert_many([{'name': 'Foo'},
|
||||||
|
{'name': 'Foo'}]) # Force 2 doc with same name
|
||||||
|
REF_OID = insert_result.inserted_ids[0]
|
||||||
|
self.db.user.insert_one({'company': REF_OID}) # Force 2 doc with same name
|
||||||
|
|
||||||
|
class Company(Document):
|
||||||
|
name = StringField(unique=True)
|
||||||
|
|
||||||
|
class User(Document):
|
||||||
|
company = ReferenceField(Company)
|
||||||
|
|
||||||
|
|
||||||
|
# Ensure index creation exception aren't swallowed (#1688)
|
||||||
|
with self.assertRaises(DuplicateKeyError):
|
||||||
|
User.objects().select_related()
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
unittest.main()
|
unittest.main()
|
||||||
|
@ -46,6 +46,17 @@ class FieldTest(MongoDBTestCase):
|
|||||||
md = MyDoc(dt='')
|
md = MyDoc(dt='')
|
||||||
self.assertRaises(ValidationError, md.save)
|
self.assertRaises(ValidationError, md.save)
|
||||||
|
|
||||||
|
def test_date_from_empty_string(self):
|
||||||
|
"""
|
||||||
|
Ensure an exception is raised when trying to
|
||||||
|
cast an empty string to datetime.
|
||||||
|
"""
|
||||||
|
class MyDoc(Document):
|
||||||
|
dt = DateField()
|
||||||
|
|
||||||
|
md = MyDoc(dt='')
|
||||||
|
self.assertRaises(ValidationError, md.save)
|
||||||
|
|
||||||
def test_datetime_from_whitespace_string(self):
|
def test_datetime_from_whitespace_string(self):
|
||||||
"""
|
"""
|
||||||
Ensure an exception is raised when trying to
|
Ensure an exception is raised when trying to
|
||||||
@ -57,6 +68,17 @@ class FieldTest(MongoDBTestCase):
|
|||||||
md = MyDoc(dt=' ')
|
md = MyDoc(dt=' ')
|
||||||
self.assertRaises(ValidationError, md.save)
|
self.assertRaises(ValidationError, md.save)
|
||||||
|
|
||||||
|
def test_date_from_whitespace_string(self):
|
||||||
|
"""
|
||||||
|
Ensure an exception is raised when trying to
|
||||||
|
cast a whitespace-only string to datetime.
|
||||||
|
"""
|
||||||
|
class MyDoc(Document):
|
||||||
|
dt = DateField()
|
||||||
|
|
||||||
|
md = MyDoc(dt=' ')
|
||||||
|
self.assertRaises(ValidationError, md.save)
|
||||||
|
|
||||||
def test_default_values_nothing_set(self):
|
def test_default_values_nothing_set(self):
|
||||||
"""Ensure that default field values are used when creating
|
"""Ensure that default field values are used when creating
|
||||||
a document.
|
a document.
|
||||||
@ -66,13 +88,14 @@ class FieldTest(MongoDBTestCase):
|
|||||||
age = IntField(default=30, required=False)
|
age = IntField(default=30, required=False)
|
||||||
userid = StringField(default=lambda: 'test', required=True)
|
userid = StringField(default=lambda: 'test', required=True)
|
||||||
created = DateTimeField(default=datetime.datetime.utcnow)
|
created = DateTimeField(default=datetime.datetime.utcnow)
|
||||||
|
day = DateField(default=datetime.date.today)
|
||||||
|
|
||||||
person = Person(name="Ross")
|
person = Person(name="Ross")
|
||||||
|
|
||||||
# Confirm saving now would store values
|
# Confirm saving now would store values
|
||||||
data_to_be_saved = sorted(person.to_mongo().keys())
|
data_to_be_saved = sorted(person.to_mongo().keys())
|
||||||
self.assertEqual(data_to_be_saved,
|
self.assertEqual(data_to_be_saved,
|
||||||
['age', 'created', 'name', 'userid']
|
['age', 'created', 'day', 'name', 'userid']
|
||||||
)
|
)
|
||||||
|
|
||||||
self.assertTrue(person.validate() is None)
|
self.assertTrue(person.validate() is None)
|
||||||
@ -81,16 +104,18 @@ class FieldTest(MongoDBTestCase):
|
|||||||
self.assertEqual(person.age, person.age)
|
self.assertEqual(person.age, person.age)
|
||||||
self.assertEqual(person.userid, person.userid)
|
self.assertEqual(person.userid, person.userid)
|
||||||
self.assertEqual(person.created, person.created)
|
self.assertEqual(person.created, person.created)
|
||||||
|
self.assertEqual(person.day, person.day)
|
||||||
|
|
||||||
self.assertEqual(person._data['name'], person.name)
|
self.assertEqual(person._data['name'], person.name)
|
||||||
self.assertEqual(person._data['age'], person.age)
|
self.assertEqual(person._data['age'], person.age)
|
||||||
self.assertEqual(person._data['userid'], person.userid)
|
self.assertEqual(person._data['userid'], person.userid)
|
||||||
self.assertEqual(person._data['created'], person.created)
|
self.assertEqual(person._data['created'], person.created)
|
||||||
|
self.assertEqual(person._data['day'], person.day)
|
||||||
|
|
||||||
# Confirm introspection changes nothing
|
# Confirm introspection changes nothing
|
||||||
data_to_be_saved = sorted(person.to_mongo().keys())
|
data_to_be_saved = sorted(person.to_mongo().keys())
|
||||||
self.assertEqual(
|
self.assertEqual(
|
||||||
data_to_be_saved, ['age', 'created', 'name', 'userid'])
|
data_to_be_saved, ['age', 'created', 'day', 'name', 'userid'])
|
||||||
|
|
||||||
def test_default_values_set_to_None(self):
|
def test_default_values_set_to_None(self):
|
||||||
"""Ensure that default field values are used even when
|
"""Ensure that default field values are used even when
|
||||||
@ -662,6 +687,32 @@ class FieldTest(MongoDBTestCase):
|
|||||||
log.time = 'ABC'
|
log.time = 'ABC'
|
||||||
self.assertRaises(ValidationError, log.validate)
|
self.assertRaises(ValidationError, log.validate)
|
||||||
|
|
||||||
|
def test_date_validation(self):
|
||||||
|
"""Ensure that invalid values cannot be assigned to datetime
|
||||||
|
fields.
|
||||||
|
"""
|
||||||
|
class LogEntry(Document):
|
||||||
|
time = DateField()
|
||||||
|
|
||||||
|
log = LogEntry()
|
||||||
|
log.time = datetime.datetime.now()
|
||||||
|
log.validate()
|
||||||
|
|
||||||
|
log.time = datetime.date.today()
|
||||||
|
log.validate()
|
||||||
|
|
||||||
|
log.time = datetime.datetime.now().isoformat(' ')
|
||||||
|
log.validate()
|
||||||
|
|
||||||
|
if dateutil:
|
||||||
|
log.time = datetime.datetime.now().isoformat('T')
|
||||||
|
log.validate()
|
||||||
|
|
||||||
|
log.time = -1
|
||||||
|
self.assertRaises(ValidationError, log.validate)
|
||||||
|
log.time = 'ABC'
|
||||||
|
self.assertRaises(ValidationError, log.validate)
|
||||||
|
|
||||||
def test_datetime_tz_aware_mark_as_changed(self):
|
def test_datetime_tz_aware_mark_as_changed(self):
|
||||||
from mongoengine import connection
|
from mongoengine import connection
|
||||||
|
|
||||||
@ -733,6 +784,51 @@ class FieldTest(MongoDBTestCase):
|
|||||||
self.assertNotEqual(log.date, d1)
|
self.assertNotEqual(log.date, d1)
|
||||||
self.assertEqual(log.date, d2)
|
self.assertEqual(log.date, d2)
|
||||||
|
|
||||||
|
def test_date(self):
|
||||||
|
"""Tests showing pymongo date fields
|
||||||
|
|
||||||
|
See: http://api.mongodb.org/python/current/api/bson/son.html#dt
|
||||||
|
"""
|
||||||
|
class LogEntry(Document):
|
||||||
|
date = DateField()
|
||||||
|
|
||||||
|
LogEntry.drop_collection()
|
||||||
|
|
||||||
|
# Test can save dates
|
||||||
|
log = LogEntry()
|
||||||
|
log.date = datetime.date.today()
|
||||||
|
log.save()
|
||||||
|
log.reload()
|
||||||
|
self.assertEqual(log.date, datetime.date.today())
|
||||||
|
|
||||||
|
d1 = datetime.datetime(1970, 1, 1, 0, 0, 1, 999)
|
||||||
|
d2 = datetime.datetime(1970, 1, 1, 0, 0, 1)
|
||||||
|
log = LogEntry()
|
||||||
|
log.date = d1
|
||||||
|
log.save()
|
||||||
|
log.reload()
|
||||||
|
self.assertEqual(log.date, d1.date())
|
||||||
|
self.assertEqual(log.date, d2.date())
|
||||||
|
|
||||||
|
d1 = datetime.datetime(1970, 1, 1, 0, 0, 1, 9999)
|
||||||
|
d2 = datetime.datetime(1970, 1, 1, 0, 0, 1, 9000)
|
||||||
|
log.date = d1
|
||||||
|
log.save()
|
||||||
|
log.reload()
|
||||||
|
self.assertEqual(log.date, d1.date())
|
||||||
|
self.assertEqual(log.date, d2.date())
|
||||||
|
|
||||||
|
if not six.PY3:
|
||||||
|
# Pre UTC dates microseconds below 1000 are dropped
|
||||||
|
# This does not seem to be true in PY3
|
||||||
|
d1 = datetime.datetime(1969, 12, 31, 23, 59, 59, 999)
|
||||||
|
d2 = datetime.datetime(1969, 12, 31, 23, 59, 59)
|
||||||
|
log.date = d1
|
||||||
|
log.save()
|
||||||
|
log.reload()
|
||||||
|
self.assertEqual(log.date, d1.date())
|
||||||
|
self.assertEqual(log.date, d2.date())
|
||||||
|
|
||||||
def test_datetime_usage(self):
|
def test_datetime_usage(self):
|
||||||
"""Tests for regular datetime fields"""
|
"""Tests for regular datetime fields"""
|
||||||
class LogEntry(Document):
|
class LogEntry(Document):
|
||||||
@ -787,6 +883,51 @@ class FieldTest(MongoDBTestCase):
|
|||||||
)
|
)
|
||||||
self.assertEqual(logs.count(), 5)
|
self.assertEqual(logs.count(), 5)
|
||||||
|
|
||||||
|
def test_date_usage(self):
|
||||||
|
"""Tests for regular datetime fields"""
|
||||||
|
class LogEntry(Document):
|
||||||
|
date = DateField()
|
||||||
|
|
||||||
|
LogEntry.drop_collection()
|
||||||
|
|
||||||
|
d1 = datetime.datetime(1970, 1, 1, 0, 0, 1)
|
||||||
|
log = LogEntry()
|
||||||
|
log.date = d1
|
||||||
|
log.validate()
|
||||||
|
log.save()
|
||||||
|
|
||||||
|
for query in (d1, d1.isoformat(' ')):
|
||||||
|
log1 = LogEntry.objects.get(date=query)
|
||||||
|
self.assertEqual(log, log1)
|
||||||
|
|
||||||
|
if dateutil:
|
||||||
|
log1 = LogEntry.objects.get(date=d1.isoformat('T'))
|
||||||
|
self.assertEqual(log, log1)
|
||||||
|
|
||||||
|
# create additional 19 log entries for a total of 20
|
||||||
|
for i in range(1971, 1990):
|
||||||
|
d = datetime.datetime(i, 1, 1, 0, 0, 1)
|
||||||
|
LogEntry(date=d).save()
|
||||||
|
|
||||||
|
self.assertEqual(LogEntry.objects.count(), 20)
|
||||||
|
|
||||||
|
# Test ordering
|
||||||
|
logs = LogEntry.objects.order_by("date")
|
||||||
|
i = 0
|
||||||
|
while i < 19:
|
||||||
|
self.assertTrue(logs[i].date <= logs[i + 1].date)
|
||||||
|
i += 1
|
||||||
|
|
||||||
|
logs = LogEntry.objects.order_by("-date")
|
||||||
|
i = 0
|
||||||
|
while i < 19:
|
||||||
|
self.assertTrue(logs[i].date >= logs[i + 1].date)
|
||||||
|
i += 1
|
||||||
|
|
||||||
|
# Test searching
|
||||||
|
logs = LogEntry.objects.filter(date__gte=datetime.datetime(1980, 1, 1))
|
||||||
|
self.assertEqual(logs.count(), 10)
|
||||||
|
|
||||||
def test_complexdatetime_storage(self):
|
def test_complexdatetime_storage(self):
|
||||||
"""Tests for complex datetime fields - which can handle
|
"""Tests for complex datetime fields - which can handle
|
||||||
microseconds without rounding.
|
microseconds without rounding.
|
||||||
@ -2006,6 +2147,15 @@ class FieldTest(MongoDBTestCase):
|
|||||||
]))
|
]))
|
||||||
self.assertEqual(a.b.c.txt, 'hi')
|
self.assertEqual(a.b.c.txt, 'hi')
|
||||||
|
|
||||||
|
def test_embedded_document_field_cant_reference_using_a_str_if_it_does_not_exist_yet(self):
|
||||||
|
raise SkipTest("Using a string reference in an EmbeddedDocumentField does not work if the class isnt registerd yet")
|
||||||
|
|
||||||
|
class MyDoc2(Document):
|
||||||
|
emb = EmbeddedDocumentField('MyDoc')
|
||||||
|
|
||||||
|
class MyDoc(EmbeddedDocument):
|
||||||
|
name = StringField()
|
||||||
|
|
||||||
def test_embedded_document_validation(self):
|
def test_embedded_document_validation(self):
|
||||||
"""Ensure that invalid embedded documents cannot be assigned to
|
"""Ensure that invalid embedded documents cannot be assigned to
|
||||||
embedded document fields.
|
embedded document fields.
|
||||||
@ -4247,6 +4397,44 @@ class EmbeddedDocumentListFieldTestCase(MongoDBTestCase):
|
|||||||
self.assertEqual(custom_data['a'], CustomData.c_field.custom_data['a'])
|
self.assertEqual(custom_data['a'], CustomData.c_field.custom_data['a'])
|
||||||
|
|
||||||
|
|
||||||
|
class TestEmbeddedDocumentField(MongoDBTestCase):
|
||||||
|
def test___init___(self):
|
||||||
|
class MyDoc(EmbeddedDocument):
|
||||||
|
name = StringField()
|
||||||
|
|
||||||
|
field = EmbeddedDocumentField(MyDoc)
|
||||||
|
self.assertEqual(field.document_type_obj, MyDoc)
|
||||||
|
|
||||||
|
field2 = EmbeddedDocumentField('MyDoc')
|
||||||
|
self.assertEqual(field2.document_type_obj, 'MyDoc')
|
||||||
|
|
||||||
|
def test___init___throw_error_if_document_type_is_not_EmbeddedDocument(self):
|
||||||
|
with self.assertRaises(ValidationError):
|
||||||
|
EmbeddedDocumentField(dict)
|
||||||
|
|
||||||
|
def test_document_type_throw_error_if_not_EmbeddedDocument_subclass(self):
|
||||||
|
|
||||||
|
class MyDoc(Document):
|
||||||
|
name = StringField()
|
||||||
|
|
||||||
|
emb = EmbeddedDocumentField('MyDoc')
|
||||||
|
with self.assertRaises(ValidationError) as ctx:
|
||||||
|
emb.document_type
|
||||||
|
self.assertIn('Invalid embedded document class provided to an EmbeddedDocumentField', str(ctx.exception))
|
||||||
|
|
||||||
|
def test_embedded_document_field_only_allow_subclasses_of_embedded_document(self):
|
||||||
|
# Relates to #1661
|
||||||
|
class MyDoc(Document):
|
||||||
|
name = StringField()
|
||||||
|
|
||||||
|
with self.assertRaises(ValidationError):
|
||||||
|
class MyFailingDoc(Document):
|
||||||
|
emb = EmbeddedDocumentField(MyDoc)
|
||||||
|
|
||||||
|
with self.assertRaises(ValidationError):
|
||||||
|
class MyFailingdoc2(Document):
|
||||||
|
emb = EmbeddedDocumentField('MyDoc')
|
||||||
|
|
||||||
class CachedReferenceFieldTest(MongoDBTestCase):
|
class CachedReferenceFieldTest(MongoDBTestCase):
|
||||||
|
|
||||||
def test_cached_reference_field_get_and_save(self):
|
def test_cached_reference_field_get_and_save(self):
|
||||||
|
@ -54,7 +54,7 @@ class FileTest(MongoDBTestCase):
|
|||||||
|
|
||||||
result = PutFile.objects.first()
|
result = PutFile.objects.first()
|
||||||
self.assertTrue(putfile == result)
|
self.assertTrue(putfile == result)
|
||||||
self.assertEqual("%s" % result.the_file, "<GridFSProxy: hello>")
|
self.assertEqual("%s" % result.the_file, "<GridFSProxy: hello (%s)>" % result.the_file.grid_id)
|
||||||
self.assertEqual(result.the_file.read(), text)
|
self.assertEqual(result.the_file.read(), text)
|
||||||
self.assertEqual(result.the_file.content_type, content_type)
|
self.assertEqual(result.the_file.content_type, content_type)
|
||||||
result.the_file.delete() # Remove file from GridFS
|
result.the_file.delete() # Remove file from GridFS
|
||||||
|
@ -4465,7 +4465,6 @@ class QuerySetTest(unittest.TestCase):
|
|||||||
self.assertNotEqual(bars._CommandCursor__collection.read_preference,
|
self.assertNotEqual(bars._CommandCursor__collection.read_preference,
|
||||||
ReadPreference.SECONDARY_PREFERRED)
|
ReadPreference.SECONDARY_PREFERRED)
|
||||||
|
|
||||||
|
|
||||||
def test_json_simple(self):
|
def test_json_simple(self):
|
||||||
|
|
||||||
class Embedded(EmbeddedDocument):
|
class Embedded(EmbeddedDocument):
|
||||||
|
@ -140,8 +140,6 @@ class ContextManagersTest(unittest.TestCase):
|
|||||||
def test_no_sub_classes(self):
|
def test_no_sub_classes(self):
|
||||||
class A(Document):
|
class A(Document):
|
||||||
x = IntField()
|
x = IntField()
|
||||||
y = IntField()
|
|
||||||
|
|
||||||
meta = {'allow_inheritance': True}
|
meta = {'allow_inheritance': True}
|
||||||
|
|
||||||
class B(A):
|
class B(A):
|
||||||
@ -152,29 +150,29 @@ class ContextManagersTest(unittest.TestCase):
|
|||||||
|
|
||||||
A.drop_collection()
|
A.drop_collection()
|
||||||
|
|
||||||
A(x=10, y=20).save()
|
A(x=10).save()
|
||||||
A(x=15, y=30).save()
|
A(x=15).save()
|
||||||
B(x=20, y=40).save()
|
B(x=20).save()
|
||||||
B(x=30, y=50).save()
|
B(x=30).save()
|
||||||
C(x=40, y=60).save()
|
C(x=40).save()
|
||||||
|
|
||||||
self.assertEqual(A.objects.count(), 5)
|
self.assertEqual(A.objects.count(), 5)
|
||||||
self.assertEqual(B.objects.count(), 3)
|
self.assertEqual(B.objects.count(), 3)
|
||||||
self.assertEqual(C.objects.count(), 1)
|
self.assertEqual(C.objects.count(), 1)
|
||||||
|
|
||||||
with no_sub_classes(A) as A:
|
with no_sub_classes(A):
|
||||||
self.assertEqual(A.objects.count(), 2)
|
self.assertEqual(A.objects.count(), 2)
|
||||||
|
|
||||||
for obj in A.objects:
|
for obj in A.objects:
|
||||||
self.assertEqual(obj.__class__, A)
|
self.assertEqual(obj.__class__, A)
|
||||||
|
|
||||||
with no_sub_classes(B) as B:
|
with no_sub_classes(B):
|
||||||
self.assertEqual(B.objects.count(), 2)
|
self.assertEqual(B.objects.count(), 2)
|
||||||
|
|
||||||
for obj in B.objects:
|
for obj in B.objects:
|
||||||
self.assertEqual(obj.__class__, B)
|
self.assertEqual(obj.__class__, B)
|
||||||
|
|
||||||
with no_sub_classes(C) as C:
|
with no_sub_classes(C):
|
||||||
self.assertEqual(C.objects.count(), 1)
|
self.assertEqual(C.objects.count(), 1)
|
||||||
|
|
||||||
for obj in C.objects:
|
for obj in C.objects:
|
||||||
@ -185,6 +183,32 @@ class ContextManagersTest(unittest.TestCase):
|
|||||||
self.assertEqual(B.objects.count(), 3)
|
self.assertEqual(B.objects.count(), 3)
|
||||||
self.assertEqual(C.objects.count(), 1)
|
self.assertEqual(C.objects.count(), 1)
|
||||||
|
|
||||||
|
def test_no_sub_classes_modification_to_document_class_are_temporary(self):
|
||||||
|
class A(Document):
|
||||||
|
x = IntField()
|
||||||
|
meta = {'allow_inheritance': True}
|
||||||
|
|
||||||
|
class B(A):
|
||||||
|
z = IntField()
|
||||||
|
|
||||||
|
self.assertEqual(A._subclasses, ('A', 'A.B'))
|
||||||
|
with no_sub_classes(A):
|
||||||
|
self.assertEqual(A._subclasses, ('A',))
|
||||||
|
self.assertEqual(A._subclasses, ('A', 'A.B'))
|
||||||
|
|
||||||
|
self.assertEqual(B._subclasses, ('A.B',))
|
||||||
|
with no_sub_classes(B):
|
||||||
|
self.assertEqual(B._subclasses, ('A.B',))
|
||||||
|
self.assertEqual(B._subclasses, ('A.B',))
|
||||||
|
|
||||||
|
def test_no_subclass_context_manager_does_not_swallow_exception(self):
|
||||||
|
class User(Document):
|
||||||
|
name = StringField()
|
||||||
|
|
||||||
|
with self.assertRaises(TypeError):
|
||||||
|
with no_sub_classes(User):
|
||||||
|
raise TypeError()
|
||||||
|
|
||||||
def test_query_counter(self):
|
def test_query_counter(self):
|
||||||
connect('mongoenginetest')
|
connect('mongoenginetest')
|
||||||
db = get_db()
|
db = get_db()
|
||||||
|
@ -1,6 +1,21 @@
|
|||||||
import unittest
|
import unittest
|
||||||
|
|
||||||
from mongoengine.base.datastructures import StrictDict
|
from mongoengine.base.datastructures import StrictDict, BaseList
|
||||||
|
|
||||||
|
|
||||||
|
class TestBaseList(unittest.TestCase):
|
||||||
|
|
||||||
|
def test_iter_simple(self):
|
||||||
|
values = [True, False, True, False]
|
||||||
|
base_list = BaseList(values, instance=None, name='my_name')
|
||||||
|
self.assertEqual(values, list(base_list))
|
||||||
|
|
||||||
|
def test_iter_allow_modification_while_iterating_withou_error(self):
|
||||||
|
# regular list allows for this, thus this subclass must comply to that
|
||||||
|
base_list = BaseList([True, False, True, False], instance=None, name='my_name')
|
||||||
|
for idx, val in enumerate(base_list):
|
||||||
|
if val:
|
||||||
|
base_list.pop(idx)
|
||||||
|
|
||||||
|
|
||||||
class TestStrictDict(unittest.TestCase):
|
class TestStrictDict(unittest.TestCase):
|
||||||
|
38
tests/test_utils.py
Normal file
38
tests/test_utils.py
Normal file
@ -0,0 +1,38 @@
|
|||||||
|
import unittest
|
||||||
|
import re
|
||||||
|
|
||||||
|
from mongoengine.base.utils import LazyRegexCompiler
|
||||||
|
|
||||||
|
signal_output = []
|
||||||
|
|
||||||
|
|
||||||
|
class LazyRegexCompilerTest(unittest.TestCase):
|
||||||
|
|
||||||
|
def test_lazy_regex_compiler_verify_laziness_of_descriptor(self):
|
||||||
|
class UserEmail(object):
|
||||||
|
EMAIL_REGEX = LazyRegexCompiler('@', flags=32)
|
||||||
|
|
||||||
|
descriptor = UserEmail.__dict__['EMAIL_REGEX']
|
||||||
|
self.assertIsNone(descriptor._compiled_regex)
|
||||||
|
|
||||||
|
regex = UserEmail.EMAIL_REGEX
|
||||||
|
self.assertEqual(regex, re.compile('@', flags=32))
|
||||||
|
self.assertEqual(regex.search('user@domain.com').group(), '@')
|
||||||
|
|
||||||
|
user_email = UserEmail()
|
||||||
|
self.assertIs(user_email.EMAIL_REGEX, UserEmail.EMAIL_REGEX)
|
||||||
|
|
||||||
|
def test_lazy_regex_compiler_verify_cannot_set_descriptor_on_instance(self):
|
||||||
|
class UserEmail(object):
|
||||||
|
EMAIL_REGEX = LazyRegexCompiler('@')
|
||||||
|
|
||||||
|
user_email = UserEmail()
|
||||||
|
with self.assertRaises(AttributeError):
|
||||||
|
user_email.EMAIL_REGEX = re.compile('@')
|
||||||
|
|
||||||
|
def test_lazy_regex_compiler_verify_can_override_class_attr(self):
|
||||||
|
class UserEmail(object):
|
||||||
|
EMAIL_REGEX = LazyRegexCompiler('@')
|
||||||
|
|
||||||
|
UserEmail.EMAIL_REGEX = re.compile('cookies')
|
||||||
|
self.assertEqual(UserEmail.EMAIL_REGEX.search('Cake & cookies').group(), 'cookies')
|
@ -7,12 +7,12 @@ from mongoengine.connection import get_db, get_connection
|
|||||||
from mongoengine.python_support import IS_PYMONGO_3
|
from mongoengine.python_support import IS_PYMONGO_3
|
||||||
|
|
||||||
|
|
||||||
MONGO_TEST_DB = 'mongoenginetest'
|
MONGO_TEST_DB = 'mongoenginetest' # standard name for the test database
|
||||||
|
|
||||||
|
|
||||||
class MongoDBTestCase(unittest.TestCase):
|
class MongoDBTestCase(unittest.TestCase):
|
||||||
"""Base class for tests that need a mongodb connection
|
"""Base class for tests that need a mongodb connection
|
||||||
db is being dropped automatically
|
It ensures that the db is clean at the beginning and dropped at the end automatically
|
||||||
"""
|
"""
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
@ -32,6 +32,7 @@ def get_mongodb_version():
|
|||||||
"""
|
"""
|
||||||
return tuple(get_connection().server_info()['versionArray'])
|
return tuple(get_connection().server_info()['versionArray'])
|
||||||
|
|
||||||
|
|
||||||
def _decorated_with_ver_requirement(func, ver_tuple):
|
def _decorated_with_ver_requirement(func, ver_tuple):
|
||||||
"""Return a given function decorated with the version requirement
|
"""Return a given function decorated with the version requirement
|
||||||
for a particular MongoDB version tuple.
|
for a particular MongoDB version tuple.
|
||||||
@ -50,18 +51,21 @@ def _decorated_with_ver_requirement(func, ver_tuple):
|
|||||||
|
|
||||||
return _inner
|
return _inner
|
||||||
|
|
||||||
|
|
||||||
def needs_mongodb_v26(func):
|
def needs_mongodb_v26(func):
|
||||||
"""Raise a SkipTest exception if we're working with MongoDB version
|
"""Raise a SkipTest exception if we're working with MongoDB version
|
||||||
lower than v2.6.
|
lower than v2.6.
|
||||||
"""
|
"""
|
||||||
return _decorated_with_ver_requirement(func, (2, 6))
|
return _decorated_with_ver_requirement(func, (2, 6))
|
||||||
|
|
||||||
|
|
||||||
def needs_mongodb_v3(func):
|
def needs_mongodb_v3(func):
|
||||||
"""Raise a SkipTest exception if we're working with MongoDB version
|
"""Raise a SkipTest exception if we're working with MongoDB version
|
||||||
lower than v3.0.
|
lower than v3.0.
|
||||||
"""
|
"""
|
||||||
return _decorated_with_ver_requirement(func, (3, 0))
|
return _decorated_with_ver_requirement(func, (3, 0))
|
||||||
|
|
||||||
|
|
||||||
def skip_pymongo3(f):
|
def skip_pymongo3(f):
|
||||||
"""Raise a SkipTest exception if we're running a test against
|
"""Raise a SkipTest exception if we're running a test against
|
||||||
PyMongo v3.x.
|
PyMongo v3.x.
|
||||||
|
Loading…
x
Reference in New Issue
Block a user