Merge branch 'master' into fix/add-dev-tools

This commit is contained in:
Filip Kucharczyk 2020-03-27 15:15:14 +01:00 committed by GitHub
commit 2be28a22a7
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
31 changed files with 674 additions and 404 deletions

View File

@ -3,7 +3,7 @@
# with a very large number of jobs, hence we only test a subset of all the # with a very large number of jobs, hence we only test a subset of all the
# combinations: # combinations:
# * MongoDB v3.4 & the latest PyMongo v3.x is currently the "main" setup, # * MongoDB v3.4 & the latest PyMongo v3.x is currently the "main" setup,
# tested against Python v2.7, v3.5, v3.6, and PyPy. # tested against Python v2.7, v3.5, v3.6, v3.7, v3.8, PyPy and PyPy3.
# * Besides that, we test the lowest actively supported Python/MongoDB/PyMongo # * Besides that, we test the lowest actively supported Python/MongoDB/PyMongo
# combination: MongoDB v3.4, PyMongo v3.4, Python v2.7. # combination: MongoDB v3.4, PyMongo v3.4, Python v2.7.
# * MongoDB v3.6 is tested against Python v3.6, and PyMongo v3.6, v3.7, v3.8. # * MongoDB v3.6 is tested against Python v3.6, and PyMongo v3.6, v3.7, v3.8.
@ -23,6 +23,7 @@ python:
- 3.5 - 3.5
- 3.6 - 3.6
- 3.7 - 3.7
- 3.8
- pypy - pypy
- pypy3 - pypy3
@ -32,14 +33,16 @@ env:
global: global:
- MONGODB_3_4=3.4.17 - MONGODB_3_4=3.4.17
- MONGODB_3_6=3.6.12 - MONGODB_3_6=3.6.12
- PYMONGO_3_9=3.9 - MONGODB_4_0=4.0.13
- PYMONGO_3_6=3.6
- PYMONGO_3_4=3.4 - PYMONGO_3_4=3.4
- PYMONGO_3_6=3.6
- PYMONGO_3_9=3.9
- PYMONGO_3_10=3.10
matrix: matrix:
- MONGODB=${MONGODB_3_4} PYMONGO=${PYMONGO_3_9} - MONGODB=${MONGODB_3_4} PYMONGO=${PYMONGO_3_10}
matrix: matrix:
# Finish the build as soon as one job fails # Finish the build as soon as one job fails
fast_finish: true fast_finish: true
@ -50,7 +53,10 @@ matrix:
env: MONGODB=${MONGODB_3_6} PYMONGO=${PYMONGO_3_6} env: MONGODB=${MONGODB_3_6} PYMONGO=${PYMONGO_3_6}
- python: 3.7 - python: 3.7
env: MONGODB=${MONGODB_3_6} PYMONGO=${PYMONGO_3_9} env: MONGODB=${MONGODB_3_6} PYMONGO=${PYMONGO_3_9}
- python: 3.7
env: MONGODB=${MONGODB_3_6} PYMONGO=${PYMONGO_3_10}
- python: 3.8
env: MONGODB=${MONGODB_4_0} PYMONGO=${PYMONGO_3_10}
install: install:
# Install Mongo # Install Mongo
@ -110,5 +116,5 @@ deploy:
on: on:
tags: true tags: true
repo: MongoEngine/mongoengine repo: MongoEngine/mongoengine
condition: ($PYMONGO = ${PYMONGO_3_6}) && ($MONGODB = ${MONGODB_3_4}) condition: ($PYMONGO = ${PYMONGO_3_10}) && ($MONGODB = ${MONGODB_3_4})
python: 2.7 python: 2.7

View File

@ -253,3 +253,6 @@ that much better:
* Gaurav Dadhania (https://github.com/GVRV) * Gaurav Dadhania (https://github.com/GVRV)
* Yurii Andrieiev (https://github.com/yandrieiev) * Yurii Andrieiev (https://github.com/yandrieiev)
* Filip Kucharczyk (https://github.com/Pacu2) * Filip Kucharczyk (https://github.com/Pacu2)
* Eric Timmons (https://github.com/daewok)
* Matthew Simpson (https://github.com/mcsimps2)
* Leonardo Domingues (https://github.com/leodmgs)

View File

@ -26,10 +26,10 @@ an `API reference <https://mongoengine-odm.readthedocs.io/apireference.html>`_.
Supported MongoDB Versions Supported MongoDB Versions
========================== ==========================
MongoEngine is currently tested against MongoDB v3.4 and v3.6. Future versions MongoEngine is currently tested against MongoDB v3.4, v3.6 and v4.0. Future versions
should be supported as well, but aren't actively tested at the moment. Make should be supported as well, but aren't actively tested at the moment. Make
sure to open an issue or submit a pull request if you experience any problems sure to open an issue or submit a pull request if you experience any problems
with MongoDB version > 3.6. with MongoDB version > 4.0.
Installation Installation
============ ============
@ -91,12 +91,11 @@ Some simple examples of what MongoEngine code looks like:
# Iterate over all posts using the BlogPost superclass # Iterate over all posts using the BlogPost superclass
>>> for post in BlogPost.objects: >>> for post in BlogPost.objects:
... print '===', post.title, '===' ... print('===', post.title, '===')
... if isinstance(post, TextPost): ... if isinstance(post, TextPost):
... print post.content ... print(post.content)
... elif isinstance(post, LinkPost): ... elif isinstance(post, LinkPost):
... print 'Link:', post.url ... print('Link:', post.url)
... print
... ...
# Count all blog posts and its subtypes # Count all blog posts and its subtypes

View File

@ -6,8 +6,26 @@ Changelog
Development Development
=========== ===========
- (Fill this out as you fix issues and develop your features). - (Fill this out as you fix issues and develop your features).
- Documentation improvements: - Add Mongo 4.0 to Travis
- Documented how `pymongo.monitoring` can be used to log all queries issued by MongoEngine to the driver. - Fixed a bug causing inaccurate query results, while combining ``__raw__`` and regular filters for the same field #2264
- Add support for the `elemMatch` projection operator in .fields() (e.g BlogPost.objects.fields(elemMatch__comments="test")) #2267
- DictField validate failed without default connection (bug introduced in 0.19.0) #2239
- Remove methods deprecated years ago:
- name parameter in Field constructor e.g `StringField(name="...")`, was replaced by db_field
- Queryset.slave_okay() was deprecated since pymongo3
- dropDups was dropped with MongoDB3
- ``Queryset._ensure_indexes`` and ``Queryset.ensure_indexes``, the right method to use is ``Document.ensure_indexes``
- Added pre-commit #2212
- Renamed requirements-lint.txt to requirements-dev.txt #2212
Changes in 0.19.1
=================
- Requires Pillow < 7.0.0 as it dropped Python2 support
- DEPRECATION: The interface of ``QuerySet.aggregate`` method was changed, it no longer takes an unpacked list of
pipeline steps (*pipeline) but simply takes the pipeline list just like ``pymongo.Collection.aggregate`` does. #2079
Changes in 0.19.0
=================
- BREAKING CHANGE: ``class_check`` and ``read_preference`` keyword arguments are no longer available when filtering a ``QuerySet``. #2112 - BREAKING CHANGE: ``class_check`` and ``read_preference`` keyword arguments are no longer available when filtering a ``QuerySet``. #2112
- Instead of ``Doc.objects(foo=bar, read_preference=...)`` use ``Doc.objects(foo=bar).read_preference(...)``. - Instead of ``Doc.objects(foo=bar, read_preference=...)`` use ``Doc.objects(foo=bar).read_preference(...)``.
- Instead of ``Doc.objects(foo=bar, class_check=False)`` use ``Doc.objects(foo=bar).clear_cls_query(...)``. - Instead of ``Doc.objects(foo=bar, class_check=False)`` use ``Doc.objects(foo=bar).clear_cls_query(...)``.
@ -17,16 +35,23 @@ Development
- If you catch/use ``MongoEngineConnectionError`` in your code, you'll have to rename it. - If you catch/use ``MongoEngineConnectionError`` in your code, you'll have to rename it.
- BREAKING CHANGE: Positional arguments when instantiating a document are no longer supported. #2103 - BREAKING CHANGE: Positional arguments when instantiating a document are no longer supported. #2103
- From now on keyword arguments (e.g. ``Doc(field_name=value)``) are required. - From now on keyword arguments (e.g. ``Doc(field_name=value)``) are required.
- Improve error message related to InvalidDocumentError #2180 - BREAKING CHANGE: A ``LazyReferenceField`` is now stored in the ``_data`` field of its parent as a ``DBRef``, ``Document``, or ``EmbeddedDocument`` (``ObjectId`` is no longer allowed). #2182
- DEPRECATION: ``Q.empty`` & ``QNode.empty`` are marked as deprecated and will be removed in a next version of MongoEngine. #2210
- Added ability to check if Q or QNode are empty by parsing them to bool.
- Instead of ``Q(name="John").empty`` use ``not Q(name="John")``.
- Fix updating/modifying/deleting/reloading a document that's sharded by a field with ``db_field`` specified. #2125 - Fix updating/modifying/deleting/reloading a document that's sharded by a field with ``db_field`` specified. #2125
- Only set no_cursor_timeout when requested (fixes an incompatibility with MongoDB 4.2) #2148
- ``ListField`` now accepts an optional ``max_length`` parameter. #2110 - ``ListField`` now accepts an optional ``max_length`` parameter. #2110
- Switch from nosetest to pytest as test runner #2114 - Improve error message related to InvalidDocumentError #2180
- The codebase is now formatted using ``black``. #2109 - Added BulkWriteError to replace NotUniqueError which was misleading in bulk write insert #2152
- In bulk write insert, the detailed error message would raise in exception.
- Added ability to compare Q and Q operations #2204 - Added ability to compare Q and Q operations #2204
- Added ability to use a db alias on query_counter #2194 - Added ability to use a db alias on query_counter #2194
- Added pre-commit #2212 - Added ability to specify collations for querysets with ``Doc.objects.collation`` #2024
- Renamed requirements-lint.txt to requirements-dev.txt #2212 - Fix updates of a list field by negative index #2094
- Switch from nosetest to pytest as test runner #2114
- The codebase is now formatted using ``black``. #2109
- Documentation improvements:
- Documented how `pymongo.monitoring` can be used to log all queries issued by MongoEngine to the driver.
Changes in 0.18.2 Changes in 0.18.2
================= =================

View File

@ -352,7 +352,7 @@ Its value can take any of the following constants:
Deletion is denied if there still exist references to the object being Deletion is denied if there still exist references to the object being
deleted. deleted.
:const:`mongoengine.NULLIFY` :const:`mongoengine.NULLIFY`
Any object's fields still referring to the object being deleted are removed Any object's fields still referring to the object being deleted are set to None
(using MongoDB's "unset" operation), effectively nullifying the relationship. (using MongoDB's "unset" operation), effectively nullifying the relationship.
:const:`mongoengine.CASCADE` :const:`mongoengine.CASCADE`
Any object containing fields that are referring to the object being deleted Any object containing fields that are referring to the object being deleted
@ -555,7 +555,6 @@ There are a few top level defaults for all indexes that can be set::
'index_background': True, 'index_background': True,
'index_cls': False, 'index_cls': False,
'auto_create_index': True, 'auto_create_index': True,
'index_drop_dups': True,
} }
@ -574,11 +573,6 @@ There are a few top level defaults for all indexes that can be set::
in systems where indexes are managed separately. Disabling this will improve in systems where indexes are managed separately. Disabling this will improve
performance. performance.
:attr:`index_drop_dups` (Optional)
Set the default value for if an index should drop duplicates
Since MongoDB 3.0 drop_dups is not supported anymore. Raises a Warning
and has no effect
Compound Indexes and Indexing sub documents Compound Indexes and Indexing sub documents
------------------------------------------- -------------------------------------------
@ -744,7 +738,7 @@ Document inheritance
To create a specialised type of a :class:`~mongoengine.Document` you have To create a specialised type of a :class:`~mongoengine.Document` you have
defined, you may subclass it and add any extra fields or methods you may need. defined, you may subclass it and add any extra fields or methods you may need.
As this is new class is not a direct subclass of As this new class is not a direct subclass of
:class:`~mongoengine.Document`, it will not be stored in its own collection; it :class:`~mongoengine.Document`, it will not be stored in its own collection; it
will use the same collection as its superclass uses. This allows for more will use the same collection as its superclass uses. This allows for more
convenient and efficient retrieval of related documents -- all you need do is convenient and efficient retrieval of related documents -- all you need do is
@ -767,6 +761,27 @@ document.::
Setting :attr:`allow_inheritance` to True should also be used in Setting :attr:`allow_inheritance` to True should also be used in
:class:`~mongoengine.EmbeddedDocument` class in case you need to subclass it :class:`~mongoengine.EmbeddedDocument` class in case you need to subclass it
When it comes to querying using :attr:`.objects()`, querying `Page.objects()` will query
both `Page` and `DatedPage` whereas querying `DatedPage` will only query the `DatedPage` documents.
Behind the scenes, MongoEngine deals with inheritance by adding a :attr:`_cls` attribute that contains
the class name in every documents. When a document is loaded, MongoEngine checks
it's :attr:`_cls` attribute and use that class to construct the instance.::
Page(title='a funky title').save()
DatedPage(title='another title', date=datetime.utcnow()).save()
print(Page.objects().count()) # 2
print(DatedPage.objects().count()) # 1
# print documents in their native form
# we remove 'id' to avoid polluting the output with unnecessary detail
qs = Page.objects.exclude('id').as_pymongo()
print(list(qs))
# [
# {'_cls': u 'Page', 'title': 'a funky title'},
# {'_cls': u 'Page.DatedPage', 'title': u 'another title', 'date': datetime.datetime(2019, 12, 13, 20, 16, 59, 993000)}
# ]
Working with existing data Working with existing data
-------------------------- --------------------------
As MongoEngine no longer defaults to needing :attr:`_cls`, you can quickly and As MongoEngine no longer defaults to needing :attr:`_cls`, you can quickly and

View File

@ -10,8 +10,9 @@ Writing
GridFS support comes in the form of the :class:`~mongoengine.fields.FileField` field GridFS support comes in the form of the :class:`~mongoengine.fields.FileField` field
object. This field acts as a file-like object and provides a couple of object. This field acts as a file-like object and provides a couple of
different ways of inserting and retrieving data. Arbitrary metadata such as different ways of inserting and retrieving data. Arbitrary metadata such as
content type can also be stored alongside the files. In the following example, content type can also be stored alongside the files. The object returned when accessing a
a document is created to store details about animals, including a photo:: FileField is a proxy to `Pymongo's GridFS <https://api.mongodb.com/python/current/examples/gridfs.html#gridfs-example>`_
In the following example, a document is created to store details about animals, including a photo::
class Animal(Document): class Animal(Document):
genus = StringField() genus = StringField()
@ -20,8 +21,8 @@ a document is created to store details about animals, including a photo::
marmot = Animal(genus='Marmota', family='Sciuridae') marmot = Animal(genus='Marmota', family='Sciuridae')
marmot_photo = open('marmot.jpg', 'rb') with open('marmot.jpg', 'rb') as fd:
marmot.photo.put(marmot_photo, content_type = 'image/jpeg') marmot.photo.put(fd, content_type = 'image/jpeg')
marmot.save() marmot.save()
Retrieval Retrieval
@ -34,6 +35,20 @@ field. The file can also be retrieved just as easily::
photo = marmot.photo.read() photo = marmot.photo.read()
content_type = marmot.photo.content_type content_type = marmot.photo.content_type
.. note:: If you need to read() the content of a file multiple times, you'll need to "rewind"
the file-like object using `seek`::
marmot = Animal.objects(genus='Marmota').first()
content1 = marmot.photo.read()
assert content1 != ""
content2 = marmot.photo.read() # will be empty
assert content2 == ""
marmot.photo.seek(0) # rewind the file by setting the current position of the cursor in the file to 0
content3 = marmot.photo.read()
assert content3 == content1
Streaming Streaming
--------- ---------

View File

@ -21,7 +21,7 @@ or with an alias:
conn = get_connection('testdb') conn = get_connection('testdb')
Example of test file: Example of test file:
-------- ---------------------
.. code-block:: python .. code-block:: python
import unittest import unittest
@ -45,4 +45,4 @@ Example of test file:
pers.save() pers.save()
fresh_pers = Person.objects().first() fresh_pers = Person.objects().first()
self.assertEqual(fresh_pers.name, 'John') assert fresh_pers.name == 'John'

View File

@ -222,6 +222,18 @@ keyword argument::
.. versionadded:: 0.4 .. versionadded:: 0.4
Sorting/Ordering results
========================
It is possible to order the results by 1 or more keys using :meth:`~mongoengine.queryset.QuerySet.order_by`.
The order may be specified by prepending each of the keys by "+" or "-". Ascending order is assumed if there's no prefix.::
# Order by ascending date
blogs = BlogPost.objects().order_by('date') # equivalent to .order_by('+date')
# Order by ascending date first, then descending title
blogs = BlogPost.objects().order_by('+date', '-title')
Limiting and skipping results Limiting and skipping results
============================= =============================
Just as with traditional ORMs, you may limit the number of results returned or Just as with traditional ORMs, you may limit the number of results returned or
@ -388,7 +400,7 @@ would be generating "tag-clouds"::
MongoDB aggregation API MongoDB aggregation API
----------------------- -----------------------
If you need to run aggregation pipelines, MongoEngine provides an entry point `Pymongo's aggregation framework <https://api.mongodb.com/python/current/examples/aggregation.html#aggregation-framework>`_ If you need to run aggregation pipelines, MongoEngine provides an entry point to `Pymongo's aggregation framework <https://api.mongodb.com/python/current/examples/aggregation.html#aggregation-framework>`_
through :meth:`~mongoengine.queryset.QuerySet.aggregate`. Check out Pymongo's documentation for the syntax and pipeline. through :meth:`~mongoengine.queryset.QuerySet.aggregate`. Check out Pymongo's documentation for the syntax and pipeline.
An example of its use would be:: An example of its use would be::
@ -402,7 +414,7 @@ An example of its use would be::
{"$sort" : {"name" : -1}}, {"$sort" : {"name" : -1}},
{"$project": {"_id": 0, "name": {"$toUpper": "$name"}}} {"$project": {"_id": 0, "name": {"$toUpper": "$name"}}}
] ]
data = Person.objects().aggregate(*pipeline) data = Person.objects().aggregate(pipeline)
assert data == [{'name': 'BOB'}, {'name': 'JOHN'}] assert data == [{'name': 'BOB'}, {'name': 'JOHN'}]
Query efficiency and performance Query efficiency and performance
@ -585,7 +597,8 @@ cannot use the `$` syntax in keyword arguments it has been mapped to `S`::
['database', 'mongodb'] ['database', 'mongodb']
From MongoDB version 2.6, push operator supports $position value which allows From MongoDB version 2.6, push operator supports $position value which allows
to push values with index. to push values with index::
>>> post = BlogPost(title="Test", tags=["mongo"]) >>> post = BlogPost(title="Test", tags=["mongo"])
>>> post.save() >>> post.save()
>>> post.update(push__tags__0=["database", "code"]) >>> post.update(push__tags__0=["database", "code"])

View File

@ -52,7 +52,7 @@ rename its occurrences.
This release includes a major rehaul of MongoEngine's code quality and This release includes a major rehaul of MongoEngine's code quality and
introduces a few breaking changes. It also touches many different parts of introduces a few breaking changes. It also touches many different parts of
the package and although all the changes have been tested and scrutinized, the package and although all the changes have been tested and scrutinized,
you're encouraged to thorougly test the upgrade. you're encouraged to thoroughly test the upgrade.
First breaking change involves renaming `ConnectionError` to `MongoEngineConnectionError`. First breaking change involves renaming `ConnectionError` to `MongoEngineConnectionError`.
If you import or catch this exception, you'll need to rename it in your code. If you import or catch this exception, you'll need to rename it in your code.

View File

@ -28,7 +28,7 @@ __all__ = (
) )
VERSION = (0, 18, 2) VERSION = (0, 19, 1)
def get_version(): def get_version():

View File

@ -120,6 +120,9 @@ class BaseList(list):
super(BaseList, self).__init__(list_items) super(BaseList, self).__init__(list_items)
def __getitem__(self, key): def __getitem__(self, key):
# change index to positive value because MongoDB does not support negative one
if isinstance(key, int) and key < 0:
key = len(self) + key
value = super(BaseList, self).__getitem__(key) value = super(BaseList, self).__getitem__(key)
if isinstance(key, slice): if isinstance(key, slice):

View File

@ -36,7 +36,6 @@ class BaseField(object):
def __init__( def __init__(
self, self,
db_field=None, db_field=None,
name=None,
required=False, required=False,
default=None, default=None,
unique=False, unique=False,
@ -51,7 +50,6 @@ class BaseField(object):
""" """
:param db_field: The database field to store this field in :param db_field: The database field to store this field in
(defaults to the name of the field) (defaults to the name of the field)
:param name: Deprecated - use db_field
:param required: If the field is required. Whether it has to have a :param required: If the field is required. Whether it has to have a
value or not. Defaults to False. value or not. Defaults to False.
:param default: (optional) The default value for this field if no value :param default: (optional) The default value for this field if no value
@ -75,11 +73,8 @@ class BaseField(object):
existing attributes. Common metadata includes `verbose_name` and existing attributes. Common metadata includes `verbose_name` and
`help_text`. `help_text`.
""" """
self.db_field = (db_field or name) if not primary_key else "_id" self.db_field = db_field if not primary_key else "_id"
if name:
msg = 'Field\'s "name" attribute deprecated in favour of "db_field"'
warnings.warn(msg, DeprecationWarning)
self.required = required or primary_key self.required = required or primary_key
self.default = default self.default = default
self.unique = bool(unique or unique_with) self.unique = bool(unique or unique_with)

View File

@ -284,7 +284,6 @@ class TopLevelDocumentMetaclass(DocumentMetaclass):
"indexes": [], # indexes to be ensured at runtime "indexes": [], # indexes to be ensured at runtime
"id_field": None, "id_field": None,
"index_background": False, "index_background": False,
"index_drop_dups": False,
"index_opts": None, "index_opts": None,
"delete_rules": None, "delete_rules": None,
# allow_inheritance can be True, False, and None. True means # allow_inheritance can be True, False, and None. True means

View File

@ -56,7 +56,7 @@ class InvalidCollectionError(Exception):
class EmbeddedDocument(six.with_metaclass(DocumentMetaclass, BaseDocument)): class EmbeddedDocument(six.with_metaclass(DocumentMetaclass, BaseDocument)):
"""A :class:`~mongoengine.Document` that isn't stored in its own r"""A :class:`~mongoengine.Document` that isn't stored in its own
collection. :class:`~mongoengine.EmbeddedDocument`\ s should be used as collection. :class:`~mongoengine.EmbeddedDocument`\ s should be used as
fields on :class:`~mongoengine.Document`\ s through the fields on :class:`~mongoengine.Document`\ s through the
:class:`~mongoengine.EmbeddedDocumentField` field type. :class:`~mongoengine.EmbeddedDocumentField` field type.
@ -332,7 +332,7 @@ class Document(six.with_metaclass(TopLevelDocumentMetaclass, BaseDocument)):
): ):
"""Save the :class:`~mongoengine.Document` to the database. If the """Save the :class:`~mongoengine.Document` to the database. If the
document already exists, it will be updated, otherwise it will be document already exists, it will be updated, otherwise it will be
created. created. Returns the saved object instance.
:param force_insert: only try to create a new document, don't allow :param force_insert: only try to create a new document, don't allow
updates of existing documents. updates of existing documents.
@ -851,17 +851,13 @@ class Document(six.with_metaclass(TopLevelDocumentMetaclass, BaseDocument)):
index_spec = cls._build_index_spec(keys) index_spec = cls._build_index_spec(keys)
index_spec = index_spec.copy() index_spec = index_spec.copy()
fields = index_spec.pop("fields") fields = index_spec.pop("fields")
drop_dups = kwargs.get("drop_dups", False)
if drop_dups:
msg = "drop_dups is deprecated and is removed when using PyMongo 3+."
warnings.warn(msg, DeprecationWarning)
index_spec["background"] = background index_spec["background"] = background
index_spec.update(kwargs) index_spec.update(kwargs)
return cls._get_collection().create_index(fields, **index_spec) return cls._get_collection().create_index(fields, **index_spec)
@classmethod @classmethod
def ensure_index(cls, key_or_list, drop_dups=False, background=False, **kwargs): def ensure_index(cls, key_or_list, background=False, **kwargs):
"""Ensure that the given indexes are in place. Deprecated in favour """Ensure that the given indexes are in place. Deprecated in favour
of create_index. of create_index.
@ -869,12 +865,7 @@ class Document(six.with_metaclass(TopLevelDocumentMetaclass, BaseDocument)):
construct a multi-field index); keys may be prefixed with a **+** construct a multi-field index); keys may be prefixed with a **+**
or a **-** to determine the index ordering or a **-** to determine the index ordering
:param background: Allows index creation in the background :param background: Allows index creation in the background
:param drop_dups: Was removed/ignored with MongoDB >2.7.5. The value
will be removed if PyMongo3+ is used
""" """
if drop_dups:
msg = "drop_dups is deprecated and is removed when using PyMongo 3+."
warnings.warn(msg, DeprecationWarning)
return cls.create_index(key_or_list, background=background, **kwargs) return cls.create_index(key_or_list, background=background, **kwargs)
@classmethod @classmethod
@ -887,12 +878,8 @@ class Document(six.with_metaclass(TopLevelDocumentMetaclass, BaseDocument)):
`auto_create_index` to False in the documents meta data `auto_create_index` to False in the documents meta data
""" """
background = cls._meta.get("index_background", False) background = cls._meta.get("index_background", False)
drop_dups = cls._meta.get("index_drop_dups", False)
index_opts = cls._meta.get("index_opts") or {} index_opts = cls._meta.get("index_opts") or {}
index_cls = cls._meta.get("index_cls", True) index_cls = cls._meta.get("index_cls", True)
if drop_dups:
msg = "drop_dups is deprecated and is removed when using PyMongo 3+."
warnings.warn(msg, DeprecationWarning)
collection = cls._get_collection() collection = cls._get_collection()
# 746: when connection is via mongos, the read preference is not necessarily an indication that # 746: when connection is via mongos, the read preference is not necessarily an indication that

View File

@ -41,6 +41,7 @@ from mongoengine.common import _import_class
from mongoengine.connection import DEFAULT_CONNECTION_NAME, get_db from mongoengine.connection import DEFAULT_CONNECTION_NAME, get_db
from mongoengine.document import Document, EmbeddedDocument from mongoengine.document import Document, EmbeddedDocument
from mongoengine.errors import DoesNotExist, InvalidQueryError, ValidationError from mongoengine.errors import DoesNotExist, InvalidQueryError, ValidationError
from mongoengine.mongodb_support import MONGODB_36, get_mongodb_version
from mongoengine.python_support import StringIO from mongoengine.python_support import StringIO
from mongoengine.queryset import DO_NOTHING from mongoengine.queryset import DO_NOTHING
from mongoengine.queryset.base import BaseQuerySet from mongoengine.queryset.base import BaseQuerySet
@ -1051,6 +1052,15 @@ def key_has_dot_or_dollar(d):
return True return True
def key_starts_with_dollar(d):
"""Helper function to recursively determine if any key in a
dictionary starts with a dollar
"""
for k, v in d.items():
if (k.startswith("$")) or (isinstance(v, dict) and key_starts_with_dollar(v)):
return True
class DictField(ComplexBaseField): class DictField(ComplexBaseField):
"""A dictionary field that wraps a standard Python dictionary. This is """A dictionary field that wraps a standard Python dictionary. This is
similar to an embedded document, but the structure is not defined. similar to an embedded document, but the structure is not defined.
@ -1077,10 +1087,15 @@ class DictField(ComplexBaseField):
if key_not_string(value): if key_not_string(value):
msg = "Invalid dictionary key - documents must have only string keys" msg = "Invalid dictionary key - documents must have only string keys"
self.error(msg) self.error(msg)
if key_has_dot_or_dollar(value):
# Following condition applies to MongoDB >= 3.6
# older Mongo has stricter constraints but
# it will be rejected upon insertion anyway
# Having a validation that depends on the MongoDB version
# is not straightforward as the field isn't aware of the connected Mongo
if key_starts_with_dollar(value):
self.error( self.error(
'Invalid dictionary key name - keys may not contain "."' 'Invalid dictionary key name - keys may not startswith "$" characters'
' or startswith "$" characters'
) )
super(DictField, self).validate(value) super(DictField, self).validate(value)
@ -2502,6 +2517,13 @@ class LazyReferenceField(BaseField):
else: else:
return pk return pk
def to_python(self, value):
"""Convert a MongoDB-compatible type to a Python type."""
if not isinstance(value, (DBRef, Document, EmbeddedDocument)):
collection = self.document_type._get_collection_name()
value = DBRef(collection, self.document_type.id.to_python(value))
return value
def validate(self, value): def validate(self, value):
if isinstance(value, LazyReference): if isinstance(value, LazyReference):
if value.collection != self.document_type._get_collection_name(): if value.collection != self.document_type._get_collection_name():

View File

@ -11,7 +11,7 @@ MONGODB_36 = (3, 6)
def get_mongodb_version(): def get_mongodb_version():
"""Return the version of the connected mongoDB (first 2 digits) """Return the version of the default connected mongoDB (first 2 digits)
:return: tuple(int, int) :return: tuple(int, int)
""" """

View File

@ -60,7 +60,6 @@ class BaseQuerySet(object):
self._ordering = None self._ordering = None
self._snapshot = False self._snapshot = False
self._timeout = True self._timeout = True
self._slave_okay = False
self._read_preference = None self._read_preference = None
self._iter = False self._iter = False
self._scalar = [] self._scalar = []
@ -302,7 +301,7 @@ class BaseQuerySet(object):
``insert(..., {w: 2, fsync: True})`` will wait until at least ``insert(..., {w: 2, fsync: True})`` will wait until at least
two servers have recorded the write and will force an fsync on two servers have recorded the write and will force an fsync on
each server being written to. each server being written to.
:parm signal_kwargs: (optional) kwargs dictionary to be passed to :param signal_kwargs: (optional) kwargs dictionary to be passed to
the signal calls. the signal calls.
By default returns document instances, set ``load_bulk`` to False to By default returns document instances, set ``load_bulk`` to False to
@ -694,8 +693,8 @@ class BaseQuerySet(object):
def in_bulk(self, object_ids): def in_bulk(self, object_ids):
"""Retrieve a set of documents by their ids. """Retrieve a set of documents by their ids.
:param object_ids: a list or tuple of ``ObjectId``\ s :param object_ids: a list or tuple of ObjectId's
:rtype: dict of ObjectIds as keys and collection-specific :rtype: dict of ObjectId's as keys and collection-specific
Document subclasses as values. Document subclasses as values.
.. versionadded:: 0.3 .. versionadded:: 0.3
@ -775,7 +774,6 @@ class BaseQuerySet(object):
"_ordering", "_ordering",
"_snapshot", "_snapshot",
"_timeout", "_timeout",
"_slave_okay",
"_read_preference", "_read_preference",
"_iter", "_iter",
"_scalar", "_scalar",
@ -1026,9 +1024,11 @@ class BaseQuerySet(object):
posts = BlogPost.objects(...).fields(comments=0) posts = BlogPost.objects(...).fields(comments=0)
To retrieve a subrange of array elements: To retrieve a subrange or sublist of array elements,
support exist for both the `slice` and `elemMatch` projection operator:
posts = BlogPost.objects(...).fields(slice__comments=5) posts = BlogPost.objects(...).fields(slice__comments=5)
posts = BlogPost.objects(...).fields(elemMatch__comments="test")
:param kwargs: A set of keyword arguments identifying what to :param kwargs: A set of keyword arguments identifying what to
include, exclude, or slice. include, exclude, or slice.
@ -1037,7 +1037,7 @@ class BaseQuerySet(object):
""" """
# Check for an operator and transform to mongo-style if there is # Check for an operator and transform to mongo-style if there is
operators = ["slice"] operators = ["slice", "elemMatch"]
cleaned_fields = [] cleaned_fields = []
for key, value in kwargs.items(): for key, value in kwargs.items():
parts = key.split("__") parts = key.split("__")
@ -1140,7 +1140,7 @@ class BaseQuerySet(object):
def explain(self): def explain(self):
"""Return an explain plan record for the """Return an explain plan record for the
:class:`~mongoengine.queryset.QuerySet`\ 's cursor. :class:`~mongoengine.queryset.QuerySet` cursor.
""" """
return self._cursor.explain() return self._cursor.explain()
@ -1170,20 +1170,6 @@ class BaseQuerySet(object):
queryset._timeout = enabled queryset._timeout = enabled
return queryset return queryset
# DEPRECATED. Has no more impact on PyMongo 3+
def slave_okay(self, enabled):
"""Enable or disable the slave_okay when querying.
:param enabled: whether or not the slave_okay is enabled
.. deprecated:: Ignored with PyMongo 3+
"""
msg = "slave_okay is deprecated as it has no impact when using PyMongo 3+."
warnings.warn(msg, DeprecationWarning)
queryset = self.clone()
queryset._slave_okay = enabled
return queryset
def read_preference(self, read_preference): def read_preference(self, read_preference):
"""Change the read_preference when querying. """Change the read_preference when querying.
@ -1255,16 +1241,27 @@ class BaseQuerySet(object):
for data in son_data for data in son_data
] ]
def aggregate(self, *pipeline, **kwargs): def aggregate(self, pipeline, *suppl_pipeline, **kwargs):
""" """Perform a aggregate function based in your queryset params
Perform a aggregate function based in your queryset params
:param pipeline: list of aggregation commands,\ :param pipeline: list of aggregation commands,\
see: http://docs.mongodb.org/manual/core/aggregation-pipeline/ see: http://docs.mongodb.org/manual/core/aggregation-pipeline/
:param suppl_pipeline: unpacked list of pipeline (added to support deprecation of the old interface)
parameter will be removed shortly
:param kwargs: (optional) kwargs dictionary to be passed to pymongo's aggregate call
See https://api.mongodb.com/python/current/api/pymongo/collection.html#pymongo.collection.Collection.aggregate
.. versionadded:: 0.9 .. versionadded:: 0.9
""" """
initial_pipeline = [] using_deprecated_interface = isinstance(pipeline, dict) or bool(suppl_pipeline)
user_pipeline = [pipeline] if isinstance(pipeline, dict) else list(pipeline)
if using_deprecated_interface:
msg = "Calling .aggregate() with un unpacked list (*pipeline) is deprecated, it will soon change and will expect a list (similar to pymongo.Collection.aggregate interface), see documentation"
warnings.warn(msg, DeprecationWarning)
user_pipeline += suppl_pipeline
initial_pipeline = []
if self._query: if self._query:
initial_pipeline.append({"$match": self._query}) initial_pipeline.append({"$match": self._query})
@ -1281,14 +1278,14 @@ class BaseQuerySet(object):
if self._skip is not None: if self._skip is not None:
initial_pipeline.append({"$skip": self._skip}) initial_pipeline.append({"$skip": self._skip})
pipeline = initial_pipeline + list(pipeline) final_pipeline = initial_pipeline + user_pipeline
collection = self._collection
if self._read_preference is not None: if self._read_preference is not None:
return self._collection.with_options( collection = self._collection.with_options(
read_preference=self._read_preference read_preference=self._read_preference
).aggregate(pipeline, cursor={}, **kwargs) )
return collection.aggregate(final_pipeline, cursor={}, **kwargs)
return self._collection.aggregate(pipeline, cursor={}, **kwargs)
# JS functionality # JS functionality
def map_reduce( def map_reduce(
@ -1947,23 +1944,3 @@ class BaseQuerySet(object):
setattr(queryset, "_" + method_name, val) setattr(queryset, "_" + method_name, val)
return queryset return queryset
# Deprecated
def ensure_index(self, **kwargs):
"""Deprecated use :func:`Document.ensure_index`"""
msg = (
"Doc.objects()._ensure_index() is deprecated. "
"Use Doc.ensure_index() instead."
)
warnings.warn(msg, DeprecationWarning)
self._document.__class__.ensure_index(**kwargs)
return self
def _ensure_indexes(self):
"""Deprecated use :func:`~Document.ensure_indexes`"""
msg = (
"Doc.objects()._ensure_indexes() is deprecated. "
"Use Doc.ensure_indexes() instead."
)
warnings.warn(msg, DeprecationWarning)
self._document.__class__.ensure_indexes()

View File

@ -169,9 +169,9 @@ def query(_doc_cls=None, **kwargs):
key = ".".join(parts) key = ".".join(parts)
if op is None or key not in mongo_query: if key not in mongo_query:
mongo_query[key] = value mongo_query[key] = value
elif key in mongo_query: else:
if isinstance(mongo_query[key], dict) and isinstance(value, dict): if isinstance(mongo_query[key], dict) and isinstance(value, dict):
mongo_query[key].update(value) mongo_query[key].update(value)
# $max/minDistance needs to come last - convert to SON # $max/minDistance needs to come last - convert to SON

View File

@ -1,4 +1,5 @@
import copy import copy
import warnings
from mongoengine.errors import InvalidQueryError from mongoengine.errors import InvalidQueryError
from mongoengine.queryset import transform from mongoengine.queryset import transform
@ -108,6 +109,8 @@ class QNode(object):
@property @property
def empty(self): def empty(self):
msg = "'empty' property is deprecated in favour of using 'not bool(filter)'"
warnings.warn(msg, DeprecationWarning)
return False return False
def __or__(self, other): def __or__(self, other):
@ -137,6 +140,11 @@ class QCombination(QNode):
op = " & " if self.operation is self.AND else " | " op = " & " if self.operation is self.AND else " | "
return "(%s)" % op.join([repr(node) for node in self.children]) return "(%s)" % op.join([repr(node) for node in self.children])
def __bool__(self):
return bool(self.children)
__nonzero__ = __bool__ # For Py2 support
def accept(self, visitor): def accept(self, visitor):
for i in range(len(self.children)): for i in range(len(self.children)):
if isinstance(self.children[i], QNode): if isinstance(self.children[i], QNode):
@ -146,6 +154,8 @@ class QCombination(QNode):
@property @property
def empty(self): def empty(self):
msg = "'empty' property is deprecated in favour of using 'not bool(filter)'"
warnings.warn(msg, DeprecationWarning)
return not bool(self.children) return not bool(self.children)
def __eq__(self, other): def __eq__(self, other):
@ -167,12 +177,17 @@ class Q(QNode):
def __repr__(self): def __repr__(self):
return "Q(**%s)" % repr(self.query) return "Q(**%s)" % repr(self.query)
def __bool__(self):
return bool(self.query)
__nonzero__ = __bool__ # For Py2 support
def __eq__(self, other):
return self.__class__ == other.__class__ and self.query == other.query
def accept(self, visitor): def accept(self, visitor):
return visitor.visit_query(self) return visitor.visit_query(self)
@property @property
def empty(self): def empty(self):
return not bool(self.query) return not bool(self.query)
def __eq__(self, other):
return self.__class__ == other.__class__ and self.query == other.query

View File

@ -108,6 +108,10 @@ CLASSIFIERS = [
"Topic :: Software Development :: Libraries :: Python Modules", "Topic :: Software Development :: Libraries :: Python Modules",
] ]
PYTHON_VERSION = sys.version_info[0]
PY3 = PYTHON_VERSION == 3
PY2 = PYTHON_VERSION == 2
extra_opts = { extra_opts = {
"packages": find_packages(exclude=["tests", "tests.*"]), "packages": find_packages(exclude=["tests", "tests.*"]),
"tests_require": [ "tests_require": [
@ -115,10 +119,11 @@ extra_opts = {
"pytest-cov", "pytest-cov",
"coverage<5.0", # recent coverage switched to sqlite format for the .coverage file which isn't handled properly by coveralls "coverage<5.0", # recent coverage switched to sqlite format for the .coverage file which isn't handled properly by coveralls
"blinker", "blinker",
"Pillow>=2.0.0", "Pillow>=2.0.0, <7.0.0", # 7.0.0 dropped Python2 support
"zipp<2.0.0", # (dependency of pytest) dropped python2 support
], ],
} }
if sys.version_info[0] == 3: if PY3:
extra_opts["use_2to3"] = True extra_opts["use_2to3"] = True
if "test" in sys.argv: if "test" in sys.argv:
extra_opts["packages"] = find_packages() extra_opts["packages"] = find_packages()
@ -143,7 +148,7 @@ setup(
long_description=LONG_DESCRIPTION, long_description=LONG_DESCRIPTION,
platforms=["any"], platforms=["any"],
classifiers=CLASSIFIERS, classifiers=CLASSIFIERS,
install_requires=["pymongo>=3.4", "six>=1.10.0"], install_requires=["pymongo>=3.4, <4.0", "six>=1.10.0"],
cmdclass={"test": PyTest}, cmdclass={"test": PyTest},
**extra_opts **extra_opts
) )

View File

@ -806,18 +806,6 @@ class TestIndexes(unittest.TestCase):
info = Log.objects._collection.index_information() info = Log.objects._collection.index_information()
assert 3600 == info["created_1"]["expireAfterSeconds"] assert 3600 == info["created_1"]["expireAfterSeconds"]
def test_index_drop_dups_silently_ignored(self):
class Customer(Document):
cust_id = IntField(unique=True, required=True)
meta = {
"indexes": ["cust_id"],
"index_drop_dups": True,
"allow_inheritance": False,
}
Customer.drop_collection()
Customer.objects.first()
def test_unique_and_indexes(self): def test_unique_and_indexes(self):
"""Ensure that 'unique' constraints aren't overridden by """Ensure that 'unique' constraints aren't overridden by
meta.indexes. meta.indexes.
@ -1058,10 +1046,6 @@ class TestIndexes(unittest.TestCase):
del index_info[key][ del index_info[key][
"ns" "ns"
] # drop the index namespace - we don't care about that here, MongoDB 3+ ] # drop the index namespace - we don't care about that here, MongoDB 3+
if "dropDups" in index_info[key]:
del index_info[key][
"dropDups"
] # drop the index dropDups - it is deprecated in MongoDB 3+
assert index_info == { assert index_info == {
"txt_1": {"key": [("txt", 1)], "background": False}, "txt_1": {"key": [("txt", 1)], "background": False},

View File

@ -523,7 +523,6 @@ class TestInheritance(MongoDBTestCase):
defaults = { defaults = {
"index_background": True, "index_background": True,
"index_drop_dups": True,
"index_opts": {"hello": "world"}, "index_opts": {"hello": "world"},
"allow_inheritance": True, "allow_inheritance": True,
"queryset_class": "QuerySet", "queryset_class": "QuerySet",

View File

@ -41,7 +41,7 @@ from tests.utils import MongoDBTestCase, get_as_pymongo
TEST_IMAGE_PATH = os.path.join(os.path.dirname(__file__), "../fields/mongoengine.png") TEST_IMAGE_PATH = os.path.join(os.path.dirname(__file__), "../fields/mongoengine.png")
class TestInstance(MongoDBTestCase): class TestDocumentInstance(MongoDBTestCase):
def setUp(self): def setUp(self):
class Job(EmbeddedDocument): class Job(EmbeddedDocument):
name = StringField() name = StringField()
@ -3319,6 +3319,39 @@ class TestInstance(MongoDBTestCase):
f1.ref # Dereferences lazily f1.ref # Dereferences lazily
assert f1 == f2 assert f1 == f2
def test_embedded_document_equality_with_lazy_ref(self):
class Job(EmbeddedDocument):
boss = LazyReferenceField("Person")
boss_dbref = LazyReferenceField("Person", dbref=True)
class Person(Document):
job = EmbeddedDocumentField(Job)
Person.drop_collection()
boss = Person()
worker = Person(job=Job(boss=boss, boss_dbref=boss))
boss.save()
worker.save()
worker1 = Person.objects.get(id=worker.id)
# worker1.job should be equal to the job used originally to create the
# document.
assert worker1.job == worker.job
# worker1.job should be equal to a newly created Job EmbeddedDocument
# using either the Boss object or his ID.
assert worker1.job == Job(boss=boss, boss_dbref=boss)
assert worker1.job == Job(boss=boss.id, boss_dbref=boss.id)
# The above equalities should also hold after worker1.job.boss has been
# fetch()ed.
worker1.job.boss.fetch()
assert worker1.job == worker.job
assert worker1.job == Job(boss=boss, boss_dbref=boss)
assert worker1.job == Job(boss=boss.id, boss_dbref=boss.id)
def test_dbref_equality(self): def test_dbref_equality(self):
class Test2(Document): class Test2(Document):
name = StringField() name = StringField()
@ -3584,6 +3617,51 @@ class TestInstance(MongoDBTestCase):
assert b._instance == a assert b._instance == a
assert idx == 2 assert idx == 2
def test_updating_listfield_manipulate_list(self):
class Company(Document):
name = StringField()
employees = ListField(field=DictField())
Company.drop_collection()
comp = Company(name="BigBank", employees=[{"name": "John"}])
comp.save()
comp.employees.append({"name": "Bill"})
comp.save()
stored_comp = get_as_pymongo(comp)
self.assertEqual(
stored_comp,
{
"_id": comp.id,
"employees": [{"name": "John"}, {"name": "Bill"}],
"name": "BigBank",
},
)
comp = comp.reload()
comp.employees[0]["color"] = "red"
comp.employees[-1]["color"] = "blue"
comp.employees[-1].update({"size": "xl"})
comp.save()
assert len(comp.employees) == 2
assert comp.employees[0] == {"name": "John", "color": "red"}
assert comp.employees[1] == {"name": "Bill", "size": "xl", "color": "blue"}
stored_comp = get_as_pymongo(comp)
self.assertEqual(
stored_comp,
{
"_id": comp.id,
"employees": [
{"name": "John", "color": "red"},
{"size": "xl", "color": "blue", "name": "Bill"},
],
"name": "BigBank",
},
)
def test_falsey_pk(self): def test_falsey_pk(self):
"""Ensure that we can create and update a document with Falsey PK.""" """Ensure that we can create and update a document with Falsey PK."""
@ -3660,13 +3738,13 @@ class TestInstance(MongoDBTestCase):
value = u"I_should_be_a_dict" value = u"I_should_be_a_dict"
coll.insert_one({"light_saber": value}) coll.insert_one({"light_saber": value})
with self.assertRaises(InvalidDocumentError) as cm: with pytest.raises(InvalidDocumentError) as exc_info:
list(Jedi.objects) list(Jedi.objects)
self.assertEqual( assert str(
str(cm.exception), exc_info.value
"Invalid data to create a `Jedi` instance.\nField 'light_saber' - The source SON object needs to be of type 'dict' but a '%s' was found" ) == "Invalid data to create a `Jedi` instance.\nField 'light_saber' - The source SON object needs to be of type 'dict' but a '%s' was found" % type(
% type(value), value
) )

View File

@ -65,7 +65,7 @@ class ComplexDateTimeFieldTest(MongoDBTestCase):
for values in itertools.product([2014], mm, dd, hh, ii, ss, microsecond): for values in itertools.product([2014], mm, dd, hh, ii, ss, microsecond):
stored = LogEntry(date=datetime.datetime(*values)).to_mongo()["date"] stored = LogEntry(date=datetime.datetime(*values)).to_mongo()["date"]
assert ( assert (
re.match("^\d{4},\d{2},\d{2},\d{2},\d{2},\d{2},\d{6}$", stored) re.match(r"^\d{4},\d{2},\d{2},\d{2},\d{2},\d{2},\d{6}$", stored)
is not None is not None
) )
@ -74,7 +74,7 @@ class ComplexDateTimeFieldTest(MongoDBTestCase):
"date_with_dots" "date_with_dots"
] ]
assert ( assert (
re.match("^\d{4}.\d{2}.\d{2}.\d{2}.\d{2}.\d{2}.\d{6}$", stored) is not None re.match(r"^\d{4}.\d{2}.\d{2}.\d{2}.\d{2}.\d{2}.\d{6}$", stored) is not None
) )
def test_complexdatetime_usage(self): def test_complexdatetime_usage(self):

View File

@ -1,8 +1,10 @@
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
import pytest import pytest
from bson import InvalidDocument
from mongoengine import * from mongoengine import *
from mongoengine.base import BaseDict from mongoengine.base import BaseDict
from mongoengine.mongodb_support import MONGODB_36, get_mongodb_version
from tests.utils import MongoDBTestCase, get_as_pymongo from tests.utils import MongoDBTestCase, get_as_pymongo
@ -18,22 +20,24 @@ class TestDictField(MongoDBTestCase):
post = BlogPost(info=info).save() post = BlogPost(info=info).save()
assert get_as_pymongo(post) == {"_id": post.id, "info": info} assert get_as_pymongo(post) == {"_id": post.id, "info": info}
def test_general_things(self): def test_validate_invalid_type(self):
"""Ensure that dict types work as expected.""" class BlogPost(Document):
info = DictField()
BlogPost.drop_collection()
invalid_infos = ["my post", ["test", "test"], {1: "test"}]
for invalid_info in invalid_infos:
with pytest.raises(ValidationError):
BlogPost(info=invalid_info).validate()
def test_keys_with_dots_or_dollars(self):
class BlogPost(Document): class BlogPost(Document):
info = DictField() info = DictField()
BlogPost.drop_collection() BlogPost.drop_collection()
post = BlogPost() post = BlogPost()
post.info = "my post"
with pytest.raises(ValidationError):
post.validate()
post.info = ["test", "test"]
with pytest.raises(ValidationError):
post.validate()
post.info = {"$title": "test"} post.info = {"$title": "test"}
with pytest.raises(ValidationError): with pytest.raises(ValidationError):
@ -43,19 +47,38 @@ class TestDictField(MongoDBTestCase):
with pytest.raises(ValidationError): with pytest.raises(ValidationError):
post.validate() post.validate()
post.info = {"the.title": "test"} post.info = {"$title.test": "test"}
with pytest.raises(ValidationError): with pytest.raises(ValidationError):
post.validate() post.validate()
post.info = {"nested": {"the.title": "test"}} post.info = {"nested": {"the.title": "test"}}
with pytest.raises(ValidationError): if get_mongodb_version() < MONGODB_36:
# MongoDB < 3.6 rejects dots
# To avoid checking the mongodb version from the DictField class
# we rely on MongoDB to reject the data during the save
post.validate()
with pytest.raises(InvalidDocument):
post.save()
else:
post.validate() post.validate()
post.info = {1: "test"} post.info = {"dollar_and_dot": {"te$st.test": "test"}}
with pytest.raises(ValidationError): if get_mongodb_version() < MONGODB_36:
post.validate()
with pytest.raises(InvalidDocument):
post.save()
else:
post.validate() post.validate()
post.info = {"title": "test"} def test_general_things(self):
"""Ensure that dict types work as expected."""
class BlogPost(Document):
info = DictField()
BlogPost.drop_collection()
post = BlogPost(info={"title": "test"})
post.save() post.save()
post = BlogPost() post = BlogPost()

View File

@ -151,7 +151,7 @@ class TestFileField(MongoDBTestCase):
result = StreamFile.objects.first() result = StreamFile.objects.first()
assert streamfile == result assert streamfile == result
assert result.the_file.read() == text + more_text assert result.the_file.read() == text + more_text
# self.assertEqual(result.the_file.content_type, content_type) # assert result.the_file.content_type == content_type
result.the_file.seek(0) result.the_file.seek(0)
assert result.the_file.tell() == 0 assert result.the_file.tell() == 0
assert result.the_file.read(len(text)) == text assert result.the_file.read(len(text)) == text

View File

@ -14,7 +14,7 @@ import six
from six import iteritems from six import iteritems
from mongoengine import * from mongoengine import *
from mongoengine.connection import get_connection, get_db from mongoengine.connection import get_db
from mongoengine.context_managers import query_counter, switch_db from mongoengine.context_managers import query_counter, switch_db
from mongoengine.errors import InvalidQueryError from mongoengine.errors import InvalidQueryError
from mongoengine.mongodb_support import MONGODB_36, get_mongodb_version from mongoengine.mongodb_support import MONGODB_36, get_mongodb_version
@ -4476,6 +4476,74 @@ class TestQueryset(unittest.TestCase):
expected = "[u'A1', u'A2']" expected = "[u'A1', u'A2']"
assert expected == "%s" % sorted(names) assert expected == "%s" % sorted(names)
def test_fields(self):
class Bar(EmbeddedDocument):
v = StringField()
z = StringField()
class Foo(Document):
x = StringField()
y = IntField()
items = EmbeddedDocumentListField(Bar)
Foo.drop_collection()
Foo(x="foo1", y=1).save()
Foo(x="foo2", y=2, items=[]).save()
Foo(x="foo3", y=3, items=[Bar(z="a", v="V")]).save()
Foo(
x="foo4",
y=4,
items=[
Bar(z="a", v="V"),
Bar(z="b", v="W"),
Bar(z="b", v="X"),
Bar(z="c", v="V"),
],
).save()
Foo(
x="foo5",
y=5,
items=[
Bar(z="b", v="X"),
Bar(z="c", v="V"),
Bar(z="d", v="V"),
Bar(z="e", v="V"),
],
).save()
foos_with_x = list(Foo.objects.order_by("y").fields(x=1))
assert all(o.x is not None for o in foos_with_x)
foos_without_y = list(Foo.objects.order_by("y").fields(y=0))
assert all(o.y is None for o in foos_with_x)
foos_with_sliced_items = list(Foo.objects.order_by("y").fields(slice__items=1))
assert foos_with_sliced_items[0].items == []
assert foos_with_sliced_items[1].items == []
assert len(foos_with_sliced_items[2].items) == 1
assert foos_with_sliced_items[2].items[0].z == "a"
assert len(foos_with_sliced_items[3].items) == 1
assert foos_with_sliced_items[3].items[0].z == "a"
assert len(foos_with_sliced_items[4].items) == 1
assert foos_with_sliced_items[4].items[0].z == "b"
foos_with_elem_match_items = list(
Foo.objects.order_by("y").fields(elemMatch__items={"z": "b"})
)
assert foos_with_elem_match_items[0].items == []
assert foos_with_elem_match_items[1].items == []
assert foos_with_elem_match_items[2].items == []
assert len(foos_with_elem_match_items[3].items) == 1
assert foos_with_elem_match_items[3].items[0].z == "b"
assert foos_with_elem_match_items[3].items[0].v == "W"
assert len(foos_with_elem_match_items[4].items) == 1
assert foos_with_elem_match_items[4].items[0].z == "b"
def test_elem_match(self): def test_elem_match(self):
class Foo(EmbeddedDocument): class Foo(EmbeddedDocument):
shape = StringField() shape = StringField()
@ -4658,21 +4726,6 @@ class TestQueryset(unittest.TestCase):
) )
assert_read_pref(bars, ReadPreference.SECONDARY_PREFERRED) assert_read_pref(bars, ReadPreference.SECONDARY_PREFERRED)
def test_read_preference_aggregation_framework(self):
class Bar(Document):
txt = StringField()
meta = {"indexes": ["txt"]}
# Aggregates with read_preference
bars = Bar.objects.read_preference(
ReadPreference.SECONDARY_PREFERRED
).aggregate()
assert (
bars._CommandCursor__collection.read_preference
== ReadPreference.SECONDARY_PREFERRED
)
def test_json_simple(self): def test_json_simple(self):
class Embedded(EmbeddedDocument): class Embedded(EmbeddedDocument):
string = StringField() string = StringField()
@ -5399,225 +5452,6 @@ class TestQueryset(unittest.TestCase):
assert Person.objects.first().name == "A" assert Person.objects.first().name == "A"
assert Person.objects._has_data(), "Cursor has data and returned False" assert Person.objects._has_data(), "Cursor has data and returned False"
def test_queryset_aggregation_framework(self):
class Person(Document):
name = StringField()
age = IntField()
Person.drop_collection()
p1 = Person(name="Isabella Luanna", age=16)
p2 = Person(name="Wilson Junior", age=21)
p3 = Person(name="Sandra Mara", age=37)
Person.objects.insert([p1, p2, p3])
data = Person.objects(age__lte=22).aggregate(
{"$project": {"name": {"$toUpper": "$name"}}}
)
assert list(data) == [
{"_id": p1.pk, "name": "ISABELLA LUANNA"},
{"_id": p2.pk, "name": "WILSON JUNIOR"},
]
data = (
Person.objects(age__lte=22)
.order_by("-name")
.aggregate({"$project": {"name": {"$toUpper": "$name"}}})
)
assert list(data) == [
{"_id": p2.pk, "name": "WILSON JUNIOR"},
{"_id": p1.pk, "name": "ISABELLA LUANNA"},
]
data = (
Person.objects(age__gte=17, age__lte=40)
.order_by("-age")
.aggregate(
{"$group": {"_id": None, "total": {"$sum": 1}, "avg": {"$avg": "$age"}}}
)
)
assert list(data) == [{"_id": None, "avg": 29, "total": 2}]
data = Person.objects().aggregate({"$match": {"name": "Isabella Luanna"}})
assert list(data) == [{u"_id": p1.pk, u"age": 16, u"name": u"Isabella Luanna"}]
def test_queryset_aggregation_with_skip(self):
class Person(Document):
name = StringField()
age = IntField()
Person.drop_collection()
p1 = Person(name="Isabella Luanna", age=16)
p2 = Person(name="Wilson Junior", age=21)
p3 = Person(name="Sandra Mara", age=37)
Person.objects.insert([p1, p2, p3])
data = Person.objects.skip(1).aggregate(
{"$project": {"name": {"$toUpper": "$name"}}}
)
assert list(data) == [
{"_id": p2.pk, "name": "WILSON JUNIOR"},
{"_id": p3.pk, "name": "SANDRA MARA"},
]
def test_queryset_aggregation_with_limit(self):
class Person(Document):
name = StringField()
age = IntField()
Person.drop_collection()
p1 = Person(name="Isabella Luanna", age=16)
p2 = Person(name="Wilson Junior", age=21)
p3 = Person(name="Sandra Mara", age=37)
Person.objects.insert([p1, p2, p3])
data = Person.objects.limit(1).aggregate(
{"$project": {"name": {"$toUpper": "$name"}}}
)
assert list(data) == [{"_id": p1.pk, "name": "ISABELLA LUANNA"}]
def test_queryset_aggregation_with_sort(self):
class Person(Document):
name = StringField()
age = IntField()
Person.drop_collection()
p1 = Person(name="Isabella Luanna", age=16)
p2 = Person(name="Wilson Junior", age=21)
p3 = Person(name="Sandra Mara", age=37)
Person.objects.insert([p1, p2, p3])
data = Person.objects.order_by("name").aggregate(
{"$project": {"name": {"$toUpper": "$name"}}}
)
assert list(data) == [
{"_id": p1.pk, "name": "ISABELLA LUANNA"},
{"_id": p3.pk, "name": "SANDRA MARA"},
{"_id": p2.pk, "name": "WILSON JUNIOR"},
]
def test_queryset_aggregation_with_skip_with_limit(self):
class Person(Document):
name = StringField()
age = IntField()
Person.drop_collection()
p1 = Person(name="Isabella Luanna", age=16)
p2 = Person(name="Wilson Junior", age=21)
p3 = Person(name="Sandra Mara", age=37)
Person.objects.insert([p1, p2, p3])
data = list(
Person.objects.skip(1)
.limit(1)
.aggregate({"$project": {"name": {"$toUpper": "$name"}}})
)
assert list(data) == [{"_id": p2.pk, "name": "WILSON JUNIOR"}]
# Make sure limit/skip chaining order has no impact
data2 = (
Person.objects.limit(1)
.skip(1)
.aggregate({"$project": {"name": {"$toUpper": "$name"}}})
)
assert data == list(data2)
def test_queryset_aggregation_with_sort_with_limit(self):
class Person(Document):
name = StringField()
age = IntField()
Person.drop_collection()
p1 = Person(name="Isabella Luanna", age=16)
p2 = Person(name="Wilson Junior", age=21)
p3 = Person(name="Sandra Mara", age=37)
Person.objects.insert([p1, p2, p3])
data = (
Person.objects.order_by("name")
.limit(2)
.aggregate({"$project": {"name": {"$toUpper": "$name"}}})
)
assert list(data) == [
{"_id": p1.pk, "name": "ISABELLA LUANNA"},
{"_id": p3.pk, "name": "SANDRA MARA"},
]
# Verify adding limit/skip steps works as expected
data = (
Person.objects.order_by("name")
.limit(2)
.aggregate({"$project": {"name": {"$toUpper": "$name"}}}, {"$limit": 1})
)
assert list(data) == [{"_id": p1.pk, "name": "ISABELLA LUANNA"}]
data = (
Person.objects.order_by("name")
.limit(2)
.aggregate(
{"$project": {"name": {"$toUpper": "$name"}}},
{"$skip": 1},
{"$limit": 1},
)
)
assert list(data) == [{"_id": p3.pk, "name": "SANDRA MARA"}]
def test_queryset_aggregation_with_sort_with_skip(self):
class Person(Document):
name = StringField()
age = IntField()
Person.drop_collection()
p1 = Person(name="Isabella Luanna", age=16)
p2 = Person(name="Wilson Junior", age=21)
p3 = Person(name="Sandra Mara", age=37)
Person.objects.insert([p1, p2, p3])
data = (
Person.objects.order_by("name")
.skip(2)
.aggregate({"$project": {"name": {"$toUpper": "$name"}}})
)
assert list(data) == [{"_id": p2.pk, "name": "WILSON JUNIOR"}]
def test_queryset_aggregation_with_sort_with_skip_with_limit(self):
class Person(Document):
name = StringField()
age = IntField()
Person.drop_collection()
p1 = Person(name="Isabella Luanna", age=16)
p2 = Person(name="Wilson Junior", age=21)
p3 = Person(name="Sandra Mara", age=37)
Person.objects.insert([p1, p2, p3])
data = (
Person.objects.order_by("name")
.skip(1)
.limit(1)
.aggregate({"$project": {"name": {"$toUpper": "$name"}}})
)
assert list(data) == [{"_id": p3.pk, "name": "SANDRA MARA"}]
def test_delete_count(self): def test_delete_count(self):
[self.Person(name="User {0}".format(i), age=i * 10).save() for i in range(1, 4)] [self.Person(name="User {0}".format(i), age=i * 10).save() for i in range(1, 4)]
assert ( assert (

View File

@ -0,0 +1,255 @@
# -*- coding: utf-8 -*-
import unittest
import warnings
from pymongo.read_preferences import ReadPreference
from mongoengine import *
from tests.utils import MongoDBTestCase
class TestQuerysetAggregate(MongoDBTestCase):
def test_read_preference_aggregation_framework(self):
class Bar(Document):
txt = StringField()
meta = {"indexes": ["txt"]}
# Aggregates with read_preference
pipeline = []
bars = Bar.objects.read_preference(
ReadPreference.SECONDARY_PREFERRED
).aggregate(pipeline)
assert (
bars._CommandCursor__collection.read_preference
== ReadPreference.SECONDARY_PREFERRED
)
def test_queryset_aggregation_framework(self):
class Person(Document):
name = StringField()
age = IntField()
Person.drop_collection()
p1 = Person(name="Isabella Luanna", age=16)
p2 = Person(name="Wilson Junior", age=21)
p3 = Person(name="Sandra Mara", age=37)
Person.objects.insert([p1, p2, p3])
pipeline = [{"$project": {"name": {"$toUpper": "$name"}}}]
data = Person.objects(age__lte=22).aggregate(pipeline)
assert list(data) == [
{"_id": p1.pk, "name": "ISABELLA LUANNA"},
{"_id": p2.pk, "name": "WILSON JUNIOR"},
]
pipeline = [{"$project": {"name": {"$toUpper": "$name"}}}]
data = Person.objects(age__lte=22).order_by("-name").aggregate(pipeline)
assert list(data) == [
{"_id": p2.pk, "name": "WILSON JUNIOR"},
{"_id": p1.pk, "name": "ISABELLA LUANNA"},
]
pipeline = [
{"$group": {"_id": None, "total": {"$sum": 1}, "avg": {"$avg": "$age"}}}
]
data = (
Person.objects(age__gte=17, age__lte=40)
.order_by("-age")
.aggregate(pipeline)
)
assert list(data) == [{"_id": None, "avg": 29, "total": 2}]
pipeline = [{"$match": {"name": "Isabella Luanna"}}]
data = Person.objects().aggregate(pipeline)
assert list(data) == [{u"_id": p1.pk, u"age": 16, u"name": u"Isabella Luanna"}]
def test_queryset_aggregation_with_skip(self):
class Person(Document):
name = StringField()
age = IntField()
Person.drop_collection()
p1 = Person(name="Isabella Luanna", age=16)
p2 = Person(name="Wilson Junior", age=21)
p3 = Person(name="Sandra Mara", age=37)
Person.objects.insert([p1, p2, p3])
pipeline = [{"$project": {"name": {"$toUpper": "$name"}}}]
data = Person.objects.skip(1).aggregate(pipeline)
assert list(data) == [
{"_id": p2.pk, "name": "WILSON JUNIOR"},
{"_id": p3.pk, "name": "SANDRA MARA"},
]
def test_queryset_aggregation_with_limit(self):
class Person(Document):
name = StringField()
age = IntField()
Person.drop_collection()
p1 = Person(name="Isabella Luanna", age=16)
p2 = Person(name="Wilson Junior", age=21)
p3 = Person(name="Sandra Mara", age=37)
Person.objects.insert([p1, p2, p3])
pipeline = [{"$project": {"name": {"$toUpper": "$name"}}}]
data = Person.objects.limit(1).aggregate(pipeline)
assert list(data) == [{"_id": p1.pk, "name": "ISABELLA LUANNA"}]
def test_queryset_aggregation_with_sort(self):
class Person(Document):
name = StringField()
age = IntField()
Person.drop_collection()
p1 = Person(name="Isabella Luanna", age=16)
p2 = Person(name="Wilson Junior", age=21)
p3 = Person(name="Sandra Mara", age=37)
Person.objects.insert([p1, p2, p3])
pipeline = [{"$project": {"name": {"$toUpper": "$name"}}}]
data = Person.objects.order_by("name").aggregate(pipeline)
assert list(data) == [
{"_id": p1.pk, "name": "ISABELLA LUANNA"},
{"_id": p3.pk, "name": "SANDRA MARA"},
{"_id": p2.pk, "name": "WILSON JUNIOR"},
]
def test_queryset_aggregation_with_skip_with_limit(self):
class Person(Document):
name = StringField()
age = IntField()
Person.drop_collection()
p1 = Person(name="Isabella Luanna", age=16)
p2 = Person(name="Wilson Junior", age=21)
p3 = Person(name="Sandra Mara", age=37)
Person.objects.insert([p1, p2, p3])
pipeline = [{"$project": {"name": {"$toUpper": "$name"}}}]
data = list(Person.objects.skip(1).limit(1).aggregate(pipeline))
assert list(data) == [{"_id": p2.pk, "name": "WILSON JUNIOR"}]
# Make sure limit/skip chaining order has no impact
data2 = Person.objects.limit(1).skip(1).aggregate(pipeline)
assert data == list(data2)
def test_queryset_aggregation_with_sort_with_limit(self):
class Person(Document):
name = StringField()
age = IntField()
Person.drop_collection()
p1 = Person(name="Isabella Luanna", age=16)
p2 = Person(name="Wilson Junior", age=21)
p3 = Person(name="Sandra Mara", age=37)
Person.objects.insert([p1, p2, p3])
pipeline = [{"$project": {"name": {"$toUpper": "$name"}}}]
data = Person.objects.order_by("name").limit(2).aggregate(pipeline)
assert list(data) == [
{"_id": p1.pk, "name": "ISABELLA LUANNA"},
{"_id": p3.pk, "name": "SANDRA MARA"},
]
# Verify adding limit/skip steps works as expected
pipeline = [{"$project": {"name": {"$toUpper": "$name"}}}, {"$limit": 1}]
data = Person.objects.order_by("name").limit(2).aggregate(pipeline)
assert list(data) == [{"_id": p1.pk, "name": "ISABELLA LUANNA"}]
pipeline = [
{"$project": {"name": {"$toUpper": "$name"}}},
{"$skip": 1},
{"$limit": 1},
]
data = Person.objects.order_by("name").limit(2).aggregate(pipeline)
assert list(data) == [{"_id": p3.pk, "name": "SANDRA MARA"}]
def test_queryset_aggregation_with_sort_with_skip(self):
class Person(Document):
name = StringField()
age = IntField()
Person.drop_collection()
p1 = Person(name="Isabella Luanna", age=16)
p2 = Person(name="Wilson Junior", age=21)
p3 = Person(name="Sandra Mara", age=37)
Person.objects.insert([p1, p2, p3])
pipeline = [{"$project": {"name": {"$toUpper": "$name"}}}]
data = Person.objects.order_by("name").skip(2).aggregate(pipeline)
assert list(data) == [{"_id": p2.pk, "name": "WILSON JUNIOR"}]
def test_queryset_aggregation_with_sort_with_skip_with_limit(self):
class Person(Document):
name = StringField()
age = IntField()
Person.drop_collection()
p1 = Person(name="Isabella Luanna", age=16)
p2 = Person(name="Wilson Junior", age=21)
p3 = Person(name="Sandra Mara", age=37)
Person.objects.insert([p1, p2, p3])
pipeline = [{"$project": {"name": {"$toUpper": "$name"}}}]
data = Person.objects.order_by("name").skip(1).limit(1).aggregate(pipeline)
assert list(data) == [{"_id": p3.pk, "name": "SANDRA MARA"}]
def test_queryset_aggregation_deprecated_interface(self):
class Person(Document):
name = StringField()
Person.drop_collection()
p1 = Person(name="Isabella Luanna")
p2 = Person(name="Wilson Junior")
p3 = Person(name="Sandra Mara")
Person.objects.insert([p1, p2, p3])
pipeline = [{"$project": {"name": {"$toUpper": "$name"}}}]
# Make sure a warning is emitted
with warnings.catch_warnings():
warnings.simplefilter("error", DeprecationWarning)
with self.assertRaises(DeprecationWarning):
Person.objects.order_by("name").limit(2).aggregate(*pipeline)
# Make sure old interface works as expected with a 1-step pipeline
data = Person.objects.order_by("name").limit(2).aggregate(*pipeline)
assert list(data) == [
{"_id": p1.pk, "name": "ISABELLA LUANNA"},
{"_id": p3.pk, "name": "SANDRA MARA"},
]
# Make sure old interface works as expected with a 2-steps pipeline
pipeline = [{"$project": {"name": {"$toUpper": "$name"}}}, {"$limit": 1}]
data = Person.objects.order_by("name").limit(2).aggregate(*pipeline)
assert list(data) == [{"_id": p1.pk, "name": "ISABELLA LUANNA"}]
if __name__ == "__main__":
unittest.main()

View File

@ -24,6 +24,12 @@ class TestTransform(unittest.TestCase):
} }
assert transform.query(friend__age__gte=30) == {"friend.age": {"$gte": 30}} assert transform.query(friend__age__gte=30) == {"friend.age": {"$gte": 30}}
assert transform.query(name__exists=True) == {"name": {"$exists": True}} assert transform.query(name__exists=True) == {"name": {"$exists": True}}
assert transform.query(name=["Mark"], __raw__={"name": {"$in": "Tom"}}) == {
"$and": [{"name": ["Mark"]}, {"name": {"$in": "Tom"}}]
}
assert transform.query(name__in=["Tom"], __raw__={"name": "Mark"}) == {
"$and": [{"name": {"$in": ["Tom"]}}, {"name": "Mark"}]
}
def test_transform_update(self): def test_transform_update(self):
class LisDoc(Document): class LisDoc(Document):

View File

@ -407,6 +407,17 @@ class TestQ(unittest.TestCase):
def test_combine_or_both_empty(self): def test_combine_or_both_empty(self):
assert Q() | Q() == Q() assert Q() | Q() == Q()
def test_q_bool(self):
assert Q(name="John")
assert not Q()
def test_combine_bool(self):
assert not Q() & Q()
assert Q() & Q(name="John")
assert Q(name="John") & Q()
assert Q() | Q(name="John")
assert Q(name="John") | Q()
if __name__ == "__main__": if __name__ == "__main__":
unittest.main() unittest.main()

View File

@ -1,5 +1,5 @@
[tox] [tox]
envlist = {py27,py35,pypy,pypy3}-{mg34,mg36} envlist = {py27,py35,pypy,pypy3}-{mg34,mg36,mg39,mg310}
[testenv] [testenv]
commands = commands =
@ -7,6 +7,7 @@ commands =
deps = deps =
mg34: pymongo>=3.4,<3.5 mg34: pymongo>=3.4,<3.5
mg36: pymongo>=3.6,<3.7 mg36: pymongo>=3.6,<3.7
mg39: pymongo>=3.9,<4.0 mg39: pymongo>=3.9,<3.10
mg310: pymongo>=3.10,<3.11
setenv = setenv =
PYTHON_EGG_CACHE = {envdir}/python-eggs PYTHON_EGG_CACHE = {envdir}/python-eggs