Compare commits
62 Commits
Author | SHA1 | Date | |
---|---|---|---|
|
74343841e4 | ||
|
3b3738b36b | ||
|
b15c3f6a3f | ||
|
2459f9b0aa | ||
|
6ff1bd9b3c | ||
|
1bc2d2ec37 | ||
|
d7fd6a4628 | ||
|
9236f365fa | ||
|
90d22c2a28 | ||
|
c9f6e6b62a | ||
|
260d9377f5 | ||
|
22d1ce6319 | ||
|
6997e02476 | ||
|
155d79ff4d | ||
|
452cd125fa | ||
|
e62c35b040 | ||
|
d5ec3c6a31 | ||
|
ad983dc279 | ||
|
bb15bf8d13 | ||
|
94adc207ad | ||
|
376d1c97ab | ||
|
4fe87b40da | ||
|
b10d76cf4b | ||
|
3bdc9a2f09 | ||
|
9d52e18659 | ||
|
653c4259ee | ||
|
9f5ab8149f | ||
|
66c6d14f7a | ||
|
2c0fc142a3 | ||
|
0da2dfd191 | ||
|
787fc1cd8b | ||
|
c31488add9 | ||
|
31ec7907b5 | ||
|
12f3f8c694 | ||
|
79098e997e | ||
|
dc1849bad5 | ||
|
e2d826c412 | ||
|
e6d796832e | ||
|
6f0a6df4f6 | ||
|
7a877a00d5 | ||
|
e8604d100e | ||
|
1647441ce8 | ||
|
9f8d6b3a00 | ||
|
4b2ad25405 | ||
|
3ce163b1a0 | ||
|
7c1ee28f13 | ||
|
2645e43da1 | ||
|
59bfe551a3 | ||
|
e2c78047b1 | ||
|
6a4351e44f | ||
|
adb60ef1ac | ||
|
3090adac04 | ||
|
b9253d86cc | ||
|
ab4d4e6230 | ||
|
7cd38c56c6 | ||
|
864053615b | ||
|
db2366f112 | ||
|
4defc82192 | ||
|
5949970a95 | ||
|
0ea4abda81 | ||
|
5c6035d636 | ||
|
a2183e3dcc |
@@ -1,5 +1,6 @@
|
|||||||
# http://travis-ci.org/#!/MongoEngine/mongoengine
|
# http://travis-ci.org/#!/MongoEngine/mongoengine
|
||||||
language: python
|
language: python
|
||||||
|
services: mongodb
|
||||||
python:
|
python:
|
||||||
- 2.5
|
- 2.5
|
||||||
- 2.6
|
- 2.6
|
||||||
@@ -25,4 +26,4 @@ notifications:
|
|||||||
branches:
|
branches:
|
||||||
only:
|
only:
|
||||||
- master
|
- master
|
||||||
- 0.7
|
- 0.8
|
7
AUTHORS
7
AUTHORS
@@ -106,7 +106,7 @@ that much better:
|
|||||||
* Adam Reeve
|
* Adam Reeve
|
||||||
* Anthony Nemitz
|
* Anthony Nemitz
|
||||||
* deignacio
|
* deignacio
|
||||||
* shaunduncan
|
* Shaun Duncan
|
||||||
* Meir Kriheli
|
* Meir Kriheli
|
||||||
* Andrey Fedoseev
|
* Andrey Fedoseev
|
||||||
* aparajita
|
* aparajita
|
||||||
@@ -123,3 +123,8 @@ that much better:
|
|||||||
* psychogenic
|
* psychogenic
|
||||||
* Stefan Wójcik
|
* Stefan Wójcik
|
||||||
* dimonb
|
* dimonb
|
||||||
|
* Garry Polley
|
||||||
|
* Adrian Scott
|
||||||
|
* Peter Teichman
|
||||||
|
* Jakub Kot
|
||||||
|
* Jorge Bastida
|
||||||
|
61
CONTRIBUTING.rst
Normal file
61
CONTRIBUTING.rst
Normal file
@@ -0,0 +1,61 @@
|
|||||||
|
Contributing to MongoEngine
|
||||||
|
===========================
|
||||||
|
|
||||||
|
MongoEngine has a large `community
|
||||||
|
<https://raw.github.com/MongoEngine/mongoengine/master/AUTHORS>`_ and
|
||||||
|
contributions are always encouraged. Contributions can be as simple as
|
||||||
|
minor tweaks to the documentation. Please read these guidelines before
|
||||||
|
sending a pull request.
|
||||||
|
|
||||||
|
Bugfixes and New Features
|
||||||
|
-------------------------
|
||||||
|
|
||||||
|
Before starting to write code, look for existing `tickets
|
||||||
|
<https://github.com/MongoEngine/mongoengine/issues?state=open>`_ or `create one
|
||||||
|
<https://github.com/MongoEngine/mongoengine/issues>`_ for your specific
|
||||||
|
issue or feature request. That way you avoid working on something
|
||||||
|
that might not be of interest or that has already been addressed. If in doubt
|
||||||
|
post to the `user group <http://groups.google.com/group/mongoengine-users>`
|
||||||
|
|
||||||
|
Supported Interpreters
|
||||||
|
----------------------
|
||||||
|
|
||||||
|
PyMongo supports CPython 2.5 and newer. Language
|
||||||
|
features not supported by all interpreters can not be used.
|
||||||
|
Please also ensure that your code is properly converted by
|
||||||
|
`2to3 <http://docs.python.org/library/2to3.html>`_ for Python 3 support.
|
||||||
|
|
||||||
|
Style Guide
|
||||||
|
-----------
|
||||||
|
|
||||||
|
MongoEngine aims to follow `PEP8 <http://www.python.org/dev/peps/pep-0008/>`_
|
||||||
|
including 4 space indents and 79 character line limits.
|
||||||
|
|
||||||
|
Testing
|
||||||
|
-------
|
||||||
|
|
||||||
|
All tests are run on `Travis <http://travis-ci.org/MongoEngine/mongoengine>`_
|
||||||
|
and any pull requests are automatically tested by Travis. Any pull requests
|
||||||
|
without tests will take longer to be integrated and might be refused.
|
||||||
|
|
||||||
|
General Guidelines
|
||||||
|
------------------
|
||||||
|
|
||||||
|
- Avoid backward breaking changes if at all possible.
|
||||||
|
- Write inline documentation for new classes and methods.
|
||||||
|
- Write tests and make sure they pass (make sure you have a mongod
|
||||||
|
running on the default port, then execute ``python setup.py test``
|
||||||
|
from the cmd line to run the test suite).
|
||||||
|
- Add yourself to AUTHORS.rst :)
|
||||||
|
|
||||||
|
Documentation
|
||||||
|
-------------
|
||||||
|
|
||||||
|
To contribute to the `API documentation
|
||||||
|
<http://docs.mongoengine.org/en/latest/apireference.html>`_
|
||||||
|
just make your changes to the inline documentation of the appropriate
|
||||||
|
`source code <https://github.com/MongoEngine/mongoengine>`_ or `rst file
|
||||||
|
<https://github.com/MongoEngine/mongoengine/tree/master/docs>`_ in a
|
||||||
|
branch and submit a `pull request <https://help.github.com/articles/using-pull-requests>`_.
|
||||||
|
You might also use the github `Edit <https://github.com/blog/844-forking-with-the-edit-button>`_
|
||||||
|
button.
|
@@ -14,7 +14,7 @@ About
|
|||||||
MongoEngine is a Python Object-Document Mapper for working with MongoDB.
|
MongoEngine is a Python Object-Document Mapper for working with MongoDB.
|
||||||
Documentation available at http://mongoengine-odm.rtfd.org - there is currently
|
Documentation available at http://mongoengine-odm.rtfd.org - there is currently
|
||||||
a `tutorial <http://readthedocs.org/docs/mongoengine-odm/en/latest/tutorial.html>`_, a `user guide
|
a `tutorial <http://readthedocs.org/docs/mongoengine-odm/en/latest/tutorial.html>`_, a `user guide
|
||||||
<http://readthedocs.org/docs/mongoengine-odm/en/latest/userguide.html>`_ and an `API reference
|
<https://mongoengine-odm.readthedocs.org/en/latest/guide/index.html>`_ and an `API reference
|
||||||
<http://readthedocs.org/docs/mongoengine-odm/en/latest/apireference.html>`_.
|
<http://readthedocs.org/docs/mongoengine-odm/en/latest/apireference.html>`_.
|
||||||
|
|
||||||
Installation
|
Installation
|
||||||
@@ -92,6 +92,4 @@ Community
|
|||||||
|
|
||||||
Contributing
|
Contributing
|
||||||
============
|
============
|
||||||
The source is available on `GitHub <http://github.com/MongoEngine/mongoengine>`_ - to
|
We welcome contributions! see the`Contribution guidelines <https://github.com/MongoEngine/mongoengine/blob/master/CONTRIBUTING.rst>`_
|
||||||
contribute to the project, fork it on GitHub and send a pull request, all
|
|
||||||
contributions and suggestions are welcome!
|
|
||||||
|
@@ -2,6 +2,45 @@
|
|||||||
Changelog
|
Changelog
|
||||||
=========
|
=========
|
||||||
|
|
||||||
|
Changes in 0.7.9
|
||||||
|
================
|
||||||
|
- Better fix handling for old style _types
|
||||||
|
- Embedded SequenceFields follow collection naming convention
|
||||||
|
|
||||||
|
Changes in 0.7.8
|
||||||
|
================
|
||||||
|
- Fix sequence fields in embedded documents (MongoEngine/mongoengine#166)
|
||||||
|
- Fix query chaining with .order_by() (MongoEngine/mongoengine#176)
|
||||||
|
- Added optional encoding and collection config for Django sessions (MongoEngine/mongoengine#180, MongoEngine/mongoengine#181, MongoEngine/mongoengine#183)
|
||||||
|
- Fixed EmailField so can add extra validation (MongoEngine/mongoengine#173, MongoEngine/mongoengine#174, MongoEngine/mongoengine#187)
|
||||||
|
- Fixed bulk inserts can now handle custom pk's (MongoEngine/mongoengine#192)
|
||||||
|
- Added as_pymongo method to return raw or cast results from pymongo (MongoEngine/mongoengine#193)
|
||||||
|
|
||||||
|
Changes in 0.7.7
|
||||||
|
================
|
||||||
|
- Fix handling for old style _types
|
||||||
|
|
||||||
|
Changes in 0.7.6
|
||||||
|
================
|
||||||
|
- Unicode fix for repr (MongoEngine/mongoengine#133)
|
||||||
|
- Allow updates with match operators (MongoEngine/mongoengine#144)
|
||||||
|
- Updated URLField - now can have a override the regex (MongoEngine/mongoengine#136)
|
||||||
|
- Allow Django AuthenticationBackends to work with Django user (hmarr/mongoengine#573)
|
||||||
|
- Fixed reload issue with ReferenceField where dbref=False (MongoEngine/mongoengine#138)
|
||||||
|
|
||||||
|
Changes in 0.7.5
|
||||||
|
================
|
||||||
|
- ReferenceFields with dbref=False use ObjectId instead of strings (MongoEngine/mongoengine#134)
|
||||||
|
See ticket for upgrade notes (https://github.com/MongoEngine/mongoengine/issues/134)
|
||||||
|
|
||||||
|
Changes in 0.7.4
|
||||||
|
================
|
||||||
|
- Fixed index inheritance issues - firmed up testcases (MongoEngine/mongoengine#123) (MongoEngine/mongoengine#125)
|
||||||
|
|
||||||
|
Changes in 0.7.3
|
||||||
|
================
|
||||||
|
- Reverted EmbeddedDocuments meta handling - now can turn off inheritance (MongoEngine/mongoengine#119)
|
||||||
|
|
||||||
Changes in 0.7.2
|
Changes in 0.7.2
|
||||||
================
|
================
|
||||||
- Update index spec generation so its not destructive (MongoEngine/mongoengine#113)
|
- Update index spec generation so its not destructive (MongoEngine/mongoengine#113)
|
||||||
|
@@ -344,6 +344,10 @@ Its value can take any of the following constants:
|
|||||||
their :file:`models.py` in the :const:`INSTALLED_APPS` tuple.
|
their :file:`models.py` in the :const:`INSTALLED_APPS` tuple.
|
||||||
|
|
||||||
|
|
||||||
|
.. warning::
|
||||||
|
Signals are not triggered when doing cascading updates / deletes - if this
|
||||||
|
is required you must manually handle the update / delete.
|
||||||
|
|
||||||
Generic reference fields
|
Generic reference fields
|
||||||
''''''''''''''''''''''''
|
''''''''''''''''''''''''
|
||||||
A second kind of reference field also exists,
|
A second kind of reference field also exists,
|
||||||
@@ -465,7 +469,7 @@ If a dictionary is passed then the following options are available:
|
|||||||
Whether the index should be sparse.
|
Whether the index should be sparse.
|
||||||
|
|
||||||
:attr:`unique` (Default: False)
|
:attr:`unique` (Default: False)
|
||||||
Whether the index should be sparse.
|
Whether the index should be unique.
|
||||||
|
|
||||||
.. note ::
|
.. note ::
|
||||||
|
|
||||||
|
@@ -50,4 +50,11 @@ Example usage::
|
|||||||
signals.post_save.connect(Author.post_save, sender=Author)
|
signals.post_save.connect(Author.post_save, sender=Author)
|
||||||
|
|
||||||
|
|
||||||
|
ReferenceFields and signals
|
||||||
|
---------------------------
|
||||||
|
|
||||||
|
Currently `reverse_delete_rules` do not trigger signals on the other part of
|
||||||
|
the relationship. If this is required you must manually handled the
|
||||||
|
reverse deletion.
|
||||||
|
|
||||||
.. _blinker: http://pypi.python.org/pypi/blinker
|
.. _blinker: http://pypi.python.org/pypi/blinker
|
||||||
|
@@ -34,10 +34,10 @@ To get help with using MongoEngine, use the `MongoEngine Users mailing list
|
|||||||
Contributing
|
Contributing
|
||||||
------------
|
------------
|
||||||
|
|
||||||
The source is available on `GitHub <http://github.com/hmarr/mongoengine>`_ and
|
The source is available on `GitHub <http://github.com/MongoEngine/mongoengine>`_ and
|
||||||
contributions are always encouraged. Contributions can be as simple as
|
contributions are always encouraged. Contributions can be as simple as
|
||||||
minor tweaks to this documentation. To contribute, fork the project on
|
minor tweaks to this documentation. To contribute, fork the project on
|
||||||
`GitHub <http://github.com/hmarr/mongoengine>`_ and send a
|
`GitHub <http://github.com/MongoEngine/mongoengine>`_ and send a
|
||||||
pull request.
|
pull request.
|
||||||
|
|
||||||
Also, you can join the developers' `mailing list
|
Also, you can join the developers' `mailing list
|
||||||
|
@@ -61,6 +61,13 @@ stored in rather than as string representations. Your code may need to be
|
|||||||
updated to handle native types rather than strings keys for the results of
|
updated to handle native types rather than strings keys for the results of
|
||||||
item frequency queries.
|
item frequency queries.
|
||||||
|
|
||||||
|
BinaryFields
|
||||||
|
------------
|
||||||
|
|
||||||
|
Binary fields have been updated so that they are native binary types. If you
|
||||||
|
previously were doing `str` comparisons with binary field values you will have
|
||||||
|
to update and wrap the value in a `str`.
|
||||||
|
|
||||||
0.5 to 0.6
|
0.5 to 0.6
|
||||||
==========
|
==========
|
||||||
|
|
||||||
|
@@ -12,7 +12,7 @@ from signals import *
|
|||||||
__all__ = (document.__all__ + fields.__all__ + connection.__all__ +
|
__all__ = (document.__all__ + fields.__all__ + connection.__all__ +
|
||||||
queryset.__all__ + signals.__all__)
|
queryset.__all__ + signals.__all__)
|
||||||
|
|
||||||
VERSION = (0, 7, 2)
|
VERSION = (0, 7, 9)
|
||||||
|
|
||||||
|
|
||||||
def get_version():
|
def get_version():
|
||||||
|
@@ -53,7 +53,7 @@ class ValidationError(AssertionError):
|
|||||||
self.message = message
|
self.message = message
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return self.message
|
return txt_type(self.message)
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return '%s(%s,)' % (self.__class__.__name__, self.message)
|
return '%s(%s,)' % (self.__class__.__name__, self.message)
|
||||||
@@ -121,10 +121,11 @@ class ValidationError(AssertionError):
|
|||||||
def get_document(name):
|
def get_document(name):
|
||||||
doc = _document_registry.get(name, None)
|
doc = _document_registry.get(name, None)
|
||||||
if not doc:
|
if not doc:
|
||||||
# Possible old style names
|
# Possible old style name
|
||||||
end = ".%s" % name
|
single_end = name.split('.')[-1]
|
||||||
|
compound_end = '.%s' % single_end
|
||||||
possible_match = [k for k in _document_registry.keys()
|
possible_match = [k for k in _document_registry.keys()
|
||||||
if k.endswith(end)]
|
if k.endswith(compound_end) or k == single_end]
|
||||||
if len(possible_match) == 1:
|
if len(possible_match) == 1:
|
||||||
doc = _document_registry.get(possible_match.pop(), None)
|
doc = _document_registry.get(possible_match.pop(), None)
|
||||||
if not doc:
|
if not doc:
|
||||||
@@ -235,7 +236,8 @@ class BaseField(object):
|
|||||||
pass
|
pass
|
||||||
|
|
||||||
def _validate(self, value):
|
def _validate(self, value):
|
||||||
from mongoengine import Document, EmbeddedDocument
|
Document = _import_class('Document')
|
||||||
|
EmbeddedDocument = _import_class('EmbeddedDocument')
|
||||||
# check choices
|
# check choices
|
||||||
if self.choices:
|
if self.choices:
|
||||||
is_cls = isinstance(value, (Document, EmbeddedDocument))
|
is_cls = isinstance(value, (Document, EmbeddedDocument))
|
||||||
@@ -283,7 +285,9 @@ class ComplexBaseField(BaseField):
|
|||||||
if instance is None:
|
if instance is None:
|
||||||
# Document class being used rather than a document object
|
# Document class being used rather than a document object
|
||||||
return self
|
return self
|
||||||
from fields import GenericReferenceField, ReferenceField
|
|
||||||
|
ReferenceField = _import_class('ReferenceField')
|
||||||
|
GenericReferenceField = _import_class('GenericReferenceField')
|
||||||
dereference = self.field is None or isinstance(self.field,
|
dereference = self.field is None or isinstance(self.field,
|
||||||
(GenericReferenceField, ReferenceField))
|
(GenericReferenceField, ReferenceField))
|
||||||
if not self._dereference and instance._initialised and dereference:
|
if not self._dereference and instance._initialised and dereference:
|
||||||
@@ -310,6 +314,7 @@ class ComplexBaseField(BaseField):
|
|||||||
)
|
)
|
||||||
value._dereferenced = True
|
value._dereferenced = True
|
||||||
instance._data[self.name] = value
|
instance._data[self.name] = value
|
||||||
|
|
||||||
return value
|
return value
|
||||||
|
|
||||||
def __set__(self, instance, value):
|
def __set__(self, instance, value):
|
||||||
@@ -321,7 +326,7 @@ class ComplexBaseField(BaseField):
|
|||||||
def to_python(self, value):
|
def to_python(self, value):
|
||||||
"""Convert a MongoDB-compatible type to a Python type.
|
"""Convert a MongoDB-compatible type to a Python type.
|
||||||
"""
|
"""
|
||||||
from mongoengine import Document
|
Document = _import_class('Document')
|
||||||
|
|
||||||
if isinstance(value, basestring):
|
if isinstance(value, basestring):
|
||||||
return value
|
return value
|
||||||
@@ -363,7 +368,7 @@ class ComplexBaseField(BaseField):
|
|||||||
def to_mongo(self, value):
|
def to_mongo(self, value):
|
||||||
"""Convert a Python type to a MongoDB-compatible type.
|
"""Convert a Python type to a MongoDB-compatible type.
|
||||||
"""
|
"""
|
||||||
from mongoengine import Document
|
Document = _import_class("Document")
|
||||||
|
|
||||||
if isinstance(value, basestring):
|
if isinstance(value, basestring):
|
||||||
return value
|
return value
|
||||||
@@ -399,7 +404,7 @@ class ComplexBaseField(BaseField):
|
|||||||
meta.get('allow_inheritance', ALLOW_INHERITANCE)
|
meta.get('allow_inheritance', ALLOW_INHERITANCE)
|
||||||
== False)
|
== False)
|
||||||
if allow_inheritance and not self.field:
|
if allow_inheritance and not self.field:
|
||||||
from fields import GenericReferenceField
|
GenericReferenceField = _import_class("GenericReferenceField")
|
||||||
value_dict[k] = GenericReferenceField().to_mongo(v)
|
value_dict[k] = GenericReferenceField().to_mongo(v)
|
||||||
else:
|
else:
|
||||||
collection = v._get_collection_name()
|
collection = v._get_collection_name()
|
||||||
@@ -460,7 +465,7 @@ class ComplexBaseField(BaseField):
|
|||||||
@property
|
@property
|
||||||
def _dereference(self,):
|
def _dereference(self,):
|
||||||
if not self.__dereference:
|
if not self.__dereference:
|
||||||
from dereference import DeReference
|
DeReference = _import_class("DeReference")
|
||||||
self.__dereference = DeReference() # Cached
|
self.__dereference = DeReference() # Cached
|
||||||
return self.__dereference
|
return self.__dereference
|
||||||
|
|
||||||
@@ -508,6 +513,10 @@ class DocumentMetaclass(type):
|
|||||||
|
|
||||||
attrs['_is_document'] = attrs.get('_is_document', False)
|
attrs['_is_document'] = attrs.get('_is_document', False)
|
||||||
|
|
||||||
|
# EmbeddedDocuments could have meta data for inheritance
|
||||||
|
if 'meta' in attrs:
|
||||||
|
attrs['_meta'] = attrs.pop('meta')
|
||||||
|
|
||||||
# Handle document Fields
|
# Handle document Fields
|
||||||
|
|
||||||
# Merge all fields from subclasses
|
# Merge all fields from subclasses
|
||||||
@@ -571,6 +580,24 @@ class DocumentMetaclass(type):
|
|||||||
superclasses[base._class_name] = base
|
superclasses[base._class_name] = base
|
||||||
superclasses.update(base._superclasses)
|
superclasses.update(base._superclasses)
|
||||||
|
|
||||||
|
if hasattr(base, '_meta'):
|
||||||
|
# Warn if allow_inheritance isn't set and prevent
|
||||||
|
# inheritance of classes where inheritance is set to False
|
||||||
|
allow_inheritance = base._meta.get('allow_inheritance',
|
||||||
|
ALLOW_INHERITANCE)
|
||||||
|
if (not getattr(base, '_is_base_cls', True)
|
||||||
|
and allow_inheritance is None):
|
||||||
|
warnings.warn(
|
||||||
|
"%s uses inheritance, the default for "
|
||||||
|
"allow_inheritance is changing to off by default. "
|
||||||
|
"Please add it to the document meta." % name,
|
||||||
|
FutureWarning
|
||||||
|
)
|
||||||
|
elif (allow_inheritance == False and
|
||||||
|
not base._meta.get('abstract')):
|
||||||
|
raise ValueError('Document %s may not be subclassed' %
|
||||||
|
base.__name__)
|
||||||
|
|
||||||
attrs['_class_name'] = '.'.join(reversed(class_name))
|
attrs['_class_name'] = '.'.join(reversed(class_name))
|
||||||
attrs['_superclasses'] = superclasses
|
attrs['_superclasses'] = superclasses
|
||||||
|
|
||||||
@@ -609,17 +636,6 @@ class DocumentMetaclass(type):
|
|||||||
"field name" % field.name)
|
"field name" % field.name)
|
||||||
raise InvalidDocumentError(msg)
|
raise InvalidDocumentError(msg)
|
||||||
|
|
||||||
# Merge in exceptions with parent hierarchy
|
|
||||||
exceptions_to_merge = (DoesNotExist, MultipleObjectsReturned)
|
|
||||||
module = attrs.get('__module__')
|
|
||||||
for exc in exceptions_to_merge:
|
|
||||||
name = exc.__name__
|
|
||||||
parents = tuple(getattr(base, name) for base in flattened_bases
|
|
||||||
if hasattr(base, name)) or (exc,)
|
|
||||||
# Create new exception and set to new_class
|
|
||||||
exception = type(name, parents, {'__module__': module})
|
|
||||||
setattr(new_class, name, exception)
|
|
||||||
|
|
||||||
# Add class to the _document_registry
|
# Add class to the _document_registry
|
||||||
_document_registry[new_class._class_name] = new_class
|
_document_registry[new_class._class_name] = new_class
|
||||||
|
|
||||||
@@ -745,21 +761,6 @@ class TopLevelDocumentMetaclass(DocumentMetaclass):
|
|||||||
if hasattr(base, 'meta'):
|
if hasattr(base, 'meta'):
|
||||||
meta.merge(base.meta)
|
meta.merge(base.meta)
|
||||||
elif hasattr(base, '_meta'):
|
elif hasattr(base, '_meta'):
|
||||||
# Warn if allow_inheritance isn't set and prevent
|
|
||||||
# inheritance of classes where inheritance is set to False
|
|
||||||
allow_inheritance = base._meta.get('allow_inheritance',
|
|
||||||
ALLOW_INHERITANCE)
|
|
||||||
if not base._is_base_cls and allow_inheritance is None:
|
|
||||||
warnings.warn(
|
|
||||||
"%s uses inheritance, the default for "
|
|
||||||
"allow_inheritance is changing to off by default. "
|
|
||||||
"Please add it to the document meta." % name,
|
|
||||||
FutureWarning
|
|
||||||
)
|
|
||||||
elif (allow_inheritance == False and
|
|
||||||
not base._meta.get('abstract')):
|
|
||||||
raise ValueError('Document %s may not be subclassed' %
|
|
||||||
base.__name__)
|
|
||||||
meta.merge(base._meta)
|
meta.merge(base._meta)
|
||||||
|
|
||||||
# Set collection in the meta if its callable
|
# Set collection in the meta if its callable
|
||||||
@@ -825,6 +826,17 @@ class TopLevelDocumentMetaclass(DocumentMetaclass):
|
|||||||
new_class._fields['id'] = ObjectIdField(db_field='_id')
|
new_class._fields['id'] = ObjectIdField(db_field='_id')
|
||||||
new_class.id = new_class._fields['id']
|
new_class.id = new_class._fields['id']
|
||||||
|
|
||||||
|
# Merge in exceptions with parent hierarchy
|
||||||
|
exceptions_to_merge = (DoesNotExist, MultipleObjectsReturned)
|
||||||
|
module = attrs.get('__module__')
|
||||||
|
for exc in exceptions_to_merge:
|
||||||
|
name = exc.__name__
|
||||||
|
parents = tuple(getattr(base, name) for base in flattened_bases
|
||||||
|
if hasattr(base, name)) or (exc,)
|
||||||
|
# Create new exception and set to new_class
|
||||||
|
exception = type(name, parents, {'__module__': module})
|
||||||
|
setattr(new_class, name, exception)
|
||||||
|
|
||||||
return new_class
|
return new_class
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
@@ -936,7 +948,7 @@ class BaseDocument(object):
|
|||||||
|
|
||||||
field = None
|
field = None
|
||||||
if not hasattr(self, name) and not name.startswith('_'):
|
if not hasattr(self, name) and not name.startswith('_'):
|
||||||
from fields import DynamicField
|
DynamicField = _import_class("DynamicField")
|
||||||
field = DynamicField(db_field=name)
|
field = DynamicField(db_field=name)
|
||||||
field.name = name
|
field.name = name
|
||||||
self._dynamic_fields[name] = field
|
self._dynamic_fields[name] = field
|
||||||
@@ -1114,7 +1126,8 @@ class BaseDocument(object):
|
|||||||
def _get_changed_fields(self, key='', inspected=None):
|
def _get_changed_fields(self, key='', inspected=None):
|
||||||
"""Returns a list of all fields that have explicitly been changed.
|
"""Returns a list of all fields that have explicitly been changed.
|
||||||
"""
|
"""
|
||||||
from mongoengine import EmbeddedDocument, DynamicEmbeddedDocument
|
EmbeddedDocument = _import_class("EmbeddedDocument")
|
||||||
|
DynamicEmbeddedDocument = _import_class("DynamicEmbeddedDocument")
|
||||||
_changed_fields = []
|
_changed_fields = []
|
||||||
_changed_fields += getattr(self, '_changed_fields', [])
|
_changed_fields += getattr(self, '_changed_fields', [])
|
||||||
|
|
||||||
@@ -1245,7 +1258,9 @@ class BaseDocument(object):
|
|||||||
geo_indices = []
|
geo_indices = []
|
||||||
inspected.append(cls)
|
inspected.append(cls)
|
||||||
|
|
||||||
from fields import EmbeddedDocumentField, GeoPointField
|
EmbeddedDocumentField = _import_class("EmbeddedDocumentField")
|
||||||
|
GeoPointField = _import_class("GeoPointField")
|
||||||
|
|
||||||
for field in cls._fields.values():
|
for field in cls._fields.values():
|
||||||
if not isinstance(field, (EmbeddedDocumentField, GeoPointField)):
|
if not isinstance(field, (EmbeddedDocumentField, GeoPointField)):
|
||||||
continue
|
continue
|
||||||
@@ -1319,10 +1334,11 @@ class BaseDocument(object):
|
|||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
try:
|
try:
|
||||||
u = txt_type(self)
|
u = self.__str__()
|
||||||
except (UnicodeEncodeError, UnicodeDecodeError):
|
except (UnicodeEncodeError, UnicodeDecodeError):
|
||||||
u = '[Bad Unicode data]'
|
u = '[Bad Unicode data]'
|
||||||
return '<%s: %s>' % (self.__class__.__name__, u)
|
repr_type = type(u)
|
||||||
|
return repr_type('<%s: %s>' % (self.__class__.__name__, u))
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
if hasattr(self, '__unicode__'):
|
if hasattr(self, '__unicode__'):
|
||||||
@@ -1330,7 +1346,7 @@ class BaseDocument(object):
|
|||||||
return self.__unicode__()
|
return self.__unicode__()
|
||||||
else:
|
else:
|
||||||
return unicode(self).encode('utf-8')
|
return unicode(self).encode('utf-8')
|
||||||
return '%s object' % self.__class__.__name__
|
return txt_type('%s object' % self.__class__.__name__)
|
||||||
|
|
||||||
def __eq__(self, other):
|
def __eq__(self, other):
|
||||||
if isinstance(other, self.__class__) and hasattr(other, 'id'):
|
if isinstance(other, self.__class__) and hasattr(other, 'id'):
|
||||||
@@ -1479,14 +1495,30 @@ def _import_class(cls_name):
|
|||||||
"""Cached mechanism for imports"""
|
"""Cached mechanism for imports"""
|
||||||
if cls_name in _class_registry:
|
if cls_name in _class_registry:
|
||||||
return _class_registry.get(cls_name)
|
return _class_registry.get(cls_name)
|
||||||
if cls_name == 'Document':
|
|
||||||
from mongoengine.document import Document as cls
|
|
||||||
elif cls_name == 'EmbeddedDocument':
|
|
||||||
from mongoengine.document import EmbeddedDocument as cls
|
|
||||||
elif cls_name == 'DictField':
|
|
||||||
from mongoengine.fields import DictField as cls
|
|
||||||
elif cls_name == 'OperationError':
|
|
||||||
from queryset import OperationError as cls
|
|
||||||
|
|
||||||
_class_registry[cls_name] = cls
|
doc_classes = ['Document', 'DynamicEmbeddedDocument', 'EmbeddedDocument']
|
||||||
return cls
|
field_classes = ['DictField', 'DynamicField', 'EmbeddedDocumentField',
|
||||||
|
'GenericReferenceField', 'GeoPointField',
|
||||||
|
'ReferenceField']
|
||||||
|
queryset_classes = ['OperationError']
|
||||||
|
deref_classes = ['DeReference']
|
||||||
|
|
||||||
|
if cls_name in doc_classes:
|
||||||
|
from mongoengine import document as module
|
||||||
|
import_classes = doc_classes
|
||||||
|
elif cls_name in field_classes:
|
||||||
|
from mongoengine import fields as module
|
||||||
|
import_classes = field_classes
|
||||||
|
elif cls_name in queryset_classes:
|
||||||
|
from mongoengine import queryset as module
|
||||||
|
import_classes = queryset_classes
|
||||||
|
elif cls_name in deref_classes:
|
||||||
|
from mongoengine import dereference as module
|
||||||
|
import_classes = deref_classes
|
||||||
|
else:
|
||||||
|
raise ValueError('No import set for: ' % cls_name)
|
||||||
|
|
||||||
|
for cls in import_classes:
|
||||||
|
_class_registry[cls] = getattr(module, cls)
|
||||||
|
|
||||||
|
return _class_registry.get(cls_name)
|
||||||
|
@@ -31,10 +31,10 @@ class DeReference(object):
|
|||||||
items = [i for i in items]
|
items = [i for i in items]
|
||||||
|
|
||||||
self.max_depth = max_depth
|
self.max_depth = max_depth
|
||||||
|
|
||||||
doc_type = None
|
doc_type = None
|
||||||
|
|
||||||
if instance and instance._fields:
|
if instance and instance._fields:
|
||||||
doc_type = instance._fields[name]
|
doc_type = instance._fields.get(name)
|
||||||
if hasattr(doc_type, 'field'):
|
if hasattr(doc_type, 'field'):
|
||||||
doc_type = doc_type.field
|
doc_type = doc_type.field
|
||||||
|
|
||||||
@@ -134,7 +134,7 @@ class DeReference(object):
|
|||||||
elif doc_type is None:
|
elif doc_type is None:
|
||||||
doc = get_document(
|
doc = get_document(
|
||||||
''.join(x.capitalize()
|
''.join(x.capitalize()
|
||||||
for x in col.split('_')))._from_son(ref)
|
for x in col.split('_')))._from_son(ref)
|
||||||
else:
|
else:
|
||||||
doc = doc_type._from_son(ref)
|
doc = doc_type._from_son(ref)
|
||||||
object_map[doc.id] = doc
|
object_map[doc.id] = doc
|
||||||
@@ -166,7 +166,7 @@ class DeReference(object):
|
|||||||
return self.object_map.get(items['_ref'].id, items)
|
return self.object_map.get(items['_ref'].id, items)
|
||||||
elif '_types' in items and '_cls' in items:
|
elif '_types' in items and '_cls' in items:
|
||||||
doc = get_document(items['_cls'])._from_son(items)
|
doc = get_document(items['_cls'])._from_son(items)
|
||||||
doc._data = self._attach_objects(doc._data, depth, doc, name)
|
doc._data = self._attach_objects(doc._data, depth, doc, None)
|
||||||
return doc
|
return doc
|
||||||
|
|
||||||
if not hasattr(items, 'items'):
|
if not hasattr(items, 'items'):
|
||||||
|
@@ -3,6 +3,8 @@ import datetime
|
|||||||
from mongoengine import *
|
from mongoengine import *
|
||||||
|
|
||||||
from django.utils.encoding import smart_str
|
from django.utils.encoding import smart_str
|
||||||
|
from django.contrib.auth.models import _user_get_all_permissions
|
||||||
|
from django.contrib.auth.models import _user_has_perm
|
||||||
from django.contrib.auth.models import AnonymousUser
|
from django.contrib.auth.models import AnonymousUser
|
||||||
from django.utils.translation import ugettext_lazy as _
|
from django.utils.translation import ugettext_lazy as _
|
||||||
|
|
||||||
@@ -104,6 +106,25 @@ class User(Document):
|
|||||||
"""
|
"""
|
||||||
return check_password(raw_password, self.password)
|
return check_password(raw_password, self.password)
|
||||||
|
|
||||||
|
def get_all_permissions(self, obj=None):
|
||||||
|
return _user_get_all_permissions(self, obj)
|
||||||
|
|
||||||
|
def has_perm(self, perm, obj=None):
|
||||||
|
"""
|
||||||
|
Returns True if the user has the specified permission. This method
|
||||||
|
queries all available auth backends, but returns immediately if any
|
||||||
|
backend returns True. Thus, a user who has permission from a single
|
||||||
|
auth backend is assumed to have permission in general. If an object is
|
||||||
|
provided, permissions for this specific object are checked.
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Active superusers have all permissions.
|
||||||
|
if self.is_active and self.is_superuser:
|
||||||
|
return True
|
||||||
|
|
||||||
|
# Otherwise we need to check the backends.
|
||||||
|
return _user_has_perm(self, perm, obj)
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def create_user(cls, username, password, email=None):
|
def create_user(cls, username, password, email=None):
|
||||||
"""Create (and save) a new user with the given username, password and
|
"""Create (and save) a new user with the given username, password and
|
||||||
|
@@ -15,13 +15,23 @@ MONGOENGINE_SESSION_DB_ALIAS = getattr(
|
|||||||
settings, 'MONGOENGINE_SESSION_DB_ALIAS',
|
settings, 'MONGOENGINE_SESSION_DB_ALIAS',
|
||||||
DEFAULT_CONNECTION_NAME)
|
DEFAULT_CONNECTION_NAME)
|
||||||
|
|
||||||
|
# a setting for the name of the collection used to store sessions
|
||||||
|
MONGOENGINE_SESSION_COLLECTION = getattr(
|
||||||
|
settings, 'MONGOENGINE_SESSION_COLLECTION',
|
||||||
|
'django_session')
|
||||||
|
|
||||||
|
# a setting for whether session data is stored encoded or not
|
||||||
|
MONGOENGINE_SESSION_DATA_ENCODE = getattr(
|
||||||
|
settings, 'MONGOENGINE_SESSION_DATA_ENCODE',
|
||||||
|
True)
|
||||||
|
|
||||||
class MongoSession(Document):
|
class MongoSession(Document):
|
||||||
session_key = fields.StringField(primary_key=True, max_length=40)
|
session_key = fields.StringField(primary_key=True, max_length=40)
|
||||||
session_data = fields.StringField()
|
session_data = fields.StringField() if MONGOENGINE_SESSION_DATA_ENCODE \
|
||||||
|
else fields.DictField()
|
||||||
expire_date = fields.DateTimeField()
|
expire_date = fields.DateTimeField()
|
||||||
|
|
||||||
meta = {'collection': 'django_session',
|
meta = {'collection': MONGOENGINE_SESSION_COLLECTION,
|
||||||
'db_alias': MONGOENGINE_SESSION_DB_ALIAS,
|
'db_alias': MONGOENGINE_SESSION_DB_ALIAS,
|
||||||
'allow_inheritance': False}
|
'allow_inheritance': False}
|
||||||
|
|
||||||
@@ -34,7 +44,10 @@ class SessionStore(SessionBase):
|
|||||||
try:
|
try:
|
||||||
s = MongoSession.objects(session_key=self.session_key,
|
s = MongoSession.objects(session_key=self.session_key,
|
||||||
expire_date__gt=datetime.now())[0]
|
expire_date__gt=datetime.now())[0]
|
||||||
return self.decode(force_unicode(s.session_data))
|
if MONGOENGINE_SESSION_DATA_ENCODE:
|
||||||
|
return self.decode(force_unicode(s.session_data))
|
||||||
|
else:
|
||||||
|
return s.session_data
|
||||||
except (IndexError, SuspiciousOperation):
|
except (IndexError, SuspiciousOperation):
|
||||||
self.create()
|
self.create()
|
||||||
return {}
|
return {}
|
||||||
@@ -57,7 +70,10 @@ class SessionStore(SessionBase):
|
|||||||
if self.session_key is None:
|
if self.session_key is None:
|
||||||
self._session_key = self._get_new_session_key()
|
self._session_key = self._get_new_session_key()
|
||||||
s = MongoSession(session_key=self.session_key)
|
s = MongoSession(session_key=self.session_key)
|
||||||
s.session_data = self.encode(self._get_session(no_load=must_create))
|
if MONGOENGINE_SESSION_DATA_ENCODE:
|
||||||
|
s.session_data = self.encode(self._get_session(no_load=must_create))
|
||||||
|
else:
|
||||||
|
s.session_data = self._get_session(no_load=must_create)
|
||||||
s.expire_date = self.get_expiry_date()
|
s.expire_date = self.get_expiry_date()
|
||||||
try:
|
try:
|
||||||
s.save(force_insert=must_create, safe=True)
|
s.save(force_insert=must_create, safe=True)
|
||||||
|
@@ -25,6 +25,14 @@ class EmbeddedDocument(BaseDocument):
|
|||||||
collection. :class:`~mongoengine.EmbeddedDocument`\ s should be used as
|
collection. :class:`~mongoengine.EmbeddedDocument`\ s should be used as
|
||||||
fields on :class:`~mongoengine.Document`\ s through the
|
fields on :class:`~mongoengine.Document`\ s through the
|
||||||
:class:`~mongoengine.EmbeddedDocumentField` field type.
|
:class:`~mongoengine.EmbeddedDocumentField` field type.
|
||||||
|
|
||||||
|
A :class:`~mongoengine.EmbeddedDocument` subclass may be itself subclassed,
|
||||||
|
to create a specialised version of the embedded document that will be
|
||||||
|
stored in the same collection. To facilitate this behaviour, `_cls` and
|
||||||
|
`_types` fields are added to documents (hidden though the MongoEngine
|
||||||
|
interface though). To disable this behaviour and remove the dependence on
|
||||||
|
the presence of `_cls` and `_types`, set :attr:`allow_inheritance` to
|
||||||
|
``False`` in the :attr:`meta` dictionary.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
# The __metaclass__ attribute is removed by 2to3 when running with Python3
|
# The __metaclass__ attribute is removed by 2to3 when running with Python3
|
||||||
@@ -353,7 +361,12 @@ class Document(BaseDocument):
|
|||||||
id_field = self._meta['id_field']
|
id_field = self._meta['id_field']
|
||||||
obj = self.__class__.objects(
|
obj = self.__class__.objects(
|
||||||
**{id_field: self[id_field]}
|
**{id_field: self[id_field]}
|
||||||
).first().select_related(max_depth=max_depth)
|
).limit(1).select_related(max_depth=max_depth)
|
||||||
|
if obj:
|
||||||
|
obj = obj[0]
|
||||||
|
else:
|
||||||
|
msg = "Reloaded document has been deleted"
|
||||||
|
raise OperationError(msg)
|
||||||
for field in self._fields:
|
for field in self._fields:
|
||||||
setattr(self, field, self._reload(field, obj[field]))
|
setattr(self, field, self._reload(field, obj[field]))
|
||||||
if self._dynamic:
|
if self._dynamic:
|
||||||
|
@@ -1,10 +1,12 @@
|
|||||||
import datetime
|
import datetime
|
||||||
import decimal
|
import decimal
|
||||||
|
import itertools
|
||||||
import re
|
import re
|
||||||
import time
|
import time
|
||||||
|
import urllib2
|
||||||
|
import urlparse
|
||||||
import uuid
|
import uuid
|
||||||
import warnings
|
import warnings
|
||||||
import itertools
|
|
||||||
from operator import itemgetter
|
from operator import itemgetter
|
||||||
|
|
||||||
import gridfs
|
import gridfs
|
||||||
@@ -101,25 +103,30 @@ class URLField(StringField):
|
|||||||
.. versionadded:: 0.3
|
.. versionadded:: 0.3
|
||||||
"""
|
"""
|
||||||
|
|
||||||
URL_REGEX = re.compile(
|
_URL_REGEX = re.compile(
|
||||||
r'^https?://'
|
r'^(?:http|ftp)s?://' # http:// or https://
|
||||||
r'(?:(?:[A-Z0-9](?:[A-Z0-9-]{0,61}[A-Z0-9])?\.)+[A-Z]{2,6}\.?|'
|
r'(?:(?:[A-Z0-9](?:[A-Z0-9-]{0,61}[A-Z0-9])?\.)+(?:[A-Z]{2,6}\.?|[A-Z0-9-]{2,}\.?)|' #domain...
|
||||||
r'localhost|'
|
r'localhost|' #localhost...
|
||||||
r'\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})'
|
r'\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})' # ...or ip
|
||||||
r'(?::\d+)?'
|
r'(?::\d+)?' # optional port
|
||||||
r'(?:/?|[/?]\S+)$', re.IGNORECASE
|
r'(?:/?|[/?]\S+)$', re.IGNORECASE)
|
||||||
)
|
|
||||||
|
|
||||||
def __init__(self, verify_exists=False, **kwargs):
|
def __init__(self, verify_exists=False, url_regex=None, **kwargs):
|
||||||
self.verify_exists = verify_exists
|
self.verify_exists = verify_exists
|
||||||
|
self.url_regex = url_regex or self._URL_REGEX
|
||||||
super(URLField, self).__init__(**kwargs)
|
super(URLField, self).__init__(**kwargs)
|
||||||
|
|
||||||
def validate(self, value):
|
def validate(self, value):
|
||||||
if not URLField.URL_REGEX.match(value):
|
if not self.url_regex.match(value):
|
||||||
self.error('Invalid URL: %s' % value)
|
self.error('Invalid URL: %s' % value)
|
||||||
|
return
|
||||||
|
|
||||||
if self.verify_exists:
|
if self.verify_exists:
|
||||||
import urllib2
|
warnings.warn(
|
||||||
|
"The URLField verify_exists argument has intractable security "
|
||||||
|
"and performance issues. Accordingly, it has been deprecated.",
|
||||||
|
DeprecationWarning
|
||||||
|
)
|
||||||
try:
|
try:
|
||||||
request = urllib2.Request(value)
|
request = urllib2.Request(value)
|
||||||
urllib2.urlopen(request)
|
urllib2.urlopen(request)
|
||||||
@@ -142,6 +149,7 @@ class EmailField(StringField):
|
|||||||
def validate(self, value):
|
def validate(self, value):
|
||||||
if not EmailField.EMAIL_REGEX.match(value):
|
if not EmailField.EMAIL_REGEX.match(value):
|
||||||
self.error('Invalid Mail-address: %s' % value)
|
self.error('Invalid Mail-address: %s' % value)
|
||||||
|
super(EmailField, self).validate(value)
|
||||||
|
|
||||||
|
|
||||||
class IntField(BaseField):
|
class IntField(BaseField):
|
||||||
@@ -709,6 +717,10 @@ class ReferenceField(BaseField):
|
|||||||
|
|
||||||
Bar.register_delete_rule(Foo, 'bar', NULLIFY)
|
Bar.register_delete_rule(Foo, 'bar', NULLIFY)
|
||||||
|
|
||||||
|
.. note ::
|
||||||
|
`reverse_delete_rules` do not trigger pre / post delete signals to be
|
||||||
|
triggered.
|
||||||
|
|
||||||
.. versionchanged:: 0.5 added `reverse_delete_rule`
|
.. versionchanged:: 0.5 added `reverse_delete_rule`
|
||||||
"""
|
"""
|
||||||
|
|
||||||
@@ -766,7 +778,7 @@ class ReferenceField(BaseField):
|
|||||||
def to_mongo(self, document):
|
def to_mongo(self, document):
|
||||||
if isinstance(document, DBRef):
|
if isinstance(document, DBRef):
|
||||||
if not self.dbref:
|
if not self.dbref:
|
||||||
return "%s" % DBRef.id
|
return document.id
|
||||||
return document
|
return document
|
||||||
elif not self.dbref and isinstance(document, basestring):
|
elif not self.dbref and isinstance(document, basestring):
|
||||||
return document
|
return document
|
||||||
@@ -788,7 +800,7 @@ class ReferenceField(BaseField):
|
|||||||
collection = self.document_type._get_collection_name()
|
collection = self.document_type._get_collection_name()
|
||||||
return DBRef(collection, id_)
|
return DBRef(collection, id_)
|
||||||
|
|
||||||
return "%s" % id_
|
return id_
|
||||||
|
|
||||||
def to_python(self, value):
|
def to_python(self, value):
|
||||||
"""Convert a MongoDB-compatible type to a Python type.
|
"""Convert a MongoDB-compatible type to a Python type.
|
||||||
@@ -1326,7 +1338,7 @@ class SequenceField(IntField):
|
|||||||
|
|
||||||
.. versionadded:: 0.5
|
.. versionadded:: 0.5
|
||||||
"""
|
"""
|
||||||
def __init__(self, collection_name=None, db_alias = None, sequence_name = None, *args, **kwargs):
|
def __init__(self, collection_name=None, db_alias=None, sequence_name=None, *args, **kwargs):
|
||||||
self.collection_name = collection_name or 'mongoengine.counters'
|
self.collection_name = collection_name or 'mongoengine.counters'
|
||||||
self.db_alias = db_alias or DEFAULT_CONNECTION_NAME
|
self.db_alias = db_alias or DEFAULT_CONNECTION_NAME
|
||||||
self.sequence_name = sequence_name
|
self.sequence_name = sequence_name
|
||||||
@@ -1336,7 +1348,7 @@ class SequenceField(IntField):
|
|||||||
"""
|
"""
|
||||||
Generate and Increment the counter
|
Generate and Increment the counter
|
||||||
"""
|
"""
|
||||||
sequence_name = self.sequence_name or self.owner_document._get_collection_name()
|
sequence_name = self.get_sequence_name()
|
||||||
sequence_id = "%s.%s" % (sequence_name, self.name)
|
sequence_id = "%s.%s" % (sequence_name, self.name)
|
||||||
collection = get_db(alias=self.db_alias)[self.collection_name]
|
collection = get_db(alias=self.db_alias)[self.collection_name]
|
||||||
counter = collection.find_and_modify(query={"_id": sequence_id},
|
counter = collection.find_and_modify(query={"_id": sequence_id},
|
||||||
@@ -1345,6 +1357,16 @@ class SequenceField(IntField):
|
|||||||
upsert=True)
|
upsert=True)
|
||||||
return counter['next']
|
return counter['next']
|
||||||
|
|
||||||
|
def get_sequence_name(self):
|
||||||
|
if self.sequence_name:
|
||||||
|
return self.sequence_name
|
||||||
|
owner = self.owner_document
|
||||||
|
if issubclass(owner, Document):
|
||||||
|
return owner._get_collection_name()
|
||||||
|
else:
|
||||||
|
return ''.join('_%s' % c if c.isupper() else c
|
||||||
|
for c in owner._class_name).strip('_').lower()
|
||||||
|
|
||||||
def __get__(self, instance, owner):
|
def __get__(self, instance, owner):
|
||||||
|
|
||||||
if instance is None:
|
if instance is None:
|
||||||
|
@@ -353,6 +353,8 @@ class QuerySet(object):
|
|||||||
self._slave_okay = False
|
self._slave_okay = False
|
||||||
self._iter = False
|
self._iter = False
|
||||||
self._scalar = []
|
self._scalar = []
|
||||||
|
self._as_pymongo = False
|
||||||
|
self._as_pymongo_coerce = False
|
||||||
|
|
||||||
# If inheritance is allowed, only return instances and instances of
|
# If inheritance is allowed, only return instances and instances of
|
||||||
# subclasses of the class being used
|
# subclasses of the class being used
|
||||||
@@ -501,8 +503,10 @@ class QuerySet(object):
|
|||||||
"""
|
"""
|
||||||
if isinstance(spec, basestring):
|
if isinstance(spec, basestring):
|
||||||
spec = {'fields': [spec]}
|
spec = {'fields': [spec]}
|
||||||
if isinstance(spec, (list, tuple)):
|
elif isinstance(spec, (list, tuple)):
|
||||||
spec = {'fields': spec}
|
spec = {'fields': list(spec)}
|
||||||
|
elif isinstance(spec, dict):
|
||||||
|
spec = dict(spec)
|
||||||
|
|
||||||
index_list = []
|
index_list = []
|
||||||
direction = None
|
direction = None
|
||||||
@@ -606,11 +610,13 @@ class QuerySet(object):
|
|||||||
if self._where_clause:
|
if self._where_clause:
|
||||||
self._cursor_obj.where(self._where_clause)
|
self._cursor_obj.where(self._where_clause)
|
||||||
|
|
||||||
# apply default ordering
|
|
||||||
if self._ordering:
|
if self._ordering:
|
||||||
|
# Apply query ordering
|
||||||
self._cursor_obj.sort(self._ordering)
|
self._cursor_obj.sort(self._ordering)
|
||||||
elif self._document._meta['ordering']:
|
elif self._document._meta['ordering']:
|
||||||
|
# Otherwise, apply the ordering from the document model
|
||||||
self.order_by(*self._document._meta['ordering'])
|
self.order_by(*self._document._meta['ordering'])
|
||||||
|
self._cursor_obj.sort(self._ordering)
|
||||||
|
|
||||||
if self._limit is not None:
|
if self._limit is not None:
|
||||||
self._cursor_obj.limit(self._limit - (self._skip or 0))
|
self._cursor_obj.limit(self._limit - (self._skip or 0))
|
||||||
@@ -925,7 +931,7 @@ class QuerySet(object):
|
|||||||
if not isinstance(doc, self._document):
|
if not isinstance(doc, self._document):
|
||||||
msg = "Some documents inserted aren't instances of %s" % str(self._document)
|
msg = "Some documents inserted aren't instances of %s" % str(self._document)
|
||||||
raise OperationError(msg)
|
raise OperationError(msg)
|
||||||
if doc.pk:
|
if doc.pk and not doc._created:
|
||||||
msg = "Some documents have ObjectIds use doc.update() instead"
|
msg = "Some documents have ObjectIds use doc.update() instead"
|
||||||
raise OperationError(msg)
|
raise OperationError(msg)
|
||||||
raw.append(doc.to_mongo())
|
raw.append(doc.to_mongo())
|
||||||
@@ -984,6 +990,9 @@ class QuerySet(object):
|
|||||||
for doc in docs:
|
for doc in docs:
|
||||||
doc_map[doc['_id']] = self._get_scalar(
|
doc_map[doc['_id']] = self._get_scalar(
|
||||||
self._document._from_son(doc))
|
self._document._from_son(doc))
|
||||||
|
elif self._as_pymongo:
|
||||||
|
for doc in docs:
|
||||||
|
doc_map[doc['_id']] = self._get_as_pymongo(doc)
|
||||||
else:
|
else:
|
||||||
for doc in docs:
|
for doc in docs:
|
||||||
doc_map[doc['_id']] = self._document._from_son(doc)
|
doc_map[doc['_id']] = self._document._from_son(doc)
|
||||||
@@ -1000,6 +1009,9 @@ class QuerySet(object):
|
|||||||
if self._scalar:
|
if self._scalar:
|
||||||
return self._get_scalar(self._document._from_son(
|
return self._get_scalar(self._document._from_son(
|
||||||
self._cursor.next()))
|
self._cursor.next()))
|
||||||
|
if self._as_pymongo:
|
||||||
|
return self._get_as_pymongo(self._cursor.next())
|
||||||
|
|
||||||
return self._document._from_son(self._cursor.next())
|
return self._document._from_son(self._cursor.next())
|
||||||
except StopIteration, e:
|
except StopIteration, e:
|
||||||
self.rewind()
|
self.rewind()
|
||||||
@@ -1182,6 +1194,8 @@ class QuerySet(object):
|
|||||||
if self._scalar:
|
if self._scalar:
|
||||||
return self._get_scalar(self._document._from_son(
|
return self._get_scalar(self._document._from_son(
|
||||||
self._cursor[key]))
|
self._cursor[key]))
|
||||||
|
if self._as_pymongo:
|
||||||
|
return self._get_as_pymongo(self._cursor.next())
|
||||||
return self._document._from_son(self._cursor[key])
|
return self._document._from_son(self._cursor[key])
|
||||||
raise AttributeError
|
raise AttributeError
|
||||||
|
|
||||||
@@ -1300,7 +1314,7 @@ class QuerySet(object):
|
|||||||
key_list.append((key, direction))
|
key_list.append((key, direction))
|
||||||
|
|
||||||
self._ordering = key_list
|
self._ordering = key_list
|
||||||
self._cursor.sort(key_list)
|
|
||||||
return self
|
return self
|
||||||
|
|
||||||
def explain(self, format=False):
|
def explain(self, format=False):
|
||||||
@@ -1393,6 +1407,8 @@ class QuerySet(object):
|
|||||||
"""
|
"""
|
||||||
operators = ['set', 'unset', 'inc', 'dec', 'pop', 'push', 'push_all',
|
operators = ['set', 'unset', 'inc', 'dec', 'pop', 'push', 'push_all',
|
||||||
'pull', 'pull_all', 'add_to_set']
|
'pull', 'pull_all', 'add_to_set']
|
||||||
|
match_operators = ['ne', 'gt', 'gte', 'lt', 'lte', 'in', 'nin', 'mod',
|
||||||
|
'all', 'size', 'exists', 'not']
|
||||||
|
|
||||||
mongo_update = {}
|
mongo_update = {}
|
||||||
for key, value in update.items():
|
for key, value in update.items():
|
||||||
@@ -1416,6 +1432,10 @@ class QuerySet(object):
|
|||||||
elif op == 'add_to_set':
|
elif op == 'add_to_set':
|
||||||
op = op.replace('_to_set', 'ToSet')
|
op = op.replace('_to_set', 'ToSet')
|
||||||
|
|
||||||
|
match = None
|
||||||
|
if parts[-1] in match_operators:
|
||||||
|
match = parts.pop()
|
||||||
|
|
||||||
if _doc_cls:
|
if _doc_cls:
|
||||||
# Switch field names to proper names [set in Field(name='foo')]
|
# Switch field names to proper names [set in Field(name='foo')]
|
||||||
fields = QuerySet._lookup_field(_doc_cls, parts)
|
fields = QuerySet._lookup_field(_doc_cls, parts)
|
||||||
@@ -1449,16 +1469,22 @@ class QuerySet(object):
|
|||||||
elif field.required or value is not None:
|
elif field.required or value is not None:
|
||||||
value = field.prepare_query_value(op, value)
|
value = field.prepare_query_value(op, value)
|
||||||
|
|
||||||
|
if match:
|
||||||
|
match = '$' + match
|
||||||
|
value = {match: value}
|
||||||
|
|
||||||
key = '.'.join(parts)
|
key = '.'.join(parts)
|
||||||
|
|
||||||
if not op:
|
if not op:
|
||||||
raise InvalidQueryError("Updates must supply an operation eg: set__FIELD=value")
|
raise InvalidQueryError("Updates must supply an operation "
|
||||||
|
"eg: set__FIELD=value")
|
||||||
|
|
||||||
if 'pull' in op and '.' in key:
|
if 'pull' in op and '.' in key:
|
||||||
# Dot operators don't work on pull operations
|
# Dot operators don't work on pull operations
|
||||||
# it uses nested dict syntax
|
# it uses nested dict syntax
|
||||||
if op == 'pullAll':
|
if op == 'pullAll':
|
||||||
raise InvalidQueryError("pullAll operations only support a single field depth")
|
raise InvalidQueryError("pullAll operations only support "
|
||||||
|
"a single field depth")
|
||||||
|
|
||||||
parts.reverse()
|
parts.reverse()
|
||||||
for key in parts:
|
for key in parts:
|
||||||
@@ -1566,6 +1592,48 @@ class QuerySet(object):
|
|||||||
|
|
||||||
return tuple(data)
|
return tuple(data)
|
||||||
|
|
||||||
|
def _get_as_pymongo(self, row):
|
||||||
|
# Extract which fields paths we should follow if .fields(...) was
|
||||||
|
# used. If not, handle all fields.
|
||||||
|
if not getattr(self, '__as_pymongo_fields', None):
|
||||||
|
self.__as_pymongo_fields = []
|
||||||
|
for field in self._loaded_fields.fields - set(['_cls', '_id', '_types']):
|
||||||
|
self.__as_pymongo_fields.append(field)
|
||||||
|
while '.' in field:
|
||||||
|
field, _ = field.rsplit('.', 1)
|
||||||
|
self.__as_pymongo_fields.append(field)
|
||||||
|
|
||||||
|
all_fields = not self.__as_pymongo_fields
|
||||||
|
|
||||||
|
def clean(data, path=None):
|
||||||
|
path = path or ''
|
||||||
|
|
||||||
|
if isinstance(data, dict):
|
||||||
|
new_data = {}
|
||||||
|
for key, value in data.iteritems():
|
||||||
|
new_path = '%s.%s' % (path, key) if path else key
|
||||||
|
if all_fields or new_path in self.__as_pymongo_fields:
|
||||||
|
new_data[key] = clean(value, path=new_path)
|
||||||
|
data = new_data
|
||||||
|
elif isinstance(data, list):
|
||||||
|
data = [clean(d, path=path) for d in data]
|
||||||
|
else:
|
||||||
|
if self._as_pymongo_coerce:
|
||||||
|
# If we need to coerce types, we need to determine the
|
||||||
|
# type of this field and use the corresponding .to_python(...)
|
||||||
|
from mongoengine.fields import EmbeddedDocumentField
|
||||||
|
obj = self._document
|
||||||
|
for chunk in path.split('.'):
|
||||||
|
obj = getattr(obj, chunk, None)
|
||||||
|
if obj is None:
|
||||||
|
break
|
||||||
|
elif isinstance(obj, EmbeddedDocumentField):
|
||||||
|
obj = obj.document_type
|
||||||
|
if obj and data is not None:
|
||||||
|
data = obj.to_python(data)
|
||||||
|
return data
|
||||||
|
return clean(row)
|
||||||
|
|
||||||
def scalar(self, *fields):
|
def scalar(self, *fields):
|
||||||
"""Instead of returning Document instances, return either a specific
|
"""Instead of returning Document instances, return either a specific
|
||||||
value or a tuple of values in order.
|
value or a tuple of values in order.
|
||||||
@@ -1588,6 +1656,16 @@ class QuerySet(object):
|
|||||||
"""An alias for scalar"""
|
"""An alias for scalar"""
|
||||||
return self.scalar(*fields)
|
return self.scalar(*fields)
|
||||||
|
|
||||||
|
def as_pymongo(self, coerce_types=False):
|
||||||
|
"""Instead of returning Document instances, return raw values from
|
||||||
|
pymongo.
|
||||||
|
|
||||||
|
:param coerce_type: Field types (if applicable) would be use to coerce types.
|
||||||
|
"""
|
||||||
|
self._as_pymongo = True
|
||||||
|
self._as_pymongo_coerce = coerce_types
|
||||||
|
return self
|
||||||
|
|
||||||
def _sub_js_fields(self, code):
|
def _sub_js_fields(self, code):
|
||||||
"""When fields are specified with [~fieldname] syntax, where
|
"""When fields are specified with [~fieldname] syntax, where
|
||||||
*fieldname* is the Python name of a field, *fieldname* will be
|
*fieldname* is the Python name of a field, *fieldname* will be
|
||||||
|
@@ -5,7 +5,7 @@
|
|||||||
%define srcname mongoengine
|
%define srcname mongoengine
|
||||||
|
|
||||||
Name: python-%{srcname}
|
Name: python-%{srcname}
|
||||||
Version: 0.7.2
|
Version: 0.7.9
|
||||||
Release: 1%{?dist}
|
Release: 1%{?dist}
|
||||||
Summary: A Python Document-Object Mapper for working with MongoDB
|
Summary: A Python Document-Object Mapper for working with MongoDB
|
||||||
|
|
||||||
|
@@ -53,7 +53,7 @@ class TestWarnings(unittest.TestCase):
|
|||||||
p2.parent = p1
|
p2.parent = p1
|
||||||
p2.save(cascade=False)
|
p2.save(cascade=False)
|
||||||
|
|
||||||
self.assertEqual(len(self.warning_list), 1)
|
self.assertTrue(len(self.warning_list) > 0)
|
||||||
warning = self.warning_list[0]
|
warning = self.warning_list[0]
|
||||||
self.assertEqual(FutureWarning, warning["category"])
|
self.assertEqual(FutureWarning, warning["category"])
|
||||||
self.assertTrue("ReferenceFields will default to using ObjectId"
|
self.assertTrue("ReferenceFields will default to using ObjectId"
|
||||||
@@ -77,6 +77,8 @@ class TestWarnings(unittest.TestCase):
|
|||||||
p2.save()
|
p2.save()
|
||||||
|
|
||||||
self.assertEqual(len(self.warning_list), 1)
|
self.assertEqual(len(self.warning_list), 1)
|
||||||
|
if len(self.warning_list) > 1:
|
||||||
|
print self.warning_list
|
||||||
warning = self.warning_list[0]
|
warning = self.warning_list[0]
|
||||||
self.assertEqual(FutureWarning, warning["category"])
|
self.assertEqual(FutureWarning, warning["category"])
|
||||||
self.assertTrue("Cascading saves will default to off in 0.8"
|
self.assertTrue("Cascading saves will default to off in 0.8"
|
||||||
|
@@ -1,7 +1,7 @@
|
|||||||
from __future__ import with_statement
|
from __future__ import with_statement
|
||||||
import unittest
|
import unittest
|
||||||
|
|
||||||
from bson import DBRef
|
from bson import DBRef, ObjectId
|
||||||
|
|
||||||
from mongoengine import *
|
from mongoengine import *
|
||||||
from mongoengine.connection import get_db
|
from mongoengine.connection import get_db
|
||||||
@@ -42,6 +42,12 @@ class FieldTest(unittest.TestCase):
|
|||||||
group_obj = Group.objects.first()
|
group_obj = Group.objects.first()
|
||||||
self.assertEqual(q, 1)
|
self.assertEqual(q, 1)
|
||||||
|
|
||||||
|
len(group_obj._data['members'])
|
||||||
|
self.assertEqual(q, 1)
|
||||||
|
|
||||||
|
len(group_obj.members)
|
||||||
|
self.assertEqual(q, 2)
|
||||||
|
|
||||||
[m for m in group_obj.members]
|
[m for m in group_obj.members]
|
||||||
self.assertEqual(q, 2)
|
self.assertEqual(q, 2)
|
||||||
|
|
||||||
@@ -84,6 +90,7 @@ class FieldTest(unittest.TestCase):
|
|||||||
|
|
||||||
group = Group(members=User.objects)
|
group = Group(members=User.objects)
|
||||||
group.save()
|
group.save()
|
||||||
|
group.reload() # Confirm reload works
|
||||||
|
|
||||||
with query_counter() as q:
|
with query_counter() as q:
|
||||||
self.assertEqual(q, 0)
|
self.assertEqual(q, 0)
|
||||||
@@ -187,8 +194,8 @@ class FieldTest(unittest.TestCase):
|
|||||||
self.assertEqual(group.members, [user])
|
self.assertEqual(group.members, [user])
|
||||||
|
|
||||||
raw_data = Group._get_collection().find_one()
|
raw_data = Group._get_collection().find_one()
|
||||||
self.assertTrue(isinstance(raw_data['author'], basestring))
|
self.assertTrue(isinstance(raw_data['author'], ObjectId))
|
||||||
self.assertTrue(isinstance(raw_data['members'][0], basestring))
|
self.assertTrue(isinstance(raw_data['members'][0], ObjectId))
|
||||||
|
|
||||||
def test_recursive_reference(self):
|
def test_recursive_reference(self):
|
||||||
"""Ensure that ReferenceFields can reference their own documents.
|
"""Ensure that ReferenceFields can reference their own documents.
|
||||||
|
@@ -1,3 +1,4 @@
|
|||||||
|
# -*- coding: utf-8 -*-
|
||||||
from __future__ import with_statement
|
from __future__ import with_statement
|
||||||
import bson
|
import bson
|
||||||
import os
|
import os
|
||||||
@@ -14,7 +15,7 @@ from datetime import datetime
|
|||||||
from tests.fixtures import Base, Mixin, PickleEmbedded, PickleTest
|
from tests.fixtures import Base, Mixin, PickleEmbedded, PickleTest
|
||||||
|
|
||||||
from mongoengine import *
|
from mongoengine import *
|
||||||
from mongoengine.base import NotRegistered, InvalidDocumentError
|
from mongoengine.base import NotRegistered, InvalidDocumentError, get_document
|
||||||
from mongoengine.queryset import InvalidQueryError
|
from mongoengine.queryset import InvalidQueryError
|
||||||
from mongoengine.connection import get_db, get_connection
|
from mongoengine.connection import get_db, get_connection
|
||||||
|
|
||||||
@@ -85,6 +86,22 @@ class DocumentTest(unittest.TestCase):
|
|||||||
# Ensure Document isn't treated like an actual document
|
# Ensure Document isn't treated like an actual document
|
||||||
self.assertFalse(hasattr(Document, '_fields'))
|
self.assertFalse(hasattr(Document, '_fields'))
|
||||||
|
|
||||||
|
def test_repr(self):
|
||||||
|
"""Ensure that unicode representation works
|
||||||
|
"""
|
||||||
|
class Article(Document):
|
||||||
|
title = StringField()
|
||||||
|
|
||||||
|
def __unicode__(self):
|
||||||
|
return self.title
|
||||||
|
|
||||||
|
Article.drop_collection()
|
||||||
|
|
||||||
|
Article(title=u'привет мир').save()
|
||||||
|
|
||||||
|
self.assertEqual('<Article: привет мир>', repr(Article.objects.first()))
|
||||||
|
self.assertEqual('[<Article: привет мир>]', repr(Article.objects.all()))
|
||||||
|
|
||||||
def test_collection_naming(self):
|
def test_collection_naming(self):
|
||||||
"""Ensure that a collection with a specified name may be used.
|
"""Ensure that a collection with a specified name may be used.
|
||||||
"""
|
"""
|
||||||
@@ -338,7 +355,6 @@ class DocumentTest(unittest.TestCase):
|
|||||||
meta = {'allow_inheritance': False}
|
meta = {'allow_inheritance': False}
|
||||||
self.assertRaises(ValueError, create_employee_class)
|
self.assertRaises(ValueError, create_employee_class)
|
||||||
|
|
||||||
|
|
||||||
def test_allow_inheritance_abstract_document(self):
|
def test_allow_inheritance_abstract_document(self):
|
||||||
"""Ensure that abstract documents can set inheritance rules and that
|
"""Ensure that abstract documents can set inheritance rules and that
|
||||||
_cls and _types will not be used.
|
_cls and _types will not be used.
|
||||||
@@ -366,6 +382,31 @@ class DocumentTest(unittest.TestCase):
|
|||||||
|
|
||||||
Animal.drop_collection()
|
Animal.drop_collection()
|
||||||
|
|
||||||
|
def test_allow_inheritance_embedded_document(self):
|
||||||
|
|
||||||
|
# Test the same for embedded documents
|
||||||
|
class Comment(EmbeddedDocument):
|
||||||
|
content = StringField()
|
||||||
|
meta = {'allow_inheritance': False}
|
||||||
|
|
||||||
|
def create_special_comment():
|
||||||
|
class SpecialComment(Comment):
|
||||||
|
pass
|
||||||
|
|
||||||
|
self.assertRaises(ValueError, create_special_comment)
|
||||||
|
|
||||||
|
comment = Comment(content='test')
|
||||||
|
self.assertFalse('_cls' in comment.to_mongo())
|
||||||
|
self.assertFalse('_types' in comment.to_mongo())
|
||||||
|
|
||||||
|
class Comment(EmbeddedDocument):
|
||||||
|
content = StringField()
|
||||||
|
meta = {'allow_inheritance': True}
|
||||||
|
|
||||||
|
comment = Comment(content='test')
|
||||||
|
self.assertTrue('_cls' in comment.to_mongo())
|
||||||
|
self.assertTrue('_types' in comment.to_mongo())
|
||||||
|
|
||||||
def test_document_inheritance(self):
|
def test_document_inheritance(self):
|
||||||
"""Ensure mutliple inheritance of abstract docs works
|
"""Ensure mutliple inheritance of abstract docs works
|
||||||
"""
|
"""
|
||||||
@@ -396,6 +437,9 @@ class DocumentTest(unittest.TestCase):
|
|||||||
'indexes': ['name']
|
'indexes': ['name']
|
||||||
}
|
}
|
||||||
|
|
||||||
|
self.assertEqual(Animal._meta['index_specs'],
|
||||||
|
[{'fields': [('_types', 1), ('name', 1)]}])
|
||||||
|
|
||||||
Animal.drop_collection()
|
Animal.drop_collection()
|
||||||
|
|
||||||
dog = Animal(name='dog')
|
dog = Animal(name='dog')
|
||||||
@@ -417,6 +461,9 @@ class DocumentTest(unittest.TestCase):
|
|||||||
'allow_inheritance': False,
|
'allow_inheritance': False,
|
||||||
'indexes': ['name']
|
'indexes': ['name']
|
||||||
}
|
}
|
||||||
|
|
||||||
|
self.assertEqual(Animal._meta['index_specs'],
|
||||||
|
[{'fields': [('name', 1)]}])
|
||||||
collection.update({}, {"$unset": {"_types": 1, "_cls": 1}}, multi=True)
|
collection.update({}, {"$unset": {"_types": 1, "_cls": 1}}, multi=True)
|
||||||
|
|
||||||
# Confirm extra data is removed
|
# Confirm extra data is removed
|
||||||
@@ -634,6 +681,12 @@ class DocumentTest(unittest.TestCase):
|
|||||||
'allow_inheritance': True
|
'allow_inheritance': True
|
||||||
}
|
}
|
||||||
|
|
||||||
|
self.assertEqual(BlogPost._meta['index_specs'],
|
||||||
|
[{'fields': [('_types', 1), ('addDate', -1)]},
|
||||||
|
{'fields': [('tags', 1)]},
|
||||||
|
{'fields': [('_types', 1), ('category', 1),
|
||||||
|
('addDate', -1)]}])
|
||||||
|
|
||||||
BlogPost.drop_collection()
|
BlogPost.drop_collection()
|
||||||
|
|
||||||
info = BlogPost.objects._collection.index_information()
|
info = BlogPost.objects._collection.index_information()
|
||||||
@@ -657,6 +710,13 @@ class DocumentTest(unittest.TestCase):
|
|||||||
title = StringField()
|
title = StringField()
|
||||||
meta = {'indexes': ['title']}
|
meta = {'indexes': ['title']}
|
||||||
|
|
||||||
|
self.assertEqual(ExtendedBlogPost._meta['index_specs'],
|
||||||
|
[{'fields': [('_types', 1), ('addDate', -1)]},
|
||||||
|
{'fields': [('tags', 1)]},
|
||||||
|
{'fields': [('_types', 1), ('category', 1),
|
||||||
|
('addDate', -1)]},
|
||||||
|
{'fields': [('_types', 1), ('title', 1)]}])
|
||||||
|
|
||||||
BlogPost.drop_collection()
|
BlogPost.drop_collection()
|
||||||
|
|
||||||
list(ExtendedBlogPost.objects)
|
list(ExtendedBlogPost.objects)
|
||||||
@@ -687,6 +747,8 @@ class DocumentTest(unittest.TestCase):
|
|||||||
description = StringField()
|
description = StringField()
|
||||||
|
|
||||||
self.assertEqual(A._meta['index_specs'], B._meta['index_specs'])
|
self.assertEqual(A._meta['index_specs'], B._meta['index_specs'])
|
||||||
|
self.assertEqual([{'fields': [('_types', 1), ('title', 1)]}],
|
||||||
|
A._meta['index_specs'])
|
||||||
|
|
||||||
def test_build_index_spec_is_not_destructive(self):
|
def test_build_index_spec_is_not_destructive(self):
|
||||||
|
|
||||||
@@ -767,6 +829,9 @@ class DocumentTest(unittest.TestCase):
|
|||||||
'allow_inheritance': False
|
'allow_inheritance': False
|
||||||
}
|
}
|
||||||
|
|
||||||
|
self.assertEqual([{'fields': [('rank.title', 1)]}],
|
||||||
|
Person._meta['index_specs'])
|
||||||
|
|
||||||
Person.drop_collection()
|
Person.drop_collection()
|
||||||
|
|
||||||
# Indexes are lazy so use list() to perform query
|
# Indexes are lazy so use list() to perform query
|
||||||
@@ -785,6 +850,10 @@ class DocumentTest(unittest.TestCase):
|
|||||||
'*location.point',
|
'*location.point',
|
||||||
],
|
],
|
||||||
}
|
}
|
||||||
|
|
||||||
|
self.assertEqual([{'fields': [('location.point', '2d')]}],
|
||||||
|
Place._meta['index_specs'])
|
||||||
|
|
||||||
Place.drop_collection()
|
Place.drop_collection()
|
||||||
|
|
||||||
info = Place.objects._collection.index_information()
|
info = Place.objects._collection.index_information()
|
||||||
@@ -810,6 +879,10 @@ class DocumentTest(unittest.TestCase):
|
|||||||
],
|
],
|
||||||
}
|
}
|
||||||
|
|
||||||
|
self.assertEqual([{'fields': [('addDate', -1)], 'unique': True,
|
||||||
|
'sparse': True, 'types': False}],
|
||||||
|
BlogPost._meta['index_specs'])
|
||||||
|
|
||||||
BlogPost.drop_collection()
|
BlogPost.drop_collection()
|
||||||
|
|
||||||
info = BlogPost.objects._collection.index_information()
|
info = BlogPost.objects._collection.index_information()
|
||||||
@@ -1263,7 +1336,6 @@ class DocumentTest(unittest.TestCase):
|
|||||||
|
|
||||||
User.drop_collection()
|
User.drop_collection()
|
||||||
|
|
||||||
|
|
||||||
def test_document_not_registered(self):
|
def test_document_not_registered(self):
|
||||||
|
|
||||||
class Place(Document):
|
class Place(Document):
|
||||||
@@ -1288,6 +1360,19 @@ class DocumentTest(unittest.TestCase):
|
|||||||
print Place.objects.all()
|
print Place.objects.all()
|
||||||
self.assertRaises(NotRegistered, query_without_importing_nice_place)
|
self.assertRaises(NotRegistered, query_without_importing_nice_place)
|
||||||
|
|
||||||
|
def test_document_registry_regressions(self):
|
||||||
|
|
||||||
|
class Location(Document):
|
||||||
|
name = StringField()
|
||||||
|
meta = {'allow_inheritance': True}
|
||||||
|
|
||||||
|
class Area(Location):
|
||||||
|
location = ReferenceField('Location', dbref=True)
|
||||||
|
|
||||||
|
Location.drop_collection()
|
||||||
|
|
||||||
|
self.assertEquals(Area, get_document("Area"))
|
||||||
|
self.assertEquals(Area, get_document("Location.Area"))
|
||||||
|
|
||||||
def test_creation(self):
|
def test_creation(self):
|
||||||
"""Ensure that document may be created using keyword arguments.
|
"""Ensure that document may be created using keyword arguments.
|
||||||
|
@@ -1,3 +1,4 @@
|
|||||||
|
# -*- coding: utf-8 -*-
|
||||||
from __future__ import with_statement
|
from __future__ import with_statement
|
||||||
import datetime
|
import datetime
|
||||||
import os
|
import os
|
||||||
@@ -7,7 +8,7 @@ import tempfile
|
|||||||
|
|
||||||
from decimal import Decimal
|
from decimal import Decimal
|
||||||
|
|
||||||
from bson import Binary, DBRef
|
from bson import Binary, DBRef, ObjectId
|
||||||
import gridfs
|
import gridfs
|
||||||
|
|
||||||
from nose.plugins.skip import SkipTest
|
from nose.plugins.skip import SkipTest
|
||||||
@@ -1104,7 +1105,17 @@ class FieldTest(unittest.TestCase):
|
|||||||
p = Person.objects.get(name="Ross")
|
p = Person.objects.get(name="Ross")
|
||||||
self.assertEqual(p.parent, p1)
|
self.assertEqual(p.parent, p1)
|
||||||
|
|
||||||
def test_str_reference_fields(self):
|
def test_dbref_to_mongo(self):
|
||||||
|
class Person(Document):
|
||||||
|
name = StringField()
|
||||||
|
parent = ReferenceField('self', dbref=False)
|
||||||
|
|
||||||
|
p1 = Person._from_son({'name': "Yakxxx",
|
||||||
|
'parent': "50a234ea469ac1eda42d347d"})
|
||||||
|
mongoed = p1.to_mongo()
|
||||||
|
self.assertTrue(isinstance(mongoed['parent'], ObjectId))
|
||||||
|
|
||||||
|
def test_objectid_reference_fields(self):
|
||||||
|
|
||||||
class Person(Document):
|
class Person(Document):
|
||||||
name = StringField()
|
name = StringField()
|
||||||
@@ -1117,7 +1128,7 @@ class FieldTest(unittest.TestCase):
|
|||||||
|
|
||||||
col = Person._get_collection()
|
col = Person._get_collection()
|
||||||
data = col.find_one({'name': 'Ross'})
|
data = col.find_one({'name': 'Ross'})
|
||||||
self.assertEqual(data['parent'], "%s" % p1.pk)
|
self.assertEqual(data['parent'], p1.pk)
|
||||||
|
|
||||||
p = Person.objects.get(name="Ross")
|
p = Person.objects.get(name="Ross")
|
||||||
self.assertEqual(p.parent, p1)
|
self.assertEqual(p.parent, p1)
|
||||||
@@ -2174,6 +2185,28 @@ class FieldTest(unittest.TestCase):
|
|||||||
c = self.db['mongoengine.counters'].find_one({'_id': 'animal.id'})
|
c = self.db['mongoengine.counters'].find_one({'_id': 'animal.id'})
|
||||||
self.assertEqual(c['next'], 10)
|
self.assertEqual(c['next'], 10)
|
||||||
|
|
||||||
|
def test_embedded_sequence_field(self):
|
||||||
|
class Comment(EmbeddedDocument):
|
||||||
|
id = SequenceField()
|
||||||
|
content = StringField(required=True)
|
||||||
|
|
||||||
|
class Post(Document):
|
||||||
|
title = StringField(required=True)
|
||||||
|
comments = ListField(EmbeddedDocumentField(Comment))
|
||||||
|
|
||||||
|
self.db['mongoengine.counters'].drop()
|
||||||
|
Post.drop_collection()
|
||||||
|
|
||||||
|
Post(title="MongoEngine",
|
||||||
|
comments=[Comment(content="NoSQL Rocks"),
|
||||||
|
Comment(content="MongoEngine Rocks")]).save()
|
||||||
|
|
||||||
|
c = self.db['mongoengine.counters'].find_one({'_id': 'comment.id'})
|
||||||
|
self.assertEqual(c['next'], 2)
|
||||||
|
post = Post.objects.first()
|
||||||
|
self.assertEqual(1, post.comments[0].id)
|
||||||
|
self.assertEqual(2, post.comments[1].id)
|
||||||
|
|
||||||
def test_generic_embedded_document(self):
|
def test_generic_embedded_document(self):
|
||||||
class Car(EmbeddedDocument):
|
class Car(EmbeddedDocument):
|
||||||
name = StringField()
|
name = StringField()
|
||||||
@@ -2298,6 +2331,18 @@ class FieldTest(unittest.TestCase):
|
|||||||
post.comments[1].content = 'here we go'
|
post.comments[1].content = 'here we go'
|
||||||
post.validate()
|
post.validate()
|
||||||
|
|
||||||
|
def test_email_field_honors_regex(self):
|
||||||
|
class User(Document):
|
||||||
|
email = EmailField(regex=r'\w+@example.com')
|
||||||
|
|
||||||
|
# Fails regex validation
|
||||||
|
user = User(email='me@foo.com')
|
||||||
|
self.assertRaises(ValidationError, user.validate)
|
||||||
|
|
||||||
|
# Passes regex validation
|
||||||
|
user = User(email='me@example.com')
|
||||||
|
self.assertTrue(user.validate() is None)
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
unittest.main()
|
unittest.main()
|
||||||
|
@@ -230,6 +230,30 @@ class QuerySetTest(unittest.TestCase):
|
|||||||
|
|
||||||
Blog.drop_collection()
|
Blog.drop_collection()
|
||||||
|
|
||||||
|
def test_chaining(self):
|
||||||
|
class A(Document):
|
||||||
|
pass
|
||||||
|
|
||||||
|
class B(Document):
|
||||||
|
a = ReferenceField(A)
|
||||||
|
|
||||||
|
A.drop_collection()
|
||||||
|
B.drop_collection()
|
||||||
|
|
||||||
|
a1 = A().save()
|
||||||
|
a2 = A().save()
|
||||||
|
|
||||||
|
B(a=a1).save()
|
||||||
|
|
||||||
|
# Works
|
||||||
|
q1 = B.objects.filter(a__in=[a1, a2], a=a1)._query
|
||||||
|
|
||||||
|
# Doesn't work
|
||||||
|
q2 = B.objects.filter(a__in=[a1, a2])
|
||||||
|
q2 = q2.filter(a=a1)._query
|
||||||
|
|
||||||
|
self.assertEqual(q1, q2)
|
||||||
|
|
||||||
def test_update_write_options(self):
|
def test_update_write_options(self):
|
||||||
"""Test that passing write_options works"""
|
"""Test that passing write_options works"""
|
||||||
|
|
||||||
@@ -414,6 +438,30 @@ class QuerySetTest(unittest.TestCase):
|
|||||||
self.assertEqual(post.comments[0].by, 'joe')
|
self.assertEqual(post.comments[0].by, 'joe')
|
||||||
self.assertEqual(post.comments[0].votes.score, 4)
|
self.assertEqual(post.comments[0].votes.score, 4)
|
||||||
|
|
||||||
|
def test_updates_can_have_match_operators(self):
|
||||||
|
|
||||||
|
class Post(Document):
|
||||||
|
title = StringField(required=True)
|
||||||
|
tags = ListField(StringField())
|
||||||
|
comments = ListField(EmbeddedDocumentField("Comment"))
|
||||||
|
|
||||||
|
class Comment(EmbeddedDocument):
|
||||||
|
content = StringField()
|
||||||
|
name = StringField(max_length=120)
|
||||||
|
vote = IntField()
|
||||||
|
|
||||||
|
Post.drop_collection()
|
||||||
|
|
||||||
|
comm1 = Comment(content="very funny indeed", name="John S", vote=1)
|
||||||
|
comm2 = Comment(content="kind of funny", name="Mark P", vote=0)
|
||||||
|
|
||||||
|
Post(title='Fun with MongoEngine', tags=['mongodb', 'mongoengine'],
|
||||||
|
comments=[comm1, comm2]).save()
|
||||||
|
|
||||||
|
Post.objects().update_one(pull__comments__vote__lt=1)
|
||||||
|
|
||||||
|
self.assertEqual(1, len(Post.objects.first().comments))
|
||||||
|
|
||||||
def test_mapfield_update(self):
|
def test_mapfield_update(self):
|
||||||
"""Ensure that the MapField can be updated."""
|
"""Ensure that the MapField can be updated."""
|
||||||
class Member(EmbeddedDocument):
|
class Member(EmbeddedDocument):
|
||||||
@@ -543,6 +591,10 @@ class QuerySetTest(unittest.TestCase):
|
|||||||
|
|
||||||
self.assertRaises(OperationError, throw_operation_error)
|
self.assertRaises(OperationError, throw_operation_error)
|
||||||
|
|
||||||
|
# Test can insert new doc
|
||||||
|
new_post = Blog(title="code", id=ObjectId())
|
||||||
|
Blog.objects.insert(new_post)
|
||||||
|
|
||||||
# test handles other classes being inserted
|
# test handles other classes being inserted
|
||||||
def throw_operation_error_wrong_doc():
|
def throw_operation_error_wrong_doc():
|
||||||
class Author(Document):
|
class Author(Document):
|
||||||
@@ -1885,6 +1937,22 @@ class QuerySetTest(unittest.TestCase):
|
|||||||
ages = [p.age for p in self.Person.objects.order_by('-name')]
|
ages = [p.age for p in self.Person.objects.order_by('-name')]
|
||||||
self.assertEqual(ages, [30, 40, 20])
|
self.assertEqual(ages, [30, 40, 20])
|
||||||
|
|
||||||
|
def test_order_by_chaining(self):
|
||||||
|
"""Ensure that an order_by query chains properly and allows .only()
|
||||||
|
"""
|
||||||
|
self.Person(name="User A", age=20).save()
|
||||||
|
self.Person(name="User B", age=40).save()
|
||||||
|
self.Person(name="User C", age=30).save()
|
||||||
|
|
||||||
|
only_age = self.Person.objects.order_by('-age').only('age')
|
||||||
|
|
||||||
|
names = [p.name for p in only_age]
|
||||||
|
ages = [p.age for p in only_age]
|
||||||
|
|
||||||
|
# The .only('age') clause should mean that all names are None
|
||||||
|
self.assertEqual(names, [None, None, None])
|
||||||
|
self.assertEqual(ages, [40, 30, 20])
|
||||||
|
|
||||||
def test_confirm_order_by_reference_wont_work(self):
|
def test_confirm_order_by_reference_wont_work(self):
|
||||||
"""Ordering by reference is not possible. Use map / reduce.. or
|
"""Ordering by reference is not possible. Use map / reduce.. or
|
||||||
denormalise"""
|
denormalise"""
|
||||||
@@ -3643,6 +3711,38 @@ class QueryFieldListTest(unittest.TestCase):
|
|||||||
ak = list(Bar.objects(foo__match={'shape': "square", "color": "purple"}))
|
ak = list(Bar.objects(foo__match={'shape': "square", "color": "purple"}))
|
||||||
self.assertEqual([b1], ak)
|
self.assertEqual([b1], ak)
|
||||||
|
|
||||||
|
def test_as_pymongo(self):
|
||||||
|
|
||||||
|
from decimal import Decimal
|
||||||
|
|
||||||
|
class User(Document):
|
||||||
|
id = ObjectIdField('_id')
|
||||||
|
name = StringField()
|
||||||
|
age = IntField()
|
||||||
|
price = DecimalField()
|
||||||
|
|
||||||
|
User.drop_collection()
|
||||||
|
User(name="Bob Dole", age=89, price=Decimal('1.11')).save()
|
||||||
|
User(name="Barack Obama", age=51, price=Decimal('2.22')).save()
|
||||||
|
|
||||||
|
users = User.objects.only('name', 'price').as_pymongo()
|
||||||
|
results = list(users)
|
||||||
|
self.assertTrue(isinstance(results[0], dict))
|
||||||
|
self.assertTrue(isinstance(results[1], dict))
|
||||||
|
self.assertEqual(results[0]['name'], 'Bob Dole')
|
||||||
|
self.assertEqual(results[0]['price'], '1.11')
|
||||||
|
self.assertEqual(results[1]['name'], 'Barack Obama')
|
||||||
|
self.assertEqual(results[1]['price'], '2.22')
|
||||||
|
|
||||||
|
# Test coerce_types
|
||||||
|
users = User.objects.only('name', 'price').as_pymongo(coerce_types=True)
|
||||||
|
results = list(users)
|
||||||
|
self.assertTrue(isinstance(results[0], dict))
|
||||||
|
self.assertTrue(isinstance(results[1], dict))
|
||||||
|
self.assertEqual(results[0]['name'], 'Bob Dole')
|
||||||
|
self.assertEqual(results[0]['price'], Decimal('1.11'))
|
||||||
|
self.assertEqual(results[1]['name'], 'Barack Obama')
|
||||||
|
self.assertEqual(results[1]['price'], Decimal('2.22'))
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
unittest.main()
|
unittest.main()
|
||||||
|
Reference in New Issue
Block a user