Add support for new geojson fields, indexes and queries (#299)
This commit is contained in:
parent
85b81fb12a
commit
9c1cd81adb
@ -76,10 +76,13 @@ Fields
|
|||||||
.. autoclass:: mongoengine.fields.BinaryField
|
.. autoclass:: mongoengine.fields.BinaryField
|
||||||
.. autoclass:: mongoengine.fields.FileField
|
.. autoclass:: mongoengine.fields.FileField
|
||||||
.. autoclass:: mongoengine.fields.ImageField
|
.. autoclass:: mongoengine.fields.ImageField
|
||||||
.. autoclass:: mongoengine.fields.GeoPointField
|
|
||||||
.. autoclass:: mongoengine.fields.SequenceField
|
.. autoclass:: mongoengine.fields.SequenceField
|
||||||
.. autoclass:: mongoengine.fields.ObjectIdField
|
.. autoclass:: mongoengine.fields.ObjectIdField
|
||||||
.. autoclass:: mongoengine.fields.UUIDField
|
.. autoclass:: mongoengine.fields.UUIDField
|
||||||
|
.. autoclass:: mongoengine.fields.GeoPointField
|
||||||
|
.. autoclass:: mongoengine.fields.PointField
|
||||||
|
.. autoclass:: mongoengine.fields.LineStringField
|
||||||
|
.. autoclass:: mongoengine.fields.PolygonField
|
||||||
.. autoclass:: mongoengine.fields.GridFSError
|
.. autoclass:: mongoengine.fields.GridFSError
|
||||||
.. autoclass:: mongoengine.fields.GridFSProxy
|
.. autoclass:: mongoengine.fields.GridFSProxy
|
||||||
.. autoclass:: mongoengine.fields.ImageGridFsProxy
|
.. autoclass:: mongoengine.fields.ImageGridFsProxy
|
||||||
|
@ -4,6 +4,7 @@ Changelog
|
|||||||
|
|
||||||
Changes in 0.8.X
|
Changes in 0.8.X
|
||||||
================
|
================
|
||||||
|
- Add support for new geojson fields, indexes and queries (#299)
|
||||||
- If values cant be compared mark as changed (#287)
|
- If values cant be compared mark as changed (#287)
|
||||||
- Ensure as_pymongo() and to_json honour only() and exclude() (#293)
|
- Ensure as_pymongo() and to_json honour only() and exclude() (#293)
|
||||||
- Document serialization uses field order to ensure a strict order is set (#296)
|
- Document serialization uses field order to ensure a strict order is set (#296)
|
||||||
|
@ -132,7 +132,11 @@ html_theme_path = ['_themes']
|
|||||||
html_use_smartypants = True
|
html_use_smartypants = True
|
||||||
|
|
||||||
# Custom sidebar templates, maps document names to template names.
|
# Custom sidebar templates, maps document names to template names.
|
||||||
#html_sidebars = {}
|
html_sidebars = {
|
||||||
|
'index': ['globaltoc.html', 'searchbox.html'],
|
||||||
|
'**': ['localtoc.html', 'relations.html', 'searchbox.html']
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
# Additional templates that should be rendered to pages, maps page names to
|
# Additional templates that should be rendered to pages, maps page names to
|
||||||
# template names.
|
# template names.
|
||||||
|
@ -1,8 +1,8 @@
|
|||||||
=============================
|
==============
|
||||||
Using MongoEngine with Django
|
Django Support
|
||||||
=============================
|
==============
|
||||||
|
|
||||||
.. note:: Updated to support Django 1.4
|
.. note:: Updated to support Django 1.5
|
||||||
|
|
||||||
Connecting
|
Connecting
|
||||||
==========
|
==========
|
||||||
|
@ -499,6 +499,35 @@ in this case use 'dot' notation to identify the value to index eg: `rank.title`
|
|||||||
Geospatial indexes
|
Geospatial indexes
|
||||||
------------------
|
------------------
|
||||||
|
|
||||||
|
|
||||||
|
The best geo index for mongodb is the new "2dsphere", which has an improved
|
||||||
|
spherical model and provides better performance and more options when querying.
|
||||||
|
The following fields will explicitly add a "2dsphere" index:
|
||||||
|
|
||||||
|
- :class:`~mongoengine.fields.PointField`
|
||||||
|
- :class:`~mongoengine.fields.LineStringField`
|
||||||
|
- :class:`~mongoengine.fields.PolygonField`
|
||||||
|
|
||||||
|
As "2dsphere" indexes can be part of a compound index, you may not want the
|
||||||
|
automatic index but would prefer a compound index. In this example we turn off
|
||||||
|
auto indexing and explicitly declare a compound index on ``location`` and ``datetime``::
|
||||||
|
|
||||||
|
class Log(Document):
|
||||||
|
location = PointField(auto_index=False)
|
||||||
|
datetime = DateTimeField()
|
||||||
|
|
||||||
|
meta = {
|
||||||
|
'indexes': [[("location", "2dsphere"), ("datetime", 1)]]
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
Pre MongoDB 2.4 Geo
|
||||||
|
'''''''''''''''''''
|
||||||
|
|
||||||
|
.. note:: For MongoDB < 2.4 this is still current, however the new 2dsphere
|
||||||
|
index is a big improvement over the previous 2D model - so upgrading is
|
||||||
|
advised.
|
||||||
|
|
||||||
Geospatial indexes will be automatically created for all
|
Geospatial indexes will be automatically created for all
|
||||||
:class:`~mongoengine.fields.GeoPointField`\ s
|
:class:`~mongoengine.fields.GeoPointField`\ s
|
||||||
|
|
||||||
|
@ -65,6 +65,9 @@ Available operators are as follows:
|
|||||||
* ``size`` -- the size of the array is
|
* ``size`` -- the size of the array is
|
||||||
* ``exists`` -- value for field exists
|
* ``exists`` -- value for field exists
|
||||||
|
|
||||||
|
String queries
|
||||||
|
--------------
|
||||||
|
|
||||||
The following operators are available as shortcuts to querying with regular
|
The following operators are available as shortcuts to querying with regular
|
||||||
expressions:
|
expressions:
|
||||||
|
|
||||||
@ -78,8 +81,71 @@ expressions:
|
|||||||
* ``iendswith`` -- string field ends with value (case insensitive)
|
* ``iendswith`` -- string field ends with value (case insensitive)
|
||||||
* ``match`` -- performs an $elemMatch so you can match an entire document within an array
|
* ``match`` -- performs an $elemMatch so you can match an entire document within an array
|
||||||
|
|
||||||
There are a few special operators for performing geographical queries, that
|
|
||||||
may used with :class:`~mongoengine.fields.GeoPointField`\ s:
|
Geo queries
|
||||||
|
-----------
|
||||||
|
|
||||||
|
There are a few special operators for performing geographical queries. The following
|
||||||
|
were added in 0.8 for: :class:`~mongoengine.fields.PointField`,
|
||||||
|
:class:`~mongoengine.fields.LineStringField` and
|
||||||
|
:class:`~mongoengine.fields.PolygonField`:
|
||||||
|
|
||||||
|
* ``geo_within`` -- Check if a geometry is within a polygon. For ease of use
|
||||||
|
it accepts either a geojson geometry or just the polygon coordinates eg::
|
||||||
|
|
||||||
|
loc.objects(point__geo_with=[[[40, 5], [40, 6], [41, 6], [40, 5]]])
|
||||||
|
loc.objects(point__geo_with={"type": "Polygon",
|
||||||
|
"coordinates": [[[40, 5], [40, 6], [41, 6], [40, 5]]]})
|
||||||
|
|
||||||
|
* ``geo_within_box`` - simplified geo_within searching with a box eg::
|
||||||
|
|
||||||
|
loc.objects(point__geo_within_box=[(-125.0, 35.0), (-100.0, 40.0)])
|
||||||
|
loc.objects(point__geo_within_box=[<bottom left coordinates>, <upper right coordinates>])
|
||||||
|
|
||||||
|
* ``geo_within_polygon`` -- simplified geo_within searching within a simple polygon eg::
|
||||||
|
|
||||||
|
loc.objects(point__geo_within_polygon=[[40, 5], [40, 6], [41, 6], [40, 5]])
|
||||||
|
loc.objects(point__geo_within_polygon=[ [ <x1> , <y1> ] ,
|
||||||
|
[ <x2> , <y2> ] ,
|
||||||
|
[ <x3> , <y3> ] ])
|
||||||
|
|
||||||
|
* ``geo_within_center`` -- simplified geo_within the flat circle radius of a point eg::
|
||||||
|
|
||||||
|
loc.objects(point__geo_within_center=[(-125.0, 35.0), 1])
|
||||||
|
loc.objects(point__geo_within_center=[ [ <x>, <y> ] , <radius> ])
|
||||||
|
|
||||||
|
* ``geo_within_sphere`` -- simplified geo_within the spherical circle radius of a point eg::
|
||||||
|
|
||||||
|
loc.objects(point__geo_within_sphere=[(-125.0, 35.0), 1])
|
||||||
|
loc.objects(point__geo_within_sphere=[ [ <x>, <y> ] , <radius> ])
|
||||||
|
|
||||||
|
* ``geo_intersects`` -- selects all locations that intersect with a geometry eg::
|
||||||
|
|
||||||
|
# Inferred from provided points lists:
|
||||||
|
loc.objects(poly__geo_intersects=[40, 6])
|
||||||
|
loc.objects(poly__geo_intersects=[[40, 5], [40, 6]])
|
||||||
|
loc.objects(poly__geo_intersects=[[[40, 5], [40, 6], [41, 6], [41, 5], [40, 5]]])
|
||||||
|
|
||||||
|
# With geoJson style objects
|
||||||
|
loc.objects(poly__geo_intersects={"type": "Point", "coordinates": [40, 6]})
|
||||||
|
loc.objects(poly__geo_intersects={"type": "LineString",
|
||||||
|
"coordinates": [[40, 5], [40, 6]]})
|
||||||
|
loc.objects(poly__geo_intersects={"type": "Polygon",
|
||||||
|
"coordinates": [[[40, 5], [40, 6], [41, 6], [41, 5], [40, 5]]]})
|
||||||
|
|
||||||
|
* ``near`` -- Find all the locations near a given point::
|
||||||
|
|
||||||
|
loc.objects(point__near=[40, 5])
|
||||||
|
loc.objects(point__near={"type": "Point", "coordinates": [40, 5]})
|
||||||
|
|
||||||
|
|
||||||
|
You can also set the maximum distance in meters as well::
|
||||||
|
|
||||||
|
loc.objects(point__near=[40, 5], point__max_distance=1000)
|
||||||
|
|
||||||
|
|
||||||
|
The older 2D indexes are still supported with the
|
||||||
|
:class:`~mongoengine.fields.GeoPointField`:
|
||||||
|
|
||||||
* ``within_distance`` -- provide a list containing a point and a maximum
|
* ``within_distance`` -- provide a list containing a point and a maximum
|
||||||
distance (e.g. [(41.342, -87.653), 5])
|
distance (e.g. [(41.342, -87.653), 5])
|
||||||
@ -91,7 +157,9 @@ may used with :class:`~mongoengine.fields.GeoPointField`\ s:
|
|||||||
[(35.0, -125.0), (40.0, -100.0)])
|
[(35.0, -125.0), (40.0, -100.0)])
|
||||||
* ``within_polygon`` -- filter documents to those within a given polygon (e.g.
|
* ``within_polygon`` -- filter documents to those within a given polygon (e.g.
|
||||||
[(41.91,-87.69), (41.92,-87.68), (41.91,-87.65), (41.89,-87.65)]).
|
[(41.91,-87.69), (41.92,-87.68), (41.91,-87.65), (41.89,-87.65)]).
|
||||||
|
|
||||||
.. note:: Requires Mongo Server 2.0
|
.. note:: Requires Mongo Server 2.0
|
||||||
|
|
||||||
* ``max_distance`` -- can be added to your location queries to set a maximum
|
* ``max_distance`` -- can be added to your location queries to set a maximum
|
||||||
distance.
|
distance.
|
||||||
|
|
||||||
|
@ -56,14 +56,16 @@ See the :doc:`changelog` for a full list of changes to MongoEngine and
|
|||||||
putting updates live in production **;)**
|
putting updates live in production **;)**
|
||||||
|
|
||||||
.. toctree::
|
.. toctree::
|
||||||
|
:maxdepth: 1
|
||||||
|
:numbered:
|
||||||
:hidden:
|
:hidden:
|
||||||
|
|
||||||
tutorial
|
tutorial
|
||||||
guide/index
|
guide/index
|
||||||
apireference
|
apireference
|
||||||
django
|
|
||||||
changelog
|
changelog
|
||||||
upgrade
|
upgrade
|
||||||
|
django
|
||||||
|
|
||||||
Indices and tables
|
Indices and tables
|
||||||
------------------
|
------------------
|
||||||
|
@ -662,7 +662,8 @@ class BaseDocument(object):
|
|||||||
if include_cls and direction is not pymongo.GEO2D:
|
if include_cls and direction is not pymongo.GEO2D:
|
||||||
index_list.insert(0, ('_cls', 1))
|
index_list.insert(0, ('_cls', 1))
|
||||||
|
|
||||||
spec['fields'] = index_list
|
if index_list:
|
||||||
|
spec['fields'] = index_list
|
||||||
if spec.get('sparse', False) and len(spec['fields']) > 1:
|
if spec.get('sparse', False) and len(spec['fields']) > 1:
|
||||||
raise ValueError(
|
raise ValueError(
|
||||||
'Sparse indexes can only have one field in them. '
|
'Sparse indexes can only have one field in them. '
|
||||||
@ -704,13 +705,13 @@ class BaseDocument(object):
|
|||||||
|
|
||||||
# Add the new index to the list
|
# Add the new index to the list
|
||||||
fields = [("%s%s" % (namespace, f), pymongo.ASCENDING)
|
fields = [("%s%s" % (namespace, f), pymongo.ASCENDING)
|
||||||
for f in unique_fields]
|
for f in unique_fields]
|
||||||
index = {'fields': fields, 'unique': True, 'sparse': sparse}
|
index = {'fields': fields, 'unique': True, 'sparse': sparse}
|
||||||
unique_indexes.append(index)
|
unique_indexes.append(index)
|
||||||
|
|
||||||
# Grab any embedded document field unique indexes
|
# Grab any embedded document field unique indexes
|
||||||
if (field.__class__.__name__ == "EmbeddedDocumentField" and
|
if (field.__class__.__name__ == "EmbeddedDocumentField" and
|
||||||
field.document_type != cls):
|
field.document_type != cls):
|
||||||
field_namespace = "%s." % field_name
|
field_namespace = "%s." % field_name
|
||||||
doc_cls = field.document_type
|
doc_cls = field.document_type
|
||||||
unique_indexes += doc_cls._unique_with_indexes(field_namespace)
|
unique_indexes += doc_cls._unique_with_indexes(field_namespace)
|
||||||
@ -718,26 +719,31 @@ class BaseDocument(object):
|
|||||||
return unique_indexes
|
return unique_indexes
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def _geo_indices(cls, inspected=None):
|
def _geo_indices(cls, inspected=None, parent_field=None):
|
||||||
inspected = inspected or []
|
inspected = inspected or []
|
||||||
geo_indices = []
|
geo_indices = []
|
||||||
inspected.append(cls)
|
inspected.append(cls)
|
||||||
|
|
||||||
EmbeddedDocumentField = _import_class("EmbeddedDocumentField")
|
geo_field_type_names = ["EmbeddedDocumentField", "GeoPointField",
|
||||||
GeoPointField = _import_class("GeoPointField")
|
"PointField", "LineStringField", "PolygonField"]
|
||||||
|
|
||||||
|
geo_field_types = tuple([_import_class(field) for field in geo_field_type_names])
|
||||||
|
|
||||||
for field in cls._fields.values():
|
for field in cls._fields.values():
|
||||||
if not isinstance(field, (EmbeddedDocumentField, GeoPointField)):
|
if not isinstance(field, geo_field_types):
|
||||||
continue
|
continue
|
||||||
if hasattr(field, 'document_type'):
|
if hasattr(field, 'document_type'):
|
||||||
field_cls = field.document_type
|
field_cls = field.document_type
|
||||||
if field_cls in inspected:
|
if field_cls in inspected:
|
||||||
continue
|
continue
|
||||||
if hasattr(field_cls, '_geo_indices'):
|
if hasattr(field_cls, '_geo_indices'):
|
||||||
geo_indices += field_cls._geo_indices(inspected)
|
geo_indices += field_cls._geo_indices(inspected, parent_field=field.db_field)
|
||||||
elif field._geo_index:
|
elif field._geo_index:
|
||||||
|
field_name = field.db_field
|
||||||
|
if parent_field:
|
||||||
|
field_name = "%s.%s" % (parent_field, field_name)
|
||||||
geo_indices.append({'fields':
|
geo_indices.append({'fields':
|
||||||
[(field.db_field, pymongo.GEO2D)]})
|
[(field_name, field._geo_index)]})
|
||||||
return geo_indices
|
return geo_indices
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
|
@ -2,7 +2,8 @@ import operator
|
|||||||
import warnings
|
import warnings
|
||||||
import weakref
|
import weakref
|
||||||
|
|
||||||
from bson import DBRef, ObjectId
|
from bson import DBRef, ObjectId, SON
|
||||||
|
import pymongo
|
||||||
|
|
||||||
from mongoengine.common import _import_class
|
from mongoengine.common import _import_class
|
||||||
from mongoengine.errors import ValidationError
|
from mongoengine.errors import ValidationError
|
||||||
@ -10,7 +11,7 @@ from mongoengine.errors import ValidationError
|
|||||||
from mongoengine.base.common import ALLOW_INHERITANCE
|
from mongoengine.base.common import ALLOW_INHERITANCE
|
||||||
from mongoengine.base.datastructures import BaseDict, BaseList
|
from mongoengine.base.datastructures import BaseDict, BaseList
|
||||||
|
|
||||||
__all__ = ("BaseField", "ComplexBaseField", "ObjectIdField")
|
__all__ = ("BaseField", "ComplexBaseField", "ObjectIdField", "GeoJsonBaseField")
|
||||||
|
|
||||||
|
|
||||||
class BaseField(object):
|
class BaseField(object):
|
||||||
@ -186,7 +187,7 @@ class ComplexBaseField(BaseField):
|
|||||||
|
|
||||||
# Convert lists / values so we can watch for any changes on them
|
# Convert lists / values so we can watch for any changes on them
|
||||||
if (isinstance(value, (list, tuple)) and
|
if (isinstance(value, (list, tuple)) and
|
||||||
not isinstance(value, BaseList)):
|
not isinstance(value, BaseList)):
|
||||||
value = BaseList(value, instance, self.name)
|
value = BaseList(value, instance, self.name)
|
||||||
instance._data[self.name] = value
|
instance._data[self.name] = value
|
||||||
elif isinstance(value, dict) and not isinstance(value, BaseDict):
|
elif isinstance(value, dict) and not isinstance(value, BaseDict):
|
||||||
@ -194,8 +195,8 @@ class ComplexBaseField(BaseField):
|
|||||||
instance._data[self.name] = value
|
instance._data[self.name] = value
|
||||||
|
|
||||||
if (self._auto_dereference and instance._initialised and
|
if (self._auto_dereference and instance._initialised and
|
||||||
isinstance(value, (BaseList, BaseDict))
|
isinstance(value, (BaseList, BaseDict))
|
||||||
and not value._dereferenced):
|
and not value._dereferenced):
|
||||||
value = self._dereference(
|
value = self._dereference(
|
||||||
value, max_depth=1, instance=instance, name=self.name
|
value, max_depth=1, instance=instance, name=self.name
|
||||||
)
|
)
|
||||||
@ -231,7 +232,7 @@ class ComplexBaseField(BaseField):
|
|||||||
|
|
||||||
if self.field:
|
if self.field:
|
||||||
value_dict = dict([(key, self.field.to_python(item))
|
value_dict = dict([(key, self.field.to_python(item))
|
||||||
for key, item in value.items()])
|
for key, item in value.items()])
|
||||||
else:
|
else:
|
||||||
value_dict = {}
|
value_dict = {}
|
||||||
for k, v in value.items():
|
for k, v in value.items():
|
||||||
@ -282,7 +283,7 @@ class ComplexBaseField(BaseField):
|
|||||||
|
|
||||||
if self.field:
|
if self.field:
|
||||||
value_dict = dict([(key, self.field.to_mongo(item))
|
value_dict = dict([(key, self.field.to_mongo(item))
|
||||||
for key, item in value.iteritems()])
|
for key, item in value.iteritems()])
|
||||||
else:
|
else:
|
||||||
value_dict = {}
|
value_dict = {}
|
||||||
for k, v in value.iteritems():
|
for k, v in value.iteritems():
|
||||||
@ -396,3 +397,100 @@ class ObjectIdField(BaseField):
|
|||||||
ObjectId(unicode(value))
|
ObjectId(unicode(value))
|
||||||
except:
|
except:
|
||||||
self.error('Invalid Object ID')
|
self.error('Invalid Object ID')
|
||||||
|
|
||||||
|
|
||||||
|
class GeoJsonBaseField(BaseField):
|
||||||
|
"""A geo json field storing a geojson style object.
|
||||||
|
.. versionadded:: 0.8
|
||||||
|
"""
|
||||||
|
|
||||||
|
_geo_index = pymongo.GEOSPHERE
|
||||||
|
_type = "GeoBase"
|
||||||
|
|
||||||
|
def __init__(self, auto_index=True, *args, **kwargs):
|
||||||
|
"""
|
||||||
|
:param auto_index: Automatically create a "2dsphere" index. Defaults
|
||||||
|
to `True`.
|
||||||
|
"""
|
||||||
|
self._name = "%sField" % self._type
|
||||||
|
if not auto_index:
|
||||||
|
self._geo_index = False
|
||||||
|
super(GeoJsonBaseField, self).__init__(*args, **kwargs)
|
||||||
|
|
||||||
|
def validate(self, value):
|
||||||
|
"""Validate the GeoJson object based on its type
|
||||||
|
"""
|
||||||
|
if isinstance(value, dict):
|
||||||
|
if set(value.keys()) == set(['type', 'coordinates']):
|
||||||
|
if value['type'] != self._type:
|
||||||
|
self.error('%s type must be "%s"' % (self._name, self._type))
|
||||||
|
return self.validate(value['coordinates'])
|
||||||
|
else:
|
||||||
|
self.error('%s can only accept a valid GeoJson dictionary'
|
||||||
|
' or lists of (x, y)' % self._name)
|
||||||
|
return
|
||||||
|
elif not isinstance(value, (list, tuple)):
|
||||||
|
self.error('%s can only accept lists of [x, y]' % self._name)
|
||||||
|
return
|
||||||
|
|
||||||
|
validate = getattr(self, "_validate_%s" % self._type.lower())
|
||||||
|
error = validate(value)
|
||||||
|
if error:
|
||||||
|
self.error(error)
|
||||||
|
|
||||||
|
def _validate_polygon(self, value):
|
||||||
|
if not isinstance(value, (list, tuple)):
|
||||||
|
return 'Polygons must contain list of linestrings'
|
||||||
|
|
||||||
|
# Quick and dirty validator
|
||||||
|
try:
|
||||||
|
value[0][0][0]
|
||||||
|
except:
|
||||||
|
return "Invalid Polygon must contain at least one valid linestring"
|
||||||
|
|
||||||
|
errors = []
|
||||||
|
for val in value:
|
||||||
|
error = self._validate_linestring(val, False)
|
||||||
|
if not error and val[0] != val[-1]:
|
||||||
|
error = 'LineStrings must start and end at the same point'
|
||||||
|
if error and error not in errors:
|
||||||
|
errors.append(error)
|
||||||
|
if errors:
|
||||||
|
return "Invalid Polygon:\n%s" % ", ".join(set(errors))
|
||||||
|
|
||||||
|
def _validate_linestring(self, value, top_level=True):
|
||||||
|
"""Validates a linestring"""
|
||||||
|
if not isinstance(value, (list, tuple)):
|
||||||
|
return 'LineStrings must contain list of coordinate pairs'
|
||||||
|
|
||||||
|
# Quick and dirty validator
|
||||||
|
try:
|
||||||
|
value[0][0]
|
||||||
|
except:
|
||||||
|
return "Invalid LineString must contain at least one valid point"
|
||||||
|
|
||||||
|
errors = []
|
||||||
|
for val in value:
|
||||||
|
error = self._validate_point(val)
|
||||||
|
if error and error not in errors:
|
||||||
|
errors.append(error)
|
||||||
|
if errors:
|
||||||
|
if top_level:
|
||||||
|
return "Invalid LineString:\n%s" % ", ".join(errors)
|
||||||
|
else:
|
||||||
|
return "%s" % ", ".join(set(errors))
|
||||||
|
|
||||||
|
def _validate_point(self, value):
|
||||||
|
"""Validate each set of coords"""
|
||||||
|
if not isinstance(value, (list, tuple)):
|
||||||
|
return 'Points must be a list of coordinate pairs'
|
||||||
|
elif not len(value) == 2:
|
||||||
|
return "Value (%s) must be a two-dimensional point" % repr(value)
|
||||||
|
elif (not isinstance(value[0], (float, int)) or
|
||||||
|
not isinstance(value[1], (float, int))):
|
||||||
|
return "Both values (%s) in point must be float or int" % repr(value)
|
||||||
|
|
||||||
|
def to_mongo(self, value):
|
||||||
|
if isinstance(value, dict):
|
||||||
|
return value
|
||||||
|
return SON([("type", self._type), ("coordinates", value)])
|
||||||
|
@ -11,6 +11,7 @@ def _import_class(cls_name):
|
|||||||
field_classes = ('DictField', 'DynamicField', 'EmbeddedDocumentField',
|
field_classes = ('DictField', 'DynamicField', 'EmbeddedDocumentField',
|
||||||
'FileField', 'GenericReferenceField',
|
'FileField', 'GenericReferenceField',
|
||||||
'GenericEmbeddedDocumentField', 'GeoPointField',
|
'GenericEmbeddedDocumentField', 'GeoPointField',
|
||||||
|
'PointField', 'LineStringField', 'PolygonField',
|
||||||
'ReferenceField', 'StringField', 'ComplexBaseField')
|
'ReferenceField', 'StringField', 'ComplexBaseField')
|
||||||
queryset_classes = ('OperationError',)
|
queryset_classes = ('OperationError',)
|
||||||
deref_classes = ('DeReference',)
|
deref_classes = ('DeReference',)
|
||||||
|
@ -523,7 +523,6 @@ class Document(BaseDocument):
|
|||||||
# an extra index on _cls, as mongodb will use the existing
|
# an extra index on _cls, as mongodb will use the existing
|
||||||
# index to service queries against _cls
|
# index to service queries against _cls
|
||||||
cls_indexed = False
|
cls_indexed = False
|
||||||
|
|
||||||
def includes_cls(fields):
|
def includes_cls(fields):
|
||||||
first_field = None
|
first_field = None
|
||||||
if len(fields):
|
if len(fields):
|
||||||
|
@ -15,7 +15,7 @@ from bson import Binary, DBRef, SON, ObjectId
|
|||||||
from mongoengine.errors import ValidationError
|
from mongoengine.errors import ValidationError
|
||||||
from mongoengine.python_support import (PY3, bin_type, txt_type,
|
from mongoengine.python_support import (PY3, bin_type, txt_type,
|
||||||
str_types, StringIO)
|
str_types, StringIO)
|
||||||
from base import (BaseField, ComplexBaseField, ObjectIdField,
|
from base import (BaseField, ComplexBaseField, ObjectIdField, GeoJsonBaseField,
|
||||||
get_document, BaseDocument)
|
get_document, BaseDocument)
|
||||||
from queryset import DO_NOTHING, QuerySet
|
from queryset import DO_NOTHING, QuerySet
|
||||||
from document import Document, EmbeddedDocument
|
from document import Document, EmbeddedDocument
|
||||||
@ -34,8 +34,8 @@ __all__ = ['StringField', 'URLField', 'EmailField', 'IntField', 'LongField',
|
|||||||
'SortedListField', 'DictField', 'MapField', 'ReferenceField',
|
'SortedListField', 'DictField', 'MapField', 'ReferenceField',
|
||||||
'GenericReferenceField', 'BinaryField', 'GridFSError',
|
'GenericReferenceField', 'BinaryField', 'GridFSError',
|
||||||
'GridFSProxy', 'FileField', 'ImageGridFsProxy',
|
'GridFSProxy', 'FileField', 'ImageGridFsProxy',
|
||||||
'ImproperlyConfigured', 'ImageField', 'GeoPointField',
|
'ImproperlyConfigured', 'ImageField', 'GeoPointField', 'PointField',
|
||||||
'SequenceField', 'UUIDField']
|
'LineStringField', 'PolygonField', 'SequenceField', 'UUIDField']
|
||||||
|
|
||||||
|
|
||||||
RECURSIVE_REFERENCE_CONSTANT = 'self'
|
RECURSIVE_REFERENCE_CONSTANT = 'self'
|
||||||
@ -1386,28 +1386,6 @@ class ImageField(FileField):
|
|||||||
**kwargs)
|
**kwargs)
|
||||||
|
|
||||||
|
|
||||||
class GeoPointField(BaseField):
|
|
||||||
"""A list storing a latitude and longitude.
|
|
||||||
|
|
||||||
.. versionadded:: 0.4
|
|
||||||
"""
|
|
||||||
|
|
||||||
_geo_index = pymongo.GEO2D
|
|
||||||
|
|
||||||
def validate(self, value):
|
|
||||||
"""Make sure that a geo-value is of type (x, y)
|
|
||||||
"""
|
|
||||||
if not isinstance(value, (list, tuple)):
|
|
||||||
self.error('GeoPointField can only accept tuples or lists '
|
|
||||||
'of (x, y)')
|
|
||||||
|
|
||||||
if not len(value) == 2:
|
|
||||||
self.error('Value must be a two-dimensional point')
|
|
||||||
if (not isinstance(value[0], (float, int)) and
|
|
||||||
not isinstance(value[1], (float, int))):
|
|
||||||
self.error('Both values in point must be float or int')
|
|
||||||
|
|
||||||
|
|
||||||
class SequenceField(BaseField):
|
class SequenceField(BaseField):
|
||||||
"""Provides a sequental counter see:
|
"""Provides a sequental counter see:
|
||||||
http://www.mongodb.org/display/DOCS/Object+IDs#ObjectIDs-SequenceNumbers
|
http://www.mongodb.org/display/DOCS/Object+IDs#ObjectIDs-SequenceNumbers
|
||||||
@ -1548,3 +1526,83 @@ class UUIDField(BaseField):
|
|||||||
value = uuid.UUID(value)
|
value = uuid.UUID(value)
|
||||||
except Exception, exc:
|
except Exception, exc:
|
||||||
self.error('Could not convert to UUID: %s' % exc)
|
self.error('Could not convert to UUID: %s' % exc)
|
||||||
|
|
||||||
|
|
||||||
|
class GeoPointField(BaseField):
|
||||||
|
"""A list storing a latitude and longitude.
|
||||||
|
|
||||||
|
.. versionadded:: 0.4
|
||||||
|
"""
|
||||||
|
|
||||||
|
_geo_index = pymongo.GEO2D
|
||||||
|
|
||||||
|
def validate(self, value):
|
||||||
|
"""Make sure that a geo-value is of type (x, y)
|
||||||
|
"""
|
||||||
|
if not isinstance(value, (list, tuple)):
|
||||||
|
self.error('GeoPointField can only accept tuples or lists '
|
||||||
|
'of (x, y)')
|
||||||
|
|
||||||
|
if not len(value) == 2:
|
||||||
|
self.error("Value (%s) must be a two-dimensional point" % repr(value))
|
||||||
|
elif (not isinstance(value[0], (float, int)) or
|
||||||
|
not isinstance(value[1], (float, int))):
|
||||||
|
self.error("Both values (%s) in point must be float or int" % repr(value))
|
||||||
|
|
||||||
|
|
||||||
|
class PointField(GeoJsonBaseField):
|
||||||
|
"""A geo json field storing a latitude and longitude.
|
||||||
|
|
||||||
|
The data is represented as:
|
||||||
|
|
||||||
|
.. code-block:: js
|
||||||
|
|
||||||
|
{ "type" : "Point" ,
|
||||||
|
"coordinates" : [x, y]}
|
||||||
|
|
||||||
|
You can either pass a dict with the full information or a list
|
||||||
|
to set the value.
|
||||||
|
|
||||||
|
Requires mongodb >= 2.4
|
||||||
|
.. versionadded:: 0.8
|
||||||
|
"""
|
||||||
|
_type = "Point"
|
||||||
|
|
||||||
|
|
||||||
|
class LineStringField(GeoJsonBaseField):
|
||||||
|
"""A geo json field storing a line of latitude and longitude coordinates.
|
||||||
|
|
||||||
|
The data is represented as:
|
||||||
|
|
||||||
|
.. code-block:: js
|
||||||
|
|
||||||
|
{ "type" : "LineString" ,
|
||||||
|
"coordinates" : [[x1, y1], [x1, y1] ... [xn, yn]]}
|
||||||
|
|
||||||
|
You can either pass a dict with the full information or a list of points.
|
||||||
|
|
||||||
|
Requires mongodb >= 2.4
|
||||||
|
.. versionadded:: 0.8
|
||||||
|
"""
|
||||||
|
_type = "LineString"
|
||||||
|
|
||||||
|
|
||||||
|
class PolygonField(GeoJsonBaseField):
|
||||||
|
"""A geo json field storing a polygon of latitude and longitude coordinates.
|
||||||
|
|
||||||
|
The data is represented as:
|
||||||
|
|
||||||
|
.. code-block:: js
|
||||||
|
|
||||||
|
{ "type" : "Polygon" ,
|
||||||
|
"coordinates" : [[[x1, y1], [x1, y1] ... [xn, yn]],
|
||||||
|
[[x1, y1], [x1, y1] ... [xn, yn]]}
|
||||||
|
|
||||||
|
You can either pass a dict with the full information or a list
|
||||||
|
of LineStrings. The first LineString being the outside and the rest being
|
||||||
|
holes.
|
||||||
|
|
||||||
|
Requires mongodb >= 2.4
|
||||||
|
.. versionadded:: 0.8
|
||||||
|
"""
|
||||||
|
_type = "Polygon"
|
||||||
|
@ -1422,15 +1422,14 @@ class QuerySet(object):
|
|||||||
|
|
||||||
code = re.sub(u'\[\s*~([A-z_][A-z_0-9.]+?)\s*\]', field_sub, code)
|
code = re.sub(u'\[\s*~([A-z_][A-z_0-9.]+?)\s*\]', field_sub, code)
|
||||||
code = re.sub(u'\{\{\s*~([A-z_][A-z_0-9.]+?)\s*\}\}', field_path_sub,
|
code = re.sub(u'\{\{\s*~([A-z_][A-z_0-9.]+?)\s*\}\}', field_path_sub,
|
||||||
code)
|
code)
|
||||||
return code
|
return code
|
||||||
|
|
||||||
# Deprecated
|
# Deprecated
|
||||||
|
|
||||||
def ensure_index(self, **kwargs):
|
def ensure_index(self, **kwargs):
|
||||||
"""Deprecated use :func:`~Document.ensure_index`"""
|
"""Deprecated use :func:`~Document.ensure_index`"""
|
||||||
msg = ("Doc.objects()._ensure_index() is deprecated. "
|
msg = ("Doc.objects()._ensure_index() is deprecated. "
|
||||||
"Use Doc.ensure_index() instead.")
|
"Use Doc.ensure_index() instead.")
|
||||||
warnings.warn(msg, DeprecationWarning)
|
warnings.warn(msg, DeprecationWarning)
|
||||||
self._document.__class__.ensure_index(**kwargs)
|
self._document.__class__.ensure_index(**kwargs)
|
||||||
return self
|
return self
|
||||||
@ -1438,6 +1437,6 @@ class QuerySet(object):
|
|||||||
def _ensure_indexes(self):
|
def _ensure_indexes(self):
|
||||||
"""Deprecated use :func:`~Document.ensure_indexes`"""
|
"""Deprecated use :func:`~Document.ensure_indexes`"""
|
||||||
msg = ("Doc.objects()._ensure_indexes() is deprecated. "
|
msg = ("Doc.objects()._ensure_indexes() is deprecated. "
|
||||||
"Use Doc.ensure_indexes() instead.")
|
"Use Doc.ensure_indexes() instead.")
|
||||||
warnings.warn(msg, DeprecationWarning)
|
warnings.warn(msg, DeprecationWarning)
|
||||||
self._document.__class__.ensure_indexes()
|
self._document.__class__.ensure_indexes()
|
||||||
|
@ -1,5 +1,6 @@
|
|||||||
from collections import defaultdict
|
from collections import defaultdict
|
||||||
|
|
||||||
|
import pymongo
|
||||||
from bson import SON
|
from bson import SON
|
||||||
|
|
||||||
from mongoengine.common import _import_class
|
from mongoengine.common import _import_class
|
||||||
@ -12,7 +13,9 @@ COMPARISON_OPERATORS = ('ne', 'gt', 'gte', 'lt', 'lte', 'in', 'nin', 'mod',
|
|||||||
'all', 'size', 'exists', 'not')
|
'all', 'size', 'exists', 'not')
|
||||||
GEO_OPERATORS = ('within_distance', 'within_spherical_distance',
|
GEO_OPERATORS = ('within_distance', 'within_spherical_distance',
|
||||||
'within_box', 'within_polygon', 'near', 'near_sphere',
|
'within_box', 'within_polygon', 'near', 'near_sphere',
|
||||||
'max_distance')
|
'max_distance', 'geo_within', 'geo_within_box',
|
||||||
|
'geo_within_polygon', 'geo_within_center',
|
||||||
|
'geo_within_sphere', 'geo_intersects')
|
||||||
STRING_OPERATORS = ('contains', 'icontains', 'startswith',
|
STRING_OPERATORS = ('contains', 'icontains', 'startswith',
|
||||||
'istartswith', 'endswith', 'iendswith',
|
'istartswith', 'endswith', 'iendswith',
|
||||||
'exact', 'iexact')
|
'exact', 'iexact')
|
||||||
@ -81,30 +84,14 @@ def query(_doc_cls=None, _field_operation=False, **query):
|
|||||||
value = field
|
value = field
|
||||||
else:
|
else:
|
||||||
value = field.prepare_query_value(op, value)
|
value = field.prepare_query_value(op, value)
|
||||||
elif op in ('in', 'nin', 'all', 'near'):
|
elif op in ('in', 'nin', 'all', 'near') and not isinstance(value, dict):
|
||||||
# 'in', 'nin' and 'all' require a list of values
|
# 'in', 'nin' and 'all' require a list of values
|
||||||
value = [field.prepare_query_value(op, v) for v in value]
|
value = [field.prepare_query_value(op, v) for v in value]
|
||||||
|
|
||||||
# if op and op not in COMPARISON_OPERATORS:
|
# if op and op not in COMPARISON_OPERATORS:
|
||||||
if op:
|
if op:
|
||||||
if op in GEO_OPERATORS:
|
if op in GEO_OPERATORS:
|
||||||
if op == "within_distance":
|
value = _geo_operator(field, op, value)
|
||||||
value = {'$within': {'$center': value}}
|
|
||||||
elif op == "within_spherical_distance":
|
|
||||||
value = {'$within': {'$centerSphere': value}}
|
|
||||||
elif op == "within_polygon":
|
|
||||||
value = {'$within': {'$polygon': value}}
|
|
||||||
elif op == "near":
|
|
||||||
value = {'$near': value}
|
|
||||||
elif op == "near_sphere":
|
|
||||||
value = {'$nearSphere': value}
|
|
||||||
elif op == 'within_box':
|
|
||||||
value = {'$within': {'$box': value}}
|
|
||||||
elif op == "max_distance":
|
|
||||||
value = {'$maxDistance': value}
|
|
||||||
else:
|
|
||||||
raise NotImplementedError("Geo method '%s' has not "
|
|
||||||
"been implemented" % op)
|
|
||||||
elif op in CUSTOM_OPERATORS:
|
elif op in CUSTOM_OPERATORS:
|
||||||
if op == 'match':
|
if op == 'match':
|
||||||
value = {"$elemMatch": value}
|
value = {"$elemMatch": value}
|
||||||
@ -250,3 +237,76 @@ def update(_doc_cls=None, **update):
|
|||||||
mongo_update[key].update(value)
|
mongo_update[key].update(value)
|
||||||
|
|
||||||
return mongo_update
|
return mongo_update
|
||||||
|
|
||||||
|
|
||||||
|
def _geo_operator(field, op, value):
|
||||||
|
"""Helper to return the query for a given geo query"""
|
||||||
|
if field._geo_index == pymongo.GEO2D:
|
||||||
|
if op == "within_distance":
|
||||||
|
value = {'$within': {'$center': value}}
|
||||||
|
elif op == "within_spherical_distance":
|
||||||
|
value = {'$within': {'$centerSphere': value}}
|
||||||
|
elif op == "within_polygon":
|
||||||
|
value = {'$within': {'$polygon': value}}
|
||||||
|
elif op == "near":
|
||||||
|
value = {'$near': value}
|
||||||
|
elif op == "near_sphere":
|
||||||
|
value = {'$nearSphere': value}
|
||||||
|
elif op == 'within_box':
|
||||||
|
value = {'$within': {'$box': value}}
|
||||||
|
elif op == "max_distance":
|
||||||
|
value = {'$maxDistance': value}
|
||||||
|
else:
|
||||||
|
raise NotImplementedError("Geo method '%s' has not "
|
||||||
|
"been implemented for a GeoPointField" % op)
|
||||||
|
else:
|
||||||
|
if op == "geo_within":
|
||||||
|
value = {"$geoWithin": _infer_geometry(value)}
|
||||||
|
elif op == "geo_within_box":
|
||||||
|
value = {"$geoWithin": {"$box": value}}
|
||||||
|
elif op == "geo_within_polygon":
|
||||||
|
value = {"$geoWithin": {"$polygon": value}}
|
||||||
|
elif op == "geo_within_center":
|
||||||
|
value = {"$geoWithin": {"$center": value}}
|
||||||
|
elif op == "geo_within_sphere":
|
||||||
|
value = {"$geoWithin": {"$centerSphere": value}}
|
||||||
|
elif op == "geo_intersects":
|
||||||
|
value = {"$geoIntersects": _infer_geometry(value)}
|
||||||
|
elif op == "near":
|
||||||
|
value = {'$near': _infer_geometry(value)}
|
||||||
|
elif op == "max_distance":
|
||||||
|
value = {'$maxDistance': value}
|
||||||
|
else:
|
||||||
|
raise NotImplementedError("Geo method '%s' has not "
|
||||||
|
"been implemented for a %s " % (op, field._name))
|
||||||
|
return value
|
||||||
|
|
||||||
|
|
||||||
|
def _infer_geometry(value):
|
||||||
|
"""Helper method that tries to infer the $geometry shape for a given value"""
|
||||||
|
if isinstance(value, dict):
|
||||||
|
if "$geometry" in value:
|
||||||
|
return value
|
||||||
|
elif 'coordinates' in value and 'type' in value:
|
||||||
|
return {"$geometry": value}
|
||||||
|
raise InvalidQueryError("Invalid $geometry dictionary should have "
|
||||||
|
"type and coordinates keys")
|
||||||
|
elif isinstance(value, (list, set)):
|
||||||
|
try:
|
||||||
|
value[0][0][0]
|
||||||
|
return {"$geometry": {"type": "Polygon", "coordinates": value}}
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
try:
|
||||||
|
value[0][0]
|
||||||
|
return {"$geometry": {"type": "LineString", "coordinates": value}}
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
try:
|
||||||
|
value[0]
|
||||||
|
return {"$geometry": {"type": "Point", "coordinates": value}}
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
raise InvalidQueryError("Invalid $geometry data. Can be either a dictionary "
|
||||||
|
"or (nested) lists of coordinate(s)")
|
||||||
|
@ -381,8 +381,7 @@ class IndexesTest(unittest.TestCase):
|
|||||||
self.assertEqual(sorted(info.keys()), ['_id_', 'tags.tag_1'])
|
self.assertEqual(sorted(info.keys()), ['_id_', 'tags.tag_1'])
|
||||||
|
|
||||||
post1 = BlogPost(title="Embedded Indexes tests in place",
|
post1 = BlogPost(title="Embedded Indexes tests in place",
|
||||||
tags=[Tag(name="about"), Tag(name="time")]
|
tags=[Tag(name="about"), Tag(name="time")])
|
||||||
)
|
|
||||||
post1.save()
|
post1.save()
|
||||||
BlogPost.drop_collection()
|
BlogPost.drop_collection()
|
||||||
|
|
||||||
@ -399,29 +398,6 @@ class IndexesTest(unittest.TestCase):
|
|||||||
info = RecursiveDocument._get_collection().index_information()
|
info = RecursiveDocument._get_collection().index_information()
|
||||||
self.assertEqual(sorted(info.keys()), ['_cls_1', '_id_'])
|
self.assertEqual(sorted(info.keys()), ['_cls_1', '_id_'])
|
||||||
|
|
||||||
def test_geo_indexes_recursion(self):
|
|
||||||
|
|
||||||
class Location(Document):
|
|
||||||
name = StringField()
|
|
||||||
location = GeoPointField()
|
|
||||||
|
|
||||||
class Parent(Document):
|
|
||||||
name = StringField()
|
|
||||||
location = ReferenceField(Location, dbref=False)
|
|
||||||
|
|
||||||
Location.drop_collection()
|
|
||||||
Parent.drop_collection()
|
|
||||||
|
|
||||||
list(Parent.objects)
|
|
||||||
|
|
||||||
collection = Parent._get_collection()
|
|
||||||
info = collection.index_information()
|
|
||||||
|
|
||||||
self.assertFalse('location_2d' in info)
|
|
||||||
|
|
||||||
self.assertEqual(len(Parent._geo_indices()), 0)
|
|
||||||
self.assertEqual(len(Location._geo_indices()), 1)
|
|
||||||
|
|
||||||
def test_covered_index(self):
|
def test_covered_index(self):
|
||||||
"""Ensure that covered indexes can be used
|
"""Ensure that covered indexes can be used
|
||||||
"""
|
"""
|
||||||
@ -432,7 +408,7 @@ class IndexesTest(unittest.TestCase):
|
|||||||
meta = {
|
meta = {
|
||||||
'indexes': ['a'],
|
'indexes': ['a'],
|
||||||
'allow_inheritance': False
|
'allow_inheritance': False
|
||||||
}
|
}
|
||||||
|
|
||||||
Test.drop_collection()
|
Test.drop_collection()
|
||||||
|
|
||||||
|
@ -1,2 +1,3 @@
|
|||||||
from fields import *
|
from fields import *
|
||||||
from file_tests import *
|
from file_tests import *
|
||||||
|
from geo import *
|
@ -1862,45 +1862,6 @@ class FieldTest(unittest.TestCase):
|
|||||||
|
|
||||||
Shirt.drop_collection()
|
Shirt.drop_collection()
|
||||||
|
|
||||||
def test_geo_indexes(self):
|
|
||||||
"""Ensure that indexes are created automatically for GeoPointFields.
|
|
||||||
"""
|
|
||||||
class Event(Document):
|
|
||||||
title = StringField()
|
|
||||||
location = GeoPointField()
|
|
||||||
|
|
||||||
Event.drop_collection()
|
|
||||||
event = Event(title="Coltrane Motion @ Double Door",
|
|
||||||
location=[41.909889, -87.677137])
|
|
||||||
event.save()
|
|
||||||
|
|
||||||
info = Event.objects._collection.index_information()
|
|
||||||
self.assertTrue(u'location_2d' in info)
|
|
||||||
self.assertTrue(info[u'location_2d']['key'] == [(u'location', u'2d')])
|
|
||||||
|
|
||||||
Event.drop_collection()
|
|
||||||
|
|
||||||
def test_geo_embedded_indexes(self):
|
|
||||||
"""Ensure that indexes are created automatically for GeoPointFields on
|
|
||||||
embedded documents.
|
|
||||||
"""
|
|
||||||
class Venue(EmbeddedDocument):
|
|
||||||
location = GeoPointField()
|
|
||||||
name = StringField()
|
|
||||||
|
|
||||||
class Event(Document):
|
|
||||||
title = StringField()
|
|
||||||
venue = EmbeddedDocumentField(Venue)
|
|
||||||
|
|
||||||
Event.drop_collection()
|
|
||||||
venue = Venue(name="Double Door", location=[41.909889, -87.677137])
|
|
||||||
event = Event(title="Coltrane Motion", venue=venue)
|
|
||||||
event.save()
|
|
||||||
|
|
||||||
info = Event.objects._collection.index_information()
|
|
||||||
self.assertTrue(u'location_2d' in info)
|
|
||||||
self.assertTrue(info[u'location_2d']['key'] == [(u'location', u'2d')])
|
|
||||||
|
|
||||||
def test_ensure_unique_default_instances(self):
|
def test_ensure_unique_default_instances(self):
|
||||||
"""Ensure that every field has it's own unique default instance."""
|
"""Ensure that every field has it's own unique default instance."""
|
||||||
class D(Document):
|
class D(Document):
|
||||||
|
274
tests/fields/geo.py
Normal file
274
tests/fields/geo.py
Normal file
@ -0,0 +1,274 @@
|
|||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
import sys
|
||||||
|
sys.path[0:0] = [""]
|
||||||
|
|
||||||
|
import unittest
|
||||||
|
|
||||||
|
from mongoengine import *
|
||||||
|
from mongoengine.connection import get_db
|
||||||
|
|
||||||
|
__all__ = ("GeoFieldTest", )
|
||||||
|
|
||||||
|
|
||||||
|
class GeoFieldTest(unittest.TestCase):
|
||||||
|
|
||||||
|
def setUp(self):
|
||||||
|
connect(db='mongoenginetest')
|
||||||
|
self.db = get_db()
|
||||||
|
|
||||||
|
def _test_for_expected_error(self, Cls, loc, expected):
|
||||||
|
try:
|
||||||
|
Cls(loc=loc).validate()
|
||||||
|
self.fail()
|
||||||
|
except ValidationError, e:
|
||||||
|
self.assertEqual(expected, e.to_dict()['loc'])
|
||||||
|
|
||||||
|
def test_geopoint_validation(self):
|
||||||
|
class Location(Document):
|
||||||
|
loc = GeoPointField()
|
||||||
|
|
||||||
|
invalid_coords = [{"x": 1, "y": 2}, 5, "a"]
|
||||||
|
expected = 'GeoPointField can only accept tuples or lists of (x, y)'
|
||||||
|
|
||||||
|
for coord in invalid_coords:
|
||||||
|
self._test_for_expected_error(Location, coord, expected)
|
||||||
|
|
||||||
|
invalid_coords = [[], [1], [1, 2, 3]]
|
||||||
|
for coord in invalid_coords:
|
||||||
|
expected = "Value (%s) must be a two-dimensional point" % repr(coord)
|
||||||
|
self._test_for_expected_error(Location, coord, expected)
|
||||||
|
|
||||||
|
invalid_coords = [[{}, {}], ("a", "b")]
|
||||||
|
for coord in invalid_coords:
|
||||||
|
expected = "Both values (%s) in point must be float or int" % repr(coord)
|
||||||
|
self._test_for_expected_error(Location, coord, expected)
|
||||||
|
|
||||||
|
def test_point_validation(self):
|
||||||
|
class Location(Document):
|
||||||
|
loc = PointField()
|
||||||
|
|
||||||
|
invalid_coords = {"x": 1, "y": 2}
|
||||||
|
expected = 'PointField can only accept a valid GeoJson dictionary or lists of (x, y)'
|
||||||
|
self._test_for_expected_error(Location, invalid_coords, expected)
|
||||||
|
|
||||||
|
invalid_coords = {"type": "MadeUp", "coordinates": []}
|
||||||
|
expected = 'PointField type must be "Point"'
|
||||||
|
self._test_for_expected_error(Location, invalid_coords, expected)
|
||||||
|
|
||||||
|
invalid_coords = {"type": "Point", "coordinates": [1, 2, 3]}
|
||||||
|
expected = "Value ([1, 2, 3]) must be a two-dimensional point"
|
||||||
|
self._test_for_expected_error(Location, invalid_coords, expected)
|
||||||
|
|
||||||
|
invalid_coords = [5, "a"]
|
||||||
|
expected = "PointField can only accept lists of [x, y]"
|
||||||
|
for coord in invalid_coords:
|
||||||
|
self._test_for_expected_error(Location, coord, expected)
|
||||||
|
|
||||||
|
invalid_coords = [[], [1], [1, 2, 3]]
|
||||||
|
for coord in invalid_coords:
|
||||||
|
expected = "Value (%s) must be a two-dimensional point" % repr(coord)
|
||||||
|
self._test_for_expected_error(Location, coord, expected)
|
||||||
|
|
||||||
|
invalid_coords = [[{}, {}], ("a", "b")]
|
||||||
|
for coord in invalid_coords:
|
||||||
|
expected = "Both values (%s) in point must be float or int" % repr(coord)
|
||||||
|
self._test_for_expected_error(Location, coord, expected)
|
||||||
|
|
||||||
|
Location(loc=[1, 2]).validate()
|
||||||
|
|
||||||
|
def test_linestring_validation(self):
|
||||||
|
class Location(Document):
|
||||||
|
loc = LineStringField()
|
||||||
|
|
||||||
|
invalid_coords = {"x": 1, "y": 2}
|
||||||
|
expected = 'LineStringField can only accept a valid GeoJson dictionary or lists of (x, y)'
|
||||||
|
self._test_for_expected_error(Location, invalid_coords, expected)
|
||||||
|
|
||||||
|
invalid_coords = {"type": "MadeUp", "coordinates": [[]]}
|
||||||
|
expected = 'LineStringField type must be "LineString"'
|
||||||
|
self._test_for_expected_error(Location, invalid_coords, expected)
|
||||||
|
|
||||||
|
invalid_coords = {"type": "LineString", "coordinates": [[1, 2, 3]]}
|
||||||
|
expected = "Invalid LineString:\nValue ([1, 2, 3]) must be a two-dimensional point"
|
||||||
|
self._test_for_expected_error(Location, invalid_coords, expected)
|
||||||
|
|
||||||
|
invalid_coords = [5, "a"]
|
||||||
|
expected = "Invalid LineString must contain at least one valid point"
|
||||||
|
self._test_for_expected_error(Location, invalid_coords, expected)
|
||||||
|
|
||||||
|
invalid_coords = [[1]]
|
||||||
|
expected = "Invalid LineString:\nValue (%s) must be a two-dimensional point" % repr(invalid_coords[0])
|
||||||
|
self._test_for_expected_error(Location, invalid_coords, expected)
|
||||||
|
|
||||||
|
invalid_coords = [[1, 2, 3]]
|
||||||
|
expected = "Invalid LineString:\nValue (%s) must be a two-dimensional point" % repr(invalid_coords[0])
|
||||||
|
self._test_for_expected_error(Location, invalid_coords, expected)
|
||||||
|
|
||||||
|
invalid_coords = [[[{}, {}]], [("a", "b")]]
|
||||||
|
for coord in invalid_coords:
|
||||||
|
expected = "Invalid LineString:\nBoth values (%s) in point must be float or int" % repr(coord[0])
|
||||||
|
self._test_for_expected_error(Location, coord, expected)
|
||||||
|
|
||||||
|
Location(loc=[[1, 2], [3, 4], [5, 6], [1,2]]).validate()
|
||||||
|
|
||||||
|
def test_polygon_validation(self):
|
||||||
|
class Location(Document):
|
||||||
|
loc = PolygonField()
|
||||||
|
|
||||||
|
invalid_coords = {"x": 1, "y": 2}
|
||||||
|
expected = 'PolygonField can only accept a valid GeoJson dictionary or lists of (x, y)'
|
||||||
|
self._test_for_expected_error(Location, invalid_coords, expected)
|
||||||
|
|
||||||
|
invalid_coords = {"type": "MadeUp", "coordinates": [[]]}
|
||||||
|
expected = 'PolygonField type must be "Polygon"'
|
||||||
|
self._test_for_expected_error(Location, invalid_coords, expected)
|
||||||
|
|
||||||
|
invalid_coords = {"type": "Polygon", "coordinates": [[[1, 2, 3]]]}
|
||||||
|
expected = "Invalid Polygon:\nValue ([1, 2, 3]) must be a two-dimensional point"
|
||||||
|
self._test_for_expected_error(Location, invalid_coords, expected)
|
||||||
|
|
||||||
|
invalid_coords = [[[5, "a"]]]
|
||||||
|
expected = "Invalid Polygon:\nBoth values ([5, 'a']) in point must be float or int"
|
||||||
|
self._test_for_expected_error(Location, invalid_coords, expected)
|
||||||
|
|
||||||
|
invalid_coords = [[[]]]
|
||||||
|
expected = "Invalid Polygon must contain at least one valid linestring"
|
||||||
|
self._test_for_expected_error(Location, invalid_coords, expected)
|
||||||
|
|
||||||
|
invalid_coords = [[[1, 2, 3]]]
|
||||||
|
expected = "Invalid Polygon:\nValue ([1, 2, 3]) must be a two-dimensional point"
|
||||||
|
self._test_for_expected_error(Location, invalid_coords, expected)
|
||||||
|
|
||||||
|
invalid_coords = [[[{}, {}]], [("a", "b")]]
|
||||||
|
expected = "Invalid Polygon:\nBoth values ([{}, {}]) in point must be float or int, Both values (('a', 'b')) in point must be float or int"
|
||||||
|
self._test_for_expected_error(Location, invalid_coords, expected)
|
||||||
|
|
||||||
|
invalid_coords = [[[1, 2], [3, 4]]]
|
||||||
|
expected = "Invalid Polygon:\nLineStrings must start and end at the same point"
|
||||||
|
self._test_for_expected_error(Location, invalid_coords, expected)
|
||||||
|
|
||||||
|
Location(loc=[[[1, 2], [3, 4], [5, 6], [1, 2]]]).validate()
|
||||||
|
|
||||||
|
def test_indexes_geopoint(self):
|
||||||
|
"""Ensure that indexes are created automatically for GeoPointFields.
|
||||||
|
"""
|
||||||
|
class Event(Document):
|
||||||
|
title = StringField()
|
||||||
|
location = GeoPointField()
|
||||||
|
|
||||||
|
geo_indicies = Event._geo_indices()
|
||||||
|
self.assertEqual(geo_indicies, [{'fields': [('location', '2d')]}])
|
||||||
|
|
||||||
|
def test_geopoint_embedded_indexes(self):
|
||||||
|
"""Ensure that indexes are created automatically for GeoPointFields on
|
||||||
|
embedded documents.
|
||||||
|
"""
|
||||||
|
class Venue(EmbeddedDocument):
|
||||||
|
location = GeoPointField()
|
||||||
|
name = StringField()
|
||||||
|
|
||||||
|
class Event(Document):
|
||||||
|
title = StringField()
|
||||||
|
venue = EmbeddedDocumentField(Venue)
|
||||||
|
|
||||||
|
geo_indicies = Event._geo_indices()
|
||||||
|
self.assertEqual(geo_indicies, [{'fields': [('venue.location', '2d')]}])
|
||||||
|
|
||||||
|
def test_indexes_2dsphere(self):
|
||||||
|
"""Ensure that indexes are created automatically for GeoPointFields.
|
||||||
|
"""
|
||||||
|
class Event(Document):
|
||||||
|
title = StringField()
|
||||||
|
point = PointField()
|
||||||
|
line = LineStringField()
|
||||||
|
polygon = PolygonField()
|
||||||
|
|
||||||
|
geo_indicies = Event._geo_indices()
|
||||||
|
self.assertEqual(geo_indicies, [{'fields': [('line', '2dsphere')]},
|
||||||
|
{'fields': [('polygon', '2dsphere')]},
|
||||||
|
{'fields': [('point', '2dsphere')]}])
|
||||||
|
|
||||||
|
def test_indexes_2dsphere_embedded(self):
|
||||||
|
"""Ensure that indexes are created automatically for GeoPointFields.
|
||||||
|
"""
|
||||||
|
class Venue(EmbeddedDocument):
|
||||||
|
name = StringField()
|
||||||
|
point = PointField()
|
||||||
|
line = LineStringField()
|
||||||
|
polygon = PolygonField()
|
||||||
|
|
||||||
|
class Event(Document):
|
||||||
|
title = StringField()
|
||||||
|
venue = EmbeddedDocumentField(Venue)
|
||||||
|
|
||||||
|
geo_indicies = Event._geo_indices()
|
||||||
|
self.assertTrue({'fields': [('venue.line', '2dsphere')]} in geo_indicies)
|
||||||
|
self.assertTrue({'fields': [('venue.polygon', '2dsphere')]} in geo_indicies)
|
||||||
|
self.assertTrue({'fields': [('venue.point', '2dsphere')]} in geo_indicies)
|
||||||
|
|
||||||
|
def test_geo_indexes_recursion(self):
|
||||||
|
|
||||||
|
class Location(Document):
|
||||||
|
name = StringField()
|
||||||
|
location = GeoPointField()
|
||||||
|
|
||||||
|
class Parent(Document):
|
||||||
|
name = StringField()
|
||||||
|
location = ReferenceField(Location)
|
||||||
|
|
||||||
|
Location.drop_collection()
|
||||||
|
Parent.drop_collection()
|
||||||
|
|
||||||
|
list(Parent.objects)
|
||||||
|
|
||||||
|
collection = Parent._get_collection()
|
||||||
|
info = collection.index_information()
|
||||||
|
|
||||||
|
self.assertFalse('location_2d' in info)
|
||||||
|
|
||||||
|
self.assertEqual(len(Parent._geo_indices()), 0)
|
||||||
|
self.assertEqual(len(Location._geo_indices()), 1)
|
||||||
|
|
||||||
|
def test_geo_indexes_auto_index(self):
|
||||||
|
|
||||||
|
# Test just listing the fields
|
||||||
|
class Log(Document):
|
||||||
|
location = PointField(auto_index=False)
|
||||||
|
datetime = DateTimeField()
|
||||||
|
|
||||||
|
meta = {
|
||||||
|
'indexes': [[("location", "2dsphere"), ("datetime", 1)]]
|
||||||
|
}
|
||||||
|
|
||||||
|
self.assertEqual([], Log._geo_indices())
|
||||||
|
|
||||||
|
Log.drop_collection()
|
||||||
|
Log.ensure_indexes()
|
||||||
|
|
||||||
|
info = Log._get_collection().index_information()
|
||||||
|
self.assertEqual(info["location_2dsphere_datetime_1"]["key"],
|
||||||
|
[('location', '2dsphere'), ('datetime', 1)])
|
||||||
|
|
||||||
|
# Test listing explicitly
|
||||||
|
class Log(Document):
|
||||||
|
location = PointField(auto_index=False)
|
||||||
|
datetime = DateTimeField()
|
||||||
|
|
||||||
|
meta = {
|
||||||
|
'indexes': [
|
||||||
|
{'fields': [("location", "2dsphere"), ("datetime", 1)]}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
self.assertEqual([], Log._geo_indices())
|
||||||
|
|
||||||
|
Log.drop_collection()
|
||||||
|
Log.ensure_indexes()
|
||||||
|
|
||||||
|
info = Log._get_collection().index_information()
|
||||||
|
self.assertEqual(info["location_2dsphere_datetime_1"]["key"],
|
||||||
|
[('location', '2dsphere'), ('datetime', 1)])
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
unittest.main()
|
@ -1,5 +1,5 @@
|
|||||||
|
|
||||||
from transform import *
|
from transform import *
|
||||||
from field_list import *
|
from field_list import *
|
||||||
from queryset import *
|
from queryset import *
|
||||||
from visitor import *
|
from visitor import *
|
||||||
|
from geo import *
|
||||||
|
418
tests/queryset/geo.py
Normal file
418
tests/queryset/geo.py
Normal file
@ -0,0 +1,418 @@
|
|||||||
|
import sys
|
||||||
|
sys.path[0:0] = [""]
|
||||||
|
|
||||||
|
import unittest
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from mongoengine import *
|
||||||
|
|
||||||
|
__all__ = ("GeoQueriesTest",)
|
||||||
|
|
||||||
|
|
||||||
|
class GeoQueriesTest(unittest.TestCase):
|
||||||
|
|
||||||
|
def setUp(self):
|
||||||
|
connect(db='mongoenginetest')
|
||||||
|
|
||||||
|
def test_geospatial_operators(self):
|
||||||
|
"""Ensure that geospatial queries are working.
|
||||||
|
"""
|
||||||
|
class Event(Document):
|
||||||
|
title = StringField()
|
||||||
|
date = DateTimeField()
|
||||||
|
location = GeoPointField()
|
||||||
|
|
||||||
|
def __unicode__(self):
|
||||||
|
return self.title
|
||||||
|
|
||||||
|
Event.drop_collection()
|
||||||
|
|
||||||
|
event1 = Event(title="Coltrane Motion @ Double Door",
|
||||||
|
date=datetime.now() - timedelta(days=1),
|
||||||
|
location=[-87.677137, 41.909889]).save()
|
||||||
|
event2 = Event(title="Coltrane Motion @ Bottom of the Hill",
|
||||||
|
date=datetime.now() - timedelta(days=10),
|
||||||
|
location=[-122.4194155, 37.7749295]).save()
|
||||||
|
event3 = Event(title="Coltrane Motion @ Empty Bottle",
|
||||||
|
date=datetime.now(),
|
||||||
|
location=[-87.686638, 41.900474]).save()
|
||||||
|
|
||||||
|
# find all events "near" pitchfork office, chicago.
|
||||||
|
# note that "near" will show the san francisco event, too,
|
||||||
|
# although it sorts to last.
|
||||||
|
events = Event.objects(location__near=[-87.67892, 41.9120459])
|
||||||
|
self.assertEqual(events.count(), 3)
|
||||||
|
self.assertEqual(list(events), [event1, event3, event2])
|
||||||
|
|
||||||
|
# find events within 5 degrees of pitchfork office, chicago
|
||||||
|
point_and_distance = [[-87.67892, 41.9120459], 5]
|
||||||
|
events = Event.objects(location__within_distance=point_and_distance)
|
||||||
|
self.assertEqual(events.count(), 2)
|
||||||
|
events = list(events)
|
||||||
|
self.assertTrue(event2 not in events)
|
||||||
|
self.assertTrue(event1 in events)
|
||||||
|
self.assertTrue(event3 in events)
|
||||||
|
|
||||||
|
# ensure ordering is respected by "near"
|
||||||
|
events = Event.objects(location__near=[-87.67892, 41.9120459])
|
||||||
|
events = events.order_by("-date")
|
||||||
|
self.assertEqual(events.count(), 3)
|
||||||
|
self.assertEqual(list(events), [event3, event1, event2])
|
||||||
|
|
||||||
|
# find events within 10 degrees of san francisco
|
||||||
|
point = [-122.415579, 37.7566023]
|
||||||
|
events = Event.objects(location__near=point, location__max_distance=10)
|
||||||
|
self.assertEqual(events.count(), 1)
|
||||||
|
self.assertEqual(events[0], event2)
|
||||||
|
|
||||||
|
# find events within 10 degrees of san francisco
|
||||||
|
point_and_distance = [[-122.415579, 37.7566023], 10]
|
||||||
|
events = Event.objects(location__within_distance=point_and_distance)
|
||||||
|
self.assertEqual(events.count(), 1)
|
||||||
|
self.assertEqual(events[0], event2)
|
||||||
|
|
||||||
|
# find events within 1 degree of greenpoint, broolyn, nyc, ny
|
||||||
|
point_and_distance = [[-73.9509714, 40.7237134], 1]
|
||||||
|
events = Event.objects(location__within_distance=point_and_distance)
|
||||||
|
self.assertEqual(events.count(), 0)
|
||||||
|
|
||||||
|
# ensure ordering is respected by "within_distance"
|
||||||
|
point_and_distance = [[-87.67892, 41.9120459], 10]
|
||||||
|
events = Event.objects(location__within_distance=point_and_distance)
|
||||||
|
events = events.order_by("-date")
|
||||||
|
self.assertEqual(events.count(), 2)
|
||||||
|
self.assertEqual(events[0], event3)
|
||||||
|
|
||||||
|
# check that within_box works
|
||||||
|
box = [(-125.0, 35.0), (-100.0, 40.0)]
|
||||||
|
events = Event.objects(location__within_box=box)
|
||||||
|
self.assertEqual(events.count(), 1)
|
||||||
|
self.assertEqual(events[0].id, event2.id)
|
||||||
|
|
||||||
|
polygon = [
|
||||||
|
(-87.694445, 41.912114),
|
||||||
|
(-87.69084, 41.919395),
|
||||||
|
(-87.681742, 41.927186),
|
||||||
|
(-87.654276, 41.911731),
|
||||||
|
(-87.656164, 41.898061),
|
||||||
|
]
|
||||||
|
events = Event.objects(location__within_polygon=polygon)
|
||||||
|
self.assertEqual(events.count(), 1)
|
||||||
|
self.assertEqual(events[0].id, event1.id)
|
||||||
|
|
||||||
|
polygon2 = [
|
||||||
|
(-1.742249, 54.033586),
|
||||||
|
(-1.225891, 52.792797),
|
||||||
|
(-4.40094, 53.389881)
|
||||||
|
]
|
||||||
|
events = Event.objects(location__within_polygon=polygon2)
|
||||||
|
self.assertEqual(events.count(), 0)
|
||||||
|
|
||||||
|
def test_geo_spatial_embedded(self):
|
||||||
|
|
||||||
|
class Venue(EmbeddedDocument):
|
||||||
|
location = GeoPointField()
|
||||||
|
name = StringField()
|
||||||
|
|
||||||
|
class Event(Document):
|
||||||
|
title = StringField()
|
||||||
|
venue = EmbeddedDocumentField(Venue)
|
||||||
|
|
||||||
|
Event.drop_collection()
|
||||||
|
|
||||||
|
venue1 = Venue(name="The Rock", location=[-87.677137, 41.909889])
|
||||||
|
venue2 = Venue(name="The Bridge", location=[-122.4194155, 37.7749295])
|
||||||
|
|
||||||
|
event1 = Event(title="Coltrane Motion @ Double Door",
|
||||||
|
venue=venue1).save()
|
||||||
|
event2 = Event(title="Coltrane Motion @ Bottom of the Hill",
|
||||||
|
venue=venue2).save()
|
||||||
|
event3 = Event(title="Coltrane Motion @ Empty Bottle",
|
||||||
|
venue=venue1).save()
|
||||||
|
|
||||||
|
# find all events "near" pitchfork office, chicago.
|
||||||
|
# note that "near" will show the san francisco event, too,
|
||||||
|
# although it sorts to last.
|
||||||
|
events = Event.objects(venue__location__near=[-87.67892, 41.9120459])
|
||||||
|
self.assertEqual(events.count(), 3)
|
||||||
|
self.assertEqual(list(events), [event1, event3, event2])
|
||||||
|
|
||||||
|
def test_spherical_geospatial_operators(self):
|
||||||
|
"""Ensure that spherical geospatial queries are working
|
||||||
|
"""
|
||||||
|
class Point(Document):
|
||||||
|
location = GeoPointField()
|
||||||
|
|
||||||
|
Point.drop_collection()
|
||||||
|
|
||||||
|
# These points are one degree apart, which (according to Google Maps)
|
||||||
|
# is about 110 km apart at this place on the Earth.
|
||||||
|
north_point = Point(location=[-122, 38]).save() # Near Concord, CA
|
||||||
|
south_point = Point(location=[-122, 37]).save() # Near Santa Cruz, CA
|
||||||
|
|
||||||
|
earth_radius = 6378.009 # in km (needs to be a float for dividing by)
|
||||||
|
|
||||||
|
# Finds both points because they are within 60 km of the reference
|
||||||
|
# point equidistant between them.
|
||||||
|
points = Point.objects(location__near_sphere=[-122, 37.5])
|
||||||
|
self.assertEqual(points.count(), 2)
|
||||||
|
|
||||||
|
# Same behavior for _within_spherical_distance
|
||||||
|
points = Point.objects(
|
||||||
|
location__within_spherical_distance=[[-122, 37.5], 60/earth_radius]
|
||||||
|
)
|
||||||
|
self.assertEqual(points.count(), 2)
|
||||||
|
|
||||||
|
points = Point.objects(location__near_sphere=[-122, 37.5],
|
||||||
|
location__max_distance=60 / earth_radius)
|
||||||
|
self.assertEqual(points.count(), 2)
|
||||||
|
|
||||||
|
# Finds both points, but orders the north point first because it's
|
||||||
|
# closer to the reference point to the north.
|
||||||
|
points = Point.objects(location__near_sphere=[-122, 38.5])
|
||||||
|
self.assertEqual(points.count(), 2)
|
||||||
|
self.assertEqual(points[0].id, north_point.id)
|
||||||
|
self.assertEqual(points[1].id, south_point.id)
|
||||||
|
|
||||||
|
# Finds both points, but orders the south point first because it's
|
||||||
|
# closer to the reference point to the south.
|
||||||
|
points = Point.objects(location__near_sphere=[-122, 36.5])
|
||||||
|
self.assertEqual(points.count(), 2)
|
||||||
|
self.assertEqual(points[0].id, south_point.id)
|
||||||
|
self.assertEqual(points[1].id, north_point.id)
|
||||||
|
|
||||||
|
# Finds only one point because only the first point is within 60km of
|
||||||
|
# the reference point to the south.
|
||||||
|
points = Point.objects(
|
||||||
|
location__within_spherical_distance=[[-122, 36.5], 60/earth_radius])
|
||||||
|
self.assertEqual(points.count(), 1)
|
||||||
|
self.assertEqual(points[0].id, south_point.id)
|
||||||
|
|
||||||
|
def test_2dsphere_point(self):
|
||||||
|
|
||||||
|
class Event(Document):
|
||||||
|
title = StringField()
|
||||||
|
date = DateTimeField()
|
||||||
|
location = PointField()
|
||||||
|
|
||||||
|
def __unicode__(self):
|
||||||
|
return self.title
|
||||||
|
|
||||||
|
Event.drop_collection()
|
||||||
|
|
||||||
|
event1 = Event(title="Coltrane Motion @ Double Door",
|
||||||
|
date=datetime.now() - timedelta(days=1),
|
||||||
|
location=[-87.677137, 41.909889])
|
||||||
|
event1.save()
|
||||||
|
event2 = Event(title="Coltrane Motion @ Bottom of the Hill",
|
||||||
|
date=datetime.now() - timedelta(days=10),
|
||||||
|
location=[-122.4194155, 37.7749295]).save()
|
||||||
|
event3 = Event(title="Coltrane Motion @ Empty Bottle",
|
||||||
|
date=datetime.now(),
|
||||||
|
location=[-87.686638, 41.900474]).save()
|
||||||
|
|
||||||
|
# find all events "near" pitchfork office, chicago.
|
||||||
|
# note that "near" will show the san francisco event, too,
|
||||||
|
# although it sorts to last.
|
||||||
|
events = Event.objects(location__near=[-87.67892, 41.9120459])
|
||||||
|
self.assertEqual(events.count(), 3)
|
||||||
|
self.assertEqual(list(events), [event1, event3, event2])
|
||||||
|
|
||||||
|
# find events within 5 degrees of pitchfork office, chicago
|
||||||
|
point_and_distance = [[-87.67892, 41.9120459], 2]
|
||||||
|
events = Event.objects(location__geo_within_center=point_and_distance)
|
||||||
|
self.assertEqual(events.count(), 2)
|
||||||
|
events = list(events)
|
||||||
|
self.assertTrue(event2 not in events)
|
||||||
|
self.assertTrue(event1 in events)
|
||||||
|
self.assertTrue(event3 in events)
|
||||||
|
|
||||||
|
# ensure ordering is respected by "near"
|
||||||
|
events = Event.objects(location__near=[-87.67892, 41.9120459])
|
||||||
|
events = events.order_by("-date")
|
||||||
|
self.assertEqual(events.count(), 3)
|
||||||
|
self.assertEqual(list(events), [event3, event1, event2])
|
||||||
|
|
||||||
|
# find events within 10km of san francisco
|
||||||
|
point = [-122.415579, 37.7566023]
|
||||||
|
events = Event.objects(location__near=point, location__max_distance=10000)
|
||||||
|
self.assertEqual(events.count(), 1)
|
||||||
|
self.assertEqual(events[0], event2)
|
||||||
|
|
||||||
|
# find events within 1km of greenpoint, broolyn, nyc, ny
|
||||||
|
events = Event.objects(location__near=[-73.9509714, 40.7237134], location__max_distance=1000)
|
||||||
|
self.assertEqual(events.count(), 0)
|
||||||
|
|
||||||
|
# ensure ordering is respected by "near"
|
||||||
|
events = Event.objects(location__near=[-87.67892, 41.9120459],
|
||||||
|
location__max_distance=10000).order_by("-date")
|
||||||
|
self.assertEqual(events.count(), 2)
|
||||||
|
self.assertEqual(events[0], event3)
|
||||||
|
|
||||||
|
# check that within_box works
|
||||||
|
box = [(-125.0, 35.0), (-100.0, 40.0)]
|
||||||
|
events = Event.objects(location__geo_within_box=box)
|
||||||
|
self.assertEqual(events.count(), 1)
|
||||||
|
self.assertEqual(events[0].id, event2.id)
|
||||||
|
|
||||||
|
polygon = [
|
||||||
|
(-87.694445, 41.912114),
|
||||||
|
(-87.69084, 41.919395),
|
||||||
|
(-87.681742, 41.927186),
|
||||||
|
(-87.654276, 41.911731),
|
||||||
|
(-87.656164, 41.898061),
|
||||||
|
]
|
||||||
|
events = Event.objects(location__geo_within_polygon=polygon)
|
||||||
|
self.assertEqual(events.count(), 1)
|
||||||
|
self.assertEqual(events[0].id, event1.id)
|
||||||
|
|
||||||
|
polygon2 = [
|
||||||
|
(-1.742249, 54.033586),
|
||||||
|
(-1.225891, 52.792797),
|
||||||
|
(-4.40094, 53.389881)
|
||||||
|
]
|
||||||
|
events = Event.objects(location__geo_within_polygon=polygon2)
|
||||||
|
self.assertEqual(events.count(), 0)
|
||||||
|
|
||||||
|
def test_2dsphere_point_embedded(self):
|
||||||
|
|
||||||
|
class Venue(EmbeddedDocument):
|
||||||
|
location = GeoPointField()
|
||||||
|
name = StringField()
|
||||||
|
|
||||||
|
class Event(Document):
|
||||||
|
title = StringField()
|
||||||
|
venue = EmbeddedDocumentField(Venue)
|
||||||
|
|
||||||
|
Event.drop_collection()
|
||||||
|
|
||||||
|
venue1 = Venue(name="The Rock", location=[-87.677137, 41.909889])
|
||||||
|
venue2 = Venue(name="The Bridge", location=[-122.4194155, 37.7749295])
|
||||||
|
|
||||||
|
event1 = Event(title="Coltrane Motion @ Double Door",
|
||||||
|
venue=venue1).save()
|
||||||
|
event2 = Event(title="Coltrane Motion @ Bottom of the Hill",
|
||||||
|
venue=venue2).save()
|
||||||
|
event3 = Event(title="Coltrane Motion @ Empty Bottle",
|
||||||
|
venue=venue1).save()
|
||||||
|
|
||||||
|
# find all events "near" pitchfork office, chicago.
|
||||||
|
# note that "near" will show the san francisco event, too,
|
||||||
|
# although it sorts to last.
|
||||||
|
events = Event.objects(venue__location__near=[-87.67892, 41.9120459])
|
||||||
|
self.assertEqual(events.count(), 3)
|
||||||
|
self.assertEqual(list(events), [event1, event3, event2])
|
||||||
|
|
||||||
|
def test_linestring(self):
|
||||||
|
|
||||||
|
class Road(Document):
|
||||||
|
name = StringField()
|
||||||
|
line = LineStringField()
|
||||||
|
|
||||||
|
Road.drop_collection()
|
||||||
|
|
||||||
|
Road(name="66", line=[[40, 5], [41, 6]]).save()
|
||||||
|
|
||||||
|
# near
|
||||||
|
point = {"type": "Point", "coordinates": [40, 5]}
|
||||||
|
roads = Road.objects.filter(line__near=point["coordinates"]).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
roads = Road.objects.filter(line__near=point).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
roads = Road.objects.filter(line__near={"$geometry": point}).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
# Within
|
||||||
|
polygon = {"type": "Polygon",
|
||||||
|
"coordinates": [[[40, 5], [40, 6], [41, 6], [41, 5], [40, 5]]]}
|
||||||
|
roads = Road.objects.filter(line__geo_within=polygon["coordinates"]).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
roads = Road.objects.filter(line__geo_within=polygon).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
roads = Road.objects.filter(line__geo_within={"$geometry": polygon}).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
# Intersects
|
||||||
|
line = {"type": "LineString",
|
||||||
|
"coordinates": [[40, 5], [40, 6]]}
|
||||||
|
roads = Road.objects.filter(line__geo_intersects=line["coordinates"]).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
roads = Road.objects.filter(line__geo_intersects=line).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
roads = Road.objects.filter(line__geo_intersects={"$geometry": line}).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
polygon = {"type": "Polygon",
|
||||||
|
"coordinates": [[[40, 5], [40, 6], [41, 6], [41, 5], [40, 5]]]}
|
||||||
|
roads = Road.objects.filter(line__geo_intersects=polygon["coordinates"]).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
roads = Road.objects.filter(line__geo_intersects=polygon).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
roads = Road.objects.filter(line__geo_intersects={"$geometry": polygon}).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
def test_polygon(self):
|
||||||
|
|
||||||
|
class Road(Document):
|
||||||
|
name = StringField()
|
||||||
|
poly = PolygonField()
|
||||||
|
|
||||||
|
Road.drop_collection()
|
||||||
|
|
||||||
|
Road(name="66", poly=[[[40, 5], [40, 6], [41, 6], [40, 5]]]).save()
|
||||||
|
|
||||||
|
# near
|
||||||
|
point = {"type": "Point", "coordinates": [40, 5]}
|
||||||
|
roads = Road.objects.filter(poly__near=point["coordinates"]).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
roads = Road.objects.filter(poly__near=point).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
roads = Road.objects.filter(poly__near={"$geometry": point}).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
# Within
|
||||||
|
polygon = {"type": "Polygon",
|
||||||
|
"coordinates": [[[40, 5], [40, 6], [41, 6], [41, 5], [40, 5]]]}
|
||||||
|
roads = Road.objects.filter(poly__geo_within=polygon["coordinates"]).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
roads = Road.objects.filter(poly__geo_within=polygon).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
roads = Road.objects.filter(poly__geo_within={"$geometry": polygon}).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
# Intersects
|
||||||
|
line = {"type": "LineString",
|
||||||
|
"coordinates": [[40, 5], [41, 6]]}
|
||||||
|
roads = Road.objects.filter(poly__geo_intersects=line["coordinates"]).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
roads = Road.objects.filter(poly__geo_intersects=line).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
roads = Road.objects.filter(poly__geo_intersects={"$geometry": line}).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
polygon = {"type": "Polygon",
|
||||||
|
"coordinates": [[[40, 5], [40, 6], [41, 6], [41, 5], [40, 5]]]}
|
||||||
|
roads = Road.objects.filter(poly__geo_intersects=polygon["coordinates"]).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
roads = Road.objects.filter(poly__geo_intersects=polygon).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
roads = Road.objects.filter(poly__geo_intersects={"$geometry": polygon}).count()
|
||||||
|
self.assertEqual(1, roads)
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
unittest.main()
|
@ -2380,167 +2380,6 @@ class QuerySetTest(unittest.TestCase):
|
|||||||
def tearDown(self):
|
def tearDown(self):
|
||||||
self.Person.drop_collection()
|
self.Person.drop_collection()
|
||||||
|
|
||||||
def test_geospatial_operators(self):
|
|
||||||
"""Ensure that geospatial queries are working.
|
|
||||||
"""
|
|
||||||
class Event(Document):
|
|
||||||
title = StringField()
|
|
||||||
date = DateTimeField()
|
|
||||||
location = GeoPointField()
|
|
||||||
|
|
||||||
def __unicode__(self):
|
|
||||||
return self.title
|
|
||||||
|
|
||||||
Event.drop_collection()
|
|
||||||
|
|
||||||
event1 = Event(title="Coltrane Motion @ Double Door",
|
|
||||||
date=datetime.now() - timedelta(days=1),
|
|
||||||
location=[41.909889, -87.677137])
|
|
||||||
event2 = Event(title="Coltrane Motion @ Bottom of the Hill",
|
|
||||||
date=datetime.now() - timedelta(days=10),
|
|
||||||
location=[37.7749295, -122.4194155])
|
|
||||||
event3 = Event(title="Coltrane Motion @ Empty Bottle",
|
|
||||||
date=datetime.now(),
|
|
||||||
location=[41.900474, -87.686638])
|
|
||||||
|
|
||||||
event1.save()
|
|
||||||
event2.save()
|
|
||||||
event3.save()
|
|
||||||
|
|
||||||
# find all events "near" pitchfork office, chicago.
|
|
||||||
# note that "near" will show the san francisco event, too,
|
|
||||||
# although it sorts to last.
|
|
||||||
events = Event.objects(location__near=[41.9120459, -87.67892])
|
|
||||||
self.assertEqual(events.count(), 3)
|
|
||||||
self.assertEqual(list(events), [event1, event3, event2])
|
|
||||||
|
|
||||||
# find events within 5 degrees of pitchfork office, chicago
|
|
||||||
point_and_distance = [[41.9120459, -87.67892], 5]
|
|
||||||
events = Event.objects(location__within_distance=point_and_distance)
|
|
||||||
self.assertEqual(events.count(), 2)
|
|
||||||
events = list(events)
|
|
||||||
self.assertTrue(event2 not in events)
|
|
||||||
self.assertTrue(event1 in events)
|
|
||||||
self.assertTrue(event3 in events)
|
|
||||||
|
|
||||||
# ensure ordering is respected by "near"
|
|
||||||
events = Event.objects(location__near=[41.9120459, -87.67892])
|
|
||||||
events = events.order_by("-date")
|
|
||||||
self.assertEqual(events.count(), 3)
|
|
||||||
self.assertEqual(list(events), [event3, event1, event2])
|
|
||||||
|
|
||||||
# find events within 10 degrees of san francisco
|
|
||||||
point = [37.7566023, -122.415579]
|
|
||||||
events = Event.objects(location__near=point, location__max_distance=10)
|
|
||||||
self.assertEqual(events.count(), 1)
|
|
||||||
self.assertEqual(events[0], event2)
|
|
||||||
|
|
||||||
# find events within 10 degrees of san francisco
|
|
||||||
point_and_distance = [[37.7566023, -122.415579], 10]
|
|
||||||
events = Event.objects(location__within_distance=point_and_distance)
|
|
||||||
self.assertEqual(events.count(), 1)
|
|
||||||
self.assertEqual(events[0], event2)
|
|
||||||
|
|
||||||
# find events within 1 degree of greenpoint, broolyn, nyc, ny
|
|
||||||
point_and_distance = [[40.7237134, -73.9509714], 1]
|
|
||||||
events = Event.objects(location__within_distance=point_and_distance)
|
|
||||||
self.assertEqual(events.count(), 0)
|
|
||||||
|
|
||||||
# ensure ordering is respected by "within_distance"
|
|
||||||
point_and_distance = [[41.9120459, -87.67892], 10]
|
|
||||||
events = Event.objects(location__within_distance=point_and_distance)
|
|
||||||
events = events.order_by("-date")
|
|
||||||
self.assertEqual(events.count(), 2)
|
|
||||||
self.assertEqual(events[0], event3)
|
|
||||||
|
|
||||||
# check that within_box works
|
|
||||||
box = [(35.0, -125.0), (40.0, -100.0)]
|
|
||||||
events = Event.objects(location__within_box=box)
|
|
||||||
self.assertEqual(events.count(), 1)
|
|
||||||
self.assertEqual(events[0].id, event2.id)
|
|
||||||
|
|
||||||
# check that polygon works for users who have a server >= 1.9
|
|
||||||
server_version = tuple(
|
|
||||||
get_connection().server_info()['version'].split('.')
|
|
||||||
)
|
|
||||||
required_version = tuple("1.9.0".split("."))
|
|
||||||
if server_version >= required_version:
|
|
||||||
polygon = [
|
|
||||||
(41.912114,-87.694445),
|
|
||||||
(41.919395,-87.69084),
|
|
||||||
(41.927186,-87.681742),
|
|
||||||
(41.911731,-87.654276),
|
|
||||||
(41.898061,-87.656164),
|
|
||||||
]
|
|
||||||
events = Event.objects(location__within_polygon=polygon)
|
|
||||||
self.assertEqual(events.count(), 1)
|
|
||||||
self.assertEqual(events[0].id, event1.id)
|
|
||||||
|
|
||||||
polygon2 = [
|
|
||||||
(54.033586,-1.742249),
|
|
||||||
(52.792797,-1.225891),
|
|
||||||
(53.389881,-4.40094)
|
|
||||||
]
|
|
||||||
events = Event.objects(location__within_polygon=polygon2)
|
|
||||||
self.assertEqual(events.count(), 0)
|
|
||||||
|
|
||||||
Event.drop_collection()
|
|
||||||
|
|
||||||
def test_spherical_geospatial_operators(self):
|
|
||||||
"""Ensure that spherical geospatial queries are working
|
|
||||||
"""
|
|
||||||
class Point(Document):
|
|
||||||
location = GeoPointField()
|
|
||||||
|
|
||||||
Point.drop_collection()
|
|
||||||
|
|
||||||
# These points are one degree apart, which (according to Google Maps)
|
|
||||||
# is about 110 km apart at this place on the Earth.
|
|
||||||
north_point = Point(location=[-122, 38]) # Near Concord, CA
|
|
||||||
south_point = Point(location=[-122, 37]) # Near Santa Cruz, CA
|
|
||||||
north_point.save()
|
|
||||||
south_point.save()
|
|
||||||
|
|
||||||
earth_radius = 6378.009; # in km (needs to be a float for dividing by)
|
|
||||||
|
|
||||||
# Finds both points because they are within 60 km of the reference
|
|
||||||
# point equidistant between them.
|
|
||||||
points = Point.objects(location__near_sphere=[-122, 37.5])
|
|
||||||
self.assertEqual(points.count(), 2)
|
|
||||||
|
|
||||||
# Same behavior for _within_spherical_distance
|
|
||||||
points = Point.objects(
|
|
||||||
location__within_spherical_distance=[[-122, 37.5], 60/earth_radius]
|
|
||||||
);
|
|
||||||
self.assertEqual(points.count(), 2)
|
|
||||||
|
|
||||||
points = Point.objects(location__near_sphere=[-122, 37.5],
|
|
||||||
location__max_distance=60 / earth_radius);
|
|
||||||
self.assertEqual(points.count(), 2)
|
|
||||||
|
|
||||||
# Finds both points, but orders the north point first because it's
|
|
||||||
# closer to the reference point to the north.
|
|
||||||
points = Point.objects(location__near_sphere=[-122, 38.5])
|
|
||||||
self.assertEqual(points.count(), 2)
|
|
||||||
self.assertEqual(points[0].id, north_point.id)
|
|
||||||
self.assertEqual(points[1].id, south_point.id)
|
|
||||||
|
|
||||||
# Finds both points, but orders the south point first because it's
|
|
||||||
# closer to the reference point to the south.
|
|
||||||
points = Point.objects(location__near_sphere=[-122, 36.5])
|
|
||||||
self.assertEqual(points.count(), 2)
|
|
||||||
self.assertEqual(points[0].id, south_point.id)
|
|
||||||
self.assertEqual(points[1].id, north_point.id)
|
|
||||||
|
|
||||||
# Finds only one point because only the first point is within 60km of
|
|
||||||
# the reference point to the south.
|
|
||||||
points = Point.objects(
|
|
||||||
location__within_spherical_distance=[[-122, 36.5], 60/earth_radius])
|
|
||||||
self.assertEqual(points.count(), 1)
|
|
||||||
self.assertEqual(points[0].id, south_point.id)
|
|
||||||
|
|
||||||
Point.drop_collection()
|
|
||||||
|
|
||||||
def test_custom_querysets(self):
|
def test_custom_querysets(self):
|
||||||
"""Ensure that custom QuerySet classes may be used.
|
"""Ensure that custom QuerySet classes may be used.
|
||||||
"""
|
"""
|
||||||
|
Loading…
x
Reference in New Issue
Block a user