109 Commits

Author SHA1 Message Date
nat
ee362a7a73 Merge pull request #73 from nat-n/always_black
Bump version to 1.2.5
2020-05-27 13:37:54 +02:00
nat
261e55b2c8 Merge pull request #72 from nat-n/always_black
Make CI check formatting is black & append .j2 suffix to template.py
2020-05-27 12:27:33 +02:00
Nat Noordanus
98930ce0d7 Bump version to 1.2.5 2020-05-27 12:04:53 +02:00
Nat Noordanus
d7d277eb0d Remove typo from Pipfile and update Pipfile.lock 2020-05-27 11:52:18 +02:00
Nat Noordanus
3860c0ab11 Add task to run black --check in ci & update README 2020-05-27 11:52:10 +02:00
Nat Noordanus
cd1c2dc3b5 Rename template file to avoid confusing black or other build tools 2020-05-27 11:25:19 +02:00
Nat Noordanus
be2a24d15c blacken 2020-05-27 11:25:00 +02:00
Vasilios Syrakis
a5effb219a Release 1.2.4 (#71)
Co-authored-by: nat <n@natn.me>
2020-05-26 22:17:55 +02:00
Bouke Versteegh
2d57f0d122 Merge pull request #67 from danielgtaylor/nat-n-patch-1
Enforce utf-8 for reading the readme in setup.py
2020-05-25 21:57:12 +02:00
nat
a68505b80e Enforce utf-8 for reading the readme
Fixes failing installation issue #66
2020-05-25 17:53:13 +02:00
nat
2f9497e064 Merge pull request #55 from boukeversteegh/pr/xfail-tests
Add intentionally failing test-cases for unimplemented bug-fixes
2020-05-25 09:54:26 +02:00
boukeversteegh
33964b883e Do not use mutable defaults 2020-05-25 00:35:43 +02:00
boukeversteegh
ec7574086d Add xfail test-case to for future circular dependency scenario 2020-05-24 20:35:10 +02:00
boukeversteegh
8a42027bc9 Improve failing test-case for issue #64 2020-05-24 20:33:48 +02:00
boukeversteegh
71737cf696 Test case for issue #63 2020-05-24 20:29:32 +02:00
boukeversteegh
659ddd9c44 Working test case for oneof 2020-05-24 20:29:19 +02:00
boukeversteegh
5b6997870a Test case for issue #61 2020-05-24 20:27:12 +02:00
boukeversteegh
cdf7645722 Test case for issue #60 2020-05-24 20:26:47 +02:00
boukeversteegh
ca20069ca3 Test case for issue #59 2020-05-24 20:26:13 +02:00
boukeversteegh
59a4a7da43 Test case for issue #58 2020-05-24 20:25:29 +02:00
boukeversteegh
15af4367e5 Test case for issue #57 2020-05-24 20:24:55 +02:00
boukeversteegh
ec5683e572 Test Service instantiation as part of standard test-case 2020-05-24 20:02:41 +02:00
boukeversteegh
20150fdcf3 Cleanup 2020-05-24 19:58:49 +02:00
boukeversteegh
d11b7d04c5 Document XFAIL tests 2020-05-24 19:58:35 +02:00
boukeversteegh
e2d35f4696 Support xfail on test-case level, support running tests on subsets. 2020-05-24 19:58:06 +02:00
boukeversteegh
c3f08b9ef2 Clear output directories before generating python files 2020-05-24 19:54:53 +02:00
boukeversteegh
24d44898f4 Only import reference module when needed. Some reference modules generate bad imports and cannot be loaded. 2020-05-24 19:53:14 +02:00
boukeversteegh
074448c996 Restore accidentally removed binary equality test 2020-05-24 19:52:14 +02:00
nat
0fe557bd3c Merge pull request #52 from nat-n/fix_type_imports
Only import types from grpclib when type checking
2020-05-24 19:09:08 +02:00
nat
1a87ea43a1 Merge pull request #40 from boukeversteegh/pr/wrapper-as-output
Support using Google's wrapper types as RPC output values
2020-05-24 19:06:30 +02:00
andrei
983e0895a2 Fix services using non-pythonified field names 2020-05-24 18:46:36 +02:00
nat
4a2baf3f0a Merge pull request #46 from jameslan/perf/class-cache
Improve performance of serialize/deserialize by caching type information of fields in class
2020-05-24 18:38:32 +02:00
boukeversteegh
8f0caf1db2 Read desired wrapper type directly from wrapper definition 2020-05-24 14:50:56 +02:00
boukeversteegh
c50d9e2fdc Add test for generating embedded wellknown types in outputs. 2020-05-24 14:48:39 +02:00
boukeversteegh
35548cb43e Test all supported wrapper types. Add xfail test for unwrapping the value 2020-05-24 12:34:37 +02:00
boukeversteegh
b711d1e11f Merge remote-tracking branch 'daniel/master' into pr/wrapper-as-output 2020-05-24 10:41:40 +02:00
James Lan
917de09bb6 Replace extra decorator with property and lazy initialization so that it is backward compatible. 2020-05-23 17:36:29 -07:00
James Lan
1f7f39049e Cache resolved classes for fields, so that there's no new data classes generated while deserializing. 2020-05-23 17:36:29 -07:00
James Lan
3d001a2a1a Store the class metadata of fields in the class, to improve preformance
Cached data include,
- lookup table between groups and fields of "oneof" fields
- default value creator of each field
- type hint of each field
2020-05-23 17:36:29 -07:00
James Lan
de61ddab21 Add option to repeatly execute betterproto operations in test, to evaluate performance 2020-05-23 17:36:29 -07:00
Nat Noordanus
5e2d9febea Blacken 2020-05-23 23:37:22 +02:00
nat
f6af077ffe Merge pull request #51 from boukeversteegh/pr/refactor-tests
Reorganize tests and add some extra documentation.
2020-05-22 22:32:37 +02:00
boukeversteegh
92088ebda8 Cleanup 2020-05-22 21:18:44 +02:00
boukeversteegh
c3e3837f71 More concise whitelist logic 2020-05-22 21:11:23 +02:00
boukeversteegh
6bd9c7835c Fix docs 2020-05-22 21:08:08 +02:00
boukeversteegh
6ec902c1b5 Fix generate noargs. Sorted iteration. 2020-05-22 21:03:45 +02:00
boukeversteegh
960dba2ae8 Renamed docs for standard tests 2020-05-22 20:58:53 +02:00
boukeversteegh
4b4bdefb6f Add explicit test for casing rules 2020-05-22 20:58:31 +02:00
boukeversteegh
dfa0a56b39 Simplify standard tests by using 1 json per case. 2020-05-22 20:58:14 +02:00
boukeversteegh
dd4873dfba Re-introducing whitelisting argument to generate.py 2020-05-22 20:51:22 +02:00
Nat Noordanus
91f586f7d7 Apply black formatting 2020-05-22 18:46:43 +02:00
Nat Noordanus
33fb83faad Only import types from grpclib when type checking 2020-05-22 18:41:29 +02:00
boukeversteegh
77c04414f5 Update readme, add docs for standard tests 2020-05-22 16:36:43 +02:00
boukeversteegh
6969ff7ff6 Add another missing gitignored file, and remove gitignore filter for tests/ 2020-05-22 15:34:25 +02:00
boukeversteegh
13e08fdaa8 Add missing file, ignore output files 2020-05-22 15:05:52 +02:00
boukeversteegh
6775632f77 Undo unintentional pipfile update 2020-05-22 13:03:52 +02:00
boukeversteegh
b12f1e4e61 Organize test-cases into folders, extract compatibility test into proper test, support adding test-case specific tests 2020-05-22 12:54:01 +02:00
Bouke Versteegh
7e9ba0866c cleanup 2020-05-21 22:55:26 +02:00
nat
3546f55146 Merge pull request #32 from nat-n/improve_stub
Add ability to provide metadata, timeout & deadline args to requests
2020-05-21 10:11:45 +02:00
boukeversteegh
499489f1d3 Support using Google's wrapper types as RPC output values 2020-05-10 16:36:29 +02:00
Vasili Syrakis
ce9f492f50 Increment version to 1.2.3 2020-04-15 14:24:02 +10:00
Vasilios Syrakis
93a6334015 Update CHANGELOG.md 2020-04-15 14:21:30 +10:00
Adam Ehlers Nyholm Thomsen
36a14026d8 Fix issue that occurs with naming when proto is double nested (#21) 2020-04-15 14:10:43 +10:00
Vasilios Syrakis
04a2fcd3eb Merge pull request #31 from nat-n/fix_readme
Fix test instructions to match pipfile
2020-04-14 10:55:18 +10:00
Nat Noordanus
5759e323bd Add ability to provide metadata, timeout & deadline args to requests
This is an enhancement of the ServiceStub abstract class that makes
it more useful by making it possible to pass all arguments supported
by the underlying grpclib request function.

It extends to the existing high level API by allowing values to be
set on the stub instance, and the low level API by allowing values
to be set per call.
2020-04-12 22:23:10 +02:00
Nat Noordanus
c762c9c549 Add test for generated service stub
- Create one simple test for generated Service stubs in preparation
for making more changes in this area.
- Add dev dependency on pytest-asyncio in order to use ChannelFor
from grpclib.testing more easily.
- Create a new example proto containing a minimal rpc example.
2020-04-12 19:37:39 +02:00
Nat Noordanus
582a12577c Fix test instructions to match pipfile 2020-04-12 18:52:43 +02:00
Vasilios Syrakis
3616190451 Merge pull request #30 from nat-n/p36_support
#27 Add support for python 3.6
2020-04-08 09:37:48 +10:00
Nat Noordanus
9b990ee1bd Make pipenv play nice with the setup-python ci workflow 2020-04-05 15:58:12 +02:00
Vasilios Syrakis
72a77b0d65 Merge pull request #28 from tanishq-dubey/patch-1
Update README.md for pip syntax
2020-04-05 14:52:48 +10:00
Nat Noordanus
b2b36c8575 Apply black formatting 2020-04-03 19:54:19 +02:00
Nat Noordanus
203105f048 Add support for python 3.6
Changes:
- Update config and docs to reference 3.6
- Add backports of dataclasses and datetime.fromisoformat for python_version<"3.7"
- Support both 3.7 and 3.6 usages of undocumented __origin__ attribute on typing objects
- Make github ci run tests for python 3.6 as well
2020-04-03 19:52:19 +02:00
Tanishq Dubey
fe11f74227 Update README.md
Add quotes to the README so pip syntax is correct
2020-03-30 09:50:11 -04:00
Daniel G. Taylor
dc7a3e9bdf Update changelog 2020-01-30 17:48:12 -08:00
Daniel G. Taylor
f2e8afc609 Merge pull request #16 from cetanu/patch-1
Exclude empty lists from to_dict output
2020-01-30 17:31:25 -08:00
Daniel G. Taylor
dbd438e682 Update to emit empty lists if asked for defaults 2020-01-30 17:28:22 -08:00
Daniel G. Taylor
dce1c89fbe Merge branch 'master' into patch-1 2020-01-30 17:22:47 -08:00
Daniel G. Taylor
c78851b1b8 Merge pull request #12 from ulasozguler/master
Added `include_default_values` parameter to `to_dict` function
2020-01-30 17:19:34 -08:00
Vasilios Syrakis
4554d91f89 Exclude empty lists from to_dict output 2020-01-29 22:32:35 +11:00
ulas
c0170f4d80 Added include_default_values parameter to to_dict function. 2020-01-22 19:16:57 +03:00
Daniel G. Taylor
559b8833d8 Bump version to 1.2.2 2020-01-09 16:47:25 -08:00
Daniel G. Taylor
7ccef16579 Mention no proto 2, fixes #6 2020-01-09 16:43:45 -08:00
Daniel G. Taylor
d8785b4622 Merge pull request #10 from qix/master
Fix serialization of dataclass constructor parameters
2020-01-09 16:35:06 -08:00
Daniel G. Taylor
45e7a30300 Merge pull request #7 from ulasozguler/master
Fix - propagate `casing` param of `to_dict` function recursively
2020-01-09 16:32:29 -08:00
Josh Yudaken
d7559c22f8 Fix serialization of dataclass constructor parameters 2020-01-08 11:29:45 -05:00
ulas
f9c351a98d propagate casing param recursively. 2019-12-04 19:28:53 +03:00
Daniel G. Taylor
feea790116 Bump library version 2019-10-29 22:00:27 -07:00
Daniel G. Taylor
33f74f6a45 Fix comment indent bug; bump version 2019-10-29 21:59:23 -07:00
Daniel G. Taylor
3d5c12c532 Add changelog, version bump 2019-10-28 21:13:25 -07:00
Daniel G. Taylor
706bd5a475 Slightly simplify gRPC helper functions 2019-10-28 20:58:33 -07:00
Daniel G. Taylor
52beeb0d73 Fix typo in example 2019-10-28 20:44:57 -07:00
Daniel G. Taylor
7e2dc595db Autoformat files after rendering 2019-10-28 20:44:50 -07:00
Daniel G. Taylor
6fd9612ee1 Doc updates, version bump for release 2019-10-27 15:43:52 -07:00
Daniel G. Taylor
ba520f88a4 Install Protobuf include files on CI host 2019-10-27 15:40:33 -07:00
Daniel G. Taylor
b0b64fcbaf Fix tests attempt 3 2019-10-27 15:29:04 -07:00
Daniel G. Taylor
7900c7c9db Fix tests 2019-10-27 15:21:20 -07:00
Daniel G. Taylor
fcc273e294 Fix tests 2019-10-27 15:18:10 -07:00
Daniel G. Taylor
f820397751 Add missing optional types test 2019-10-27 15:14:06 -07:00
Daniel G. Taylor
16687211a2 Typing fixes 2019-10-27 15:13:51 -07:00
Daniel G. Taylor
eb5020db2a Fix bool parsing bug 2019-10-27 14:59:38 -07:00
Daniel G. Taylor
035793aec3 Support wrapper types 2019-10-27 14:55:25 -07:00
Daniel G. Taylor
c79535b614 Support Duration/Timestamp Google well-known types 2019-10-26 23:07:30 -07:00
Daniel G. Taylor
5daf61f64c Refactor default value code 2019-10-25 21:16:32 -07:00
Daniel G. Taylor
4679c571c3 Fix comment newlines 2019-10-25 12:28:26 -07:00
Daniel G. Taylor
ff8463cf12 Handle fields that clash with Python reserved keywords 2019-10-23 21:28:31 -07:00
Daniel G. Taylor
eff9021529 Some informational output from the plugin, do not overwrite __init__.py 2019-10-23 15:07:05 -07:00
Daniel G. Taylor
d43d5af5ce Better JSON casing support, renaming messages/fields 2019-10-23 15:06:34 -07:00
Daniel G. Taylor
ef0a1bf50c Use specific version of pypi publish image 2019-10-23 15:03:13 -07:00
Daniel G. Taylor
0e389abbef Add Python package long description 2019-10-22 21:31:42 -07:00
98 changed files with 2500 additions and 550 deletions

View File

@@ -3,21 +3,70 @@ name: CI
on: [push, pull_request]
jobs:
build:
check-formatting:
runs-on: ubuntu-latest
name: Consult black on python formatting
steps:
- uses: actions/checkout@v1
- uses: actions/setup-python@v1
with:
python-version: 3.7
- uses: dschep/install-pipenv-action@v1
- name: Install dependencies
run: |
sudo apt install protobuf-compiler
pipenv install --dev
- name: Run tests
run: |
pipenv run generate
pipenv run test
- uses: actions/checkout@v1
- uses: actions/setup-python@v1
with:
python-version: 3.7
- uses: dschep/install-pipenv-action@v1
- name: Install dependencies
run: |
pipenv install --dev --python ${pythonLocation}/python
- name: Run black
run: |
pipenv run black . --check --diff --exclude tests/output_
run-tests:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: [ '3.6', '3.7' ]
name: Python ${{ matrix.python-version }} test
steps:
- uses: actions/checkout@v1
- uses: actions/setup-python@v1
with:
python-version: ${{ matrix.python-version }}
- uses: dschep/install-pipenv-action@v1
- name: Install dependencies
run: |
sudo apt install protobuf-compiler libprotobuf-dev
pipenv install --dev --python ${pythonLocation}/python
- name: Run tests
run: |
cp .env.default .env
pipenv run pip install -e .
pipenv run generate
pipenv run test
build-release:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- uses: actions/setup-python@v1
with:
python-version: 3.7
- uses: dschep/install-pipenv-action@v1
- name: Install dependencies
run: |
sudo apt install protobuf-compiler libprotobuf-dev
pipenv install --dev --python ${pythonLocation}/python
- name: Build package
if: github.event_name == 'push' && startsWith(github.event.ref, 'refs/tags')
run: pipenv run python setup.py sdist
- name: Publish package
if: github.event_name == 'push' && startsWith(github.event.ref, 'refs/tags')
uses: pypa/gh-action-pypi-publish@v1.0.0a0
with:
user: __token__
password: ${{ secrets.pypi }}

9
.gitignore vendored
View File

@@ -2,12 +2,11 @@
.vscode/settings.json
.mypy_cache
.pytest_cache
betterproto/tests/*.bin
betterproto/tests/*_pb2.py
betterproto/tests/*.py
!betterproto/tests/generate.py
!betterproto/tests/test_*.py
.python-version
build/
betterproto/tests/output_*
**/__pycache__
dist
**/*.egg-info
output
.idea

69
CHANGELOG.md Normal file
View File

@@ -0,0 +1,69 @@
# Changelog
All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [1.2.5] - 2020-04-27
- Add .j2 suffix to python template names to avoid confusing certain build tools [#72](https://github.com/danielgtaylor/python-betterproto/pull/72)
## [1.2.4] - 2020-04-26
- Enforce utf-8 for reading the readme in setup.py [#67](https://github.com/danielgtaylor/python-betterproto/pull/67)
- Only import types from grpclib when type checking [#52](https://github.com/danielgtaylor/python-betterproto/pull/52)
- Improve performance of serialize/deserialize by caching type information of fields in class [#46](https://github.com/danielgtaylor/python-betterproto/pull/46)
- Support using Google's wrapper types as RPC output values [#40](https://github.com/danielgtaylor/python-betterproto/pull/40)
- Fixes issue where protoc did not recognize plugin.py as win32 application [#38](https://github.com/danielgtaylor/python-betterproto/pull/38)
- Fix services using non-pythonified field names [#34](https://github.com/danielgtaylor/python-betterproto/pull/34)
- Add ability to provide metadata, timeout & deadline args to requests [#32](https://github.com/danielgtaylor/python-betterproto/pull/32)
## [1.2.3] - 2020-04-15
- Exclude empty lists from `to_dict` by default [#16](https://github.com/danielgtaylor/python-betterproto/pull/16)
- Add `include_default_values` parameter for `to_dict` [#12](https://github.com/danielgtaylor/python-betterproto/pull/12)
- Fix class names being prepended with duplicates when using protocol buffers that are nested more than once [#21](https://github.com/danielgtaylor/python-betterproto/pull/21)
- Add support for python 3.6 [#30](https://github.com/danielgtaylor/python-betterproto/pull/30)
## [1.2.2] - 2020-01-09
- Mention lack of Proto 2 support in README.
- Fix serialization of constructor parameters [#10](https://github.com/danielgtaylor/python-betterproto/pull/10)
- Fix `casing` parameter propagation [#7](https://github.com/danielgtaylor/python-betterproto/pull/7)
## [1.2.1] - 2019-10-29
- Fix comment indentation bug in rendered gRPC methods.
## [1.2.0] - 2019-10-28
- Generated code output auto-formatting via [Black](https://github.com/psf/black)
- Simplified gRPC helper functions
## [1.1.0] - 2019-10-27
- Better JSON casing support
- Handle field names which clash with Python reserved words
- Better handling of default values from type introspection
- Support for Google Duration & Timestamp types
- Support for Google wrapper types
- Documentation updates
## [1.0.1] - 2019-10-22
- README to the PyPI details page
## [1.0.0] - 2019-10-22
- Initial release
[1.2.5]: https://github.com/danielgtaylor/python-betterproto/compare/v1.2.4...v1.2.5
[1.2.4]: https://github.com/danielgtaylor/python-betterproto/compare/v1.2.3...v1.2.4
[1.2.3]: https://github.com/danielgtaylor/python-betterproto/compare/v1.2.2...v1.2.3
[1.2.2]: https://github.com/danielgtaylor/python-betterproto/compare/v1.2.1...v1.2.2
[1.2.1]: https://github.com/danielgtaylor/python-betterproto/compare/v1.2.0...v1.2.1
[1.2.0]: https://github.com/danielgtaylor/python-betterproto/compare/v1.1.0...v1.2.0
[1.1.0]: https://github.com/danielgtaylor/python-betterproto/compare/v1.0.1...v1.1.0
[1.0.1]: https://github.com/danielgtaylor/python-betterproto/compare/v1.0.0...v1.0.1
[1.0.0]: https://github.com/danielgtaylor/python-betterproto/releases/tag/v1.0.0

10
Pipfile
View File

@@ -8,17 +8,25 @@ flake8 = "*"
mypy = "*"
isort = "*"
pytest = "*"
pytest-asyncio = "*"
rope = "*"
[packages]
protobuf = "*"
jinja2 = "*"
grpclib = "*"
stringcase = "*"
black = "*"
backports-datetime-fromisoformat = "*"
dataclasses = "*"
[requires]
python_version = "3.7"
python_version = "3.6"
[scripts]
plugin = "protoc --plugin=protoc-gen-custom=betterproto/plugin.py --custom_out=output"
generate = "python betterproto/tests/generate.py"
test = "pytest ./betterproto/tests"
[pipenv]
allow_prereleases = true

468
Pipfile.lock generated
View File

@@ -1,11 +1,11 @@
{
"_meta": {
"hash": {
"sha256": "f698150037f2a8ac554e4d37ecd4619ba35d1aa570f5b641d048ec9c6b23eb40"
"sha256": "44ae793965dc2b6ec17f0435a388846248b8a703cf857470b66af84227535950"
},
"pipfile-spec": 6,
"requires": {
"python_version": "3.7"
"python_version": "3.6"
},
"sources": [
{
@@ -16,19 +16,63 @@
]
},
"default": {
"grpclib": {
"appdirs": {
"hashes": [
"sha256:d19e2ea87cb073e5b0825dfee15336fd2b1c09278d271816e04c90faddc107ea"
"sha256:7d5d0167b2b1ba821647616af46a749d1c653740dd0d2415100fe26e27afdf41",
"sha256:a841dacd6b99318a741b166adb07e19ee71a274450e68237b4650ca1055ab128"
],
"version": "==1.4.4"
},
"attrs": {
"hashes": [
"sha256:08a96c641c3a74e44eb59afb61a24f2cb9f4d7188748e76ba4bb5edfa3cb7d1c",
"sha256:f7b7ce16570fe9965acd6d30101a28f62fb4a7f9e926b3bbc9b61f8b04247e72"
],
"version": "==19.3.0"
},
"backports-datetime-fromisoformat": {
"hashes": [
"sha256:9577a2a9486cd7383a5f58b23bb8e81cf0821dbbc0eb7c87d3fa198c1df40f5c"
],
"index": "pypi",
"version": "==0.3.0"
"version": "==1.0.0"
},
"black": {
"hashes": [
"sha256:1b30e59be925fafc1ee4565e5e08abef6b03fe455102883820fe5ee2e4734e0b",
"sha256:c2edb73a08e9e0e6f65a0e6af18b059b8b1cdd5bef997d7a0b181df93dc81539"
],
"index": "pypi",
"version": "==19.10b0"
},
"click": {
"hashes": [
"sha256:d2b5255c7c6349bc1bd1e59e08cd12acbbd63ce649f2588755783aa94dfb6b1a",
"sha256:dacca89f4bfadd5de3d7489b7c8a566eee0d3676333fbb50030263894c38c0dc"
],
"version": "==7.1.2"
},
"dataclasses": {
"hashes": [
"sha256:454a69d788c7fda44efd71e259be79577822f5e3f53f029a22d08004e951dc9f",
"sha256:6988bd2b895eef432d562370bb707d540f32f7360ab13da45340101bc2307d84"
],
"index": "pypi",
"version": "==0.6"
},
"grpclib": {
"hashes": [
"sha256:b27d56c987b89023d5640fe9668943e49b46703fc85d8182a58c9f3b19120cdc"
],
"index": "pypi",
"version": "==0.3.2rc1"
},
"h2": {
"hashes": [
"sha256:ac377fcf586314ef3177bfd90c12c7826ab0840edeb03f0f24f511858326049e",
"sha256:b8a32bd282594424c0ac55845377eea13fa54fe4a8db012f3a198ed923dc3ab4"
"sha256:61e0f6601fa709f35cdb730863b4e5ec7ad449792add80d1410d4174ed139af5",
"sha256:875f41ebd6f2c44781259005b157faed1a5031df3ae5aa7bcb4628a6c0782f14"
],
"version": "==3.1.1"
"version": "==3.2.0"
},
"hpack": {
"hashes": [
@@ -46,146 +90,187 @@
},
"jinja2": {
"hashes": [
"sha256:74320bb91f31270f9551d46522e33af46a80c3d619f4a4bf42b3164d30b5911f",
"sha256:9fe95f19286cfefaa917656583d020be14e7859c6b0252588391e47db34527de"
"sha256:c10142f819c2d22bdcd17548c46fa9b77cf4fda45097854c689666bf425e7484",
"sha256:c922560ac46888d47384de1dbdc3daaa2ea993af4b26a436dec31fa2c19ec668"
],
"index": "pypi",
"version": "==2.10.3"
"version": "==3.0.0a1"
},
"markupsafe": {
"hashes": [
"sha256:00bc623926325b26bb9605ae9eae8a215691f33cae5df11ca5424f06f2d1f473",
"sha256:09027a7803a62ca78792ad89403b1b7a73a01c8cb65909cd876f7fcebd79b161",
"sha256:09c4b7f37d6c648cb13f9230d847adf22f8171b1ccc4d5682398e77f40309235",
"sha256:1027c282dad077d0bae18be6794e6b6b8c91d58ed8a8d89a89d59693b9131db5",
"sha256:24982cc2533820871eba85ba648cd53d8623687ff11cbb805be4ff7b4c971aff",
"sha256:29872e92839765e546828bb7754a68c418d927cd064fd4708fab9fe9c8bb116b",
"sha256:43a55c2930bbc139570ac2452adf3d70cdbb3cfe5912c71cdce1c2c6bbd9c5d1",
"sha256:46c99d2de99945ec5cb54f23c8cd5689f6d7177305ebff350a58ce5f8de1669e",
"sha256:500d4957e52ddc3351cabf489e79c91c17f6e0899158447047588650b5e69183",
"sha256:535f6fc4d397c1563d08b88e485c3496cf5784e927af890fb3c3aac7f933ec66",
"sha256:62fe6c95e3ec8a7fad637b7f3d372c15ec1caa01ab47926cfdf7a75b40e0eac1",
"sha256:6dd73240d2af64df90aa7c4e7481e23825ea70af4b4922f8ede5b9e35f78a3b1",
"sha256:717ba8fe3ae9cc0006d7c451f0bb265ee07739daf76355d06366154ee68d221e",
"sha256:79855e1c5b8da654cf486b830bd42c06e8780cea587384cf6545b7d9ac013a0b",
"sha256:7c1699dfe0cf8ff607dbdcc1e9b9af1755371f92a68f706051cc8c37d447c905",
"sha256:88e5fcfb52ee7b911e8bb6d6aa2fd21fbecc674eadd44118a9cc3863f938e735",
"sha256:8defac2f2ccd6805ebf65f5eeb132adcf2ab57aa11fdf4c0dd5169a004710e7d",
"sha256:98c7086708b163d425c67c7a91bad6e466bb99d797aa64f965e9d25c12111a5e",
"sha256:9add70b36c5666a2ed02b43b335fe19002ee5235efd4b8a89bfcf9005bebac0d",
"sha256:9bf40443012702a1d2070043cb6291650a0841ece432556f784f004937f0f32c",
"sha256:ade5e387d2ad0d7ebf59146cc00c8044acbd863725f887353a10df825fc8ae21",
"sha256:b00c1de48212e4cc9603895652c5c410df699856a2853135b3967591e4beebc2",
"sha256:b1282f8c00509d99fef04d8ba936b156d419be841854fe901d8ae224c59f0be5",
"sha256:b2051432115498d3562c084a49bba65d97cf251f5a331c64a12ee7e04dacc51b",
"sha256:ba59edeaa2fc6114428f1637ffff42da1e311e29382d81b339c1817d37ec93c6",
"sha256:c8716a48d94b06bb3b2524c2b77e055fb313aeb4ea620c8dd03a105574ba704f",
"sha256:cd5df75523866410809ca100dc9681e301e3c27567cf498077e8551b6d20e42f",
"sha256:e249096428b3ae81b08327a63a485ad0878de3fb939049038579ac0ef61e17e7"
"sha256:06358015a4dee8ee23ae426bf885616ab3963622defd829eb45b44e3dee3515f",
"sha256:0b0c4fc852c5f02c6277ef3b33d23fcbe89b1b227460423e3335374da046b6db",
"sha256:267677fc42afed5094fc5ea1c4236bbe4b6a00fe4b08e93451e65ae9048139c7",
"sha256:303cb70893e2c345588fb5d5b86e0ca369f9bb56942f03064c5e3e75fa7a238a",
"sha256:3c9b624a0d9ed5a5093ac4edc4e823e6b125441e60ef35d36e6f4a6fdacd5054",
"sha256:42033e14cae1f6c86fc0c3e90d04d08ce73ac8e46ba420a0d22d545c2abd4977",
"sha256:4e4a99b6af7bdc0856b50020c095848ec050356a001e1f751510aef6ab14d0e0",
"sha256:4eb07faad54bb07427d848f31030a65a49ebb0cec0b30674f91cf1ddd456bfe4",
"sha256:63a7161cd8c2bc563feeda45df62f42c860dd0675e2b8da2667f25bb3c95eaba",
"sha256:68e0fd039b68d2945b4beb947d4023ca7f8e95b708031c345762efba214ea761",
"sha256:8092a63397025c2f655acd42784b2a1528339b90b987beb9253f22e8cdbb36c3",
"sha256:841218860683c0f2223e24756843d84cc49cccdae6765e04962607754a52d3e0",
"sha256:94076b2314bd2f6cfae508ad65b4d493e3a58a50112b7a2cbb6287bdbc404ae8",
"sha256:9d22aff1c5322e402adfb3ce40839a5056c353e711c033798cf4f02eb9f5124d",
"sha256:b0e4584f62b3e5f5c1a7bcefd2b52f236505e6ef032cc508caa4f4c8dc8d3af1",
"sha256:b1163ffc1384d242964426a8164da12dbcdbc0de18ea36e2c34b898ed38c3b45",
"sha256:beac28ed60c8e838301226a7a85841d0af2068eba2dcb1a58c2d32d6c05e440e",
"sha256:c29f096ce79c03054a1101d6e5fe6bf04b0bb489165d5e0e9653fb4fe8048ee1",
"sha256:c58779966d53e5f14ba393d64e2402a7926601d1ac8adeb4e83893def79d0428",
"sha256:cfe14b37908eaf7d5506302987228bff69e1b8e7071ccd4e70fd0283b1b47f0b",
"sha256:e834249c45aa9837d0753351cdca61a4b8b383cc9ad0ff2325c97ff7b69e72a6",
"sha256:eed1b234c4499811ee85bcefa22ef5e466e75d132502226ed29740d593316c1f"
],
"version": "==1.1.1"
"version": "==2.0.0a1"
},
"multidict": {
"hashes": [
"sha256:024b8129695a952ebd93373e45b5d341dbb87c17ce49637b34000093f243dd4f",
"sha256:041e9442b11409be5e4fc8b6a97e4bcead758ab1e11768d1e69160bdde18acc3",
"sha256:045b4dd0e5f6121e6f314d81759abd2c257db4634260abcfe0d3f7083c4908ef",
"sha256:047c0a04e382ef8bd74b0de01407e8d8632d7d1b4db6f2561106af812a68741b",
"sha256:068167c2d7bbeebd359665ac4fff756be5ffac9cda02375b5c5a7c4777038e73",
"sha256:148ff60e0fffa2f5fad2eb25aae7bef23d8f3b8bdaf947a65cdbe84a978092bc",
"sha256:1d1c77013a259971a72ddaa83b9f42c80a93ff12df6a4723be99d858fa30bee3",
"sha256:1d48bc124a6b7a55006d97917f695effa9725d05abe8ee78fd60d6588b8344cd",
"sha256:31dfa2fc323097f8ad7acd41aa38d7c614dd1960ac6681745b6da124093dc351",
"sha256:34f82db7f80c49f38b032c5abb605c458bac997a6c3142e0d6c130be6fb2b941",
"sha256:3d5dd8e5998fb4ace04789d1d008e2bb532de501218519d70bb672c4c5a2fc5d",
"sha256:4a6ae52bd3ee41ee0f3acf4c60ceb3f44e0e3bc52ab7da1c2b2aa6703363a3d1",
"sha256:4b02a3b2a2f01d0490dd39321c74273fed0568568ea0e7ea23e02bd1fb10a10b",
"sha256:4b843f8e1dd6a3195679d9838eb4670222e8b8d01bc36c9894d6c3538316fa0a",
"sha256:5de53a28f40ef3c4fd57aeab6b590c2c663de87a5af76136ced519923d3efbb3",
"sha256:61b2b33ede821b94fa99ce0b09c9ece049c7067a33b279f343adfe35108a4ea7",
"sha256:6a3a9b0f45fd75dc05d8e93dc21b18fc1670135ec9544d1ad4acbcf6b86781d0",
"sha256:76ad8e4c69dadbb31bad17c16baee61c0d1a4a73bed2590b741b2e1a46d3edd0",
"sha256:7ba19b777dc00194d1b473180d4ca89a054dd18de27d0ee2e42a103ec9b7d014",
"sha256:7c1b7eab7a49aa96f3db1f716f0113a8a2e93c7375dd3d5d21c4941f1405c9c5",
"sha256:7fc0eee3046041387cbace9314926aa48b681202f8897f8bff3809967a049036",
"sha256:8ccd1c5fff1aa1427100ce188557fc31f1e0a383ad8ec42c559aabd4ff08802d",
"sha256:8e08dd76de80539d613654915a2f5196dbccc67448df291e69a88712ea21e24a",
"sha256:c18498c50c59263841862ea0501da9f2b3659c00db54abfbf823a80787fde8ce",
"sha256:c49db89d602c24928e68c0d510f4fcf8989d77defd01c973d6cbe27e684833b1",
"sha256:ce20044d0317649ddbb4e54dab3c1bcc7483c78c27d3f58ab3d0c7e6bc60d26a",
"sha256:d1071414dd06ca2eafa90c85a079169bfeb0e5f57fd0b45d44c092546fcd6fd9",
"sha256:d3be11ac43ab1a3e979dac80843b42226d5d3cccd3986f2e03152720a4297cd7",
"sha256:db603a1c235d110c860d5f39988ebc8218ee028f07a7cbc056ba6424372ca31b"
"sha256:1ece5a3369835c20ed57adadc663400b5525904e53bae59ec854a5d36b39b21a",
"sha256:275ca32383bc5d1894b6975bb4ca6a7ff16ab76fa622967625baeebcf8079000",
"sha256:3750f2205b800aac4bb03b5ae48025a64e474d2c6cc79547988ba1d4122a09e2",
"sha256:4538273208e7294b2659b1602490f4ed3ab1c8cf9dbdd817e0e9db8e64be2507",
"sha256:5141c13374e6b25fe6bf092052ab55c0c03d21bd66c94a0e3ae371d3e4d865a5",
"sha256:51a4d210404ac61d32dada00a50ea7ba412e6ea945bbe992e4d7a595276d2ec7",
"sha256:5cf311a0f5ef80fe73e4f4c0f0998ec08f954a6ec72b746f3c179e37de1d210d",
"sha256:6513728873f4326999429a8b00fc7ceddb2509b01d5fd3f3be7881a257b8d463",
"sha256:7388d2ef3c55a8ba80da62ecfafa06a1c097c18032a501ffd4cabbc52d7f2b19",
"sha256:9456e90649005ad40558f4cf51dbb842e32807df75146c6d940b6f5abb4a78f3",
"sha256:c026fe9a05130e44157b98fea3ab12969e5b60691a276150db9eda71710cd10b",
"sha256:d14842362ed4cf63751648e7672f7174c9818459d169231d03c56e84daf90b7c",
"sha256:e0d072ae0f2a179c375f67e3da300b47e1a83293c554450b29c900e50afaae87",
"sha256:f07acae137b71af3bb548bd8da720956a3bc9f9a0b87733e0899226a2317aeb7",
"sha256:fbb77a75e529021e7c4a8d4e823d88ef4d23674a202be4f5addffc72cbb91430",
"sha256:fcfbb44c59af3f8ea984de67ec7c306f618a3ec771c2843804069917a8f2e255",
"sha256:feed85993dbdb1dbc29102f50bca65bdc68f2c0c8d352468c25b54874f23c39d"
],
"version": "==4.5.2"
"version": "==4.7.6"
},
"pathspec": {
"hashes": [
"sha256:7d91249d21749788d07a2d0f94147accd8f845507400749ea19c1ec9054a12b0",
"sha256:da45173eb3a6f2a5a487efba21f050af2b41948be6ab52b6a1e3ff22bb8b7061"
],
"version": "==0.8.0"
},
"protobuf": {
"hashes": [
"sha256:125713564d8cfed7610e52444c9769b8dcb0b55e25cc7841f2290ee7bc86636f",
"sha256:1accdb7a47e51503be64d9a57543964ba674edac103215576399d2d0e34eac77",
"sha256:27003d12d4f68e3cbea9eb67427cab3bfddd47ff90670cb367fcd7a3a89b9657",
"sha256:3264f3c431a631b0b31e9db2ae8c927b79fc1a7b1b06b31e8e5bcf2af91fe896",
"sha256:3c5ab0f5c71ca5af27143e60613729e3488bb45f6d3f143dc918a20af8bab0bf",
"sha256:45dcf8758873e3f69feab075e5f3177270739f146255225474ee0b90429adef6",
"sha256:56a77d61a91186cc5676d8e11b36a5feb513873e4ae88d2ee5cf530d52bbcd3b",
"sha256:5984e4947bbcef5bd849d6244aec507d31786f2dd3344139adc1489fb403b300",
"sha256:6b0441da73796dd00821763bb4119674eaf252776beb50ae3883bed179a60b2a",
"sha256:6f6677c5ade94d4fe75a912926d6796d5c71a2a90c2aeefe0d6f211d75c74789",
"sha256:84a825a9418d7196e2acc48f8746cf1ee75877ed2f30433ab92a133f3eaf8fbe",
"sha256:b842c34fe043ccf78b4a6cf1019d7b80113707d68c88842d061fa2b8fb6ddedc",
"sha256:ca33d2f09dae149a1dcf942d2d825ebb06343b77b437198c9e2ef115cf5d5bc1",
"sha256:db83b5c12c0cd30150bb568e6feb2435c49ce4e68fe2d7b903113f0e221e58fe",
"sha256:f50f3b1c5c1c1334ca7ce9cad5992f098f460ffd6388a3cabad10b66c2006b09",
"sha256:f99f127909731cafb841c52f9216e447d3e4afb99b17bebfad327a75aee206de"
"sha256:04d0b2bd99050d09393875a5a25fd12337b17f3ac2e29c0c1b8e65b277cbfe72",
"sha256:05288e44638e91498f13127a3699a6528dec6f9d3084d60959d721bfb9ea5b98",
"sha256:175d85370947f89e33b3da93f4ccdda3f326bebe3e599df5915ceb7f804cd9df",
"sha256:440a8c77531b3652f24999b249256ed01fd44c498ab0973843066681bd276685",
"sha256:49fb6fab19cd3f30fa0e976eeedcbf2558e9061e5fa65b4fe51ded1f4002e04d",
"sha256:4c7cae1f56056a4a2a2e3b00b26ab8550eae738bd9548f4ea0c2fcb88ed76ae5",
"sha256:519abfacbb421c3591d26e8daf7a4957763428db7267f7207e3693e29f6978db",
"sha256:60f32af25620abc4d7928d8197f9f25d49d558c5959aa1e08c686f974ac0b71a",
"sha256:613ac49f6db266fba243daf60fb32af107cfe3678e5c003bb40a381b6786389d",
"sha256:954bb14816edd24e746ba1a6b2d48c43576393bbde2fb8e1e3bd6d4504c7feac",
"sha256:9b1462c033a2cee7f4e8eb396905c69de2c532c3b835ff8f71f8e5fb77c38023",
"sha256:c0767f4d93ce4288475afe0571663c78870924f1f8881efd5406c10f070c75e4",
"sha256:c45f5980ce32879391144b5766120fd7b8803129f127ce36bd060dd38824801f",
"sha256:eeb7502f59e889a88bcb59f299493e215d1864f3d75335ea04a413004eb4fe24",
"sha256:fdb1742f883ee4662e39fcc5916f2725fec36a5191a52123fec60f8c53b70495",
"sha256:fe554066c4962c2db0a1d4752655223eb948d2bfa0fb1c4a7f2c00ec07324f1c"
],
"index": "pypi",
"version": "==3.10.0"
"version": "==3.12.1"
},
"regex": {
"hashes": [
"sha256:1386e75c9d1574f6aa2e4eb5355374c8e55f9aac97e224a8a5a6abded0f9c927",
"sha256:27ff7325b297fb6e5ebb70d10437592433601c423f5acf86e5bc1ee2919b9561",
"sha256:329ba35d711e3428db6b45a53b1b13a0a8ba07cbbcf10bbed291a7da45f106c3",
"sha256:3a9394197664e35566242686d84dfd264c07b20f93514e2e09d3c2b3ffdf78fe",
"sha256:51f17abbe973c7673a61863516bdc9c0ef467407a940f39501e786a07406699c",
"sha256:579ea215c81d18da550b62ff97ee187b99f1b135fd894a13451e00986a080cad",
"sha256:70c14743320a68c5dac7fc5a0f685be63bc2024b062fe2aaccc4acc3d01b14a1",
"sha256:7e61be8a2900897803c293247ef87366d5df86bf701083b6c43119c7c6c99108",
"sha256:8044d1c085d49673aadb3d7dc20ef5cb5b030c7a4fa253a593dda2eab3059929",
"sha256:89d76ce33d3266173f5be80bd4efcbd5196cafc34100fdab814f9b228dee0fa4",
"sha256:99568f00f7bf820c620f01721485cad230f3fb28f57d8fbf4a7967ec2e446994",
"sha256:a7c37f048ec3920783abab99f8f4036561a174f1314302ccfa4e9ad31cb00eb4",
"sha256:c2062c7d470751b648f1cacc3f54460aebfc261285f14bc6da49c6943bd48bdd",
"sha256:c9bce6e006fbe771a02bda468ec40ffccbf954803b470a0345ad39c603402577",
"sha256:ce367d21f33e23a84fb83a641b3834dd7dd8e9318ad8ff677fbfae5915a239f7",
"sha256:ce450ffbfec93821ab1fea94779a8440e10cf63819be6e176eb1973a6017aff5",
"sha256:ce5cc53aa9fbbf6712e92c7cf268274eaff30f6bd12a0754e8133d85a8fb0f5f",
"sha256:d466967ac8e45244b9dfe302bbe5e3337f8dc4dec8d7d10f5e950d83b140d33a",
"sha256:d881c2e657c51d89f02ae4c21d9adbef76b8325fe4d5cf0e9ad62f850f3a98fd",
"sha256:e565569fc28e3ba3e475ec344d87ed3cd8ba2d575335359749298a0899fe122e",
"sha256:ea55b80eb0d1c3f1d8d784264a6764f931e172480a2f1868f2536444c5f01e01"
],
"version": "==2020.5.14"
},
"six": {
"hashes": [
"sha256:3350809f0555b11f552448330d0b52d5f24c91a322ea4a15ef22629740f3761c",
"sha256:d16a0141ec1a18405cd4ce8b4613101da75da0e9a7aec5bdd4fa804d0e0eba73"
"sha256:30639c035cdb23534cd4aa2dd52c3bf48f06e5f4a941509c8bafd8ce11080259",
"sha256:8b74bedcbbbaca38ff6d7491d76f2b06b3592611af620f8426e82dddb04a5ced"
],
"version": "==1.12.0"
"version": "==1.15.0"
},
"stringcase": {
"hashes": [
"sha256:48a06980661908efe8d9d34eab2b6c13aefa2163b3ced26972902e3bdfd87008"
],
"index": "pypi",
"version": "==1.2.0"
},
"toml": {
"hashes": [
"sha256:926b612be1e5ce0634a2ca03470f95169cf16f939018233a670519cb4ac58b0f",
"sha256:bda89d5935c2eac546d648028b9901107a595863cb36bae0c73ac804a9b4ce88"
],
"version": "==0.10.1"
},
"typed-ast": {
"hashes": [
"sha256:0666aa36131496aed8f7be0410ff974562ab7eeac11ef351def9ea6fa28f6355",
"sha256:0c2c07682d61a629b68433afb159376e24e5b2fd4641d35424e462169c0a7919",
"sha256:249862707802d40f7f29f6e1aad8d84b5aa9e44552d2cc17384b209f091276aa",
"sha256:24995c843eb0ad11a4527b026b4dde3da70e1f2d8806c99b7b4a7cf491612652",
"sha256:269151951236b0f9a6f04015a9004084a5ab0d5f19b57de779f908621e7d8b75",
"sha256:4083861b0aa07990b619bd7ddc365eb7fa4b817e99cf5f8d9cf21a42780f6e01",
"sha256:498b0f36cc7054c1fead3d7fc59d2150f4d5c6c56ba7fb150c013fbc683a8d2d",
"sha256:4e3e5da80ccbebfff202a67bf900d081906c358ccc3d5e3c8aea42fdfdfd51c1",
"sha256:6daac9731f172c2a22ade6ed0c00197ee7cc1221aa84cfdf9c31defeb059a907",
"sha256:715ff2f2df46121071622063fc7543d9b1fd19ebfc4f5c8895af64a77a8c852c",
"sha256:73d785a950fc82dd2a25897d525d003f6378d1cb23ab305578394694202a58c3",
"sha256:8c8aaad94455178e3187ab22c8b01a3837f8ee50e09cf31f1ba129eb293ec30b",
"sha256:8ce678dbaf790dbdb3eba24056d5364fb45944f33553dd5869b7580cdbb83614",
"sha256:aaee9905aee35ba5905cfb3c62f3e83b3bec7b39413f0a7f19be4e547ea01ebb",
"sha256:bcd3b13b56ea479b3650b82cabd6b5343a625b0ced5429e4ccad28a8973f301b",
"sha256:c9e348e02e4d2b4a8b2eedb48210430658df6951fa484e59de33ff773fbd4b41",
"sha256:d205b1b46085271b4e15f670058ce182bd1199e56b317bf2ec004b6a44f911f6",
"sha256:d43943ef777f9a1c42bf4e552ba23ac77a6351de620aa9acf64ad54933ad4d34",
"sha256:d5d33e9e7af3b34a40dc05f498939f0ebf187f07c385fd58d591c533ad8562fe",
"sha256:fc0fea399acb12edbf8a628ba8d2312f583bdbdb3335635db062fa98cf71fca4",
"sha256:fe460b922ec15dd205595c9b5b99e2f056fd98ae8f9f56b888e7a17dc2b757e7"
],
"version": "==1.4.1"
}
},
"develop": {
"atomicwrites": {
"hashes": [
"sha256:03472c30eb2c5d1ba9227e4c2ca66ab8287fbfbbda3888aa93dc2e28fc6811b4",
"sha256:75a9445bac02d8d058d5e1fe689654ba5a6556a1dfd8ce6ec55a0ed79866cfa6"
],
"version": "==1.3.0"
},
"attrs": {
"hashes": [
"sha256:ec20e7a4825331c1b5ebf261d111e16fa9612c1f7a5e1f884f12bd53a664dfd2",
"sha256:f913492e1663d3c36f502e5e9ba6cd13cf19d7fab50aa13239e420fef95e1396"
"sha256:08a96c641c3a74e44eb59afb61a24f2cb9f4d7188748e76ba4bb5edfa3cb7d1c",
"sha256:f7b7ce16570fe9965acd6d30101a28f62fb4a7f9e926b3bbc9b61f8b04247e72"
],
"version": "==19.2.0"
},
"entrypoints": {
"hashes": [
"sha256:589f874b313739ad35be6e0cd7efde2a4e9b6fea91edcc34e58ecbb8dbe56d19",
"sha256:c70dd71abe5a8c85e55e12c19bd91ccfeec11a6e99044204511f9ed547d48451"
],
"version": "==0.3"
"version": "==19.3.0"
},
"flake8": {
"hashes": [
"sha256:19241c1cbc971b9962473e4438a2ca19749a7dd002dd1a946eaba171b4114548",
"sha256:8e9dfa3cecb2400b3738a42c54c3043e821682b9c840b0448c0503f781130696"
"sha256:c69ac1668e434d37a2d2880b3ca9aafd54b3a10a3ac1ab101d22f29e29cf8634",
"sha256:ccaa799ef9893cebe69fdfefed76865aeaefbb94cb8545617b2298786a4de9a5"
],
"index": "pypi",
"version": "==3.7.8"
"version": "==3.8.2"
},
"importlib-metadata": {
"hashes": [
"sha256:aa18d7378b00b40847790e7c27e11673d7fed219354109d0e7b9e5b25dc3ad26",
"sha256:d5f18a79777f3aa179c145737780282e27b508fc8fd688cb17c7a813e8bd39af"
"sha256:2a688cbaa90e0cc587f1df48bdc97a6eadccdcd9c35fb3f976a09e3b5016d90f",
"sha256:34513a8a0c4962bc66d35b359558fd8a5e10cd472d37aec5f66858addef32c1e"
],
"markers": "python_version < '3.8'",
"version": "==0.23"
"version": "==1.6.0"
},
"isort": {
"hashes": [
@@ -204,141 +289,156 @@
},
"more-itertools": {
"hashes": [
"sha256:409cd48d4db7052af495b09dec721011634af3753ae1ef92d2b32f73a745f832",
"sha256:92b8c4b06dac4f0611c0729b2f2ede52b2e1bac1ab48f089c7ddc12e26bb60c4"
"sha256:558bb897a2232f5e4f8e2399089e35aecb746e1f9191b6584a151647e89267be",
"sha256:7818f596b1e87be009031c7653d01acc46ed422e6656b394b0f765ce66ed4982"
],
"version": "==7.2.0"
"version": "==8.3.0"
},
"mypy": {
"hashes": [
"sha256:1d98fd818ad3128a5408148c9e4a5edce6ed6b58cc314283e631dd5d9216527b",
"sha256:22ee018e8fc212fe601aba65d3699689dd29a26410ef0d2cc1943de7bec7e3ac",
"sha256:3a24f80776edc706ec8d05329e854d5b9e464cd332e25cde10c8da2da0a0db6c",
"sha256:42a78944e80770f21609f504ca6c8173f7768043205b5ac51c9144e057dcf879",
"sha256:4b2b20106973548975f0c0b1112eceb4d77ed0cafe0a231a1318f3b3a22fc795",
"sha256:591a9625b4d285f3ba69f541c84c0ad9e7bffa7794da3fa0585ef13cf95cb021",
"sha256:5b4b70da3d8bae73b908a90bb2c387b977e59d484d22c604a2131f6f4397c1a3",
"sha256:84edda1ffeda0941b2ab38ecf49302326df79947fa33d98cdcfbf8ca9cf0bb23",
"sha256:b2b83d29babd61b876ae375786960a5374bba0e4aba3c293328ca6ca5dc448dd",
"sha256:cc4502f84c37223a1a5ab700649b5ab1b5e4d2bf2d426907161f20672a21930b",
"sha256:e29e24dd6e7f39f200a5bb55dcaa645d38a397dd5a6674f6042ef02df5795046"
"sha256:15b948e1302682e3682f11f50208b726a246ab4e6c1b39f9264a8796bb416aa2",
"sha256:219a3116ecd015f8dca7b5d2c366c973509dfb9a8fc97ef044a36e3da66144a1",
"sha256:3b1fc683fb204c6b4403a1ef23f0b1fac8e4477091585e0c8c54cbdf7d7bb164",
"sha256:3beff56b453b6ef94ecb2996bea101a08f1f8a9771d3cbf4988a61e4d9973761",
"sha256:7687f6455ec3ed7649d1ae574136835a4272b65b3ddcf01ab8704ac65616c5ce",
"sha256:7ec45a70d40ede1ec7ad7f95b3c94c9cf4c186a32f6bacb1795b60abd2f9ef27",
"sha256:86c857510a9b7c3104cf4cde1568f4921762c8f9842e987bc03ed4f160925754",
"sha256:8a627507ef9b307b46a1fea9513d5c98680ba09591253082b4c48697ba05a4ae",
"sha256:8dfb69fbf9f3aeed18afffb15e319ca7f8da9642336348ddd6cab2713ddcf8f9",
"sha256:a34b577cdf6313bf24755f7a0e3f3c326d5c1f4fe7422d1d06498eb25ad0c600",
"sha256:a8ffcd53cb5dfc131850851cc09f1c44689c2812d0beb954d8138d4f5fc17f65",
"sha256:b90928f2d9eb2f33162405f32dde9f6dcead63a0971ca8a1b50eb4ca3e35ceb8",
"sha256:c56ffe22faa2e51054c5f7a3bc70a370939c2ed4de308c690e7949230c995913",
"sha256:f91c7ae919bbc3f96cd5e5b2e786b2b108343d1d7972ea130f7de27fdd547cf3"
],
"index": "pypi",
"version": "==0.730"
"version": "==0.770"
},
"mypy-extensions": {
"hashes": [
"sha256:a161e3b917053de87dbe469987e173e49fb454eca10ef28b48b384538cc11458"
"sha256:090fedd75945a69ae91ce1303b5824f428daf5a028d2f6ab8a299250a846f15d",
"sha256:2d82818f5bb3e369420cb3c4060a7970edba416647068eb4c5343488a6c604a8"
],
"version": "==0.4.2"
"version": "==0.4.3"
},
"packaging": {
"hashes": [
"sha256:28b924174df7a2fa32c1953825ff29c61e2f5e082343165438812f00d3a7fc47",
"sha256:d9551545c6d761f3def1677baf08ab2a3ca17c56879e70fecba2fc4dde4ed108"
"sha256:4357f74f47b9c12db93624a82154e9b120fa8293699949152b22065d556079f8",
"sha256:998416ba6962ae7fbd6596850b80e17859a5753ba17c32284f67bfff33784181"
],
"version": "==19.2"
"version": "==20.4"
},
"pluggy": {
"hashes": [
"sha256:0db4b7601aae1d35b4a033282da476845aa19185c1e6964b25cf324b5e4ec3e6",
"sha256:fa5fa1622fa6dd5c030e9cad086fa19ef6a0cf6d7a2d12318e10cb49d6d68f34"
"sha256:15b2acde666561e1298d71b523007ed7364de07029219b604cf808bfa1c765b0",
"sha256:966c145cd83c96502c3c3868f50408687b38434af77734af1e9ca461a4081d2d"
],
"version": "==0.13.0"
"version": "==0.13.1"
},
"py": {
"hashes": [
"sha256:64f65755aee5b381cea27766a3a147c3f15b9b6b9ac88676de66ba2ae36793fa",
"sha256:dc639b046a6e2cff5bbe40194ad65936d6ba360b52b3c3fe1d08a82dd50b5e53"
"sha256:5e27081401262157467ad6e7f851b7aa402c5852dbcb3dae06768434de5752aa",
"sha256:c20fdd83a5dbc0af9efd622bee9a5564e278f6380fffcacc43ba6f43db2813b0"
],
"version": "==1.8.0"
"version": "==1.8.1"
},
"pycodestyle": {
"hashes": [
"sha256:95a2219d12372f05704562a14ec30bc76b05a5b297b21a5dfe3f6fac3491ae56",
"sha256:e40a936c9a450ad81df37f549d676d127b1b66000a6c500caa2b085bc0ca976c"
"sha256:2295e7b2f6b5bd100585ebcb1f616591b652db8a741695b3d8f5d28bdc934367",
"sha256:c58a7d2815e0e8d7972bf1803331fb0152f867bd89adf8a01dfd55085434192e"
],
"version": "==2.5.0"
"version": "==2.6.0"
},
"pyflakes": {
"hashes": [
"sha256:17dbeb2e3f4d772725c777fabc446d5634d1038f234e77343108ce445ea69ce0",
"sha256:d976835886f8c5b31d47970ed689944a0262b5f3afa00a5a7b4dc81e5449f8a2"
"sha256:0d94e0e05a19e57a99444b6ddcf9a6eb2e5c68d3ca1e98e90707af8152c90a92",
"sha256:35b2d75ee967ea93b55750aa9edbbf72813e06a66ba54438df2cfac9e3c27fc8"
],
"version": "==2.1.1"
"version": "==2.2.0"
},
"pyparsing": {
"hashes": [
"sha256:6f98a7b9397e206d78cc01df10131398f1c8b8510a2f4d97d9abd82e1aacdd80",
"sha256:d9338df12903bbf5d65a0e4e87c2161968b10d2e489652bb47001d82a9b028b4"
"sha256:67199f0c41a9c702154efb0e7a8cc08accf830eb003b4d9fa42c4059002e2492",
"sha256:700d17888d441604b0bd51535908dcb297561b040819cccde647a92439db5a2a"
],
"version": "==2.4.2"
"version": "==3.0.0a1"
},
"pytest": {
"hashes": [
"sha256:7e4800063ccfc306a53c461442526c5571e1462f61583506ce97e4da6a1d88c8",
"sha256:ca563435f4941d0cb34767301c27bc65c510cb82e90b9ecf9cb52dc2c63caaa0"
"sha256:95c710d0a72d91c13fae35dce195633c929c3792f54125919847fdcdf7caa0d3",
"sha256:eb2b5e935f6a019317e455b6da83dd8650ac9ffd2ee73a7b657a30873d67a698"
],
"index": "pypi",
"version": "==5.2.1"
"version": "==5.4.2"
},
"pytest-asyncio": {
"hashes": [
"sha256:475bd2f3dc0bc11d2463656b3cbaafdbec5a47b47508ea0b329ee693040eebd2"
],
"index": "pypi",
"version": "==0.12.0"
},
"rope": {
"hashes": [
"sha256:6b728fdc3e98a83446c27a91fc5d56808a004f8beab7a31ab1d7224cecc7d969",
"sha256:c5c5a6a87f7b1a2095fb311135e2a3d1f194f5ecb96900fdd0a9100881f48aaf",
"sha256:f0dcf719b63200d492b85535ebe5ea9b29e0d0b8aebeb87fe03fc1a65924fdaf"
"sha256:658ad6705f43dcf3d6df379da9486529cf30e02d9ea14c5682aa80eb33b649e1"
],
"index": "pypi",
"version": "==0.14.0"
"version": "==0.17.0"
},
"six": {
"hashes": [
"sha256:3350809f0555b11f552448330d0b52d5f24c91a322ea4a15ef22629740f3761c",
"sha256:d16a0141ec1a18405cd4ce8b4613101da75da0e9a7aec5bdd4fa804d0e0eba73"
"sha256:30639c035cdb23534cd4aa2dd52c3bf48f06e5f4a941509c8bafd8ce11080259",
"sha256:8b74bedcbbbaca38ff6d7491d76f2b06b3592611af620f8426e82dddb04a5ced"
],
"version": "==1.12.0"
"version": "==1.15.0"
},
"typed-ast": {
"hashes": [
"sha256:18511a0b3e7922276346bcb47e2ef9f38fb90fd31cb9223eed42c85d1312344e",
"sha256:262c247a82d005e43b5b7f69aff746370538e176131c32dda9cb0f324d27141e",
"sha256:2b907eb046d049bcd9892e3076c7a6456c93a25bebfe554e931620c90e6a25b0",
"sha256:354c16e5babd09f5cb0ee000d54cfa38401d8b8891eefa878ac772f827181a3c",
"sha256:4e0b70c6fc4d010f8107726af5fd37921b666f5b31d9331f0bd24ad9a088e631",
"sha256:630968c5cdee51a11c05a30453f8cd65e0cc1d2ad0d9192819df9978984529f4",
"sha256:66480f95b8167c9c5c5c87f32cf437d585937970f3fc24386f313a4c97b44e34",
"sha256:71211d26ffd12d63a83e079ff258ac9d56a1376a25bc80b1cdcdf601b855b90b",
"sha256:95bd11af7eafc16e829af2d3df510cecfd4387f6453355188342c3e79a2ec87a",
"sha256:bc6c7d3fa1325a0c6613512a093bc2a2a15aeec350451cbdf9e1d4bffe3e3233",
"sha256:cc34a6f5b426748a507dd5d1de4c1978f2eb5626d51326e43280941206c209e1",
"sha256:d755f03c1e4a51e9b24d899561fec4ccaf51f210d52abdf8c07ee2849b212a36",
"sha256:d7c45933b1bdfaf9f36c579671fec15d25b06c8398f113dab64c18ed1adda01d",
"sha256:d896919306dd0aa22d0132f62a1b78d11aaf4c9fc5b3410d3c666b818191630a",
"sha256:ffde2fbfad571af120fcbfbbc61c72469e72f550d676c3342492a9dfdefb8f12"
"sha256:0666aa36131496aed8f7be0410ff974562ab7eeac11ef351def9ea6fa28f6355",
"sha256:0c2c07682d61a629b68433afb159376e24e5b2fd4641d35424e462169c0a7919",
"sha256:249862707802d40f7f29f6e1aad8d84b5aa9e44552d2cc17384b209f091276aa",
"sha256:24995c843eb0ad11a4527b026b4dde3da70e1f2d8806c99b7b4a7cf491612652",
"sha256:269151951236b0f9a6f04015a9004084a5ab0d5f19b57de779f908621e7d8b75",
"sha256:4083861b0aa07990b619bd7ddc365eb7fa4b817e99cf5f8d9cf21a42780f6e01",
"sha256:498b0f36cc7054c1fead3d7fc59d2150f4d5c6c56ba7fb150c013fbc683a8d2d",
"sha256:4e3e5da80ccbebfff202a67bf900d081906c358ccc3d5e3c8aea42fdfdfd51c1",
"sha256:6daac9731f172c2a22ade6ed0c00197ee7cc1221aa84cfdf9c31defeb059a907",
"sha256:715ff2f2df46121071622063fc7543d9b1fd19ebfc4f5c8895af64a77a8c852c",
"sha256:73d785a950fc82dd2a25897d525d003f6378d1cb23ab305578394694202a58c3",
"sha256:8c8aaad94455178e3187ab22c8b01a3837f8ee50e09cf31f1ba129eb293ec30b",
"sha256:8ce678dbaf790dbdb3eba24056d5364fb45944f33553dd5869b7580cdbb83614",
"sha256:aaee9905aee35ba5905cfb3c62f3e83b3bec7b39413f0a7f19be4e547ea01ebb",
"sha256:bcd3b13b56ea479b3650b82cabd6b5343a625b0ced5429e4ccad28a8973f301b",
"sha256:c9e348e02e4d2b4a8b2eedb48210430658df6951fa484e59de33ff773fbd4b41",
"sha256:d205b1b46085271b4e15f670058ce182bd1199e56b317bf2ec004b6a44f911f6",
"sha256:d43943ef777f9a1c42bf4e552ba23ac77a6351de620aa9acf64ad54933ad4d34",
"sha256:d5d33e9e7af3b34a40dc05f498939f0ebf187f07c385fd58d591c533ad8562fe",
"sha256:fc0fea399acb12edbf8a628ba8d2312f583bdbdb3335635db062fa98cf71fca4",
"sha256:fe460b922ec15dd205595c9b5b99e2f056fd98ae8f9f56b888e7a17dc2b757e7"
],
"version": "==1.4.0"
"version": "==1.4.1"
},
"typing-extensions": {
"hashes": [
"sha256:2ed632b30bb54fc3941c382decfd0ee4148f5c591651c9272473fea2c6397d95",
"sha256:b1edbbf0652660e32ae780ac9433f4231e7339c7f9a8057d0f042fcbcea49b87",
"sha256:d8179012ec2c620d3791ca6fe2bf7979d979acdbef1fca0bc56b37411db682ed"
"sha256:6e95524d8a547a91e08f404ae485bbb71962de46967e1b71a0cb89af24e761c5",
"sha256:79ee589a3caca649a9bfd2a8de4709837400dfa00b6cc81962a1e6a1815969ae",
"sha256:f8d2bd89d25bc39dabe7d23df520442fa1d8969b82544370e03d88b5a591c392"
],
"version": "==3.7.4"
"version": "==3.7.4.2"
},
"wcwidth": {
"hashes": [
"sha256:3df37372226d6e63e1b1e1eda15c594bca98a22d33a23832a90998faa96bc65e",
"sha256:f4ebe71925af7b40a864553f761ed559b43544f8f71746c2d756c7fe788ade7c"
"sha256:cafe2186b3c009a04067022ce1dcd79cb38d8d65ee4f4791b8888d6599d1bbe1",
"sha256:ee73862862a156bf77ff92b09034fc4825dd3af9cf81bc5b360668d425f3c5f1"
],
"version": "==0.1.7"
"version": "==0.1.9"
},
"zipp": {
"hashes": [
"sha256:3718b1cbcd963c7d4c5511a8240812904164b7f381b647143a89d3b98f9bcd8e",
"sha256:f06903e9f1f43b12d371004b4ac7b06ab39a44adc747266928ae6debfa7b3335"
"sha256:aa36550ff0c0b7ef7fa639055d797116ee891440eac1a56f378e2d3179e0320b",
"sha256:c599e4d75c98f6798c509911d08a22e6c021d074469042177c8c86fb92eefd96"
],
"version": "==0.6.0"
"version": "==3.1.0"
}
}
}

115
README.md
View File

@@ -2,14 +2,15 @@
![](https://github.com/danielgtaylor/python-betterproto/workflows/CI/badge.svg)
This project aims to provide an improved experience when using Protobuf / gRPC in a modern Python environment by making use of modern language features and generating readable, understandable, idiomatic Python code. It will not support legacy features or environments. The following are supported:
This project aims to provide an improved experience when using Protobuf / gRPC in a modern Python environment by making use of modern language features and generating readable, understandable, idiomatic Python code. It will not support legacy features or environments (e.g. Protobuf 2). The following are supported:
- Protobuf 3 & gRPC code generation
- Both binary & JSON serialization is built-in
- Python 3.7+ making use of:
- Python 3.6+ making use of:
- Enums
- Dataclasses
- `async`/`await`
- Timezone-aware `datetime` and `timedelta` objects
- Relative imports
- Mypy type checking
@@ -34,6 +35,8 @@ This project exists because I am unhappy with the state of the official Google p
- Much code looks like C++ or Java ported 1:1 to Python
- Capitalized function names like `HasField()` and `SerializeToString()`
- Uses `SerializeToString()` rather than the built-in `__bytes__()`
- Special wrapped types don't use Python's `None`
- Timestamp/duration types don't use Python's built-in `datetime` module
This project is a reimplementation from the ground up focused on idiomatic modern Python to help fix some of the above. While it may not be a 1:1 drop-in replacement due to changed method names and call patterns, the wire format is identical.
@@ -43,7 +46,7 @@ First, install the package. Note that the `[compiler]` feature flag tells it to
```sh
# Install both the library and compiler
$ pip install betterproto[compiler]
$ pip install "betterproto[compiler]"
# Install just the library (to use the generated code output)
$ pip install betterproto
@@ -155,7 +158,7 @@ You can use it like so (enable async in the interactive shell first):
EchoResponse(values=["hello", "hello"])
>>> async for response in service.echo_stream(value="hello", extra_times=1)
print(response)
print(response)
EchoStreamResponse(value="hello")
EchoStreamResponse(value="hello")
@@ -168,6 +171,12 @@ Both serializing and parsing are supported to/from JSON and Python dictionaries
- Dicts: `Message().to_dict()`, `Message().from_dict(...)`
- JSON: `Message().to_json()`, `Message().from_json(...)`
For compatibility the default is to convert field names to `camelCase`. You can control this behavior by passing a casing value, e.g:
```py
>>> MyMessage().to_dict(casing=betterproto.Casing.SNAKE)
```
### Determining if a message was sent
Sometimes it is useful to be able to determine whether a message has been sent on the wire. This is how the Google wrapper types work to let you know whether a value is unset, set as the default (zero value), or set as something else, for example.
@@ -238,9 +247,56 @@ Again this is a little different than the official Google code generator:
["foo", "foo's value"]
```
### Well-Known Google Types
Google provides several well-known message types like a timestamp, duration, and several wrappers used to provide optional zero value support. Each of these has a special JSON representation and is handled a little differently from normal messages. The Python mapping for these is as follows:
| Google Message | Python Type | Default |
| --------------------------- | ---------------------------------------- | ---------------------- |
| `google.protobuf.duration` | [`datetime.timedelta`][td] | `0` |
| `google.protobuf.timestamp` | Timezone-aware [`datetime.datetime`][dt] | `1970-01-01T00:00:00Z` |
| `google.protobuf.*Value` | `Optional[...]` | `None` |
[td]: https://docs.python.org/3/library/datetime.html#timedelta-objects
[dt]: https://docs.python.org/3/library/datetime.html#datetime.datetime
For the wrapper types, the Python type corresponds to the wrapped type, e.g. `google.protobuf.BoolValue` becomes `Optional[bool]` while `google.protobuf.Int32Value` becomes `Optional[int]`. All of the optional values default to `None`, so don't forget to check for that possible state. Given:
```protobuf
syntax = "proto3";
import "google/protobuf/duration.proto";
import "google/protobuf/timestamp.proto";
import "google/protobuf/wrappers.proto";
message Test {
google.protobuf.BoolValue maybe = 1;
google.protobuf.Timestamp ts = 2;
google.protobuf.Duration duration = 3;
}
```
You can do stuff like:
```py
>>> t = Test().from_dict({"maybe": True, "ts": "2019-01-01T12:00:00Z", "duration": "1.200s"})
>>> t
Test(maybe=True, ts=datetime.datetime(2019, 1, 1, 12, 0, tzinfo=datetime.timezone.utc), duration=datetime.timedelta(seconds=1, microseconds=200000))
>>> t.ts - t.duration
datetime.datetime(2019, 1, 1, 11, 59, 58, 800000, tzinfo=datetime.timezone.utc)
>>> t.ts.isoformat()
'2019-01-01T12:00:00+00:00'
>>> t.maybe = None
>>> t.to_dict()
{'ts': '2019-01-01T12:00:00Z', 'duration': '1.200s'}
```
## Development
First, make sure you have Python 3.7+ and `pipenv` installed, along with the official [Protobuf Compiler](https://github.com/protocolbuffers/protobuf/releases) for your platform. Then:
First, make sure you have Python 3.6+ and `pipenv` installed, along with the official [Protobuf Compiler](https://github.com/protocolbuffers/protobuf/releases) for your platform. Then:
```sh
# Get set up with the virtual env & dependencies
@@ -251,14 +307,42 @@ $ pipenv shell
$ pip install -e .
```
### Code style
This project enforces [black](https://github.com/psf/black) python code formatting.
Before commiting changes run:
```bash
pipenv run black .
```
To avoid merge conflicts later, non-black formatted python code will fail in CI.
### Tests
There are two types of tests:
1. Manually-written tests for some behavior of the library
2. Proto files and JSON inputs for automated tests
1. Standard tests
2. Custom tests
For #2, you can add a new `*.proto` file into the `betterproto/tests` directory along with a sample `*.json` input and it will get automatically picked up.
#### Standard tests
Adding a standard test case is easy.
- Create a new directory `betterproto/tests/inputs/<name>`
- add `<name>.proto` with a message called `Test`
- add `<name>.json` with some test data
It will be picked up automatically when you run the tests.
- See also: [Standard Tests Development Guide](betterproto/tests/README.md)
#### Custom tests
Custom tests are found in `tests/test_*.py` and are run with pytest.
#### Running
Here's how to run the tests.
@@ -266,8 +350,8 @@ Here's how to run the tests.
# Generate assets from sample .proto files
$ pipenv run generate
# Run the tests
$ pipenv run tests
# Run all tests
$ pipenv run test
```
### TODO
@@ -284,6 +368,9 @@ $ pipenv run tests
- [x] Refs to nested types
- [x] Imports in proto files
- [x] Well-known Google types
- [ ] Support as request input
- [ ] Support as response output
- [ ] Automatically wrap/unwrap responses
- [x] OneOf support
- [x] Basic support on the wire
- [x] Check which was set from the group
@@ -295,14 +382,14 @@ $ pipenv run tests
- [x] Bytes as base64
- [ ] Any support
- [x] Enum strings
- [ ] Well known types support (timestamp, duration, wrappers)
- [ ] Support different casing (orig vs. camel vs. others?)
- [x] Well known types support (timestamp, duration, wrappers)
- [x] Support different casing (orig vs. camel vs. others?)
- [ ] Async service stubs
- [x] Unary-unary
- [x] Server streaming response
- [ ] Client streaming request
- [ ] Renaming messages and fields to conform to Python name standards
- [ ] Renaming clashes with language keywords and standard library top-level packages
- [x] Renaming messages and fields to conform to Python name standards
- [x] Renaming clashes with language keywords
- [x] Python package
- [x] Automate running tests
- [ ] Cleanup!

View File

@@ -3,16 +3,20 @@ import enum
import inspect
import json
import struct
import sys
from abc import ABC
from base64 import b64encode, b64decode
from datetime import datetime, timedelta, timezone
from typing import (
Any,
AsyncGenerator,
Callable,
Collection,
Dict,
Generator,
Iterable,
List,
Mapping,
Optional,
SupportsBytes,
Tuple,
@@ -20,10 +24,26 @@ from typing import (
TypeVar,
Union,
get_type_hints,
TYPE_CHECKING,
)
import grpclib.client
import grpclib.const
import stringcase
from .casing import safe_snake_case
if TYPE_CHECKING:
from grpclib._protocols import IProtoMessage
from grpclib.client import Channel
from grpclib.metadata import Deadline
if not (sys.version_info.major == 3 and sys.version_info.minor >= 7):
# Apply backport of datetime.fromisoformat from 3.7
from backports.datetime_fromisoformat import MonkeyPatch
MonkeyPatch.patch_fromisoformat()
# Proto 3 data types
TYPE_ENUM = "enum"
@@ -101,6 +121,21 @@ WIRE_FIXED_64_TYPES = [TYPE_DOUBLE, TYPE_FIXED64, TYPE_SFIXED64]
WIRE_LEN_DELIM_TYPES = [TYPE_STRING, TYPE_BYTES, TYPE_MESSAGE, TYPE_MAP]
# Protobuf datetimes start at the Unix Epoch in 1970 in UTC.
def datetime_default_gen():
return datetime(1970, 1, 1, tzinfo=timezone.utc)
DATETIME_ZERO = datetime_default_gen()
class Casing(enum.Enum):
"""Casing constants for serialization."""
CAMEL = stringcase.camelcase
SNAKE = stringcase.snakecase
class _PLACEHOLDER:
pass
@@ -108,18 +143,6 @@ class _PLACEHOLDER:
PLACEHOLDER: Any = _PLACEHOLDER()
def get_default(proto_type: str) -> Any:
"""Get the default (zero value) for a given type."""
return {
TYPE_BOOL: False,
TYPE_FLOAT: 0.0,
TYPE_DOUBLE: 0.0,
TYPE_STRING: "",
TYPE_BYTES: b"",
TYPE_MAP: {},
}.get(proto_type, 0)
@dataclasses.dataclass(frozen=True)
class FieldMetadata:
"""Stores internal metadata used for parsing & serialization."""
@@ -129,9 +152,11 @@ class FieldMetadata:
# Protobuf type name
proto_type: str
# Map information if the proto_type is a map
map_types: Optional[Tuple[str, str]]
map_types: Optional[Tuple[str, str]] = None
# Groups several "one-of" fields together
group: Optional[str]
group: Optional[str] = None
# Describes the wrapped type (e.g. when using google.protobuf.BoolValue)
wraps: Optional[str] = None
@staticmethod
def get(field: dataclasses.Field) -> "FieldMetadata":
@@ -145,11 +170,14 @@ def dataclass_field(
*,
map_types: Optional[Tuple[str, str]] = None,
group: Optional[str] = None,
wraps: Optional[str] = None,
) -> dataclasses.Field:
"""Creates a dataclass field with attached protobuf metadata."""
return dataclasses.field(
default=PLACEHOLDER,
metadata={"betterproto": FieldMetadata(number, proto_type, map_types, group)},
metadata={
"betterproto": FieldMetadata(number, proto_type, map_types, group, wraps)
},
)
@@ -222,8 +250,10 @@ def bytes_field(number: int, group: Optional[str] = None) -> Any:
return dataclass_field(number, TYPE_BYTES, group=group)
def message_field(number: int, group: Optional[str] = None) -> Any:
return dataclass_field(number, TYPE_MESSAGE, group=group)
def message_field(
number: int, group: Optional[str] = None, wraps: Optional[str] = None
) -> Any:
return dataclass_field(number, TYPE_MESSAGE, group=group, wraps=wraps)
def map_field(
@@ -274,7 +304,7 @@ def encode_varint(value: int) -> bytes:
return bytes(b + [bits])
def _preprocess_single(proto_type: str, value: Any) -> bytes:
def _preprocess_single(proto_type: str, wraps: str, value: Any) -> bytes:
"""Adjusts values before serialization."""
if proto_type in [
TYPE_ENUM,
@@ -297,16 +327,37 @@ def _preprocess_single(proto_type: str, value: Any) -> bytes:
elif proto_type == TYPE_STRING:
return value.encode("utf-8")
elif proto_type == TYPE_MESSAGE:
if isinstance(value, datetime):
# Convert the `datetime` to a timestamp message.
seconds = int(value.timestamp())
nanos = int(value.microsecond * 1e3)
value = _Timestamp(seconds=seconds, nanos=nanos)
elif isinstance(value, timedelta):
# Convert the `timedelta` to a duration message.
total_ms = value // timedelta(microseconds=1)
seconds = int(total_ms / 1e6)
nanos = int((total_ms % 1e6) * 1e3)
value = _Duration(seconds=seconds, nanos=nanos)
elif wraps:
if value is None:
return b""
value = _get_wrapper(wraps)(value=value)
return bytes(value)
return value
def _serialize_single(
field_number: int, proto_type: str, value: Any, *, serialize_empty: bool = False
field_number: int,
proto_type: str,
value: Any,
*,
serialize_empty: bool = False,
wraps: str = "",
) -> bytes:
"""Serializes a single field and value."""
value = _preprocess_single(proto_type, value)
value = _preprocess_single(proto_type, wraps, value)
output = b""
if proto_type in WIRE_VARINT_TYPES:
@@ -319,7 +370,7 @@ def _serialize_single(
key = encode_varint((field_number << 3) | 1)
output += key + value
elif proto_type in WIRE_LEN_DELIM_TYPES:
if len(value) or serialize_empty:
if len(value) or serialize_empty or wraps:
key = encode_varint((field_number << 3) | 2)
output += key + encode_varint(len(value)) + value
else:
@@ -359,7 +410,6 @@ def parse_fields(value: bytes) -> Generator[ParsedField, None, None]:
while i < len(value):
start = i
num_wire, i = decode_varint(value, i)
# print(num_wire, i)
number = num_wire >> 3
wire_type = num_wire & 0x7
@@ -375,8 +425,6 @@ def parse_fields(value: bytes) -> Generator[ParsedField, None, None]:
elif wire_type == 5:
decoded, i = value[i : i + 4], i + 4
# print(ParsedField(number=number, wire_type=wire_type, value=decoded))
yield ParsedField(
number=number, wire_type=wire_type, value=decoded, raw=value[start:i]
)
@@ -386,6 +434,63 @@ def parse_fields(value: bytes) -> Generator[ParsedField, None, None]:
T = TypeVar("T", bound="Message")
class ProtoClassMetadata:
cls: Type["Message"]
def __init__(self, cls: Type["Message"]):
self.cls = cls
by_field = {}
by_group = {}
for field in dataclasses.fields(cls):
meta = FieldMetadata.get(field)
if meta.group:
# This is part of a one-of group.
by_field[field.name] = meta.group
by_group.setdefault(meta.group, set()).add(field)
self.oneof_group_by_field = by_field
self.oneof_field_by_group = by_group
self.init_default_gen()
self.init_cls_by_field()
def init_default_gen(self):
default_gen = {}
for field in dataclasses.fields(self.cls):
meta = FieldMetadata.get(field)
default_gen[field.name] = self.cls._get_field_default_gen(field, meta)
self.default_gen = default_gen
def init_cls_by_field(self):
field_cls = {}
for field in dataclasses.fields(self.cls):
meta = FieldMetadata.get(field)
if meta.proto_type == TYPE_MAP:
assert meta.map_types
kt = self.cls._cls_for(field, index=0)
vt = self.cls._cls_for(field, index=1)
Entry = dataclasses.make_dataclass(
"Entry",
[
("key", kt, dataclass_field(1, meta.map_types[0])),
("value", vt, dataclass_field(2, meta.map_types[1])),
],
bases=(Message,),
)
field_cls[field.name] = Entry
field_cls[field.name + ".value"] = vt
else:
field_cls[field.name] = self.cls._cls_for(field)
self.cls_by_field = field_cls
class Message(ABC):
"""
A protobuf message base class. Generated code will inherit from this and
@@ -393,33 +498,37 @@ class Message(ABC):
to go between Python, binary and JSON protobuf message representations.
"""
_serialized_on_wire: bool
_unknown_fields: bytes
_group_map: Dict[str, dict]
def __post_init__(self) -> None:
# Keep track of whether every field was default
all_sentinel = True
# Set a default value for each field in the class after `__init__` has
# already been run.
group_map = {"fields": {}, "groups": {}}
group_map: Dict[str, dataclasses.Field] = {}
for field in dataclasses.fields(self):
meta = FieldMetadata.get(field)
if meta.group:
group_map["fields"][field.name] = meta.group
if meta.group not in group_map["groups"]:
group_map["groups"][meta.group] = {"current": None, "fields": set()}
group_map["groups"][meta.group]["fields"].add(field)
group_map.setdefault(meta.group)
if getattr(self, field.name) != PLACEHOLDER:
# Skip anything not set to the sentinel value
all_sentinel = False
if meta.group:
# This was set, so make it the selected value of the one-of.
group_map["groups"][meta.group]["current"] = field
group_map[meta.group] = field
continue
setattr(self, field.name, self._get_field_default(field, meta))
# Now that all the defaults are set, reset it!
self.__dict__["_serialized_on_wire"] = False
self.__dict__["_serialized_on_wire"] = not all_sentinel
self.__dict__["_unknown_fields"] = b""
self.__dict__["_group_map"] = group_map
@@ -428,19 +537,33 @@ class Message(ABC):
# Track when a field has been set.
self.__dict__["_serialized_on_wire"] = True
if attr in getattr(self, "_group_map", {}).get("fields", {}):
group = self._group_map["fields"][attr]
for field in self._group_map["groups"][group]["fields"]:
if field.name == attr:
self._group_map["groups"][group]["current"] = field
else:
super().__setattr__(
field.name,
self._get_field_default(field, FieldMetadata.get(field)),
)
if hasattr(self, "_group_map"): # __post_init__ had already run
if attr in self._betterproto.oneof_group_by_field:
group = self._betterproto.oneof_group_by_field[attr]
for field in self._betterproto.oneof_field_by_group[group]:
if field.name == attr:
self._group_map[group] = field
else:
super().__setattr__(
field.name,
self._get_field_default(field, FieldMetadata.get(field)),
)
super().__setattr__(attr, value)
@property
def _betterproto(self):
"""
Lazy initialize metadata for each protobuf class.
It may be initialized multiple times in a multi-threaded environment,
but that won't affect the correctness.
"""
meta = getattr(self.__class__, "_betterproto_meta", None)
if not meta:
meta = ProtoClassMetadata(self.__class__)
self.__class__._betterproto_meta = meta
return meta
def __bytes__(self) -> bytes:
"""
Get the binary encoded Protobuf representation of this instance.
@@ -450,49 +573,60 @@ class Message(ABC):
meta = FieldMetadata.get(field)
value = getattr(self, field.name)
if value is None:
# Optional items should be skipped. This is used for the Google
# wrapper types.
continue
# Being selected in a a group means this field is the one that is
# currently set in a `oneof` group, so it must be serialized even
# if the value is the default zero value.
selected_in_group = False
if meta.group and self._group_map["groups"][meta.group]["current"] == field:
if meta.group and self._group_map[meta.group] == field:
selected_in_group = True
if isinstance(value, list):
if not len(value) and not selected_in_group:
# Empty values are not serialized
continue
serialize_empty = False
if isinstance(value, Message) and value._serialized_on_wire:
# Empty messages can still be sent on the wire if they were
# set (or received empty).
serialize_empty = True
if value == self._get_field_default(field, meta) and not (
selected_in_group or serialize_empty
):
# Default (zero) values are not serialized. Two exceptions are
# if this is the selected oneof item or if we know we have to
# serialize an empty message (i.e. zero value was explicitly
# set by the user).
continue
if isinstance(value, list):
if meta.proto_type in PACKED_TYPES:
# Packed lists look like a length-delimited field. First,
# preprocess/encode each value into a buffer and then
# treat it like a field of raw bytes.
buf = b""
for item in value:
buf += _preprocess_single(meta.proto_type, item)
buf += _preprocess_single(meta.proto_type, "", item)
output += _serialize_single(meta.number, TYPE_BYTES, buf)
else:
for item in value:
output += _serialize_single(meta.number, meta.proto_type, item)
output += _serialize_single(
meta.number, meta.proto_type, item, wraps=meta.wraps or ""
)
elif isinstance(value, dict):
if not len(value) and not selected_in_group:
# Empty values are not serialized
continue
for k, v in value.items():
assert meta.map_types
sk = _serialize_single(1, meta.map_types[0], k)
sv = _serialize_single(2, meta.map_types[1], v)
output += _serialize_single(meta.number, meta.proto_type, sk + sv)
else:
if value == get_default(meta.proto_type) and not selected_in_group:
# Default (zero) values are not serialized
continue
serialize_empty = False
if isinstance(value, Message) and value._serialized_on_wire:
serialize_empty = True
output += _serialize_single(
meta.number, meta.proto_type, value, serialize_empty=serialize_empty
meta.number,
meta.proto_type,
value,
serialize_empty=serialize_empty,
wraps=meta.wraps or "",
)
return output + self._unknown_fields
@@ -500,32 +634,52 @@ class Message(ABC):
# For compatibility with other libraries
SerializeToString = __bytes__
def _cls_for(self, field: dataclasses.Field, index: int = 0) -> Type:
@classmethod
def _type_hint(cls, field_name: str) -> Type:
module = inspect.getmodule(cls)
type_hints = get_type_hints(cls, vars(module))
return type_hints[field_name]
@classmethod
def _cls_for(cls, field: dataclasses.Field, index: int = 0) -> Type:
"""Get the message class for a field from the type hints."""
module = inspect.getmodule(self.__class__)
type_hints = get_type_hints(self.__class__, vars(module))
cls = type_hints[field.name]
if hasattr(cls, "__args__") and index >= 0:
cls = type_hints[field.name].__args__[index]
return cls
field_cls = cls._type_hint(field.name)
if hasattr(field_cls, "__args__") and index >= 0:
field_cls = field_cls.__args__[index]
return field_cls
def _get_field_default(self, field: dataclasses.Field, meta: FieldMetadata) -> Any:
t = self._cls_for(field, index=-1)
return self._betterproto.default_gen[field.name]()
value: Any = 0
if meta.proto_type == TYPE_MAP:
# Maps cannot be repeated, so we check these first.
value = {}
elif hasattr(t, "__args__") and len(t.__args__) == 1:
# Anything else with type args is a list.
value = []
elif meta.proto_type == TYPE_MESSAGE:
# Message means creating an instance of the right type.
value = t()
@classmethod
def _get_field_default_gen(
cls, field: dataclasses.Field, meta: FieldMetadata
) -> Any:
t = cls._type_hint(field.name)
if hasattr(t, "__origin__"):
if t.__origin__ in (dict, Dict):
# This is some kind of map (dict in Python).
return dict
elif t.__origin__ in (list, List):
# This is some kind of list (repeated) field.
return list
elif t.__origin__ == Union and t.__args__[1] == type(None):
# This is an optional (wrapped) field. For setting the default we
# really don't care what kind of field it is.
return type(None)
else:
return t
elif issubclass(t, Enum):
# Enums always default to zero.
return int
elif t == datetime:
# Offsets are relative to 1970-01-01T00:00:00Z
return datetime_default_gen
else:
value = get_default(meta.proto_type)
return value
# This is either a primitive scalar or another message type. Calling
# it should result in its zero value.
return t
def _postprocess_single(
self, wire_type: int, meta: FieldMetadata, field: dataclasses.Field, value: Any
@@ -540,6 +694,9 @@ class Message(ABC):
elif meta.proto_type in [TYPE_SINT32, TYPE_SINT64]:
# Undo zig-zag encoding
value = (value >> 1) ^ (-(value & 1))
elif meta.proto_type == TYPE_BOOL:
# Booleans use a varint encoding, so convert it to true/false.
value = value > 0
elif wire_type in [WIRE_FIXED_32, WIRE_FIXED_64]:
fmt = _pack_fmt(meta.proto_type)
value = struct.unpack(fmt, value)[0]
@@ -547,24 +704,21 @@ class Message(ABC):
if meta.proto_type == TYPE_STRING:
value = value.decode("utf-8")
elif meta.proto_type == TYPE_MESSAGE:
cls = self._cls_for(field)
value = cls().parse(value)
value._serialized_on_wire = True
cls = self._betterproto.cls_by_field[field.name]
if cls == datetime:
value = _Timestamp().parse(value).to_datetime()
elif cls == timedelta:
value = _Duration().parse(value).to_timedelta()
elif meta.wraps:
# This is a Google wrapper value message around a single
# scalar type.
value = _get_wrapper(meta.wraps)().parse(value).value
else:
value = cls().parse(value)
value._serialized_on_wire = True
elif meta.proto_type == TYPE_MAP:
# TODO: This is slow, use a cache to make it faster since each
# key/value pair will recreate the class.
assert meta.map_types
kt = self._cls_for(field, index=0)
vt = self._cls_for(field, index=1)
Entry = dataclasses.make_dataclass(
"Entry",
[
("key", kt, dataclass_field(1, meta.map_types[0])),
("value", vt, dataclass_field(2, meta.map_types[1])),
],
bases=(Message,),
)
value = Entry().parse(value)
value = self._betterproto.cls_by_field[field.name]().parse(value)
return value
@@ -624,48 +778,70 @@ class Message(ABC):
def FromString(cls: Type[T], data: bytes) -> T:
return cls().parse(data)
def to_dict(self) -> dict:
def to_dict(
self, casing: Casing = Casing.CAMEL, include_default_values: bool = False
) -> dict:
"""
Returns a dict representation of this message instance which can be
used to serialize to e.g. JSON.
used to serialize to e.g. JSON. Defaults to camel casing for
compatibility but can be set to other modes.
`include_default_values` can be set to `True` to include default
values of fields. E.g. an `int32` type field with `0` value will
not be in returned dict if `include_default_values` is set to
`False`.
"""
output: Dict[str, Any] = {}
for field in dataclasses.fields(self):
meta = FieldMetadata.get(field)
v = getattr(self, field.name)
cased_name = casing(field.name).rstrip("_") # type: ignore
if meta.proto_type == "message":
if isinstance(v, list):
if isinstance(v, datetime):
if v != DATETIME_ZERO or include_default_values:
output[cased_name] = _Timestamp.timestamp_to_json(v)
elif isinstance(v, timedelta):
if v != timedelta(0) or include_default_values:
output[cased_name] = _Duration.delta_to_json(v)
elif meta.wraps:
if v is not None or include_default_values:
output[cased_name] = v
elif isinstance(v, list):
# Convert each item.
v = [i.to_dict() for i in v]
output[field.name] = v
elif v._serialized_on_wire:
output[field.name] = v.to_dict()
v = [i.to_dict(casing, include_default_values) for i in v]
if v or include_default_values:
output[cased_name] = v
else:
if v._serialized_on_wire or include_default_values:
output[cased_name] = v.to_dict(casing, include_default_values)
elif meta.proto_type == "map":
for k in v:
if hasattr(v[k], "to_dict"):
v[k] = v[k].to_dict()
v[k] = v[k].to_dict(casing, include_default_values)
if v:
output[field.name] = v
elif v != get_default(meta.proto_type):
if v or include_default_values:
output[cased_name] = v
elif v != self._get_field_default(field, meta) or include_default_values:
if meta.proto_type in INT_64_TYPES:
if isinstance(v, list):
output[field.name] = [str(n) for n in v]
output[cased_name] = [str(n) for n in v]
else:
output[field.name] = str(v)
output[cased_name] = str(v)
elif meta.proto_type == TYPE_BYTES:
if isinstance(v, list):
output[field.name] = [b64encode(b).decode("utf8") for b in v]
output[cased_name] = [b64encode(b).decode("utf8") for b in v]
else:
output[field.name] = b64encode(v).decode("utf8")
output[cased_name] = b64encode(v).decode("utf8")
elif meta.proto_type == TYPE_ENUM:
enum_values = list(self._cls_for(field))
enum_values = list(
self._betterproto.cls_by_field[field.name]
) # type: ignore
if isinstance(v, list):
output[field.name] = [enum_values[e].name for e in v]
output[cased_name] = [enum_values[e].name for e in v]
else:
output[field.name] = enum_values[v].name
output[cased_name] = enum_values[v].name
else:
output[field.name] = v
output[cased_name] = v
return output
def from_dict(self: T, value: dict) -> T:
@@ -674,44 +850,58 @@ class Message(ABC):
returns the instance itself and is therefore assignable and chainable.
"""
self._serialized_on_wire = True
for field in dataclasses.fields(self):
meta = FieldMetadata.get(field)
if field.name in value and value[field.name] is not None:
if meta.proto_type == "message":
v = getattr(self, field.name)
# print(v, value[field.name])
if isinstance(v, list):
cls = self._cls_for(field)
for i in range(len(value[field.name])):
v.append(cls().from_dict(value[field.name][i]))
else:
v.from_dict(value[field.name])
elif meta.map_types and meta.map_types[1] == TYPE_MESSAGE:
v = getattr(self, field.name)
cls = self._cls_for(field, index=1)
for k in value[field.name]:
v[k] = cls().from_dict(value[field.name][k])
else:
v = value[field.name]
if meta.proto_type in INT_64_TYPES:
if isinstance(value[field.name], list):
v = [int(n) for n in value[field.name]]
else:
v = int(value[field.name])
elif meta.proto_type == TYPE_BYTES:
if isinstance(value[field.name], list):
v = [b64decode(n) for n in value[field.name]]
else:
v = b64decode(value[field.name])
elif meta.proto_type == TYPE_ENUM:
enum_cls = self._cls_for(field)
if isinstance(v, list):
v = [enum_cls.from_string(e) for e in v]
elif isinstance(v, str):
v = enum_cls.from_string(v)
fields_by_name = {f.name: f for f in dataclasses.fields(self)}
for key in value:
snake_cased = safe_snake_case(key)
if snake_cased in fields_by_name:
field = fields_by_name[snake_cased]
meta = FieldMetadata.get(field)
if v is not None:
setattr(self, field.name, v)
if value[key] is not None:
if meta.proto_type == "message":
v = getattr(self, field.name)
if isinstance(v, list):
cls = self._betterproto.cls_by_field[field.name]
for i in range(len(value[key])):
v.append(cls().from_dict(value[key][i]))
elif isinstance(v, datetime):
v = datetime.fromisoformat(
value[key].replace("Z", "+00:00")
)
setattr(self, field.name, v)
elif isinstance(v, timedelta):
v = timedelta(seconds=float(value[key][:-1]))
setattr(self, field.name, v)
elif meta.wraps:
setattr(self, field.name, value[key])
else:
v.from_dict(value[key])
elif meta.map_types and meta.map_types[1] == TYPE_MESSAGE:
v = getattr(self, field.name)
cls = self._betterproto.cls_by_field[field.name + ".value"]
for k in value[key]:
v[k] = cls().from_dict(value[key][k])
else:
v = value[key]
if meta.proto_type in INT_64_TYPES:
if isinstance(value[key], list):
v = [int(n) for n in value[key]]
else:
v = int(value[key])
elif meta.proto_type == TYPE_BYTES:
if isinstance(value[key], list):
v = [b64decode(n) for n in value[key]]
else:
v = b64decode(value[key])
elif meta.proto_type == TYPE_ENUM:
enum_cls = self._betterproto.cls_by_field[field.name]
if isinstance(v, list):
v = [enum_cls.from_string(e) for e in v]
elif isinstance(v, str):
v = enum_cls.from_string(v)
if v is not None:
setattr(self, field.name, v)
return self
def to_json(self, indent: Union[None, int, str] = None) -> str:
@@ -737,26 +927,198 @@ def serialized_on_wire(message: Message) -> bool:
def which_one_of(message: Message, group_name: str) -> Tuple[str, Any]:
"""Return the name and value of a message's one-of field group."""
field = message._group_map["groups"].get(group_name, {}).get("current")
field = message._group_map.get(group_name)
if not field:
return ("", None)
return (field.name, getattr(message, field.name))
@dataclasses.dataclass
class _Duration(Message):
# Signed seconds of the span of time. Must be from -315,576,000,000 to
# +315,576,000,000 inclusive. Note: these bounds are computed from: 60
# sec/min * 60 min/hr * 24 hr/day * 365.25 days/year * 10000 years
seconds: int = int64_field(1)
# Signed fractions of a second at nanosecond resolution of the span of time.
# Durations less than one second are represented with a 0 `seconds` field and
# a positive or negative `nanos` field. For durations of one second or more,
# a non-zero value for the `nanos` field must be of the same sign as the
# `seconds` field. Must be from -999,999,999 to +999,999,999 inclusive.
nanos: int = int32_field(2)
def to_timedelta(self) -> timedelta:
return timedelta(seconds=self.seconds, microseconds=self.nanos / 1e3)
@staticmethod
def delta_to_json(delta: timedelta) -> str:
parts = str(delta.total_seconds()).split(".")
if len(parts) > 1:
while len(parts[1]) not in [3, 6, 9]:
parts[1] = parts[1] + "0"
return ".".join(parts) + "s"
@dataclasses.dataclass
class _Timestamp(Message):
# Represents seconds of UTC time since Unix epoch 1970-01-01T00:00:00Z. Must
# be from 0001-01-01T00:00:00Z to 9999-12-31T23:59:59Z inclusive.
seconds: int = int64_field(1)
# Non-negative fractions of a second at nanosecond resolution. Negative
# second values with fractions must still have non-negative nanos values that
# count forward in time. Must be from 0 to 999,999,999 inclusive.
nanos: int = int32_field(2)
def to_datetime(self) -> datetime:
ts = self.seconds + (self.nanos / 1e9)
return datetime.fromtimestamp(ts, tz=timezone.utc)
@staticmethod
def timestamp_to_json(dt: datetime) -> str:
nanos = dt.microsecond * 1e3
copy = dt.replace(microsecond=0, tzinfo=None)
result = copy.isoformat()
if (nanos % 1e9) == 0:
# If there are 0 fractional digits, the fractional
# point '.' should be omitted when serializing.
return result + "Z"
if (nanos % 1e6) == 0:
# Serialize 3 fractional digits.
return result + ".%03dZ" % (nanos / 1e6)
if (nanos % 1e3) == 0:
# Serialize 6 fractional digits.
return result + ".%06dZ" % (nanos / 1e3)
# Serialize 9 fractional digits.
return result + ".%09dZ" % nanos
class _WrappedMessage(Message):
"""
Google protobuf wrapper types base class. JSON representation is just the
value itself.
"""
value: Any
def to_dict(self, casing: Casing = Casing.CAMEL) -> Any:
return self.value
def from_dict(self: T, value: Any) -> T:
if value is not None:
self.value = value
return self
@dataclasses.dataclass
class _BoolValue(_WrappedMessage):
value: bool = bool_field(1)
@dataclasses.dataclass
class _Int32Value(_WrappedMessage):
value: int = int32_field(1)
@dataclasses.dataclass
class _UInt32Value(_WrappedMessage):
value: int = uint32_field(1)
@dataclasses.dataclass
class _Int64Value(_WrappedMessage):
value: int = int64_field(1)
@dataclasses.dataclass
class _UInt64Value(_WrappedMessage):
value: int = uint64_field(1)
@dataclasses.dataclass
class _FloatValue(_WrappedMessage):
value: float = float_field(1)
@dataclasses.dataclass
class _DoubleValue(_WrappedMessage):
value: float = double_field(1)
@dataclasses.dataclass
class _StringValue(_WrappedMessage):
value: str = string_field(1)
@dataclasses.dataclass
class _BytesValue(_WrappedMessage):
value: bytes = bytes_field(1)
def _get_wrapper(proto_type: str) -> Type:
"""Get the wrapper message class for a wrapped type."""
return {
TYPE_BOOL: _BoolValue,
TYPE_INT32: _Int32Value,
TYPE_UINT32: _UInt32Value,
TYPE_INT64: _Int64Value,
TYPE_UINT64: _UInt64Value,
TYPE_FLOAT: _FloatValue,
TYPE_DOUBLE: _DoubleValue,
TYPE_STRING: _StringValue,
TYPE_BYTES: _BytesValue,
}[proto_type]
_Value = Union[str, bytes]
_MetadataLike = Union[Mapping[str, _Value], Collection[Tuple[str, _Value]]]
class ServiceStub(ABC):
"""
Base class for async gRPC service stubs.
"""
def __init__(self, channel: grpclib.client.Channel) -> None:
def __init__(
self,
channel: "Channel",
*,
timeout: Optional[float] = None,
deadline: Optional["Deadline"] = None,
metadata: Optional[_MetadataLike] = None,
) -> None:
self.channel = channel
self.timeout = timeout
self.deadline = deadline
self.metadata = metadata
def __resolve_request_kwargs(
self,
timeout: Optional[float],
deadline: Optional["Deadline"],
metadata: Optional[_MetadataLike],
):
return {
"timeout": self.timeout if timeout is None else timeout,
"deadline": self.deadline if deadline is None else deadline,
"metadata": self.metadata if metadata is None else metadata,
}
async def _unary_unary(
self, route: str, request_type: Type, response_type: Type[T], request: Any
self,
route: str,
request: "IProtoMessage",
response_type: Type[T],
*,
timeout: Optional[float] = None,
deadline: Optional["Deadline"] = None,
metadata: Optional[_MetadataLike] = None,
) -> T:
"""Make a unary request and return the response."""
async with self.channel.request(
route, grpclib.const.Cardinality.UNARY_UNARY, request_type, response_type
route,
grpclib.const.Cardinality.UNARY_UNARY,
type(request),
response_type,
**self.__resolve_request_kwargs(timeout, deadline, metadata),
) as stream:
await stream.send_message(request, end=True)
response = await stream.recv_message()
@@ -764,11 +1126,22 @@ class ServiceStub(ABC):
return response
async def _unary_stream(
self, route: str, request_type: Type, response_type: Type[T], request: Any
self,
route: str,
request: "IProtoMessage",
response_type: Type[T],
*,
timeout: Optional[float] = None,
deadline: Optional["Deadline"] = None,
metadata: Optional[_MetadataLike] = None,
) -> AsyncGenerator[T, None]:
"""Make a unary request and return the stream response iterator."""
async with self.channel.request(
route, grpclib.const.Cardinality.UNARY_STREAM, request_type, response_type
route,
grpclib.const.Cardinality.UNARY_STREAM,
type(request),
response_type,
**self.__resolve_request_kwargs(timeout, deadline, metadata),
) as stream:
await stream.send_message(request, end=True)
async for message in stream:

41
betterproto/casing.py Normal file
View File

@@ -0,0 +1,41 @@
import stringcase
def safe_snake_case(value: str) -> str:
"""Snake case a value taking into account Python keywords."""
value = stringcase.snakecase(value)
if value in [
"and",
"as",
"assert",
"break",
"class",
"continue",
"def",
"del",
"elif",
"else",
"except",
"finally",
"for",
"from",
"global",
"if",
"import",
"in",
"is",
"lambda",
"nonlocal",
"not",
"or",
"pass",
"raise",
"return",
"try",
"while",
"with",
"yield",
]:
# https://www.python.org/dev/peps/pep-0008/#descriptive-naming-styles
value += "_"
return value

2
betterproto/plugin.bat Normal file
View File

@@ -0,0 +1,2 @@
@SET plugin_dir=%~dp0
@python %plugin_dir%/plugin.py %*

View File

@@ -1,51 +1,92 @@
#!/usr/bin/env python
import itertools
import json
import os.path
import re
import sys
import textwrap
from typing import Any, List, Tuple
from collections import defaultdict
from typing import Dict, List, Optional, Type
try:
import jinja2
import black
except ImportError:
print(
"Unable to import `jinja2`. Did you install the compiler feature with `pip install betterproto[compiler]`?"
"Unable to import `black` formatter. Did you install the compiler feature with `pip install betterproto[compiler]`?"
)
raise SystemExit(1)
import jinja2
import stringcase
from google.protobuf.compiler import plugin_pb2 as plugin
from google.protobuf.descriptor_pb2 import (
DescriptorProto,
EnumDescriptorProto,
FieldDescriptorProto,
FileDescriptorProto,
ServiceDescriptorProto,
)
from betterproto.casing import safe_snake_case
import google.protobuf.wrappers_pb2 as google_wrappers
WRAPPER_TYPES: Dict[str, Optional[Type]] = defaultdict(
lambda: None,
{
"google.protobuf.DoubleValue": google_wrappers.DoubleValue,
"google.protobuf.FloatValue": google_wrappers.FloatValue,
"google.protobuf.Int64Value": google_wrappers.Int64Value,
"google.protobuf.UInt64Value": google_wrappers.UInt64Value,
"google.protobuf.Int32Value": google_wrappers.Int32Value,
"google.protobuf.UInt32Value": google_wrappers.UInt32Value,
"google.protobuf.BoolValue": google_wrappers.BoolValue,
"google.protobuf.StringValue": google_wrappers.StringValue,
"google.protobuf.BytesValue": google_wrappers.BytesValue,
},
)
def snake_case(value: str) -> str:
return (
re.sub(r"(?<=[a-z])[A-Z]|[A-Z](?=[^A-Z])", r"_\g<0>", value).lower().strip("_")
)
def get_ref_type(package: str, imports: set, type_name: str) -> str:
def get_ref_type(
package: str, imports: set, type_name: str, unwrap: bool = True
) -> str:
"""
Return a Python type name for a proto type reference. Adds the import if
necessary.
necessary. Unwraps well known type if required.
"""
# If the package name is a blank string, then this should still work
# because by convention packages are lowercase and message/enum types are
# pascal-cased. May require refactoring in the future.
type_name = type_name.lstrip(".")
# Check if type is wrapper.
wrapper_class = WRAPPER_TYPES[type_name]
if unwrap:
if wrapper_class:
wrapped_type = type(wrapper_class().value)
return f"Optional[{wrapped_type.__name__}]"
if type_name == "google.protobuf.Duration":
return "timedelta"
if type_name == "google.protobuf.Timestamp":
return "datetime"
elif wrapper_class:
imports.add(f"from {wrapper_class.__module__} import {wrapper_class.__name__}")
return f"{wrapper_class.__name__}"
if type_name.startswith(package):
# This is the current package, which has nested types flattened.
type_name = f'"{type_name.lstrip(package).lstrip(".").replace(".", "")}"'
parts = type_name.lstrip(package).lstrip(".").split(".")
if len(parts) == 1 or (len(parts) > 1 and parts[0][0] == parts[0][0].upper()):
# This is the current package, which has nested types flattened.
# foo.bar_thing => FooBarThing
cased = [stringcase.pascalcase(part) for part in parts]
type_name = f'"{"".join(cased)}"'
if "." in type_name:
# This is imported from another package. No need
# to use a forward ref and we need to add the import.
parts = type_name.split(".")
parts[-1] = stringcase.pascalcase(parts[-1])
imports.add(f"from .{'.'.join(parts[:-2])} import {parts[-2]}")
type_name = f"{parts[-2]}.{parts[-1]}"
@@ -92,19 +133,19 @@ def get_py_zero(type_num: int) -> str:
def traverse(proto_file):
def _traverse(path, items):
def _traverse(path, items, prefix=""):
for i, item in enumerate(items):
# Adjust the name since we flatten the heirarchy.
item.name = next_prefix = prefix + item.name
yield item, path + [i]
if isinstance(item, DescriptorProto):
for enum in item.enum_type:
enum.name = item.name + enum.name
enum.name = next_prefix + enum.name
yield enum, path + [i, 4]
if item.nested_type:
for n, p in _traverse(path + [i, 3], item.nested_type):
# Adjust the name since we flatten the heirarchy.
n.name = item.name + n.name
for n, p in _traverse(path + [i, 3], item.nested_type, next_prefix):
yield n, p
return itertools.chain(
@@ -112,25 +153,26 @@ def traverse(proto_file):
)
def get_comment(proto_file, path: List[int]) -> str:
def get_comment(proto_file, path: List[int], indent: int = 4) -> str:
pad = " " * indent
for sci in proto_file.source_code_info.location:
# print(list(sci.path), path, file=sys.stderr)
if list(sci.path) == path and sci.leading_comments:
lines = textwrap.wrap(
sci.leading_comments.strip().replace("\n", ""), width=75
sci.leading_comments.strip().replace("\n", ""), width=79 - indent
)
if path[-2] == 2 and path[-4] != 6:
# This is a field
return " # " + " # ".join(lines)
return f"{pad}# " + f"\n{pad}# ".join(lines)
else:
# This is a message, enum, service, or method
if len(lines) == 1 and len(lines[0]) < 70:
if len(lines) == 1 and len(lines[0]) < 79 - indent - 6:
lines[0] = lines[0].strip('"')
return f' """{lines[0]}"""'
return f'{pad}"""{lines[0]}"""'
else:
joined = "\n ".join(lines)
return f' """\n {joined}\n """'
joined = f"\n{pad}".join(lines)
return f'{pad}"""\n{pad}{joined}\n{pad}"""'
return ""
@@ -141,11 +183,14 @@ def generate_code(request, response):
lstrip_blocks=True,
loader=jinja2.FileSystemLoader("%s/templates/" % os.path.dirname(__file__)),
)
template = env.get_template("template.py")
template = env.get_template("template.py.j2")
output_map = {}
for proto_file in request.proto_file:
out = proto_file.package
if out == "google.protobuf":
continue
if not out:
out = os.path.splitext(proto_file.name)[0].replace(os.path.sep, ".")
@@ -163,6 +208,7 @@ def generate_code(request, response):
"package": package,
"files": [f.name for f in options["files"]],
"imports": set(),
"datetime_imports": set(),
"typing_imports": set(),
"messages": [],
"enums": [],
@@ -179,7 +225,7 @@ def generate_code(request, response):
for item, path in traverse(proto_file):
# print(item, file=sys.stderr)
# print(path, file=sys.stderr)
data = {"name": item.name}
data = {"name": item.name, "py_name": stringcase.pascalcase(item.name)}
if isinstance(item, DescriptorProto):
# print(item, file=sys.stderr)
@@ -203,6 +249,14 @@ def generate_code(request, response):
packed = False
field_type = f.Type.Name(f.type).lower()[5:]
field_wraps = ""
if f.type_name.startswith(
".google.protobuf"
) and f.type_name.endswith("Value"):
w = f.type_name.split(".").pop()[:-5].upper()
field_wraps = f"betterproto.TYPE_{w}"
map_types = None
if f.type == 11:
# This might be a map...
@@ -252,13 +306,23 @@ def generate_code(request, response):
if f.HasField("oneof_index"):
one_of = item.oneof_decl[f.oneof_index].name
if "Optional[" in t:
output["typing_imports"].add("Optional")
if "timedelta" in t:
output["datetime_imports"].add("timedelta")
elif "datetime" in t:
output["datetime_imports"].add("datetime")
data["properties"].append(
{
"name": f.name,
"py_name": safe_snake_case(f.name),
"number": f.number,
"comment": get_comment(proto_file, path + [2, i]),
"proto_type": int(f.type),
"field_type": field_type,
"field_wraps": field_wraps,
"map_types": map_types,
"type": t,
"zero": zero,
@@ -294,6 +358,7 @@ def generate_code(request, response):
data = {
"name": service.name,
"py_name": stringcase.pascalcase(service.name),
"comment": get_comment(proto_file, [6, i]),
"methods": [],
}
@@ -317,15 +382,18 @@ def generate_code(request, response):
data["methods"].append(
{
"name": method.name,
"py_name": snake_case(method.name),
"comment": get_comment(proto_file, [6, i, 2, j]),
"py_name": stringcase.snakecase(method.name),
"comment": get_comment(proto_file, [6, i, 2, j], indent=8),
"route": f"/{package}.{service.name}/{method.name}",
"input": get_ref_type(
package, output["imports"], method.input_type
).strip('"'),
"input_message": input_message,
"output": get_ref_type(
package, output["imports"], method.output_type
package,
output["imports"],
method.output_type,
unwrap=False,
).strip('"'),
"client_streaming": method.client_streaming,
"server_streaming": method.server_streaming,
@@ -338,6 +406,7 @@ def generate_code(request, response):
output["services"].append(data)
output["imports"] = sorted(output["imports"])
output["datetime_imports"] = sorted(output["datetime_imports"])
output["typing_imports"] = sorted(output["typing_imports"])
# Fill response
@@ -345,8 +414,11 @@ def generate_code(request, response):
# print(filename, file=sys.stderr)
f.name = filename.replace(".", os.path.sep) + ".py"
# f.content = json.dumps(output, indent=2)
f.content = template.render(description=output).rstrip("\n") + "\n"
# Render and then format the output file.
f.content = black.format_str(
template.render(description=output),
mode=black.FileMode(target_versions=set([black.TargetVersion.PY37])),
)
inits = set([""])
for f in response.file:
@@ -361,10 +433,20 @@ def generate_code(request, response):
inits.add(base)
for base in inits:
name = os.path.join(base, "__init__.py")
if os.path.exists(name):
# Never overwrite inits as they may have custom stuff in them.
continue
init = response.file.add()
init.name = os.path.join(base, "__init__.py")
init.name = name
init.content = b""
filenames = sorted([f.name for f in response.file])
for fname in filenames:
print(f"Writing {fname}", file=sys.stderr)
def main():
"""The plugin's main entry point."""

View File

@@ -2,6 +2,10 @@
# sources: {{ ', '.join(description.files) }}
# plugin: python-betterproto
from dataclasses import dataclass
{% if description.datetime_imports %}
from datetime import {% for i in description.datetime_imports %}{{ i }}{% if not loop.last %}, {% endif %}{% endfor %}
{% endif%}
{% if description.typing_imports %}
from typing import {% for i in description.typing_imports %}{{ i }}{% if not loop.last %}, {% endif %}{% endfor %}
@@ -11,14 +15,14 @@ import betterproto
{% if description.services %}
import grpclib
{% endif %}
{% for i in description.imports %}
{% for i in description.imports %}
{{ i }}
{% endfor %}
{% if description.enums %}{% for enum in description.enums %}
class {{ enum.name }}(betterproto.Enum):
class {{ enum.py_name }}(betterproto.Enum):
{% if enum.comment %}
{{ enum.comment }}
@@ -35,7 +39,7 @@ class {{ enum.name }}(betterproto.Enum):
{% endif %}
{% for message in description.messages %}
@dataclass
class {{ message.name }}(betterproto.Message):
class {{ message.py_name }}(betterproto.Message):
{% if message.comment %}
{{ message.comment }}
@@ -44,7 +48,7 @@ class {{ message.name }}(betterproto.Message):
{% if field.comment %}
{{ field.comment }}
{% endif %}
{{ field.name }}: {{ field.type }} = betterproto.{{ field.field_type }}_field({{ field.number }}{% if field.field_type == 'map'%}, betterproto.{{ field.map_types[0] }}, betterproto.{{ field.map_types[1] }}{% endif %}{% if field.one_of %}, group="{{ field.one_of }}"{% endif %})
{{ field.py_name }}: {{ field.type }} = betterproto.{{ field.field_type }}_field({{ field.number }}{% if field.field_type == 'map'%}, betterproto.{{ field.map_types[0] }}, betterproto.{{ field.map_types[1] }}{% endif %}{% if field.one_of %}, group="{{ field.one_of }}"{% endif %}{% if field.field_wraps %}, wraps={{ field.field_wraps }}{% endif %})
{% endfor %}
{% if not message.properties %}
pass
@@ -53,13 +57,13 @@ class {{ message.name }}(betterproto.Message):
{% endfor %}
{% for service in description.services %}
class {{ service.name }}Stub(betterproto.ServiceStub):
class {{ service.py_name }}Stub(betterproto.ServiceStub):
{% if service.comment %}
{{ service.comment }}
{% endif %}
{% for method in service.methods %}
async def {{ method.py_name }}(self{% if method.input_message and method.input_message.properties %}, *, {% for field in method.input_message.properties %}{{ field.name }}: {% if field.zero == "None" %}Optional[{{ field.type }}]{% else %}{{ field.type }}{% endif %} = {{ field.zero }}{% if not loop.last %}, {% endif %}{% endfor %}{% endif %}) -> {% if method.server_streaming %}AsyncGenerator[{{ method.output }}, None]{% else %}{{ method.output }}{% endif %}:
async def {{ method.py_name }}(self{% if method.input_message and method.input_message.properties %}, *, {% for field in method.input_message.properties %}{{ field.py_name }}: {% if field.zero == "None" and not field.type.startswith("Optional[") %}Optional[{{ field.type }}]{% else %}{{ field.type }}{% endif %} = {{ field.zero }}{% if not loop.last %}, {% endif %}{% endfor %}{% endif %}) -> {% if method.server_streaming %}AsyncGenerator[{{ method.output }}, None]{% else %}{{ method.output }}{% endif %}:
{% if method.comment %}
{{ method.comment }}
@@ -67,27 +71,25 @@ class {{ service.name }}Stub(betterproto.ServiceStub):
request = {{ method.input }}()
{% for field in method.input_message.properties %}
{% if field.field_type == 'message' %}
if {{ field.name }} is not None:
request.{{ field.name }} = {{ field.name }}
if {{ field.py_name }} is not None:
request.{{ field.py_name }} = {{ field.py_name }}
{% else %}
request.{{ field.name }} = {{ field.name }}
request.{{ field.py_name }} = {{ field.py_name }}
{% endif %}
{% endfor %}
{% if method.server_streaming %}
async for response in self._unary_stream(
"{{ method.route }}",
{{ method.input }},
{{ method.output }},
request,
{{ method.output }},
):
yield response
{% else %}
return await self._unary_unary(
"{{ method.route }}",
{{ method.input }},
{{ method.output }},
request,
{{ method.output }},
)
{% endif %}

View File

@@ -0,0 +1,90 @@
# Standard Tests Development Guide
Standard test cases are found in [betterproto/tests/inputs](inputs), where each subdirectory represents a testcase, that is verified in isolation.
```
inputs/
bool/
double/
int32/
...
```
## Test case directory structure
Each testcase has a `<name>.proto` file with a message called `Test`, a matching `.json` file and optionally a custom test file called `test_*.py`.
```bash
bool/
bool.proto
bool.json
test_bool.py # optional
```
### proto
`<name>.proto` &mdash; *The protobuf message to test*
```protobuf
syntax = "proto3";
message Test {
bool value = 1;
}
```
You can add multiple `.proto` files to the test case, as long as one file matches the directory name.
### json
`<name>.json` &mdash; *Test-data to validate the message with*
```json
{
"value": true
}
```
### pytest
`test_<name>.py` &mdash; *Custom test to validate specific aspects of the generated class*
```python
from betterproto.tests.output_betterproto.bool.bool import Test
def test_value():
message = Test()
assert not message.value, "Boolean is False by default"
```
## Standard tests
The following tests are automatically executed for all cases:
- [x] Can the generated python code imported?
- [x] Can the generated message class be instantiated?
- [x] Is the generated code compatible with the Google's `grpc_tools.protoc` implementation?
## Running the tests
- `pipenv run generate`
This generates
- `betterproto/tests/output_betterproto` &mdash; *the plugin generated python classes*
- `betterproto/tests/output_reference` &mdash; *reference implementation classes*
- `pipenv run test`
## Intentionally Failing tests
The standard test suite includes tests that fail by intention. These tests document known bugs and missing features that are intended to be corrented in the future.
When running `pytest`, they show up as `x` or `X` in the test results.
```
betterproto/tests/test_inputs.py ..x...x..x...x.X........xx........x.....x.......x.xx....x...................... [ 84%]
```
- `.` &mdash; PASSED
- `x` &mdash; XFAIL: expected failure
- `X` &mdash; XPASS: expected failure, but still passed
Test cases marked for expected failure are declared in [inputs/xfail.py](inputs.xfail.py)

View File

@@ -1,83 +1,91 @@
#!/usr/bin/env python
import glob
import os
import shutil
import sys
from typing import Set
from betterproto.tests.util import (
get_directories,
inputs_path,
output_path_betterproto,
output_path_reference,
protoc_plugin,
protoc_reference,
)
# Force pure-python implementation instead of C++, otherwise imports
# break things because we can't properly reset the symbol database.
os.environ["PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION"] = "python"
import importlib
import json
import subprocess
import sys
from typing import Generator, Tuple
from google.protobuf import symbol_database
from google.protobuf.descriptor_pool import DescriptorPool
from google.protobuf.json_format import MessageToJson, Parse
def clear_directory(path: str):
for file_or_directory in glob.glob(os.path.join(path, "*")):
if os.path.isdir(file_or_directory):
shutil.rmtree(file_or_directory)
else:
os.remove(file_or_directory)
root = os.path.dirname(os.path.realpath(__file__))
def generate(whitelist: Set[str]):
path_whitelist = {os.path.realpath(e) for e in whitelist if os.path.exists(e)}
name_whitelist = {e for e in whitelist if not os.path.exists(e)}
test_case_names = set(get_directories(inputs_path))
for test_case_name in sorted(test_case_names):
test_case_input_path = os.path.realpath(
os.path.join(inputs_path, test_case_name)
)
if (
whitelist
and test_case_input_path not in path_whitelist
and test_case_name not in name_whitelist
):
continue
test_case_output_path_reference = os.path.join(
output_path_reference, test_case_name
)
test_case_output_path_betterproto = os.path.join(
output_path_betterproto, test_case_name
)
print(f"Generating output for {test_case_name}")
os.makedirs(test_case_output_path_reference, exist_ok=True)
os.makedirs(test_case_output_path_betterproto, exist_ok=True)
clear_directory(test_case_output_path_reference)
clear_directory(test_case_output_path_betterproto)
protoc_reference(test_case_input_path, test_case_output_path_reference)
protoc_plugin(test_case_input_path, test_case_output_path_betterproto)
def get_files(end: str) -> Generator[str, None, None]:
for r, dirs, files in os.walk(root):
for filename in [f for f in files if f.endswith(end)]:
yield os.path.join(r, filename)
HELP = "\n".join(
[
"Usage: python generate.py",
" python generate.py [DIRECTORIES or NAMES]",
"Generate python classes for standard tests.",
"",
"DIRECTORIES One or more relative or absolute directories of test-cases to generate classes for.",
" python generate.py inputs/bool inputs/double inputs/enum",
"",
"NAMES One or more test-case names to generate classes for.",
" python generate.py bool double enums",
]
)
def get_base(filename: str) -> str:
return os.path.splitext(os.path.basename(filename))[0]
def main():
if set(sys.argv).intersection({"-h", "--help"}):
print(HELP)
return
whitelist = set(sys.argv[1:])
def ensure_ext(filename: str, ext: str) -> str:
if not filename.endswith(ext):
return filename + ext
return filename
generate(whitelist)
if __name__ == "__main__":
os.chdir(root)
if len(sys.argv) > 1:
proto_files = [ensure_ext(f, ".proto") for f in sys.argv[1:]]
bases = {get_base(f) for f in proto_files}
json_files = [
f for f in get_files(".json") if get_base(f).split("-")[0] in bases
]
else:
proto_files = get_files(".proto")
json_files = get_files(".json")
for filename in proto_files:
print(f"Generating code for {os.path.basename(filename)}")
subprocess.run(
f"protoc --python_out=. {os.path.basename(filename)}", shell=True
)
subprocess.run(
f"protoc --plugin=protoc-gen-custom=../plugin.py --custom_out=. {os.path.basename(filename)}",
shell=True,
)
for filename in json_files:
# Reset the internal symbol database so we can import the `Test` message
# multiple times. Ugh.
sym = symbol_database.Default()
sym.pool = DescriptorPool()
parts = get_base(filename).split("-")
out = filename.replace(".json", ".bin")
print(f"Using {parts[0]}_pb2 to generate {os.path.basename(out)}")
imported = importlib.import_module(f"{parts[0]}_pb2")
input_json = open(filename).read()
parsed = Parse(input_json, imported.Test())
serialized = parsed.SerializeToString()
serialized_json = MessageToJson(parsed, preserving_proto_field_name=True)
s_loaded = json.loads(serialized_json)
in_loaded = json.loads(input_json)
if s_loaded != in_loaded:
raise AssertionError("Expected JSON to be equal:", s_loaded, in_loaded)
open(out, "wb").write(serialized)
main()

View File

@@ -0,0 +1,3 @@
{
"value": true
}

View File

@@ -0,0 +1,5 @@
syntax = "proto3";
message Test {
bool value = 1;
}

View File

@@ -0,0 +1,6 @@
from betterproto.tests.output_betterproto.bool.bool import Test
def test_value():
message = Test()
assert not message.value, "Boolean is False by default"

View File

@@ -0,0 +1,4 @@
{
"camelCase": 1,
"snakeCase": "ONE"
}

View File

@@ -0,0 +1,17 @@
syntax = "proto3";
enum my_enum {
ZERO = 0;
ONE = 1;
TWO = 2;
}
message Test {
int32 camelCase = 1;
my_enum snake_case = 2;
snake_case_message snake_case_message = 3;
}
message snake_case_message {
}

View File

@@ -0,0 +1,22 @@
import betterproto.tests.output_betterproto.casing.casing as casing
from betterproto.tests.output_betterproto.casing.casing import Test
def test_message_attributes():
message = Test()
assert hasattr(
message, "snake_case_message"
), "snake_case field name is same in python"
assert hasattr(message, "camel_case"), "CamelCase field is snake_case in python"
def test_message_casing():
assert hasattr(
casing, "SnakeCaseMessage"
), "snake_case Message name is converted to CamelCase in python"
def test_enum_casing():
assert hasattr(
casing, "MyEnum"
), "snake_case Enum name is converted to CamelCase in python"

View File

@@ -0,0 +1 @@
{}

View File

@@ -0,0 +1,5 @@
{
"maybe": false,
"ts": "1972-01-01T10:00:20.021Z",
"duration": "1.200s"
}

View File

@@ -0,0 +1,12 @@
syntax = "proto3";
import "google/protobuf/duration.proto";
import "google/protobuf/timestamp.proto";
import "google/protobuf/wrappers.proto";
message Test {
google.protobuf.BoolValue maybe = 1;
google.protobuf.Timestamp ts = 2;
google.protobuf.Duration duration = 3;
google.protobuf.Int32Value important = 4;
}

View File

@@ -0,0 +1,21 @@
syntax = "proto3";
import "google/protobuf/wrappers.proto";
// Tests that wrapped values can be used directly as return values
service Test {
rpc GetDouble (Input) returns (google.protobuf.DoubleValue);
rpc GetFloat (Input) returns (google.protobuf.FloatValue);
rpc GetInt64 (Input) returns (google.protobuf.Int64Value);
rpc GetUInt64 (Input) returns (google.protobuf.UInt64Value);
rpc GetInt32 (Input) returns (google.protobuf.Int32Value);
rpc GetUInt32 (Input) returns (google.protobuf.UInt32Value);
rpc GetBool (Input) returns (google.protobuf.BoolValue);
rpc GetString (Input) returns (google.protobuf.StringValue);
rpc GetBytes (Input) returns (google.protobuf.BytesValue);
}
message Input {
}

View File

@@ -0,0 +1,56 @@
from typing import Any, Callable, Optional
import google.protobuf.wrappers_pb2 as wrappers
import pytest
from betterproto.tests.mocks import MockChannel
from betterproto.tests.output_betterproto.googletypes_response.googletypes_response import (
TestStub,
)
test_cases = [
(TestStub.get_double, wrappers.DoubleValue, 2.5),
(TestStub.get_float, wrappers.FloatValue, 2.5),
(TestStub.get_int64, wrappers.Int64Value, -64),
(TestStub.get_u_int64, wrappers.UInt64Value, 64),
(TestStub.get_int32, wrappers.Int32Value, -32),
(TestStub.get_u_int32, wrappers.UInt32Value, 32),
(TestStub.get_bool, wrappers.BoolValue, True),
(TestStub.get_string, wrappers.StringValue, "string"),
(TestStub.get_bytes, wrappers.BytesValue, bytes(0xFF)[0:4]),
]
@pytest.mark.asyncio
@pytest.mark.parametrize(["service_method", "wrapper_class", "value"], test_cases)
async def test_channel_receives_wrapped_type(
service_method: Callable[[TestStub], Any], wrapper_class: Callable, value
):
wrapped_value = wrapper_class()
wrapped_value.value = value
channel = MockChannel(responses=[wrapped_value])
service = TestStub(channel)
await service_method(service)
assert channel.requests[0]["response_type"] != Optional[type(value)]
assert channel.requests[0]["response_type"] == type(wrapped_value)
@pytest.mark.asyncio
@pytest.mark.xfail
@pytest.mark.parametrize(["service_method", "wrapper_class", "value"], test_cases)
async def test_service_unwraps_response(
service_method: Callable[[TestStub], Any], wrapper_class: Callable, value
):
"""
grpclib does not unwrap wrapper values returned by services
"""
wrapped_value = wrapper_class()
wrapped_value.value = value
service = TestStub(MockChannel(responses=[wrapped_value]))
response_value = await service_method(service)
assert response_value == value
assert type(response_value) == type(value)

View File

@@ -0,0 +1,24 @@
syntax = "proto3";
import "google/protobuf/wrappers.proto";
// Tests that wrapped values are supported as part of output message
service Test {
rpc getOutput (Input) returns (Output);
}
message Input {
}
message Output {
google.protobuf.DoubleValue double_value = 1;
google.protobuf.FloatValue float_value = 2;
google.protobuf.Int64Value int64_value = 3;
google.protobuf.UInt64Value uint64_value = 4;
google.protobuf.Int32Value int32_value = 5;
google.protobuf.UInt32Value uint32_value = 6;
google.protobuf.BoolValue bool_value = 7;
google.protobuf.StringValue string_value = 8;
google.protobuf.BytesValue bytes_value = 9;
}

View File

@@ -0,0 +1,39 @@
import pytest
from betterproto.tests.mocks import MockChannel
from betterproto.tests.output_betterproto.googletypes_response_embedded.googletypes_response_embedded import (
Output,
TestStub,
)
@pytest.mark.asyncio
async def test_service_passes_through_unwrapped_values_embedded_in_response():
"""
We do not not need to implement value unwrapping for embedded well-known types,
as this is already handled by grpclib. This test merely shows that this is the case.
"""
output = Output(
double_value=10.0,
float_value=12.0,
int64_value=-13,
uint64_value=14,
int32_value=-15,
uint32_value=16,
bool_value=True,
string_value="string",
bytes_value=bytes(0xFF)[0:4],
)
service = TestStub(MockChannel(responses=[output]))
response = await service.get_output()
assert response.double_value == 10.0
assert response.float_value == 12.0
assert response.int64_value == -13
assert response.uint64_value == 14
assert response.int32_value == -15
assert response.uint32_value == 16
assert response.bool_value
assert response.string_value == "string"
assert response.bytes_value == bytes(0xFF)[0:4]

View File

@@ -0,0 +1,7 @@
syntax = "proto3";
package package.childpackage;
message ChildMessage {
}

View File

@@ -0,0 +1,9 @@
syntax = "proto3";
import "package_message.proto";
// Tests generated imports when a message in a package refers to a message in a nested child package.
message Test {
package.PackageMessage message = 1;
}

View File

@@ -0,0 +1,9 @@
syntax = "proto3";
import "child.proto";
package package;
message PackageMessage {
package.childpackage.ChildMessage c = 1;
}

View File

@@ -0,0 +1,7 @@
syntax = "proto3";
package childpackage;
message Message {
}

View File

@@ -0,0 +1,9 @@
syntax = "proto3";
import "child.proto";
// Tests generated imports when a message in root refers to a message in a child package.
message Test {
childpackage.Message child = 1;
}

View File

@@ -0,0 +1,28 @@
syntax = "proto3";
import "root.proto";
import "other.proto";
// This test-case verifies that future implementations will support circular dependencies in the generated python files.
//
// This becomes important when generating 1 python file/module per package, rather than 1 file per proto file.
//
// Scenario:
//
// The proto messages depend on each other in a non-circular way:
//
// Test -------> RootPackageMessage <--------------.
// `------------------------------------> OtherPackageMessage
//
// Test and RootPackageMessage are in different files, but belong to the same package (root):
//
// (Test -------> RootPackageMessage) <------------.
// `------------------------------------> OtherPackageMessage
//
// After grouping the packages into single files or modules, a circular dependency is created:
//
// (root: Test & RootPackageMessage) <-------> (other: OtherPackageMessage)
message Test {
RootPackageMessage message = 1;
other.OtherPackageMessage other =2;
}

View File

@@ -0,0 +1,8 @@
syntax = "proto3";
import "root.proto";
package other;
message OtherPackageMessage {
RootPackageMessage rootPackageMessage = 1;
}

View File

@@ -0,0 +1,5 @@
syntax = "proto3";
message RootPackageMessage {
}

View File

@@ -0,0 +1,12 @@
syntax = "proto3";
import "parent_package_message.proto";
package parent.child;
// Tests generated imports when a message refers to a message defined in its parent package
message Test {
ParentPackageMessage message_implicit = 1;
parent.ParentPackageMessage message_explicit = 2;
}

View File

@@ -0,0 +1,6 @@
syntax = "proto3";
package parent;
message ParentPackageMessage {
}

View File

@@ -0,0 +1,11 @@
syntax = "proto3";
import "root.proto";
package child;
// Tests generated imports when a message inside a child-package refers to a message defined in the root.
message Test {
RootMessage message = 1;
}

View File

@@ -0,0 +1,5 @@
syntax = "proto3";
message RootMessage {
}

View File

@@ -0,0 +1,9 @@
syntax = "proto3";
import "sibling.proto";
// Tests generated imports when a message in the root package refers to another message in the root package
message Test {
SiblingMessage sibling = 1;
}

View File

@@ -0,0 +1,5 @@
syntax = "proto3";
message SiblingMessage {
}

View File

@@ -0,0 +1,4 @@
{
"positive": 150,
"negative": -150
}

View File

@@ -3,5 +3,6 @@ syntax = "proto3";
// Some documentation about the Test message.
message Test {
// Some documentation about the count.
int32 count = 1;
int32 positive = 1;
int32 negative = 2;
}

View File

@@ -0,0 +1,5 @@
{
"for": 1,
"with": 2,
"as": 3
}

View File

@@ -0,0 +1,11 @@
syntax = "proto3";
message Test {
int32 for = 1;
int32 with = 2;
int32 as = 3;
}
service TestService {
rpc GetTest(Test) returns (Test) {}
}

View File

@@ -0,0 +1,11 @@
{
"root": {
"name": "double-nested",
"parent": {
"child": [{"foo": "hello"}],
"enumChild": ["A"],
"rootParentChild": [{"a": "hello"}],
"bar": true
}
}
}

View File

@@ -0,0 +1,26 @@
syntax = "proto3";
message Test {
message Root {
message Parent {
message RootParentChild {
string a = 1;
}
enum EnumChild{
A = 0;
B = 1;
}
message Child {
string foo = 1;
}
reserved 1;
repeated Child child = 2;
repeated EnumChild enumChild=3;
repeated RootParentChild rootParentChild=4;
bool bar = 5;
}
string name = 1;
Parent parent = 2;
}
Root root = 1;
}

View File

@@ -0,0 +1,3 @@
{
"name": "foobar"
}

View File

@@ -0,0 +1,3 @@
{
"count": 100
}

View File

@@ -0,0 +1,15 @@
import betterproto
from betterproto.tests.output_betterproto.oneof.oneof import Test
from betterproto.tests.util import get_test_case_json_data
def test_which_count():
message = Test()
message.from_json(get_test_case_json_data("oneof"))
assert betterproto.which_one_of(message, "foo") == ("count", 100)
def test_which_name():
message = Test()
message.from_json(get_test_case_json_data("oneof", "oneof-name.json"))
assert betterproto.which_one_of(message, "foo") == ("name", "foobar")

View File

@@ -0,0 +1,3 @@
{
"signal": "PASS"
}

View File

@@ -0,0 +1,3 @@
{
"signal": "RESIGN"
}

View File

@@ -0,0 +1,6 @@
{
"move": {
"x": 2,
"y": 3
}
}

View File

@@ -0,0 +1,18 @@
syntax = "proto3";
message Test {
oneof action {
Signal signal = 1;
Move move = 2;
}
}
enum Signal {
PASS = 0;
RESIGN = 1;
}
message Move {
int32 x = 1;
int32 y = 2;
}

View File

@@ -0,0 +1,42 @@
import pytest
import betterproto
from betterproto.tests.output_betterproto.oneof_enum.oneof_enum import (
Move,
Signal,
Test,
)
from betterproto.tests.util import get_test_case_json_data
@pytest.mark.xfail
def test_which_one_of_returns_enum_with_default_value():
"""
returns first field when it is enum and set with default value
"""
message = Test()
message.from_json(get_test_case_json_data("oneof_enum", "oneof_enum-enum-0.json"))
assert message.move is None
assert message.signal == Signal.PASS
assert betterproto.which_one_of(message, "action") == ("signal", Signal.PASS)
@pytest.mark.xfail
def test_which_one_of_returns_enum_with_non_default_value():
"""
returns first field when it is enum and set with non default value
"""
message = Test()
message.from_json(get_test_case_json_data("oneof_enum", "oneof_enum-enum-1.json"))
assert message.move is None
assert message.signal == Signal.PASS
assert betterproto.which_one_of(message, "action") == ("signal", Signal.RESIGN)
@pytest.mark.xfail
def test_which_one_of_returns_second_field_when_set():
message = Test()
message.from_json(get_test_case_json_data("oneof_enum"))
assert message.move == Move(x=2, y=3)
assert message.signal == 0
assert betterproto.which_one_of(message, "action") == ("move", Move(x=2, y=3))

View File

@@ -0,0 +1,11 @@
syntax = "proto3";
package repeatedmessage;
message Test {
repeated Sub greetings = 1;
}
message Sub {
string greeting = 1;
}

View File

@@ -0,0 +1,15 @@
syntax = "proto3";
package service;
message DoThingRequest {
int32 iterations = 1;
}
message DoThingResponse {
int32 successfulIterations = 1;
}
service Test {
rpc DoThing (DoThingRequest) returns (DoThingResponse);
}

View File

@@ -0,0 +1,132 @@
import betterproto
import grpclib
from grpclib.testing import ChannelFor
import pytest
from typing import Dict
from betterproto.tests.output_betterproto.service.service import (
DoThingResponse,
DoThingRequest,
TestStub as ExampleServiceStub,
)
class ExampleService:
def __init__(self, test_hook=None):
# This lets us pass assertions to the servicer ;)
self.test_hook = test_hook
async def DoThing(
self, stream: "grpclib.server.Stream[DoThingRequest, DoThingResponse]"
):
request = await stream.recv_message()
print("self.test_hook", self.test_hook)
if self.test_hook is not None:
self.test_hook(stream)
for iteration in range(request.iterations):
pass
await stream.send_message(DoThingResponse(request.iterations))
def __mapping__(self) -> Dict[str, grpclib.const.Handler]:
return {
"/service.Test/DoThing": grpclib.const.Handler(
self.DoThing,
grpclib.const.Cardinality.UNARY_UNARY,
DoThingRequest,
DoThingResponse,
)
}
async def _test_stub(stub, iterations=42, **kwargs):
response = await stub.do_thing(iterations=iterations)
assert response.successful_iterations == iterations
def _get_server_side_test(deadline, metadata):
def server_side_test(stream):
assert stream.deadline._timestamp == pytest.approx(
deadline._timestamp, 1
), "The provided deadline should be recieved serverside"
assert (
stream.metadata["authorization"] == metadata["authorization"]
), "The provided authorization metadata should be recieved serverside"
return server_side_test
@pytest.mark.asyncio
async def test_simple_service_call():
async with ChannelFor([ExampleService()]) as channel:
await _test_stub(ExampleServiceStub(channel))
@pytest.mark.asyncio
async def test_service_call_with_upfront_request_params():
# Setting deadline
deadline = grpclib.metadata.Deadline.from_timeout(22)
metadata = {"authorization": "12345"}
async with ChannelFor(
[ExampleService(test_hook=_get_server_side_test(deadline, metadata))]
) as channel:
await _test_stub(
ExampleServiceStub(channel, deadline=deadline, metadata=metadata)
)
# Setting timeout
timeout = 99
deadline = grpclib.metadata.Deadline.from_timeout(timeout)
metadata = {"authorization": "12345"}
async with ChannelFor(
[ExampleService(test_hook=_get_server_side_test(deadline, metadata))]
) as channel:
await _test_stub(
ExampleServiceStub(channel, timeout=timeout, metadata=metadata)
)
@pytest.mark.asyncio
async def test_service_call_lower_level_with_overrides():
ITERATIONS = 99
# Setting deadline
deadline = grpclib.metadata.Deadline.from_timeout(22)
metadata = {"authorization": "12345"}
kwarg_deadline = grpclib.metadata.Deadline.from_timeout(28)
kwarg_metadata = {"authorization": "12345"}
async with ChannelFor(
[ExampleService(test_hook=_get_server_side_test(deadline, metadata))]
) as channel:
stub = ExampleServiceStub(channel, deadline=deadline, metadata=metadata)
response = await stub._unary_unary(
"/service.Test/DoThing",
DoThingRequest(ITERATIONS),
DoThingResponse,
deadline=kwarg_deadline,
metadata=kwarg_metadata,
)
assert response.successful_iterations == ITERATIONS
# Setting timeout
timeout = 99
deadline = grpclib.metadata.Deadline.from_timeout(timeout)
metadata = {"authorization": "12345"}
kwarg_timeout = 9000
kwarg_deadline = grpclib.metadata.Deadline.from_timeout(kwarg_timeout)
kwarg_metadata = {"authorization": "09876"}
async with ChannelFor(
[
ExampleService(
test_hook=_get_server_side_test(kwarg_deadline, kwarg_metadata)
)
]
) as channel:
stub = ExampleServiceStub(channel, deadline=deadline, metadata=metadata)
response = await stub._unary_unary(
"/service.Test/DoThing",
DoThingRequest(ITERATIONS),
DoThingResponse,
timeout=kwarg_timeout,
metadata=kwarg_metadata,
)
assert response.successful_iterations == ITERATIONS

View File

@@ -0,0 +1,6 @@
{
"signed32": 150,
"negative32": -150,
"string64": "150",
"negative64": "-150"
}

View File

@@ -0,0 +1,9 @@
syntax = "proto3";
message Test {
// todo: rename fields after fixing bug where 'signed_32_positive' will map to 'signed_32Positive' as output json
sint32 signed32 = 1; // signed_32_positive
sint32 negative32 = 2; // signed_32_negative
sint64 string64 = 3; // signed_64_positive
sint64 negative64 = 4; // signed_64_negative
}

View File

@@ -0,0 +1,10 @@
# Test cases that are expected to fail, e.g. unimplemented features or bug-fixes.
# Remove from list when fixed.
tests = {
"import_root_sibling",
"import_child_package_from_package",
"import_root_package_from_child",
"import_parent_package_from_child",
"import_circular_dependency",
"oneof_enum",
}

View File

@@ -1,3 +0,0 @@
{
"count": -150
}

View File

@@ -1,3 +0,0 @@
{
"count": 150
}

View File

@@ -0,0 +1,39 @@
from typing import List
from grpclib.client import Channel
class MockChannel(Channel):
# noinspection PyMissingConstructor
def __init__(self, responses=None) -> None:
self.responses = responses if responses else []
self.requests = []
def request(self, route, cardinality, request, response_type, **kwargs):
self.requests.append(
{
"route": route,
"cardinality": cardinality,
"request": request,
"response_type": response_type,
}
)
return MockStream(self.responses)
class MockStream:
def __init__(self, responses: List) -> None:
super().__init__()
self.responses = responses
async def recv_message(self):
return self.responses.pop(0)
async def send_message(self, *args, **kwargs):
pass
async def __aexit__(self, exc_type, exc_val, exc_tb):
return True
async def __aenter__(self):
return self

View File

@@ -1,3 +0,0 @@
{
"name": "foo"
}

View File

@@ -1,3 +0,0 @@
{
"count": 1
}

View File

@@ -1,4 +0,0 @@
{
"signed_32": -150,
"signed_64": "-150"
}

View File

@@ -1,4 +0,0 @@
{
"signed_32": 150,
"signed_64": "150"
}

View File

@@ -1,6 +0,0 @@
syntax = "proto3";
message Test {
sint32 signed_32 = 1;
sint64 signed_64 = 2;
}

View File

@@ -1,5 +1,6 @@
import betterproto
from dataclasses import dataclass
from typing import Optional
def test_has_field():
@@ -32,6 +33,21 @@ def test_has_field():
assert betterproto.serialized_on_wire(foo.bar) == False
def test_class_init():
@dataclass
class Bar(betterproto.Message):
name: str = betterproto.string_field(1)
@dataclass
class Foo(betterproto.Message):
name: str = betterproto.string_field(1)
child: Bar = betterproto.message_field(2)
foo = Foo(name="foo", child=Bar(name="bar"))
assert foo.to_dict() == {"name": "foo", "child": {"name": "bar"}}
def test_enum_as_int_json():
class TestEnum(betterproto.Enum):
ZERO = 0
@@ -115,3 +131,135 @@ def test_oneof_support():
assert betterproto.which_one_of(foo2, "group1")[0] == "bar"
assert foo.bar == 0
assert betterproto.which_one_of(foo2, "group2")[0] == ""
def test_json_casing():
@dataclass
class CasingTest(betterproto.Message):
pascal_case: int = betterproto.int32_field(1)
camel_case: int = betterproto.int32_field(2)
snake_case: int = betterproto.int32_field(3)
kabob_case: int = betterproto.int32_field(4)
# Parsing should accept almost any input
test = CasingTest().from_dict(
{"PascalCase": 1, "camelCase": 2, "snake_case": 3, "kabob-case": 4}
)
assert test == CasingTest(1, 2, 3, 4)
# Serializing should be strict.
assert test.to_dict() == {
"pascalCase": 1,
"camelCase": 2,
"snakeCase": 3,
"kabobCase": 4,
}
assert test.to_dict(casing=betterproto.Casing.SNAKE) == {
"pascal_case": 1,
"camel_case": 2,
"snake_case": 3,
"kabob_case": 4,
}
def test_optional_flag():
@dataclass
class Request(betterproto.Message):
flag: Optional[bool] = betterproto.message_field(1, wraps=betterproto.TYPE_BOOL)
# Serialization of not passed vs. set vs. zero-value.
assert bytes(Request()) == b""
assert bytes(Request(flag=True)) == b"\n\x02\x08\x01"
assert bytes(Request(flag=False)) == b"\n\x00"
# Differentiate between not passed and the zero-value.
assert Request().parse(b"").flag == None
assert Request().parse(b"\n\x00").flag == False
def test_to_dict_default_values():
@dataclass
class TestMessage(betterproto.Message):
some_int: int = betterproto.int32_field(1)
some_double: float = betterproto.double_field(2)
some_str: str = betterproto.string_field(3)
some_bool: bool = betterproto.bool_field(4)
# Empty dict
test = TestMessage().from_dict({})
assert test.to_dict(include_default_values=True) == {
"someInt": 0,
"someDouble": 0.0,
"someStr": "",
"someBool": False,
}
# All default values
test = TestMessage().from_dict(
{"someInt": 0, "someDouble": 0.0, "someStr": "", "someBool": False}
)
assert test.to_dict(include_default_values=True) == {
"someInt": 0,
"someDouble": 0.0,
"someStr": "",
"someBool": False,
}
# Some default and some other values
@dataclass
class TestMessage2(betterproto.Message):
some_int: int = betterproto.int32_field(1)
some_double: float = betterproto.double_field(2)
some_str: str = betterproto.string_field(3)
some_bool: bool = betterproto.bool_field(4)
some_default_int: int = betterproto.int32_field(5)
some_default_double: float = betterproto.double_field(6)
some_default_str: str = betterproto.string_field(7)
some_default_bool: bool = betterproto.bool_field(8)
test = TestMessage2().from_dict(
{
"someInt": 2,
"someDouble": 1.2,
"someStr": "hello",
"someBool": True,
"someDefaultInt": 0,
"someDefaultDouble": 0.0,
"someDefaultStr": "",
"someDefaultBool": False,
}
)
assert test.to_dict(include_default_values=True) == {
"someInt": 2,
"someDouble": 1.2,
"someStr": "hello",
"someBool": True,
"someDefaultInt": 0,
"someDefaultDouble": 0.0,
"someDefaultStr": "",
"someDefaultBool": False,
}
# Nested messages
@dataclass
class TestChildMessage(betterproto.Message):
some_other_int: int = betterproto.int32_field(1)
@dataclass
class TestParentMessage(betterproto.Message):
some_int: int = betterproto.int32_field(1)
some_double: float = betterproto.double_field(2)
some_message: TestChildMessage = betterproto.message_field(3)
test = TestParentMessage().from_dict({"someInt": 0, "someDouble": 1.2})
assert test.to_dict(include_default_values=True) == {
"someInt": 0,
"someDouble": 1.2,
"someMessage": {"someOtherInt": 0},
}

View File

@@ -1,32 +1,147 @@
import importlib
import json
import os
import sys
from collections import namedtuple
from typing import Set
import pytest
from .generate import get_base, get_files
import betterproto
from betterproto.tests.inputs import xfail
from betterproto.tests.mocks import MockChannel
from betterproto.tests.util import get_directories, get_test_case_json_data, inputs_path
inputs = get_files(".bin")
# Force pure-python implementation instead of C++, otherwise imports
# break things because we can't properly reset the symbol database.
os.environ["PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION"] = "python"
from google.protobuf import symbol_database
from google.protobuf.descriptor_pool import DescriptorPool
from google.protobuf.json_format import Parse
@pytest.mark.parametrize("filename", inputs)
def test_sample(filename: str) -> None:
module = get_base(filename).split("-")[0]
imported = importlib.import_module(f"betterproto.tests.{module}")
data_binary = open(filename, "rb").read()
data_dict = json.loads(open(filename.replace(".bin", ".json")).read())
t1 = imported.Test().parse(data_binary)
t2 = imported.Test().from_dict(data_dict)
print(t1)
print(t2)
class TestCases:
def __init__(self, path, services: Set[str], xfail: Set[str]):
_all = set(get_directories(path))
_services = services
_messages = _all - services
_messages_with_json = {
test for test in _messages if get_test_case_json_data(test)
}
# Equality should automagically work for dataclasses!
assert t1 == t2
self.all = self.apply_xfail_marks(_all, xfail)
self.services = self.apply_xfail_marks(_services, xfail)
self.messages = self.apply_xfail_marks(_messages, xfail)
self.messages_with_json = self.apply_xfail_marks(_messages_with_json, xfail)
# Generally this can't be relied on, but here we are aiming to match the
# existing Python implementation and aren't doing anything tricky.
# https://developers.google.com/protocol-buffers/docs/encoding#implications
assert bytes(t1) == data_binary
assert bytes(t2) == data_binary
@staticmethod
def apply_xfail_marks(test_set: Set[str], xfail: Set[str]):
return [
pytest.param(test, marks=pytest.mark.xfail) if test in xfail else test
for test in test_set
]
assert t1.to_dict() == data_dict
assert t2.to_dict() == data_dict
test_cases = TestCases(
path=inputs_path,
# test cases for services
services={"googletypes_response", "googletypes_response_embedded", "service"},
xfail=xfail.tests,
)
plugin_output_package = "betterproto.tests.output_betterproto"
reference_output_package = "betterproto.tests.output_reference"
TestData = namedtuple("TestData", "plugin_module, reference_module, json_data")
@pytest.fixture
def test_data(request):
test_case_name = request.param
# Reset the internal symbol database so we can import the `Test` message
# multiple times. Ugh.
sym = symbol_database.Default()
sym.pool = DescriptorPool()
reference_module_root = os.path.join(
*reference_output_package.split("."), test_case_name
)
sys.path.append(reference_module_root)
yield (
TestData(
plugin_module=importlib.import_module(
f"{plugin_output_package}.{test_case_name}.{test_case_name}"
),
reference_module=lambda: importlib.import_module(
f"{reference_output_package}.{test_case_name}.{test_case_name}_pb2"
),
json_data=get_test_case_json_data(test_case_name),
)
)
sys.path.remove(reference_module_root)
@pytest.mark.parametrize("test_data", test_cases.messages, indirect=True)
def test_message_can_instantiated(test_data: TestData) -> None:
plugin_module, *_ = test_data
plugin_module.Test()
@pytest.mark.parametrize("test_data", test_cases.messages, indirect=True)
def test_message_equality(test_data: TestData) -> None:
plugin_module, *_ = test_data
message1 = plugin_module.Test()
message2 = plugin_module.Test()
assert message1 == message2
@pytest.mark.parametrize("test_data", test_cases.messages_with_json, indirect=True)
def test_message_json(repeat, test_data: TestData) -> None:
plugin_module, _, json_data = test_data
for _ in range(repeat):
message: betterproto.Message = plugin_module.Test()
message.from_json(json_data)
message_json = message.to_json(0)
assert json.loads(json_data) == json.loads(message_json)
@pytest.mark.parametrize("test_data", test_cases.services, indirect=True)
def test_service_can_be_instantiated(test_data: TestData) -> None:
plugin_module, _, json_data = test_data
plugin_module.TestStub(MockChannel())
@pytest.mark.parametrize("test_data", test_cases.messages_with_json, indirect=True)
def test_binary_compatibility(repeat, test_data: TestData) -> None:
plugin_module, reference_module, json_data = test_data
reference_instance = Parse(json_data, reference_module().Test())
reference_binary_output = reference_instance.SerializeToString()
for _ in range(repeat):
plugin_instance_from_json: betterproto.Message = plugin_module.Test().from_json(
json_data
)
plugin_instance_from_binary = plugin_module.Test.FromString(
reference_binary_output
)
# # Generally this can't be relied on, but here we are aiming to match the
# # existing Python implementation and aren't doing anything tricky.
# # https://developers.google.com/protocol-buffers/docs/encoding#implications
assert bytes(plugin_instance_from_json) == reference_binary_output
assert bytes(plugin_instance_from_binary) == reference_binary_output
assert plugin_instance_from_json == plugin_instance_from_binary
assert (
plugin_instance_from_json.to_dict() == plugin_instance_from_binary.to_dict()
)

61
betterproto/tests/util.py Normal file
View File

@@ -0,0 +1,61 @@
import os
import subprocess
from typing import Generator
os.environ["PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION"] = "python"
root_path = os.path.dirname(os.path.realpath(__file__))
inputs_path = os.path.join(root_path, "inputs")
output_path_reference = os.path.join(root_path, "output_reference")
output_path_betterproto = os.path.join(root_path, "output_betterproto")
if os.name == "nt":
plugin_path = os.path.join(root_path, "..", "plugin.bat")
else:
plugin_path = os.path.join(root_path, "..", "plugin.py")
def get_files(path, end: str) -> Generator[str, None, None]:
for r, dirs, files in os.walk(path):
for filename in [f for f in files if f.endswith(end)]:
yield os.path.join(r, filename)
def get_directories(path):
for root, directories, files in os.walk(path):
for directory in directories:
yield directory
def relative(file: str, path: str):
return os.path.join(os.path.dirname(file), path)
def read_relative(file: str, path: str):
with open(relative(file, path)) as fh:
return fh.read()
def protoc_plugin(path: str, output_dir: str):
subprocess.run(
f"protoc --plugin=protoc-gen-custom={plugin_path} --custom_out={output_dir} --proto_path={path} {path}/*.proto",
shell=True,
)
def protoc_reference(path: str, output_dir: str):
subprocess.run(
f"protoc --python_out={output_dir} --proto_path={path} {path}/*.proto",
shell=True,
)
def get_test_case_json_data(test_case_name, json_file_name=None):
test_data_file_name = json_file_name if json_file_name else f"{test_case_name}.json"
test_data_file_path = os.path.join(inputs_path, test_case_name, test_data_file_name)
if not os.path.exists(test_data_file_path):
return None
with open(test_data_file_path) as fh:
return fh.read()

12
conftest.py Normal file
View File

@@ -0,0 +1,12 @@
import pytest
def pytest_addoption(parser):
parser.addoption(
"--repeat", type=int, default=1, help="repeat the operation multiple times"
)
@pytest.fixture(scope="session")
def repeat(request):
return request.config.getoption("repeat")

View File

@@ -1,5 +1,5 @@
[tool.black]
target-version = ['py37']
target-version = ['py36']
[tool.isort]
multi_line_output = 3

5
pytest.ini Normal file
View File

@@ -0,0 +1,5 @@
[pytest]
python_files = test_*.py
python_classes =
norecursedirs = **/output_*
addopts = -p no:warnings

View File

@@ -2,8 +2,10 @@ from setuptools import setup, find_packages
setup(
name="betterproto",
version="1.0",
version="1.2.5",
description="A better Protobuf / gRPC generator & library",
long_description=open("README.md", "r", encoding="utf-8").read(),
long_description_content_type="text/markdown",
url="http://github.com/danielgtaylor/python-betterproto",
author="Daniel G. Taylor",
author_email="danielgtaylor@gmail.com",
@@ -14,9 +16,14 @@ setup(
packages=find_packages(
exclude=["tests", "*.tests", "*.tests.*", "output", "output.*"]
),
package_data={"betterproto": ["py.typed", "templates/template.py"]},
python_requires=">=3.7",
install_requires=["grpclib"],
extras_require={"compiler": ["jinja2", "protobuf"]},
package_data={"betterproto": ["py.typed", "templates/template.py.j2"]},
python_requires=">=3.6",
install_requires=[
'dataclasses; python_version<"3.7"',
'backports-datetime-fromisoformat; python_version<"3.7"',
"grpclib",
"stringcase",
],
extras_require={"compiler": ["black", "jinja2", "protobuf"]},
zip_safe=False,
)