284 Commits

Author SHA1 Message Date
Basileus
3eaff291c4 Changelog additions 2022-01-27 09:33:28 +11:00
Michael Osthege
9b5594adbe Format field comments also as docstrings (#304)
Closes #303

* Format field comments also as docstrings
To make it clear that they refer to the item above.
* Fix placement of enum item docstrings
* Add line breaks after class attribute or enum item docstrings
2022-01-27 09:25:48 +11:00
Danil Akhtarov
d991040ff6 Fix message text in NotImplementedError (#325) 2022-01-21 11:39:09 +00:00
efokschaner
d260f071e0 Client and Service Stubs take 1 request parameter, not one for each field (#311) 2022-01-17 19:58:57 +01:00
James Hilton-Balfe
6dd7baa26c Release v2.0.0.b4 (#307)
Co-authored-by: Kalan <22137047+kalzoo@users.noreply.github.com>
2022-01-03 18:18:44 +00:00
Kalan
573c7292a6 Add Python 3.10 to GitHub Actions test matrix (#280)
Co-authored-by: James Hilton-Balfe <50501825+Gobot1234@users.noreply.github.com>
2021-12-29 23:10:34 +00:00
Kalan
d77f44ebb7 Support proto3 field presence (#281)
* Update protobuf pregenerated files

* Update grpcio-tools to latest version

* Implement proto3 field presence

* Fix to_dict with None optional fields.

* Add test with optional enum

* Properly support optional enums

* Add tests for 64-bit ints and floats

* Support field presence for int64 types

* Fix oneof serialization with proto3 field presence (#292)

= Description

The serialization of a oneof message that contains a message with fields
with explicit presence was buggy.

For example:

```
message A {
    oneof kind {
        B b = 1;
        C c = 2;
    }
}

message B {}
message C {
    optional bool z = 1;
}
```

Serializing `A(b=B())` would lead to this payload:

```
0A # tag1, length delimited
00 # length: 0
12 # tag2, length delimited
00 # length: 0
```

Which when deserialized, leads to the message `A(c=C())`.

= Explanation

The issue lies in the post_init method. All fields are introspected, and
if different from PLACEHOLDER, the message is marked as having been
"serialized_on_wire".
Then, when serializing `A(b=B())`, we go through each field of the
oneof:

- field 'b': this is the selected field from the group, so it is
  serialized
- field 'c': marked as 'serialized_on_wire', so it is added as well.

= Fix

The issue is that support for explicit presence changed the default
value from PLACEHOLDER to None. This breaks the post_init method in that
case, which is relatively easy to fix: if a field is optional, and set
to None, this is considered as the default value (which it is).

This fix however has a side-effect: the group_current for this field (the
oneof trick for explicit presence) is no longer set. This changes the
behavior when serializing the message in JSON: as the value is the
default one (None), and the group is not set (which would force the
serialization of the field), so None fields are no longer serialized in
JSON. This break one test, and will be fixed in the next commit.

* fix: do not serialize None fields in JSON format

This is linked to the fix from the previous commit: after it, scalar
None fields were not included in the JSON format, but some were still
included.

This is all cleaned up: None fields are not added in JSON by default,
as they indicate the default value of fields with explicit presence.
However, if `include_default_values is set, they are included.

* Fix: use builtin annotation prefix

* Remove comment

Co-authored-by: roblabla <unfiltered@roblab.la>
Co-authored-by: Vincent Thiberville <vthib@pm.me>
2021-12-29 13:38:32 -08:00
dependabot[bot]
671c0ff4ac Bump urllib3 from 1.26.4 to 1.26.5 (#288)
Bumps [urllib3](https://github.com/urllib3/urllib3) from 1.26.4 to 1.26.5.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/1.26.4...1.26.5)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-12-11 18:31:26 -08:00
dependabot[bot]
9cecc8c3ff Bump babel from 2.9.0 to 2.9.1 (#289)
Bumps [babel](https://github.com/python-babel/babel) from 2.9.0 to 2.9.1.
- [Release notes](https://github.com/python-babel/babel/releases)
- [Changelog](https://github.com/python-babel/babel/blob/master/CHANGES)
- [Commits](https://github.com/python-babel/babel/compare/v2.9.0...v2.9.1)

---
updated-dependencies:
- dependency-name: babel
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-12-11 18:30:43 -08:00
Kim Gustyr
bc3cfc5562 Fix default values for enum service args #298 (#299) 2021-12-03 21:26:48 +00:00
guysz
b0a36d12e4 Fix compilation of fields with name identical to their type (#294)
* Revert "Fix compilation of fields named 'bytes' or 'str' (#226)"

This reverts commit deb623ed14.

* Fix compilation of fileds with name identical to their type

* Added test for field-name identical to python type

Co-authored-by: Guy Szweigman <guysz@nvidia.com>
2021-12-01 16:31:02 +00:00
Kalan
a4d2d39546 Fix Python 3.9 Tests (#284)
Co-authored-by: James Hilton-Balfe <50501825+Gobot1234@users.noreply.github.com>
2021-11-19 21:32:36 +00:00
lazytype
c424b6f8db Include AsyncIterator import for both clients and servers (#264)
Co-authored-by: Robin Lambertz <github@roblab.la>
2021-11-05 14:22:15 +00:00
James Hilton-Balfe
421fdba309 Allow parsing of messages from ByteStrings #266 2021-10-26 00:34:33 +01:00
Robin Lambertz
fb2793e0b6 Allow parsing messages from byteslike
Byteslike objects (like memoryview) do not have a decode function defined.
Instead, a string may be created from them by passing them to the str
constructor along with an encoding.
2021-08-25 12:53:02 +02:00
PIGNOSE
ad8b91766a Add benchmarking cases for nested, repeat and deserialize (#241) 2021-06-21 23:38:22 +02:00
Bekhzod Tillakhanov
a33126544b Fix readme docs 'Async gRPC Support' (#249) 2021-06-21 23:29:59 +02:00
nat
02e41afd09 Release v2.0.0b3 (#182)
Updated change log to include all new features and fixes from master.
2021-04-07 12:57:44 +02:00
nat
7368299a70 Fix serialization of repeated fields with empty messages (#180)
Extend test config and utils to support exclusion of certain json samples from
testing for symetry.
2021-04-06 10:50:45 +10:00
nat
deb623ed14 Fix compilation of fields named 'bytes' or 'str' (#226)
* if you have a field named "bytes" using the bytes type, it doesn't work.
* Enable existing use-case & generalize solution to cover it

Co-authored-by: Spencer <spencer@sf-n.com>
2021-04-06 10:45:57 +10:00
nat
95339bf74d Misc cleanup, see commit body (#227)
- Enable oneof_enum test case that passes now (removed the xfail)
- Switch from toml to tomlkit as a dev dep for better toml support
- upgrade poethepoet to latest stable release
- use full table format for poe tasks to avoid long lines in pyproject.toml
- remove redundant _WrappedMessage class
- fix various Mypy warnings
- reformat some comments for consistent line length
2021-04-06 10:43:09 +10:00
nat
5b639c82b2 Micro-optimization: use tuples instead of lists for conditions (#228)
This should give a small speed boost to some critical code paths.
2021-04-06 10:40:45 +10:00
Matthew Badger
7c5ee47e68 Added support for infinite and nan floats/doubles (#215)
- Added support for the custom double values from
   the protobuf json spec: "Infinity", "-Infinity", and "NaN"
- Added `infinite_floats` test data
- Updated Message.__eq__ to consider nan values
   equal
- Updated `test_message_json` and
   `test_binary_compatibility` to replace NaN float
   values in dictionaries before comparison
   (because two NaN values are not equal)
2021-04-02 15:15:28 +02:00
Nat Noordanus
bb646fe26f Fix template bug resulting in empty __post_init__ methods 2021-04-02 10:13:08 +11:00
Nat Noordanus
fc90653ab1 Sort the list of sources in generated file headers 2021-04-02 10:13:00 +11:00
Nat Noordanus
2a73dbac98 Make plugin use betterproto generated classes internally
This means the betterproto plugin no longer needs to depend durectly on
protobuf.

This requires a small runtime hack to monkey patch some google types to
get around the fact that the compiler uses proto2, but betterproto
expects proto3.

Also:
- regenerate google.protobuf package
- fix a regex bug in the logic for determining whether to use a google
  wrapper type.
- fix a bug causing comments to get mixed up when multiple proto files
  generate code into a single python module
2021-04-02 10:13:00 +11:00
nat
891c9e5d6c Update readme to avoid confusion about unreleased features. (#223) 2021-04-01 20:40:02 +02:00
Nat Noordanus
a890514b5c Update deps & add generate_lib task
- Remove plugin dependency on protobuf since it's no longer required.
- Update poethepoet to for better pyproject toml syntax support
- Add handy generate_lib poe task for maintaining generated libs
2021-04-01 09:49:22 +11:00
Nat Noordanus
fe1e712fdb Make plugin use betterproto generated classes internally
This means the betterproto plugin no longer needs to depend durectly on
protobuf.

This requires a small runtime hack to monkey patch some google types to
get around the fact that the compiler uses proto2, but betterproto
expects proto3.

Also:
- regenerate google.protobuf package
- fix a regex bug in the logic for determining whether to use a google
  wrapper type.
- fix a bug causing comments to get mixed up when multiple proto files
  generate code into a single python module
2021-04-01 09:49:22 +11:00
Vasili Syrakis
7a358a63cf Add __version__ attribute to package 2021-03-31 11:44:32 +11:00
nat
342e6559dc Properly serialize zero-value messages in a oneof group (#176)
Also improve test utils to make it easier to have multiple json examples.

Co-authored-by: Christopher Chambers <chris@peanutcode.com>
2021-03-15 13:52:35 +01:00
Vladimir Solomatin
2f62189346 Fix typing and datetime imports not being present for service method type annotations (#183) 2021-03-12 22:15:15 +01:00
MinJune Kim
8a215367ad Allow empty services (#222)
Fixes issue #220
2021-03-12 21:49:58 +01:00
robinaly
6c1c41e9cc Use dateutil parser (#213)
Switch to using `isoparse` from `dateutil.parser` instead of `datetime.fromisoformat` for more robust parsing of dates in from_dict.
2021-02-24 22:18:05 +01:00
Matthew Badger
9e6881999e Add support for repeated timestamps and durations to to_dict from_dict (#211) 2021-02-16 19:54:50 +01:00
nat
59f5f88c0d Rebuild poetry.lock to fix CI (#202) 2021-01-25 20:28:30 +01:00
Tim Schmidt
8eea5fe256 added documentation for server-facing stubs (#186) 2021-01-24 22:20:32 +01:00
Tim Schmidt
1d54ef8f99 Generate grpclib service stubs (#170) 2020-12-04 22:22:11 +01:00
nat
73cea12e1f Fix incorrect routes in generated client when service is not in a package (#177) 2020-11-28 17:50:25 +01:00
Arun Babu Neelicattu
a157f05480 Release v2.0.0b2 (#175) 2020-11-24 23:04:33 +01:00
James
69dfe9cafc Implement Message.__bool__ (#142)
* Implement Message.__bool__ with similar semantics to a collection, such that any value being set on the message (i.e. having a non-default value) make the Message value truthy .

Co-authored-by: nat <n@natn.me>
2020-11-24 19:35:09 +01:00
Arun Babu Neelicattu
a8a082e4e7 Update dependencies and add ci checks for python 3.9 (#173)
* Update locked dependencies to fix grpcio compile issue with python 3.9
* ci: add python 3.9
2020-11-24 19:28:28 +01:00
Tim Schmidt
e44de6da06 replace now-disabled set-env command (#172)
thanks @abn
2020-11-21 14:42:50 +01:00
James
a5e0ef910f Fixes for Python 3.9 (#140)
Fix issue in logic for evaluating field types affecting python 3.9
2020-11-01 15:23:02 +01:00
James
8f7af272cc QOL fixes (#141)
- Add missing type annotations
- Various style improvements
- Use constants more consistently
- enforce black on benchmark code
2020-10-17 19:27:11 +02:00
Arun Babu Neelicattu
bf9412e083 Use poetry-core as PEP 517 build backend (#108)
This change replaces the use of poetry as the build backend in favour
of the leaner poetry-core. This speeds up PEP-517 builds for source
installs, tox environment setup etc.
2020-10-01 14:45:45 +02:00
Keerthan Jaic
4630c1cc67 bump grpclib to 0.4.1 (#150) 2020-09-23 21:55:23 +02:00
James
d3e4fbb311 Add Documentation (#125)
Add sphinx docs with readthedocs integration.

Docs can be built locally with `poe docs`.
2020-09-20 22:00:02 +02:00
Jonas Kalderstam
58556e0eb6 Update README with example of calling protoc from python (#149) 2020-09-19 17:03:49 +02:00
Adrian Garcia Badaracco
a3f5f21738 Add benchmarks (#148)
Add asv based benchmarks to guide future optimisation work.
2020-09-19 16:28:16 +02:00
Arun Babu Neelicattu
0028cc384a Relax black version constraints (#146)
This change ensures that the wheel built only requests for the minimum
version of black it requires to function as intended. Without this
change any project that uses betterproto[compiler] would break while
resolving dependencies.
2020-08-31 22:10:57 +02:00
Chris Chambers
034e2e7da0 Add support for recursive messages (#130)
Changes message initialization (`__post_init__`) so that default values
are no longer eagerly created to prevent infinite recursion when
initializing recursive messages.

As a result, `PLACEHOLDER` will be present in the message for any
uninitialized fields.  So, an implementation of `__get_attribute__` is
added that checks for `PLACEHOLDER` and lazily creates and stores
default field values.

And, because `PLACEHOLDER` values don't compare equal with zero values,
a custom implementation of `__eq__` is provided, and the code generation
template is updated so that messages generate with `@dataclass(eq=False)`.

Also add new Message __repr__ implementation that skips PLACEHOLDER 
values and orders keys by number from the proto.

Co-authored-by: Christopher Chambers <chris@peanutcode.com>
Co-authored-by: nat <n@natn.me>
Co-authored-by: James <50501825+Gobot1234@users.noreply.github.com>
2020-08-30 21:04:36 +02:00
James
ca16b6ed34 Various micro-optimizations (#139) 2020-08-30 17:23:57 +02:00
James
16d554db75 Update black 2020-08-29 17:15:59 +02:00
Adrian Garcia Badaracco
9ef5503728 Small improvements to models.py 2020-08-23 14:26:15 +02:00
Adrian Garcia Badaracco
c93351ef21 Factor code template compilation out into a separate module 2020-08-09 20:06:39 +02:00
James
80bef7c94f Improve logic to avoid keyword collisions in generated code
Use the standard library keyword module instead of a hard coded list and applying it to enum keys as well.
2020-08-09 12:41:41 +02:00
nat
804805f0f5 Update poe (#132)
- This update improves support for windows & removes the direct dependency on poetry
2020-08-06 22:16:25 +02:00
Arun Babu Neelicattu
43c134d27c ci: refactor jobs and improve platform coverage (#128) 2020-07-30 14:47:38 +02:00
Arun Babu Neelicattu
0cd9510b54 Support deprecated message and fields (#126) 2020-07-30 14:47:01 +02:00
Arun Babu Neelicattu
beafc812ff Fix static type checking for grpclib client (#124)
* Fix static type checking in grpclib client
* Fix python3.6 compatibility issue with dataclasses
2020-07-30 11:30:58 +02:00
Arun Babu Neelicattu
3d8c0cb713 grpclib_client: handle trailer-only responses (#127)
Resolves: #123
2020-07-25 19:57:46 +02:00
nat
c513853301 Replace Makefile with poe tasks in pyproject.yaml (#118)
https://github.com/nat-n/poethepoet
2020-07-25 19:54:40 +02:00
Brady Kieffer
c1a76a5f5e Serialize default values in oneofs when calling to_dict() or to_json() (#110)
* Serialize default values in oneofs when calling to_dict() or to_json()

This change is consistent with the official protobuf implementation. If
a default value is set when using a oneof, and then a message is
translated from message -> JSON -> message, the default value is kept in
tact. Also, if no default value is set, they remain null.

* Some cleanup + testing for nested messages with oneofs

* Cleanup oneof_enum test cases, they should be fixed

This _should_ address:
https://github.com/danielgtaylor/python-betterproto/issues/63

* Include default value oneof fields when serializing to bytes

This will cause oneof fields with default values to explicitly be sent
to clients. Note that does not mean that all fields are serialized and
sent to clients, just those that _could_ be null and are not.

* Remove assignment when populating a sub-message within a proto

Also, move setattr out one indentation level

* Properly transform proto with empty string in oneof to bytes

Also, updated tests to ensure that which_one_of picks up the set field

* Formatting betterproto/__init__.py

* Adding test cases demonstrating equivalent behaviour with google impl

* Removing a temporary file I made locally

* Adding some clarifying comments

* Fixing tests for python38
2020-07-25 19:51:40 +02:00
Joshua Salzedo
2745953a8e Fix the readme gRPC usage example (#122)
* re-implement README gRPC client example to be a self-contained script
 - fix a syntax error
 - fix a usage error

* asyncio.run() was added in 3.7
 - this lib targets >= 3.6

* Apply suggestions from code review

Optimized imports, store RPC call result before printing

Co-authored-by: Arun Babu Neelicattu <arun.neelicattu@gmail.com>

* add entry-point check to example

Co-authored-by: Arun Babu Neelicattu <arun.neelicattu@gmail.com>
2020-07-25 19:45:26 +02:00
Adrian Garcia Badaracco
b5dcac1250 REF: Refactor plugin.py to use modular dataclasses in tree-like structure to represent parsed data (#121)
Refactor plugin to parse input into data-class based hierarchical structure
2020-07-25 19:44:02 +02:00
James
cbd3437080 Some minor consistency changes
- replace some usages of `==` with `is`
- use available constants instead of magic strings for type names

Co-authored-by: nat <nat.noordanus@gmail.com>
2020-07-12 16:07:27 +02:00
boukeversteegh
2585a07fcf Improve poetry install speed by first upgrading pip 2020-07-12 15:42:31 +02:00
Bouke Versteegh
6c29771f4c Fix: to_dict returns wrong enum fields when numbering is not consecutive (#102)
Fixes #93 to_dict returns wrong enum fields when numbering is not consecutive
2020-07-12 15:06:55 +02:00
Arun Babu Neelicattu
0ba0692dec Handle mutable default arguments cleanly
When generating code, ensure that default list/dict arguments are
initialised in local scope if unspecified or `None`.
2020-07-11 22:33:44 +02:00
Arun Babu Neelicattu
42e197f985 Ensure we clean up egg-info directories 2020-07-11 19:51:01 +02:00
Arun Babu Neelicattu
459d12b24d Move betterproto → src/betterproto
This change avoids some nasty import issues and also ensures that the
right code is tested and arbitrary code is not included when packaging.
2020-07-11 19:51:01 +02:00
Arun Babu Neelicattu
cebf9176a3 Move betterproto/tests → tests 2020-07-11 19:51:01 +02:00
Bouke Versteegh
8864f4fdbd Merge pull request #103 from boukeversteegh/fix/service-input-message
Fix - No arguments are generated for stub methods when using `import` with proto definition
2020-07-10 22:55:05 +02:00
Arun Babu Neelicattu
03211604bc Replace dependency on protoc with grpcio-tools
This change removes the dependency on platform provided protobuf tools
in favour of `grpcio-tools` dependency. This makes both development and
compiler use independent from platform dependencies.
2020-07-10 13:16:40 +02:00
boukeversteegh
1d7ba850e9 Reorder methods, use BETTERPROTO_DUMP for dump env var, docs. 2020-07-09 23:09:34 +02:00
Bouke Versteegh
b2651335ce Merge pull request #112 from danielgtaylor/pr/readme-contribution
Updated readme with contribution section. More help welcome 😃
2020-07-09 22:53:22 +02:00
nat
5a591ef2a4 Add link to testing README in CONTRIBUTING.md 2020-07-09 20:41:13 +02:00
boukeversteegh
8d7d0efb9b Move contributing guide to CONTRIBUTING.md 2020-07-09 09:31:04 +02:00
boukeversteegh
b891d257f6 Updated readme with contribution section. More help welcome 😃 2020-07-09 00:16:36 +02:00
Bouke Versteegh
8bcb67b66f Merge pull request #81 from discord/serialized_on_wire_repeated
Always set serialized_on_wire for all parsed message fields
2020-07-08 23:10:14 +02:00
boukeversteegh
72d72b4603 Merge remote-tracking branch 'daniel/master' into fix/service-input-message
# Conflicts:
#	betterproto/plugin.py
2020-07-08 23:00:32 +02:00
Bouke Versteegh
3273ae4d2c Merge pull request #100 from boukeversteegh/fix/circular-dependencies
Import bug - Circular Dependencies
2020-07-07 21:45:06 +02:00
Bouke Versteegh
6fe666473d Merge pull request #106 from abn/minor-formatting
Minor non-functional improvements
2020-07-07 20:22:44 +02:00
Arun Babu Neelicattu
0338fcba29 Ignore commonly used .venv directory 2020-07-07 19:23:38 +02:00
Arun Babu Neelicattu
0f3ad25770 Minor non-functional changes
- fix few typos
- remove unused imports
- fix minor code-quality issues
- replace `grpclib._protocols` with `grpclib._typing`
- fix boolean and None assertions in test cases
2020-07-07 19:23:38 +02:00
Bouke Versteegh
586e28d2dc Merge pull request #104 from abn/fix-casing
Add missing async/await keywords when casing
2020-07-07 14:32:51 +02:00
Arun Babu Neelicattu
a8d8159d27 Add missing async/await keywords when casing 2020-07-07 13:15:46 +02:00
boukeversteegh
3f519d4fb1 Fixes #23 again, a broken test made it seem the issue was fixed before. 2020-07-05 17:14:53 +02:00
boukeversteegh
dedead048f Read proto objects before services 2020-07-05 13:10:25 +02:00
boukeversteegh
87b3a4b86d Move parsing of protobuf data types and services into separate methods 2020-07-05 12:27:06 +02:00
boukeversteegh
f2e87192b0 Clarify variable names 2020-07-05 12:24:21 +02:00
boukeversteegh
98d00f0d21 Supports running plugin.py standalone by reading from a dump-file, so its possible to debug it. 2020-07-05 12:20:55 +02:00
Bouke Versteegh
bde6d06835 Merge pull request #99 from boukeversteegh/release-v2.0.0b1
Release v2.0.0b1
2020-07-05 10:20:37 +02:00
boukeversteegh
23dcbc2695 Fixes circular import problem when a non-circular dependency triangle is flattened into two python packages 2020-07-04 15:49:55 +02:00
boukeversteegh
0af0cf4bfb Fixes circular import problem when a non-circular dependency triangle is flattened into two python packages 2020-07-04 15:35:42 +02:00
boukeversteegh
eaa4f7f5d9 Release v2.0.0b1 2020-07-04 14:00:35 +02:00
Bouke Versteegh
cdddb2f42a Merge pull request #88 from boukeversteegh/fix/imports
🍏 Fix imports
2020-07-04 11:22:12 +02:00
boukeversteegh
d21cd6e391 black 2020-07-01 13:15:03 +02:00
boukeversteegh
af7115429a Expose betterproto.ServiceStub 2020-07-01 12:43:28 +02:00
boukeversteegh
0d9387abec Remove stringcase dependency 2020-07-01 12:43:12 +02:00
boukeversteegh
f4ebcb0f65 Merge remote-tracking branch 'daniel/master' into fix/imports
# Conflicts:
#	Pipfile
#	README.md
#	betterproto/__init__.py
#	betterproto/plugin.py
#	betterproto/tests/util.py
2020-07-01 12:19:25 +02:00
boukeversteegh
81711d2427 Avoid naming conflicts when importing multiple types with the same name from an ancestor package 2020-07-01 12:07:59 +02:00
boukeversteegh
e3135ce766 Add parameter for non-strict cased output that preserves delimiter count 2020-07-01 09:39:37 +02:00
Bouke Versteegh
9532844929 Merge pull request #83 from nat-n/client-streaming
Client streaming
2020-06-24 22:13:54 +02:00
nat
0c5d1ff868 Merge branch 'master' into client-streaming 2020-06-23 22:02:23 +02:00
Bouke Versteegh
5fb4b4b7ff Merge pull request #75 from nat-n/add_poetry
Switch from pipenv to poetry
2020-06-23 21:59:46 +02:00
Nat Noordanus
4f820b4a6a Include python 3.8 i ci test runs & optimise CI and make config 2020-06-22 19:38:41 +02:00
Nat Noordanus
75a4c230da Add optional deps to dev-deps
So contributors dont have to remember to run poetry install with `-E compiler`
2020-06-22 19:35:23 +02:00
nat
5c9a12e2f6 Merge pull request #1 from boukeversteegh/client-streaming-tests
Client streaming tests
2020-06-16 19:36:40 +02:00
Nat Noordanus
e1ccd540a9 Fix bugs and remove footgun feature in AsyncChannel 2020-06-16 00:07:28 +02:00
nat
4e78fe9579 Merge branch 'client-streaming' into client-streaming-tests 2020-06-15 23:42:01 +02:00
Nat Noordanus
50bb67bf5d Fix bugs and remove footgun feature in AsyncChannel 2020-06-15 23:35:56 +02:00
Bouke Versteegh
1ecbf1a125 Merge pull request #90 from jameslan/fix/fixed-types
fixed field types should be int
2020-06-15 19:48:31 +02:00
boukeversteegh
0814729c5a Add cases for send() 2020-06-15 18:14:13 +02:00
boukeversteegh
f7aa6150e2 Add test-cases for client stream-stream 2020-06-15 18:02:37 +02:00
boukeversteegh
159c30ddd8 Fix close not awaitable, fix done is callable, fix return async next value 2020-06-15 18:02:05 +02:00
Nat Noordanus
c8229e53a7 Fix most mypy warnings 2020-06-15 00:19:07 +02:00
Nat Noordanus
3185c67098 Improve generate script
- Fix issue with __pycache__ dirs getting picked up
- parallelise code generation with asyncio for 3x speedup
- silence protoc output unless -v option is supplied
- Use pathlib ;)
2020-06-15 00:19:07 +02:00
boukeversteegh
52eea5ce4c Added missing tests for casing 2020-06-14 23:15:56 +02:00
Nat Noordanus
4b6f55dce5 Finish implementation and testing of client
Including stream_unary and stream_stream call methods.

Also
- improve organisation of relevant tests
- fix some generated type annotations
- Add AsyncChannel utility cos it's useful
2020-06-14 23:04:52 +02:00
boukeversteegh
fdbe0205f1 find_module docstring and search for init files instead of directories 2020-06-14 22:54:03 +02:00
Nat Noordanus
09f821921f Move ServiceStub to a seperate module and add more rpcs to service test 2020-06-14 22:19:51 +02:00
Hans Lellelid
a757da1b29 Adding basic support (untested) for client streaming 2020-06-14 22:19:51 +02:00
boukeversteegh
e2d672a422 Fix terminology, improve docstrings and add missing asserts to tests 2020-06-14 21:40:12 +02:00
boukeversteegh
63f5191f02 Shorten list selectors 2020-06-14 16:54:34 +02:00
boukeversteegh
87f4b34930 Revert "Support running plugin without installing betterproto"
This reverts commit c88edfd0
2020-06-14 16:52:33 +02:00
boukeversteegh
2c360a55f2 Readability for generating init_files 2020-06-14 16:51:52 +02:00
James Lan
04dce524aa fixed field types should be int 2020-06-12 17:04:56 -07:00
Nat Noordanus
8edec81b11 Switch from pipenv to poetry
- dropped dev dependency on rope, isort & flake
- poetry doesn't support dev scripts like pipenv, so create a makefile instead
- Add pytest-cov
- Use tox for testing multiple python versions in CI
- Update README

Update ci workflow
2020-06-12 21:13:55 +02:00
boukeversteegh
32c8e77274 Recompile Google Protobuf files 2020-06-12 13:56:32 +02:00
boukeversteegh
d9fa6d2dd3 Fixes issue where generated Google Protobuf messages imported from betterproto.lib instead of using local forward references 2020-06-12 13:55:55 +02:00
boukeversteegh
c88edfd093 Support running plugin without installing betterproto 2020-06-12 13:54:14 +02:00
nat
a46979c8a6 Merge pull request #86 from danielgtaylor/boukeversteegh-patch-1
Add Slack invite link
2020-06-11 17:26:38 +02:00
boukeversteegh
83e13aa606 Fix method name 2020-06-11 13:55:12 +02:00
boukeversteegh
3ca75dadd7 Remove dependency on stringcase, apply black 2020-06-11 13:55:12 +02:00
boukeversteegh
5d2f3a2cd9 Remove fixed test from xfail list #11 2020-06-11 13:55:12 +02:00
boukeversteegh
65c1f366ef Update readme with new output structure and fix example inconsistencies 2020-06-11 13:55:12 +02:00
boukeversteegh
34c34bd15a Add failing test for importing a message from package that looks like a nested type #87 2020-06-11 13:55:12 +02:00
boukeversteegh
fb54917f2c Detect entry-point of tests automatically 2020-06-11 13:55:12 +02:00
boukeversteegh
1a95a7988e Ensure uniquely generated import aliases are not name mangled (python.org/dev/peps/pep-0008/#id34) 2020-06-11 13:55:11 +02:00
boukeversteegh
76db2f153e Add import aliases to ancestor imports 2020-06-11 13:55:11 +02:00
boukeversteegh
8567892352 Simplify logic for generating package init files 2020-06-11 13:55:11 +02:00
boukeversteegh
3105e952ea Fixes issue where importing cousin where path has a package with the same name broke import 2020-06-11 13:55:11 +02:00
boukeversteegh
7c8d47de6d Add test cases for cousin imports that break due to aliases starting with two underscores 2020-06-11 13:55:11 +02:00
boukeversteegh
c00e2aef19 Break up importing logic in methods 2020-06-11 13:55:11 +02:00
boukeversteegh
fdf3b2e764 Compile proto files based on package structure 2020-06-11 13:55:11 +02:00
boukeversteegh
f7c2fd1194 Support nested messages, fix casing. Support test-cases in packages. 2020-06-11 13:55:11 +02:00
boukeversteegh
d8abb850f8 Update tests to reflect new generated package structure 2020-06-11 13:55:11 +02:00
boukeversteegh
d7ba27de2b fix all broken imports 2020-06-11 13:55:11 +02:00
boukeversteegh
57523a9e7f Implement importing unrelated package 2020-06-11 13:55:11 +02:00
boukeversteegh
e5e61c873c Implement some import scenarios 2020-06-11 13:55:11 +02:00
boukeversteegh
9fd1c058e6 Create unit tests for importing 2020-06-11 13:55:11 +02:00
boukeversteegh
d336153845 Use never expiring invitation link 2020-06-11 13:49:53 +02:00
nat
9a45ea9f16 Merge pull request #78 from boukeversteegh/pr/google
Basic general support for Google Protobuf
2020-06-11 10:50:12 +02:00
Bouke Versteegh
bb7f5229fb Add Slack invite link 2020-06-10 17:30:18 +02:00
boukeversteegh
f7769a19d1 Pass betterproto option using custom_opt instead of environment variable 2020-06-06 12:51:37 +02:00
Danny Weinberg
28a288924f Change to have parse *always* set serialized_on_wire 2020-06-04 16:20:32 -07:00
Danny Weinberg
5c700618fd Black again lol 2020-06-04 13:42:43 -07:00
Danny Weinberg
a914306f33 Put test into test_features, simplify to call parse directly 2020-06-04 13:42:07 -07:00
Danny Weinberg
67422db6b9 Fix formatting 2020-06-04 11:34:20 -07:00
Danny Weinberg
061bf86a9c Set serialized_on_wire when message contains only lists
This fixes a bug where serialized_on_wire was not set when a message contained only repeated values (eg in a list or map). The fix here is to just set it to true in the `parse` method as soon as we receive any valid data. This also adds a test to expose the behavior.
2020-06-04 11:04:36 -07:00
boukeversteegh
d31f90be6b Combine circular imports 2020-06-04 00:11:22 +02:00
boukeversteegh
919b0a6a7d Check if betterproto has wrapper support in idiomatic way 2020-06-04 00:02:28 +02:00
boukeversteegh
7ecf3fe0e6 Add comment to explain unusual import location 2020-06-04 00:02:28 +02:00
Bouke Versteegh
ff14948a4e Use raw string for regex
Co-authored-by: nat <nat.noordanus@gmail.com>
2020-06-04 00:02:28 +02:00
Bouke Versteegh
cb00273257 Fix name PROTOBUF_OPTS -> BETTERPROTO_OPTS 2020-06-04 00:02:28 +02:00
boukeversteegh
973d68a154 Add missing field to MockChannel to prevent warnings while testing 2020-06-04 00:02:28 +02:00
boukeversteegh
ab9857b5fd Add test-case for service that returns google protobuf values 2020-06-04 00:02:28 +02:00
boukeversteegh
2f658df666 Use betterproto wrapper classes, extract to module for testability 2020-06-04 00:02:28 +02:00
boukeversteegh
b813d1cedb Undo adding skip to test 2020-06-03 23:59:10 +02:00
boukeversteegh
f5ce1b7108 Check that config.xfail contains valid test case names 2020-06-03 23:59:10 +02:00
boukeversteegh
62fc421d60 Add failing tests for google.protobuf Struct and Value #9 2020-06-03 23:59:10 +02:00
boukeversteegh
eeed1c0db7 Extend pre-compiled Duration and Timestamp instead of manual definition 2020-06-03 23:58:47 +02:00
boukeversteegh
2a3e1e1827 Add basic support for all google.protobuf types 2020-06-03 23:58:47 +02:00
boukeversteegh
53ce1255d3 Do not unwrap google.protobuf.Value and unsupported wrapper types 2020-06-03 23:58:47 +02:00
boukeversteegh
e8991339e9 Use pre-compiled wrapper-classes 2020-06-03 23:54:43 +02:00
boukeversteegh
4556d67503 Include pre-compiled google protobuf classes 2020-06-03 23:54:43 +02:00
boukeversteegh
f087c6c9bd Support compiling google protobuf files 2020-06-03 23:54:43 +02:00
Bouke Versteegh
eec24e4ee8 Merge pull request #77 from danielgtaylor/nat-n-patch-1
Rearrange plugin import to make import errors more helpful
2020-05-30 20:52:35 +02:00
nat
91111ab7d8 Make plugin import errors more helpful
This addresses an issue where if the user happens to have black installed in
their environment but not the other dependencies when running the protoc
plugin then the resulting import error (No module named 'google') is not very
helpful.
2020-05-30 16:08:36 +02:00
Bouke Versteegh
fcff3dff74 Merge pull request #62 from jameslan/perf/cache-fields
Cache field metadata, to avoid calling `dataclasses.fields` to get more than 10% performance improvement
2020-05-29 12:17:25 +02:00
Bouke Versteegh
5c4969ff1c Merge pull request #69 from boukeversteegh/pr/bugreports
Bugreports
2020-05-28 09:07:11 +02:00
James Lan
ed33a48d64 Cache field metadata, to avoid calling dataclasses.fields to get more than 10% performance improvement 2020-05-27 15:58:14 -07:00
nat
ee362a7a73 Merge pull request #73 from nat-n/always_black
Bump version to 1.2.5
2020-05-27 13:37:54 +02:00
nat
261e55b2c8 Merge pull request #72 from nat-n/always_black
Make CI check formatting is black & append .j2 suffix to template.py
2020-05-27 12:27:33 +02:00
Nat Noordanus
98930ce0d7 Bump version to 1.2.5 2020-05-27 12:04:53 +02:00
Nat Noordanus
d7d277eb0d Remove typo from Pipfile and update Pipfile.lock 2020-05-27 11:52:18 +02:00
Nat Noordanus
3860c0ab11 Add task to run black --check in ci & update README 2020-05-27 11:52:10 +02:00
Nat Noordanus
cd1c2dc3b5 Rename template file to avoid confusing black or other build tools 2020-05-27 11:25:19 +02:00
Nat Noordanus
be2a24d15c blacken 2020-05-27 11:25:00 +02:00
Vasilios Syrakis
a5effb219a Release 1.2.4 (#71)
Co-authored-by: nat <n@natn.me>
2020-05-26 22:17:55 +02:00
boukeversteegh
b354aeb692 Add dict to list of built-types for #53 2020-05-26 10:09:58 +02:00
boukeversteegh
6d9e3fc580 Add issue references to failing test cases 2020-05-25 23:43:01 +02:00
boukeversteegh
72de590651 Remove unused proto file 2020-05-25 23:36:09 +02:00
boukeversteegh
3c70f21074 #70 Messages should allow fields that are Python keywords 2020-05-25 23:36:08 +02:00
boukeversteegh
4b7d5d3de4 #53 Crash when field has the same name as a system type 2020-05-25 22:23:39 +02:00
Bouke Versteegh
2d57f0d122 Merge pull request #67 from danielgtaylor/nat-n-patch-1
Enforce utf-8 for reading the readme in setup.py
2020-05-25 21:57:12 +02:00
boukeversteegh
142e976c40 Add extra related test cases for #11 2020-05-25 21:56:03 +02:00
boukeversteegh
382fabb96c #11 ALL_CAPS message fields are parsed incorrectly 2020-05-25 21:50:30 +02:00
boukeversteegh
18598e77d4 Remove renamed service from test input config 2020-05-25 21:38:14 +02:00
boukeversteegh
6871053ab2 #9 Import bug - returning well known type Empty from service 2020-05-25 21:21:33 +02:00
boukeversteegh
5bb6931df7 #25 Two packages with the same name suffix should not cause naming conflict 2020-05-25 21:15:39 +02:00
boukeversteegh
e8a9960b73 Move configuration of test-cases to config file, include list of service tests 2020-05-25 21:11:33 +02:00
boukeversteegh
f25c66777a #68 Service input messages are not imported 2020-05-25 18:48:42 +02:00
nat
a68505b80e Enforce utf-8 for reading the readme
Fixes failing installation issue #66
2020-05-25 17:53:13 +02:00
nat
2f9497e064 Merge pull request #55 from boukeversteegh/pr/xfail-tests
Add intentionally failing test-cases for unimplemented bug-fixes
2020-05-25 09:54:26 +02:00
boukeversteegh
33964b883e Do not use mutable defaults 2020-05-25 00:35:43 +02:00
boukeversteegh
ec7574086d Add xfail test-case to for future circular dependency scenario 2020-05-24 20:35:10 +02:00
boukeversteegh
8a42027bc9 Improve failing test-case for issue #64 2020-05-24 20:33:48 +02:00
boukeversteegh
71737cf696 Test case for issue #63 2020-05-24 20:29:32 +02:00
boukeversteegh
659ddd9c44 Working test case for oneof 2020-05-24 20:29:19 +02:00
boukeversteegh
5b6997870a Test case for issue #61 2020-05-24 20:27:12 +02:00
boukeversteegh
cdf7645722 Test case for issue #60 2020-05-24 20:26:47 +02:00
boukeversteegh
ca20069ca3 Test case for issue #59 2020-05-24 20:26:13 +02:00
boukeversteegh
59a4a7da43 Test case for issue #58 2020-05-24 20:25:29 +02:00
boukeversteegh
15af4367e5 Test case for issue #57 2020-05-24 20:24:55 +02:00
boukeversteegh
ec5683e572 Test Service instantiation as part of standard test-case 2020-05-24 20:02:41 +02:00
boukeversteegh
20150fdcf3 Cleanup 2020-05-24 19:58:49 +02:00
boukeversteegh
d11b7d04c5 Document XFAIL tests 2020-05-24 19:58:35 +02:00
boukeversteegh
e2d35f4696 Support xfail on test-case level, support running tests on subsets. 2020-05-24 19:58:06 +02:00
boukeversteegh
c3f08b9ef2 Clear output directories before generating python files 2020-05-24 19:54:53 +02:00
boukeversteegh
24d44898f4 Only import reference module when needed. Some reference modules generate bad imports and cannot be loaded. 2020-05-24 19:53:14 +02:00
boukeversteegh
074448c996 Restore accidentally removed binary equality test 2020-05-24 19:52:14 +02:00
nat
0fe557bd3c Merge pull request #52 from nat-n/fix_type_imports
Only import types from grpclib when type checking
2020-05-24 19:09:08 +02:00
nat
1a87ea43a1 Merge pull request #40 from boukeversteegh/pr/wrapper-as-output
Support using Google's wrapper types as RPC output values
2020-05-24 19:06:30 +02:00
andrei
983e0895a2 Fix services using non-pythonified field names 2020-05-24 18:46:36 +02:00
nat
4a2baf3f0a Merge pull request #46 from jameslan/perf/class-cache
Improve performance of serialize/deserialize by caching type information of fields in class
2020-05-24 18:38:32 +02:00
boukeversteegh
8f0caf1db2 Read desired wrapper type directly from wrapper definition 2020-05-24 14:50:56 +02:00
boukeversteegh
c50d9e2fdc Add test for generating embedded wellknown types in outputs. 2020-05-24 14:48:39 +02:00
boukeversteegh
35548cb43e Test all supported wrapper types. Add xfail test for unwrapping the value 2020-05-24 12:34:37 +02:00
boukeversteegh
b711d1e11f Merge remote-tracking branch 'daniel/master' into pr/wrapper-as-output 2020-05-24 10:41:40 +02:00
James Lan
917de09bb6 Replace extra decorator with property and lazy initialization so that it is backward compatible. 2020-05-23 17:36:29 -07:00
James Lan
1f7f39049e Cache resolved classes for fields, so that there's no new data classes generated while deserializing. 2020-05-23 17:36:29 -07:00
James Lan
3d001a2a1a Store the class metadata of fields in the class, to improve preformance
Cached data include,
- lookup table between groups and fields of "oneof" fields
- default value creator of each field
- type hint of each field
2020-05-23 17:36:29 -07:00
James Lan
de61ddab21 Add option to repeatly execute betterproto operations in test, to evaluate performance 2020-05-23 17:36:29 -07:00
Nat Noordanus
5e2d9febea Blacken 2020-05-23 23:37:22 +02:00
nat
f6af077ffe Merge pull request #51 from boukeversteegh/pr/refactor-tests
Reorganize tests and add some extra documentation.
2020-05-22 22:32:37 +02:00
boukeversteegh
92088ebda8 Cleanup 2020-05-22 21:18:44 +02:00
boukeversteegh
c3e3837f71 More concise whitelist logic 2020-05-22 21:11:23 +02:00
boukeversteegh
6bd9c7835c Fix docs 2020-05-22 21:08:08 +02:00
boukeversteegh
6ec902c1b5 Fix generate noargs. Sorted iteration. 2020-05-22 21:03:45 +02:00
boukeversteegh
960dba2ae8 Renamed docs for standard tests 2020-05-22 20:58:53 +02:00
boukeversteegh
4b4bdefb6f Add explicit test for casing rules 2020-05-22 20:58:31 +02:00
boukeversteegh
dfa0a56b39 Simplify standard tests by using 1 json per case. 2020-05-22 20:58:14 +02:00
boukeversteegh
dd4873dfba Re-introducing whitelisting argument to generate.py 2020-05-22 20:51:22 +02:00
Nat Noordanus
91f586f7d7 Apply black formatting 2020-05-22 18:46:43 +02:00
Nat Noordanus
33fb83faad Only import types from grpclib when type checking 2020-05-22 18:41:29 +02:00
boukeversteegh
77c04414f5 Update readme, add docs for standard tests 2020-05-22 16:36:43 +02:00
boukeversteegh
6969ff7ff6 Add another missing gitignored file, and remove gitignore filter for tests/ 2020-05-22 15:34:25 +02:00
boukeversteegh
13e08fdaa8 Add missing file, ignore output files 2020-05-22 15:05:52 +02:00
boukeversteegh
6775632f77 Undo unintentional pipfile update 2020-05-22 13:03:52 +02:00
boukeversteegh
b12f1e4e61 Organize test-cases into folders, extract compatibility test into proper test, support adding test-case specific tests 2020-05-22 12:54:01 +02:00
Bouke Versteegh
7e9ba0866c cleanup 2020-05-21 22:55:26 +02:00
nat
3546f55146 Merge pull request #32 from nat-n/improve_stub
Add ability to provide metadata, timeout & deadline args to requests
2020-05-21 10:11:45 +02:00
boukeversteegh
499489f1d3 Support using Google's wrapper types as RPC output values 2020-05-10 16:36:29 +02:00
Vasili Syrakis
ce9f492f50 Increment version to 1.2.3 2020-04-15 14:24:02 +10:00
Vasilios Syrakis
93a6334015 Update CHANGELOG.md 2020-04-15 14:21:30 +10:00
Adam Ehlers Nyholm Thomsen
36a14026d8 Fix issue that occurs with naming when proto is double nested (#21) 2020-04-15 14:10:43 +10:00
Vasilios Syrakis
04a2fcd3eb Merge pull request #31 from nat-n/fix_readme
Fix test instructions to match pipfile
2020-04-14 10:55:18 +10:00
Nat Noordanus
5759e323bd Add ability to provide metadata, timeout & deadline args to requests
This is an enhancement of the ServiceStub abstract class that makes
it more useful by making it possible to pass all arguments supported
by the underlying grpclib request function.

It extends to the existing high level API by allowing values to be
set on the stub instance, and the low level API by allowing values
to be set per call.
2020-04-12 22:23:10 +02:00
Nat Noordanus
c762c9c549 Add test for generated service stub
- Create one simple test for generated Service stubs in preparation
for making more changes in this area.
- Add dev dependency on pytest-asyncio in order to use ChannelFor
from grpclib.testing more easily.
- Create a new example proto containing a minimal rpc example.
2020-04-12 19:37:39 +02:00
Nat Noordanus
582a12577c Fix test instructions to match pipfile 2020-04-12 18:52:43 +02:00
Vasilios Syrakis
3616190451 Merge pull request #30 from nat-n/p36_support
#27 Add support for python 3.6
2020-04-08 09:37:48 +10:00
Nat Noordanus
9b990ee1bd Make pipenv play nice with the setup-python ci workflow 2020-04-05 15:58:12 +02:00
Vasilios Syrakis
72a77b0d65 Merge pull request #28 from tanishq-dubey/patch-1
Update README.md for pip syntax
2020-04-05 14:52:48 +10:00
Nat Noordanus
b2b36c8575 Apply black formatting 2020-04-03 19:54:19 +02:00
Nat Noordanus
203105f048 Add support for python 3.6
Changes:
- Update config and docs to reference 3.6
- Add backports of dataclasses and datetime.fromisoformat for python_version<"3.7"
- Support both 3.7 and 3.6 usages of undocumented __origin__ attribute on typing objects
- Make github ci run tests for python 3.6 as well
2020-04-03 19:52:19 +02:00
Tanishq Dubey
fe11f74227 Update README.md
Add quotes to the README so pip syntax is correct
2020-03-30 09:50:11 -04:00
Daniel G. Taylor
dc7a3e9bdf Update changelog 2020-01-30 17:48:12 -08:00
Daniel G. Taylor
f2e8afc609 Merge pull request #16 from cetanu/patch-1
Exclude empty lists from to_dict output
2020-01-30 17:31:25 -08:00
Daniel G. Taylor
dbd438e682 Update to emit empty lists if asked for defaults 2020-01-30 17:28:22 -08:00
Daniel G. Taylor
dce1c89fbe Merge branch 'master' into patch-1 2020-01-30 17:22:47 -08:00
Daniel G. Taylor
c78851b1b8 Merge pull request #12 from ulasozguler/master
Added `include_default_values` parameter to `to_dict` function
2020-01-30 17:19:34 -08:00
Vasilios Syrakis
4554d91f89 Exclude empty lists from to_dict output 2020-01-29 22:32:35 +11:00
ulas
c0170f4d80 Added include_default_values parameter to to_dict function. 2020-01-22 19:16:57 +03:00
Daniel G. Taylor
559b8833d8 Bump version to 1.2.2 2020-01-09 16:47:25 -08:00
Daniel G. Taylor
7ccef16579 Mention no proto 2, fixes #6 2020-01-09 16:43:45 -08:00
Daniel G. Taylor
d8785b4622 Merge pull request #10 from qix/master
Fix serialization of dataclass constructor parameters
2020-01-09 16:35:06 -08:00
Daniel G. Taylor
45e7a30300 Merge pull request #7 from ulasozguler/master
Fix - propagate `casing` param of `to_dict` function recursively
2020-01-09 16:32:29 -08:00
Josh Yudaken
d7559c22f8 Fix serialization of dataclass constructor parameters 2020-01-08 11:29:45 -05:00
ulas
f9c351a98d propagate casing param recursively. 2019-12-04 19:28:53 +03:00
Daniel G. Taylor
feea790116 Bump library version 2019-10-29 22:00:27 -07:00
Daniel G. Taylor
33f74f6a45 Fix comment indent bug; bump version 2019-10-29 21:59:23 -07:00
218 changed files with 12393 additions and 2491 deletions

23
.github/CONTRIBUTING.md vendored Normal file
View File

@@ -0,0 +1,23 @@
# Contributing
There's lots to do, and we're working hard, so any help is welcome!
- :speech_balloon: Join us on [Slack](https://join.slack.com/t/betterproto/shared_invite/zt-f0n0uolx-iN8gBNrkPxtKHTLpG3o1OQ)!
What can you do?
- :+1: Vote on [issues](https://github.com/danielgtaylor/python-betterproto/issues).
- :speech_balloon: Give feedback on [Pull Requests](https://github.com/danielgtaylor/python-betterproto/pulls) and [Issues](https://github.com/danielgtaylor/python-betterproto/issues):
- Suggestions
- Express approval
- Raise concerns
- :small_red_triangle: Create an issue:
- File a bug (please check its not a duplicate)
- Propose an enhancement
- :white_check_mark: Create a PR:
- [Creating a failing test-case](https://github.com/danielgtaylor/python-betterproto/blob/master/betterproto/tests/README.md) to make bug-fixing easier
- Fix any of the open issues
- [Good first issues](https://github.com/danielgtaylor/python-betterproto/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22)
- [Issues with tests](https://github.com/danielgtaylor/python-betterproto/issues?q=is%3Aissue+is%3Aopen+label%3A%22has+test%22)
- New bugfix or idea
- If you'd like to discuss your idea first, join us on Slack!

View File

@@ -1,33 +1,69 @@
name: CI
on: [push, pull_request]
on:
push:
branches:
- master
pull_request:
branches:
- '**'
jobs:
build:
runs-on: ubuntu-latest
tests:
name: ${{ matrix.os }} / ${{ matrix.python-version }}
runs-on: ${{ matrix.os }}-latest
strategy:
matrix:
os: [Ubuntu, MacOS, Windows]
python-version: ['3.6.7', '3.7', '3.8', '3.9', '3.10']
exclude:
- os: Windows
python-version: 3.6
steps:
- uses: actions/checkout@v1
- uses: actions/setup-python@v1
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: 3.7
- uses: dschep/install-pipenv-action@v1
python-version: ${{ matrix.python-version }}
- name: Get full Python version
id: full-python-version
shell: bash
run: echo ::set-output name=version::$(python -c "import sys; print('-'.join(str(v) for v in sys.version_info))")
- name: Install poetry
shell: bash
run: |
python -m pip install poetry
echo "$HOME/.poetry/bin" >> $GITHUB_PATH
- name: Configure poetry
shell: bash
run: poetry config virtualenvs.in-project true
- name: Set up cache
uses: actions/cache@v2
id: cache
with:
path: .venv
key: venv-${{ runner.os }}-${{ steps.full-python-version.outputs.version }}-${{ hashFiles('**/poetry.lock') }}
- name: Ensure cache is healthy
if: steps.cache.outputs.cache-hit == 'true'
shell: bash
run: poetry run pip --version >/dev/null 2>&1 || rm -rf .venv
- name: Install dependencies
shell: bash
run: |
sudo apt install protobuf-compiler libprotobuf-dev
pipenv install --dev
- name: Run tests
run: |
cp .env.default .env
pipenv run pip install -e .
pipenv run generate
pipenv run test
- name: Build package
if: github.event_name == 'push' && startsWith(github.event.ref, 'refs/tags')
run: pipenv run python setup.py sdist
- name: Publish package
if: github.event_name == 'push' && startsWith(github.event.ref, 'refs/tags')
uses: pypa/gh-action-pypi-publish@v1.0.0a0
with:
user: __token__
password: ${{ secrets.pypi }}
poetry run python -m pip install pip -U
poetry install
- name: Generate code from proto files
shell: bash
run: poetry run python -m tests.generate -v
- name: Execute test suite
shell: bash
run: poetry run python -m pytest tests/

26
.github/workflows/code-quality.yml vendored Normal file
View File

@@ -0,0 +1,26 @@
name: Code Quality
on:
push:
branches:
- master
pull_request:
branches:
- '**'
jobs:
check-formatting:
name: Check code/doc formatting
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Run Black
uses: lgeiger/black-action@master
with:
args: --check src/ tests/ benchmarks/
- name: Install rST dependcies
run: python -m pip install doc8
- name: Lint documentation for errors
run: python -m doc8 docs --max-line-length 88 --ignore-path-errors "docs/migrating.rst;D001"
# it has a table which is longer than 88 characters long

31
.github/workflows/release.yml vendored Normal file
View File

@@ -0,0 +1,31 @@
name: Release
on:
push:
branches:
- master
tags:
- '**'
pull_request:
branches:
- '**'
jobs:
packaging:
name: Distribution
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Set up Python 3.8
uses: actions/setup-python@v2
with:
python-version: 3.8
- name: Install poetry
run: python -m pip install poetry
- name: Build package
run: poetry build
- name: Publish package to PyPI
if: github.event_name == 'push' && startsWith(github.event.ref, 'refs/tags')
env:
POETRY_PYPI_TOKEN_PYPI: ${{ secrets.pypi }}
run: poetry publish -n

17
.gitignore vendored
View File

@@ -1,13 +1,20 @@
.coverage
.DS_Store
.env
.vscode/settings.json
.mypy_cache
.pytest_cache
betterproto/tests/*.bin
betterproto/tests/*_pb2.py
betterproto/tests/*.py
!betterproto/tests/generate.py
!betterproto/tests/test_*.py
.python-version
build/
tests/output_*
**/__pycache__
dist
**/*.egg-info
output
.idea
.DS_Store
.tox
.venv
.asv
venv
.devcontainer

17
.readthedocs.yml Normal file
View File

@@ -0,0 +1,17 @@
version: 2
formats: []
build:
image: latest
sphinx:
configuration: docs/conf.py
fail_on_warning: false
python:
version: 3.7
install:
- method: pip
path: .
extra_requirements:
- dev

View File

@@ -5,8 +5,143 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
- Versions suffixed with `b*` are in `beta` and can be installed with `pip install --pre betterproto`.
## [Unreleased]
- fix: Format field comments also as docstrings (#304)
- fix: Fix message text in NotImplementedError (#325)
- **Breaking**: Client and Service Stubs take 1 request parameter, not one for each field (#311)
Client and Service Stubs no longer pack and unpack the input message fields as parameters.
Update your client calls and server handlers as follows:
Clients before:
```py
response = await service.echo(value="hello", extra_times=1)
```
Clients after:
```py
response = await service.echo(EchoRequest(value="hello", extra_times=1))
```
Servers before:
```py
async def echo(self, value: str, extra_times: int) -> EchoResponse:
```
Servers after:
```py
async def echo(self, echo_request: EchoRequest) -> EchoResponse:
# Use echo_request.value
# Use echo_request.extra_times
```
## [2.0.0b4] - 2022-01-03
- **Breaking**: the minimum Python version has been bumped to `3.6.2`
- Always add `AsyncIterator` to imports if there are services [#264](https://github.com/danielgtaylor/python-betterproto/pull/264)
- Allow parsing of messages from `ByteStrings` [#266](https://github.com/danielgtaylor/python-betterproto/pull/266)
- Add support for proto3 optional [#281](https://github.com/danielgtaylor/python-betterproto/pull/281)
- Fix compilation of fields with names identical to builtin types [#294](https://github.com/danielgtaylor/python-betterproto/pull/294)
- Fix default values for enum service args [#299](https://github.com/danielgtaylor/python-betterproto/pull/299)
## [2.0.0b3] - 2021-04-07
- Generate grpclib service stubs [#170](https://github.com/danielgtaylor/python-betterproto/pull/170)
- Add \_\_version\_\_ attribute to package [#134](https://github.com/danielgtaylor/python-betterproto/pull/134)
- Use betterproto generated messages in the plugin [#161](https://github.com/danielgtaylor/python-betterproto/pull/161)
- Sort the list of sources in generated file headers [#164](https://github.com/danielgtaylor/python-betterproto/pull/164)
- Micro-optimization: use tuples instead of lists for conditions [#228](https://github.com/danielgtaylor/python-betterproto/pull/228)
- Improve datestring parsing [#213](https://github.com/danielgtaylor/python-betterproto/pull/213)
- Fix serialization of repeated fields with empty messages [#180](https://github.com/danielgtaylor/python-betterproto/pull/180)
- Fix compilation of fields named 'bytes' or 'str' [#226](https://github.com/danielgtaylor/python-betterproto/pull/226)
- Fix json serialization of infinite and nan floats/doubles [#215](https://github.com/danielgtaylor/python-betterproto/pull/215)
- Fix template bug resulting in empty \_\_post_init\_\_ methods [#162](https://github.com/danielgtaylor/python-betterproto/pull/162)
- Fix serialization of zero-value messages in a oneof group [#176](https://github.com/danielgtaylor/python-betterproto/pull/176)
- Fix missing typing and datetime imports [#183](https://github.com/danielgtaylor/python-betterproto/pull/183)
- Fix code generation for empty services [#222](https://github.com/danielgtaylor/python-betterproto/pull/222)
- Fix Message.to_dict and from_dict handling of repeated timestamps and durations [#211](https://github.com/danielgtaylor/python-betterproto/pull/211)
- Fix incorrect routes in generated client when service is not in a package [#177](https://github.com/danielgtaylor/python-betterproto/pull/177)
## [2.0.0b2] - 2020-11-24
- Add support for deprecated message and fields [#126](https://github.com/danielgtaylor/python-betterproto/pull/126)
- Add support for recursive messages [#130](https://github.com/danielgtaylor/python-betterproto/pull/130)
- Add support for `bool(Message)` [#142](https://github.com/danielgtaylor/python-betterproto/pull/142)
- Improve support for Python 3.9 [#140](https://github.com/danielgtaylor/python-betterproto/pull/140) [#173](https://github.com/danielgtaylor/python-betterproto/pull/173)
- Improve keyword sanitisation for generated code [#137](https://github.com/danielgtaylor/python-betterproto/pull/137)
- Fix missing serialized_on_wire when message contains only lists [#81](https://github.com/danielgtaylor/python-betterproto/pull/81)
- Fix circular dependencies [#100](https://github.com/danielgtaylor/python-betterproto/pull/100)
- Fix to_dict enum fields when numbering is not consecutive [#102](https://github.com/danielgtaylor/python-betterproto/pull/102)
- Fix argument generation for stub methods when using `import` with proto definition [#103](https://github.com/danielgtaylor/python-betterproto/pull/103)
- Fix missing async/await keywords when casing [#104](https://github.com/danielgtaylor/python-betterproto/pull/104)
- Fix mutable default arguments in generated code [#105](https://github.com/danielgtaylor/python-betterproto/pull/105)
- Fix serialisation of default values in oneofs when calling to_dict() or to_json() [#110](https://github.com/danielgtaylor/python-betterproto/pull/110)
- Fix static type checking for grpclib client [#124](https://github.com/danielgtaylor/python-betterproto/pull/124)
- Fix python3.6 compatibility issue with dataclasses [#124](https://github.com/danielgtaylor/python-betterproto/pull/124)
- Fix handling of trailer-only responses [#127](https://github.com/danielgtaylor/python-betterproto/pull/127)
- Refactor plugin.py to use modular dataclasses in tree-like structure to represent parsed data [#121](https://github.com/danielgtaylor/python-betterproto/pull/121)
- Refactor template compilation logic [#136](https://github.com/danielgtaylor/python-betterproto/pull/136)
- Replace use of platform provided protoc with development dependency on grpcio-tools [#107](https://github.com/danielgtaylor/python-betterproto/pull/107)
- Switch to using `poe` from `make` to manage project development tasks [#118](https://github.com/danielgtaylor/python-betterproto/pull/118)
- Improve CI platform coverage [#128](https://github.com/danielgtaylor/python-betterproto/pull/128)
## [2.0.0b1] - 2020-07-04
[Upgrade Guide](./docs/upgrading.md)
> Several bugfixes and improvements required or will require small breaking changes, necessitating a new version.
> `2.0.0` will be released once the interface is stable.
- Add support for gRPC and **stream-stream** [#83](https://github.com/danielgtaylor/python-betterproto/pull/83)
- Switch from `pipenv` to `poetry` for development [#75](https://github.com/danielgtaylor/python-betterproto/pull/75)
- Fix two packages with the same name suffix should not cause naming conflict [#25](https://github.com/danielgtaylor/python-betterproto/issues/25)
- Fix Import child package from root [#57](https://github.com/danielgtaylor/python-betterproto/issues/57)
- Fix Import child package from package [#58](https://github.com/danielgtaylor/python-betterproto/issues/58)
- Fix Import parent package from child package [#59](https://github.com/danielgtaylor/python-betterproto/issues/59)
- Fix Import root package from child package [#60](https://github.com/danielgtaylor/python-betterproto/issues/60)
- Fix Import root package from root [#61](https://github.com/danielgtaylor/python-betterproto/issues/61)
- Fix ALL_CAPS message fields are parsed incorrectly. [#11](https://github.com/danielgtaylor/python-betterproto/issues/11)
## [1.2.5] - 2020-04-27
- Add .j2 suffix to python template names to avoid confusing certain build tools [#72](https://github.com/danielgtaylor/python-betterproto/pull/72)
## [1.2.4] - 2020-04-26
- Enforce utf-8 for reading the readme in setup.py [#67](https://github.com/danielgtaylor/python-betterproto/pull/67)
- Only import types from grpclib when type checking [#52](https://github.com/danielgtaylor/python-betterproto/pull/52)
- Improve performance of serialize/deserialize by caching type information of fields in class [#46](https://github.com/danielgtaylor/python-betterproto/pull/46)
- Support using Google's wrapper types as RPC output values [#40](https://github.com/danielgtaylor/python-betterproto/pull/40)
- Fixes issue where protoc did not recognize plugin.py as win32 application [#38](https://github.com/danielgtaylor/python-betterproto/pull/38)
- Fix services using non-pythonified field names [#34](https://github.com/danielgtaylor/python-betterproto/pull/34)
- Add ability to provide metadata, timeout & deadline args to requests [#32](https://github.com/danielgtaylor/python-betterproto/pull/32)
## [1.2.3] - 2020-04-15
- Exclude empty lists from `to_dict` by default [#16](https://github.com/danielgtaylor/python-betterproto/pull/16)
- Add `include_default_values` parameter for `to_dict` [#12](https://github.com/danielgtaylor/python-betterproto/pull/12)
- Fix class names being prepended with duplicates when using protocol buffers that are nested more than once [#21](https://github.com/danielgtaylor/python-betterproto/pull/21)
- Add support for python 3.6 [#30](https://github.com/danielgtaylor/python-betterproto/pull/30)
## [1.2.2] - 2020-01-09
- Mention lack of Proto 2 support in README.
- Fix serialization of constructor parameters [#10](https://github.com/danielgtaylor/python-betterproto/pull/10)
- Fix `casing` parameter propagation [#7](https://github.com/danielgtaylor/python-betterproto/pull/7)
## [1.2.1] - 2019-10-29
- Fix comment indentation bug in rendered gRPC methods.
## [1.2.0] - 2019-10-28
- Generated code output auto-formatting via [Black](https://github.com/psf/black)
@@ -29,7 +164,11 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
- Initial release
[unreleased]: https://github.com/danielgtaylor/python-betterproto/compare/v1.2.0...HEAD
[1.2.5]: https://github.com/danielgtaylor/python-betterproto/compare/v1.2.4...v1.2.5
[1.2.4]: https://github.com/danielgtaylor/python-betterproto/compare/v1.2.3...v1.2.4
[1.2.3]: https://github.com/danielgtaylor/python-betterproto/compare/v1.2.2...v1.2.3
[1.2.2]: https://github.com/danielgtaylor/python-betterproto/compare/v1.2.1...v1.2.2
[1.2.1]: https://github.com/danielgtaylor/python-betterproto/compare/v1.2.0...v1.2.1
[1.2.0]: https://github.com/danielgtaylor/python-betterproto/compare/v1.1.0...v1.2.0
[1.1.0]: https://github.com/danielgtaylor/python-betterproto/compare/v1.0.1...v1.1.0
[1.0.1]: https://github.com/danielgtaylor/python-betterproto/compare/v1.0.0...v1.0.1

29
Pipfile
View File

@@ -1,29 +0,0 @@
[[source]]
name = "pypi"
url = "https://pypi.org/simple"
verify_ssl = true
[dev-packages]
flake8 = "*"
mypy = "*"
isort = "*"
pytest = "*"
rope = "*"
[packages]
protobuf = "*"
jinja2 = "*"
grpclib = "*"
stringcase = "*"
black = "*"
[requires]
python_version = "3.7"
[scripts]
plugin = "protoc --plugin=protoc-gen-custom=betterproto/plugin.py --custom_out=output"
generate = "python betterproto/tests/generate.py"
test = "pytest ./betterproto/tests"
[pipenv]
allow_prereleases = true

396
Pipfile.lock generated
View File

@@ -1,396 +0,0 @@
{
"_meta": {
"hash": {
"sha256": "c7b72ed87dc3d70566c53d7ec8a636c8d4854aa30aa97a9116c0734cd5266f33"
},
"pipfile-spec": 6,
"requires": {
"python_version": "3.7"
},
"sources": [
{
"name": "pypi",
"url": "https://pypi.org/simple",
"verify_ssl": true
}
]
},
"default": {
"appdirs": {
"hashes": [
"sha256:9e5896d1372858f8dd3344faf4e5014d21849c756c8d5701f78f8a103b372d92",
"sha256:d8b24664561d0d34ddfaec54636d502d7cea6e29c3eaf68f3df6180863e2166e"
],
"version": "==1.4.3"
},
"attrs": {
"hashes": [
"sha256:08a96c641c3a74e44eb59afb61a24f2cb9f4d7188748e76ba4bb5edfa3cb7d1c",
"sha256:f7b7ce16570fe9965acd6d30101a28f62fb4a7f9e926b3bbc9b61f8b04247e72"
],
"version": "==19.3.0"
},
"black": {
"hashes": [
"sha256:09a9dcb7c46ed496a9850b76e4e825d6049ecd38b611f1224857a79bd985a8cf",
"sha256:68950ffd4d9169716bcb8719a56c07a2f4485354fec061cdd5910aa07369731c"
],
"index": "pypi",
"version": "==19.3b0"
},
"click": {
"hashes": [
"sha256:2335065e6395b9e67ca716de5f7526736bfa6ceead690adf616d925bdc622b13",
"sha256:5b94b49521f6456670fdb30cd82a4eca9412788a93fa6dd6df72c94d5a8ff2d7"
],
"version": "==7.0"
},
"grpclib": {
"hashes": [
"sha256:2d63cee35f764e40a7ea196f27354d2f4ab936401c40b14128bbb4fec06f51d4"
],
"index": "pypi",
"version": "==0.3.1rc2"
},
"h2": {
"hashes": [
"sha256:ac377fcf586314ef3177bfd90c12c7826ab0840edeb03f0f24f511858326049e",
"sha256:b8a32bd282594424c0ac55845377eea13fa54fe4a8db012f3a198ed923dc3ab4"
],
"version": "==3.1.1"
},
"hpack": {
"hashes": [
"sha256:0edd79eda27a53ba5be2dfabf3b15780928a0dff6eb0c60a3d6767720e970c89",
"sha256:8eec9c1f4bfae3408a3f30500261f7e6a65912dc138526ea054f9ad98892e9d2"
],
"version": "==3.0.0"
},
"hyperframe": {
"hashes": [
"sha256:5187962cb16dcc078f23cb5a4b110098d546c3f41ff2d4038a9896893bbd0b40",
"sha256:a9f5c17f2cc3c719b917c4f33ed1c61bd1f8dfac4b1bd23b7c80b3400971b41f"
],
"version": "==5.2.0"
},
"jinja2": {
"hashes": [
"sha256:74320bb91f31270f9551d46522e33af46a80c3d619f4a4bf42b3164d30b5911f",
"sha256:9fe95f19286cfefaa917656583d020be14e7859c6b0252588391e47db34527de"
],
"index": "pypi",
"version": "==2.10.3"
},
"markupsafe": {
"hashes": [
"sha256:00bc623926325b26bb9605ae9eae8a215691f33cae5df11ca5424f06f2d1f473",
"sha256:09027a7803a62ca78792ad89403b1b7a73a01c8cb65909cd876f7fcebd79b161",
"sha256:09c4b7f37d6c648cb13f9230d847adf22f8171b1ccc4d5682398e77f40309235",
"sha256:1027c282dad077d0bae18be6794e6b6b8c91d58ed8a8d89a89d59693b9131db5",
"sha256:24982cc2533820871eba85ba648cd53d8623687ff11cbb805be4ff7b4c971aff",
"sha256:29872e92839765e546828bb7754a68c418d927cd064fd4708fab9fe9c8bb116b",
"sha256:43a55c2930bbc139570ac2452adf3d70cdbb3cfe5912c71cdce1c2c6bbd9c5d1",
"sha256:46c99d2de99945ec5cb54f23c8cd5689f6d7177305ebff350a58ce5f8de1669e",
"sha256:500d4957e52ddc3351cabf489e79c91c17f6e0899158447047588650b5e69183",
"sha256:535f6fc4d397c1563d08b88e485c3496cf5784e927af890fb3c3aac7f933ec66",
"sha256:62fe6c95e3ec8a7fad637b7f3d372c15ec1caa01ab47926cfdf7a75b40e0eac1",
"sha256:6dd73240d2af64df90aa7c4e7481e23825ea70af4b4922f8ede5b9e35f78a3b1",
"sha256:717ba8fe3ae9cc0006d7c451f0bb265ee07739daf76355d06366154ee68d221e",
"sha256:79855e1c5b8da654cf486b830bd42c06e8780cea587384cf6545b7d9ac013a0b",
"sha256:7c1699dfe0cf8ff607dbdcc1e9b9af1755371f92a68f706051cc8c37d447c905",
"sha256:88e5fcfb52ee7b911e8bb6d6aa2fd21fbecc674eadd44118a9cc3863f938e735",
"sha256:8defac2f2ccd6805ebf65f5eeb132adcf2ab57aa11fdf4c0dd5169a004710e7d",
"sha256:98c7086708b163d425c67c7a91bad6e466bb99d797aa64f965e9d25c12111a5e",
"sha256:9add70b36c5666a2ed02b43b335fe19002ee5235efd4b8a89bfcf9005bebac0d",
"sha256:9bf40443012702a1d2070043cb6291650a0841ece432556f784f004937f0f32c",
"sha256:ade5e387d2ad0d7ebf59146cc00c8044acbd863725f887353a10df825fc8ae21",
"sha256:b00c1de48212e4cc9603895652c5c410df699856a2853135b3967591e4beebc2",
"sha256:b1282f8c00509d99fef04d8ba936b156d419be841854fe901d8ae224c59f0be5",
"sha256:b2051432115498d3562c084a49bba65d97cf251f5a331c64a12ee7e04dacc51b",
"sha256:ba59edeaa2fc6114428f1637ffff42da1e311e29382d81b339c1817d37ec93c6",
"sha256:c8716a48d94b06bb3b2524c2b77e055fb313aeb4ea620c8dd03a105574ba704f",
"sha256:cd5df75523866410809ca100dc9681e301e3c27567cf498077e8551b6d20e42f",
"sha256:e249096428b3ae81b08327a63a485ad0878de3fb939049038579ac0ef61e17e7"
],
"version": "==1.1.1"
},
"multidict": {
"hashes": [
"sha256:024b8129695a952ebd93373e45b5d341dbb87c17ce49637b34000093f243dd4f",
"sha256:041e9442b11409be5e4fc8b6a97e4bcead758ab1e11768d1e69160bdde18acc3",
"sha256:045b4dd0e5f6121e6f314d81759abd2c257db4634260abcfe0d3f7083c4908ef",
"sha256:047c0a04e382ef8bd74b0de01407e8d8632d7d1b4db6f2561106af812a68741b",
"sha256:068167c2d7bbeebd359665ac4fff756be5ffac9cda02375b5c5a7c4777038e73",
"sha256:148ff60e0fffa2f5fad2eb25aae7bef23d8f3b8bdaf947a65cdbe84a978092bc",
"sha256:1d1c77013a259971a72ddaa83b9f42c80a93ff12df6a4723be99d858fa30bee3",
"sha256:1d48bc124a6b7a55006d97917f695effa9725d05abe8ee78fd60d6588b8344cd",
"sha256:31dfa2fc323097f8ad7acd41aa38d7c614dd1960ac6681745b6da124093dc351",
"sha256:34f82db7f80c49f38b032c5abb605c458bac997a6c3142e0d6c130be6fb2b941",
"sha256:3d5dd8e5998fb4ace04789d1d008e2bb532de501218519d70bb672c4c5a2fc5d",
"sha256:4a6ae52bd3ee41ee0f3acf4c60ceb3f44e0e3bc52ab7da1c2b2aa6703363a3d1",
"sha256:4b02a3b2a2f01d0490dd39321c74273fed0568568ea0e7ea23e02bd1fb10a10b",
"sha256:4b843f8e1dd6a3195679d9838eb4670222e8b8d01bc36c9894d6c3538316fa0a",
"sha256:5de53a28f40ef3c4fd57aeab6b590c2c663de87a5af76136ced519923d3efbb3",
"sha256:61b2b33ede821b94fa99ce0b09c9ece049c7067a33b279f343adfe35108a4ea7",
"sha256:6a3a9b0f45fd75dc05d8e93dc21b18fc1670135ec9544d1ad4acbcf6b86781d0",
"sha256:76ad8e4c69dadbb31bad17c16baee61c0d1a4a73bed2590b741b2e1a46d3edd0",
"sha256:7ba19b777dc00194d1b473180d4ca89a054dd18de27d0ee2e42a103ec9b7d014",
"sha256:7c1b7eab7a49aa96f3db1f716f0113a8a2e93c7375dd3d5d21c4941f1405c9c5",
"sha256:7fc0eee3046041387cbace9314926aa48b681202f8897f8bff3809967a049036",
"sha256:8ccd1c5fff1aa1427100ce188557fc31f1e0a383ad8ec42c559aabd4ff08802d",
"sha256:8e08dd76de80539d613654915a2f5196dbccc67448df291e69a88712ea21e24a",
"sha256:c18498c50c59263841862ea0501da9f2b3659c00db54abfbf823a80787fde8ce",
"sha256:c49db89d602c24928e68c0d510f4fcf8989d77defd01c973d6cbe27e684833b1",
"sha256:ce20044d0317649ddbb4e54dab3c1bcc7483c78c27d3f58ab3d0c7e6bc60d26a",
"sha256:d1071414dd06ca2eafa90c85a079169bfeb0e5f57fd0b45d44c092546fcd6fd9",
"sha256:d3be11ac43ab1a3e979dac80843b42226d5d3cccd3986f2e03152720a4297cd7",
"sha256:db603a1c235d110c860d5f39988ebc8218ee028f07a7cbc056ba6424372ca31b"
],
"version": "==4.5.2"
},
"protobuf": {
"hashes": [
"sha256:125713564d8cfed7610e52444c9769b8dcb0b55e25cc7841f2290ee7bc86636f",
"sha256:1accdb7a47e51503be64d9a57543964ba674edac103215576399d2d0e34eac77",
"sha256:27003d12d4f68e3cbea9eb67427cab3bfddd47ff90670cb367fcd7a3a89b9657",
"sha256:3264f3c431a631b0b31e9db2ae8c927b79fc1a7b1b06b31e8e5bcf2af91fe896",
"sha256:3c5ab0f5c71ca5af27143e60613729e3488bb45f6d3f143dc918a20af8bab0bf",
"sha256:45dcf8758873e3f69feab075e5f3177270739f146255225474ee0b90429adef6",
"sha256:56a77d61a91186cc5676d8e11b36a5feb513873e4ae88d2ee5cf530d52bbcd3b",
"sha256:5984e4947bbcef5bd849d6244aec507d31786f2dd3344139adc1489fb403b300",
"sha256:6b0441da73796dd00821763bb4119674eaf252776beb50ae3883bed179a60b2a",
"sha256:6f6677c5ade94d4fe75a912926d6796d5c71a2a90c2aeefe0d6f211d75c74789",
"sha256:84a825a9418d7196e2acc48f8746cf1ee75877ed2f30433ab92a133f3eaf8fbe",
"sha256:b842c34fe043ccf78b4a6cf1019d7b80113707d68c88842d061fa2b8fb6ddedc",
"sha256:ca33d2f09dae149a1dcf942d2d825ebb06343b77b437198c9e2ef115cf5d5bc1",
"sha256:db83b5c12c0cd30150bb568e6feb2435c49ce4e68fe2d7b903113f0e221e58fe",
"sha256:f50f3b1c5c1c1334ca7ce9cad5992f098f460ffd6388a3cabad10b66c2006b09",
"sha256:f99f127909731cafb841c52f9216e447d3e4afb99b17bebfad327a75aee206de"
],
"index": "pypi",
"version": "==3.10.0"
},
"six": {
"hashes": [
"sha256:3350809f0555b11f552448330d0b52d5f24c91a322ea4a15ef22629740f3761c",
"sha256:d16a0141ec1a18405cd4ce8b4613101da75da0e9a7aec5bdd4fa804d0e0eba73"
],
"version": "==1.12.0"
},
"stringcase": {
"hashes": [
"sha256:48a06980661908efe8d9d34eab2b6c13aefa2163b3ced26972902e3bdfd87008"
],
"index": "pypi",
"version": "==1.2.0"
},
"toml": {
"hashes": [
"sha256:229f81c57791a41d65e399fc06bf0848bab550a9dfd5ed66df18ce5f05e73d5c",
"sha256:235682dd292d5899d361a811df37e04a8828a5b1da3115886b73cf81ebc9100e"
],
"version": "==0.10.0"
}
},
"develop": {
"atomicwrites": {
"hashes": [
"sha256:03472c30eb2c5d1ba9227e4c2ca66ab8287fbfbbda3888aa93dc2e28fc6811b4",
"sha256:75a9445bac02d8d058d5e1fe689654ba5a6556a1dfd8ce6ec55a0ed79866cfa6"
],
"version": "==1.3.0"
},
"attrs": {
"hashes": [
"sha256:08a96c641c3a74e44eb59afb61a24f2cb9f4d7188748e76ba4bb5edfa3cb7d1c",
"sha256:f7b7ce16570fe9965acd6d30101a28f62fb4a7f9e926b3bbc9b61f8b04247e72"
],
"version": "==19.3.0"
},
"entrypoints": {
"hashes": [
"sha256:589f874b313739ad35be6e0cd7efde2a4e9b6fea91edcc34e58ecbb8dbe56d19",
"sha256:c70dd71abe5a8c85e55e12c19bd91ccfeec11a6e99044204511f9ed547d48451"
],
"version": "==0.3"
},
"flake8": {
"hashes": [
"sha256:45681a117ecc81e870cbf1262835ae4af5e7a8b08e40b944a8a6e6b895914cfb",
"sha256:49356e766643ad15072a789a20915d3c91dc89fd313ccd71802303fd67e4deca"
],
"index": "pypi",
"version": "==3.7.9"
},
"importlib-metadata": {
"hashes": [
"sha256:aa18d7378b00b40847790e7c27e11673d7fed219354109d0e7b9e5b25dc3ad26",
"sha256:d5f18a79777f3aa179c145737780282e27b508fc8fd688cb17c7a813e8bd39af"
],
"markers": "python_version < '3.8'",
"version": "==0.23"
},
"isort": {
"hashes": [
"sha256:54da7e92468955c4fceacd0c86bd0ec997b0e1ee80d97f67c35a78b719dccab1",
"sha256:6e811fcb295968434526407adb8796944f1988c5b65e8139058f2014cbe100fd"
],
"index": "pypi",
"version": "==4.3.21"
},
"mccabe": {
"hashes": [
"sha256:ab8a6258860da4b6677da4bd2fe5dc2c659cff31b3ee4f7f5d64e79735b80d42",
"sha256:dd8d182285a0fe56bace7f45b5e7d1a6ebcbf524e8f3bd87eb0f125271b8831f"
],
"version": "==0.6.1"
},
"more-itertools": {
"hashes": [
"sha256:409cd48d4db7052af495b09dec721011634af3753ae1ef92d2b32f73a745f832",
"sha256:92b8c4b06dac4f0611c0729b2f2ede52b2e1bac1ab48f089c7ddc12e26bb60c4"
],
"version": "==7.2.0"
},
"mypy": {
"hashes": [
"sha256:1521c186a3d200c399bd5573c828ea2db1362af7209b2adb1bb8532cea2fb36f",
"sha256:31a046ab040a84a0fc38bc93694876398e62bc9f35eca8ccbf6418b7297f4c00",
"sha256:3b1a411909c84b2ae9b8283b58b48541654b918e8513c20a400bb946aa9111ae",
"sha256:48c8bc99380575deb39f5d3400ebb6a8a1cb5cc669bbba4d3bb30f904e0a0e7d",
"sha256:540c9caa57a22d0d5d3c69047cc9dd0094d49782603eb03069821b41f9e970e9",
"sha256:672e418425d957e276c291930a3921b4a6413204f53fe7c37cad7bc57b9a3391",
"sha256:6ed3b9b3fdc7193ea7aca6f3c20549b377a56f28769783a8f27191903a54170f",
"sha256:9371290aa2cad5ad133e4cdc43892778efd13293406f7340b9ffe99d5ec7c1d9",
"sha256:ace6ac1d0f87d4072f05b5468a084a45b4eda970e4d26704f201e06d47ab2990",
"sha256:b428f883d2b3fe1d052c630642cc6afddd07d5cd7873da948644508be3b9d4a7",
"sha256:d5bf0e6ec8ba346a2cf35cb55bf4adfddbc6b6576fcc9e10863daa523e418dbb",
"sha256:d7574e283f83c08501607586b3167728c58e8442947e027d2d4c7dcd6d82f453",
"sha256:dc889c84241a857c263a2b1cd1121507db7d5b5f5e87e77147097230f374d10b",
"sha256:f4748697b349f373002656bf32fede706a0e713d67bfdcf04edf39b1f61d46eb"
],
"index": "pypi",
"version": "==0.740"
},
"mypy-extensions": {
"hashes": [
"sha256:090fedd75945a69ae91ce1303b5824f428daf5a028d2f6ab8a299250a846f15d",
"sha256:2d82818f5bb3e369420cb3c4060a7970edba416647068eb4c5343488a6c604a8"
],
"version": "==0.4.3"
},
"packaging": {
"hashes": [
"sha256:28b924174df7a2fa32c1953825ff29c61e2f5e082343165438812f00d3a7fc47",
"sha256:d9551545c6d761f3def1677baf08ab2a3ca17c56879e70fecba2fc4dde4ed108"
],
"version": "==19.2"
},
"pluggy": {
"hashes": [
"sha256:0db4b7601aae1d35b4a033282da476845aa19185c1e6964b25cf324b5e4ec3e6",
"sha256:fa5fa1622fa6dd5c030e9cad086fa19ef6a0cf6d7a2d12318e10cb49d6d68f34"
],
"version": "==0.13.0"
},
"py": {
"hashes": [
"sha256:64f65755aee5b381cea27766a3a147c3f15b9b6b9ac88676de66ba2ae36793fa",
"sha256:dc639b046a6e2cff5bbe40194ad65936d6ba360b52b3c3fe1d08a82dd50b5e53"
],
"version": "==1.8.0"
},
"pycodestyle": {
"hashes": [
"sha256:95a2219d12372f05704562a14ec30bc76b05a5b297b21a5dfe3f6fac3491ae56",
"sha256:e40a936c9a450ad81df37f549d676d127b1b66000a6c500caa2b085bc0ca976c"
],
"version": "==2.5.0"
},
"pyflakes": {
"hashes": [
"sha256:17dbeb2e3f4d772725c777fabc446d5634d1038f234e77343108ce445ea69ce0",
"sha256:d976835886f8c5b31d47970ed689944a0262b5f3afa00a5a7b4dc81e5449f8a2"
],
"version": "==2.1.1"
},
"pyparsing": {
"hashes": [
"sha256:6f98a7b9397e206d78cc01df10131398f1c8b8510a2f4d97d9abd82e1aacdd80",
"sha256:d9338df12903bbf5d65a0e4e87c2161968b10d2e489652bb47001d82a9b028b4"
],
"version": "==2.4.2"
},
"pytest": {
"hashes": [
"sha256:27abc3fef618a01bebb1f0d6d303d2816a99aa87a5968ebc32fe971be91eb1e6",
"sha256:58cee9e09242937e136dbb3dab466116ba20d6b7828c7620f23947f37eb4dae4"
],
"index": "pypi",
"version": "==5.2.2"
},
"rope": {
"hashes": [
"sha256:6b728fdc3e98a83446c27a91fc5d56808a004f8beab7a31ab1d7224cecc7d969",
"sha256:c5c5a6a87f7b1a2095fb311135e2a3d1f194f5ecb96900fdd0a9100881f48aaf",
"sha256:f0dcf719b63200d492b85535ebe5ea9b29e0d0b8aebeb87fe03fc1a65924fdaf"
],
"index": "pypi",
"version": "==0.14.0"
},
"six": {
"hashes": [
"sha256:3350809f0555b11f552448330d0b52d5f24c91a322ea4a15ef22629740f3761c",
"sha256:d16a0141ec1a18405cd4ce8b4613101da75da0e9a7aec5bdd4fa804d0e0eba73"
],
"version": "==1.12.0"
},
"typed-ast": {
"hashes": [
"sha256:1170afa46a3799e18b4c977777ce137bb53c7485379d9706af8a59f2ea1aa161",
"sha256:18511a0b3e7922276346bcb47e2ef9f38fb90fd31cb9223eed42c85d1312344e",
"sha256:262c247a82d005e43b5b7f69aff746370538e176131c32dda9cb0f324d27141e",
"sha256:2b907eb046d049bcd9892e3076c7a6456c93a25bebfe554e931620c90e6a25b0",
"sha256:354c16e5babd09f5cb0ee000d54cfa38401d8b8891eefa878ac772f827181a3c",
"sha256:48e5b1e71f25cfdef98b013263a88d7145879fbb2d5185f2a0c79fa7ebbeae47",
"sha256:4e0b70c6fc4d010f8107726af5fd37921b666f5b31d9331f0bd24ad9a088e631",
"sha256:630968c5cdee51a11c05a30453f8cd65e0cc1d2ad0d9192819df9978984529f4",
"sha256:66480f95b8167c9c5c5c87f32cf437d585937970f3fc24386f313a4c97b44e34",
"sha256:71211d26ffd12d63a83e079ff258ac9d56a1376a25bc80b1cdcdf601b855b90b",
"sha256:7954560051331d003b4e2b3eb822d9dd2e376fa4f6d98fee32f452f52dd6ebb2",
"sha256:838997f4310012cf2e1ad3803bce2f3402e9ffb71ded61b5ee22617b3a7f6b6e",
"sha256:95bd11af7eafc16e829af2d3df510cecfd4387f6453355188342c3e79a2ec87a",
"sha256:bc6c7d3fa1325a0c6613512a093bc2a2a15aeec350451cbdf9e1d4bffe3e3233",
"sha256:cc34a6f5b426748a507dd5d1de4c1978f2eb5626d51326e43280941206c209e1",
"sha256:d755f03c1e4a51e9b24d899561fec4ccaf51f210d52abdf8c07ee2849b212a36",
"sha256:d7c45933b1bdfaf9f36c579671fec15d25b06c8398f113dab64c18ed1adda01d",
"sha256:d896919306dd0aa22d0132f62a1b78d11aaf4c9fc5b3410d3c666b818191630a",
"sha256:fdc1c9bbf79510b76408840e009ed65958feba92a88833cdceecff93ae8fff66",
"sha256:ffde2fbfad571af120fcbfbbc61c72469e72f550d676c3342492a9dfdefb8f12"
],
"version": "==1.4.0"
},
"typing-extensions": {
"hashes": [
"sha256:091ecc894d5e908ac75209f10d5b4f118fbdb2eb1ede6a63544054bb1edb41f2",
"sha256:910f4656f54de5993ad9304959ce9bb903f90aadc7c67a0bef07e678014e892d",
"sha256:cf8b63fedea4d89bab840ecbb93e75578af28f76f66c35889bd7065f5af88575"
],
"version": "==3.7.4.1"
},
"wcwidth": {
"hashes": [
"sha256:3df37372226d6e63e1b1e1eda15c594bca98a22d33a23832a90998faa96bc65e",
"sha256:f4ebe71925af7b40a864553f761ed559b43544f8f71746c2d756c7fe788ade7c"
],
"version": "==0.1.7"
},
"zipp": {
"hashes": [
"sha256:3718b1cbcd963c7d4c5511a8240812904164b7f381b647143a89d3b98f9bcd8e",
"sha256:f06903e9f1f43b12d371004b4ac7b06ab39a44adc747266928ae6debfa7b3335"
],
"version": "==0.6.0"
}
}
}

233
README.md
View File

@@ -1,12 +1,13 @@
# Better Protobuf / gRPC Support for Python
![](https://github.com/danielgtaylor/python-betterproto/workflows/CI/badge.svg)
> :octocat: If you're reading this on github, please be aware that it might mention unreleased features! See the latest released README on [pypi](https://pypi.org/project/betterproto/).
This project aims to provide an improved experience when using Protobuf / gRPC in a modern Python environment by making use of modern language features and generating readable, understandable, idiomatic Python code. It will not support legacy features or environments. The following are supported:
This project aims to provide an improved experience when using Protobuf / gRPC in a modern Python environment by making use of modern language features and generating readable, understandable, idiomatic Python code. It will not support legacy features or environments (e.g. Protobuf 2). The following are supported:
- Protobuf 3 & gRPC code generation
- Both binary & JSON serialization is built-in
- Python 3.7+ making use of:
- Python 3.6+ making use of:
- Enums
- Dataclasses
- `async`/`await`
@@ -37,21 +38,26 @@ This project exists because I am unhappy with the state of the official Google p
- Uses `SerializeToString()` rather than the built-in `__bytes__()`
- Special wrapped types don't use Python's `None`
- Timestamp/duration types don't use Python's built-in `datetime` module
This project is a reimplementation from the ground up focused on idiomatic modern Python to help fix some of the above. While it may not be a 1:1 drop-in replacement due to changed method names and call patterns, the wire format is identical.
## Installation & Getting Started
## Installation
First, install the package. Note that the `[compiler]` feature flag tells it to install extra dependencies only needed by the `protoc` plugin:
```sh
# Install both the library and compiler
$ pip install betterproto[compiler]
pip install "betterproto[compiler]"
# Install just the library (to use the generated code output)
$ pip install betterproto
pip install betterproto
```
*Betterproto* is under active development. To install the latest beta version, use `pip install --pre betterproto`.
## Getting Started
### Compiling proto files
Now, given you installed the compiler and have a proto file, e.g `example.proto`:
```protobuf
@@ -65,17 +71,25 @@ message Greeting {
}
```
You can run the following:
You can run the following to invoke protoc directly:
```sh
$ protoc -I . --python_betterproto_out=. example.proto
mkdir lib
protoc -I . --python_betterproto_out=lib example.proto
```
This will generate `hello.py` which looks like:
or run the following to invoke protoc via grpcio-tools:
```py
```sh
pip install grpcio-tools
python -m grpc_tools.protoc -I . --python_betterproto_out=lib example.proto
```
This will generate `lib/hello/__init__.py` which looks like:
```python
# Generated by the protocol buffer compiler. DO NOT EDIT!
# sources: hello.proto
# sources: example.proto
# plugin: python-betterproto
from dataclasses import dataclass
@@ -83,7 +97,7 @@ import betterproto
@dataclass
class Hello(betterproto.Message):
class Greeting(betterproto.Message):
"""Greeting represents a message you can tell a user."""
message: str = betterproto.string_field(1)
@@ -91,23 +105,23 @@ class Hello(betterproto.Message):
Now you can use it!
```py
>>> from hello import Hello
>>> test = Hello()
```python
>>> from lib.hello import Greeting
>>> test = Greeting()
>>> test
Hello(message='')
Greeting(message='')
>>> test.message = "Hey!"
>>> test
Hello(message="Hey!")
Greeting(message="Hey!")
>>> serialized = bytes(test)
>>> serialized
b'\n\x04Hey!'
>>> another = Hello().parse(serialized)
>>> another = Greeting().parse(serialized)
>>> another
Hello(message="Hey!")
Greeting(message="Hey!")
>>> another.to_dict()
{"message": "Hey!"}
@@ -119,7 +133,7 @@ Hello(message="Hey!")
The generated Protobuf `Message` classes are compatible with [grpclib](https://github.com/vmagamedov/grpclib) so you are free to use it if you like. That said, this project also includes support for async gRPC stub generation with better static type checking and code completion support. It is enabled by default.
Given an example like:
Given an example service definition:
```protobuf
syntax = "proto3";
@@ -146,22 +160,74 @@ service Echo {
}
```
You can use it like so (enable async in the interactive shell first):
Generate echo proto file:
```py
>>> import echo
>>> from grpclib.client import Channel
```
python -m grpc_tools.protoc -I . --python_betterproto_out=. echo.proto
```
>>> channel = Channel(host="127.0.0.1", port=1234)
>>> service = echo.EchoStub(channel)
>>> await service.echo(value="hello", extra_times=1)
EchoResponse(values=["hello", "hello"])
A client can be implemented as follows:
```python
import asyncio
import echo
>>> async for response in service.echo_stream(value="hello", extra_times=1)
from grpclib.client import Channel
async def main():
channel = Channel(host="127.0.0.1", port=50051)
service = echo.EchoStub(channel)
response = await service.echo(echo.EchoRequest(value="hello", extra_times=1))
print(response)
async for response in service.echo_stream(echo.EchoRequest(value="hello", extra_times=1)):
print(response)
EchoStreamResponse(value="hello")
EchoStreamResponse(value="hello")
# don't forget to close the channel when done!
channel.close()
if __name__ == "__main__":
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
```
which would output
```python
EchoResponse(values=['hello', 'hello'])
EchoStreamResponse(value='hello')
EchoStreamResponse(value='hello')
```
This project also produces server-facing stubs that can be used to implement a Python
gRPC server.
To use them, simply subclass the base class in the generated files and override the
service methods:
```python
import asyncio
from echo import EchoBase, EchoRequest, EchoResponse, EchoStreamResponse
from grpclib.server import Server
from typing import AsyncIterator
class EchoService(EchoBase):
async def echo(self, echo_request: "EchoRequest") -> "EchoResponse":
return EchoResponse([echo_request.value for _ in range(echo_request.extra_times)])
async def echo_stream(self, echo_request: "EchoRequest") -> AsyncIterator["EchoStreamResponse"]:
for _ in range(echo_request.extra_times):
yield EchoStreamResponse(echo_request.value)
async def main():
server = Server([EchoService()])
await server.start("127.0.0.1", 50051)
await server.wait_closed()
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
```
### JSON
@@ -173,8 +239,8 @@ Both serializing and parsing are supported to/from JSON and Python dictionaries
For compatibility the default is to convert field names to `camelCase`. You can control this behavior by passing a casing value, e.g:
```py
>>> MyMessage().to_dict(casing=betterproto.Casing.SNAKE)
```python
MyMessage().to_dict(casing=betterproto.Casing.SNAKE)
```
### Determining if a message was sent
@@ -256,6 +322,7 @@ Google provides several well-known message types like a timestamp, duration, and
| `google.protobuf.duration` | [`datetime.timedelta`][td] | `0` |
| `google.protobuf.timestamp` | Timezone-aware [`datetime.datetime`][dt] | `1970-01-01T00:00:00Z` |
| `google.protobuf.*Value` | `Optional[...]` | `None` |
| `google.protobuf.*` | `betterproto.lib.google.protobuf.*` | `None` |
[td]: https://docs.python.org/3/library/datetime.html#timedelta-objects
[dt]: https://docs.python.org/3/library/datetime.html#datetime.datetime
@@ -296,34 +363,99 @@ datetime.datetime(2019, 1, 1, 11, 59, 58, 800000, tzinfo=datetime.timezone.utc)
## Development
First, make sure you have Python 3.7+ and `pipenv` installed, along with the official [Protobuf Compiler](https://github.com/protocolbuffers/protobuf/releases) for your platform. Then:
- _Join us on [Slack](https://join.slack.com/t/betterproto/shared_invite/zt-f0n0uolx-iN8gBNrkPxtKHTLpG3o1OQ)!_
- _See how you can help &rarr; [Contributing](.github/CONTRIBUTING.md)_
### Requirements
- Python (3.6 or higher)
- [poetry](https://python-poetry.org/docs/#installation)
*Needed to install dependencies in a virtual environment*
- [poethepoet](https://github.com/nat-n/poethepoet) for running development tasks as defined in pyproject.toml
- Can be installed to your host environment via `pip install poethepoet` then executed as simple `poe`
- or run from the poetry venv as `poetry run poe`
### Setup
```sh
# Get set up with the virtual env & dependencies
$ pipenv install --dev
poetry run pip install --upgrade pip
poetry install
# Link the local package
$ pipenv shell
$ pip install -e .
# Activate the poetry environment
poetry shell
```
### Code style
This project enforces [black](https://github.com/psf/black) python code formatting.
Before committing changes run:
```sh
poe format
```
To avoid merge conflicts later, non-black formatted python code will fail in CI.
### Tests
There are two types of tests:
1. Manually-written tests for some behavior of the library
2. Proto files and JSON inputs for automated tests
1. Standard tests
2. Custom tests
For #2, you can add a new `*.proto` file into the `betterproto/tests` directory along with a sample `*.json` input and it will get automatically picked up.
#### Standard tests
Adding a standard test case is easy.
- Create a new directory `betterproto/tests/inputs/<name>`
- add `<name>.proto` with a message called `Test`
- add `<name>.json` with some test data (optional)
It will be picked up automatically when you run the tests.
- See also: [Standard Tests Development Guide](betterproto/tests/README.md)
#### Custom tests
Custom tests are found in `tests/test_*.py` and are run with pytest.
#### Running
Here's how to run the tests.
```sh
# Generate assets from sample .proto files
$ pipenv run generate
# Generate assets from sample .proto files required by the tests
poe generate
# Run the tests
$ pipenv run tests
poe test
```
To run tests as they are run in CI (with tox) run:
```sh
poe full-test
```
### (Re)compiling Google Well-known Types
Betterproto includes compiled versions for Google's well-known types at [betterproto/lib/google](betterproto/lib/google).
Be sure to regenerate these files when modifying the plugin output format, and validate by running the tests.
Normally, the plugin does not compile any references to `google.protobuf`, since they are pre-compiled. To force compilation of `google.protobuf`, use the option `--custom_opt=INCLUDE_GOOGLE`.
Assuming your `google.protobuf` source files (included with all releases of `protoc`) are located in `/usr/local/include`, you can regenerate them as follows:
```sh
protoc \
--plugin=protoc-gen-custom=src/betterproto/plugin/main.py \
--custom_opt=INCLUDE_GOOGLE \
--custom_out=src/betterproto/lib \
-I /usr/local/include/ \
/usr/local/include/google/protobuf/*.proto
```
### TODO
@@ -340,6 +472,9 @@ $ pipenv run tests
- [x] Refs to nested types
- [x] Imports in proto files
- [x] Well-known Google types
- [ ] Support as request input
- [ ] Support as response output
- [ ] Automatically wrap/unwrap responses
- [x] OneOf support
- [x] Basic support on the wire
- [x] Check which was set from the group
@@ -353,16 +488,20 @@ $ pipenv run tests
- [x] Enum strings
- [x] Well known types support (timestamp, duration, wrappers)
- [x] Support different casing (orig vs. camel vs. others?)
- [ ] Async service stubs
- [x] Async service stubs
- [x] Unary-unary
- [x] Server streaming response
- [ ] Client streaming request
- [x] Client streaming request
- [x] Renaming messages and fields to conform to Python name standards
- [x] Renaming clashes with language keywords
- [x] Python package
- [x] Automate running tests
- [ ] Cleanup!
## Community
Join us on [Slack](https://join.slack.com/t/betterproto/shared_invite/zt-f0n0uolx-iN8gBNrkPxtKHTLpG3o1OQ)!
## License
Copyright © 2019 Daniel G. Taylor

157
asv.conf.json Normal file
View File

@@ -0,0 +1,157 @@
{
// The version of the config file format. Do not change, unless
// you know what you are doing.
"version": 1,
// The name of the project being benchmarked
"project": "python-betterproto",
// The project's homepage
"project_url": "https://github.com/danielgtaylor/python-betterproto",
// The URL or local path of the source code repository for the
// project being benchmarked
"repo": ".",
// The Python project's subdirectory in your repo. If missing or
// the empty string, the project is assumed to be located at the root
// of the repository.
// "repo_subdir": "",
// Customizable commands for building, installing, and
// uninstalling the project. See asv.conf.json documentation.
//
"install_command": ["python -m pip install ."],
"uninstall_command": ["return-code=any python -m pip uninstall -y {project}"],
"build_command": ["python -m pip wheel -w {build_cache_dir} {build_dir}"],
// List of branches to benchmark. If not provided, defaults to "master"
// (for git) or "default" (for mercurial).
// "branches": ["master"], // for git
// "branches": ["default"], // for mercurial
// The DVCS being used. If not set, it will be automatically
// determined from "repo" by looking at the protocol in the URL
// (if remote), or by looking for special directories, such as
// ".git" (if local).
// "dvcs": "git",
// The tool to use to create environments. May be "conda",
// "virtualenv" or other value depending on the plugins in use.
// If missing or the empty string, the tool will be automatically
// determined by looking for tools on the PATH environment
// variable.
"environment_type": "virtualenv",
// timeout in seconds for installing any dependencies in environment
// defaults to 10 min
//"install_timeout": 600,
// the base URL to show a commit for the project.
// "show_commit_url": "http://github.com/owner/project/commit/",
// The Pythons you'd like to test against. If not provided, defaults
// to the current version of Python used to run `asv`.
// "pythons": ["2.7", "3.6"],
// The list of conda channel names to be searched for benchmark
// dependency packages in the specified order
// "conda_channels": ["conda-forge", "defaults"],
// The matrix of dependencies to test. Each key is the name of a
// package (in PyPI) and the values are version numbers. An empty
// list or empty string indicates to just test against the default
// (latest) version. null indicates that the package is to not be
// installed. If the package to be tested is only available from
// PyPi, and the 'environment_type' is conda, then you can preface
// the package name by 'pip+', and the package will be installed via
// pip (with all the conda available packages installed first,
// followed by the pip installed packages).
//
// "matrix": {
// "numpy": ["1.6", "1.7"],
// "six": ["", null], // test with and without six installed
// "pip+emcee": [""], // emcee is only available for install with pip.
// },
// Combinations of libraries/python versions can be excluded/included
// from the set to test. Each entry is a dictionary containing additional
// key-value pairs to include/exclude.
//
// An exclude entry excludes entries where all values match. The
// values are regexps that should match the whole string.
//
// An include entry adds an environment. Only the packages listed
// are installed. The 'python' key is required. The exclude rules
// do not apply to includes.
//
// In addition to package names, the following keys are available:
//
// - python
// Python version, as in the *pythons* variable above.
// - environment_type
// Environment type, as above.
// - sys_platform
// Platform, as in sys.platform. Possible values for the common
// cases: 'linux2', 'win32', 'cygwin', 'darwin'.
//
// "exclude": [
// {"python": "3.2", "sys_platform": "win32"}, // skip py3.2 on windows
// {"environment_type": "conda", "six": null}, // don't run without six on conda
// ],
//
// "include": [
// // additional env for python2.7
// {"python": "2.7", "numpy": "1.8"},
// // additional env if run on windows+conda
// {"platform": "win32", "environment_type": "conda", "python": "2.7", "libpython": ""},
// ],
// The directory (relative to the current directory) that benchmarks are
// stored in. If not provided, defaults to "benchmarks"
// "benchmark_dir": "benchmarks",
// The directory (relative to the current directory) to cache the Python
// environments in. If not provided, defaults to "env"
"env_dir": ".asv/env",
// The directory (relative to the current directory) that raw benchmark
// results are stored in. If not provided, defaults to "results".
"results_dir": ".asv/results",
// The directory (relative to the current directory) that the html tree
// should be written to. If not provided, defaults to "html".
"html_dir": ".asv/html",
// The number of characters to retain in the commit hashes.
// "hash_length": 8,
// `asv` will cache results of the recent builds in each
// environment, making them faster to install next time. This is
// the number of builds to keep, per environment.
// "build_cache_size": 2,
// The commits after which the regression search in `asv publish`
// should start looking for regressions. Dictionary whose keys are
// regexps matching to benchmark names, and values corresponding to
// the commit (exclusive) after which to start looking for
// regressions. The default is to start from the first commit
// with results. If the commit is `null`, regression detection is
// skipped for the matching benchmark.
//
// "regressions_first_commits": {
// "some_benchmark": "352cdf", // Consider regressions only after this commit
// "another_benchmark": null, // Skip regression detection altogether
// },
// The thresholds for relative change in results, after which `asv
// publish` starts reporting regressions. Dictionary of the same
// form as in ``regressions_first_commits``, with values
// indicating the thresholds. If multiple entries match, the
// maximum is taken. If no entry matches, the default is 5%.
//
// "regressions_thresholds": {
// "some_benchmark": 0.01, // Threshold of 1%
// "another_benchmark": 0.5, // Threshold of 50%
// },
}

1
benchmarks/__init__.py Normal file
View File

@@ -0,0 +1 @@

128
benchmarks/benchmarks.py Normal file
View File

@@ -0,0 +1,128 @@
import betterproto
from dataclasses import dataclass
from typing import List
@dataclass
class TestMessage(betterproto.Message):
foo: int = betterproto.uint32_field(0)
bar: str = betterproto.string_field(1)
baz: float = betterproto.float_field(2)
@dataclass
class TestNestedChildMessage(betterproto.Message):
str_key: str = betterproto.string_field(0)
bytes_key: bytes = betterproto.bytes_field(1)
bool_key: bool = betterproto.bool_field(2)
float_key: float = betterproto.float_field(3)
int_key: int = betterproto.uint64_field(4)
@dataclass
class TestNestedMessage(betterproto.Message):
foo: TestNestedChildMessage = betterproto.message_field(0)
bar: TestNestedChildMessage = betterproto.message_field(1)
baz: TestNestedChildMessage = betterproto.message_field(2)
@dataclass
class TestRepeatedMessage(betterproto.Message):
foo_repeat: List[str] = betterproto.string_field(0)
bar_repeat: List[int] = betterproto.int64_field(1)
baz_repeat: List[bool] = betterproto.bool_field(2)
class BenchMessage:
"""Test creation and usage a proto message."""
def setup(self):
self.cls = TestMessage
self.instance = TestMessage()
self.instance_filled = TestMessage(0, "test", 0.0)
self.instance_filled_bytes = bytes(self.instance_filled)
self.instance_filled_nested = TestNestedMessage(
TestNestedChildMessage("foo", bytearray(b"test1"), True, 0.1234, 500),
TestNestedChildMessage("bar", bytearray(b"test2"), True, 3.1415, -302),
TestNestedChildMessage("baz", bytearray(b"test3"), False, 1e5, 300),
)
self.instance_filled_nested_bytes = bytes(self.instance_filled_nested)
self.instance_filled_repeated = TestRepeatedMessage(
[
"test1",
"test2",
"test3",
"test4",
"test5",
"test6",
"test7",
"test8",
"test9",
"test10",
],
[2, -100, 0, 500000, 600, -425678, 1000000000, -300, 1, -694214214466],
[True, False, False, False, True, True, False, True, False, False],
)
self.instance_filled_repeated_bytes = bytes(self.instance_filled_repeated)
def time_overhead(self):
"""Overhead in class definition."""
@dataclass
class Message(betterproto.Message):
foo: int = betterproto.uint32_field(0)
bar: str = betterproto.string_field(1)
baz: float = betterproto.float_field(2)
def time_instantiation(self):
"""Time instantiation"""
self.cls()
def time_attribute_access(self):
"""Time to access an attribute"""
self.instance.foo
self.instance.bar
self.instance.baz
def time_init_with_values(self):
"""Time to set an attribute"""
self.cls(0, "test", 0.0)
def time_attribute_setting(self):
"""Time to set attributes"""
self.instance.foo = 0
self.instance.bar = "test"
self.instance.baz = 0.0
def time_serialize(self):
"""Time serializing a message to wire."""
bytes(self.instance_filled)
def time_deserialize(self):
"""Time deserialize a message."""
TestMessage().parse(self.instance_filled_bytes)
def time_serialize_nested(self):
"""Time serializing a nested message to wire."""
bytes(self.instance_filled_nested)
def time_deserialize_nested(self):
"""Time deserialize a nested message."""
TestNestedMessage().parse(self.instance_filled_nested_bytes)
def time_serialize_repeated(self):
"""Time serializing a repeated message to wire."""
bytes(self.instance_filled_repeated)
def time_deserialize_repeated(self):
"""Time deserialize a repeated message."""
TestRepeatedMessage().parse(self.instance_filled_repeated_bytes)
class MemSuite:
def setup(self):
self.cls = TestMessage
def mem_instance(self):
return self.cls()

File diff suppressed because it is too large Load Diff

View File

@@ -1,41 +0,0 @@
import stringcase
def safe_snake_case(value: str) -> str:
"""Snake case a value taking into account Python keywords."""
value = stringcase.snakecase(value)
if value in [
"and",
"as",
"assert",
"break",
"class",
"continue",
"def",
"del",
"elif",
"else",
"except",
"finally",
"for",
"from",
"global",
"if",
"import",
"in",
"is",
"lambda",
"nonlocal",
"not",
"or",
"pass",
"raise",
"return",
"try",
"while",
"with",
"yield",
]:
# https://www.python.org/dev/peps/pep-0008/#descriptive-naming-styles
value += "_"
return value

View File

@@ -1,459 +0,0 @@
#!/usr/bin/env python
import itertools
import json
import os.path
import re
import sys
import textwrap
from typing import Any, List, Tuple
try:
import black
except ImportError:
print(
"Unable to import `black` formatter. Did you install the compiler feature with `pip install betterproto[compiler]`?"
)
raise SystemExit(1)
import jinja2
import stringcase
from google.protobuf.compiler import plugin_pb2 as plugin
from google.protobuf.descriptor_pb2 import (
DescriptorProto,
EnumDescriptorProto,
FieldDescriptorProto,
FileDescriptorProto,
ServiceDescriptorProto,
)
from betterproto.casing import safe_snake_case
WRAPPER_TYPES = {
"google.protobuf.DoubleValue": "float",
"google.protobuf.FloatValue": "float",
"google.protobuf.Int64Value": "int",
"google.protobuf.UInt64Value": "int",
"google.protobuf.Int32Value": "int",
"google.protobuf.UInt32Value": "int",
"google.protobuf.BoolValue": "bool",
"google.protobuf.StringValue": "str",
"google.protobuf.BytesValue": "bytes",
}
def get_ref_type(package: str, imports: set, type_name: str) -> str:
"""
Return a Python type name for a proto type reference. Adds the import if
necessary.
"""
# If the package name is a blank string, then this should still work
# because by convention packages are lowercase and message/enum types are
# pascal-cased. May require refactoring in the future.
type_name = type_name.lstrip(".")
if type_name in WRAPPER_TYPES:
return f"Optional[{WRAPPER_TYPES[type_name]}]"
if type_name == "google.protobuf.Duration":
return "timedelta"
if type_name == "google.protobuf.Timestamp":
return "datetime"
if type_name.startswith(package):
parts = type_name.lstrip(package).lstrip(".").split(".")
if len(parts) == 1 or (len(parts) > 1 and parts[0][0] == parts[0][0].upper()):
# This is the current package, which has nested types flattened.
# foo.bar_thing => FooBarThing
cased = [stringcase.pascalcase(part) for part in parts]
type_name = f'"{"".join(cased)}"'
if "." in type_name:
# This is imported from another package. No need
# to use a forward ref and we need to add the import.
parts = type_name.split(".")
parts[-1] = stringcase.pascalcase(parts[-1])
imports.add(f"from .{'.'.join(parts[:-2])} import {parts[-2]}")
type_name = f"{parts[-2]}.{parts[-1]}"
return type_name
def py_type(
package: str,
imports: set,
message: DescriptorProto,
descriptor: FieldDescriptorProto,
) -> str:
if descriptor.type in [1, 2, 6, 7, 15, 16]:
return "float"
elif descriptor.type in [3, 4, 5, 13, 17, 18]:
return "int"
elif descriptor.type == 8:
return "bool"
elif descriptor.type == 9:
return "str"
elif descriptor.type in [11, 14]:
# Type referencing another defined Message or a named enum
return get_ref_type(package, imports, descriptor.type_name)
elif descriptor.type == 12:
return "bytes"
else:
raise NotImplementedError(f"Unknown type {descriptor.type}")
def get_py_zero(type_num: int) -> str:
zero = 0
if type_num in []:
zero = 0.0
elif type_num == 8:
zero = "False"
elif type_num == 9:
zero = '""'
elif type_num == 11:
zero = "None"
elif type_num == 12:
zero = 'b""'
return zero
def traverse(proto_file):
def _traverse(path, items):
for i, item in enumerate(items):
yield item, path + [i]
if isinstance(item, DescriptorProto):
for enum in item.enum_type:
enum.name = item.name + enum.name
yield enum, path + [i, 4]
if item.nested_type:
for n, p in _traverse(path + [i, 3], item.nested_type):
# Adjust the name since we flatten the heirarchy.
n.name = item.name + n.name
yield n, p
return itertools.chain(
_traverse([5], proto_file.enum_type), _traverse([4], proto_file.message_type)
)
def get_comment(proto_file, path: List[int]) -> str:
for sci in proto_file.source_code_info.location:
# print(list(sci.path), path, file=sys.stderr)
if list(sci.path) == path and sci.leading_comments:
lines = textwrap.wrap(
sci.leading_comments.strip().replace("\n", ""), width=75
)
if path[-2] == 2 and path[-4] != 6:
# This is a field
return " # " + "\n # ".join(lines)
else:
# This is a message, enum, service, or method
if len(lines) == 1 and len(lines[0]) < 70:
lines[0] = lines[0].strip('"')
return f' """{lines[0]}"""'
else:
joined = "\n ".join(lines)
return f' """\n {joined}\n """'
return ""
def generate_code(request, response):
env = jinja2.Environment(
trim_blocks=True,
lstrip_blocks=True,
loader=jinja2.FileSystemLoader("%s/templates/" % os.path.dirname(__file__)),
)
template = env.get_template("template.py")
output_map = {}
for proto_file in request.proto_file:
out = proto_file.package
if out == "google.protobuf":
continue
if not out:
out = os.path.splitext(proto_file.name)[0].replace(os.path.sep, ".")
if out not in output_map:
output_map[out] = {"package": proto_file.package, "files": []}
output_map[out]["files"].append(proto_file)
# TODO: Figure out how to handle gRPC request/response messages and add
# processing below for Service.
for filename, options in output_map.items():
package = options["package"]
# print(package, filename, file=sys.stderr)
output = {
"package": package,
"files": [f.name for f in options["files"]],
"imports": set(),
"datetime_imports": set(),
"typing_imports": set(),
"messages": [],
"enums": [],
"services": [],
}
type_mapping = {}
for proto_file in options["files"]:
# print(proto_file.message_type, file=sys.stderr)
# print(proto_file.service, file=sys.stderr)
# print(proto_file.source_code_info, file=sys.stderr)
for item, path in traverse(proto_file):
# print(item, file=sys.stderr)
# print(path, file=sys.stderr)
data = {"name": item.name, "py_name": stringcase.pascalcase(item.name)}
if isinstance(item, DescriptorProto):
# print(item, file=sys.stderr)
if item.options.map_entry:
# Skip generated map entry messages since we just use dicts
continue
data.update(
{
"type": "Message",
"comment": get_comment(proto_file, path),
"properties": [],
}
)
for i, f in enumerate(item.field):
t = py_type(package, output["imports"], item, f)
zero = get_py_zero(f.type)
repeated = False
packed = False
field_type = f.Type.Name(f.type).lower()[5:]
field_wraps = ""
if f.type_name.startswith(
".google.protobuf"
) and f.type_name.endswith("Value"):
w = f.type_name.split(".").pop()[:-5].upper()
field_wraps = f"betterproto.TYPE_{w}"
map_types = None
if f.type == 11:
# This might be a map...
message_type = f.type_name.split(".").pop().lower()
# message_type = py_type(package)
map_entry = f"{f.name.replace('_', '').lower()}entry"
if message_type == map_entry:
for nested in item.nested_type:
if (
nested.name.replace("_", "").lower()
== map_entry
):
if nested.options.map_entry:
# print("Found a map!", file=sys.stderr)
k = py_type(
package,
output["imports"],
item,
nested.field[0],
)
v = py_type(
package,
output["imports"],
item,
nested.field[1],
)
t = f"Dict[{k}, {v}]"
field_type = "map"
map_types = (
f.Type.Name(nested.field[0].type),
f.Type.Name(nested.field[1].type),
)
output["typing_imports"].add("Dict")
if f.label == 3 and field_type != "map":
# Repeated field
repeated = True
t = f"List[{t}]"
zero = "[]"
output["typing_imports"].add("List")
if f.type in [1, 2, 3, 4, 5, 6, 7, 8, 13, 15, 16, 17, 18]:
packed = True
one_of = ""
if f.HasField("oneof_index"):
one_of = item.oneof_decl[f.oneof_index].name
if "Optional[" in t:
output["typing_imports"].add("Optional")
if "timedelta" in t:
output["datetime_imports"].add("timedelta")
elif "datetime" in t:
output["datetime_imports"].add("datetime")
data["properties"].append(
{
"name": f.name,
"py_name": safe_snake_case(f.name),
"number": f.number,
"comment": get_comment(proto_file, path + [2, i]),
"proto_type": int(f.type),
"field_type": field_type,
"field_wraps": field_wraps,
"map_types": map_types,
"type": t,
"zero": zero,
"repeated": repeated,
"packed": packed,
"one_of": one_of,
}
)
# print(f, file=sys.stderr)
output["messages"].append(data)
elif isinstance(item, EnumDescriptorProto):
# print(item.name, path, file=sys.stderr)
data.update(
{
"type": "Enum",
"comment": get_comment(proto_file, path),
"entries": [
{
"name": v.name,
"value": v.number,
"comment": get_comment(proto_file, path + [2, i]),
}
for i, v in enumerate(item.value)
],
}
)
output["enums"].append(data)
for i, service in enumerate(proto_file.service):
# print(service, file=sys.stderr)
data = {
"name": service.name,
"py_name": stringcase.pascalcase(service.name),
"comment": get_comment(proto_file, [6, i]),
"methods": [],
}
for j, method in enumerate(service.method):
if method.client_streaming:
raise NotImplementedError("Client streaming not yet supported")
input_message = None
input_type = get_ref_type(
package, output["imports"], method.input_type
).strip('"')
for msg in output["messages"]:
if msg["name"] == input_type:
input_message = msg
for field in msg["properties"]:
if field["zero"] == "None":
output["typing_imports"].add("Optional")
break
data["methods"].append(
{
"name": method.name,
"py_name": stringcase.snakecase(method.name),
"comment": get_comment(proto_file, [6, i, 2, j]),
"route": f"/{package}.{service.name}/{method.name}",
"input": get_ref_type(
package, output["imports"], method.input_type
).strip('"'),
"input_message": input_message,
"output": get_ref_type(
package, output["imports"], method.output_type
).strip('"'),
"client_streaming": method.client_streaming,
"server_streaming": method.server_streaming,
}
)
if method.server_streaming:
output["typing_imports"].add("AsyncGenerator")
output["services"].append(data)
output["imports"] = sorted(output["imports"])
output["datetime_imports"] = sorted(output["datetime_imports"])
output["typing_imports"] = sorted(output["typing_imports"])
# Fill response
f = response.file.add()
# print(filename, file=sys.stderr)
f.name = filename.replace(".", os.path.sep) + ".py"
# Render and then format the output file.
f.content = black.format_str(
template.render(description=output),
mode=black.FileMode(target_versions=set([black.TargetVersion.PY37])),
)
inits = set([""])
for f in response.file:
# Ensure output paths exist
# print(f.name, file=sys.stderr)
dirnames = os.path.dirname(f.name)
if dirnames:
os.makedirs(dirnames, exist_ok=True)
base = ""
for part in dirnames.split(os.path.sep):
base = os.path.join(base, part)
inits.add(base)
for base in inits:
name = os.path.join(base, "__init__.py")
if os.path.exists(name):
# Never overwrite inits as they may have custom stuff in them.
continue
init = response.file.add()
init.name = name
init.content = b""
filenames = sorted([f.name for f in response.file])
for fname in filenames:
print(f"Writing {fname}", file=sys.stderr)
def main():
"""The plugin's main entry point."""
# Read request message from stdin
data = sys.stdin.buffer.read()
# Parse request
request = plugin.CodeGeneratorRequest()
request.ParseFromString(data)
# Create response
response = plugin.CodeGeneratorResponse()
# Generate code
generate_code(request, response)
# Serialise response message
output = response.SerializeToString()
# Write to stdout
sys.stdout.buffer.write(output)
if __name__ == "__main__":
main()

View File

@@ -1,97 +0,0 @@
# Generated by the protocol buffer compiler. DO NOT EDIT!
# sources: {{ ', '.join(description.files) }}
# plugin: python-betterproto
from dataclasses import dataclass
{% if description.datetime_imports %}
from datetime import {% for i in description.datetime_imports %}{{ i }}{% if not loop.last %}, {% endif %}{% endfor %}
{% endif%}
{% if description.typing_imports %}
from typing import {% for i in description.typing_imports %}{{ i }}{% if not loop.last %}, {% endif %}{% endfor %}
{% endif %}
import betterproto
{% if description.services %}
import grpclib
{% endif %}
{% for i in description.imports %}
{{ i }}
{% endfor %}
{% if description.enums %}{% for enum in description.enums %}
class {{ enum.py_name }}(betterproto.Enum):
{% if enum.comment %}
{{ enum.comment }}
{% endif %}
{% for entry in enum.entries %}
{% if entry.comment %}
{{ entry.comment }}
{% endif %}
{{ entry.name }} = {{ entry.value }}
{% endfor %}
{% endfor %}
{% endif %}
{% for message in description.messages %}
@dataclass
class {{ message.py_name }}(betterproto.Message):
{% if message.comment %}
{{ message.comment }}
{% endif %}
{% for field in message.properties %}
{% if field.comment %}
{{ field.comment }}
{% endif %}
{{ field.py_name }}: {{ field.type }} = betterproto.{{ field.field_type }}_field({{ field.number }}{% if field.field_type == 'map'%}, betterproto.{{ field.map_types[0] }}, betterproto.{{ field.map_types[1] }}{% endif %}{% if field.one_of %}, group="{{ field.one_of }}"{% endif %}{% if field.field_wraps %}, wraps={{ field.field_wraps }}{% endif %})
{% endfor %}
{% if not message.properties %}
pass
{% endif %}
{% endfor %}
{% for service in description.services %}
class {{ service.py_name }}Stub(betterproto.ServiceStub):
{% if service.comment %}
{{ service.comment }}
{% endif %}
{% for method in service.methods %}
async def {{ method.py_name }}(self{% if method.input_message and method.input_message.properties %}, *, {% for field in method.input_message.properties %}{{ field.name }}: {% if field.zero == "None" and not field.type.startswith("Optional[") %}Optional[{{ field.type }}]{% else %}{{ field.type }}{% endif %} = {{ field.zero }}{% if not loop.last %}, {% endif %}{% endfor %}{% endif %}) -> {% if method.server_streaming %}AsyncGenerator[{{ method.output }}, None]{% else %}{{ method.output }}{% endif %}:
{% if method.comment %}
{{ method.comment }}
{% endif %}
request = {{ method.input }}()
{% for field in method.input_message.properties %}
{% if field.field_type == 'message' %}
if {{ field.name }} is not None:
request.{{ field.name }} = {{ field.name }}
{% else %}
request.{{ field.name }} = {{ field.name }}
{% endif %}
{% endfor %}
{% if method.server_streaming %}
async for response in self._unary_stream(
"{{ method.route }}",
request,
{{ method.output }},
):
yield response
{% else %}
return await self._unary_unary(
"{{ method.route }}",
request,
{{ method.output }},
)
{% endif %}
{% endfor %}
{% endfor %}

View File

@@ -1,3 +0,0 @@
{
"greeting": "HEY"
}

View File

@@ -1,14 +0,0 @@
syntax = "proto3";
// Enum for the different greeting types
enum Greeting {
HI = 0;
HEY = 1;
// Formal greeting
HELLO = 2;
}
message Test {
// Greeting enum example
Greeting greeting = 1;
}

View File

@@ -1,84 +0,0 @@
#!/usr/bin/env python
import os
# Force pure-python implementation instead of C++, otherwise imports
# break things because we can't properly reset the symbol database.
os.environ["PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION"] = "python"
import importlib
import json
import subprocess
import sys
from typing import Generator, Tuple
from google.protobuf import symbol_database
from google.protobuf.descriptor_pool import DescriptorPool
from google.protobuf.json_format import MessageToJson, Parse
root = os.path.dirname(os.path.realpath(__file__))
def get_files(end: str) -> Generator[str, None, None]:
for r, dirs, files in os.walk(root):
for filename in [f for f in files if f.endswith(end)]:
yield os.path.join(r, filename)
def get_base(filename: str) -> str:
return os.path.splitext(os.path.basename(filename))[0]
def ensure_ext(filename: str, ext: str) -> str:
if not filename.endswith(ext):
return filename + ext
return filename
if __name__ == "__main__":
os.chdir(root)
if len(sys.argv) > 1:
proto_files = [ensure_ext(f, ".proto") for f in sys.argv[1:]]
bases = {get_base(f) for f in proto_files}
json_files = [
f for f in get_files(".json") if get_base(f).split("-")[0] in bases
]
else:
proto_files = get_files(".proto")
json_files = get_files(".json")
for filename in proto_files:
print(f"Generating code for {os.path.basename(filename)}")
subprocess.run(
f"protoc --python_out=. {os.path.basename(filename)}", shell=True
)
subprocess.run(
f"protoc --plugin=protoc-gen-custom=../plugin.py --custom_out=. {os.path.basename(filename)}",
shell=True,
)
for filename in json_files:
# Reset the internal symbol database so we can import the `Test` message
# multiple times. Ugh.
sym = symbol_database.Default()
sym.pool = DescriptorPool()
parts = get_base(filename).split("-")
out = filename.replace(".json", ".bin")
print(f"Using {parts[0]}_pb2 to generate {os.path.basename(out)}")
imported = importlib.import_module(f"{parts[0]}_pb2")
input_json = open(filename).read()
parsed = Parse(input_json, imported.Test())
serialized = parsed.SerializeToString()
preserve = "casing" not in filename
serialized_json = MessageToJson(parsed, preserving_proto_field_name=preserve)
s_loaded = json.loads(serialized_json)
in_loaded = json.loads(input_json)
if s_loaded != in_loaded:
raise AssertionError("Expected JSON to be equal:", s_loaded, in_loaded)
open(out, "wb").write(serialized)

View File

@@ -1,3 +0,0 @@
{
"count": -150
}

View File

@@ -1,3 +0,0 @@
{
"count": 150
}

View File

@@ -1,5 +0,0 @@
{
"for": 1,
"with": 2,
"as": 3
}

View File

@@ -1,7 +0,0 @@
syntax = "proto3";
message Test {
int32 for = 1;
int32 with = 2;
int32 as = 3;
}

View File

@@ -1,3 +0,0 @@
{
"name": "foo"
}

View File

@@ -1,3 +0,0 @@
{
"count": 1
}

View File

@@ -1,8 +0,0 @@
syntax = "proto3";
message Test {
oneof foo {
int32 count = 1;
string name = 2;
}
}

View File

@@ -1,4 +0,0 @@
{
"signed_32": -150,
"signed_64": "-150"
}

View File

@@ -1,4 +0,0 @@
{
"signed_32": 150,
"signed_64": "150"
}

View File

@@ -1,6 +0,0 @@
syntax = "proto3";
message Test {
sint32 signed_32 = 1;
sint64 signed_64 = 2;
}

View File

@@ -1,164 +0,0 @@
import betterproto
from dataclasses import dataclass
from typing import Optional
def test_has_field():
@dataclass
class Bar(betterproto.Message):
baz: int = betterproto.int32_field(1)
@dataclass
class Foo(betterproto.Message):
bar: Bar = betterproto.message_field(1)
# Unset by default
foo = Foo()
assert betterproto.serialized_on_wire(foo.bar) == False
# Serialized after setting something
foo.bar.baz = 1
assert betterproto.serialized_on_wire(foo.bar) == True
# Still has it after setting the default value
foo.bar.baz = 0
assert betterproto.serialized_on_wire(foo.bar) == True
# Manual override (don't do this)
foo.bar._serialized_on_wire = False
assert betterproto.serialized_on_wire(foo.bar) == False
# Can manually set it but defaults to false
foo.bar = Bar()
assert betterproto.serialized_on_wire(foo.bar) == False
def test_enum_as_int_json():
class TestEnum(betterproto.Enum):
ZERO = 0
ONE = 1
@dataclass
class Foo(betterproto.Message):
bar: TestEnum = betterproto.enum_field(1)
# JSON strings are supported, but ints should still be supported too.
foo = Foo().from_dict({"bar": 1})
assert foo.bar == TestEnum.ONE
# Plain-ol'-ints should serialize properly too.
foo.bar = 1
assert foo.to_dict() == {"bar": "ONE"}
def test_unknown_fields():
@dataclass
class Newer(betterproto.Message):
foo: bool = betterproto.bool_field(1)
bar: int = betterproto.int32_field(2)
baz: str = betterproto.string_field(3)
@dataclass
class Older(betterproto.Message):
foo: bool = betterproto.bool_field(1)
newer = Newer(foo=True, bar=1, baz="Hello")
serialized_newer = bytes(newer)
# Unknown fields in `Newer` should round trip with `Older`
round_trip = bytes(Older().parse(serialized_newer))
assert serialized_newer == round_trip
new_again = Newer().parse(round_trip)
assert newer == new_again
def test_oneof_support():
@dataclass
class Sub(betterproto.Message):
val: int = betterproto.int32_field(1)
@dataclass
class Foo(betterproto.Message):
bar: int = betterproto.int32_field(1, group="group1")
baz: str = betterproto.string_field(2, group="group1")
sub: Sub = betterproto.message_field(3, group="group2")
abc: str = betterproto.string_field(4, group="group2")
foo = Foo()
assert betterproto.which_one_of(foo, "group1")[0] == ""
foo.bar = 1
foo.baz = "test"
# Other oneof fields should now be unset
assert foo.bar == 0
assert betterproto.which_one_of(foo, "group1")[0] == "baz"
foo.sub.val = 1
assert betterproto.serialized_on_wire(foo.sub)
foo.abc = "test"
# Group 1 shouldn't be touched, group 2 should have reset
assert foo.sub.val == 0
assert betterproto.serialized_on_wire(foo.sub) == False
assert betterproto.which_one_of(foo, "group2")[0] == "abc"
# Zero value should always serialize for one-of
foo = Foo(bar=0)
assert betterproto.which_one_of(foo, "group1")[0] == "bar"
assert bytes(foo) == b"\x08\x00"
# Round trip should also work
foo2 = Foo().parse(bytes(foo))
assert betterproto.which_one_of(foo2, "group1")[0] == "bar"
assert foo.bar == 0
assert betterproto.which_one_of(foo2, "group2")[0] == ""
def test_json_casing():
@dataclass
class CasingTest(betterproto.Message):
pascal_case: int = betterproto.int32_field(1)
camel_case: int = betterproto.int32_field(2)
snake_case: int = betterproto.int32_field(3)
kabob_case: int = betterproto.int32_field(4)
# Parsing should accept almost any input
test = CasingTest().from_dict(
{"PascalCase": 1, "camelCase": 2, "snake_case": 3, "kabob-case": 4}
)
assert test == CasingTest(1, 2, 3, 4)
# Serializing should be strict.
assert test.to_dict() == {
"pascalCase": 1,
"camelCase": 2,
"snakeCase": 3,
"kabobCase": 4,
}
assert test.to_dict(casing=betterproto.Casing.SNAKE) == {
"pascal_case": 1,
"camel_case": 2,
"snake_case": 3,
"kabob_case": 4,
}
def test_optional_flag():
@dataclass
class Request(betterproto.Message):
flag: Optional[bool] = betterproto.message_field(1, wraps=betterproto.TYPE_BOOL)
# Serialization of not passed vs. set vs. zero-value.
assert bytes(Request()) == b""
assert bytes(Request(flag=True)) == b"\n\x02\x08\x01"
assert bytes(Request(flag=False)) == b"\n\x00"
# Differentiate between not passed and the zero-value.
assert Request().parse(b"").flag == None
assert Request().parse(b"\n\x00").flag == False

View File

@@ -1,32 +0,0 @@
import importlib
import json
import pytest
from .generate import get_base, get_files
inputs = get_files(".bin")
@pytest.mark.parametrize("filename", inputs)
def test_sample(filename: str) -> None:
module = get_base(filename).split("-")[0]
imported = importlib.import_module(f"betterproto.tests.{module}")
data_binary = open(filename, "rb").read()
data_dict = json.loads(open(filename.replace(".bin", ".json")).read())
t1 = imported.Test().parse(data_binary)
t2 = imported.Test().from_dict(data_dict)
print(t1)
print(t2)
# Equality should automagically work for dataclasses!
assert t1 == t2
# Generally this can't be relied on, but here we are aiming to match the
# existing Python implementation and aren't doing anything tricky.
# https://developers.google.com/protocol-buffers/docs/encoding#implications
assert bytes(t1) == data_binary
assert bytes(t2) == data_binary
assert t1.to_dict() == data_dict
assert t2.to_dict() == data_dict

31
docs/api.rst Normal file
View File

@@ -0,0 +1,31 @@
.. currentmodule:: betterproto
API reference
=============
The following document outlines betterproto's api. **None** of these classes should be
extended by the user manually.
Message
--------
.. autoclass:: betterproto.Message
:members:
:special-members: __bytes__, __bool__
.. autofunction:: betterproto.serialized_on_wire
.. autofunction:: betterproto.which_one_of
Enumerations
-------------
.. autoclass:: betterproto.Enum()
:members:
.. autoclass:: betterproto.Casing()
:members:

60
docs/conf.py Normal file
View File

@@ -0,0 +1,60 @@
# Configuration file for the Sphinx documentation builder.
#
# This file only contains a selection of the most common options. For a full
# list see the documentation:
# https://www.sphinx-doc.org/en/master/usage/configuration.html
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
import pathlib
import toml
# -- Project information -----------------------------------------------------
project = "betterproto"
copyright = "2019 Daniel G. Taylor"
author = "danielgtaylor"
pyproject = toml.load(open(pathlib.Path(__file__).parent.parent / "pyproject.toml"))
# The full version, including alpha/beta/rc tags.
release = pyproject["tool"]["poetry"]["version"]
# -- General configuration ---------------------------------------------------
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = [
"sphinx.ext.autodoc",
"sphinx.ext.intersphinx",
"sphinx.ext.napoleon",
]
autodoc_member_order = "bysource"
autodoc_typehints = "none"
extlinks = {
"issue": ("https://github.com/danielgtaylor/python-betterproto/issues/%s", "GH-"),
}
# Links used for cross-referencing stuff in other documentation
intersphinx_mapping = {
"py": ("https://docs.python.org/3", None),
}
# -- Options for HTML output -------------------------------------------------
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = "friendly"
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
html_theme = "sphinx_rtd_theme"

33
docs/index.rst Normal file
View File

@@ -0,0 +1,33 @@
Welcome to betterproto's documentation!
=======================================
betterproto is a protobuf compiler and interpreter. It improves the experience of using
Protobuf and gRPC in Python, by generating readable, understandable, and idiomatic
Python code, using modern language features.
Features:
~~~~~~~~~
- Generated messages are both binary & JSON serializable
- Messages use relevant python types, e.g. ``Enum``, ``datetime`` and ``timedelta``
objects
- ``async``/``await`` support for gRPC Clients and Servers
- Generates modern, readable, idiomatic python code
Contents:
~~~~~~~~~
.. toctree::
:maxdepth: 2
quick-start
api
migrating
If you still can't find what you're looking for, try in one of the following pages:
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`

157
docs/migrating.rst Normal file
View File

@@ -0,0 +1,157 @@
Migrating Guide
===============
Google's protocolbuffers
------------------------
betterproto has a mostly 1 to 1 drop in replacement for Google's protocolbuffers (after
regenerating your protobufs of course) although there are some minor differences.
.. note::
betterproto implements the same basic methods including:
- :meth:`betterproto.Message.FromString`
- :meth:`betterproto.Message.SerializeToString`
for compatibility purposes, however it is important to note that these are
effectively aliases for :meth:`betterproto.Message.parse` and
:meth:`betterproto.Message.__bytes__` respectively.
Determining if a message was sent
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Sometimes it is useful to be able to determine whether a message has been sent on
the wire. This is how the Google wrapper types work to let you know whether a value is
unset (set as the default/zero value), or set as something else, for example.
Use ``betterproto.serialized_on_wire(message)`` to determine if it was sent. This is
a little bit different from the official Google generated Python code, and it lives
outside the generated ``Message`` class to prevent name clashes. Note that it only
supports Proto 3 and thus can only be used to check if ``Message`` fields are set.
You cannot check if a scalar was sent on the wire.
.. code-block:: python
# Old way (official Google Protobuf package)
>>> mymessage.HasField('myfield')
True
# New way (this project)
>>> betterproto.serialized_on_wire(mymessage.myfield)
True
One-of Support
~~~~~~~~~~~~~~
Protobuf supports grouping fields in a oneof clause. Only one of the fields in the group
may be set at a given time. For example, given the proto:
.. code-block:: proto
syntax = "proto3";
message Test {
oneof foo {
bool on = 1;
int32 count = 2;
string name = 3;
}
}
You can use ``betterproto.which_one_of(message, group_name)`` to determine which of the
fields was set. It returns a tuple of the field name and value, or a blank string and
``None`` if unset. Again this is a little different than the official Google code
generator:
.. code-block:: python
# Old way (official Google protobuf package)
>>> message.WhichOneof("group")
"foo"
# New way (this project)
>>> betterproto.which_one_of(message, "group")
("foo", "foo's value")
Well-Known Google Types
~~~~~~~~~~~~~~~~~~~~~~~
Google provides several well-known message types like a timestamp, duration, and several
wrappers used to provide optional zero value support. Each of these has a special JSON
representation and is handled a little differently from normal messages. The Python
mapping for these is as follows:
+-------------------------------+-----------------------------------------------+--------------------------+
| ``Google Message`` | ``Python Type`` | ``Default`` |
+===============================+===============================================+==========================+
| ``google.protobuf.duration`` | :class:`datetime.timedelta` | ``0`` |
+-------------------------------+-----------------------------------------------+--------------------------+
| ``google.protobuf.timestamp`` | ``Timezone-aware`` :class:`datetime.datetime` | ``1970-01-01T00:00:00Z`` |
+-------------------------------+-----------------------------------------------+--------------------------+
| ``google.protobuf.*Value`` | ``Optional[...]``/``None`` | ``None`` |
+-------------------------------+-----------------------------------------------+--------------------------+
| ``google.protobuf.*`` | ``betterproto.lib.google.protobuf.*`` | ``None`` |
+-------------------------------+-----------------------------------------------+--------------------------+
For the wrapper types, the Python type corresponds to the wrapped type, e.g.
``google.protobuf.BoolValue`` becomes ``Optional[bool]`` while
``google.protobuf.Int32Value`` becomes ``Optional[int]``. All of the optional values
default to None, so don't forget to check for that possible state.
Given:
.. code-block:: proto
syntax = "proto3";
import "google/protobuf/duration.proto";
import "google/protobuf/timestamp.proto";
import "google/protobuf/wrappers.proto";
message Test {
google.protobuf.BoolValue maybe = 1;
google.protobuf.Timestamp ts = 2;
google.protobuf.Duration duration = 3;
}
You can use it as such:
.. code-block:: python
>>> t = Test().from_dict({"maybe": True, "ts": "2019-01-01T12:00:00Z", "duration": "1.200s"})
>>> t
Test(maybe=True, ts=datetime.datetime(2019, 1, 1, 12, 0, tzinfo=datetime.timezone.utc), duration=datetime.timedelta(seconds=1, microseconds=200000))
>>> t.ts - t.duration
datetime.datetime(2019, 1, 1, 11, 59, 58, 800000, tzinfo=datetime.timezone.utc)
>>> t.ts.isoformat()
'2019-01-01T12:00:00+00:00'
>>> t.maybe = None
>>> t.to_dict()
{'ts': '2019-01-01T12:00:00Z', 'duration': '1.200s'}
[1.2.5] to [2.0.0b1]
--------------------
Updated package structures
~~~~~~~~~~~~~~~~~~~~~~~~~~
Generated code now strictly follows the *package structure* of the ``.proto`` files.
Consequently ``.proto`` files without a package will be combined in a single
``__init__.py`` file. To avoid overwriting existing ``__init__.py`` files, its best
to compile into a dedicated subdirectory.
Upgrading:
- Remove your previously compiled ``.py`` files.
- Create a new *empty* directory, e.g. ``generated`` or ``lib/generated/proto`` etc.
- Regenerate your python files into this directory
- Update import statements, e.g. ``import ExampleMessage from generated``

222
docs/quick-start.rst Normal file
View File

@@ -0,0 +1,222 @@
Getting Started
===============
Installation
++++++++++++
Installation from PyPI is as simple as running:
.. code-block:: sh
python3 -m pip install -U betterproto
If you are using Windows, then the following should be used instead:
.. code-block:: sh
py -3 -m pip install -U betterproto
To include the protoc plugin, install betterproto[compiler] instead of betterproto,
e.g.
.. code-block:: sh
python3 -m pip install -U "betterproto[compiler]"
Compiling proto files
+++++++++++++++++++++
Given you installed the compiler and have a proto file, e.g ``example.proto``:
.. code-block:: proto
syntax = "proto3";
package hello;
// Greeting represents a message you can tell a user.
message Greeting {
string message = 1;
}
To compile the proto you would run the following:
You can run the following to invoke protoc directly:
.. code-block:: sh
mkdir hello
protoc -I . --python_betterproto_out=lib example.proto
or run the following to invoke protoc via grpcio-tools:
.. code-block:: sh
pip install grpcio-tools
python -m grpc_tools.protoc -I . --python_betterproto_out=lib example.proto
This will generate ``lib/__init__.py`` which looks like:
.. code-block:: python
# Generated by the protocol buffer compiler. DO NOT EDIT!
# sources: example.proto
# plugin: python-betterproto
from dataclasses import dataclass
import betterproto
@dataclass
class Greeting(betterproto.Message):
"""Greeting represents a message you can tell a user."""
message: str = betterproto.string_field(1)
Then to use it:
.. code-block:: python
>>> from lib import Greeting
>>> test = Greeting()
>>> test
Greeting(message='')
>>> test.message = "Hey!"
>>> test
Greeting(message="Hey!")
>>> bytes(test)
b'\n\x04Hey!'
>>> Greeting().parse(serialized)
Greeting(message="Hey!")
Async gRPC Support
++++++++++++++++++
The generated code includes `grpclib <https://grpclib.readthedocs.io/en/latest>`_ based
stub (client and server) classes for rpc services declared in the input proto files.
It is enabled by default.
Given a service definition similar to the one below:
.. code-block:: proto
syntax = "proto3";
package echo;
message EchoRequest {
string value = 1;
// Number of extra times to echo
uint32 extra_times = 2;
}
message EchoResponse {
repeated string values = 1;
}
message EchoStreamResponse {
string value = 1;
}
service Echo {
rpc Echo(EchoRequest) returns (EchoResponse);
rpc EchoStream(EchoRequest) returns (stream EchoStreamResponse);
}
The generated client can be used like so:
.. code-block:: python
import asyncio
from grpclib.client import Channel
import echo
async def main():
channel = Channel(host="127.0.0.1", port=50051)
service = echo.EchoStub(channel)
response = await service.echo(value="hello", extra_times=1)
print(response)
async for response in service.echo_stream(value="hello", extra_times=1):
print(response)
# don't forget to close the channel when you're done!
channel.close()
asyncio.run(main()) # python 3.7 only
# outputs
EchoResponse(values=['hello', 'hello'])
EchoStreamResponse(value='hello')
EchoStreamResponse(value='hello')
The server-facing stubs can be used to implement a Python
gRPC server.
To use them, simply subclass the base class in the generated files and override the
service methods:
.. code-block:: python
from echo import EchoBase
from grpclib.server import Server
from typing import AsyncIterator
class EchoService(EchoBase):
async def echo(self, value: str, extra_times: int) -> "EchoResponse":
return value
async def echo_stream(
self, value: str, extra_times: int
) -> AsyncIterator["EchoStreamResponse"]:
for _ in range(extra_times):
yield value
async def start_server():
HOST = "127.0.0.1"
PORT = 1337
server = Server([EchoService()])
await server.start(HOST, PORT)
await server.serve_forever()
JSON
++++
Message objects include :meth:`betterproto.Message.to_json` and
:meth:`betterproto.Message.from_json` methods for JSON (de)serialisation, and
:meth:`betterproto.Message.to_dict`, :meth:`betterproto.Message.from_dict` for
converting back and forth from JSON serializable dicts.
For compatibility the default is to convert field names to
:attr:`betterproto.Casing.CAMEL`. You can control this behavior by passing a
different casing value, e.g:
.. code-block:: python
@dataclass
class MyMessage(betterproto.Message):
a_long_field_name: str = betterproto.string_field(1)
>>> test = MyMessage(a_long_field_name="Hello World!")
>>> test.to_dict(betterproto.Casing.SNAKE)
{"a_long_field_name": "Hello World!"}
>>> test.to_dict(betterproto.Casing.CAMEL)
{"aLongFieldName": "Hello World!"}
>>> test.to_json(indent=2)
'{\n "aLongFieldName": "Hello World!"\n}'
>>> test.from_dict({"aLongFieldName": "Goodbye World!"})
>>> test.a_long_field_name
"Goodbye World!"

1531
poetry.lock generated Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -1,9 +1,125 @@
[tool.black]
target-version = ['py37']
[tool.poetry]
name = "betterproto"
version = "2.0.0b4"
description = "A better Protobuf / gRPC generator & library"
authors = ["Daniel G. Taylor <danielgtaylor@gmail.com>"]
readme = "README.md"
repository = "https://github.com/danielgtaylor/python-betterproto"
keywords = ["protobuf", "gRPC"]
license = "MIT"
packages = [
{ include = "betterproto", from = "src" }
]
[tool.isort]
multi_line_output = 3
include_trailing_comma = true
force_grid_wrap = 0
use_parentheses = true
line_length = 88
[tool.poetry.dependencies]
python = ">=3.6.2,<4.0"
black = { version = ">=19.3b0", optional = true }
dataclasses = { version = "^0.7", python = ">=3.6, <3.7" }
grpclib = "^0.4.1"
jinja2 = { version = "^2.11.2", optional = true }
python-dateutil = "^2.8"
[tool.poetry.dev-dependencies]
asv = "^0.4.2"
black = "^21.11b0"
bpython = "^0.19"
grpcio-tools = "^1.40.0"
jinja2 = "^2.11.2"
mypy = "^0.930"
poethepoet = ">=0.9.0"
protobuf = "^3.12.2"
pytest = "^6.2.5"
pytest-asyncio = "^0.12.0"
pytest-cov = "^2.9.0"
pytest-mock = "^3.1.1"
sphinx = "3.1.2"
sphinx-rtd-theme = "0.5.0"
tomlkit = "^0.7.0"
tox = "^3.15.1"
[tool.poetry.scripts]
protoc-gen-python_betterproto = "betterproto.plugin:main"
[tool.poetry.extras]
compiler = ["black", "jinja2"]
# Dev workflow tasks
[tool.poe.tasks.generate]
script = "tests.generate:main"
help = "Generate test cases (do this once before running test)"
[tool.poe.tasks.test]
cmd = "pytest"
help = "Run tests"
[tool.poe.tasks.types]
cmd = "mypy src --ignore-missing-imports"
help = "Check types with mypy"
[tool.poe.tasks.format]
cmd = "black . --exclude tests/output_"
help = "Apply black formatting to source code"
[tool.poe.tasks.docs]
cmd = "sphinx-build docs docs/build"
help = "Build the sphinx docs"
[tool.poe.tasks.bench]
shell = "asv run master^! && asv run HEAD^! && asv compare master HEAD"
help = "Benchmark current commit vs. master branch"
[tool.poe.tasks.clean]
cmd = """
rm -rf .asv .coverage .mypy_cache .pytest_cache
dist betterproto.egg-info **/__pycache__
testsoutput_*
"""
help = "Clean out generated files from the workspace"
[tool.poe.tasks.generate_lib]
cmd = """
protoc
--plugin=protoc-gen-custom=src/betterproto/plugin/main.py
--custom_opt=INCLUDE_GOOGLE
--custom_out=src/betterproto/lib
-I /usr/local/include/
/usr/local/include/google/protobuf/**/*.proto
"""
help = "Regenerate the types in betterproto.lib.google"
# CI tasks
[tool.poe.tasks.full-test]
shell = "poe generate && tox"
help = "Run tests with multiple pythons"
[tool.poe.tasks.check-style]
cmd = "black . --check --diff --exclude tests/output_"
help = "Check if code style is correct"
[tool.black]
target-version = ['py36']
[tool.coverage.run]
omit = ["betterproto/tests/*"]
[tool.tox]
legacy_tox_ini = """
[tox]
isolated_build = true
envlist = py36, py37, py38, py310
[testenv]
whitelist_externals = poetry
commands =
poetry install -v --extras compiler
poetry run pytest --cov betterproto
"""
[build-system]
requires = ["poetry-core>=1.0.0,<2"]
build-backend = "poetry.core.masonry.api"

5
pytest.ini Normal file
View File

@@ -0,0 +1,5 @@
[pytest]
python_files = test_*.py
python_classes =
norecursedirs = **/output_*
addopts = -p no:warnings

View File

@@ -1,24 +0,0 @@
from setuptools import setup, find_packages
setup(
name="betterproto",
version="1.2.0",
description="A better Protobuf / gRPC generator & library",
long_description=open("README.md", "r").read(),
long_description_content_type="text/markdown",
url="http://github.com/danielgtaylor/python-betterproto",
author="Daniel G. Taylor",
author_email="danielgtaylor@gmail.com",
license="MIT",
entry_points={
"console_scripts": ["protoc-gen-python_betterproto=betterproto.plugin:main"]
},
packages=find_packages(
exclude=["tests", "*.tests", "*.tests.*", "output", "output.*"]
),
package_data={"betterproto": ["py.typed", "templates/template.py"]},
python_requires=">=3.7",
install_requires=["grpclib", "stringcase"],
extras_require={"compiler": ["black", "jinja2", "protobuf"]},
zip_safe=False,
)

1382
src/betterproto/__init__.py Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,9 @@
from typing import TYPE_CHECKING, TypeVar
if TYPE_CHECKING:
from grpclib._typing import IProtoMessage
from . import Message
# Bound type variable to allow methods to return `self` of subclasses
T = TypeVar("T", bound="Message")
ST = TypeVar("ST", bound="IProtoMessage")

View File

@@ -0,0 +1,3 @@
from pkg_resources import get_distribution
__version__ = get_distribution("betterproto").version

138
src/betterproto/casing.py Normal file
View File

@@ -0,0 +1,138 @@
import keyword
import re
# Word delimiters and symbols that will not be preserved when re-casing.
# language=PythonRegExp
SYMBOLS = "[^a-zA-Z0-9]*"
# Optionally capitalized word.
# language=PythonRegExp
WORD = "[A-Z]*[a-z]*[0-9]*"
# Uppercase word, not followed by lowercase letters.
# language=PythonRegExp
WORD_UPPER = "[A-Z]+(?![a-z])[0-9]*"
def safe_snake_case(value: str) -> str:
"""Snake case a value taking into account Python keywords."""
value = snake_case(value)
value = sanitize_name(value)
return value
def snake_case(value: str, strict: bool = True) -> str:
"""
Join words with an underscore into lowercase and remove symbols.
Parameters
-----------
value: :class:`str`
The value to convert.
strict: :class:`bool`
Whether or not to force single underscores.
Returns
--------
:class:`str`
The value in snake_case.
"""
def substitute_word(symbols: str, word: str, is_start: bool) -> str:
if not word:
return ""
if strict:
delimiter_count = 0 if is_start else 1 # Single underscore if strict.
elif is_start:
delimiter_count = len(symbols)
elif word.isupper() or word.islower():
delimiter_count = max(
1, len(symbols)
) # Preserve all delimiters if not strict.
else:
delimiter_count = len(symbols) + 1 # Extra underscore for leading capital.
return ("_" * delimiter_count) + word.lower()
snake = re.sub(
f"(^)?({SYMBOLS})({WORD_UPPER}|{WORD})",
lambda groups: substitute_word(groups[2], groups[3], groups[1] is not None),
value,
)
return snake
def pascal_case(value: str, strict: bool = True) -> str:
"""
Capitalize each word and remove symbols.
Parameters
-----------
value: :class:`str`
The value to convert.
strict: :class:`bool`
Whether or not to output only alphanumeric characters.
Returns
--------
:class:`str`
The value in PascalCase.
"""
def substitute_word(symbols, word):
if strict:
return word.capitalize() # Remove all delimiters
if word.islower():
delimiter_length = len(symbols[:-1]) # Lose one delimiter
else:
delimiter_length = len(symbols) # Preserve all delimiters
return ("_" * delimiter_length) + word.capitalize()
return re.sub(
f"({SYMBOLS})({WORD_UPPER}|{WORD})",
lambda groups: substitute_word(groups[1], groups[2]),
value,
)
def camel_case(value: str, strict: bool = True) -> str:
"""
Capitalize all words except first and remove symbols.
Parameters
-----------
value: :class:`str`
The value to convert.
strict: :class:`bool`
Whether or not to output only alphanumeric characters.
Returns
--------
:class:`str`
The value in camelCase.
"""
return lowercase_first(pascal_case(value, strict=strict))
def lowercase_first(value: str) -> str:
"""
Lower cases the first character of the value.
Parameters
----------
value: :class:`str`
The value to lower case.
Returns
-------
:class:`str`
The lower cased string.
"""
return value[0:1].lower() + value[1:]
def sanitize_name(value: str) -> str:
# https://www.python.org/dev/peps/pep-0008/#descriptive-naming-styles
return f"{value}_" if keyword.iskeyword(value) else value

View File

@@ -0,0 +1,162 @@
import os
import re
from typing import Dict, List, Set, Tuple, Type
from ..casing import safe_snake_case
from ..lib.google import protobuf as google_protobuf
from .naming import pythonize_class_name
WRAPPER_TYPES: Dict[str, Type] = {
".google.protobuf.DoubleValue": google_protobuf.DoubleValue,
".google.protobuf.FloatValue": google_protobuf.FloatValue,
".google.protobuf.Int32Value": google_protobuf.Int32Value,
".google.protobuf.Int64Value": google_protobuf.Int64Value,
".google.protobuf.UInt32Value": google_protobuf.UInt32Value,
".google.protobuf.UInt64Value": google_protobuf.UInt64Value,
".google.protobuf.BoolValue": google_protobuf.BoolValue,
".google.protobuf.StringValue": google_protobuf.StringValue,
".google.protobuf.BytesValue": google_protobuf.BytesValue,
}
def parse_source_type_name(field_type_name: str) -> Tuple[str, str]:
"""
Split full source type name into package and type name.
E.g. 'root.package.Message' -> ('root.package', 'Message')
'root.Message.SomeEnum' -> ('root', 'Message.SomeEnum')
"""
package_match = re.match(r"^\.?([^A-Z]+)\.(.+)", field_type_name)
if package_match:
package = package_match.group(1)
name = package_match.group(2)
else:
package = ""
name = field_type_name.lstrip(".")
return package, name
def get_type_reference(
package: str, imports: set, source_type: str, unwrap: bool = True
) -> str:
"""
Return a Python type name for a proto type reference. Adds the import if
necessary. Unwraps well known type if required.
"""
if unwrap:
if source_type in WRAPPER_TYPES:
wrapped_type = type(WRAPPER_TYPES[source_type]().value)
return f"Optional[{wrapped_type.__name__}]"
if source_type == ".google.protobuf.Duration":
return "timedelta"
elif source_type == ".google.protobuf.Timestamp":
return "datetime"
source_package, source_type = parse_source_type_name(source_type)
current_package: List[str] = package.split(".") if package else []
py_package: List[str] = source_package.split(".") if source_package else []
py_type: str = pythonize_class_name(source_type)
compiling_google_protobuf = current_package == ["google", "protobuf"]
importing_google_protobuf = py_package == ["google", "protobuf"]
if importing_google_protobuf and not compiling_google_protobuf:
py_package = ["betterproto", "lib"] + py_package
if py_package[:1] == ["betterproto"]:
return reference_absolute(imports, py_package, py_type)
if py_package == current_package:
return reference_sibling(py_type)
if py_package[: len(current_package)] == current_package:
return reference_descendent(current_package, imports, py_package, py_type)
if current_package[: len(py_package)] == py_package:
return reference_ancestor(current_package, imports, py_package, py_type)
return reference_cousin(current_package, imports, py_package, py_type)
def reference_absolute(imports: Set[str], py_package: List[str], py_type: str) -> str:
"""
Returns a reference to a python type located in the root, i.e. sys.path.
"""
string_import = ".".join(py_package)
string_alias = safe_snake_case(string_import)
imports.add(f"import {string_import} as {string_alias}")
return f'"{string_alias}.{py_type}"'
def reference_sibling(py_type: str) -> str:
"""
Returns a reference to a python type within the same package as the current package.
"""
return f'"{py_type}"'
def reference_descendent(
current_package: List[str], imports: Set[str], py_package: List[str], py_type: str
) -> str:
"""
Returns a reference to a python type in a package that is a descendent of the
current package, and adds the required import that is aliased to avoid name
conflicts.
"""
importing_descendent = py_package[len(current_package) :]
string_from = ".".join(importing_descendent[:-1])
string_import = importing_descendent[-1]
if string_from:
string_alias = "_".join(importing_descendent)
imports.add(f"from .{string_from} import {string_import} as {string_alias}")
return f'"{string_alias}.{py_type}"'
else:
imports.add(f"from . import {string_import}")
return f'"{string_import}.{py_type}"'
def reference_ancestor(
current_package: List[str], imports: Set[str], py_package: List[str], py_type: str
) -> str:
"""
Returns a reference to a python type in a package which is an ancestor to the
current package, and adds the required import that is aliased (if possible) to avoid
name conflicts.
Adds trailing __ to avoid name mangling (python.org/dev/peps/pep-0008/#id34).
"""
distance_up = len(current_package) - len(py_package)
if py_package:
string_import = py_package[-1]
string_alias = f"_{'_' * distance_up}{string_import}__"
string_from = f"..{'.' * distance_up}"
imports.add(f"from {string_from} import {string_import} as {string_alias}")
return f'"{string_alias}.{py_type}"'
else:
string_alias = f"{'_' * distance_up}{py_type}__"
imports.add(f"from .{'.' * distance_up} import {py_type} as {string_alias}")
return f'"{string_alias}"'
def reference_cousin(
current_package: List[str], imports: Set[str], py_package: List[str], py_type: str
) -> str:
"""
Returns a reference to a python type in a package that is not descendent, ancestor
or sibling, and adds the required import that is aliased to avoid name conflicts.
"""
shared_ancestry = os.path.commonprefix([current_package, py_package]) # type: ignore
distance_up = len(current_package) - len(shared_ancestry)
string_from = f".{'.' * distance_up}" + ".".join(
py_package[len(shared_ancestry) : -1]
)
string_import = py_package[-1]
# Add trailing __ to avoid name mangling (python.org/dev/peps/pep-0008/#id34)
string_alias = (
f"{'_' * distance_up}"
+ safe_snake_case(".".join(py_package[len(shared_ancestry) :]))
+ "__"
)
imports.add(f"from {string_from} import {string_import} as {string_alias}")
return f'"{string_alias}.{py_type}"'

View File

@@ -0,0 +1,13 @@
from betterproto import casing
def pythonize_class_name(name: str) -> str:
return casing.pascal_case(name)
def pythonize_field_name(name: str) -> str:
return casing.safe_snake_case(name)
def pythonize_method_name(name: str) -> str:
return casing.safe_snake_case(name)

View File

View File

@@ -0,0 +1,171 @@
import asyncio
from abc import ABC
from typing import (
TYPE_CHECKING,
AsyncIterable,
AsyncIterator,
Collection,
Iterable,
Mapping,
Optional,
Tuple,
Type,
Union,
)
import grpclib.const
from .._types import ST, T
if TYPE_CHECKING:
from grpclib.client import Channel
from grpclib.metadata import Deadline
_Value = Union[str, bytes]
_MetadataLike = Union[Mapping[str, _Value], Collection[Tuple[str, _Value]]]
_MessageLike = Union[T, ST]
_MessageSource = Union[Iterable[ST], AsyncIterable[ST]]
class ServiceStub(ABC):
"""
Base class for async gRPC clients.
"""
def __init__(
self,
channel: "Channel",
*,
timeout: Optional[float] = None,
deadline: Optional["Deadline"] = None,
metadata: Optional[_MetadataLike] = None,
) -> None:
self.channel = channel
self.timeout = timeout
self.deadline = deadline
self.metadata = metadata
def __resolve_request_kwargs(
self,
timeout: Optional[float],
deadline: Optional["Deadline"],
metadata: Optional[_MetadataLike],
):
return {
"timeout": self.timeout if timeout is None else timeout,
"deadline": self.deadline if deadline is None else deadline,
"metadata": self.metadata if metadata is None else metadata,
}
async def _unary_unary(
self,
route: str,
request: _MessageLike,
response_type: Type[T],
*,
timeout: Optional[float] = None,
deadline: Optional["Deadline"] = None,
metadata: Optional[_MetadataLike] = None,
) -> T:
"""Make a unary request and return the response."""
async with self.channel.request(
route,
grpclib.const.Cardinality.UNARY_UNARY,
type(request),
response_type,
**self.__resolve_request_kwargs(timeout, deadline, metadata),
) as stream:
await stream.send_message(request, end=True)
response = await stream.recv_message()
assert response is not None
return response
async def _unary_stream(
self,
route: str,
request: _MessageLike,
response_type: Type[T],
*,
timeout: Optional[float] = None,
deadline: Optional["Deadline"] = None,
metadata: Optional[_MetadataLike] = None,
) -> AsyncIterator[T]:
"""Make a unary request and return the stream response iterator."""
async with self.channel.request(
route,
grpclib.const.Cardinality.UNARY_STREAM,
type(request),
response_type,
**self.__resolve_request_kwargs(timeout, deadline, metadata),
) as stream:
await stream.send_message(request, end=True)
async for message in stream:
yield message
async def _stream_unary(
self,
route: str,
request_iterator: _MessageSource,
request_type: Type[ST],
response_type: Type[T],
*,
timeout: Optional[float] = None,
deadline: Optional["Deadline"] = None,
metadata: Optional[_MetadataLike] = None,
) -> T:
"""Make a stream request and return the response."""
async with self.channel.request(
route,
grpclib.const.Cardinality.STREAM_UNARY,
request_type,
response_type,
**self.__resolve_request_kwargs(timeout, deadline, metadata),
) as stream:
await self._send_messages(stream, request_iterator)
response = await stream.recv_message()
assert response is not None
return response
async def _stream_stream(
self,
route: str,
request_iterator: _MessageSource,
request_type: Type[ST],
response_type: Type[T],
*,
timeout: Optional[float] = None,
deadline: Optional["Deadline"] = None,
metadata: Optional[_MetadataLike] = None,
) -> AsyncIterator[T]:
"""
Make a stream request and return an AsyncIterator to iterate over response
messages.
"""
async with self.channel.request(
route,
grpclib.const.Cardinality.STREAM_STREAM,
request_type,
response_type,
**self.__resolve_request_kwargs(timeout, deadline, metadata),
) as stream:
await stream.send_request()
sending_task = asyncio.ensure_future(
self._send_messages(stream, request_iterator)
)
try:
async for response in stream:
yield response
except:
sending_task.cancel()
raise
@staticmethod
async def _send_messages(stream, messages: _MessageSource):
if isinstance(messages, AsyncIterable):
async for message in messages:
await stream.send_message(message)
else:
for message in messages:
await stream.send_message(message)
await stream.end()

View File

@@ -0,0 +1,30 @@
from abc import ABC
from collections.abc import AsyncIterable
from typing import Any, Callable, Dict
import grpclib
import grpclib.server
class ServiceBase(ABC):
"""
Base class for async gRPC servers.
"""
async def _call_rpc_handler_server_stream(
self,
handler: Callable,
stream: grpclib.server.Stream,
request: Any,
) -> None:
response_iter = handler(request)
# check if response is actually an AsyncIterator
# this might be false if the method just returns without
# yielding at least once
# in that case, we just interpret it as an empty iterator
if isinstance(response_iter, AsyncIterable):
async for response_message in response_iter:
await stream.send_message(response_message)
else:
response_iter.close()

View File

View File

@@ -0,0 +1,185 @@
import asyncio
from typing import AsyncIterable, AsyncIterator, Iterable, Optional, TypeVar, Union
T = TypeVar("T")
class ChannelClosed(Exception):
"""
An exception raised on an attempt to send through a closed channel
"""
class ChannelDone(Exception):
"""
An exception raised on an attempt to send receive from a channel that is both closed
and empty.
"""
class AsyncChannel(AsyncIterable[T]):
"""
A buffered async channel for sending items between coroutines with FIFO ordering.
This makes decoupled bidirectional steaming gRPC requests easy if used like:
.. code-block:: python
client = GeneratedStub(grpclib_chan)
request_channel = await AsyncChannel()
# We can start be sending all the requests we already have
await request_channel.send_from([RequestObject(...), RequestObject(...)])
async for response in client.rpc_call(request_channel):
# The response iterator will remain active until the connection is closed
...
# More items can be sent at any time
await request_channel.send(RequestObject(...))
...
# The channel must be closed to complete the gRPC connection
request_channel.close()
Items can be sent through the channel by either:
- providing an iterable to the send_from method
- passing them to the send method one at a time
Items can be received from the channel by either:
- iterating over the channel with a for loop to get all items
- calling the receive method to get one item at a time
If the channel is empty then receivers will wait until either an item appears or the
channel is closed.
Once the channel is closed then subsequent attempt to send through the channel will
fail with a ChannelClosed exception.
When th channel is closed and empty then it is done, and further attempts to receive
from it will fail with a ChannelDone exception
If multiple coroutines receive from the channel concurrently, each item sent will be
received by only one of the receivers.
:param source:
An optional iterable will items that should be sent through the channel
immediately.
:param buffer_limit:
Limit the number of items that can be buffered in the channel, A value less than
1 implies no limit. If the channel is full then attempts to send more items will
result in the sender waiting until an item is received from the channel.
:param close:
If set to True then the channel will automatically close after exhausting source
or immediately if no source is provided.
"""
def __init__(self, *, buffer_limit: int = 0, close: bool = False):
self._queue: asyncio.Queue[T] = asyncio.Queue(buffer_limit)
self._closed = False
self._waiting_receivers: int = 0
# Track whether flush has been invoked so it can only happen once
self._flushed = False
def __aiter__(self) -> AsyncIterator[T]:
return self
async def __anext__(self) -> T:
if self.done():
raise StopAsyncIteration
self._waiting_receivers += 1
try:
result = await self._queue.get()
if result is self.__flush:
raise StopAsyncIteration
return result
finally:
self._waiting_receivers -= 1
self._queue.task_done()
def closed(self) -> bool:
"""
Returns True if this channel is closed and no-longer accepting new items
"""
return self._closed
def done(self) -> bool:
"""
Check if this channel is done.
:return: True if this channel is closed and and has been drained of items in
which case any further attempts to receive an item from this channel will raise
a ChannelDone exception.
"""
# After close the channel is not yet done until there is at least one waiting
# receiver per enqueued item.
return self._closed and self._queue.qsize() <= self._waiting_receivers
async def send_from(
self, source: Union[Iterable[T], AsyncIterable[T]], close: bool = False
) -> "AsyncChannel[T]":
"""
Iterates the given [Async]Iterable and sends all the resulting items.
If close is set to True then subsequent send calls will be rejected with a
ChannelClosed exception.
:param source: an iterable of items to send
:param close:
if True then the channel will be closed after the source has been exhausted
"""
if self._closed:
raise ChannelClosed("Cannot send through a closed channel")
if isinstance(source, AsyncIterable):
async for item in source:
await self._queue.put(item)
else:
for item in source:
await self._queue.put(item)
if close:
# Complete the closing process
self.close()
return self
async def send(self, item: T) -> "AsyncChannel[T]":
"""
Send a single item over this channel.
:param item: The item to send
"""
if self._closed:
raise ChannelClosed("Cannot send through a closed channel")
await self._queue.put(item)
return self
async def receive(self) -> Optional[T]:
"""
Returns the next item from this channel when it becomes available,
or None if the channel is closed before another item is sent.
:return: An item from the channel
"""
if self.done():
raise ChannelDone("Cannot receive from a closed channel")
self._waiting_receivers += 1
try:
result = await self._queue.get()
if result is self.__flush:
return None
return result
finally:
self._waiting_receivers -= 1
self._queue.task_done()
def close(self):
"""
Close this channel to new items
"""
self._closed = True
asyncio.ensure_future(self._flush_queue())
async def _flush_queue(self):
"""
To be called after the channel is closed. Pushes a number of self.__flush
objects to the queue to ensure no waiting consumers get deadlocked.
"""
if not self._flushed:
self._flushed = True
deadlocked_receivers = max(0, self._waiting_receivers - self._queue.qsize())
for _ in range(deadlocked_receivers):
await self._queue.put(self.__flush)
# A special signal object for flushing the queue when the channel is closed
__flush = object()

View File

View File

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,128 @@
# Generated by the protocol buffer compiler. DO NOT EDIT!
# sources: google/protobuf/compiler/plugin.proto
# plugin: python-betterproto
from dataclasses import dataclass
from typing import List
import betterproto
from betterproto.grpc.grpclib_server import ServiceBase
class CodeGeneratorResponseFeature(betterproto.Enum):
FEATURE_NONE = 0
FEATURE_PROTO3_OPTIONAL = 1
@dataclass(eq=False, repr=False)
class Version(betterproto.Message):
"""The version number of protocol compiler."""
major: int = betterproto.int32_field(1)
minor: int = betterproto.int32_field(2)
patch: int = betterproto.int32_field(3)
# A suffix for alpha, beta or rc release, e.g., "alpha-1", "rc2". It should
# be empty for mainline stable releases.
suffix: str = betterproto.string_field(4)
@dataclass(eq=False, repr=False)
class CodeGeneratorRequest(betterproto.Message):
"""An encoded CodeGeneratorRequest is written to the plugin's stdin."""
# The .proto files that were explicitly listed on the command-line. The code
# generator should generate code only for these files. Each file's
# descriptor will be included in proto_file, below.
file_to_generate: List[str] = betterproto.string_field(1)
# The generator parameter passed on the command-line.
parameter: str = betterproto.string_field(2)
# FileDescriptorProtos for all files in files_to_generate and everything they
# import. The files will appear in topological order, so each file appears
# before any file that imports it. protoc guarantees that all proto_files
# will be written after the fields above, even though this is not technically
# guaranteed by the protobuf wire format. This theoretically could allow a
# plugin to stream in the FileDescriptorProtos and handle them one by one
# rather than read the entire set into memory at once. However, as of this
# writing, this is not similarly optimized on protoc's end -- it will store
# all fields in memory at once before sending them to the plugin. Type names
# of fields and extensions in the FileDescriptorProto are always fully
# qualified.
proto_file: List[
"betterproto_lib_google_protobuf.FileDescriptorProto"
] = betterproto.message_field(15)
# The version number of protocol compiler.
compiler_version: "Version" = betterproto.message_field(3)
@dataclass(eq=False, repr=False)
class CodeGeneratorResponse(betterproto.Message):
"""The plugin writes an encoded CodeGeneratorResponse to stdout."""
# Error message. If non-empty, code generation failed. The plugin process
# should exit with status code zero even if it reports an error in this way.
# This should be used to indicate errors in .proto files which prevent the
# code generator from generating correct code. Errors which indicate a
# problem in protoc itself -- such as the input CodeGeneratorRequest being
# unparseable -- should be reported by writing a message to stderr and
# exiting with a non-zero status code.
error: str = betterproto.string_field(1)
# A bitmask of supported features that the code generator supports. This is a
# bitwise "or" of values from the Feature enum.
supported_features: int = betterproto.uint64_field(2)
file: List["CodeGeneratorResponseFile"] = betterproto.message_field(15)
@dataclass(eq=False, repr=False)
class CodeGeneratorResponseFile(betterproto.Message):
"""Represents a single generated file."""
# The file name, relative to the output directory. The name must not contain
# "." or ".." components and must be relative, not be absolute (so, the file
# cannot lie outside the output directory). "/" must be used as the path
# separator, not "\". If the name is omitted, the content will be appended to
# the previous file. This allows the generator to break large files into
# small chunks, and allows the generated text to be streamed back to protoc
# so that large files need not reside completely in memory at one time. Note
# that as of this writing protoc does not optimize for this -- it will read
# the entire CodeGeneratorResponse before writing files to disk.
name: str = betterproto.string_field(1)
# If non-empty, indicates that the named file should already exist, and the
# content here is to be inserted into that file at a defined insertion point.
# This feature allows a code generator to extend the output produced by
# another code generator. The original generator may provide insertion
# points by placing special annotations in the file that look like:
# @@protoc_insertion_point(NAME) The annotation can have arbitrary text
# before and after it on the line, which allows it to be placed in a comment.
# NAME should be replaced with an identifier naming the point -- this is what
# other generators will use as the insertion_point. Code inserted at this
# point will be placed immediately above the line containing the insertion
# point (thus multiple insertions to the same point will come out in the
# order they were added). The double-@ is intended to make it unlikely that
# the generated code could contain things that look like insertion points by
# accident. For example, the C++ code generator places the following line in
# the .pb.h files that it generates: //
# @@protoc_insertion_point(namespace_scope) This line appears within the
# scope of the file's package namespace, but outside of any particular class.
# Another plugin can then specify the insertion_point "namespace_scope" to
# generate additional classes or other declarations that should be placed in
# this scope. Note that if the line containing the insertion point begins
# with whitespace, the same whitespace will be added to every line of the
# inserted text. This is useful for languages like Python, where indentation
# matters. In these languages, the insertion point comment should be
# indented the same amount as any inserted code will need to be in order to
# work correctly in that context. The code generator that generates the
# initial file and the one which inserts into it must both run as part of a
# single invocation of protoc. Code generators are executed in the order in
# which they appear on the command line. If |insertion_point| is present,
# |name| must also be present.
insertion_point: str = betterproto.string_field(2)
# The file contents.
content: str = betterproto.string_field(15)
# Information describing the file content being inserted. If an insertion
# point is used, this information will be appropriately offset and inserted
# into the code generation metadata for the generated files.
generated_code_info: "betterproto_lib_google_protobuf.GeneratedCodeInfo" = (
betterproto.message_field(16)
)
import betterproto.lib.google.protobuf as betterproto_lib_google_protobuf

View File

@@ -0,0 +1 @@
from .main import main

View File

@@ -0,0 +1,4 @@
from .main import main
main()

View File

@@ -0,0 +1,37 @@
import os.path
try:
# betterproto[compiler] specific dependencies
import black
import jinja2
except ImportError as err:
print(
"\033[31m"
f"Unable to import `{err.name}` from betterproto plugin! "
"Please ensure that you've installed betterproto as "
'`pip install "betterproto[compiler]"` so that compiler dependencies '
"are included."
"\033[0m"
)
raise SystemExit(1)
from .models import OutputTemplate
def outputfile_compiler(output_file: OutputTemplate) -> str:
templates_folder = os.path.abspath(
os.path.join(os.path.dirname(__file__), "..", "templates")
)
env = jinja2.Environment(
trim_blocks=True,
lstrip_blocks=True,
loader=jinja2.FileSystemLoader(templates_folder),
)
template = env.get_template("template.py.j2")
return black.format_str(
template.render(output_file=output_file),
mode=black.Mode(),
)

53
src/betterproto/plugin/main.py Executable file
View File

@@ -0,0 +1,53 @@
#!/usr/bin/env python
import os
import sys
from betterproto.lib.google.protobuf.compiler import (
CodeGeneratorRequest,
CodeGeneratorResponse,
)
from betterproto.plugin.parser import generate_code
from betterproto.plugin.models import monkey_patch_oneof_index
def main() -> None:
"""The plugin's main entry point."""
# Read request message from stdin
data = sys.stdin.buffer.read()
# Apply Work around for proto2/3 difference in protoc messages
monkey_patch_oneof_index()
# Parse request
request = CodeGeneratorRequest()
request.parse(data)
dump_file = os.getenv("BETTERPROTO_DUMP")
if dump_file:
dump_request(dump_file, request)
# Generate code
response = generate_code(request)
# Serialise response message
output = response.SerializeToString()
# Write to stdout
sys.stdout.buffer.write(output)
def dump_request(dump_file: str, request: CodeGeneratorRequest) -> None:
"""
For developers: Supports running plugin.py standalone so its possible to debug it.
Run protoc (or generate.py) with BETTERPROTO_DUMP="yourfile.bin" to write the request to a file.
Then run plugin.py from your IDE in debugging mode, and redirect stdin to the file.
"""
with open(str(dump_file), "wb") as fh:
sys.stderr.write(f"\033[31mWriting input from protoc to: {dump_file}\033[0m\n")
fh.write(request.SerializeToString())
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,775 @@
"""Plugin model dataclasses.
These classes are meant to be an intermediate representation
of protobuf objects. They are used to organize the data collected during parsing.
The general intention is to create a doubly-linked tree-like structure
with the following types of references:
- Downwards references: from message -> fields, from output package -> messages
or from service -> service methods
- Upwards references: from field -> message, message -> package.
- Input/output message references: from a service method to it's corresponding
input/output messages, which may even be in another package.
There are convenience methods to allow climbing up and down this tree, for
example to retrieve the list of all messages that are in the same package as
the current message.
Most of these classes take as inputs:
- proto_obj: A reference to it's corresponding protobuf object as
presented by the protoc plugin.
- parent: a reference to the parent object in the tree.
With this information, the class is able to expose attributes,
such as a pythonized name, that will be calculated from proto_obj.
The instantiation should also attach a reference to the new object
into the corresponding place within it's parent object. For example,
instantiating field `A` with parent message `B` should add a
reference to `A` to `B`'s `fields` attribute.
"""
import builtins
import re
import textwrap
from dataclasses import dataclass, field
from typing import Dict, Iterable, Iterator, List, Optional, Set, Type, Union
import betterproto
from betterproto import which_one_of
from betterproto.casing import sanitize_name
from betterproto.compile.importing import get_type_reference, parse_source_type_name
from betterproto.compile.naming import (
pythonize_class_name,
pythonize_field_name,
pythonize_method_name,
)
from betterproto.lib.google.protobuf import (
DescriptorProto,
EnumDescriptorProto,
Field,
FieldDescriptorProto,
FieldDescriptorProtoLabel,
FieldDescriptorProtoType,
FileDescriptorProto,
MethodDescriptorProto,
)
from betterproto.lib.google.protobuf.compiler import CodeGeneratorRequest
from ..casing import sanitize_name
from ..compile.importing import get_type_reference, parse_source_type_name
from ..compile.naming import (
pythonize_class_name,
pythonize_field_name,
pythonize_method_name,
)
# Create a unique placeholder to deal with
# https://stackoverflow.com/questions/51575931/class-inheritance-in-python-3-7-dataclasses
PLACEHOLDER = object()
# Organize proto types into categories
PROTO_FLOAT_TYPES = (
FieldDescriptorProtoType.TYPE_DOUBLE, # 1
FieldDescriptorProtoType.TYPE_FLOAT, # 2
)
PROTO_INT_TYPES = (
FieldDescriptorProtoType.TYPE_INT64, # 3
FieldDescriptorProtoType.TYPE_UINT64, # 4
FieldDescriptorProtoType.TYPE_INT32, # 5
FieldDescriptorProtoType.TYPE_FIXED64, # 6
FieldDescriptorProtoType.TYPE_FIXED32, # 7
FieldDescriptorProtoType.TYPE_UINT32, # 13
FieldDescriptorProtoType.TYPE_SFIXED32, # 15
FieldDescriptorProtoType.TYPE_SFIXED64, # 16
FieldDescriptorProtoType.TYPE_SINT32, # 17
FieldDescriptorProtoType.TYPE_SINT64, # 18
)
PROTO_BOOL_TYPES = (FieldDescriptorProtoType.TYPE_BOOL,) # 8
PROTO_STR_TYPES = (FieldDescriptorProtoType.TYPE_STRING,) # 9
PROTO_BYTES_TYPES = (FieldDescriptorProtoType.TYPE_BYTES,) # 12
PROTO_MESSAGE_TYPES = (
FieldDescriptorProtoType.TYPE_MESSAGE, # 11
FieldDescriptorProtoType.TYPE_ENUM, # 14
)
PROTO_MAP_TYPES = (FieldDescriptorProtoType.TYPE_MESSAGE,) # 11
PROTO_PACKED_TYPES = (
FieldDescriptorProtoType.TYPE_DOUBLE, # 1
FieldDescriptorProtoType.TYPE_FLOAT, # 2
FieldDescriptorProtoType.TYPE_INT64, # 3
FieldDescriptorProtoType.TYPE_UINT64, # 4
FieldDescriptorProtoType.TYPE_INT32, # 5
FieldDescriptorProtoType.TYPE_FIXED64, # 6
FieldDescriptorProtoType.TYPE_FIXED32, # 7
FieldDescriptorProtoType.TYPE_BOOL, # 8
FieldDescriptorProtoType.TYPE_UINT32, # 13
FieldDescriptorProtoType.TYPE_SFIXED32, # 15
FieldDescriptorProtoType.TYPE_SFIXED64, # 16
FieldDescriptorProtoType.TYPE_SINT32, # 17
FieldDescriptorProtoType.TYPE_SINT64, # 18
)
def monkey_patch_oneof_index():
"""
The compiler message types are written for proto2, but we read them as proto3.
For this to work in the case of the oneof_index fields, which depend on being able
to tell whether they were set, we have to treat them as oneof fields. This method
monkey patches the generated classes after the fact to force this behaviour.
"""
object.__setattr__(
FieldDescriptorProto.__dataclass_fields__["oneof_index"].metadata[
"betterproto"
],
"group",
"oneof_index",
)
object.__setattr__(
Field.__dataclass_fields__["oneof_index"].metadata["betterproto"],
"group",
"oneof_index",
)
def get_comment(
proto_file: "FileDescriptorProto", path: List[int], indent: int = 4
) -> str:
pad = " " * indent
for sci_loc in proto_file.source_code_info.location:
if list(sci_loc.path) == path and sci_loc.leading_comments:
lines = textwrap.wrap(
sci_loc.leading_comments.strip().replace("\n", ""), width=79 - indent
)
# This is a field, message, enum, service, or method
if len(lines) == 1 and len(lines[0]) < 79 - indent - 6:
lines[0] = lines[0].strip('"')
return f'{pad}"""{lines[0]}"""'
else:
joined = f"\n{pad}".join(lines)
return f'{pad}"""\n{pad}{joined}\n{pad}"""'
return ""
class ProtoContentBase:
"""Methods common to MessageCompiler, ServiceCompiler and ServiceMethodCompiler."""
source_file: FileDescriptorProto
path: List[int]
comment_indent: int = 4
parent: Union["betterproto.Message", "OutputTemplate"]
__dataclass_fields__: Dict[str, object]
def __post_init__(self) -> None:
"""Checks that no fake default fields were left as placeholders."""
for field_name, field_val in self.__dataclass_fields__.items():
if field_val is PLACEHOLDER:
raise ValueError(f"`{field_name}` is a required field.")
@property
def output_file(self) -> "OutputTemplate":
current = self
while not isinstance(current, OutputTemplate):
current = current.parent
return current
@property
def request(self) -> "PluginRequestCompiler":
current = self
while not isinstance(current, OutputTemplate):
current = current.parent
return current.parent_request
@property
def comment(self) -> str:
"""Crawl the proto source code and retrieve comments
for this object.
"""
return get_comment(
proto_file=self.source_file, path=self.path, indent=self.comment_indent
)
@dataclass
class PluginRequestCompiler:
plugin_request_obj: CodeGeneratorRequest
output_packages: Dict[str, "OutputTemplate"] = field(default_factory=dict)
@property
def all_messages(self) -> List["MessageCompiler"]:
"""All of the messages in this request.
Returns
-------
List[MessageCompiler]
List of all of the messages in this request.
"""
return [
msg for output in self.output_packages.values() for msg in output.messages
]
@dataclass
class OutputTemplate:
"""Representation of an output .py file.
Each output file corresponds to a .proto input file,
but may need references to other .proto files to be
built.
"""
parent_request: PluginRequestCompiler
package_proto_obj: FileDescriptorProto
input_files: List[str] = field(default_factory=list)
imports: Set[str] = field(default_factory=set)
datetime_imports: Set[str] = field(default_factory=set)
typing_imports: Set[str] = field(default_factory=set)
builtins_import: bool = False
messages: List["MessageCompiler"] = field(default_factory=list)
enums: List["EnumDefinitionCompiler"] = field(default_factory=list)
services: List["ServiceCompiler"] = field(default_factory=list)
@property
def package(self) -> str:
"""Name of input package.
Returns
-------
str
Name of input package.
"""
return self.package_proto_obj.package
@property
def input_filenames(self) -> Iterable[str]:
"""Names of the input files used to build this output.
Returns
-------
Iterable[str]
Names of the input files used to build this output.
"""
return sorted(f.name for f in self.input_files)
@property
def python_module_imports(self) -> Set[str]:
imports = set()
if any(x for x in self.messages if any(x.deprecated_fields)):
imports.add("warnings")
if self.builtins_import:
imports.add("builtins")
return imports
@dataclass
class MessageCompiler(ProtoContentBase):
"""Representation of a protobuf message."""
source_file: FileDescriptorProto
parent: Union["MessageCompiler", OutputTemplate] = PLACEHOLDER
proto_obj: DescriptorProto = PLACEHOLDER
path: List[int] = PLACEHOLDER
fields: List[Union["FieldCompiler", "MessageCompiler"]] = field(
default_factory=list
)
deprecated: bool = field(default=False, init=False)
builtins_types: Set[str] = field(default_factory=set)
def __post_init__(self) -> None:
# Add message to output file
if isinstance(self.parent, OutputTemplate):
if isinstance(self, EnumDefinitionCompiler):
self.output_file.enums.append(self)
else:
self.output_file.messages.append(self)
self.deprecated = self.proto_obj.options.deprecated
super().__post_init__()
@property
def proto_name(self) -> str:
return self.proto_obj.name
@property
def py_name(self) -> str:
return pythonize_class_name(self.proto_name)
@property
def annotation(self) -> str:
if self.repeated:
return f"List[{self.py_name}]"
return self.py_name
@property
def deprecated_fields(self) -> Iterator[str]:
for f in self.fields:
if f.deprecated:
yield f.py_name
@property
def has_deprecated_fields(self) -> bool:
return any(self.deprecated_fields)
def is_map(
proto_field_obj: FieldDescriptorProto, parent_message: DescriptorProto
) -> bool:
"""True if proto_field_obj is a map, otherwise False."""
if proto_field_obj.type == FieldDescriptorProtoType.TYPE_MESSAGE:
# This might be a map...
message_type = proto_field_obj.type_name.split(".").pop().lower()
map_entry = f"{proto_field_obj.name.replace('_', '').lower()}entry"
if message_type == map_entry:
for nested in parent_message.nested_type: # parent message
if (
nested.name.replace("_", "").lower() == map_entry
and nested.options.map_entry
):
return True
return False
def is_oneof(proto_field_obj: FieldDescriptorProto) -> bool:
"""
True if proto_field_obj is a OneOf, otherwise False.
.. warning::
Becuase the message from protoc is defined in proto2, and betterproto works with
proto3, and interpreting the FieldDescriptorProto.oneof_index field requires
distinguishing between default and unset values (which proto3 doesn't support),
we have to hack the generated FieldDescriptorProto class for this to work.
The hack consists of setting group="oneof_index" in the field metadata,
essentially making oneof_index the sole member of a one_of group, which allows
us to tell whether it was set, via the which_one_of interface.
"""
return which_one_of(proto_field_obj, "oneof_index")[0] == "oneof_index"
@dataclass
class FieldCompiler(MessageCompiler):
parent: MessageCompiler = PLACEHOLDER
proto_obj: FieldDescriptorProto = PLACEHOLDER
def __post_init__(self) -> None:
# Add field to message
self.parent.fields.append(self)
# Check for new imports
self.add_imports_to(self.output_file)
super().__post_init__() # call FieldCompiler-> MessageCompiler __post_init__
def get_field_string(self, indent: int = 4) -> str:
"""Construct string representation of this field as a field."""
name = f"{self.py_name}"
annotations = f": {self.annotation}"
field_args = ", ".join(
([""] + self.betterproto_field_args) if self.betterproto_field_args else []
)
betterproto_field_type = (
f"betterproto.{self.field_type}_field({self.proto_obj.number}{field_args})"
)
if self.py_name in dir(builtins):
self.parent.builtins_types.add(self.py_name)
return f"{name}{annotations} = {betterproto_field_type}"
@property
def betterproto_field_args(self) -> List[str]:
args = []
if self.field_wraps:
args.append(f"wraps={self.field_wraps}")
if self.optional:
args.append(f"optional=True")
return args
@property
def datetime_imports(self) -> Set[str]:
imports = set()
annotation = self.annotation
# FIXME: false positives - e.g. `MyDatetimedelta`
if "timedelta" in annotation:
imports.add("timedelta")
if "datetime" in annotation:
imports.add("datetime")
return imports
@property
def typing_imports(self) -> Set[str]:
imports = set()
annotation = self.annotation
if "Optional[" in annotation:
imports.add("Optional")
if "List[" in annotation:
imports.add("List")
if "Dict[" in annotation:
imports.add("Dict")
return imports
@property
def use_builtins(self) -> bool:
return self.py_type in self.parent.builtins_types or (
self.py_type == self.py_name and self.py_name in dir(builtins)
)
def add_imports_to(self, output_file: OutputTemplate) -> None:
output_file.datetime_imports.update(self.datetime_imports)
output_file.typing_imports.update(self.typing_imports)
output_file.builtins_import = output_file.builtins_import or self.use_builtins
@property
def field_wraps(self) -> Optional[str]:
"""Returns betterproto wrapped field type or None."""
match_wrapper = re.match(
r"\.google\.protobuf\.(.+)Value$", self.proto_obj.type_name
)
if match_wrapper:
wrapped_type = "TYPE_" + match_wrapper.group(1).upper()
if hasattr(betterproto, wrapped_type):
return f"betterproto.{wrapped_type}"
return None
@property
def repeated(self) -> bool:
return (
self.proto_obj.label == FieldDescriptorProtoLabel.LABEL_REPEATED
and not is_map(self.proto_obj, self.parent)
)
@property
def optional(self) -> bool:
return self.proto_obj.proto3_optional
@property
def mutable(self) -> bool:
"""True if the field is a mutable type, otherwise False."""
return self.annotation.startswith(("List[", "Dict["))
@property
def field_type(self) -> str:
"""String representation of proto field type."""
return (
FieldDescriptorProtoType(self.proto_obj.type)
.name.lower()
.replace("type_", "")
)
@property
def default_value_string(self) -> str:
"""Python representation of the default proto value."""
if self.repeated:
return "[]"
if self.optional:
return "None"
if self.py_type == "int":
return "0"
if self.py_type == "float":
return "0.0"
elif self.py_type == "bool":
return "False"
elif self.py_type == "str":
return '""'
elif self.py_type == "bytes":
return 'b""'
elif self.field_type == "enum":
enum_proto_obj_name = self.proto_obj.type_name.split(".").pop()
enum = next(
e
for e in self.output_file.enums
if e.proto_obj.name == enum_proto_obj_name
)
return enum.default_value_string
else:
# Message type
return "None"
@property
def packed(self) -> bool:
"""True if the wire representation is a packed format."""
return self.repeated and self.proto_obj.type in PROTO_PACKED_TYPES
@property
def py_name(self) -> str:
"""Pythonized name."""
return pythonize_field_name(self.proto_name)
@property
def proto_name(self) -> str:
"""Original protobuf name."""
return self.proto_obj.name
@property
def py_type(self) -> str:
"""String representation of Python type."""
if self.proto_obj.type in PROTO_FLOAT_TYPES:
return "float"
elif self.proto_obj.type in PROTO_INT_TYPES:
return "int"
elif self.proto_obj.type in PROTO_BOOL_TYPES:
return "bool"
elif self.proto_obj.type in PROTO_STR_TYPES:
return "str"
elif self.proto_obj.type in PROTO_BYTES_TYPES:
return "bytes"
elif self.proto_obj.type in PROTO_MESSAGE_TYPES:
# Type referencing another defined Message or a named enum
return get_type_reference(
package=self.output_file.package,
imports=self.output_file.imports,
source_type=self.proto_obj.type_name,
)
else:
raise NotImplementedError(f"Unknown type {self.proto_obj.type}")
@property
def annotation(self) -> str:
py_type = self.py_type
if self.use_builtins:
py_type = f"builtins.{py_type}"
if self.repeated:
return f"List[{py_type}]"
if self.optional:
return f"Optional[{py_type}]"
return py_type
@dataclass
class OneOfFieldCompiler(FieldCompiler):
@property
def betterproto_field_args(self) -> List[str]:
args = super().betterproto_field_args
group = self.parent.proto_obj.oneof_decl[self.proto_obj.oneof_index].name
args.append(f'group="{group}"')
return args
@dataclass
class MapEntryCompiler(FieldCompiler):
py_k_type: Type = PLACEHOLDER
py_v_type: Type = PLACEHOLDER
proto_k_type: str = PLACEHOLDER
proto_v_type: str = PLACEHOLDER
def __post_init__(self) -> None:
"""Explore nested types and set k_type and v_type if unset."""
map_entry = f"{self.proto_obj.name.replace('_', '').lower()}entry"
for nested in self.parent.proto_obj.nested_type:
if (
nested.name.replace("_", "").lower() == map_entry
and nested.options.map_entry
):
# Get Python types
self.py_k_type = FieldCompiler(
source_file=self.source_file,
parent=self,
proto_obj=nested.field[0], # key
).py_type
self.py_v_type = FieldCompiler(
source_file=self.source_file,
parent=self,
proto_obj=nested.field[1], # value
).py_type
# Get proto types
self.proto_k_type = FieldDescriptorProtoType(nested.field[0].type).name
self.proto_v_type = FieldDescriptorProtoType(nested.field[1].type).name
super().__post_init__() # call FieldCompiler-> MessageCompiler __post_init__
@property
def betterproto_field_args(self) -> List[str]:
return [f"betterproto.{self.proto_k_type}", f"betterproto.{self.proto_v_type}"]
@property
def field_type(self) -> str:
return "map"
@property
def annotation(self) -> str:
return f"Dict[{self.py_k_type}, {self.py_v_type}]"
@property
def repeated(self) -> bool:
return False # maps cannot be repeated
@dataclass
class EnumDefinitionCompiler(MessageCompiler):
"""Representation of a proto Enum definition."""
proto_obj: EnumDescriptorProto = PLACEHOLDER
entries: List["EnumDefinitionCompiler.EnumEntry"] = PLACEHOLDER
@dataclass(unsafe_hash=True)
class EnumEntry:
"""Representation of an Enum entry."""
name: str
value: int
comment: str
def __post_init__(self) -> None:
# Get entries/allowed values for this Enum
self.entries = [
self.EnumEntry(
name=sanitize_name(entry_proto_value.name),
value=entry_proto_value.number,
comment=get_comment(
proto_file=self.source_file, path=self.path + [2, entry_number]
),
)
for entry_number, entry_proto_value in enumerate(self.proto_obj.value)
]
super().__post_init__() # call MessageCompiler __post_init__
@property
def default_value_string(self) -> str:
"""Python representation of the default value for Enums.
As per the spec, this is the first value of the Enum.
"""
return str(self.entries[0].value) # ideally, should ALWAYS be int(0)!
@dataclass
class ServiceCompiler(ProtoContentBase):
parent: OutputTemplate = PLACEHOLDER
proto_obj: DescriptorProto = PLACEHOLDER
path: List[int] = PLACEHOLDER
methods: List["ServiceMethodCompiler"] = field(default_factory=list)
def __post_init__(self) -> None:
# Add service to output file
self.output_file.services.append(self)
self.output_file.typing_imports.add("Dict")
super().__post_init__() # check for unset fields
@property
def proto_name(self) -> str:
return self.proto_obj.name
@property
def py_name(self) -> str:
return pythonize_class_name(self.proto_name)
@dataclass
class ServiceMethodCompiler(ProtoContentBase):
parent: ServiceCompiler
proto_obj: MethodDescriptorProto
path: List[int] = PLACEHOLDER
comment_indent: int = 8
def __post_init__(self) -> None:
# Add method to service
self.parent.methods.append(self)
# Check for imports
if "Optional" in self.py_output_message_type:
self.output_file.typing_imports.add("Optional")
# Check for Async imports
if self.client_streaming:
self.output_file.typing_imports.add("AsyncIterable")
self.output_file.typing_imports.add("Iterable")
self.output_file.typing_imports.add("Union")
# Required by both client and server
if self.client_streaming or self.server_streaming:
self.output_file.typing_imports.add("AsyncIterator")
super().__post_init__() # check for unset fields
@property
def py_name(self) -> str:
"""Pythonized method name."""
return pythonize_method_name(self.proto_obj.name)
@property
def proto_name(self) -> str:
"""Original protobuf name."""
return self.proto_obj.name
@property
def route(self) -> str:
package_part = (
f"{self.output_file.package}." if self.output_file.package else ""
)
return f"/{package_part}{self.parent.proto_name}/{self.proto_name}"
@property
def py_input_message(self) -> Optional[MessageCompiler]:
"""Find the input message object.
Returns
-------
Optional[MessageCompiler]
Method instance representing the input message.
If not input message could be found or there are no
input messages, None is returned.
"""
package, name = parse_source_type_name(self.proto_obj.input_type)
# Nested types are currently flattened without dots.
# Todo: keep a fully quantified name in types, that is
# comparable with method.input_type
for msg in self.request.all_messages:
if (
msg.py_name == name.replace(".", "")
and msg.output_file.package == package
):
return msg
return None
@property
def py_input_message_type(self) -> str:
"""String representation of the Python type corresponding to the
input message.
Returns
-------
str
String representation of the Python type corresponding to the input message.
"""
return get_type_reference(
package=self.output_file.package,
imports=self.output_file.imports,
source_type=self.proto_obj.input_type,
).strip('"')
@property
def py_input_message_param(self) -> str:
"""Param name corresponding to py_input_message_type.
Returns
-------
str
Param name corresponding to py_input_message_type.
"""
return pythonize_field_name(self.py_input_message_type)
@property
def py_output_message_type(self) -> str:
"""String representation of the Python type corresponding to the
output message.
Returns
-------
str
String representation of the Python type corresponding to the output message.
"""
return get_type_reference(
package=self.output_file.package,
imports=self.output_file.imports,
source_type=self.proto_obj.output_type,
unwrap=False,
).strip('"')
@property
def client_streaming(self) -> bool:
return self.proto_obj.client_streaming
@property
def server_streaming(self) -> bool:
return self.proto_obj.server_streaming

View File

@@ -0,0 +1,193 @@
from betterproto.lib.google.protobuf import (
DescriptorProto,
EnumDescriptorProto,
FieldDescriptorProto,
FileDescriptorProto,
ServiceDescriptorProto,
)
from betterproto.lib.google.protobuf.compiler import (
CodeGeneratorRequest,
CodeGeneratorResponse,
CodeGeneratorResponseFeature,
CodeGeneratorResponseFile,
)
import itertools
import pathlib
import sys
from typing import Iterator, List, Set, Tuple, TYPE_CHECKING, Union
from .compiler import outputfile_compiler
from .models import (
EnumDefinitionCompiler,
FieldCompiler,
MapEntryCompiler,
MessageCompiler,
OneOfFieldCompiler,
OutputTemplate,
PluginRequestCompiler,
ServiceCompiler,
ServiceMethodCompiler,
is_map,
is_oneof,
)
if TYPE_CHECKING:
from google.protobuf.descriptor import Descriptor
def traverse(
proto_file: FieldDescriptorProto,
) -> "itertools.chain[Tuple[Union[str, EnumDescriptorProto], List[int]]]":
# Todo: Keep information about nested hierarchy
def _traverse(
path: List[int], items: List["EnumDescriptorProto"], prefix=""
) -> Iterator[Tuple[Union[str, EnumDescriptorProto], List[int]]]:
for i, item in enumerate(items):
# Adjust the name since we flatten the hierarchy.
# Todo: don't change the name, but include full name in returned tuple
item.name = next_prefix = prefix + item.name
yield item, path + [i]
if isinstance(item, DescriptorProto):
for enum in item.enum_type:
enum.name = next_prefix + enum.name
yield enum, path + [i, 4]
if item.nested_type:
for n, p in _traverse(path + [i, 3], item.nested_type, next_prefix):
yield n, p
return itertools.chain(
_traverse([5], proto_file.enum_type), _traverse([4], proto_file.message_type)
)
def generate_code(request: CodeGeneratorRequest) -> CodeGeneratorResponse:
response = CodeGeneratorResponse()
plugin_options = request.parameter.split(",") if request.parameter else []
response.supported_features = CodeGeneratorResponseFeature.FEATURE_PROTO3_OPTIONAL
request_data = PluginRequestCompiler(plugin_request_obj=request)
# Gather output packages
for proto_file in request.proto_file:
if (
proto_file.package == "google.protobuf"
and "INCLUDE_GOOGLE" not in plugin_options
):
# If not INCLUDE_GOOGLE,
# skip re-compiling Google's well-known types
continue
output_package_name = proto_file.package
if output_package_name not in request_data.output_packages:
# Create a new output if there is no output for this package
request_data.output_packages[output_package_name] = OutputTemplate(
parent_request=request_data, package_proto_obj=proto_file
)
# Add this input file to the output corresponding to this package
request_data.output_packages[output_package_name].input_files.append(proto_file)
# Read Messages and Enums
# We need to read Messages before Services in so that we can
# get the references to input/output messages for each service
for output_package_name, output_package in request_data.output_packages.items():
for proto_input_file in output_package.input_files:
for item, path in traverse(proto_input_file):
read_protobuf_type(
source_file=proto_input_file,
item=item,
path=path,
output_package=output_package,
)
# Read Services
for output_package_name, output_package in request_data.output_packages.items():
for proto_input_file in output_package.input_files:
for index, service in enumerate(proto_input_file.service):
read_protobuf_service(service, index, output_package)
# Generate output files
output_paths: Set[pathlib.Path] = set()
for output_package_name, output_package in request_data.output_packages.items():
# Add files to the response object
output_path = pathlib.Path(*output_package_name.split("."), "__init__.py")
output_paths.add(output_path)
response.file.append(
CodeGeneratorResponseFile(
name=str(output_path),
# Render and then format the output file
content=outputfile_compiler(output_file=output_package),
)
)
# Make each output directory a package with __init__ file
init_files = {
directory.joinpath("__init__.py")
for path in output_paths
for directory in path.parents
} - output_paths
for init_file in init_files:
response.file.append(CodeGeneratorResponseFile(name=str(init_file)))
for output_package_name in sorted(output_paths.union(init_files)):
print(f"Writing {output_package_name}", file=sys.stderr)
return response
def read_protobuf_type(
item: DescriptorProto,
path: List[int],
source_file: "FileDescriptorProto",
output_package: OutputTemplate,
) -> None:
if isinstance(item, DescriptorProto):
if item.options.map_entry:
# Skip generated map entry messages since we just use dicts
return
# Process Message
message_data = MessageCompiler(
source_file=source_file, parent=output_package, proto_obj=item, path=path
)
for index, field in enumerate(item.field):
if is_map(field, item):
MapEntryCompiler(
source_file=source_file,
parent=message_data,
proto_obj=field,
path=path + [2, index],
)
elif is_oneof(field):
OneOfFieldCompiler(
source_file=source_file,
parent=message_data,
proto_obj=field,
path=path + [2, index],
)
else:
FieldCompiler(
source_file=source_file,
parent=message_data,
proto_obj=field,
path=path + [2, index],
)
elif isinstance(item, EnumDescriptorProto):
# Enum
EnumDefinitionCompiler(
source_file=source_file, parent=output_package, proto_obj=item, path=path
)
def read_protobuf_service(
service: ServiceDescriptorProto, index: int, output_package: OutputTemplate
) -> None:
service_data = ServiceCompiler(
parent=output_package, proto_obj=service, path=[6, index]
)
for j, method in enumerate(service.method):
ServiceMethodCompiler(
parent=service_data, proto_obj=method, path=[6, index, 2, j]
)

View File

@@ -0,0 +1,2 @@
@SET plugin_dir=%~dp0
@python -m %plugin_dir% %*

0
src/betterproto/py.typed Normal file
View File

View File

@@ -0,0 +1,200 @@
# Generated by the protocol buffer compiler. DO NOT EDIT!
# sources: {{ ', '.join(output_file.input_filenames) }}
# plugin: python-betterproto
{% for i in output_file.python_module_imports|sort %}
import {{ i }}
{% endfor %}
from dataclasses import dataclass
{% if output_file.datetime_imports %}
from datetime import {% for i in output_file.datetime_imports|sort %}{{ i }}{% if not loop.last %}, {% endif %}{% endfor %}
{% endif%}
{% if output_file.typing_imports %}
from typing import {% for i in output_file.typing_imports|sort %}{{ i }}{% if not loop.last %}, {% endif %}{% endfor %}
{% endif %}
import betterproto
from betterproto.grpc.grpclib_server import ServiceBase
{% if output_file.services %}
import grpclib
{% endif %}
{% if output_file.enums %}{% for enum in output_file.enums %}
class {{ enum.py_name }}(betterproto.Enum):
{% if enum.comment %}
{{ enum.comment }}
{% endif %}
{% for entry in enum.entries %}
{{ entry.name }} = {{ entry.value }}
{% if entry.comment %}
{{ entry.comment }}
{% endif %}
{% endfor %}
{% endfor %}
{% endif %}
{% for message in output_file.messages %}
@dataclass(eq=False, repr=False)
class {{ message.py_name }}(betterproto.Message):
{% if message.comment %}
{{ message.comment }}
{% endif %}
{% for field in message.fields %}
{{ field.get_field_string() }}
{% if field.comment %}
{{ field.comment }}
{% endif %}
{% endfor %}
{% if not message.fields %}
pass
{% endif %}
{% if message.deprecated or message.has_deprecated_fields %}
def __post_init__(self) -> None:
{% if message.deprecated %}
warnings.warn("{{ message.py_name }} is deprecated", DeprecationWarning)
{% endif %}
super().__post_init__()
{% for field in message.deprecated_fields %}
if self.{{ field }}:
warnings.warn("{{ message.py_name }}.{{ field }} is deprecated", DeprecationWarning)
{% endfor %}
{% endif %}
{% endfor %}
{% for service in output_file.services %}
class {{ service.py_name }}Stub(betterproto.ServiceStub):
{% if service.comment %}
{{ service.comment }}
{% elif not service.methods %}
pass
{% endif %}
{% for method in service.methods %}
async def {{ method.py_name }}(self
{%- if not method.client_streaming -%}
{%- if method.py_input_message -%}, {{ method.py_input_message_param }}: "{{ method.py_input_message_type }}"{%- endif -%}
{%- else -%}
{# Client streaming: need a request iterator instead #}
, {{ method.py_input_message_param }}_iterator: Union[AsyncIterable["{{ method.py_input_message_type }}"], Iterable["{{ method.py_input_message_type }}"]]
{%- endif -%}
) -> {% if method.server_streaming %}AsyncIterator["{{ method.py_output_message_type }}"]{% else %}"{{ method.py_output_message_type }}"{% endif %}:
{% if method.comment %}
{{ method.comment }}
{% endif %}
{% if method.server_streaming %}
{% if method.client_streaming %}
async for response in self._stream_stream(
"{{ method.route }}",
{{ method.py_input_message_param }}_iterator,
{{ method.py_input_message_type }},
{{ method.py_output_message_type.strip('"') }},
):
yield response
{% else %}{# i.e. not client streaming #}
async for response in self._unary_stream(
"{{ method.route }}",
{{ method.py_input_message_param }},
{{ method.py_output_message_type.strip('"') }},
):
yield response
{% endif %}{# if client streaming #}
{% else %}{# i.e. not server streaming #}
{% if method.client_streaming %}
return await self._stream_unary(
"{{ method.route }}",
{{ method.py_input_message_param }}_iterator,
{{ method.py_input_message_type }},
{{ method.py_output_message_type.strip('"') }}
)
{% else %}{# i.e. not client streaming #}
return await self._unary_unary(
"{{ method.route }}",
{{ method.py_input_message_param }},
{{ method.py_output_message_type.strip('"') }}
)
{% endif %}{# client streaming #}
{% endif %}
{% endfor %}
{% endfor %}
{% for service in output_file.services %}
class {{ service.py_name }}Base(ServiceBase):
{% if service.comment %}
{{ service.comment }}
{% endif %}
{% for method in service.methods %}
async def {{ method.py_name }}(self
{%- if not method.client_streaming -%}
{%- if method.py_input_message -%}, {{ method.py_input_message_param }}: "{{ method.py_input_message_type }}"{%- endif -%}
{%- else -%}
{# Client streaming: need a request iterator instead #}
, {{ method.py_input_message_param }}_iterator: AsyncIterator["{{ method.py_input_message_type }}"]
{%- endif -%}
) -> {% if method.server_streaming %}AsyncIterator["{{ method.py_output_message_type }}"]{% else %}"{{ method.py_output_message_type }}"{% endif %}:
{% if method.comment %}
{{ method.comment }}
{% endif %}
raise grpclib.GRPCError(grpclib.const.Status.UNIMPLEMENTED)
{% endfor %}
{% for method in service.methods %}
async def __rpc_{{ method.py_name }}(self, stream: grpclib.server.Stream) -> None:
{% if not method.client_streaming %}
request = await stream.recv_message()
{% else %}
request = stream.__aiter__()
{% endif %}
{% if not method.server_streaming %}
response = await self.{{ method.py_name }}(request)
await stream.send_message(response)
{% else %}
await self._call_rpc_handler_server_stream(
self.{{ method.py_name }},
stream,
request,
)
{% endif %}
{% endfor %}
def __mapping__(self) -> Dict[str, grpclib.const.Handler]:
return {
{% for method in service.methods %}
"{{ method.route }}": grpclib.const.Handler(
self.__rpc_{{ method.py_name }},
{% if not method.client_streaming and not method.server_streaming %}
grpclib.const.Cardinality.UNARY_UNARY,
{% elif not method.client_streaming and method.server_streaming %}
grpclib.const.Cardinality.UNARY_STREAM,
{% elif method.client_streaming and not method.server_streaming %}
grpclib.const.Cardinality.STREAM_UNARY,
{% else %}
grpclib.const.Cardinality.STREAM_STREAM,
{% endif %}
{{ method.py_input_message_type }},
{{ method.py_output_message_type }},
),
{% endfor %}
}
{% endfor %}
{% for i in output_file.imports|sort %}
{{ i }}
{% endfor %}

91
tests/README.md Normal file
View File

@@ -0,0 +1,91 @@
# Standard Tests Development Guide
Standard test cases are found in [betterproto/tests/inputs](inputs), where each subdirectory represents a testcase, that is verified in isolation.
```
inputs/
bool/
double/
int32/
...
```
## Test case directory structure
Each testcase has a `<name>.proto` file with a message called `Test`, and optionally a matching `.json` file and a custom test called `test_*.py`.
```bash
bool/
bool.proto
bool.json # optional
test_bool.py # optional
```
### proto
`<name>.proto` &mdash; *The protobuf message to test*
```protobuf
syntax = "proto3";
message Test {
bool value = 1;
}
```
You can add multiple `.proto` files to the test case, as long as one file matches the directory name.
### json
`<name>.json` &mdash; *Test-data to validate the message with*
```json
{
"value": true
}
```
### pytest
`test_<name>.py` &mdash; *Custom test to validate specific aspects of the generated class*
```python
from tests.output_betterproto.bool.bool import Test
def test_value():
message = Test()
assert not message.value, "Boolean is False by default"
```
## Standard tests
The following tests are automatically executed for all cases:
- [x] Can the generated python code be imported?
- [x] Can the generated message class be instantiated?
- [x] Is the generated code compatible with the Google's `grpc_tools.protoc` implementation?
- _when `.json` is present_
## Running the tests
- `pipenv run generate`
This generates:
- `betterproto/tests/output_betterproto` &mdash; *the plugin generated python classes*
- `betterproto/tests/output_reference` &mdash; *reference implementation classes*
- `pipenv run test`
## Intentionally Failing tests
The standard test suite includes tests that fail by intention. These tests document known bugs and missing features that are intended to be corrected in the future.
When running `pytest`, they show up as `x` or `X` in the test results.
```
betterproto/tests/test_inputs.py ..x...x..x...x.X........xx........x.....x.......x.xx....x...................... [ 84%]
```
- `.` &mdash; PASSED
- `x` &mdash; XFAIL: expected failure
- `X` &mdash; XPASS: expected failure, but still passed
Test cases marked for expected failure are declared in [inputs/config.py](inputs/config.py)

0
tests/__init__.py Normal file
View File

12
tests/conftest.py Normal file
View File

@@ -0,0 +1,12 @@
import pytest
def pytest_addoption(parser):
parser.addoption(
"--repeat", type=int, default=1, help="repeat the operation multiple times"
)
@pytest.fixture(scope="session")
def repeat(request):
return request.config.getoption("repeat")

168
tests/generate.py Executable file
View File

@@ -0,0 +1,168 @@
#!/usr/bin/env python
import asyncio
import os
from pathlib import Path
import platform
import shutil
import sys
from typing import Set
from tests.util import (
get_directories,
inputs_path,
output_path_betterproto,
output_path_reference,
protoc,
)
# Force pure-python implementation instead of C++, otherwise imports
# break things because we can't properly reset the symbol database.
os.environ["PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION"] = "python"
def clear_directory(dir_path: Path):
for file_or_directory in dir_path.glob("*"):
if file_or_directory.is_dir():
shutil.rmtree(file_or_directory)
else:
file_or_directory.unlink()
async def generate(whitelist: Set[str], verbose: bool):
test_case_names = set(get_directories(inputs_path)) - {"__pycache__"}
path_whitelist = set()
name_whitelist = set()
for item in whitelist:
if item in test_case_names:
name_whitelist.add(item)
continue
path_whitelist.add(item)
generation_tasks = []
for test_case_name in sorted(test_case_names):
test_case_input_path = inputs_path.joinpath(test_case_name).resolve()
if (
whitelist
and str(test_case_input_path) not in path_whitelist
and test_case_name not in name_whitelist
):
continue
generation_tasks.append(
generate_test_case_output(test_case_input_path, test_case_name, verbose)
)
failed_test_cases = []
# Wait for all subprocs and match any failures to names to report
for test_case_name, result in zip(
sorted(test_case_names), await asyncio.gather(*generation_tasks)
):
if result != 0:
failed_test_cases.append(test_case_name)
if len(failed_test_cases) > 0:
sys.stderr.write(
"\n\033[31;1;4mFailed to generate the following test cases:\033[0m\n"
)
for failed_test_case in failed_test_cases:
sys.stderr.write(f"- {failed_test_case}\n")
sys.exit(1)
async def generate_test_case_output(
test_case_input_path: Path, test_case_name: str, verbose: bool
) -> int:
"""
Returns the max of the subprocess return values
"""
test_case_output_path_reference = output_path_reference.joinpath(test_case_name)
test_case_output_path_betterproto = output_path_betterproto.joinpath(test_case_name)
os.makedirs(test_case_output_path_reference, exist_ok=True)
os.makedirs(test_case_output_path_betterproto, exist_ok=True)
clear_directory(test_case_output_path_reference)
clear_directory(test_case_output_path_betterproto)
(
(ref_out, ref_err, ref_code),
(plg_out, plg_err, plg_code),
) = await asyncio.gather(
protoc(test_case_input_path, test_case_output_path_reference, True),
protoc(test_case_input_path, test_case_output_path_betterproto, False),
)
if ref_code == 0:
print(f"\033[31;1;4mGenerated reference output for {test_case_name!r}\033[0m")
else:
print(
f"\033[31;1;4mFailed to generate reference output for {test_case_name!r}\033[0m"
)
if verbose:
if ref_out:
print("Reference stdout:")
sys.stdout.buffer.write(ref_out)
sys.stdout.buffer.flush()
if ref_err:
print("Reference stderr:")
sys.stderr.buffer.write(ref_err)
sys.stderr.buffer.flush()
if plg_code == 0:
print(f"\033[31;1;4mGenerated plugin output for {test_case_name!r}\033[0m")
else:
print(
f"\033[31;1;4mFailed to generate plugin output for {test_case_name!r}\033[0m"
)
if verbose:
if plg_out:
print("Plugin stdout:")
sys.stdout.buffer.write(plg_out)
sys.stdout.buffer.flush()
if plg_err:
print("Plugin stderr:")
sys.stderr.buffer.write(plg_err)
sys.stderr.buffer.flush()
return max(ref_code, plg_code)
HELP = "\n".join(
(
"Usage: python generate.py [-h] [-v] [DIRECTORIES or NAMES]",
"Generate python classes for standard tests.",
"",
"DIRECTORIES One or more relative or absolute directories of test-cases to generate classes for.",
" python generate.py inputs/bool inputs/double inputs/enum",
"",
"NAMES One or more test-case names to generate classes for.",
" python generate.py bool double enums",
)
)
def main():
if set(sys.argv).intersection({"-h", "--help"}):
print(HELP)
return
if sys.argv[1:2] == ["-v"]:
verbose = True
whitelist = set(sys.argv[2:])
else:
verbose = False
whitelist = set(sys.argv[1:])
if platform.system() == "Windows":
asyncio.set_event_loop(asyncio.ProactorEventLoop())
asyncio.get_event_loop().run_until_complete(generate(whitelist, verbose))
if __name__ == "__main__":
main()

0
tests/grpc/__init__.py Normal file
View File

View File

@@ -0,0 +1,221 @@
import asyncio
import sys
import grpclib
import grpclib.metadata
import grpclib.server
import pytest
from betterproto.grpc.util.async_channel import AsyncChannel
from grpclib.testing import ChannelFor
from tests.output_betterproto.service.service import (
DoThingRequest,
DoThingResponse,
GetThingRequest,
)
from tests.output_betterproto.service.service import TestStub as ThingServiceClient
from .thing_service import ThingService
async def _test_client(client: ThingServiceClient, name="clean room", **kwargs):
response = await client.do_thing(DoThingRequest(name=name))
assert response.names == [name]
def _assert_request_meta_received(deadline, metadata):
def server_side_test(stream):
assert stream.deadline._timestamp == pytest.approx(
deadline._timestamp, 1
), "The provided deadline should be received serverside"
assert (
stream.metadata["authorization"] == metadata["authorization"]
), "The provided authorization metadata should be received serverside"
return server_side_test
@pytest.fixture
def handler_trailer_only_unauthenticated():
async def handler(stream: grpclib.server.Stream):
await stream.recv_message()
await stream.send_initial_metadata()
await stream.send_trailing_metadata(status=grpclib.Status.UNAUTHENTICATED)
return handler
@pytest.mark.asyncio
async def test_simple_service_call():
async with ChannelFor([ThingService()]) as channel:
await _test_client(ThingServiceClient(channel))
@pytest.mark.asyncio
async def test_trailer_only_error_unary_unary(
mocker, handler_trailer_only_unauthenticated
):
service = ThingService()
mocker.patch.object(
service,
"do_thing",
side_effect=handler_trailer_only_unauthenticated,
autospec=True,
)
async with ChannelFor([service]) as channel:
with pytest.raises(grpclib.exceptions.GRPCError) as e:
await ThingServiceClient(channel).do_thing(DoThingRequest(name="something"))
assert e.value.status == grpclib.Status.UNAUTHENTICATED
@pytest.mark.asyncio
async def test_trailer_only_error_stream_unary(
mocker, handler_trailer_only_unauthenticated
):
service = ThingService()
mocker.patch.object(
service,
"do_many_things",
side_effect=handler_trailer_only_unauthenticated,
autospec=True,
)
async with ChannelFor([service]) as channel:
with pytest.raises(grpclib.exceptions.GRPCError) as e:
await ThingServiceClient(channel).do_many_things(
do_thing_request_iterator=[DoThingRequest(name="something")]
)
await _test_client(ThingServiceClient(channel))
assert e.value.status == grpclib.Status.UNAUTHENTICATED
@pytest.mark.asyncio
@pytest.mark.skipif(
sys.version_info < (3, 8), reason="async mock spy does works for python3.8+"
)
async def test_service_call_mutable_defaults(mocker):
async with ChannelFor([ThingService()]) as channel:
client = ThingServiceClient(channel)
spy = mocker.spy(client, "_unary_unary")
await _test_client(client)
comments = spy.call_args_list[-1].args[1].comments
await _test_client(client)
assert spy.call_args_list[-1].args[1].comments is not comments
@pytest.mark.asyncio
async def test_service_call_with_upfront_request_params():
# Setting deadline
deadline = grpclib.metadata.Deadline.from_timeout(22)
metadata = {"authorization": "12345"}
async with ChannelFor(
[ThingService(test_hook=_assert_request_meta_received(deadline, metadata))]
) as channel:
await _test_client(
ThingServiceClient(channel, deadline=deadline, metadata=metadata)
)
# Setting timeout
timeout = 99
deadline = grpclib.metadata.Deadline.from_timeout(timeout)
metadata = {"authorization": "12345"}
async with ChannelFor(
[ThingService(test_hook=_assert_request_meta_received(deadline, metadata))]
) as channel:
await _test_client(
ThingServiceClient(channel, timeout=timeout, metadata=metadata)
)
@pytest.mark.asyncio
async def test_service_call_lower_level_with_overrides():
THING_TO_DO = "get milk"
# Setting deadline
deadline = grpclib.metadata.Deadline.from_timeout(22)
metadata = {"authorization": "12345"}
kwarg_deadline = grpclib.metadata.Deadline.from_timeout(28)
kwarg_metadata = {"authorization": "12345"}
async with ChannelFor(
[ThingService(test_hook=_assert_request_meta_received(deadline, metadata))]
) as channel:
client = ThingServiceClient(channel, deadline=deadline, metadata=metadata)
response = await client._unary_unary(
"/service.Test/DoThing",
DoThingRequest(THING_TO_DO),
DoThingResponse,
deadline=kwarg_deadline,
metadata=kwarg_metadata,
)
assert response.names == [THING_TO_DO]
# Setting timeout
timeout = 99
deadline = grpclib.metadata.Deadline.from_timeout(timeout)
metadata = {"authorization": "12345"}
kwarg_timeout = 9000
kwarg_deadline = grpclib.metadata.Deadline.from_timeout(kwarg_timeout)
kwarg_metadata = {"authorization": "09876"}
async with ChannelFor(
[
ThingService(
test_hook=_assert_request_meta_received(kwarg_deadline, kwarg_metadata),
)
]
) as channel:
client = ThingServiceClient(channel, deadline=deadline, metadata=metadata)
response = await client._unary_unary(
"/service.Test/DoThing",
DoThingRequest(THING_TO_DO),
DoThingResponse,
timeout=kwarg_timeout,
metadata=kwarg_metadata,
)
assert response.names == [THING_TO_DO]
@pytest.mark.asyncio
async def test_async_gen_for_unary_stream_request():
thing_name = "my milkshakes"
async with ChannelFor([ThingService()]) as channel:
client = ThingServiceClient(channel)
expected_versions = [5, 4, 3, 2, 1]
async for response in client.get_thing_versions(
GetThingRequest(name=thing_name)
):
assert response.name == thing_name
assert response.version == expected_versions.pop()
@pytest.mark.asyncio
async def test_async_gen_for_stream_stream_request():
some_things = ["cake", "cricket", "coral reef"]
more_things = ["ball", "that", "56kmodem", "liberal humanism", "cheesesticks"]
expected_things = (*some_things, *more_things)
async with ChannelFor([ThingService()]) as channel:
client = ThingServiceClient(channel)
# Use an AsyncChannel to decouple sending and recieving, it'll send some_things
# immediately and we'll use it to send more_things later, after recieving some
# results
request_chan = AsyncChannel()
send_initial_requests = asyncio.ensure_future(
request_chan.send_from(GetThingRequest(name) for name in some_things)
)
response_index = 0
async for response in client.get_different_things(request_chan):
assert response.name == expected_things[response_index]
assert response.version == response_index + 1
response_index += 1
if more_things:
# Send some more requests as we receive responses to be sure coordination of
# send/receive events doesn't matter
await request_chan.send(GetThingRequest(more_things.pop(0)))
elif not send_initial_requests.done():
# Make sure the sending task it completed
await send_initial_requests
else:
# No more things to send make sure channel is closed
request_chan.close()
assert response_index == len(
expected_things
), "Didn't receive all expected responses"

View File

@@ -0,0 +1,97 @@
import asyncio
import betterproto
from betterproto.grpc.util.async_channel import AsyncChannel
from dataclasses import dataclass
import pytest
from typing import AsyncIterator
@dataclass
class Message(betterproto.Message):
body: str = betterproto.string_field(1)
@pytest.fixture
def expected_responses():
return [Message("Hello world 1"), Message("Hello world 2"), Message("Done")]
class ClientStub:
async def connect(self, requests: AsyncIterator):
await asyncio.sleep(0.1)
async for request in requests:
await asyncio.sleep(0.1)
yield request
await asyncio.sleep(0.1)
yield Message("Done")
async def to_list(generator: AsyncIterator):
return [value async for value in generator]
@pytest.fixture
def client():
# channel = Channel(host='127.0.0.1', port=50051)
# return ClientStub(channel)
return ClientStub()
@pytest.mark.asyncio
async def test_send_from_before_connect_and_close_automatically(
client, expected_responses
):
requests = AsyncChannel()
await requests.send_from(
[Message(body="Hello world 1"), Message(body="Hello world 2")], close=True
)
responses = client.connect(requests)
assert await to_list(responses) == expected_responses
@pytest.mark.asyncio
async def test_send_from_after_connect_and_close_automatically(
client, expected_responses
):
requests = AsyncChannel()
responses = client.connect(requests)
await requests.send_from(
[Message(body="Hello world 1"), Message(body="Hello world 2")], close=True
)
assert await to_list(responses) == expected_responses
@pytest.mark.asyncio
async def test_send_from_close_manually_immediately(client, expected_responses):
requests = AsyncChannel()
responses = client.connect(requests)
await requests.send_from(
[Message(body="Hello world 1"), Message(body="Hello world 2")], close=False
)
requests.close()
assert await to_list(responses) == expected_responses
@pytest.mark.asyncio
async def test_send_individually_and_close_before_connect(client, expected_responses):
requests = AsyncChannel()
await requests.send(Message(body="Hello world 1"))
await requests.send(Message(body="Hello world 2"))
requests.close()
responses = client.connect(requests)
assert await to_list(responses) == expected_responses
@pytest.mark.asyncio
async def test_send_individually_and_close_after_connect(client, expected_responses):
requests = AsyncChannel()
await requests.send(Message(body="Hello world 1"))
await requests.send(Message(body="Hello world 2"))
responses = client.connect(requests)
requests.close()
assert await to_list(responses) == expected_responses

View File

@@ -0,0 +1,83 @@
from tests.output_betterproto.service.service import (
DoThingResponse,
DoThingRequest,
GetThingRequest,
GetThingResponse,
)
import grpclib
import grpclib.server
from typing import Dict
class ThingService:
def __init__(self, test_hook=None):
# This lets us pass assertions to the servicer ;)
self.test_hook = test_hook
async def do_thing(
self, stream: "grpclib.server.Stream[DoThingRequest, DoThingResponse]"
):
request = await stream.recv_message()
if self.test_hook is not None:
self.test_hook(stream)
await stream.send_message(DoThingResponse([request.name]))
async def do_many_things(
self, stream: "grpclib.server.Stream[DoThingRequest, DoThingResponse]"
):
thing_names = [request.name for request in stream]
if self.test_hook is not None:
self.test_hook(stream)
await stream.send_message(DoThingResponse(thing_names))
async def get_thing_versions(
self, stream: "grpclib.server.Stream[GetThingRequest, GetThingResponse]"
):
request = await stream.recv_message()
if self.test_hook is not None:
self.test_hook(stream)
for version_num in range(1, 6):
await stream.send_message(
GetThingResponse(name=request.name, version=version_num)
)
async def get_different_things(
self, stream: "grpclib.server.Stream[GetThingRequest, GetThingResponse]"
):
if self.test_hook is not None:
self.test_hook(stream)
# Respond to each input item immediately
response_num = 0
async for request in stream:
response_num += 1
await stream.send_message(
GetThingResponse(name=request.name, version=response_num)
)
def __mapping__(self) -> Dict[str, "grpclib.const.Handler"]:
return {
"/service.Test/DoThing": grpclib.const.Handler(
self.do_thing,
grpclib.const.Cardinality.UNARY_UNARY,
DoThingRequest,
DoThingResponse,
),
"/service.Test/DoManyThings": grpclib.const.Handler(
self.do_many_things,
grpclib.const.Cardinality.STREAM_UNARY,
DoThingRequest,
DoThingResponse,
),
"/service.Test/GetThingVersions": grpclib.const.Handler(
self.get_thing_versions,
grpclib.const.Cardinality.UNARY_STREAM,
GetThingRequest,
GetThingResponse,
),
"/service.Test/GetDifferentThings": grpclib.const.Handler(
self.get_different_things,
grpclib.const.Cardinality.STREAM_STREAM,
GetThingRequest,
GetThingResponse,
),
}

View File

@@ -0,0 +1,6 @@
from tests.output_betterproto.bool import Test
def test_value():
message = Test()
assert not message.value, "Boolean is False by default"

View File

@@ -9,4 +9,10 @@ enum my_enum {
message Test {
int32 camelCase = 1;
my_enum snake_case = 2;
snake_case_message snake_case_message = 3;
int32 UPPERCASE = 4;
}
message snake_case_message {
}

View File

@@ -0,0 +1,23 @@
import tests.output_betterproto.casing as casing
from tests.output_betterproto.casing import Test
def test_message_attributes():
message = Test()
assert hasattr(
message, "snake_case_message"
), "snake_case field name is same in python"
assert hasattr(message, "camel_case"), "CamelCase field is snake_case in python"
assert hasattr(message, "uppercase"), "UPPERCASE field is lowercase in python"
def test_message_casing():
assert hasattr(
casing, "SnakeCaseMessage"
), "snake_case Message name is converted to CamelCase in python"
def test_enum_casing():
assert hasattr(
casing, "MyEnum"
), "snake_case Enum name is converted to CamelCase in python"

View File

@@ -0,0 +1,7 @@
syntax = "proto3";
message Test {
int32 UPPERCASE = 1;
int32 UPPERCASE_V2 = 2;
int32 UPPER_CAMEL_CASE = 3;
}

View File

@@ -0,0 +1,14 @@
from tests.output_betterproto.casing_message_field_uppercase import Test
def test_message_casing():
message = Test()
assert hasattr(
message, "uppercase"
), "UPPERCASE attribute is converted to 'uppercase' in python"
assert hasattr(
message, "uppercase_v2"
), "UPPERCASE_V2 attribute is converted to 'uppercase_v2' in python"
assert hasattr(
message, "upper_camel_case"
), "UPPER_CAMEL_CASE attribute is converted to upper_camel_case in python"

28
tests/inputs/config.py Normal file
View File

@@ -0,0 +1,28 @@
# Test cases that are expected to fail, e.g. unimplemented features or bug-fixes.
# Remove from list when fixed.
xfail = {
"namespace_keywords", # 70
"googletypes_struct", # 9
"googletypes_value", # 9
"import_capitalized_package",
"example", # This is the example in the readme. Not a test.
}
services = {
"googletypes_response",
"googletypes_response_embedded",
"service",
"service_separate_packages",
"import_service_input_message",
"googletypes_service_returns_empty",
"googletypes_service_returns_googletype",
"example_service",
"empty_service",
}
# Indicate json sample messages to skip when testing that json (de)serialization
# is symmetrical becuase some cases legitimately are not symmetrical.
# Each key references the name of the test scenario and the values in the tuple
# Are the names of the json files.
non_symmetrical_json = {"empty_repeated": ("empty_repeated",)}

View File

@@ -0,0 +1,4 @@
{
"v": 10,
"value": 10
}

View File

@@ -0,0 +1,9 @@
syntax = "proto3";
// Some documentation about the Test message.
message Test {
// Some documentation about the value.
option deprecated = true;
int32 v = 1 [deprecated=true];
int32 value = 2;
}

View File

@@ -0,0 +1,4 @@
{
"v": 10,
"value": 10
}

View File

@@ -0,0 +1,8 @@
syntax = "proto3";
// Some documentation about the Test message.
message Test {
// Some documentation about the value.
int32 v = 1 [deprecated=true];
int32 value = 2;
}

View File

@@ -0,0 +1,3 @@
{
"msg": [{"values":[]}]
}

View File

@@ -0,0 +1,9 @@
syntax = "proto3";
message MessageA {
repeated float values = 1;
}
message Test {
repeated MessageA msg = 1;
}

View File

@@ -0,0 +1,7 @@
/* Empty service without comments */
syntax = "proto3";
package empty_service;
service Test {
}

View File

@@ -0,0 +1,9 @@
{
"choice": "FOUR",
"choices": [
"ZERO",
"ONE",
"THREE",
"FOUR"
]
}

View File

@@ -0,0 +1,15 @@
syntax = "proto3";
// Tests that enums are correctly serialized and that it correctly handles skipped and out-of-order enum values
message Test {
Choice choice = 1;
repeated Choice choices = 2;
}
enum Choice {
ZERO = 0;
ONE = 1;
// TWO = 2;
FOUR = 4;
THREE = 3;
}

View File

@@ -0,0 +1,84 @@
from tests.output_betterproto.enum import (
Test,
Choice,
)
def test_enum_set_and_get():
assert Test(choice=Choice.ZERO).choice == Choice.ZERO
assert Test(choice=Choice.ONE).choice == Choice.ONE
assert Test(choice=Choice.THREE).choice == Choice.THREE
assert Test(choice=Choice.FOUR).choice == Choice.FOUR
def test_enum_set_with_int():
assert Test(choice=0).choice == Choice.ZERO
assert Test(choice=1).choice == Choice.ONE
assert Test(choice=3).choice == Choice.THREE
assert Test(choice=4).choice == Choice.FOUR
def test_enum_is_comparable_with_int():
assert Test(choice=Choice.ZERO).choice == 0
assert Test(choice=Choice.ONE).choice == 1
assert Test(choice=Choice.THREE).choice == 3
assert Test(choice=Choice.FOUR).choice == 4
def test_enum_to_dict():
assert (
"choice" not in Test(choice=Choice.ZERO).to_dict()
), "Default enum value is not serialized"
assert (
Test(choice=Choice.ZERO).to_dict(include_default_values=True)["choice"]
== "ZERO"
)
assert Test(choice=Choice.ONE).to_dict()["choice"] == "ONE"
assert Test(choice=Choice.THREE).to_dict()["choice"] == "THREE"
assert Test(choice=Choice.FOUR).to_dict()["choice"] == "FOUR"
def test_repeated_enum_is_comparable_with_int():
assert Test(choices=[Choice.ZERO]).choices == [0]
assert Test(choices=[Choice.ONE]).choices == [1]
assert Test(choices=[Choice.THREE]).choices == [3]
assert Test(choices=[Choice.FOUR]).choices == [4]
def test_repeated_enum_set_and_get():
assert Test(choices=[Choice.ZERO]).choices == [Choice.ZERO]
assert Test(choices=[Choice.ONE]).choices == [Choice.ONE]
assert Test(choices=[Choice.THREE]).choices == [Choice.THREE]
assert Test(choices=[Choice.FOUR]).choices == [Choice.FOUR]
def test_repeated_enum_to_dict():
assert Test(choices=[Choice.ZERO]).to_dict()["choices"] == ["ZERO"]
assert Test(choices=[Choice.ONE]).to_dict()["choices"] == ["ONE"]
assert Test(choices=[Choice.THREE]).to_dict()["choices"] == ["THREE"]
assert Test(choices=[Choice.FOUR]).to_dict()["choices"] == ["FOUR"]
all_enums_dict = Test(
choices=[Choice.ZERO, Choice.ONE, Choice.THREE, Choice.FOUR]
).to_dict()
assert (all_enums_dict["choices"]) == ["ZERO", "ONE", "THREE", "FOUR"]
def test_repeated_enum_with_single_value_to_dict():
assert Test(choices=Choice.ONE).to_dict()["choices"] == ["ONE"]
assert Test(choices=1).to_dict()["choices"] == ["ONE"]
def test_repeated_enum_with_non_list_iterables_to_dict():
assert Test(choices=(1, 3)).to_dict()["choices"] == ["ONE", "THREE"]
assert Test(choices=(1, 3)).to_dict()["choices"] == ["ONE", "THREE"]
assert Test(choices=(Choice.ONE, Choice.THREE)).to_dict()["choices"] == [
"ONE",
"THREE",
]
def enum_generator():
yield Choice.ONE
yield Choice.THREE
assert Test(choices=enum_generator()).to_dict()["choices"] == ["ONE", "THREE"]

View File

@@ -0,0 +1,909 @@
// Protocol Buffers - Google's data interchange format
// Copyright 2008 Google Inc. All rights reserved.
// https://developers.google.com/protocol-buffers/
//
// Redistribution and use in source and binary forms, with or without
// modification, are permitted provided that the following conditions are
// met:
//
// * Redistributions of source code must retain the above copyright
// notice, this list of conditions and the following disclaimer.
// * Redistributions in binary form must reproduce the above
// copyright notice, this list of conditions and the following disclaimer
// in the documentation and/or other materials provided with the
// distribution.
// * Neither the name of Google Inc. nor the names of its
// contributors may be used to endorse or promote products derived from
// this software without specific prior written permission.
//
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
// Author: kenton@google.com (Kenton Varda)
// Based on original Protocol Buffers design by
// Sanjay Ghemawat, Jeff Dean, and others.
//
// The messages in this file describe the definitions found in .proto files.
// A valid .proto file can be translated directly to a FileDescriptorProto
// without any other information (e.g. without reading its imports).
syntax = "proto2";
// package google.protobuf;
option go_package = "google.golang.org/protobuf/types/descriptorpb";
option java_package = "com.google.protobuf";
option java_outer_classname = "DescriptorProtos";
option csharp_namespace = "Google.Protobuf.Reflection";
option objc_class_prefix = "GPB";
option cc_enable_arenas = true;
// descriptor.proto must be optimized for speed because reflection-based
// algorithms don't work during bootstrapping.
option optimize_for = SPEED;
// The protocol compiler can output a FileDescriptorSet containing the .proto
// files it parses.
message FileDescriptorSet {
repeated FileDescriptorProto file = 1;
}
// Describes a complete .proto file.
message FileDescriptorProto {
optional string name = 1; // file name, relative to root of source tree
optional string package = 2; // e.g. "foo", "foo.bar", etc.
// Names of files imported by this file.
repeated string dependency = 3;
// Indexes of the public imported files in the dependency list above.
repeated int32 public_dependency = 10;
// Indexes of the weak imported files in the dependency list.
// For Google-internal migration only. Do not use.
repeated int32 weak_dependency = 11;
// All top-level definitions in this file.
repeated DescriptorProto message_type = 4;
repeated EnumDescriptorProto enum_type = 5;
repeated ServiceDescriptorProto service = 6;
repeated FieldDescriptorProto extension = 7;
optional FileOptions options = 8;
// This field contains optional information about the original source code.
// You may safely remove this entire field without harming runtime
// functionality of the descriptors -- the information is needed only by
// development tools.
optional SourceCodeInfo source_code_info = 9;
// The syntax of the proto file.
// The supported values are "proto2" and "proto3".
optional string syntax = 12;
}
// Describes a message type.
message DescriptorProto {
optional string name = 1;
repeated FieldDescriptorProto field = 2;
repeated FieldDescriptorProto extension = 6;
repeated DescriptorProto nested_type = 3;
repeated EnumDescriptorProto enum_type = 4;
message ExtensionRange {
optional int32 start = 1; // Inclusive.
optional int32 end = 2; // Exclusive.
optional ExtensionRangeOptions options = 3;
}
repeated ExtensionRange extension_range = 5;
repeated OneofDescriptorProto oneof_decl = 8;
optional MessageOptions options = 7;
// Range of reserved tag numbers. Reserved tag numbers may not be used by
// fields or extension ranges in the same message. Reserved ranges may
// not overlap.
message ReservedRange {
optional int32 start = 1; // Inclusive.
optional int32 end = 2; // Exclusive.
}
repeated ReservedRange reserved_range = 9;
// Reserved field names, which may not be used by fields in the same message.
// A given name may only be reserved once.
repeated string reserved_name = 10;
}
message ExtensionRangeOptions {
// The parser stores options it doesn't recognize here. See above.
repeated UninterpretedOption uninterpreted_option = 999;
// Clients can define custom options in extensions of this message. See above.
extensions 1000 to max;
}
// Describes a field within a message.
message FieldDescriptorProto {
enum Type {
// 0 is reserved for errors.
// Order is weird for historical reasons.
TYPE_DOUBLE = 1;
TYPE_FLOAT = 2;
// Not ZigZag encoded. Negative numbers take 10 bytes. Use TYPE_SINT64 if
// negative values are likely.
TYPE_INT64 = 3;
TYPE_UINT64 = 4;
// Not ZigZag encoded. Negative numbers take 10 bytes. Use TYPE_SINT32 if
// negative values are likely.
TYPE_INT32 = 5;
TYPE_FIXED64 = 6;
TYPE_FIXED32 = 7;
TYPE_BOOL = 8;
TYPE_STRING = 9;
// Tag-delimited aggregate.
// Group type is deprecated and not supported in proto3. However, Proto3
// implementations should still be able to parse the group wire format and
// treat group fields as unknown fields.
TYPE_GROUP = 10;
TYPE_MESSAGE = 11; // Length-delimited aggregate.
// New in version 2.
TYPE_BYTES = 12;
TYPE_UINT32 = 13;
TYPE_ENUM = 14;
TYPE_SFIXED32 = 15;
TYPE_SFIXED64 = 16;
TYPE_SINT32 = 17; // Uses ZigZag encoding.
TYPE_SINT64 = 18; // Uses ZigZag encoding.
}
enum Label {
// 0 is reserved for errors
LABEL_OPTIONAL = 1;
LABEL_REQUIRED = 2;
LABEL_REPEATED = 3;
}
optional string name = 1;
optional int32 number = 3;
optional Label label = 4;
// If type_name is set, this need not be set. If both this and type_name
// are set, this must be one of TYPE_ENUM, TYPE_MESSAGE or TYPE_GROUP.
optional Type type = 5;
// For message and enum types, this is the name of the type. If the name
// starts with a '.', it is fully-qualified. Otherwise, C++-like scoping
// rules are used to find the type (i.e. first the nested types within this
// message are searched, then within the parent, on up to the root
// namespace).
optional string type_name = 6;
// For extensions, this is the name of the type being extended. It is
// resolved in the same manner as type_name.
optional string extendee = 2;
// For numeric types, contains the original text representation of the value.
// For booleans, "true" or "false".
// For strings, contains the default text contents (not escaped in any way).
// For bytes, contains the C escaped value. All bytes >= 128 are escaped.
// TODO(kenton): Base-64 encode?
optional string default_value = 7;
// If set, gives the index of a oneof in the containing type's oneof_decl
// list. This field is a member of that oneof.
optional int32 oneof_index = 9;
// JSON name of this field. The value is set by protocol compiler. If the
// user has set a "json_name" option on this field, that option's value
// will be used. Otherwise, it's deduced from the field's name by converting
// it to camelCase.
optional string json_name = 10;
optional FieldOptions options = 8;
// If true, this is a proto3 "optional". When a proto3 field is optional, it
// tracks presence regardless of field type.
//
// When proto3_optional is true, this field must be belong to a oneof to
// signal to old proto3 clients that presence is tracked for this field. This
// oneof is known as a "synthetic" oneof, and this field must be its sole
// member (each proto3 optional field gets its own synthetic oneof). Synthetic
// oneofs exist in the descriptor only, and do not generate any API. Synthetic
// oneofs must be ordered after all "real" oneofs.
//
// For message fields, proto3_optional doesn't create any semantic change,
// since non-repeated message fields always track presence. However it still
// indicates the semantic detail of whether the user wrote "optional" or not.
// This can be useful for round-tripping the .proto file. For consistency we
// give message fields a synthetic oneof also, even though it is not required
// to track presence. This is especially important because the parser can't
// tell if a field is a message or an enum, so it must always create a
// synthetic oneof.
//
// Proto2 optional fields do not set this flag, because they already indicate
// optional with `LABEL_OPTIONAL`.
optional bool proto3_optional = 17;
}
// Describes a oneof.
message OneofDescriptorProto {
optional string name = 1;
optional OneofOptions options = 2;
}
// Describes an enum type.
message EnumDescriptorProto {
optional string name = 1;
repeated EnumValueDescriptorProto value = 2;
optional EnumOptions options = 3;
// Range of reserved numeric values. Reserved values may not be used by
// entries in the same enum. Reserved ranges may not overlap.
//
// Note that this is distinct from DescriptorProto.ReservedRange in that it
// is inclusive such that it can appropriately represent the entire int32
// domain.
message EnumReservedRange {
optional int32 start = 1; // Inclusive.
optional int32 end = 2; // Inclusive.
}
// Range of reserved numeric values. Reserved numeric values may not be used
// by enum values in the same enum declaration. Reserved ranges may not
// overlap.
repeated EnumReservedRange reserved_range = 4;
// Reserved enum value names, which may not be reused. A given name may only
// be reserved once.
repeated string reserved_name = 5;
}
// Describes a value within an enum.
message EnumValueDescriptorProto {
optional string name = 1;
optional int32 number = 2;
optional EnumValueOptions options = 3;
}
// Describes a service.
message ServiceDescriptorProto {
optional string name = 1;
repeated MethodDescriptorProto method = 2;
optional ServiceOptions options = 3;
}
// Describes a method of a service.
message MethodDescriptorProto {
optional string name = 1;
// Input and output type names. These are resolved in the same way as
// FieldDescriptorProto.type_name, but must refer to a message type.
optional string input_type = 2;
optional string output_type = 3;
optional MethodOptions options = 4;
// Identifies if client streams multiple client messages
optional bool client_streaming = 5 [default = false];
// Identifies if server streams multiple server messages
optional bool server_streaming = 6 [default = false];
}
// ===================================================================
// Options
// Each of the definitions above may have "options" attached. These are
// just annotations which may cause code to be generated slightly differently
// or may contain hints for code that manipulates protocol messages.
//
// Clients may define custom options as extensions of the *Options messages.
// These extensions may not yet be known at parsing time, so the parser cannot
// store the values in them. Instead it stores them in a field in the *Options
// message called uninterpreted_option. This field must have the same name
// across all *Options messages. We then use this field to populate the
// extensions when we build a descriptor, at which point all protos have been
// parsed and so all extensions are known.
//
// Extension numbers for custom options may be chosen as follows:
// * For options which will only be used within a single application or
// organization, or for experimental options, use field numbers 50000
// through 99999. It is up to you to ensure that you do not use the
// same number for multiple options.
// * For options which will be published and used publicly by multiple
// independent entities, e-mail protobuf-global-extension-registry@google.com
// to reserve extension numbers. Simply provide your project name (e.g.
// Objective-C plugin) and your project website (if available) -- there's no
// need to explain how you intend to use them. Usually you only need one
// extension number. You can declare multiple options with only one extension
// number by putting them in a sub-message. See the Custom Options section of
// the docs for examples:
// https://developers.google.com/protocol-buffers/docs/proto#options
// If this turns out to be popular, a web service will be set up
// to automatically assign option numbers.
message FileOptions {
// Sets the Java package where classes generated from this .proto will be
// placed. By default, the proto package is used, but this is often
// inappropriate because proto packages do not normally start with backwards
// domain names.
optional string java_package = 1;
// If set, all the classes from the .proto file are wrapped in a single
// outer class with the given name. This applies to both Proto1
// (equivalent to the old "--one_java_file" option) and Proto2 (where
// a .proto always translates to a single class, but you may want to
// explicitly choose the class name).
optional string java_outer_classname = 8;
// If set true, then the Java code generator will generate a separate .java
// file for each top-level message, enum, and service defined in the .proto
// file. Thus, these types will *not* be nested inside the outer class
// named by java_outer_classname. However, the outer class will still be
// generated to contain the file's getDescriptor() method as well as any
// top-level extensions defined in the file.
optional bool java_multiple_files = 10 [default = false];
// This option does nothing.
optional bool java_generate_equals_and_hash = 20 [deprecated=true];
// If set true, then the Java2 code generator will generate code that
// throws an exception whenever an attempt is made to assign a non-UTF-8
// byte sequence to a string field.
// Message reflection will do the same.
// However, an extension field still accepts non-UTF-8 byte sequences.
// This option has no effect on when used with the lite runtime.
optional bool java_string_check_utf8 = 27 [default = false];
// Generated classes can be optimized for speed or code size.
enum OptimizeMode {
SPEED = 1; // Generate complete code for parsing, serialization,
// etc.
CODE_SIZE = 2; // Use ReflectionOps to implement these methods.
LITE_RUNTIME = 3; // Generate code using MessageLite and the lite runtime.
}
optional OptimizeMode optimize_for = 9 [default = SPEED];
// Sets the Go package where structs generated from this .proto will be
// placed. If omitted, the Go package will be derived from the following:
// - The basename of the package import path, if provided.
// - Otherwise, the package statement in the .proto file, if present.
// - Otherwise, the basename of the .proto file, without extension.
optional string go_package = 11;
// Should generic services be generated in each language? "Generic" services
// are not specific to any particular RPC system. They are generated by the
// main code generators in each language (without additional plugins).
// Generic services were the only kind of service generation supported by
// early versions of google.protobuf.
//
// Generic services are now considered deprecated in favor of using plugins
// that generate code specific to your particular RPC system. Therefore,
// these default to false. Old code which depends on generic services should
// explicitly set them to true.
optional bool cc_generic_services = 16 [default = false];
optional bool java_generic_services = 17 [default = false];
optional bool py_generic_services = 18 [default = false];
optional bool php_generic_services = 42 [default = false];
// Is this file deprecated?
// Depending on the target platform, this can emit Deprecated annotations
// for everything in the file, or it will be completely ignored; in the very
// least, this is a formalization for deprecating files.
optional bool deprecated = 23 [default = false];
// Enables the use of arenas for the proto messages in this file. This applies
// only to generated classes for C++.
optional bool cc_enable_arenas = 31 [default = true];
// Sets the objective c class prefix which is prepended to all objective c
// generated classes from this .proto. There is no default.
optional string objc_class_prefix = 36;
// Namespace for generated classes; defaults to the package.
optional string csharp_namespace = 37;
// By default Swift generators will take the proto package and CamelCase it
// replacing '.' with underscore and use that to prefix the types/symbols
// defined. When this options is provided, they will use this value instead
// to prefix the types/symbols defined.
optional string swift_prefix = 39;
// Sets the php class prefix which is prepended to all php generated classes
// from this .proto. Default is empty.
optional string php_class_prefix = 40;
// Use this option to change the namespace of php generated classes. Default
// is empty. When this option is empty, the package name will be used for
// determining the namespace.
optional string php_namespace = 41;
// Use this option to change the namespace of php generated metadata classes.
// Default is empty. When this option is empty, the proto file name will be
// used for determining the namespace.
optional string php_metadata_namespace = 44;
// Use this option to change the package of ruby generated classes. Default
// is empty. When this option is not set, the package name will be used for
// determining the ruby package.
optional string ruby_package = 45;
// The parser stores options it doesn't recognize here.
// See the documentation for the "Options" section above.
repeated UninterpretedOption uninterpreted_option = 999;
// Clients can define custom options in extensions of this message.
// See the documentation for the "Options" section above.
extensions 1000 to max;
reserved 38;
}
message MessageOptions {
// Set true to use the old proto1 MessageSet wire format for extensions.
// This is provided for backwards-compatibility with the MessageSet wire
// format. You should not use this for any other reason: It's less
// efficient, has fewer features, and is more complicated.
//
// The message must be defined exactly as follows:
// message Foo {
// option message_set_wire_format = true;
// extensions 4 to max;
// }
// Note that the message cannot have any defined fields; MessageSets only
// have extensions.
//
// All extensions of your type must be singular messages; e.g. they cannot
// be int32s, enums, or repeated messages.
//
// Because this is an option, the above two restrictions are not enforced by
// the protocol compiler.
optional bool message_set_wire_format = 1 [default = false];
// Disables the generation of the standard "descriptor()" accessor, which can
// conflict with a field of the same name. This is meant to make migration
// from proto1 easier; new code should avoid fields named "descriptor".
optional bool no_standard_descriptor_accessor = 2 [default = false];
// Is this message deprecated?
// Depending on the target platform, this can emit Deprecated annotations
// for the message, or it will be completely ignored; in the very least,
// this is a formalization for deprecating messages.
optional bool deprecated = 3 [default = false];
// Whether the message is an automatically generated map entry type for the
// maps field.
//
// For maps fields:
// map<KeyType, ValueType> map_field = 1;
// The parsed descriptor looks like:
// message MapFieldEntry {
// option map_entry = true;
// optional KeyType key = 1;
// optional ValueType value = 2;
// }
// repeated MapFieldEntry map_field = 1;
//
// Implementations may choose not to generate the map_entry=true message, but
// use a native map in the target language to hold the keys and values.
// The reflection APIs in such implementations still need to work as
// if the field is a repeated message field.
//
// NOTE: Do not set the option in .proto files. Always use the maps syntax
// instead. The option should only be implicitly set by the proto compiler
// parser.
optional bool map_entry = 7;
reserved 8; // javalite_serializable
reserved 9; // javanano_as_lite
// The parser stores options it doesn't recognize here. See above.
repeated UninterpretedOption uninterpreted_option = 999;
// Clients can define custom options in extensions of this message. See above.
extensions 1000 to max;
}
message FieldOptions {
// The ctype option instructs the C++ code generator to use a different
// representation of the field than it normally would. See the specific
// options below. This option is not yet implemented in the open source
// release -- sorry, we'll try to include it in a future version!
optional CType ctype = 1 [default = STRING];
enum CType {
// Default mode.
STRING = 0;
CORD = 1;
STRING_PIECE = 2;
}
// The packed option can be enabled for repeated primitive fields to enable
// a more efficient representation on the wire. Rather than repeatedly
// writing the tag and type for each element, the entire array is encoded as
// a single length-delimited blob. In proto3, only explicit setting it to
// false will avoid using packed encoding.
optional bool packed = 2;
// The jstype option determines the JavaScript type used for values of the
// field. The option is permitted only for 64 bit integral and fixed types
// (int64, uint64, sint64, fixed64, sfixed64). A field with jstype JS_STRING
// is represented as JavaScript string, which avoids loss of precision that
// can happen when a large value is converted to a floating point JavaScript.
// Specifying JS_NUMBER for the jstype causes the generated JavaScript code to
// use the JavaScript "number" type. The behavior of the default option
// JS_NORMAL is implementation dependent.
//
// This option is an enum to permit additional types to be added, e.g.
// goog.math.Integer.
optional JSType jstype = 6 [default = JS_NORMAL];
enum JSType {
// Use the default type.
JS_NORMAL = 0;
// Use JavaScript strings.
JS_STRING = 1;
// Use JavaScript numbers.
JS_NUMBER = 2;
}
// Should this field be parsed lazily? Lazy applies only to message-type
// fields. It means that when the outer message is initially parsed, the
// inner message's contents will not be parsed but instead stored in encoded
// form. The inner message will actually be parsed when it is first accessed.
//
// This is only a hint. Implementations are free to choose whether to use
// eager or lazy parsing regardless of the value of this option. However,
// setting this option true suggests that the protocol author believes that
// using lazy parsing on this field is worth the additional bookkeeping
// overhead typically needed to implement it.
//
// This option does not affect the public interface of any generated code;
// all method signatures remain the same. Furthermore, thread-safety of the
// interface is not affected by this option; const methods remain safe to
// call from multiple threads concurrently, while non-const methods continue
// to require exclusive access.
//
//
// Note that implementations may choose not to check required fields within
// a lazy sub-message. That is, calling IsInitialized() on the outer message
// may return true even if the inner message has missing required fields.
// This is necessary because otherwise the inner message would have to be
// parsed in order to perform the check, defeating the purpose of lazy
// parsing. An implementation which chooses not to check required fields
// must be consistent about it. That is, for any particular sub-message, the
// implementation must either *always* check its required fields, or *never*
// check its required fields, regardless of whether or not the message has
// been parsed.
optional bool lazy = 5 [default = false];
// Is this field deprecated?
// Depending on the target platform, this can emit Deprecated annotations
// for accessors, or it will be completely ignored; in the very least, this
// is a formalization for deprecating fields.
optional bool deprecated = 3 [default = false];
// For Google-internal migration only. Do not use.
optional bool weak = 10 [default = false];
// The parser stores options it doesn't recognize here. See above.
repeated UninterpretedOption uninterpreted_option = 999;
// Clients can define custom options in extensions of this message. See above.
extensions 1000 to max;
reserved 4; // removed jtype
}
message OneofOptions {
// The parser stores options it doesn't recognize here. See above.
repeated UninterpretedOption uninterpreted_option = 999;
// Clients can define custom options in extensions of this message. See above.
extensions 1000 to max;
}
message EnumOptions {
// Set this option to true to allow mapping different tag names to the same
// value.
optional bool allow_alias = 2;
// Is this enum deprecated?
// Depending on the target platform, this can emit Deprecated annotations
// for the enum, or it will be completely ignored; in the very least, this
// is a formalization for deprecating enums.
optional bool deprecated = 3 [default = false];
reserved 5; // javanano_as_lite
// The parser stores options it doesn't recognize here. See above.
repeated UninterpretedOption uninterpreted_option = 999;
// Clients can define custom options in extensions of this message. See above.
extensions 1000 to max;
}
message EnumValueOptions {
// Is this enum value deprecated?
// Depending on the target platform, this can emit Deprecated annotations
// for the enum value, or it will be completely ignored; in the very least,
// this is a formalization for deprecating enum values.
optional bool deprecated = 1 [default = false];
// The parser stores options it doesn't recognize here. See above.
repeated UninterpretedOption uninterpreted_option = 999;
// Clients can define custom options in extensions of this message. See above.
extensions 1000 to max;
}
message ServiceOptions {
// Note: Field numbers 1 through 32 are reserved for Google's internal RPC
// framework. We apologize for hoarding these numbers to ourselves, but
// we were already using them long before we decided to release Protocol
// Buffers.
// Is this service deprecated?
// Depending on the target platform, this can emit Deprecated annotations
// for the service, or it will be completely ignored; in the very least,
// this is a formalization for deprecating services.
optional bool deprecated = 33 [default = false];
// The parser stores options it doesn't recognize here. See above.
repeated UninterpretedOption uninterpreted_option = 999;
// Clients can define custom options in extensions of this message. See above.
extensions 1000 to max;
}
message MethodOptions {
// Note: Field numbers 1 through 32 are reserved for Google's internal RPC
// framework. We apologize for hoarding these numbers to ourselves, but
// we were already using them long before we decided to release Protocol
// Buffers.
// Is this method deprecated?
// Depending on the target platform, this can emit Deprecated annotations
// for the method, or it will be completely ignored; in the very least,
// this is a formalization for deprecating methods.
optional bool deprecated = 33 [default = false];
// Is this method side-effect-free (or safe in HTTP parlance), or idempotent,
// or neither? HTTP based RPC implementation may choose GET verb for safe
// methods, and PUT verb for idempotent methods instead of the default POST.
enum IdempotencyLevel {
IDEMPOTENCY_UNKNOWN = 0;
NO_SIDE_EFFECTS = 1; // implies idempotent
IDEMPOTENT = 2; // idempotent, but may have side effects
}
optional IdempotencyLevel idempotency_level = 34
[default = IDEMPOTENCY_UNKNOWN];
// The parser stores options it doesn't recognize here. See above.
repeated UninterpretedOption uninterpreted_option = 999;
// Clients can define custom options in extensions of this message. See above.
extensions 1000 to max;
}
// A message representing a option the parser does not recognize. This only
// appears in options protos created by the compiler::Parser class.
// DescriptorPool resolves these when building Descriptor objects. Therefore,
// options protos in descriptor objects (e.g. returned by Descriptor::options(),
// or produced by Descriptor::CopyTo()) will never have UninterpretedOptions
// in them.
message UninterpretedOption {
// The name of the uninterpreted option. Each string represents a segment in
// a dot-separated name. is_extension is true iff a segment represents an
// extension (denoted with parentheses in options specs in .proto files).
// E.g.,{ ["foo", false], ["bar.baz", true], ["qux", false] } represents
// "foo.(bar.baz).qux".
message NamePart {
required string name_part = 1;
required bool is_extension = 2;
}
repeated NamePart name = 2;
// The value of the uninterpreted option, in whatever type the tokenizer
// identified it as during parsing. Exactly one of these should be set.
optional string identifier_value = 3;
optional uint64 positive_int_value = 4;
optional int64 negative_int_value = 5;
optional double double_value = 6;
optional bytes string_value = 7;
optional string aggregate_value = 8;
}
// ===================================================================
// Optional source code info
// Encapsulates information about the original source file from which a
// FileDescriptorProto was generated.
message SourceCodeInfo {
// A Location identifies a piece of source code in a .proto file which
// corresponds to a particular definition. This information is intended
// to be useful to IDEs, code indexers, documentation generators, and similar
// tools.
//
// For example, say we have a file like:
// message Foo {
// optional string foo = 1;
// }
// Let's look at just the field definition:
// optional string foo = 1;
// ^ ^^ ^^ ^ ^^^
// a bc de f ghi
// We have the following locations:
// span path represents
// [a,i) [ 4, 0, 2, 0 ] The whole field definition.
// [a,b) [ 4, 0, 2, 0, 4 ] The label (optional).
// [c,d) [ 4, 0, 2, 0, 5 ] The type (string).
// [e,f) [ 4, 0, 2, 0, 1 ] The name (foo).
// [g,h) [ 4, 0, 2, 0, 3 ] The number (1).
//
// Notes:
// - A location may refer to a repeated field itself (i.e. not to any
// particular index within it). This is used whenever a set of elements are
// logically enclosed in a single code segment. For example, an entire
// extend block (possibly containing multiple extension definitions) will
// have an outer location whose path refers to the "extensions" repeated
// field without an index.
// - Multiple locations may have the same path. This happens when a single
// logical declaration is spread out across multiple places. The most
// obvious example is the "extend" block again -- there may be multiple
// extend blocks in the same scope, each of which will have the same path.
// - A location's span is not always a subset of its parent's span. For
// example, the "extendee" of an extension declaration appears at the
// beginning of the "extend" block and is shared by all extensions within
// the block.
// - Just because a location's span is a subset of some other location's span
// does not mean that it is a descendant. For example, a "group" defines
// both a type and a field in a single declaration. Thus, the locations
// corresponding to the type and field and their components will overlap.
// - Code which tries to interpret locations should probably be designed to
// ignore those that it doesn't understand, as more types of locations could
// be recorded in the future.
repeated Location location = 1;
message Location {
// Identifies which part of the FileDescriptorProto was defined at this
// location.
//
// Each element is a field number or an index. They form a path from
// the root FileDescriptorProto to the place where the definition. For
// example, this path:
// [ 4, 3, 2, 7, 1 ]
// refers to:
// file.message_type(3) // 4, 3
// .field(7) // 2, 7
// .name() // 1
// This is because FileDescriptorProto.message_type has field number 4:
// repeated DescriptorProto message_type = 4;
// and DescriptorProto.field has field number 2:
// repeated FieldDescriptorProto field = 2;
// and FieldDescriptorProto.name has field number 1:
// optional string name = 1;
//
// Thus, the above path gives the location of a field name. If we removed
// the last element:
// [ 4, 3, 2, 7 ]
// this path refers to the whole field declaration (from the beginning
// of the label to the terminating semicolon).
repeated int32 path = 1 [packed = true];
// Always has exactly three or four elements: start line, start column,
// end line (optional, otherwise assumed same as start line), end column.
// These are packed into a single field for efficiency. Note that line
// and column numbers are zero-based -- typically you will want to add
// 1 to each before displaying to a user.
repeated int32 span = 2 [packed = true];
// If this SourceCodeInfo represents a complete declaration, these are any
// comments appearing before and after the declaration which appear to be
// attached to the declaration.
//
// A series of line comments appearing on consecutive lines, with no other
// tokens appearing on those lines, will be treated as a single comment.
//
// leading_detached_comments will keep paragraphs of comments that appear
// before (but not connected to) the current element. Each paragraph,
// separated by empty lines, will be one comment element in the repeated
// field.
//
// Only the comment content is provided; comment markers (e.g. //) are
// stripped out. For block comments, leading whitespace and an asterisk
// will be stripped from the beginning of each line other than the first.
// Newlines are included in the output.
//
// Examples:
//
// optional int32 foo = 1; // Comment attached to foo.
// // Comment attached to bar.
// optional int32 bar = 2;
//
// optional string baz = 3;
// // Comment attached to baz.
// // Another line attached to baz.
//
// // Comment attached to qux.
// //
// // Another line attached to qux.
// optional double qux = 4;
//
// // Detached comment for corge. This is not leading or trailing comments
// // to qux or corge because there are blank lines separating it from
// // both.
//
// // Detached comment for corge paragraph 2.
//
// optional string corge = 5;
// /* Block comment attached
// * to corge. Leading asterisks
// * will be removed. */
// /* Block comment attached to
// * grault. */
// optional int32 grault = 6;
//
// // ignored detached comments.
optional string leading_comments = 3;
optional string trailing_comments = 4;
repeated string leading_detached_comments = 6;
}
}
// Describes the relationship between generated code and its original source
// file. A GeneratedCodeInfo message is associated with only one generated
// source file, but may contain references to different source .proto files.
message GeneratedCodeInfo {
// An Annotation connects some span of text in generated code to an element
// of its generating .proto file.
repeated Annotation annotation = 1;
message Annotation {
// Identifies the element in the original source .proto file. This field
// is formatted the same as SourceCodeInfo.Location.path.
repeated int32 path = 1 [packed = true];
// Identifies the filesystem path to the original source .proto.
optional string source_file = 2;
// Identifies the starting offset in bytes in the generated code
// that relates to the identified object.
optional int32 begin = 3;
// Identifies the ending offset in bytes in the generated code that
// relates to the identified offset. The end offset should be one past
// the last relevant byte (so the length of the text = end - begin).
optional int32 end = 4;
}
}

Some files were not shown because too many files have changed in this diff Show More