Add Documentation (#125)

Add sphinx docs with readthedocs integration.

Docs can be built locally with `poe docs`.
This commit is contained in:
James
2020-09-20 21:00:02 +01:00
committed by GitHub
parent 58556e0eb6
commit d3e4fbb311
14 changed files with 973 additions and 98 deletions

31
docs/api.rst Normal file
View File

@@ -0,0 +1,31 @@
.. currentmodule:: betterproto
API reference
=============
The following document outlines betterproto's api. **None** of these classes should be
extended by the user manually.
Message
--------
.. autoclass:: betterproto.Message
:members:
:special-members: __bytes__
.. autofunction:: betterproto.serialized_on_wire
.. autofunction:: betterproto.which_one_of
Enumerations
-------------
.. autoclass:: betterproto.Enum()
:members:
.. autoclass:: betterproto.Casing()
:members:

60
docs/conf.py Normal file
View File

@@ -0,0 +1,60 @@
# Configuration file for the Sphinx documentation builder.
#
# This file only contains a selection of the most common options. For a full
# list see the documentation:
# https://www.sphinx-doc.org/en/master/usage/configuration.html
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
import pathlib
import toml
# -- Project information -----------------------------------------------------
project = "betterproto"
copyright = "2019 Daniel G. Taylor"
author = "danielgtaylor"
pyproject = toml.load(open(pathlib.Path(__file__).parent.parent / "pyproject.toml"))
# The full version, including alpha/beta/rc tags.
release = pyproject["tool"]["poetry"]["version"]
# -- General configuration ---------------------------------------------------
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = [
"sphinx.ext.autodoc",
"sphinx.ext.intersphinx",
"sphinx.ext.napoleon",
]
autodoc_member_order = "bysource"
autodoc_typehints = "none"
extlinks = {
"issue": ("https://github.com/danielgtaylor/python-betterproto/issues/%s", "GH-"),
}
# Links used for cross-referencing stuff in other documentation
intersphinx_mapping = {
"py": ("https://docs.python.org/3", None),
}
# -- Options for HTML output -------------------------------------------------
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = "friendly"
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
html_theme = "sphinx_rtd_theme"

33
docs/index.rst Normal file
View File

@@ -0,0 +1,33 @@
Welcome to betterproto's documentation!
=======================================
betterproto is a protobuf compiler and interpreter. It improves the experience of using
Protobuf and gRPC in Python, by generating readable, understandable, and idiomatic
Python code, using modern language features.
Features:
~~~~~~~~~
- Generated messages are both binary & JSON serializable
- Messages use relevant python types, e.g. ``Enum``, ``datetime`` and ``timedelta``
objects
- ``async``/``await`` support for gRPC Clients
- Generates modern, readable, idiomatic python code
Contents:
~~~~~~~~~
.. toctree::
:maxdepth: 2
quick-start
api
migrating
If you still can't find what you're looking for, try in one of the following pages:
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`

157
docs/migrating.rst Normal file
View File

@@ -0,0 +1,157 @@
Migrating Guide
===============
Google's protocolbuffers
------------------------
betterproto has a mostly 1 to 1 drop in replacement for Google's protocolbuffers (after
regenerating your protobufs of course) although there are some minor differences.
.. note::
betterproto implements the same basic methods including:
- :meth:`betterproto.Message.FromString`
- :meth:`betterproto.Message.SerializeToString`
for compatibility purposes, however it is important to note that these are
effectively aliases for :meth:`betterproto.Message.parse` and
:meth:`betterproto.Message.__bytes__` respectively.
Determining if a message was sent
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Sometimes it is useful to be able to determine whether a message has been sent on
the wire. This is how the Google wrapper types work to let you know whether a value is
unset (set as the default/zero value), or set as something else, for example.
Use ``betterproto.serialized_on_wire(message)`` to determine if it was sent. This is
a little bit different from the official Google generated Python code, and it lives
outside the generated ``Message`` class to prevent name clashes. Note that it only
supports Proto 3 and thus can only be used to check if ``Message`` fields are set.
You cannot check if a scalar was sent on the wire.
.. code-block:: python
# Old way (official Google Protobuf package)
>>> mymessage.HasField('myfield')
True
# New way (this project)
>>> betterproto.serialized_on_wire(mymessage.myfield)
True
One-of Support
~~~~~~~~~~~~~~
Protobuf supports grouping fields in a oneof clause. Only one of the fields in the group
may be set at a given time. For example, given the proto:
.. code-block:: proto
syntax = "proto3";
message Test {
oneof foo {
bool on = 1;
int32 count = 2;
string name = 3;
}
}
You can use ``betterproto.which_one_of(message, group_name)`` to determine which of the
fields was set. It returns a tuple of the field name and value, or a blank string and
``None`` if unset. Again this is a little different than the official Google code
generator:
.. code-block:: python
# Old way (official Google protobuf package)
>>> message.WhichOneof("group")
"foo"
# New way (this project)
>>> betterproto.which_one_of(message, "group")
("foo", "foo's value")
Well-Known Google Types
~~~~~~~~~~~~~~~~~~~~~~~
Google provides several well-known message types like a timestamp, duration, and several
wrappers used to provide optional zero value support. Each of these has a special JSON
representation and is handled a little differently from normal messages. The Python
mapping for these is as follows:
+-------------------------------+-----------------------------------------------+--------------------------+
| ``Google Message`` | ``Python Type`` | ``Default`` |
+===============================+===============================================+==========================+
| ``google.protobuf.duration`` | :class:`datetime.timedelta` | ``0`` |
+-------------------------------+-----------------------------------------------+--------------------------+
| ``google.protobuf.timestamp`` | ``Timezone-aware`` :class:`datetime.datetime` | ``1970-01-01T00:00:00Z`` |
+-------------------------------+-----------------------------------------------+--------------------------+
| ``google.protobuf.*Value`` | ``Optional[...]``/``None`` | ``None`` |
+-------------------------------+-----------------------------------------------+--------------------------+
| ``google.protobuf.*`` | ``betterproto.lib.google.protobuf.*`` | ``None`` |
+-------------------------------+-----------------------------------------------+--------------------------+
For the wrapper types, the Python type corresponds to the wrapped type, e.g.
``google.protobuf.BoolValue`` becomes ``Optional[bool]`` while
``google.protobuf.Int32Value`` becomes ``Optional[int]``. All of the optional values
default to None, so don't forget to check for that possible state.
Given:
.. code-block:: proto
syntax = "proto3";
import "google/protobuf/duration.proto";
import "google/protobuf/timestamp.proto";
import "google/protobuf/wrappers.proto";
message Test {
google.protobuf.BoolValue maybe = 1;
google.protobuf.Timestamp ts = 2;
google.protobuf.Duration duration = 3;
}
You can use it as such:
.. code-block:: python
>>> t = Test().from_dict({"maybe": True, "ts": "2019-01-01T12:00:00Z", "duration": "1.200s"})
>>> t
Test(maybe=True, ts=datetime.datetime(2019, 1, 1, 12, 0, tzinfo=datetime.timezone.utc), duration=datetime.timedelta(seconds=1, microseconds=200000))
>>> t.ts - t.duration
datetime.datetime(2019, 1, 1, 11, 59, 58, 800000, tzinfo=datetime.timezone.utc)
>>> t.ts.isoformat()
'2019-01-01T12:00:00+00:00'
>>> t.maybe = None
>>> t.to_dict()
{'ts': '2019-01-01T12:00:00Z', 'duration': '1.200s'}
[1.2.5] to [2.0.0b1]
--------------------
Updated package structures
~~~~~~~~~~~~~~~~~~~~~~~~~~
Generated code now strictly follows the *package structure* of the ``.proto`` files.
Consequently ``.proto`` files without a package will be combined in a single
``__init__.py`` file. To avoid overwriting existing ``__init__.py`` files, its best
to compile into a dedicated subdirectory.
Upgrading:
- Remove your previously compiled ``.py`` files.
- Create a new *empty* directory, e.g. ``generated`` or ``lib/generated/proto`` etc.
- Regenerate your python files into this directory
- Update import statements, e.g. ``import ExampleMessage from generated``

192
docs/quick-start.rst Normal file
View File

@@ -0,0 +1,192 @@
Getting Started
===============
Installation
++++++++++++
Installation from PyPI is as simple as running:
.. code-block:: sh
python3 -m pip install -U betterproto
If you are using Windows, then the following should be used instead:
.. code-block:: sh
py -3 -m pip install -U betterproto
To include the protoc plugin, install betterproto[compiler] instead of betterproto,
e.g.
.. code-block:: sh
python3 -m pip install -U "betterproto[compiler]"
Compiling proto files
+++++++++++++++++++++
Given you installed the compiler and have a proto file, e.g ``example.proto``:
.. code-block:: proto
syntax = "proto3";
package hello;
// Greeting represents a message you can tell a user.
message Greeting {
string message = 1;
}
To compile the proto you would run the following:
You can run the following to invoke protoc directly:
.. code-block:: sh
mkdir hello
protoc -I . --python_betterproto_out=lib example.proto
or run the following to invoke protoc via grpcio-tools:
.. code-block:: sh
pip install grpcio-tools
python -m grpc_tools.protoc -I . --python_betterproto_out=lib example.proto
This will generate ``lib/__init__.py`` which looks like:
.. code-block:: python
# Generated by the protocol buffer compiler. DO NOT EDIT!
# sources: example.proto
# plugin: python-betterproto
from dataclasses import dataclass
import betterproto
@dataclass
class Greeting(betterproto.Message):
"""Greeting represents a message you can tell a user."""
message: str = betterproto.string_field(1)
Then to use it:
.. code-block:: python
>>> from lib import Greeting
>>> test = Greeting()
>>> test
Greeting(message='')
>>> test.message = "Hey!"
>>> test
Greeting(message="Hey!")
>>> bytes(test)
b'\n\x04Hey!'
>>> Greeting().parse(serialized)
Greeting(message="Hey!")
Async gRPC Support
++++++++++++++++++
The generated code includes `grpclib <https://grpclib.readthedocs.io/en/latest>`_ based
stub (client) classes for rpc services declared in the input proto files.
It is enabled by default.
Given a service definition similar to the one below:
.. code-block:: proto
syntax = "proto3";
package echo;
message EchoRequest {
string value = 1;
// Number of extra times to echo
uint32 extra_times = 2;
}
message EchoResponse {
repeated string values = 1;
}
message EchoStreamResponse {
string value = 1;
}
service Echo {
rpc Echo(EchoRequest) returns (EchoResponse);
rpc EchoStream(EchoRequest) returns (stream EchoStreamResponse);
}
The generated client can be used like so:
.. code-block:: python
import asyncio
from grpclib.client import Channel
import echo
async def main():
channel = Channel(host="127.0.0.1", port=50051)
service = echo.EchoStub(channel)
response = await service.echo(value="hello", extra_times=1)
print(response)
async for response in service.echo_stream(value="hello", extra_times=1):
print(response)
# don't forget to close the channel when you're done!
channel.close()
asyncio.run(main()) # python 3.7 only
# outputs
EchoResponse(values=['hello', 'hello'])
EchoStreamResponse(value='hello')
EchoStreamResponse(value='hello')
JSON
++++
Message objects include :meth:`betterproto.Message.to_json` and
:meth:`betterproto.Message.from_json` methods for JSON (de)serialisation, and
:meth:`betterproto.Message.to_dict`, :meth:`betterproto.Message.from_dict` for
converting back and forth from JSON serializable dicts.
For compatibility the default is to convert field names to
:attr:`betterproto.Casing.CAMEL`. You can control this behavior by passing a
different casing value, e.g:
.. code-block:: python
@dataclass
class MyMessage(betterproto.Message):
a_long_field_name: str = betterproto.string_field(1)
>>> test = MyMessage(a_long_field_name="Hello World!")
>>> test.to_dict(betterproto.Casing.SNAKE)
{"a_long_field_name": "Hello World!"}
>>> test.to_dict(betterproto.Casing.CAMEL)
{"aLongFieldName": "Hello World!"}
>>> test.to_json(indent=2)
'{\n "aLongFieldName": "Hello World!"\n}'
>>> test.from_dict({"aLongFieldName": "Goodbye World!"})
>>> test.a_long_field_name
"Goodbye World!"

View File

@@ -1,16 +0,0 @@
# Upgrade Guide
## [1.2.5] to [2.0.0b1]
### Updated package structures
Generated code now strictly follows the *package structure* of the `.proto` files.
Consequently `.proto` files without a package will be combined in a single `__init__.py` file.
To avoid overwriting existing `__init__.py` files, its best to compile into a dedicated subdirectory.
Upgrading:
- Remove your previously compiled `.py` files.
- Create a new *empty* directory, e.g. `generated` or `lib/generated/proto` etcetera.
- Regenerate your python files into this directory
- Update import statements, e.g. `import ExampleMessage from generated`