Using buf.build to generate your gRPC codes

Tai Vong
6 min readSep 23, 2021

As you may know, since the first introduction, the buf tool has been served as a good linter for the gRPC ecosystem for a time. And after all those time, they released a tool to help the work with gRPC become even easier. The buf generate tool has been announced for a long time, and I have also tried the beta version. It’s great, but it still lacks a lot of features to convince me to adopt that.

Recently, the team finally released an official version with Buf Schema Registry. And that really turns me on again. You can see an example code base here: https://github.com/johanbrandhorst/grpc-gateway-boilerplate. It describes a standard code base that can easily be transformed into a good gRPC repository at minimum cost. I have followed that practice and take my own research on the official document of the buf team here: https://docs.buf.build/tour/introduction. After that, I have concluded my own experience and in this article, I will show you a tour that I have been through to migrate my old pipeline into a new one using buf.

The traditional way

If you are using gRPC before, you may familiar with using plugins like protoc-gen-go-grpc, protoc-gen-grpc-gateway, protoc-gen-gogo, protoc-gen-swagger, protoc-gen-validator, … They are well-known plugins that enable all the best features to your gRPC server. Those plugins may also come with a lot of great protobuf extensions like gogoproto, validator, swagger and google API annotations. Together they form a pipeline that helps to transform your gRPC server into a new level. I also used those in my go repositories. From the best support from the gRPC development team, I believed that people from other languages can take this practice as the same as I, too.

The most common way to manage those dependencies is that we can use the go-build tools to include them with the go codebase.

// +build tools

package tools

import (
_ "github.com/grpc-ecosystem/grpc-gateway/v2/protoc-gen-grpc-gateway"
_ "github.com/grpc-ecosystem/grpc-gateway/v2/protoc-gen-openapiv2"
_ "google.golang.org/grpc/cmd/protoc-gen-go-grpc"
_ "google.golang.org/protobuf/cmd/protoc-gen-go"
)

or can install them manually,

go install \
github.com/grpc-ecosystem/grpc-gateway/v2/protoc-gen-grpc-gateway \
github.com/grpc-ecosystem/grpc-gateway/v2/protoc-gen-openapiv2 \
google.golang.org/protobuf/cmd/protoc-gen-go \
google.golang.org/grpc/cmd/protoc-gen-go-grpc

After all the plugins have been installed, we can now accomplish our task using this command.

protoc \  
-I proto \
-I third_party/grpc-gateway/ \
-I third_party/googleapis \
--go_out=plugins=grpc,paths=source_relative:./proto \
--grpc-gateway_out=paths=source_relative:./proto \
--openapiv2_out=third_party/OpenAPI/ \
proto/example.proto

That command is specifying a pipeline and uses the protobuf compiler to generate the code for us.

  • The -I flag specifies all the place where protobuf files can be found, they are going to be used to resolve the import path for your protobuf files.
  • The --<plugin>_out specify what plugins may be used in the pipeline to complete the task, after that is the extra parameter that adjusts the behavior of the plugin. The term plugins=grpc,paths=source_relative:./proto means that the compiler should use the go_grpc plugin, the output folder should be source relative and the root output folder is ./proto.

As you can see, it’s introduced too much complexity and is hard to be readable for an inexperienced one. Most of the time, the experienced developer will try to write a most general pipeline and put it in the Makefile, so that we can do all the things with the command make generate. Normally, newbies joining the projects cannot understand what the hell is going on, and much take a long time to be able to edit, or moving from that.

I have also followed the above process for a long time. Last year, I adopted the buf lint tool from the buf team to add a linter to our process to detect the breaking changes that may harm our process. After all those time, the buf team has successfully created another tool to make the above process become easier. But my first use ended up with a lot of obstacles trying to make it work with the third-party plugins. It’s still hard to deal with the version control of third-party plugins. I also have to deal with a problem that go mod vendor will only copy the necessary things from the dependent repository. It ends up not include the protobuf files from other repositories. At that time, buf worth nothing but to help clean the command only. Until recently, with the introduction of the official version with Buf Schema Registry, we can finally face those problems again.

The modern way

To be able to use buf, you can install the plugin at: https://docs.buf.build/tour/introduction/

And that’s all, we are ready to continue the work. Now we will try to transform this pipeline into a buf build version.

protoc \  
-I proto \
-I third_party/grpc-gateway/ \
-I third_party/googleapis \
-I vendor \
--go_out=plugins=grpc,paths=source_relative:./proto \
--grpc-gateway_out=paths=source_relative:./proto \
--openapiv2_out=third_party/OpenAPI/ \
proto/example.proto

In order to generate the protobuf with buf, you should have a buf.gen.yaml in your root. That may specify all the plugins that may be used, and the options

version: v1
plugins:
- name: go
out: ./proto
opt:
- paths=source_relative
- name: go-grpc
out: ./proto
opt:
- paths=source_relative
- name: grpc-gateway
out: ./proto
opt:
- paths=source_relative
- name: openapiv2
out: ./docs

To include the third-party proto files, we should include them in our go.mod, then vendor them and use the relative path to target those files. As I aforementioned, the vendoring process may not work. Because the vendor only copies mandatory files, enough to make the go code base work into vendor folder. It does not well serve for the protobuf use case.

So a work-around solution we commonly used is to clone those proto files into a folder and copy it for each project we go through. That may cause a slightly misynced between our projects, and as time flies, the proto files may soon become outdated with the plugin's version. And this is where BSR becomes a hero. It created a dependency management system for those proto files. And you just need to specify them in a buf.yaml file.

version: v1
name: buf.build/yourorg/myprotos
deps:
- buf.build/googleapis/googleapis
- buf.build/grpc-ecosystem/grpc-gateway

Then run buf mod update command. Buf will create a buf.lock file in order to resolve the dependency for us. No longer pain with copying them around from time to time.

Now your code has been ready to be generated. Run buf generate and see the miracles.

Managing the workspace

Soon you will realize that, buf actually will scan all the proto files inside your root folder and start the generating process. That may cause a lot of trouble because your codebase may include some extra proto files that should not be included in the process, especially those things in the vendor folder. So buf have proposed us a solution, may not be the best, but I can finally make it worked with fewer efforts. That is you should create a buf.work.yaml file in the same folder as your buf.gen.yaml file.

version: v1
directories:
- proto

In that file, we will specify all the folder that contains our proto files, that is ready to be included. After that specification, the import path should be relative to those annotated in that file. Every folder should include self buf.yaml and buf.lock file in order to resolve self-dependencies.

Conclusion

I have led you all to go through a process to transform your traditional protobuf generating pipeline with a modern buf build pipeline. Despite the greatness of the tool, I am still stuck with some problems with the private protobuf generating plugins I wrote for our organization. But I believe that may not be a big problem, and buf will become the best tool to work with throughout the gRPC ecosystem in the future. So don’t hesitate to make a change today.

--

--