Why are Protobufs not more popular? - WIP
Table of Contents
AI-assisted: I developed the arguments and references in this post. Claude helped organize and expand the draft.
It isn’t about he protocol it is about the schema. You don’t need to give up REST to get the benefits of Protobufs.
Contract-first is a different mindset
Some developers naturally think in terms of defining the contract before writing any code. The service, its methods, its message types, all defined up front. Then tooling generates everything from that definition.
- SOAP did this with XML/WSDL, gRPC does it with Protobuf. Different eras, same instinct.
- The wire format is not the concern. The contract is the source of truth.
- Define the contract, generate the code, trust the process.
JSON is the default
JSON is human-readable, universally supported, and works everywhere. Curl, browser dev tools, Postman, every language. That’s a real strength and it’s why JSON became the default format for APIs. Nothing wrong with that.
- JSON doesn’t have a built-in schema. The shape of the data is implied by the code. Schema definitions (OpenAPI, Swagger, JSON Schema) are often added as an afterthought.
- Most teams build code-first and generate docs later. Contract-first (or “design-first”) requires defining the schema before writing business logic. That’s a different workflow that takes looking at the problem different.
Most developers have only used one approach
JSON’s ubiquity means fewer developers actively think about their serialization or schema choice. It’s just what you use.
- BaaS platforms (Firebase, Supabase) abstract the API layer. The data format is hidden behind an SDK call.
- Heavy frameworks and vibe coding generate the API layer. AI defaults to JSON because that’s what the training data contains.
- JSON is the default because so many before already chose it. The well worn path, not an active decision.
- Most developers haven’t had the opportunity to try contract-first development.
Bringing attention to Protobufs
Rising tides float all boats. The goal is to teach developers another way to define their APIs that opens up more opportunities.
- A
.protofile defines message types and service contracts in one place. Single definition, everything generated from it. - The schema is the source of truth, not the code. You define the contract first, then generate clients, servers, and docs from it.
- Any object that leaves the application boundary (DTOs, API responses, events, messages) can be defined as a Protobuf message from the start. That schema becomes shareable across any service that needs it, regardless of language. Two Go services, a Python worker, and a TypeScript frontend can all generate types from the same
.protofile. - Tools like
buf(bufbuild/buf) handle linting, formatting, and breaking change detection against that definition. The schema has its own development lifecycle. protovalidate(bufbuild/protovalidate) adds validation rules directly in the.protofile. The contract enforces its own constraints.- Protobuf forward and backward compatibility reduces breaking changes, lets teams upgrade independently.
- The transport protocol is a separate decision. gRPC, REST (via gRPC-gateway or Envoy transcoding), ConnectRPC (connectrpc/connect-es) all work from the same
.protodefinition. JSON still flows through REST gateways, mapped from the proto contract. - More OSS projects should offer Protobuf-defined APIs alongside their existing interfaces. Normalizes contract-first thinking.
- Larger projects with explicit service contracts get easier to maintain over time, not harder
- Buf’s argument: the real reason to use Protobuf isn’t performance, it’s that schema-driven development reduces integration failures
- Learning to trust the contract is the real shift
DevRel strategy
Start with open source tools that already have developer-facing APIs. Contribute Protobuf definitions to real projects.
- Just creating a PR brings awareness. Maintainers and watchers see it, the conversation starts.
- Don’t create PRs with AI-generated code. Authenticity earns credibility with maintainers.
- OpenTelemetry (OTel) is a strong starting point. gRPC is already a core transport, data model defined in Protobuf.
- Metrics, traces, and logs are exactly the kind of structured data contracts where Protobuf excels
- Buf’s tooling-first DevRel approach (CLI, BSR, Buf Slack) has done more for adoption than performance benchmarks
Do it live
Stream the contributions. Demystifies the Protobuf workflow for developers who have never touched a .proto file.
- Questions answered in real time, friction points become visible and solvable in front of an audience
- VODs, clips, and writeups continue educating after the stream ends
- Each contribution becomes a case study for introducing schema-driven APIs to existing projects
It is time to reconsider Protobuf - Blog and Talk
The ROI on Protobuf has never been better. The tooling, the ecosystem, and the developer experience have all changed since the common criticisms were written.
- A direct rebuttal to the feedback collected in I Reviewed 1,000s of Opinions on gRPC
. Those criticisms were valid when written. Today each one can either be reversed or is much more nuanced:
- “Tooling is immature.”
bufCLI, BSR, VS Code and IntelliJ plugins, Postman gRPC support. The tooling gap has closed. - “Build process overhead.”
buf generateis one command. Remote plugins via BSR mean you don’t need protoc installed locally. - “Debugging is hard.”
buf curl, grpcurl, Postman, and ConnectRPC’s JSON mode all make inspection straightforward. - “No browser support.” ConnectRPC works over standard HTTP without a proxy. This is solved.
- “Load balancing is tricky.” Service meshes (Istio, Linkerd, Envoy) handle gRPC-aware load balancing natively now.
- “It’s over-engineering for non-Google scale.” The argument was always about performance. The real argument is about schemas. You don’t need Google scale to benefit from clear contracts.
- “Tooling is immature.”
- The deciding factor should be: is this project going to have schemas? Schemas almost always matter. If data crosses a boundary, a schema makes that boundry well defined and easier to maintain.
- Protobuf is the most capable schema language available and can be used anywhere schemas are found. API contracts, event schemas, data pipelines, configuration, inter-service messages, and even REST APIs.
CFP
It Is Time to Reconsider Protobuf
Most developers dismissed Protocol Buffers years ago. The tooling was rough, the learning curve was steep, and the common advice was “you’re not Google, you don’t need it.” That advice is outdated.
The Protobuf ecosystem has changed. The buf CLI handles linting, formatting, and breaking change detection in a single tool. The Buf Schema Registry provides dependency management and remote code generation without installing protoc. ConnectRPC serves gRPC, gRPC-Web, and plain HTTP from the same .proto definition with no proxy required. Validation rules live directly in the schema with protovalidate.
But the bigger shift is in how we should think about Protobuf. It was never really about performance or replacing REST. Protobuf is a schema language. Any object that crosses an application boundary (an API response, an event, a message, a DTO) can be defined as a Protobuf message. That definition becomes the source of truth, shareable across services in any language, with forward and backward compatibility built in.
This talk walks through the most common criticisms of Protobuf and gRPC, acknowledges where they came from, and shows what has changed. You’ll see the current state of the tooling, a practical workflow for adopting Protobuf schemas in an existing project, and why the real ROI is in schema-driven development, not raw throughput.
Talk outline
Slated at ~25 min runtime but can be adapted to a lightning talk or deep dive
- The old criticisms and why they stuck (5 min)
- Walk through real developer feedback collected from Reddit, HN, and Twitter
- Acknowledge each point was valid at the time
- What changed (8 min)
bufCLI and BSR: schema development lifecycle- ConnectRPC: multi-protocol from one definition, no proxy
protovalidate: contracts that enforce themselves- Postman, VS Code, IntelliJ: mainstream tooling support
- Protobuf is a schema, not just a SerDe format (7 min)
- Any object leaving the application boundary is a candidate
- Schema as source of truth vs code as source of truth
- Forward/backward compatibility vs custom versioning
- Same
.proto, any language, any transport
- Practical adoption path (5 min)
- Start with DTOs and boundary objects
- Generate alongside existing JSON APIs
- gRPC-gateway / ConnectRPC for coexistence
- One schema, multiple consumers
Target audience
Polyglot developers, API designers, and platform engineers who evaluated Protobuf or gRPC in the past and decided against it, or who have only ever worked with JSON APIs.
Takeaways
- The Protobuf tooling ecosystem has matured to the point where the old friction points simply don’t exist for most use cases
- Protobuf is a schema language, not just a serialization format or a gRPC dependency
- You can adopt Protobuf schemas without giving up REST or JSON
- Schema-driven development reduces integration failures regardless of project scale
References
Schema-driven development and the case for Protobuf
- The real reason to use Protobuf is not performance - Buf’s argument that schema-driven development (not speed) is why Protobuf matters
- API Design-First vs. Code First - Stoplight on why defining contracts before writing code reduces rework
Protobuf compatibility and schema evolution
- Protobuf Language Guide: Updating A Message Type - Official Google docs on safe field changes and wire compatibility
- Protobuf Dos and Don’ts - Official best practices for evolving schemas
- Backward and Forward Compatibility with Protocol Buffers - Practical walkthrough of Protobuf’s extensibility model
Bridging REST and gRPC
- grpc-gateway - Generates a REST reverse-proxy from .proto service definitions
- Envoy gRPC-JSON Transcoder - Infrastructure-level REST-to-gRPC translation
- gRPC-Web - Browser client support for gRPC via proxy
Tooling and ecosystem
- Buf Schema Registry (BSR) - Hosted Protobuf registry with dependency management and generated SDKs
- OpenTelemetry Protocol (OTLP) Specification - Real-world example of Protobuf as an industry-standard wire format
- opentelemetry-proto - The actual .proto definitions for OTel’s data model
Developer sentiment and community discourse
- I Reviewed 1,000s of Opinions on gRPC - Synthesizes developer opinions from Reddit, HN, Twitter, and YouTube
- HN: Can somebody please explain why we would use gRPC? - Candid practitioner discussion on gRPC’s value and friction points
- HN: A detailed comparison of REST and gRPC - Real-world experience reports on the REST vs gRPC tradeoff
Protobuf/gRPC momentum
- Google Pushes for gRPC Support in Model Context Protocol - Google Cloud contributing gRPC transport to Anthropic’s MCP (Feb 2026)