After announcing its partnership with M31 Capital, SQD’s co-founders held a live discussion with M31’s David Attermann to explain the company’s vision and development approach.
Why M31 Invested in SQD
M31 Capital focuses on infrastructure, middleware, and the data layer — an underrated area in Web3. David Attermann explained that SQD caught his attention following the Google Cloud partnership. After conducting thorough due diligence, he recognized SQD as offering a completely different way to do data with capabilities exceeding what’s possible in traditional web technologies.
The investment appealed to M31 because SQD emphasizes verifiability and designed its network with sustainable unit economics. Attermann noted that SQD will provide valuable functionality in the future and, therefore, is a long-term investment.
Why SQD Values the Partnership
Co-founder Marcel highlighted that finding technically sophisticated people in crypto venture capital is uncommon. M31’s team expertise will benefit SQD’s product-focused approach.
SQD’s Origin and Differentiation
CEO Dmitry Zhelezov recounted how SQD began three years ago addressing a specific problem: indexing blockchain data proved inefficient through standard RPCs. The initial prototype stored data in an efficient database, which gained traction after a hackathon win. This led to building SQD as a scalable solution.
The platform combines decentralized physical infrastructure (DePIN) with advanced Web2 data technologies, offering three key advantages:
Scalability: The decentralized data lake scales with each node. Each worker contributes 1GB per second throughput, achieving 100x efficiency gains over traditional indexers.
Flexibility: SQD stores raw data directly, enabling developers creative freedom while solving data access at scale through client-side tooling.
Network Economics: Rather than fixing token inflation initially, SQD implemented a fixed reward pool during bootstrapping to maximize decentralization. Hardware requirements remain low to encourage widespread node participation.
Game Plan: Gateway 2.0 and Light Clients
Gateway 2.0 will upgrade the current system by pre-fetching and allocating data intelligently, potentially eliminating the need for downstream RPCs. Users will stream filtered data in real-time through a simple interface.
Light Clients leverage SQLite — a lightweight production-ready database for personal devices. Dmitry noted that in Web3, we’re still stuck in the postgres area, while Web2 has long moved on. This enables local indexing with private data processing through secure enclaves.
Verifiability’s Critical Role
The speakers emphasized that verifiability matters more than decentralization, particularly as AI adoption increases. Two verification aspects are essential:
- Confirming data providers added correct onchain data (using hashes)
- Verifying query execution correctness (using Trusted Execution Environments or zero-knowledge proofs)