Introducing Zero Knowledge Credentials (ZKCreds), the latest addition to cheqd

cheqd - Introducing ZKCreds

cheqd becomes one of the first Decentralised Identity networks to enable Zero Knowledge Credentials, ‘ZKCreds’, also known as AnonCreds. Leveraging the renowned and widely used AnonCreds Verifiable Credential format, cheqd, alongside Animo, has built ZKCreds into our tech stack, opening the door to a more privacy-preserving online experience for users.


Zero Knowledge has fast become one of the hottest topics in Web3, with projects across protocols boasting bold and powerful new tooling to enable more private and autonomous online experiences for users.

Zero Knowledge or a zero-knowledge-proof (ZKP) is “a method by which one party (the prover) can prove to another party (the verifier) that a given statement is true while the prover avoids conveying any additional information apart from the fact that the statement is indeed true” (ExpressVPNP).

Put simply, a user is able to share only information that is necessary to the outcome required, and not all information they could reveal, but don’t need to. Combined with Verifiable Credentials (VCs), ZKPs offer a unique blend of privacy, flexibility and trust.

At cheqd, we’ve been determined to build ZKPs into the stack we offer to our partners, and fortunately, we haven’t had to do this alone. Thanks to the excellent leadership from the Hyperledger Indy and Hyperledger Aries communities, alongside our partner Animo, we’re thrilled to announce the support for Zero Knowledge Credentials (ZKCreds) on cheqd, using the widely known ‘AnonCreds’ Verifiable Credentials format.

Summary of AnonCreds & Zero Knowledge Creds

Across the SSI landscape, there is, of course, a broad range of use cases, which necessitate varying levels of privacy. For some, revealing more data than necessarily required is not of massive concern, for example, when less personal data is contained within the credential. However, for others, being able to prove the validity of a credential without revealing the claims, attributes or data within the credential itself is vital. For example, “ I don’t need to give up my full name or home address to prove I’m over 18”. Within a Web 3 context, “ I don’t need to reveal my wallet address to prove I hold a certain amount of a token”.

AnonCreds, short for ‘Anonymous Credentials’ — are a flavour of Verifiable Credentials that do this and more. Initially part of Hyperledger Indy, now in the Hyperledger AnonCreds project, Anoncreds have been used since 2017 and are one of the most commonly used Verifiable Credential (VC) formats in the world. Crucially, they are regarded as the standard for ZKP-based verifiable credentials, offering varying levels of privacy to the holder, dependent on the data being embedded.

However, as a Hyperledger native format, they have, for the most part, been tightly wedded to the Hyperledger Indy blockchain and an associated Hyperledger Aries Software Development Kit. This made it difficult for developers that want to reap the ZKP benefits of AnonCreds to leverage other networks’ unique value propositions — for example, the payment rails being built by cheqd.

Now, thanks to the work of the Anoncreds community, and the outstanding leadership from Timo Glastra and the Animo team, developers can issue AnonCreds on other chains. What makes this particularly poignant is that cheqd is one of the very first to offer Anoncreds on a non-Indy chain in a standard-compliant, highly performant and accessible format, which has been an industry-wide initiative, as we’ll come to.

Given the key value proposition of AnonCreds is the capability of zero-knowledge-proofs, we’ve chosen to coin the term ‘ZKCreds’ for their implementation on cheqd, given their zero-knowledge strengths and the broader level of understanding when compared to “Anon” as the key term.

ZKCreds x AFJ: opening the door to cheqd for current and future partners

Early on in our cheqd journey, we realised the Aries frameworks and AnonCreds dominance, reported in our blog post in April 2022 and have been eager to find a solution.

At cheqd, of our 42 SSI Vendors, 20+ are primarily issuing AnonCreds, or intend to, which currently requires the use of an Aries-centred Software Development Kit, such as Aries Framework Javascript — AFJ or Aries-Cloud-Agent-Python — ACA-Py.

With the release of the afj/cheqd-sdk module, cheqd’s current and prospective partners now have the ability to migrate their stack from Hyperledger Indy to cheqd, issuing AnonCreds with Aries Framework Javascript initially, with ACA-Py to follow.

Launching ZKCreds on cheqd also has a direct impact on cheqd’s tokenomics. Earlier this year, we introduced a ‘burn’ to the network as part of Tokenomics Part 4, whereby “A proportion of the identity transaction charges will be burnt to establish equilibrium with the inflation on the network and target total supply returning to a total initial supply of one billion $CHEQ.”

With AnonCreds, the writing of resources (DID-Linked Resources) such as Cred Defs, Schemas and Revocation Registries to the ledger are required, each with an associated price. Currently, 50% of identity transaction fees are burnt, however, recent community discussion has meant a proposal to push this to a 99% burn rate is imminent.

Building ZKCreds into cheqd… how did we do it?

In September last 2022, we released and demoed our AnonCreds MVP, working alongside Animo. The crucial requirement for a ledger-agnostic AnonCreds was the ability to store the required AnonCreds resources on-ledger. These include, Credential Definitions (CredDefs), which create an immutable record of key credential information in one place, schemas used to list a set of attributes a credential contains, and the resources associated with revoking a credential.

Ultimately this was made possible through our pioneering implementation of an on-ledger resource module, offering the capability to store and retrieve resources, known as DID-Linked Resources, due to the way they’re stored from the cheqd ledger.

In tandem with our initial MVP, conversations started heating up around the topic of making ‘ledger-agnostic AnonCreds’, and over the past six months, a phenomenal amount of effort has been put in by organisations, including the IETF and the AnonCreds Specification Working Group amongst others, into making this possible. This culminated in the migration of AnonCreds to Hyperledger (hence why the new AnonCreds format is formally known as ‘Hyperledger AnonCreds’). More than 25 sponsors supported the adoption of AnonCreds by Hyperledger, including representatives of Indicio, Accenture, IBM Research Europe, several universities, BC Gov, and Canadian provincial governments, demonstrating the scale of their adoption and the vast pool of potential partners and end customers.

What's next?

Moving forward, we’ll now be guiding our partners through integrating with cheqd, specifically with Aries Framework Javascript. Stay tuned for tutorials on getting started with ZKCreds using AFJ, and check out the cheqd AnonCreds Method here.

For those whose applications are centred around ACA-Py, we’ll also be working with Animo and other partners to build the support for ZKCreds with ACA-Py on cheqd. If you’re involved in ACA-Py development, we’d love to hear from you — reach out to [email protected] to flag your interest.

Separately, if you’re an identity developer, not yet building on cheqd, let’s connect! Reach out to [email protected] to learn more about cheqd and how we may be able to help you.

We’ll also be hosting a series of webinars and workshops geared towards helping you to ‘Build ZKCreds on cheqd’. If you’re interested in being involved, please reach out to [email protected].

Our Product Vision for 2023 at cheqd

Setting the vision for the year ahead for delivering on cheqd network’s unique differentiators in decentralised identity

Co-authored by Ankur Banerjee (CTO/co-founder), Ross Power (Product Manager), and Alex Tweeddale (Product Manager & Governance).

The cheqd team is thrilled to begin 2023 with our first major version released for the cheqd mainnet/testnet to v1.x, which was successfully carried out on 30th January 2023. Throughout this blog post, we want to revisit our product reflections at the end of last year, and lay out a vision for the future.

🚀 Our product roadmap at cheqd for 2023

Our product vision last year (published just two months after the launch) described our objectives using fundamental building blocks that we wanted to deliver. We’re excited to say that our v1.x release delivered on a lot of those fundamentals (more on that later).

product roadmap at cheqd for 2023, at a glance

Therefore, for this year, we’ve outlined five product development goals the cheqd team is working on, in terms of objectives we want to achieve by the end of 2023:

  1. Double-down on building payment rails for digital identity: This is the unique selling point for which most of our partners signed up to the cheqd network — and why many people in the cheqd community are backing us. For reasons we’ll go into more detail in the blog post, we had to spend more time than we anticipated in 2022 laying down the groundwork for this — but now that those foundations have been delivered in our v1.x release, we’ll be laser-focussed on delivering the payment rails everyone has been waiting for.
  2. Continue simplifying developer experience to make building on top of cheqd network faster and cheaper: We’ve spent a lot of time in 2022 simplifying the developer experience for partners and app developers building products on cheqd. We plan to continue on our mission to make building products on cheqd simple in comparison to Web2 as well as Web3 digital identity competitors. We also want to drive more organic adoption of network usage by existing and future partners.
  3. Support compelling products addressing decentralised reputation for Web3 ecosystems: 2023 seems to be a turning point where the wider crypto ecosystem has acknowledged that Web3 requires compelling solutions to decentralised reputation, but so far this effort has been focussed on using NFTs for identity (hello, soul-bound tokens), proof-of-attendance protocols (“POAPs”), and so on. As we laid out in our trends report on decentralised identity, we think there’s a large and unaddressed gap in the market for a more privacy-focussed vision of how online reputation can be solved in decentralised communities. We’ve been working on skunkworks project for the past few months, codenamed “Project Boots”, which is aimed at this space.
  4. Integrate deeper with the Cosmoverse and beyond: We focussed a lot of last year on building out strong partnerships in the self-sovereign identity (SSI) ecosystem. This year, we want to leverage the multichain vision of the Cosmos SDK ecosystem to become one of the go-to solutions for digital identity within the Cosmoverse. We also believe that our network can be of utility for Web3 projects not built on Cosmos SDK, without them needing to switch blockchains. At the same time, we will work on improving the availability of $CHEQ tokens, and first-class support in crypto apps (such as wallets and explorers), all with the aim of making it easier to use and build upon the network.
  5. Drive adoption via industry standards for product innovations made at cheqd: We’ve been working on differentiated product innovations last year at cheqd, such as our approach to DID-Linked Resources. While we’ve written and designed this to be ledger-agnostic, widespread adoption of this idea (and therefore, normalising its usage on the cheqd network) requires us to do the legwork to get them codified as industry standards at W3CDIF, and ToIP. We think the time is ripe for this, given the Decentralized Identifiers Core (“DID Core”) became an official W3C Recommendation standard last year.

We’ll deep-dive into each of the five product development goals below. Alongside this, we are very excited to share an interactive product roadmap for cheqd which we will continue to update and iterate over the course of the year and beyond.

1️⃣ Double-down on building payment rails for digital identity

The core vision of cheqd is that while the idea of decentralised identity that is in control of the user is a noble ideal, the lack of business models on why the issuers of this data should give this data back to users is hampering adoption of self-sovereign identity.
Overview of payment rails for digital identity

Payment rails for verifiable credentials will enable each party in a digital identity ecosystem to be rewarded for their participation in the network. Credential issuers may be rewarded for issuing Verifiable CredentialsHolders may be rewarded for sharing their own data in the form of Verifiable Credentials or PresentationsCredential recipients (“Verifiers”) may be rewarded with compliance benefits and lower costs for identity verifications.

At the same time, we acknowledge that one of the fundamental principles of SSI / decentralised identity is giving power to the user. We believe there’s a happy middle-ground that can enable these payment rails and have privacy-preserving characteristics that the SSI ecosystem (and users) value.


  1. You might hold a drivers licence, issued by a trusted authority (e.g., the government driving licence authority)
  2. When you go to buy an age-restricted product, such as alcohol or tobacco, the store might check your age by asking you for your drivers licence. (Note that the “credential” is being used here outside its original intended purpose to prove the right to drive.) This interaction is “free” for the store’s clerk, since they are only interested in checking whether the age in an official-looking document is above a certain number,
  3. However, if you wanted to rent a car, the car rental agency might run a live check against a government database or other 3rd party API to determine whether the licence is currently valid, or whether it has been revoked ahead of its original expiry date. This interaction typically costs the car rental agency something, i.e., a per-check fee paid to the API they do the lookup on.


The holder is you (the driving licence holder), the issuer is the driving licence agency, and there are two (separate, unrelated) credential recipients (“verifiers”): the alcohol store and the car rental agency. We believe there’s a large sector of the SSI market where credential recipients will be willing to pay a small fee to check the status of credentials, and it will motivate potential credential issuers to hand out data in portable and secure data credentials.

Responses captured within cheqd blog Understanding the SSI stack through 5 trends and challenges


In order to build such a solution, it is relatively easy to build a centralised, gated API where payment must be made before the API returns a response. Centralised, gated APIs are extremely common in the Web2 world. Such a centralised API would also be privacy-leaking, since the act of looking up details on such APIs itself reveals information about the client application/user accessing the information, the contents of the information being accessed, etc. Both Google and Facebook backdoor tracking data for their ad networks using their “Login with…” buttons, regardless of whether you personally use the Google/Facebook login option or not. Therefore, we’re interested in building a decentralised mechanism or API that is privacy-preserving and minimises leakage of which credential’s revocation status was checked. Logical components include:
  1. A decentralised credential status & trust registry: This answers the question “Is this credential I’ve been shown revoked or not?” and “Is the issuer of the credential trustworthy?”. We’ve already delivered a mechanism to answer these two questions in a decentralised fashion, using DID-Linked Resources (and made it more robust in our v1.x mainnet release).
  2. A token-based payment-gating mechanism: The status & trust registry above should only reveal the answer once the pre-defined payment set by the issuer of the credential has been paid. In other words, this would be a mechanism to charge for read actions on DID-Linked Resources.
  3. A payment-privacy system: The value of a transaction or the sender/recipient itself can leak private information about the parties involved. An ideal payment mechanism for the above would try to minimise information leakage.
Without giving too much of the secret sauce away, our product & engineering team already has a viable approach for a token-based payment-gating mechanism we plan on building out over Q1/Q2 of this year, layer on top of the code we’ve already shipped for credential status & trust registries. We plan on making this functionality available first using a decentralised approach as the first layer of privacy-preservation, and then layer on multiple approaches to increasing payment functionality till the end of the year.

2️⃣ Continue simplifying developer experience to make building on top of cheqd network faster and cheaper

When we started 2022, cheqd had just launched its mainnet in November 2021. While our first mainnet release certainly offered enhanced support for more complex Decentralized Identifier Documents (“DID Documents”), we knew that we were going up against incumbent networks in self-sovereign identity which had been building for much longer than us.

So we set out in address one north star question within the product team last year: “Can an app developer, starting from scratch, build a digital identity use case on cheqd in less than 30 minutes?” Critically, we didn’t distinguish this with any qualifiers such as “…compared to other SSI networks” or “…compared to Web 2.0 identity solutions”. Both Web2 as well as Web3 identity solutions are our competition since they represent the status quo in how our digital identities are handled online.

What the cheqd team has delivered in 2022

Much of what we’ve shipped in terms of products on-ledger and off-ledger was building towards answering this question. Some of this work was driven by the cheqd Product & Engineering team directly to jump-start network usage, such as building a plugin for the widely-used Veramo SDK for W3C-compliant DIDs and Verifiable Credentials.

Other efforts, such as the work done on Hyperledger Aries by Animo to make AnonCreds independent of Hyperledger Indy happened more organically through app developers building on top of the cheqd network. Ultimately, we want more app developers organically building products, so that our team can focus on unique differentiators that no other decentralised identity network offers.

cheqd Product Reflections 2022


Product partnerships are crucial to cheqd as a network, whether they are mature SSI vendors, or startups that have heard of SSI but haven’t dipped too far into building it into their products so far.

Instead of just looking at integrating cheqd at a technical level, we’re collaborating on more than a dozen specific product use cases with our partners, such as:

  1. Single Sign-On (opening up the opportunity for “Sign-in with cheqd”);
  2. Integrations in enterprise uses cases in Web2;
  3. Integrations into Web3 use cases including community & reputation managements;
  4. Integrations in private network consorta clients
  5. Integrations for Consumer Credit scores to Web3 lending environments
  6. Integrations for recycling and sustainability payments
  7. Payment rails for financial institutions
  8. Opportunities for better management of storage


At the start of 2022 we surveyed our partners to inform our product strategy for the year. The responses emphasised the importance and potential impact of assigning part of our time to adding cheqd network support in Hyperledger Aries, which is the most widely-used SSI SDK kit.

(N.B, Respondents could select that they are using multiple SDKs hence the reason for graph illustrating a percentage greater than 100).

Therefore, this year we’re continuing to build on the progress we made to support Aries frameworks on cheqd, working closely with our partners and the wider SSI community to advance standards and interoperability. This includes contributing to and integrating with Hyperledger AnonCreds, a ledger-agnostic and client-agnostic version of the widely used AnonCreds credential format. Working with Animo, we’re enabling our partners to issue Hyperledger AnonCreds using the Aries Framework Javascript SDK, followed by ACA-Py.

However, we also know that not every SSI vendor uses Aries SDK. Having a diverse and vibrant ecosystem of SDKs is important for creating a truly decentralised developer ecosystem, and as such we’re continuing to work closely with the Veramo SDK, supporting both JSON-JWT and JSON-LD credentials.

We also believe that outside of the SSI vendors we’re already partnering with, there’s a larger market of app developers who want to build privacy-enhanced digital identity experiences, but aren’t currently using SSI in their products. This has been evidenced by a large volume of inbound requests we’ve had from such startups and app developers, asking us how they could incorporate a new model for digital identity into their products. Asking these app developers to wholesale switch SDKs is hard.

Therefore, we’re focussed on building cheqd network support into DIF Universal Registrar, making it quick and easy for app developers to integrate cheqd using simple REST APIs. (REST APIs are the most commonly-used API pattern, used in Web 2.0 apps as well as within Cosmos SDK itself, e.g., for Keplr Wallet interacting with Cosmos dApps.) This will increase cheqd’s wider adoption and user / developer experience for SSI vendors, developers and Web3 partners.


With a huge amount of work on ledger-agnostic AnonCreds in-flight, we plan to continue funding and accelerating efforts to unlock first-class support for cheqd network in the most widely adopted SSI SDK.

We’re also excited to announce that we will soon be fully supported in’s SSI Kit, which natively supports JSON and JSON-LD credential types, while using a different credential exchange protocol to our existing Veramo SDK’s SSI Kit uses the Open ID Connect Stack for credential and presentation exchange, affording cheqd even more coverage across the Verifiable Credentials landscape.

3️⃣ Launch compelling products addressing decentralised reputation for Web3 ecosystems

A large gap in the current SSI space is functional governance for SSI ecosystems. There are a few approaches to Trust Registries and Trust Establishment which are currently unstandardised and generally rely on centralised infrastructure. We presented a research paper at the Rebooting Web of Trust conference last year, titled Dynamic & Decentralized Reputation for the Web of Trust: What We Can Learn from the World of Sports, Tinder, and Netflix which laid out some of extremely long-term thinking on where the digital ID industry is heading.

In the more immediate term, with cheqd’s DID-Linked Resources we’ve built a perfect foundation for creating Trusted Issuer ListsTrust Establishment files, Credential Manifests or Presentation Definitions on-ledger, identifiable with a fully resolvable DID URL. This may even improve the standard paradigm for working with interoperability profiles such as WACI-DIDComm.

Further to this, one of the needs we’ve identified from SSI clients is a web application that is able to manage trusted issuers in a given ecosystem, as well as what levels of governance are required for different purposes.

Using cheqd DID-Linked Resources, we foresee this being far easier to achieve than by building from scratch. This year, we plan to demonstrate these benefits and streamline their existing client facing application and demonstrate DIF Trust Establishment and Trust Registry capabilities on cheqd, alongside one of our partners.

Lastly on this point, we plan on launching our skunkworks project, codenamed “Project Boots”, within Q1/Q2 of this year. While the SSI space often uses the usual suspect use cases, e.g., Know Your Customer (KYC), education degrees, etc, in examples quite often, we fundamentally think these are infrequent use cases, and therefore not something truly painful that real-world users face on a daily basis. What they struggle with is proving their reputation online. Stay tuned for more on how we plan to re-use everything we’ve already launched in a compelling and fun experience for tackling reputation in Web3.

4️⃣ Integrate deeper with the Cosmoverse and beyond

One of the things we understand, in hindsight, is that while we focussed last year a lot on SSI vendor partnerships, we didn’t focus perhaps as much on building equally deep ties within the Cosmoverse. While some of the most notable players in the Cosmos ecosystem validate on our network and we collaborate in terms of contributing back open source code, we could do even better at this.

It’s worth noting that as part of our v1.x release, cheqd is one of the first Cosmos SDK chains to adopt Cosmos SDK v0.46, which brings capabilities such as Interchain Accounts. Besides, Cosmos SDK v0.46 introduces long-term and stable fixes for the “Dragonberry” security vulnerability which could theoretically compromise IBC payments. Incorporating the deep nuts-and-bolts changes needed to make cheqd ledger compatible with changes in upstream Cosmos SDK was one of the reasons why our v1.x release took longer than we expected. While being one of the first chains to adopt this Cosmos SDK release was technically challenging, we take security very seriously at cheqd — especially as we want to build the payment rails for digital identity.

Moreover, the next version of Cosmos SDK (v0.47) has an even bigger growth spurt to make the interchain software foundations more robust and secure. While we won’t need to incorporate these right away, the work we’ve done to transition v1.x will make the process of moving to Cosmos SDK v0.47 smoother when the time comes.

We’ve heard you loud and clear, though, on the need to increase our presence in the Cosmoverse. Some of our aspirational goals in this space are:

  1. First-class support for cheqd network in more Cosmos SDK wallets and apps: We’ve got first-party integrations into Leap Wallet for our network’s token functionality. We want to work on expanding support for Verifiable Credentials natively in crypto wallets by extending the WalletConnect protocol in 2023. Ideally, we want to extend this to Keplr Wallet and Mintscan, which are the two biggest product requests we get. We’re in continued discussions with both and are looking forward to these product integrations as soon as reasonably possible.
  2. Position cheqd as one of the go-to solutions for digital identity within the Cosmoverse: We’ll leverage the good work we’ve done last year, and continue to do this year, in terms of making utility usage of our network easier to specifically guide Cosmos SDK projects on how to use cheqd for privacy-enhanced digital ID. Our goals defined under simplify developer experience and decentralised reputation (aka “Project Boots”) both feed into this goal.

As always, we’ll continue to support $CHEQ in more CEXs/DEXs as an ongoing goal to make the tokenomics of our network more viable.

5️⃣ Drive adoption via industry standards for product innovations made at cheqd

Technical standards are crucial in ensuring interoperability between different SSI ecosystems. As such, we are currently in the process of getting two different standards created: ledger-agnostic Hyperledger AnonCreds and W3C DID-Linked Resources. This is based on feedback from our SSI vendor-partners that efforts we put into getting these product innovations at cheqd incorporated into standards has a tangible impact on their prioritisation for these features into being used in their products.

Later in the year, we will kick off a technical standard looking at how Verifiable Credentials can be incorporated within WalletConnect at the Chain Agnostic Standards Alliance (CASA). The basic idea here is that there are far more crypto wallets in active use, as opposed to SSI wallets, and native support for Verifiable Credentials in crypto wallets will significantly increase the reach and distribution of SSI. We want to focus on WalletConnect since some of the most widely-adopted crypto wallets support this protocol, and our proposal would be to specify a technical method through which dApps and wallets can signal which kind of Verifiable Credential features they support. Similarly, we’re also exploring collaboration with the Open Wallet foundation, a Linux foundation project with the goal of developing an open-source engine for secure and interoperable multi-purpose wallets.

Finally, we are eager to participate and explore live interoperability testing amongst projects, such as Silicon Valley Innovation Program Plugfest and other large SSI networks such as European Blockchain Services Infrastructure (EBSI) and LACChain. This is important because cheqd does not want to build in a vacuum and aligning with other chains, programmes and identity initiatives will ensure that cheqd’s progress in payments and resources is market-ready and interoperable cross-ecosystem.

⏪ How our v1.x release enables our 2023 product goals

Hopefully, we’ve given a flavour above on our product vision for this year. We’ve been humbled by the response over 2022 from our partners and app developers building on top of the cheqd network. There is clear excitement about the product vision we’ve laid out in terms of economically-sustainable business models for digital identity. But to truly be able to build towards that vision, our strategy was to first offer a decentralised network that was incrementally better than the current state-of-the-art in self-sovereign identity.

Making AnonCreds ledger-independent and the opportunities unlocked by DID-Linked Resources are two tangible examples where we exceeded the capabilities of current SSI networks. But, we also knew that achieving our north star goal of “up-and-running in 30 minutes or less” required additional effort.

To wrap up our product vision, we wanted to recap the key areas that our v1.x release delivers on, and forms the fundamentals of our product roadmap we’ve outlined above for 2023…

Read: cheqd Network Changelog – release 1.x


The feedback on DID-Linked Resources that we received from the decentralised identity community has been, overall, positive. One of the features that developers appreciated was the robust versioning (and being able to query historical versions) for these digital Resources. While it was technically possible to query historical versions of DID Documents as well, we’ve released the same robust versioning method we built for DID-Linked Resources for actual DID Documents themselves. (It’s worth noting that not all DID methods support versioning.)

We also spent the latter half of 2022 building SDKs and integration software such as support in the DIF Universal Resolver, which we know are important for developer adoption. Using the lessons learnt along the way, we refactored parts of our on-ledger code to make building integrations with SDKs and 3rd party apps easier. This enables our goals for simplifying developer experienceintegrating deeper with Cosmoverse, and aligning for industry-standardisation (based on feedback we received for DID-Linked Resources).


Prior to the v1.x update, writing a DID to cheqd mainnet only costs gas, approximately 0.004 $CHEQ. In comparison to other networks, this is a far lower price (you’ll find a pricing comparison here, also below).


Prior to the v1.x update, writing a DID to cheqd mainnet only costs gas, approximately 0.004 $CHEQ. In comparison to other networks, this is a far lower price (you’ll find a pricing comparison here, also below).

Additionally, $CHEQ was previously inflationary, meaning the staking rewards are generated from inflation only, and only slashing, tombstoning or no-with-veto slashing provide an inflationary pressure. Therefore, we chose to introduce two key mechanisms to counteract these. The first, identity transaction pricing, means that any transaction on-ledger related to identity has a specific, fixed fee. For example, creating a DID or any form of resource, like an image or schema, has a set fee. The current prices for these can be seen here, and below.

The second, a burn mechanism helps establish equilibrium with the inflation on the network by burning a proportion of the identity transaction charges above. Over time we see this a viable route to return the total supply to the total initial supply of one billion $CHEQ. The current burn % has been set at 50%, however we have tested the right up to 99% and will take the communities lead to decide where this % will remain for a longer term.

In addition to making identity transactions more economically-sustainable, we also see this an important method of providing more stability to app developers building with cheqd, as a means to more accurately forecast and price the build of products for customers.


Preparing for and implementing the v1.x release did however have its fair share of challenges, which transparently slowed down our release cycle.

To accommodate for the ledger enhancements, fundamental changes were required to the underlying code base which the ledger uses. For example identity transaction pricing and the ability to provide an alternative URI for a DID did not exist in the previous version, meaning this feature would not be possible to implement. The effort and time required to implement the changes and test these expanded far greater than we’d anticipated, however ultimately doing so has put us in a far better position which will mean we take back the time in future implementation and releases.

We’ll be providing a deeper-dive into these challenges, plus others, and crucially our lessons learnt which we hope will be useful for other Cosmos SDK-based projects and the general developer community alike.

Tell us what you think!

It’s going to be a very busy and ambitious year here at cheqd and we would love your feedback on our product vision for 2023. We welcome engagement and feedback across a range of different forums, such as our Community Slack and Commonwealth Governance Forum (best for extended, in-depth discussions).

We, at cheqd, help companies leverage SSI. cheqd’s network is built on a blockchain with a dedicated token for payment, which enables new business models for verifiers, holders and issuers. In these business models, verifiable credentials are exchanged in a trusted, reusable, safer, and cheaper way — alongside a customisable fee. Find out more about cheqd’s solutions for self-sovereign identity

cheqd update 1 year since network launch — next stop: payment rails

The foundations are set for cheqd’s first-of-a-kind decentralised identity payment rails

Co-authored by Ross Power (Product Manager), Alex Tweeddale (Product Manager & Governance Lead) and Ankur Banerjee (CTO/co-founder).

Making cheqd the most flexible, usable and interoperable network is integral for achieving wider adoption, as we pave the way for the first-of-its-kind decentralised identity payment rails. Recently we wrapped up the third quarter of 2022 and nearing just 1 year since network launch, we’re thrilled to have built the foundation for our partners to truly get to work on building their decentralised identity applications on the cheqd network, and we can’t wait to see what they create. Now, as partners begin to integrate their applications with cheqd, we’re heads down on building what we’re here for… cheqd’s Payment Rails for self-sovereign identity (SSI).


Since we launched cheqd in 2021, we’ve been on a mission to build a stable, trusted and secure decentralised identity network (known within SSI as a Verifiable Data Registry). This is the cornerstone of SSI, acting as the ultimate trust machine required for the verifications of identity data.

Ultimately, it is the fundamental infrastructure needed for being able to offer our first-of-a-kind payment rails which we believe is the missing piece in the widespread adoption of Self Sovereign Identity.

Here’s a quick reminder of what the journey so far looks like…

cheqd — journey to credential payment rails

In a sense, you can think of everything we have been working on to date as:

  1. Feature parity with all other SSI networks;
  2. Identity functionality that goes beyond existing SSI networks; and
  3. The tooling and scaffolding to lay the foundations for payment rails for Verifiable Credentials.

Bringing this all together into a visual representation, using the Trust Over IP stack as we have done in the past, helps to make sense of what cheqd’s capabilities look like both now and what’s to come…

cheqd Capability Matrix

For more information on any of the above including tutorials for creating DIDs and Issuing Verifiable Credentials, head to our identity docs, or reach out to [email protected].

Going beyond the current SSI paradigm

With all the above in place, we now match many of the leading networks in the space, and go well beyond.

Not bad going for a company just over a year old — check out the image below as an example of how cheqd now stacks up against Hyperledger Indy:

cheqd vs Indy comparison — October 2022

To learn more about cheqd pricing and features comparison to Indy based networks, please email: [email protected].

Breaking this down into a quick summary, we can safely say that:

  1. Performance: We are much faster and more scalable for production environments, boasting a far greater transaction per second speed and number of transactions per block.
  2. Decentralisation: Owing to the number of nodes which can run on cheqd there is far greater decentralisation. This means that the network is more resilient to attack or malicious actors.
  3. Incentives: We include rewards for maintaining and using the network functionality using our native token: CHEQ. This is even prior to our payment rails being enabled.
  4. Extensibility: We have improved on the way Indy handles DIDs, keeping directly in line with DID Core. This enables cheqd to support the full remit of DID Core, since it does not need to convert anything like a ‘nym’ into a DIDDoc. It works using DIDs from the start.
  5. Innovation: We include far more flexible on-ledger resources, which we’ve built to be directly retrievable using standardised DID URLs. This enables a far greater range of resources to be anchored and easily referenced on cheqd
  6. Interoperability: We support a greater range of Verifiable Credentials and SDKs (JSON based JWT, AnonCreds and soon JSON-LD). Indy is closely tied to Hyperledger Aries libraries and struggles to work with other libraries owing to its specific transaction types.

From today, this means our existing partners will be able to begin using cheqd to its full potential, with feature parity with other networks and additional benefits.

Here’s a quick reminder of who are existing partners are:

cheqd partners — October 2022

Looking ahead to the future

As a network we have built fast and are now at a poignant moment for ourselves and partners as we witness products start to fully integrate with cheqd. Soon we’ll be seeing real applications using cheqd as their underlying network. Our demo with Animo shows a first in SSI with AnonCreds being issued on a non-Indy chain (check this out here).

It also means we can now move our attention to our primary objective and what we believe is the missing piece in the widespread adoption of Self Sovereign Identity — a commercial model that works for all parties.

Therefore, over the next 6 months, our roadmap is largely broken down into two streams:

  1. Integrating with SSI networks and onboarding them onto cheqd for existing clients and identity use cases
  2. Completing the research, development and implementation of payment functionality for revocation statuses and eventually for Verifiable Credentials.
cheqd partners — October 2022

Like always, we’d love to hear your thoughts on our writing and how resources on-ledger could improve your product offering. Feel free to contact the product team directly — [email protected], or alternatively start a thread in either our Slack channel or Discord.

Building decentralised Identity applications on cheqd with Veramo


Launching the Veramo SDK for cheqd, enabling our partners to create and manage Decentralised Identifiers, Verifiable Credentials and Verifiable Presentations on cheqd

Co-authored by Ross Power (Product Manager), Tasos Derisiotis (Senior Blockchain Engineer), Ankur Banerjee (CTO/co-founder) and Alex Tweeddale (Governance & Compliance Lead)

We’re thrilled to launch the Veramo Software Development Kit (SDK) for cheqd. A Software Development Kit is a collection of libraries containing crucial building blocks of SSI which act as a bridge between cheqd and the identity apps provided by our partners. This is a big step forward in the overall adoption of the cheqd network as it makes it possible for our partners to start building their Decentralised Identity applications using cheqd.


One of the primary objectives for Q2 at cheqd was to make it easier for our SSI partners to create and manage DIDs, schemas, CredDefs (for AnonCred users) and Verifiable Credentials. We also wanted to begin work on supporting credential presentation and verification (you can see what we set out to achieve below).

To achieve this objective, it was important to build cheqd support into software that was compatible with languages and applications that our partners already used.

As part of our Product Research earlier this year, we conducted a survey to gauge the current stack our SSI partners are currently working with and plan to leverage in the future.

Within this, we identified that most companies use JavaScript/TypeScript or related frameworks as their primary language for development, as seen in the graphic below:

However, despite this dominance, the choice of SDK’s (Software Development Kit) didn’t correlate to this, with non-Javascript based SDKs holding the largest proportion, 35.1%, of respondents using Aries Cloud Agent Python (ACA-Py); and secondly, 27.0% of the respondents using Evernym’s VDR Tools SDK (tied to Hyperledger Indy):

What do we mean by an SDK?

The widely used, yet often not well described, term ‘SDK’ refers to Software Development Kits.

To get to grips with this, it’s first important to understand one of the key building blocks that make up SDKs — ‘libraries’.

A library is a packaged, reusable chunk of code that performs a certain function or set of closely related functions. You can insert a library into your application and call it when you need to implement that function without having to write the code from scratch.

So, suppose that instead of an application, you are building a house. One of the things you will need for that house is a stove, but it will not be very practical for you to build it from scratch when you could just buy one off the shelf. A library is to an application what the stove is to that house.

Building on this, an SDK is therefore a collection of libraries, APIs, documentation, utilities, and/or sample code that help you implement certain features without having to do the work from scratch. SDKs vary in scope and function from implementing a feature or set of features, like an analytics SDK for instance, to building whole applications for a specific platform.

Going back to our house analogy, if a library is a stove, then an SDK is a whole kitchen. While you can go and buy all your kitchen appliances, cabinets, and counters separately, it will be a lot easier to buy a full kitchen set, complete with built-in appliances and instructions on how to assemble it. SDKs can be limited in scope such as in the case with one room, but they can also have a bigger scope of a collection of rooms or even the whole house.

Where do SDKs fit in the overall SSI technology stack?

Taking a look at the Trust over IP technology stack helps to illustrate where SDKs fit in. As a reminder, this Trust over IP stack demonstrates the different elements of what is required in the Self Sovereign Identity ecosystem.

At the top level, Layer 4, you’ll notice application ecosystems. These are what the regular user interacts with; a mobile app you download from the app store to hold an airline ticket or vaccine credential for example. At the opposite end of the stack, Layer 1, you have the underlying plumbing. This is the distributed ledger layer where Decentralised Identifiers (DIDs), used to sign Verifiable Credentials (VCs), live.

Between the two you have a vast array of technologies which are used for the sending and receiving of Verifiable Credentials (VCs) and Verifiable Presentations (VPs) in a secure, privacy-preserving fashion. The combination of these two layers is what are more commonly known as SDKs, which contain various libraries that are made up of the technologies required for sending and receiving VCs and VPs.

Selecting the most suitable SDK to leverage for cheqd

Based on the findings detailed above, deciding on which SDKs wasn’t immediately obvious, although we had a few key requirements:

  1. The SDK should be JavaScript based — this is both due to the overall stack out partners were using, and also the dominance of Javascript and Typescript in the Web 3.0 and Cosmos ecosystem
  2. The SDK should be written in languages that are used in web-based applications, such as Keplr, so we can demonstrate DeFi activities (e.g, governance, staking and delegating) in the same wallet as holding a credential
  3. The SDK should be highly modularised so that those not directly using it can easily leverage the cheqd plug-ins built to work with it into their own SDK
  4. The SDK should already be used across the SSI ecosystem, with a significant amount of time being invested into its continuous product development
  5. The SDK should be building in a W3C compliant way, supporting the key exchange protocols and credentials our partners and beyond require

After some research, including exploring Aries Framework JavaScriptAries Framework Go (transpiled to Wasm), Trinsic Okapi and MATTR, we settled on the Veramo SDK.

What is the Veramo SDK and why did we select it?

Available at: uport-project/veramo

Veramo is a JavaScript Framework for Verifiable Data. The Veramo SDK is an agent designed to handle multiple networks and DID methods. Current implementations of the Veramo SDK include did:ethr, did:web, did:key and more.

We chose Veramo for a few key reasons.

  1. Design Principles — The Veramo SDK was designed to be highly flexible and modular making it highly scalable and fit for a lot of complex workflows. As a result, we felt it offered a route to minimise how much needs to be built from scratch. Through its flexible plug-in system, it’s easy to pick and choose which plug-ins are most beneficial, plus it’s possible to add in our custom packages where required which we knew would be necessary from Cosmos-based transactions.
  2. Developer Experience — The Veramo SDK has been designed in a way that offers a fast end-to-end process. Ultimately, at cheqd, we want to reduce the amount of time our team spends on SDKs and so we can maintain our focus on building ledger functionality (i.e. building our implementation of the revocation registry and the credentials payment rails).
  3. Attractive & Simple CLI — The Veramo core API is exposed by its CLI which makes it easy to create DIDs and VCs directly from a terminal or local cloud agent.
  4. Platform Agnostic — The Veramo packages run on Node, Browsers and React & React Native right out of the box.

What does the Veramo SDK for cheqd allow you to do on the cheqd network?

Veramo provides an excellent foundation for clients that want to build verifiable data applications. This is because Veramo Core, the Veramo CLI and any specific plugins are available as NPM packages, enabling:

  1. Identity functionality to be carried out through a native CLI interface; or
  2. Identity functionality to be integrated directly into client applications through NPM packages.

More importantly, Veramo is useful for both DIDs as well as Verifiable Credentials and Verifiable Presentations. It’s not just for one side of the credential usage process.

When combining the existing packages provided by the Veramo SDK with some native packages that we’ve built at cheqd the following functionality is available on the cheqd network (hyperlinks go to the tutorial/information for each):

With the additional Resource module recently applied to the cheqd ledger, AnonCreds will also be possible through the cheqd/sdk which the Veramo SDK for cheqd uses. This is not yet available but is part of the immediate cheqd product roadmap.

Thank you to the Veramo team for all their support throughout. We’re thrilled to strengthen this partnership and appreciate these kind words shared with us by the Veramo team:

The cheqd team quickly and effectively understood all the effort that the Veramo team has been putting into making life easier for developers when building better layers of trust in their applications. We also thank the cheqd team for being participative and collaborative in the Veramo community by supporting the development and discussions.

– Italo Borssato

I was pleasantly surprised to hear that the folks from cheqd were able to extend and adapt Veramo to their own use-case while leveraging existing integration, which then allowed them to focus on their core business. This kind of permissionless innovation and integration is what we were hoping for when designing the Veramo framework. I’m very happy to see the cheqd team not only using Veramo but also contributing back.

– Mircea Nistor

Our implementation of the Veramo SDK for cheqd

Veramo’s modular architecture meant we were able to leverage much of the existing work which offers the core functionality required for credential and presentation issuance and verification.

Adding the ability to create and update DIDs took more work, including contributing to Veramo upstream to improve the overall ecosystem beyond our own implementation. For example, we wanted to support full DIDDoc create and update operations which were not previously available within Veramo’s codebase.

Below you’ll find the basic architecture for the Veramo SDK for cheqd. The Veramo packages offer key Layer 2 and Layer 3 functionality, as described above, including exchange protocols and proof formats.

The key cheqd package is did-provider-cheqd.

The purpose of this @cheqd/did-provider-cheqd NPM package is to enable developers to interact with the cheqd ledger using Veramo SDK.

It includes Veramo SDK Agent methods for use with the Veramo CLI NPM package. It can also be consumed as an NPM package outside Veramo CLI for building your own applications with NPM.

The package’s core functionality is borrowed from Veramo Core NPM package and extends this to include cheqd ledger functionality, such as creating and managing DIDs.

Head over to our tutorials site to find everything you need to install this package and start using cheqd for your identity products today!

Figure 1: Veramo SDK for cheqd architecture

A reference implementation of the Veramo SDK for cheqd, built for the cheqd web demo

If you’re interested in how cheqd has worked with the Veramo packages, but aren’t sure what this would look like in practice, perhaps our implementation we built for an IIW demo would be useful for you.

For a full walkthrough of our choices here check out Ankur’s Tweetstorm on this and if you’re interested in doing something similar feel free to reach out to the product team at cheqd ([email protected]).

Figure 2: A reference implementation of the Veramo SDK for cheqd, built for the cheqd demo

FAQs on Veramo & our implementation

Q: Is it possible to use cheqd libraries to build a mobile wallet? (do the cheqd libraries take React Native support into account)

A: Veramo runs on React Native right out of the box. More information on this is available here.

Q: Do the cheqd libraries introduce any new dependencies that won’t work in React Native? (e.g. the cosmjs libraries or other crypto-related libraries)

A: React Native can work with any browser-based JS library, since cosmjs works in browser environments and most other libraries we use at cheqd.

Q: Where are keys stored?

A: The Veramo SDK uses a PrivateKeyStore for hosting the key data. The @veramo/data-store package provides an implementation that uses a database for storing these keys encrypted by a SecretBox.

The modular architecture of the framework makes it possible to use multiple types of key stores and key management systems for different purposes. These include implementations using CloudHSM, mnemonic-based key derivation, or even web3 wallets.


We’re thrilled to have hit this exciting milestone at cheqd.

We knew our partners were eager for us to answer their question “when will I be able to use the cheqd ledger for identity? and now we’re happy to be able to say “go have a play now!”.

So what’s next for cheqd in terms of SDKs?

Ultimately, we know that there are more focused and dedicated teams who are building SSI SDKs, and our Veramo SDK is just the start. Over the coming months and weeks, we’re in the process of forming ’cheqd SDK Alliances’ within our partner network who can collaborate on required SDKs, leveraging this implementation.

These alliances include those looking to build for; Aries Framework Javascript, .NET, Aca-PY and further building on Veramo.

Where we believe we can add the greatest value is by building out our approach to Revocation Registry and the credential payment rails, which we believe will be a real game-changer for SSI’s adoption.

If you’re interested in joining one of the alliances, reach out to the cheqd product team ([email protected]).

As always, we’d love to hear your thoughts on our writing and what this means for your company. Feel free to contact the product team directly — [email protected], or alternatively start a thread in either our Slack channel or Discord.

cheqd’s “open-source-a-thon” to contribute new tools to identity & Cosmos SDK community


Co-written by Ankur Banerjee (CTO/co-founder), Ross Power (Product Manager), and Alex Tweeddale (Governance Lead)


As strong proponents for open source software, over the past month the cheqd Engineering & Product team has spent a lot of effort polishing and open-sourcing products we’ve been developing for decentralised identity and Cosmos SDK framework. Some of these tools are core to delivering our product roadmap vision, while others are tools we built for internal usage and believe will be valuable for developers outside of our company.

Most of the Cosmos SDK blockchain framework, as well as self-sovereign identity (SSI) code is built by community-led efforts where developers open source and make their code available for free. Or at least, that’s how it’s supposed to work. In practice, what quite often happens, unfortunately, is that very few companies or developers contribute code “upstream” or make them available to others leading to the “one random person in Nebraska” problem.

While the crypto markets have taken a hit, we believe there is no better time to get our heads down to build, innovate, and iterate on new products and tools.

Our intention is to enable others to benefit from our work. For each product or tool that we are releasing under an open source license (Apache 2.0), we explain what the unique value proposition is, and which audience could benefit from the work we have done.

Our work is largely split between:

  1. Core identity functionality which is integral to our product and work in the Self-Sovereign Identity ecosystem; and
  2. Helpful tooling, infrastructure and analytics packages for Cosmos SDK, which can be easily adopted by other Cosmos/IBC chain developers.

🆔 Core Identity Functionality


Github repositories: cheqd/did-resolver

We released the cheqd DID method in late 2021, creating a way for any person to write permanent and tamper-proof identifiers that act as the root of trust for issuing digital credentials. While the functionality to read/write Decentralized Identifiers (DIDs) has existed since the beginning of the network, such as when we published the first DID our network or ran an Easter egg hunt for our community with clues hidden in DIDs, we wanted to go further and provide a “resolver” or “reader” software that makes it easy for app developers to use such functionality themselves. This was one of our primary goals for the first half of 2022.

Having a DID resolver software is important since reading DIDs from a network happens disproportionately more often than writing DIDs to a network. Think about the act of getting a Twitter username or a domain name: you sign up and create the handle once, and then use it as many times as you want to send new tweets, or to publish new web pages on your domain. This is similar to writing a DID once, and using it to publish digitally verifiable credentials many times. The recipients of those credentials can use a DID resolver to read the keys written on-ledger used to verify that credentials issued are untampered.

Many decentralised digital identity projects use the Decentralized Identity Foundation (DIF) Universal Resolver project to carry out DID reads/resolution. Therefore, we have made a cheqd DID Resolver available under an open-source license, and we’re working with DIF to get this integrated upstream into the Universal Resolver project.

The DID resolver we’ve made available is a “full” profile resolver, which is written in Golang (same as our ledger/node software) and can do authenticated/verified data reads from any node on the cheqd network over its gRPC API endpoint. This will likely be required by app developers or partners looking at processing high volumes of DID resolution requests since they can pull the data from their own cheqd node, and/or if they want the highest levels of assurance that the data pulled was untampered.

We also plan on releasing a “light” profile DID Resolver, built as a tiny Node.js application designed to run on Cloudflare Workers (a serverless hosting platform with extremely quick “cold start” times). This will allow app developers who don’t want to run a full node + full DID Resolver to be able to run their own, extremely scalable, and lightweight infrastructure for servicing DID read requests.

In short, this architecture improves:

  • Accessibility to cheqd DIDs
  • Flexibility, offering app developers and partners optionality and choice of platforms to run on, according to their security/scalability needs, and at various different levels of how much it costs to run this infrastructure.


Github repositoriescheqd/did-provider-cheqdcheqd/did-jwt-vc

We’ve been working hard following our identity wallet demo at Internet Identity Workshop in April 2022 to make this functionality available to every app developer. We’re excited to announce that you can now issue JSON / JWT-encoded Verifiable Credentials using a plugin for the cheqd network we built for Veramo.

Veramo is an existing open-source JavaScript software development kit (SDK) for Verifiable Credentials. We recognise that app developers and our SSI vendor partners have their own preferred SDKs and languages, based on credential formats. We chose to implement the JWT/JSON Verifiable Credentials (VCs) using Veramo since it has a highly-flexible and modular architecture. This allowed us to create a plugin to add support for the cheqd network, without having to rewrite a lot of code for basic DID and VC functionality from scratch.

You can try out a reference implementation of how you as an app developer can build your own applications using our Veramo SDK plugin and ledger on the cheqd wallet web app (more on this later).

We’re also working on supporting other popular credential formats, such as AnonCreds as we know there’s interest in this from many of our SSI vendors/partners.

Why is this valuable

cheqd’s objective is to provide its partners with a highly scalable, performant Layer 1 which can support payment rails and customisable commercial models for Verifiable Credentials. In order to showcase the baseline identity functionality, it was important to build tooling to issue and verify Credentials using an existing SDK. Veramo was a perfect SDK to build an initial plugin for cheqd, due to its already modular architecture.

This repository therefore showcases that cheqd has:

  • Functional capability for signing Credentials using its DID method
  • Functional capability for carrying out DID authentication to verify Credentials
  • Ability for cheqd to plug into existing SDKs, which we hope to expand to other SDKs provided by our partners and the wider SSI community.


In April 2022, we demoed at Internet Identity Week (IIW34) in San Francisco to show a non-custodial wallet, that can be recovered, where users can stake, delegate, vote with CHEQ tokens AND hold W3C Verifiable Credentials.

To build the demo wallet, we forked the Lum wallet, an existing Cosmos project. By adding new identity features to an already great foundation, we’ve been able to speed up our journey to get a Verifiable Credential in a web-based wallet.

Whilst we’re expanding this we’ve OpenSourced the cheqd-wallet repo to enable our partners, other SSI vendors and interested developers to:

  1. View and test out the functionality in their own environments, and
  2. To build-on, extend and replicate the wallet’s utility into their own software and wallets.

In the coming week’s we’ll be launching some exciting features which will really bring this wallet to life for the @cheqd_io community and beyond… so watch this space!

Try out the @cheqd_io demo yourself at, get a VerifiableCredential, and read more about the background on how we built it in our CTO Ankur Banerjee’s TweetStorm.

Why is this valuable?

So far, this wallet has been used for demo purposes, however, moving forward we would love to showcase the real value of Verifiable Credentials by issuing our community their own VCs for different reasons. These could all be stored, verified and backed up using the cheqd wallet. Through demonstrating our technology in a wallet as such, it makes it easier for new community members and partners to visualise and understand the value of everything we’re building on the identity front.

🛠️ Oven-ready tooling, infrastructure and analytics packages


Github repositoriescheqd/infra

Over the past months we’ve been implementing various tools to improve performance, speed up node setup and help to reduce manual effort for our team and external developers as much as possible. We wanted to make installing and running cheqd nodes easy. Therefore, our automation allows people to configure secure, out-of-the-box configurations efficiently and at a low cost.

Terraform: Infra-as-code

We have started using HashiCorp’s Terraform to define consistent and automated workflows — in order to improve efficiency and streamline the process of setting up a node on cheqd. Terraform is a form of Infra-as-code which is essentially the managing and provisioning of infrastructure through code instead of through manual processes. You can think of it like dominos — one click of a button can result in a whole series of outcomes.

This automation gives prospective network Validators the choice of whether they want to just install a validator node (using our install instructions), or whether they want to set up a sentry+validator architecture for more security.

Terragrunt: Infra-as-code

Terragrunt works hand-in-hand with Terraform, making code more modular, reducing repetition and facilitating different configurations of code for different use cases. You can plug in config information like CPU, RAM, Static IPs, Storage, etc., which speed things up whilst making the code more modular and reusable.

Through the use of Terragrunt, we are also able to extend our infrastructure to a full suite of supported cloud providers. This is important since our infrastructure code only works directly with Hetzner and DigitalOcean cloud providers (for their good balance of cost vs performance). We did, however, recognise that many people use AWS or Azure. Terragrunt therefore, performs the role of a wrapper to make our infrastructure available in Hetzner and DigitalOcean, as well as making it easier to utilise with AWS or Azure.

Ansible: Infra-as-code

Ansible allows node operators to update software on their nodes, carry out configuration changes etc, during the first install and subsequent maintenance. In a similar way to Terragrunt, Ansible code can also act as a wrapper, converting the code established via Terragrunt and Terraform into more cross-compatible formats.

Using Ansible, the same configurations created for setting up nodes on cheqd could be packaged in a format which could be consumed by other Cosmos networks. Therefore, this could have a knock-on effect for benefiting the entire Cosmos ecosystem for running sentry+validator infrastructure.

DataDog: Monitoring

DataDog is a tool that provides monitoring of servers, databases, tools, and services, through a SaaS-based data analytics platform. You can think of it like a task manager on your laptop. Using DataDog we keep an eye on metrics from Tendermint (e.g. if a validator double signs a transaction) and the Cosmos SDK (e.g. transactions / day).

This is valuable to ensure the network runs smoothly & any security vulnerabilities/issues that may impact consensus are quickly resolved.

Cloudflare Teams: Role Management (SSH)

When managing a network it’s important that those building it can gain access when they need it. For this we’ve been using Cloudflare Teams to SSH into one of our nodes.

SSH (Secure Shell) is a communication protocol that enables two computers to communicate by providing password or public-key based authentication and encrypted connections between two network endpoints

This work is important because other Cosmos networks can reuse the role management package to reduce the time spent on configuring their own role management processes for SSH.

HashiCorp Vault: Secret Sharing

Sharing secrets in a secure fashion is vital — for this we’ve HashiCorp Vault which offers a script that copies private keys and node keys over to a vault. You can think of this like a LastPass or 1Password for network secrets (e.g. private keys). This way, if for example a node is accidentally deleted and the private key is deleted for a validator, it’s easy to restore it.

This is hugely valuable for Validator nodes, who may want to add an extra layer of security to the process of backing up private keys and sharing keys between persons internally. Moreover, through using HashiCorp Vault, we hope to reduce the amount of risk teams may incur of losing their private keys and thus, losing the ability to properly manage their nodes.


Github repository: cheqd/faucet-ui

The cheqd testnet faucet is a self-serve site that allows app developers and node operators who want to try out our identity functionality or node operations to request test CHEQ tokens, without having to spend money to acquire “real” CHEQ tokens on mainnet.

We built this using Cloudflare Pages as it provides a fast way to create serverless applications which are able to scale up and down dynamically depending on traffic, especially for something such as a testnet faucet which may not receive consistent levels of traffic. The backend for this faucet works using an existing CosmJS faucet app to handle requests, run using a Digital Ocean app wrapped in a Dockerfile.


This solution:

  1. Helps to keep the team focused on building, as no longer do we need to dedicate time for manually responding to requests for tokens.
  2. Creates a far more cost effective way of handling tesnet token distributions
  3. Can be utilised by developers to test cheqd functionality far more efficiently
  4. Can be used by other Cosmos projects to reduce operational overheads and reduce headaches around distributing testnet tokens


Github repository: cheqd/airdrop-ui (FE), cheqd/airdrop-distribution(BE)

The airdrop tools, used for our community airdrop rewards site, are split into two repos; one for managing the actual distribution of airdrop rewards to wallets, and another for the frontend itself to handle claims.

In terms of the frontend, we learnt that airdrop reward sites need to be more resilient to traffic spikes than most websites because, when announced, community members will tend to flock to the site to claim their rewards generating a large spike in traffic, followed by a period of much lower traffic.

This type of traffic pattern can make prepping the server to host airdrop claim websites particularly difficult. For example, many projects will choose to purchase a large server capacity to prevent server lag, whilst others may simply become overwhelmed with the traffic.

To manage this, the frontend site was developed to work with Cloudflare Workers, a serverless and highly-scalable platform so that the airdrop reward site could handle these spikes in demand.

On the backend we also needed to build something that could manage a surge in demand whilst providing a highly scalable and fast way of completing mass distributions. Initially our implementation struggled with the number of claims resulting in an excessive wait to receive rewards in the claimant’s wallet. To improve this we used 2 separate CosmJS-based Cloudflare Workers scripts; one which lined up claims in 3 separate queues (or more if we wanted to scale further), and a second distributor script that is instantiated dependent on the number of queues (i.e. 3 queues would require 3 distribution workers).

There is no hiding that we ran into some hiccups, in part due to our Cloudflare Worker approach, during our Cosmos Community Mission 2 Airdrop. We have documented all of the issues we ran into during our airdrop and the lessons learnt in our airdrop takeaway blog post. What is important to explain is that:

  1. The reward site using Cloudflare Workers scaled very well in practice, with no hiccups;
  2. We had problems with the way we collated data, but the fundamental Cloudflare Workers infrastructure we ended up with, after having to refactor for our initial mistakes, is battle tested, highly efficient and resilient.


Any project using the CosmosSDK and looking to carry out an airdrop or community rewards program can now use our Open Sourced frontend UI and Distribution repository to ensure a smooth and efficient process for the community, without any hiccups in the server capacity or distribution mechanics.

We would much rather other projects do not make the same mistakes as we did when we initially started our airdrop process. What we have come away with, in terms of infrastructure and lessons learned, should be an example of the do’s and the not-to-do’s when carrying out a Cosmos based airdrop.


Github repositoriescheqd/data-api

We found on our journey that there’s a LOT of stuff that we needed APIs for, but couldn’t directly fetch from base Cosmos SDK’s.

As Cosmonauts are well aware of, the CosmosSDK offers APIs for built-in modules using gRPC, REST & Tendermint RPC, however, we noticed a few that it can’t provide, so we built them:

  1. Total Supply
  2. Circulating Supply
  3. Vesting Account Balance
  4. Liquid Account Balance
  5. Total Account Balance

This collection of custom APIs can be deployed as a Cloudflare Worker or compatible serverless platforms.

Further specifics about what these APIs mean can be found within our repository Readme.

Why is this valuable

These APIs are useful for multiple reasons:

  1. Applying for listings on exchanges requires many of these APIs upfront
  2. Auditing and analysing the health of a network
  3. Creating forecasts and projections based on network usage
  4. Providing transparency of metrics to the network’s community

Through open sourcing these APIs, we want to provide an easy way for all other Cosmos projects to track these metrics, hugely reducing the time and energy needed to source these metrics from scratch.


Github repositories: cheqd/cosmjs-cli-converter

There is an assumption in the CosmosEcosystem that wallet addresses across different chains, such as, Cosmos (ATOM), Osmosis (OSMO) and cheqd (CHEQ), are all identical. This is because they all look very similar. However, each chain’s wallet address is actually unique.

Interestingly, each network’s wallet address can be derived using a common derivation path from the Cosmos Hub wallet address. Using one derivation path #BIP44 means that users that use one secret recovery phrase and core account to interact with multiple networks.

Our cross-chain address convertor is able to automate the derivation of any chain address from one Cosmos address to another. We’ve seen some examples of this previously, but they are mostly designed to do one-off conversions in a browser rather than large-scale batch conversions. Emphatically, our converter could do 200k+ addresses in a few minutes. Doing this using any existing CLI tools or shell scripts can take hours.

Why is this valuable

This is valuable since it can automate airdrops or distributions to any account, just from a Cosmos Hub address in bulk, making data calculations far more efficient.

For new chains in the Cosmos Ecosystem, this makes it much easier for the core team and Cosmonauts to discover and utilise their account addresses and carry out distributions.


Phew! There’s a lot here but we really want to make sure everything we do for cheqd is useful far beyond our project. Contributing back to the Web3 and SSI community is a shared belief across the cheqd team. It as one of our foundational principles.

As always, we’d love to hear your thoughts on our writing and what this means for your company. Feel free to contact the product team directly — [email protected], or alternatively start a thread in either our Slack channel or Discord.

Trusted data explained | the rise of the trusted data economy

Trusted data (or authentic data) is information translated into a form usable by computers, whose source is verifiable — it can be checked through a standardised method to demonstrate accuracy. The trusted data economy takes trusted data one step further, encompassing the business models that can enable a fairer, more transparent and decentralised world.

Systems that expand the radius of trust change societies”

(Werbach, 2016: 4)

Trust is the underpinning of all human contact and institutional interactions; a crucial value in international affairs and a complex interpersonal and organisational construct, embedded in all areas of society, from individuals’ relationships with each other to the global political system.

It is widely seen as one of the most important synthetic forces within society which encompasses values such as reciprocity, solidarity and cooperation, whilst within areas such as technology, law and governance, it is less of a synthetic value and more of an intrinsic and core construct within contracts, regulation and code.


But what is trust?

At its core, trust is centred on the reliability of an assertion about someone or something; an indisputable, verifiable claim (the operative word here being ‘verifiable’ — the ability to check or demonstrate accuracy).

In a continuously digitised and globalised world, trust has been increasingly hard to nurture, and as events over the past decade have shown, it has fast become a threatened commodity the world over.


The evolution of trusted data

With the mass adoption of the internet, the world has witnessed a rapid acceleration of innovation, and as a result, a diverse range of positive outcomes.

Access to the internet, for example, in a vast portion of the world is now considered a basic human right and an essential component of a functioning society.

However, for the vast majority of people, the reliance and somewhat addiction to the efficiencies it brings to our lives, towers over our curiosity, and knowledge of, the very real risks it poses to us around what is being done backstage, behind the warm lure of the glossy frontends.

As we engage in our now phenomenally digitised day-to-day lives, the information about the things we write, say, do and read is packaged up into something we hear about a lot, but don’t necessarily question all that often what it is.

You guessed it.… data.

Data, within the context of computing, is simply information, facts provided or learned about something or someone, translated into a form that is efficient for movement or processing.

Put simply, everything we know about anything is information which can be packaged up and utilised as data.

This page addresses the meeting of the two: trust and data.

Trusted data (we also use the term authentic data interchangeable) is, therefore: information translated into a form usable by computerswhose source is verifiable — and can be checked through a standardised method to demonstrate accuracy.

The need for trusted data

For much history, we have found methods to demonstrate that information itself can be trusted. This is based on the way the issuer of information, and the verifier of information agree on what makes something trustworthy.

We trust paper money because we trust the fine details that are imprinted onto each one, which can be verified as being issued by the body that can reliably demonstrate they have the authority to do so.

As a temporary holder of the paper money, one can also conduct their own checks.

A British £20 banknote

For example; the £20, the most used note in Britain, can be verified by verifying that:
  1. The hologram image changes between ‘Twenty’ and ‘Pounds’
  2. The foil is gold and blue on the front of the note and silver on the back within the see-through windows
  3. A portrait of the Queen is printed on the window with ‘£20 Bank of England’ printed twice around the edge
  4. A round, purple foil patch contains the letter ‘T’
  5. Under a good-quality ultraviolet light, the number ‘20’ appears in bright red and green on the front of the note
Similarly, if we look at an identity-related example, we trust the information on a passport, driver’s licence or birth certificate, because we note other fine details and unique characteristics which a verifier of this information can reliably identify. This model of having a common societal understanding of what is trustworthy and what is not extends across all aspects of information in the physical world, and now deep into the digital world. However, with the advancement of technology and ease of access to methods and tools to behave in a fraudulent manner, combined with a lack of transparency over where the information (stored as data) is being held, it is more difficult to actually verify a claim and ultimately be able to reliably state that data is trustworthy.  

A world of truly trusted data in Web 3.0 and Decentralised Identity

Without going into too much technical detail of how this data is made trusted in Decentralised Identity, you can find out all you need to know on our learn site here, some of the underlying principles and basics do help illustrate what a world where trusted data is the norm would look like. In Web 2.0, much of the world’s data is held in huge data centres controlled by a small number of large players, acting as gatekeepers. The term ‘cloud’ has been used effectively to create the feeling that one’s data is just held in the air, the ether, for an individual to call on when they need it, yet in reality, our data is locked up and secured by these large gatekeepers.

Our data is not held, controlled or owned by ourselves.

As a result, our understanding of what goes in and what comes out is limited. An issuer may provide a trusted piece of data, but what happens before this arrives to an individual or a verifier is out of your control.

Likewise, with more sophisticated means of cybercrime and hacking, uncovering whether some data has been tampered with is harder and harder to do.

To get around this, a combination of technologies have come together at a poignant moment across different industries. Within the Decentralised Identity / SSI space, three technologies, in particular, are integral:

  1. Decentralised Identifiers (DIDs)
  2. Verifiable Credentials (VCs)
  3. Blockchain technology (used with SSI as a Verifiable Data Registry) (note: for decentralised identity, blockchain is not necessarily required but it does offer some significant advantages in further improving the level of trust, transparency and the overall efficiencies required for it to flourish

Decentralised Identifiers (DIDs) and Verifiable Credentials work in tandem as the foundations of Decentralised Identity to ensure data can be trusted. DIDs act as a form of digital stamp or hologram, making it possible to check the authenticity of the information, whilst VCs contain the very information itself that needs to be checked and verified — more on both DIDs and VCs here.

Blockchain is often described as a “trustless” system, meaning that ultimately one does not have to have some synthetic indeterminate level of “trust” as the rules and structures laid out in code do this.

Although blockchains use complicated technology, which often deters people from further reading, their basic function is quite simple: to provide a distributed yet provably accurate record.

In other words, everyone can maintain a copy of a dynamically-updated ledger, but all those copies remain the same, even without a central administrator or master version.

This approach offers two basic benefits.

First, one can have confidence in transactions without trusting the integrity of any individuals, intermediaries, or governments. Data is therefore trustworthy because no party can tamper with it so the data put in is what comes out.

Second, the distributed ledger replaces many private databases that must be reconciled for consistency, thus reducing transaction costs.


The trusted data economy

The trusted data economy takes trusted data one step further.

Over the past two decades leveraging the buying and selling of data has become a powerful and incredibly profitable business model. It is what has led to the growth and dominance of the behemoths of the internet: Google and Facebook the most prominent examples.

By providing a service, totally free for most uses, these companies have quietly deepened their drills into the gold mine of individuals’ data whilst many have unknowingly compromised their privacy and freedoms.

Yet, though more and more this is being exposed and famous terms are emerging, such as ‘if the product is free, you’re the product’, there has still been very little movement and change at a regulatory or social level.

Enter the trusted data economy

The trusted data economy flips this business model entirely on its head, shifting control away from these internet behemoths and over to the individuals.

Through a range of payment models enabled through these technologies, the individual can now be the ultimate gatekeeper and vendor of their identity; able to choose who and for what their data is used and even sold for!

This new data economy of trusted data has been labelled as ‘decentralised identity’ as well as ‘self-sovereign identity (SSI)’ since it directly empowers individuals to have control and engage in trusted interactions in both the physical and digital spheres.

Find out more about the economy of trusted data might look in cheqd’s tokenomics for self-sovereign identity.



Transparency, freedom, determination, democratisation — these are all features of what Web 3 and the shift in power promises, however none of these are truly possible without a new era of data management in which we can have faith in where data resides, who has access to it and who is making money with it.

Through revolutions over time, a small number of people and organisations benefit most and power concentrates to the few. Yet, as time progresses and accessibility to the technologies that enabled that revolution increases, the more the masses can engage and challenge the status quo.

Trusted data is a very real solution for many of the problems of today and the trusted data economy which enables it to gain mass adoption can make it happen.

Find out more about how we at cheqd are helping usher in the trusted data revolution….

CHEQ tokens are now available on Ethereum

gravity bridge ethereum cheqd

Why we bridged the cheqd network to Ethereum, and our collaboration with Gravity Bridge to achieve it

We’re excited to share that we have now successfully set up a bridge to Ethereum for the cheqd network using the Gravity Bridge. A blockchain bridge or ‘cross-chain bridge’ enables users to transfer assets or any form of data seamlessly from one entirely separate protocol or ecosystem to another. As we build payment rails for digital identity (more on that below), we want to offer issuers, verifiers, and holders a choice on the means of settlement. We believe that widely-adopted stablecoins such as USDC offer price-stable currencies to designate such transactions, and therefore bridging to Ethereum makes obvious sense.

Before we get into why we chose to bridge to Ethereum with the Gravity Bridge, it’s worth sharing a little about what a bridge is, why we felt the time was right for us to do this, and ultimately what are the benefits to the cheqd network, our partners and end-users.

Introducing the CHEQ-ERC20 wrapped token

To create an ERC20 representation of the Cosmos based CHEQ token we’ve used a bridge. A blockchain bridge or ‘cross-chain bridge’ enables users to transfer assets or any form of data seamlessly from one entirely separate protocol or ecosystem to another (i.e. Solana to Ethereum, or in our case Cosmos to Ethereum and vice versa).

Bridges generally use some kind of mint-and-burn function to keep token supply constant across all platforms. When the token leaves one blockchain, it is burned or locked, and an equivalent token is minted on the opposite blockchain. Conversely, the equivalent token is burned or locked when the token moves back to its original network. This equivalent token is known as a ‘wrapped token’ because the original asset is put in a wrapper, a kind of digital vault that allows the wrapped version to be created on another blockchain.

The CHEQ-ERC20 wrapped token can be found here (you can also add it to your MetaMask wallet through this link — go to profile summary > click ‘more’ > ‘add token to MetaMask’ )


How do I transfer tokens to Ethereum and join a pool?

Good question, we’re glad you asked!

To get started, head over to our learn site, where you’ll find out all you need to go about sending tokens to Ethereum.

You’ll also be able to join a CHEQ:USDT pool on UniSwap.

cheqd uniswap-1
USDT-CHEQ pool on UniSwap

Why did we decide on a bridge to Ethereum?

As we build payment rails for trusted data (more on that below), we want to offer issuers, verifiers (the receivers of trusted data), and holders a choice on the means of settlement. We expect a preference for stablecoins to eliminate the volatility in either pricing or settling payments for trusted data.

More on this in our tokenomics for payment models.

Whilst the Cosmos ecosystem has these, they aren’t as widely adopted yet as either USDC or USDT, both of which are within the Ethereum ecosystem. Furthermore, as we want to work with fiat on and off-ramps to remove the need for end customers to worry about crypto, there are currently more of these available in the Ethereum ecosystem, although we’re sure the Cosmos ecosystem will catch up with new companies joining the likes of Kado.

A nice byproduct of this is providing easier access to CHEQ, whether you’re building upon the network, seeking to secure the network through staking or liquidity mining.

Ethereum was the first ecosystem we bridged to but it certainly won’t be the last.

What is the Gravity Bridge & how does it work?

gravity docs cheqd

The Gravity Bridge is a trustless, neutral bridge between the Ethereum and Cosmos ecosystems built by the Althea team. Built using the Cosmos SDK, it uses the validator set to sign transactions instead of a multi-sig or permissioned set of actors.

The neutrality here implies that the entire focus of the Gravity community is on providing the most effective and secure bridge possible instead of on a DeFi application on the local chain. This neutrality aggregates volume from a number of blockchains and sources, increasing efficiency and lowering costs. All control over the bridge is handled entirely by the Gravity Bridge validator set.

The Gravity Bridge has two defined components:

1) A Solidity contract on Ethereum

2) A Cosmos SDK module on the Gravity Bridge blockchain

The way Gravity Bridge works is similar to how all cross-chain bridges work, i.e. locking up a native token on one side of the bridge and minting a representation of that token on the other. The user then uses this representation before it is returned to the bridge and redeemed for the native asset on the other chain.

The most critical component for bridges to and from Ethereum is the Solidity contract. It holds the native assets being sent across the bridge.

Gravity.sol, the Solidity contract developed by the Althea team, holds funds for Gravity Bridge on Ethereum. In contrast to the prevailing trend in other bridge designs, at a mere 580 lines of code, Gravity.sol is compact and easy to review.

It has been audited by three independent teams (InformalLeast Authority, and Code4rena), and it is not upgradable, meaning that auditors found it cannot be tampered with by any malicious actor and does not contain any trusted parties of any kind.

To interact with Gravity Bridge, head to (supported by Cosmostation), where you can connect your MetaMask and Keplr wallet.

If you want to learn more, hear from the Gravity team themselves here or head over to the Gravity Docs.

Why did we decide to use the Gravity Bridge?

After exploring the options available, we felt the Gravity Bridge was the most suitable and could help us achieve our results in the fastest whilst safest way possible.

One of the key things that attracted us to the Gravity Bridge is the way in which the Ethereum contract has been highly optimised, utilising batches to dramatically reduce the cost of transfers between Cosmos and Ethereum.

We also felt the decentralised running of the network was most in line with our vision at cheqd. As it’s a non-custodial solution, a stake can be slashed for any misbehaviour in accurately bridging assets or secure messages.

In addition, as a native bridge, the value of tokens on the destination chain and the absence of double spending are guaranteed by the same consensus as on the consensus chain. This means all the validators are coming to a consensus that the individual owns the tokens that are registered on both sides (the same amount of tokens has been launched on both sides of the bridge — i.e. on both chains).

Finally, we also saw the bridge as a powerful way to ensure having a token on another chain doesn’t fracture liquidity like wrapped tokens through a custodian would do.

Potential issues identified

One of the issues that was raised with the Gravity Bridge is down to the upgrade process. Given the team’s warranted focus on simplicity, it has led to the Solidity contract (gravity.sol) being non-upgradeable, whereas the other bridges can be upgraded by multi-sig wallets. This means the validators and their delegators are completely in control of the Gravity Bridge, and no one else can change the code.

Although we recognise this concern, we feel that the validators on the Gravity Bridge chain have a legitimate governance process and have acted in line with our principles to date. We also plan to set up a validator node on the Gravity chain and engage more in the governance and running of the bridge itself.

What other bridging options did we explore?

Whilst beginning our investigation into bridges, we came across varying bridging techniques available, mainly in the Cosmos ecosystem, although these remain much the same across protocols.

Although many of the wrapped tokens available require a custodian to manage this minting and burning (aka. centralised or trusted bridges), the more innovative bridges that now exist (aka. noncustodial, decentralised or trustless bridges) do this through automated methods using contracts on either side of the bridge, as we’ll come on to.

As mentioned, our initial priority with a bridge is for ease of accessing stablecoins for settling payments for trusted data that will be facilitated on the cheqd network. With this in mind, our bridging requirements at this stage are less complex than they might be in the future (for example, we aren’t necessarily in need of a fully-fledged solution that allows the bridging of smart contracts across protocols like EVMos will in time enable).

If you’re a project looking at bridging, we’d recommend you check out this great explainer video put together on bridges by Ken Timsit from Cronos (Cronos also plan to use the Gravity Bridge, launching in Q2)


Evmos is a Secondary Bridge leveraging a third party — the Connext Peer-to-Peer Bridge. Connext acts as an intermediary to provide liquidity on both sides by locking up purchased tokens and adding these

Essentially when using EVmos you don’t have an Ethereum contract (like Gravity does with Gravity.sol), but it uses the existing contracts of the tokens that exist on the other side. Therefore, given other third parties provide the liquidity, we felt this would be a greater overhead for our team compared to Gravity and more centralised.

That said, we are excited by EVMos and the great team working on this project, who look to have an exciting year ahead with four different DEXs, lending protocol, two perpetual platforms and at least three NFT collections for the NFT marketplaces.


Connext is a peer-to-peer bridge / cross-chain swap. It uses a similar method to the Hashed Timelock Contract, a transactional agreement used on the Bitcoin network to produce conditional payments wherein the receiver or the beneficiary must acknowledge the receipt of payment before a predetermined time or a preset deadline.

Basically, when you want to transfer tokens, there is one router which will be assigned and takes the responsibility of fronting the liquidity. In exchange, they will wait to get paid after the transaction is completed on Ethereum with the same amount of tokens.

Connext’s network utilises nxtp, a lightweight protocol for generalised cross-chain transfers. Nxtp is made up of a simple contract that uses a locking pattern (mentioned above) to prepare and fulfil transactions, a network of off-chain routers that participate in pricing auctions and pass calldata between chains and a user-side SDK that finds routes and prompts on-chain transactions.


Thorchain is a blockchain protocol built on Cosmos that aims to “make all of crypto liquid”. It seeks to do this by enabling the trading of non-native crypto assets, such as trading Bitcoin for Ethereum, but in a completely decentralised way. In essence, it does much of what Coinbase and Binance do — but without a third party ever taking control of the funds.

The Thorchain protocol also powers a decentralised exchange (DEX) by the same name. Like Uniswap or SushiSwap, the Thorchain DEX allows anyone to trade or lend their crypto assets by providing liquidity to an asset pool and, in exchange, earn a return (or “yield”) on those assets. With 1.5 million transactions to date and >80 validators, it is a leading solution, however, for our use case, it just didn’t offer the simplicity and ease of enabling a $CHEQ token on Ethereum.

Since we completed our investigation, Osmosis has also spent some time exploring how they plan to bridge their DEX’s requirements. Axelar, Wormhole and Nomad were also discussed here — all options we came across but did less investigating at the time. You can also find the RFPs for these Bridge proposals links: AxelarGravity BridgeNomadWormhole

A brilliant panel with some of the bridge providers can be found here, and the write-up also available here. The Osmosis Discord has had perhaps the most lively debate since Robo McGobo created a special channel for bridge discussions, and representatives from the various teams have been quite responsive there.

Looking forward; the cheqd <> Gravity Bridge Partnership

We’re excited to be working with the Gravity Bridge / Althea team to explore more ways in which our networks complement each other, as ultimately, they both seek to offer seamless experiences in DeFi & Web 3.0 for users. Watch out for our upcoming blogs around “Seamless experiences, powered by cheqd and Gravity Bridge”.

Huge thanks to the Gravity Bridge and Althea teams for building this out — we’re thrilled to reach this important milestone for cheqd and play a part in the future of the Gravity Bridge.

Happy bridging — all aboard the Gravity express!

Liquidity Pools explained — what, why, and how…

Liquidity Pools explained   what, why, and how…

Liquidity pools are an innovative solution within DeFi to create the mechanics of a market maker in a decentralised fashion. Although often met with confusion, they are simply clusters of tokens with pre-determined weights.

A token’s weight is how much its value accounts for the total value within the pool. Liquidity Pools are an exciting and equalising tool, which represent the true nature of the Decentralised Finance (DeFi) and Web3.0 movement.

This blog will offer some insight into Liquidity Pools.

It will first take you through what they are and why they exist, followed by how they work to create an environment, which incentivises contribution. It will then explore some suggestions as to why you may be interested in engaging with them and finally how to get involved.

Liquidity Pools explained   what, why, and how…

What is a liquidity pool?

In a previous blog post we outlined where liquidity pools derived from which we’d recommend you read here first if you haven’t already.

At a high level, liquidity pools are a method of increasing liquidity, similar to the way traditional exchanges use market makers.

Yet, where traditional finance requires expensive and centralised intermediaries, which have a level of power to manipulate prices, liquidity pools offer a decentralised alternative through automated market makers, which offer a unique opportunity for anybody to contribute to a pool which behaves similar to a market maker. The pool is essentially a shared market maker, the gains from which are distributed between those that contribute.

This both embodies the ideals of blockchain and decentralisation generally and offers users and companies unique opportunities to trade more efficiently and cheaply whilst having total trust in the system makes it so. Before they arrived on the scene, liquidity, i.e. how easy it is for one asset to be converted into another, often fiat currency without affecting its market price, was difficult for DEXs.

How do they work?

In order for Liquidity Pools to function in the way that leads to the outcomes laid out above, i.e. greater decentralisation of projects and increasing liquidity, there are a number of key aspects worth understanding:

  • token weighting;
  • pricing;
  • market-making functions;
  • LP tokens.

(much of the following is taken from the official documentation from Osmosis Labs).

Token weight

Liquidity pools are simply clusters of tokens with pre-determined weights. A token’s weight is how much its value accounts for the total value within the pool.

For example, Uniswap pools involve two tokens with 50–50 weights. The total value of Asset A must remain equal to the total value of Asset B. Other token weights are possible, such as 90–10.


With fixed predetermined token weights, it is possible for AMMs to achieve deterministic pricing, i.e. outcomes are precisely determined through known relationships among states and events, without any room for random variation. As a result, tokens in LPs maintain their value relative to one another, even as the number of tokens within the pool changes. Prices adjust so that the relative value between tokens remains equal.

For example, in a pool with 50–50 weights between Asset A and Asset B, a large buy of Asset A results in fewer Asset A tokens in the pool. There are now more Asset B tokens in the pool than before. The price of Asset A increases so that the remaining Asset A tokens remain equal in value to the total number of Asset B tokens in the pool.

Consequently, the cost of each trade is based on how much it disrupts the ratio of assets within the pool. Traders prefer deep liquid pools because each order tends to involve only a small percentage of assets within the pool. In small pools, a single order can cause dramatic price swings; it is much more difficult to purchase say 1,000 ATOMs from a liquidity pool with 2,000 ATOMs than a pool with 2,000,000 ATOMs.

Market-Making Functions

AMMs leverage a formula that decides how assets will be priced in the pool. Many AMMs utilise the Constant Product Market Maker model (x * y = k). This design requires that the total amount of liquidity (k) within the pool remains constant. Liquidity equals the total value of Asset A (x) multiplied by the value of Asset B (y).

Other market-making functions also exist, you can find out more about these here.

Liquidity Pool Tokens (LP tokens)

When a user deposits assets into a Liquidity Pool, they receive LP tokens. These represent their share of the total pool.

For example, if Pool #1 is the OSMO<>ATOM pool, users can deposit OSMO and ATOM tokens into the pool and receive back Pool1 share tokens. These tokens do not correspond to an exact quantity of tokens, but rather the proportional ownership of the pool. When users remove their liquidity from the pool, they get back the percentage of liquidity that their LP tokens represent.

Source: Osmosis Labs docs

Why should I care?

The collection of the mechanisms above is used to ensure liquidity pools are able to maintain a stable price and ultimately work as a traditional market maker would do.

However, in order to achieve their ultimate goals, encouraging token holders to provide liquidity to pools is required.

The aspects in place to do so is what is known as ‘liquidity mining’ or ‘yield farming’. Contributing to a pool makes an individual a liquidity provider (LPs).

Liquidity mining

Liquidity providers earn through fees and special pool rewards. LP rewards come from swaps that occur in the pool and are distributed among the LPs in proportion to their shares of the pool’s total liquidity. So where do the rewards themselves come from?

Liquidity rewards are derived from the parameters laid out in the genesis of the AMM, in the case of the Cosmos Ecosystem this is Osmosis. For Osmosis, each day, 45% of released tokens go towards liquidity mining incentives.

When a liquidity provider bonds their tokens they become eligible for the OSMO rewards. On top of this, the Osmosis community decides on the allocation of rewards to a specific bonded liquidity gauge through a governance vote.

Bonded Liquidity Gauges

Bonded Liquidity Gauges are mechanisms for distributing liquidity incentives to LP tokens that have been bonded for a minimum amount of time. For instance, a Pool 1 LP share, 1-week gauge would distribute rewards to users who have bonded Pool1 LP tokens for one week or longer. The amount that each user receives is in proportion to the number of their bonded tokens.

The rewards earned from liquidity mining are not subject to unbonding. Rewards are liquid and transferable immediately. Only the principal bonded shares are subject to the unbonding period.

However, as with any opportunity for gain, there is of course some degree of risk; i.e. an individual could be better off holding the tokens rather than supplying them.

This outcome is called impermanent loss and essentially describes the difference in net worth between HODLing and LPing (more here). Liquidity mining mentioned above helps to offset impermanent loss for LPs. There are also other initiatives within the Osmosis ecosystem and beyond exploring other mechanisms to reduce impairment loss.

How do I get involved in liquidity pools

So you’re sold on their potential and now you want to get involved?

Liquidity pools can be access across DeFi, whether in the Ethereum ecosystem using UniSwap and SushiSwap, or closer to home for cheqd in Cosmos, through Osmosis and Emeris

For the purpose of this article we’ll share how to get involved using Osmosis.

First, head to Osmosis and click enter the lab. Once you’ve agreed to terms and you’re ‘in the lab’ you’ll see some trading pairs and a button to connect your wallet (bottom left of the dashboard).

enter the osmosis lab cheqd blog setting up

You can then select Keplr wallet which will automatically connect to your Keplr wallet if you’ve already set it up as a Browser extension.

enter the osmosis lab cheqd blog wallet

Next, you’ll need to deposit the assets you would like to contribute towards a Liquidity Pool. You can see the available Liquidity Pools under ‘Pools’. For example, if you would like to contribute to the Pool #602 : CHEQ / OSMO, you will need to deposit both of these tokens.

To do so, select ‘Assets’ and find the tokens you would like to deposit to contribute to the pool.

Note: if you already hold OSMO in your Keplr wallet you won’t be required to deposit.

Once you have deposited enough tokens for both sides of the pools (i.e. ensure that if the pool is setup as 50:50, you must have the equivalent amount is USD on both sides)

Next, find your pool and select ‘Add/ Remove Liquidity’.

Here you’ll be able to add tokens on both sides of the pool.

On selecting ‘Add Liquidity’ you’ll then be directed back to Keplr to approve the transaction (a small fee is required).

Once you have added liquidity to the pool, you’ll receive your LP tokens (a token representing your share of the total pool). Now it’s time to start ‘Liquidity Mining’.

You’ll now be able to see your total Available LP tokens. Below this you’ll see an option to ‘Start Earning’.

Once here you’ll see a few options for your unbonding period (i.e. the amount of days it takes to remove your tokens from the pool if you decide to withdraw). The longer you choose to bond your tokens, the higher the rewards you’ll be eligible to earn.

Next select the amount of your LP tokens you’d like to contribute to the pool and finally hit ‘Bond’ (this will kick off another approval through a Keplr pop-up).

You’ll now see your total bonded tokens. Each day rewards will then be distributed. When you decide to withdraw from the pool you’ll simply need to select ‘Remove Liquidity’ and select the amount you’d like to withdraw.


Overall, liquidity pools offer a new avenue for projects to gain more liquidity, and believers of these to show their support. Where for many years engaging in and benefiting from such financial systems was reserved solely for the wealthiest individuals and large organisations, now anyone can gain access and start contributing to their favourite projects, voting on their future direction and earning from the part they play.

Note: for the purpose of engaging with cheqd through its token the information above is not required, however, we strongly believe in the value of educating and sharing what we’re learning with our community to help you better understand DeFi and support us in raising the awareness of the shift to Web 3.0.

You’ve seen our Product Vision for 2022… now we want to hear from you!

Product Vision for 2022 Part 2

Co-authored by Ross Power and Ankur Banerjee

Last week we shared our Product Vision for 2022 where we broke down our product development for the year into three focus areas:

  1. Identity: Core identity functionality for our partners to build compelling self-sovereign identity use-cases on top of the cheqd network.
  2. Web 3.0 Core: Core Web 3.0 functionality adds deeper integration for our network and token into the Cosmos and other blockchain ecosystems.
  3. Web 3.0 Exploratory: Emerging Web 3.0 use-cases such as decentralised exchanges (DEX) ecosystems; decentralised autonomous organisations (DAOs); identity for non-fungible tokens (NFTs), and in general, DeFi applications.

If you missed it you can see it here.

cheqd’s product vision-we want to hear from you

Now we want to hear from you… launching cheqd’s 2022 Product Roadmap Survey

We strongly believe in the importance of our Product Roadmap being informed by the ultimate end users of the cheqd network; whether this is our immediate SSI Partners, their customers, or even the long-term end-users of SSI and the cheqd network.

As such, the purpose of this survey is to gain a greater understanding of the hopes, needs and wants of our community, developers, and partners as we build our network for incentivised decentralised digital identity.

Once we have gathered responses we’ll work through our backlog using our existing understanding and assumptions, informed by the feedback we receive, to prioritise our next steps using the framework below (at a high level).

Objectives of the Product Roadmap Survey

  1. Network Utility: To gain a deeper understanding of our SSI vendors needs, which will subsequently inform how we develop the network utility and token utility. By understanding our SSI vendors’ customer needs we’ll be able to more effectively assimilate ideas to find common needs that meet the majority.
  2. Interoperability: To align the network and token utility, to the best of our ability, with open standards to maximise; Technical interoperability, Semantic interoperability, Economic interoperability, Legal interoperability. On this point, we’re excited to share that cheqd is now an official ToIP supported Public Utility!
  3. Vendors customer research: To help inform and test our assumptions relating to the more strategically significant milestones in the SSI space in terms of the actual end customers/end users.
  4. Partnership Commitment: To fulfil our intentions of creating a truly collaborative and engaged partnership ecosystem that can effectively provide direction to a network that they will ultimately be the beneficiaries of, and channel to end-users.

The Survey

If you’d like to take part you can complete the survey here. It should take 5–7 minutes to complete and will close on Friday 11th February 12:00 UTC.

As a thank you for your contributions, 20 randomly-chosen respondents will receive a surprise gift from cheqd 🤫. To allow us to get in touch to give you the surprise, you can optionally opt-in at the end of the survey to be considered for the random draw. Please be sure to let us know any additional feedback you have in the free-text questions of the survey if you feel we’ve missed anything or have anything you’d like to add. We look forward to sharing the results with you in the near future!

Tell us what you think!

In addition to the Product Roadmap Survey, we welcome engagement and feedback across a range of different forums, such as our Community Slack and Governance Framework discussion board (best for extended, in-depth discussions), or right here on Medium.

We, at cheqd, help companies leverage SSI. cheqd’s network is built on a blockchain with a dedicated token for payment, which enables new business models for verifiers, holders and issuers. In these business models, verifiable credentials are exchanged in a trusted, reusable, safer, and cheaper way — alongside a customisable fee. Find out more about cheqd’s solutions for self-sovereign identity (SSI).

We’re launching our network very soon! Here’s how you can get ready…

We’re launching our network very soon! Here’s how you can get ready

Update: Since this blog has been published, we’ve successfully launched cheqd mainnet and updated our tokenomics. However, if you’re looking to hodl $CHEQ and participate in our governance votes, all the information below is still relevant.


Disclaimer: All information provided is intended to help users get set up on cheqd. However, we do not expressly recommend or mandate a certain approach. All actions taken are your personal responsibility. Much of the content of this article was created using the Keplr FAQKeplrs account creation blog and wallet setup guide, which you should cross-check for further information.

In the coming weeks, we’ll be launching our network. This article will provide you with a few steps on how you can set up your Cosmos wallet.

As the cheqd network has been built on Cosmos, we’ll be using wallets that integrate directly with the Cosmos ecosystem and our initial Decentralised Exchanges (DEX) — we’ll share more information on this at a later time.

Here’s where Keplr comes in…

What is Keplr?

Keplr is a Cosmos wallet that has been praised for putting the user at the heart of the experience. As the first and leading Inter-Blockchain Communication (IBC) enabled wallet for the Cosmos ecosystem, Keplr offers users the ability to stake their tokens, use blockchain apps and manage multiple tokens in one wallet.

Slow down… IBC? IBC, or Inter-Blockchain Communication Protocol, is an interoperability protocol for relaying messages between different Cosmos chains, launched earlier this year. Essentially it allows users to complete token transfers between various chains on the Cosmos Hub. As Keplr is IBC enabled, users can execute transfers of tokens within the wallet or through connected DEXs (we’ll come to this in the next piece).

Now back to Keplr.

For those a little new to crypto, Keplr is a software wallet, meaning it can be installed on a mobile or as a browser extension in chrome or safari, for example. Although less secure than hardware wallets (like a Ledger), paper wallets, or backups, software wallets offer a much more convenient experience for the user as they allow users to store their mnemonic locally on a computer.

“What is a mnemonic?” I hear you say…

Good question.

A mnemonic (said with a silent m — “nuh-mon-ic” — if you really want to impress your crypto friends), mnemonic phrase, mnemonic seed, or simply seed phrase is a string of 12–24 words that represent the private key to your wallet. When used in the correct sequence, they give users access to their cryptocurrencies stored within a wallet in the same way a private key does. As private keys generally are made up of a combination of letters and numbers, which are difficult to decipher, mnemonics are used to make it possible to understand the private key.

mnemonic cheqd blog

We all have childhood memories of creating easy-to-remember phrases to recall things like the planets in our Solar System before crypto was ever invented and when Pluto still had a seat at the Solar System table.

pluto cheqd blog

The fancy word ‘mnemonic’ is exactly this. Literally, it refers to a memory aid like a rhyme, abbreviation or song that helps to remember something else. Ironically, the word itself is much less easy to remember.

In the context of digital assets, mnemonics are used to protect your wallet in the event that the computer that it was running on died, was lost or stolen. The mnemonic, usually 12 words, can be typed in the sequence it was created to restore the wallet and your access on a new device. In this case, you will not need to store a separate version of your mnemonic as the Keplr application itself manages this for you.

We’re launching our network very soon! cheqd blog keplr wallet 2

Setting up a Keplr Wallet on PC/Mac

  1. Go to the Keplr website to find the relevant extension for your browser. If you’re using Chrome, which we’d recommend, click here → browser extension
  2. Once you’ve installed Keplr wallet, you’ll see four options to set up an account:

1) Google Single Sign On (SSO) / One-Click Login,

2) Create a new account by setting up a new seed/mnemonic phrase,

3) Import an existing account, or

4) Import your Ledger account

In this guide, we’ll assume you have not used Keplr before and, therefore, will take you through options 1 and 2

keplr wallet sign up cheqd blog

Note: The Keplr team will never reach out and ask you to validate your wallet or type your mnemonics. Be careful of scammers.

Option 1: Google SSO / One-Click Login

For the highest level of security when using Keplr, we’d recommend this option. With Google SSO, you can enable Two Factor Authentication (TFA), which enhances your security through adding a second factor. For example, you can use SMS (NOT recommended), a software authenticator, or a hardware key like Yubikey or Google Titan Key.

  1. Select Sign In with Google
  2. Enter an account name (this can be changed later) and a password
keplr wallet sign up with google cheqd blog
  1. Next, you’ll be prompted to sign in to your Google account with your email or phone (and later password). Click [Next].
  2. And that’s it! You now have a Keplr wallet.
keplr wallet se up cheqd blog
keplr wallet dashboard cheqd blog

Note: By using Google SSO, you will not have to store your mnemonic as your access is managed by Google. With the extension, you can access your private key at any time; however, you will not see a mnemonic. To do this, click on your profile icon and select the account, and use the three dots to select the information you want to see (you’ll need your password to do this).

Option 2: Create an Account via Secret Seed/mnemonic Phrase

  1. By clicking on the Keplr browser icon for the first time will take you to the accounts setup page. Choose option [Create new account].
keplr wallet create an account cheqd blog

2. The next page shows you your secret seed/mnemonic phrase.* You can select a 12 or 24-word phrase. Save this phrase in a secure place but also do not lose it. (You’ll need to input this phrase into the following page.)

backup mnemonic keplr wallet cheqd blog

3. Next, enter a name for your account and password (you can change these later). Click on [Next].

keplr wallet save password cheqd blog

4. To confirm the creation of this new account, you’ll need to click on the words in the right order in which they appear in your seed/mnemonic phrase and press [Register].

keplr wallet confirm registration cheqd blog

5. Congratulations! 🎉 You are now the owner of a Keplr wallet account and are ready to explore the interchain.

Copying a wallet address from Keplr

To receive tokens into your Cosmos wallet, whether you are depositing yourself from another wallet or you are required to do so to receive tokens from another party, you will need to provide a wallet address.

To do this, enter your Keplr app through the Browser extension by clicking on the small ‘K’ icon. If this does not come up, click the puzzle icon and find the Keplr app here (you can pin this to your browser to access your wallet more easily in the future)

puzzle icon keplr wallet cheqd blog

Once you’re in, you’ll automatically see the Cosmos wallet, which holds ATOM tokens.

cosmos wallet cheqd blog

To provide a wallet address for Cosmos, please make sure you are within this wallet. Below your name, you will see an address beginning with ‘cosmos1 followed by numbers and letters. To copy this, CLICK on the wallet address itself, and you will see an alert pop-up stating, ‘Address copied!’

cosmos wallet cheqd blog 2

This will now be the most recent information you have copied. When you are ready to paste — either right-click your mouse and select paste OR use ‘ctr-v’.

As a final step, always ensure you have double-checked the address by cross-checking the final four digits of the address you have pasted against your address in your wallet.

Note: this method works for Cosmos, cheqd and all other tokens building on Cosmos, which appear in the dropdown menu. If you’ve been asked to provide a Cosmos wallet address by the cheqd team, you’ll be guided on how to view this in your Keplr wallet at a later date (this is pending CHEQ being listed on Keplr).

Setting up multiple accounts on Keplr

In some cases, you may be required to set up multiple accounts on Keplr. You can do this within the Keplr extension you have set up in the previous steps:

  1. Select the Keplr extension from your browser
  2. Select your profile icon on the top right of the pop-up
  3. Select ‘Add account’. This will take you to the start of the process above. Please follow the same instructions to set up your additional account

Setting up a Keplr Wallet on your smartphone.

Although we’d highly recommend following the instructions above (i.e. setting up your wallet as a browser extension) as this will provide a much more seamless experience when connecting to Decentralised Exchanges, you can also log in using your phone.

If you have done the above already, once you have downloaded the Keplr app, when logging in, you will be required to provide either the mnemonic seed OR you can log in through Google SSO.

If it is your first time and you would prefer to use your phone, you can follow the steps above as a guide and set up your wallet directly through the app (always remember to store your mnemonic safely!)

Adding the cheqd network to your Keplr wallet

To see $CHEQ in your Keplr wallet, you’ll need to follow the steps below, which enable Keplr to auto-discover the cheqd network.

  1. Sign in to your Keplr wallet on your browser (do not use the mobile app)
  2. Go to our dashboard at
  3. Click “Connect” in the top right corner to link your Keplr to the dashboard
  4. You will see a pop-up that asks you to approve adding cheqd-mainnet-1 to your Keplr wallet.
  5. Click “Approve.
  6. On the Keplr extension in your browser, click the drop-down menu at the top. Scroll down to below “ — — Beta support — — ” where you will find the cheqd wallet
  7. You will now be able to see your balance in $CHEQ.
  8. See the link for further instructions on sending $CHEQ, staking and other information, with helpful screenshots.

A reminder of best practices

  • Always save your mnemonic or private key.
  • Don’t use ctrl-c or copy when copying. Instead, use the built-in function within wallets to copy your keys.
  • Store multiple copies of your mnemonic in secure locations to which you won’t lose access!
  • Never share your private keys, mnemonic seed or password with anyone.
  • Never screenshot your mnemonic seed, as you may forget to delete these.

Get ready...

If you haven’t already joined our Telegram group, follow us on Twitter and sign up for a surprise.


Update: In the meantime, take a look at the cheqd journey so far to get up to speed on where we are, where we are headed next, and how you can get involved. Our next stop is payment rails.