cheqd’s “open-source-a-thon” to contribute new tools to identity & Cosmos SDK community

cheqd-Open-source-a-thon-2022

Co-written by Ankur Banerjee (CTO/co-founder), Ross Power (Product Manager), and Alex Tweeddale (Governance Lead)

Overview

As strong proponents for open source software, over the past month the cheqd Engineering & Product team has spent a lot of effort polishing and open-sourcing products we’ve been developing for decentralised identity and Cosmos SDK framework. Some of these tools are core to delivering our product roadmap vision, while others are tools we built for internal usage and believe will be valuable for developers outside of our company.

Most of the Cosmos SDK blockchain framework, as well as self-sovereign identity (SSI) code is built by community-led efforts where developers open source and make their code available for free. Or at least, that’s how it’s supposed to work. In practice, what quite often happens, unfortunately, is that very few companies or developers contribute code “upstream” or make them available to others leading to the “one random person in Nebraska” problem.

While the crypto markets have taken a hit, we believe there is no better time to get our heads down to build, innovate, and iterate on new products and tools.

Our intention is to enable others to benefit from our work. For each product or tool that we are releasing under an open source license (Apache 2.0), we explain what the unique value proposition is, and which audience could benefit from the work we have done.

Our work is largely split between:

  1. Core identity functionality which is integral to our product and work in the Self-Sovereign Identity ecosystem; and
  2. Helpful tooling, infrastructure and analytics packages for Cosmos SDK, which can be easily adopted by other Cosmos/IBC chain developers.

🆔 Core Identity Functionality

🔍 EASILY READ DECENTRALIZED IDENTIFIERS (DIDS) ON CHEQD NETWORK

Github repositories: cheqd/did-resolver

We released the cheqd DID method in late 2021, creating a way for any person to write permanent and tamper-proof identifiers that act as the root of trust for issuing digital credentials. While the functionality to read/write Decentralized Identifiers (DIDs) has existed since the beginning of the network, such as when we published the first DID our network or ran an Easter egg hunt for our community with clues hidden in DIDs, we wanted to go further and provide a “resolver” or “reader” software that makes it easy for app developers to use such functionality themselves. This was one of our primary goals for the first half of 2022.

Having a DID resolver software is important since reading DIDs from a network happens disproportionately more often than writing DIDs to a network. Think about the act of getting a Twitter username or a domain name: you sign up and create the handle once, and then use it as many times as you want to send new tweets, or to publish new web pages on your domain. This is similar to writing a DID once, and using it to publish digitally verifiable credentials many times. The recipients of those credentials can use a DID resolver to read the keys written on-ledger used to verify that credentials issued are untampered.

Many decentralised digital identity projects use the Decentralized Identity Foundation (DIF) Universal Resolver project to carry out DID reads/resolution. Therefore, we have made a cheqd DID Resolver available under an open-source license, and we’re working with DIF to get this integrated upstream into the Universal Resolver project.

The DID resolver we’ve made available is a “full” profile resolver, which is written in Golang (same as our ledger/node software) and can do authenticated/verified data reads from any node on the cheqd network over its gRPC API endpoint. This will likely be required by app developers or partners looking at processing high volumes of DID resolution requests since they can pull the data from their own cheqd node, and/or if they want the highest levels of assurance that the data pulled was untampered.

We also plan on releasing a “light” profile DID Resolver, built as a tiny Node.js application designed to run on Cloudflare Workers (a serverless hosting platform with extremely quick “cold start” times). This will allow app developers who don’t want to run a full node + full DID Resolver to be able to run their own, extremely scalable, and lightweight infrastructure for servicing DID read requests.

In short, this architecture improves:

  • Accessibility to cheqd DIDs
  • Flexibility, offering app developers and partners optionality and choice of platforms to run on, according to their security/scalability needs, and at various different levels of how much it costs to run this infrastructure.

🧑‍💻 ISSUE AND VERIFY DIGITAL CREDENTIALS USING VERAMO CLIENT-APP SDK ON THE CHEQD NETWORK

Github repositoriescheqd/did-provider-cheqdcheqd/did-jwt-vc

We’ve been working hard following our identity wallet demo at Internet Identity Workshop in April 2022 to make this functionality available to every app developer. We’re excited to announce that you can now issue JSON / JWT-encoded Verifiable Credentials using a plugin for the cheqd network we built for Veramo.

Veramo is an existing open-source JavaScript software development kit (SDK) for Verifiable Credentials. We recognise that app developers and our SSI vendor partners have their own preferred SDKs and languages, based on credential formats. We chose to implement the JWT/JSON Verifiable Credentials (VCs) using Veramo since it has a highly-flexible and modular architecture. This allowed us to create a plugin to add support for the cheqd network, without having to rewrite a lot of code for basic DID and VC functionality from scratch.

You can try out a reference implementation of how you as an app developer can build your own applications using our Veramo SDK plugin and ledger on the cheqd wallet web app (more on this later).

We’re also working on supporting other popular credential formats, such as AnonCreds as we know there’s interest in this from many of our SSI vendors/partners.

Why is this valuable

cheqd’s objective is to provide its partners with a highly scalable, performant Layer 1 which can support payment rails and customisable commercial models for Verifiable Credentials. In order to showcase the baseline identity functionality, it was important to build tooling to issue and verify Credentials using an existing SDK. Veramo was a perfect SDK to build an initial plugin for cheqd, due to its already modular architecture.

This repository therefore showcases that cheqd has:

  • Functional capability for signing Credentials using its DID method
  • Functional capability for carrying out DID authentication to verify Credentials
  • Ability for cheqd to plug into existing SDKs, which we hope to expand to other SDKs provided by our partners and the wider SSI community.

💻  STAKE, DELEGATE AND HOLD A CREDENTIALS IN A FIRST-PARTY CHEQD WEB-APP WALLET FOR CHEQ TOKEN HOLDERS

In April 2022, we demoed wallet.cheqd.io at Internet Identity Week (IIW34) in San Francisco to show a non-custodial wallet, that can be recovered, where users can stake, delegate, vote with CHEQ tokens AND hold W3C Verifiable Credentials.

To build the demo wallet, we forked the Lum wallet, an existing Cosmos project. By adding new identity features to an already great foundation, we’ve been able to speed up our journey to get a Verifiable Credential in a web-based wallet.

Whilst we’re expanding this we’ve OpenSourced the cheqd-wallet repo to enable our partners, other SSI vendors and interested developers to:

  1. View and test out the functionality in their own environments, and
  2. To build-on, extend and replicate the wallet’s utility into their own software and wallets.

In the coming week’s we’ll be launching some exciting features which will really bring this wallet to life for the @cheqd_io community and beyond… so watch this space!

Try out the @cheqd_io demo yourself at wallet.cheqd.io, get a VerifiableCredential, and read more about the background on how we built it in our CTO Ankur Banerjee’s TweetStorm.

Why is this valuable?

So far, this wallet has been used for demo purposes, however, moving forward we would love to showcase the real value of Verifiable Credentials by issuing our community their own VCs for different reasons. These could all be stored, verified and backed up using the cheqd wallet. Through demonstrating our technology in a wallet as such, it makes it easier for new community members and partners to visualise and understand the value of everything we’re building on the identity front.

🛠️ Oven-ready tooling, infrastructure and analytics packages

⚙️ STREAMLINING NODE SETUP AND MANAGEMENT WITH INFRASTRUCTURE-AS-CODE

Github repositoriescheqd/infra

Over the past months we’ve been implementing various tools to improve performance, speed up node setup and help to reduce manual effort for our team and external developers as much as possible. We wanted to make installing and running cheqd nodes easy. Therefore, our automation allows people to configure secure, out-of-the-box configurations efficiently and at a low cost.

Terraform: Infra-as-code

We have started using HashiCorp’s Terraform to define consistent and automated workflows — in order to improve efficiency and streamline the process of setting up a node on cheqd. Terraform is a form of Infra-as-code which is essentially the managing and provisioning of infrastructure through code instead of through manual processes. You can think of it like dominos — one click of a button can result in a whole series of outcomes.

This automation gives prospective network Validators the choice of whether they want to just install a validator node (using our install instructions), or whether they want to set up a sentry+validator architecture for more security.

Terragrunt: Infra-as-code

Terragrunt works hand-in-hand with Terraform, making code more modular, reducing repetition and facilitating different configurations of code for different use cases. You can plug in config information like CPU, RAM, Static IPs, Storage, etc., which speed things up whilst making the code more modular and reusable.

Through the use of Terragrunt, we are also able to extend our infrastructure to a full suite of supported cloud providers. This is important since our infrastructure code only works directly with Hetzner and DigitalOcean cloud providers (for their good balance of cost vs performance). We did, however, recognise that many people use AWS or Azure. Terragrunt therefore, performs the role of a wrapper to make our infrastructure available in Hetzner and DigitalOcean, as well as making it easier to utilise with AWS or Azure.

Ansible: Infra-as-code

Ansible allows node operators to update software on their nodes, carry out configuration changes etc, during the first install and subsequent maintenance. In a similar way to Terragrunt, Ansible code can also act as a wrapper, converting the code established via Terragrunt and Terraform into more cross-compatible formats.

Using Ansible, the same configurations created for setting up nodes on cheqd could be packaged in a format which could be consumed by other Cosmos networks. Therefore, this could have a knock-on effect for benefiting the entire Cosmos ecosystem for running sentry+validator infrastructure.

DataDog: Monitoring

DataDog is a tool that provides monitoring of servers, databases, tools, and services, through a SaaS-based data analytics platform. You can think of it like a task manager on your laptop. Using DataDog we keep an eye on metrics from Tendermint (e.g. if a validator double signs a transaction) and the Cosmos SDK (e.g. transactions / day).

This is valuable to ensure the network runs smoothly & any security vulnerabilities/issues that may impact consensus are quickly resolved.

Cloudflare Teams: Role Management (SSH)

When managing a network it’s important that those building it can gain access when they need it. For this we’ve been using Cloudflare Teams to SSH into one of our nodes.

SSH (Secure Shell) is a communication protocol that enables two computers to communicate by providing password or public-key based authentication and encrypted connections between two network endpoints

This work is important because other Cosmos networks can reuse the role management package to reduce the time spent on configuring their own role management processes for SSH.

HashiCorp Vault: Secret Sharing

Sharing secrets in a secure fashion is vital — for this we’ve HashiCorp Vault which offers a script that copies private keys and node keys over to a vault. You can think of this like a LastPass or 1Password for network secrets (e.g. private keys). This way, if for example a node is accidentally deleted and the private key is deleted for a validator, it’s easy to restore it.

This is hugely valuable for Validator nodes, who may want to add an extra layer of security to the process of backing up private keys and sharing keys between persons internally. Moreover, through using HashiCorp Vault, we hope to reduce the amount of risk teams may incur of losing their private keys and thus, losing the ability to properly manage their nodes.

🪙 AUTOMATED DISTRIBUTION OF CHEQ TEST TOKENS WITH OUR TESTNET FAUCET

Github repository: cheqd/faucet-ui

The cheqd testnet faucet is a self-serve site that allows app developers and node operators who want to try out our identity functionality or node operations to request test CHEQ tokens, without having to spend money to acquire “real” CHEQ tokens on mainnet.

We built this using Cloudflare Pages as it provides a fast way to create serverless applications which are able to scale up and down dynamically depending on traffic, especially for something such as a testnet faucet which may not receive consistent levels of traffic. The backend for this faucet works using an existing CosmJS faucet app to handle requests, run using a Digital Ocean app wrapped in a Dockerfile.

WHY IS THIS VALUABLE?

This solution:

  1. Helps to keep the team focused on building, as no longer do we need to dedicate time for manually responding to requests for tokens.
  2. Creates a far more cost effective way of handling tesnet token distributions
  3. Can be utilised by developers to test cheqd functionality far more efficiently
  4. Can be used by other Cosmos projects to reduce operational overheads and reduce headaches around distributing testnet tokens

🪂 FRONTEND/BACKEND FOR RUNNING COSMOS SDK AIRDROPS

Github repository: cheqd/airdrop-ui (FE), cheqd/airdrop-distribution(BE)

The airdrop tools, used for our community airdrop rewards site, are split into two repos; one for managing the actual distribution of airdrop rewards to wallets, and another for the frontend itself to handle claims.

In terms of the frontend, we learnt that airdrop reward sites need to be more resilient to traffic spikes than most websites because, when announced, community members will tend to flock to the site to claim their rewards generating a large spike in traffic, followed by a period of much lower traffic.

This type of traffic pattern can make prepping the server to host airdrop claim websites particularly difficult. For example, many projects will choose to purchase a large server capacity to prevent server lag, whilst others may simply become overwhelmed with the traffic.

To manage this, the frontend site was developed to work with Cloudflare Workers, a serverless and highly-scalable platform so that the airdrop reward site could handle these spikes in demand.

On the backend we also needed to build something that could manage a surge in demand whilst providing a highly scalable and fast way of completing mass distributions. Initially our implementation struggled with the number of claims resulting in an excessive wait to receive rewards in the claimant’s wallet. To improve this we used 2 separate CosmJS-based Cloudflare Workers scripts; one which lined up claims in 3 separate queues (or more if we wanted to scale further), and a second distributor script that is instantiated dependent on the number of queues (i.e. 3 queues would require 3 distribution workers).

There is no hiding that we ran into some hiccups, in part due to our Cloudflare Worker approach, during our Cosmos Community Mission 2 Airdrop. We have documented all of the issues we ran into during our airdrop and the lessons learnt in our airdrop takeaway blog post. What is important to explain is that:

  1. The reward site using Cloudflare Workers scaled very well in practice, with no hiccups;
  2. We had problems with the way we collated data, but the fundamental Cloudflare Workers infrastructure we ended up with, after having to refactor for our initial mistakes, is battle tested, highly efficient and resilient.

WHY IS THIS VALUABLE?

Any project using the CosmosSDK and looking to carry out an airdrop or community rewards program can now use our Open Sourced frontend UI and Distribution repository to ensure a smooth and efficient process for the community, without any hiccups in the server capacity or distribution mechanics.

We would much rather other projects do not make the same mistakes as we did when we initially started our airdrop process. What we have come away with, in terms of infrastructure and lessons learned, should be an example of the do’s and the not-to-do’s when carrying out a Cosmos based airdrop.

🔗 USEFUL COSMOS DATA API’S FOR DEVELOPERS AND PRODUCT MANAGERS

Github repositoriescheqd/data-api

We found on our journey that there’s a LOT of stuff that we needed APIs for, but couldn’t directly fetch from base Cosmos SDK’s.

As Cosmonauts are well aware of, the CosmosSDK offers APIs for built-in modules using gRPC, REST & Tendermint RPC, however, we noticed a few that it can’t provide, so we built them:

  1. Total Supply
  2. Circulating Supply
  3. Vesting Account Balance
  4. Liquid Account Balance
  5. Total Account Balance

This collection of custom APIs can be deployed as a Cloudflare Worker or compatible serverless platforms.

Further specifics about what these APIs mean can be found within our repository Readme.

Why is this valuable

These APIs are useful for multiple reasons:

  1. Applying for listings on exchanges requires many of these APIs upfront
  2. Auditing and analysing the health of a network
  3. Creating forecasts and projections based on network usage
  4. Providing transparency of metrics to the network’s community

Through open sourcing these APIs, we want to provide an easy way for all other Cosmos projects to track these metrics, hugely reducing the time and energy needed to source these metrics from scratch.

🔀 COSMOS CROSS CHAIN ADDRESS CONVERTOR CLI

Github repositories: cheqd/cosmjs-cli-converter

There is an assumption in the CosmosEcosystem that wallet addresses across different chains, such as, Cosmos (ATOM), Osmosis (OSMO) and cheqd (CHEQ), are all identical. This is because they all look very similar. However, each chain’s wallet address is actually unique.

Interestingly, each network’s wallet address can be derived using a common derivation path from the Cosmos Hub wallet address. Using one derivation path #BIP44 means that users that use one secret recovery phrase and core account to interact with multiple networks.

Our cross-chain address convertor is able to automate the derivation of any chain address from one Cosmos address to another. We’ve seen some examples of this previously, but they are mostly designed to do one-off conversions in a browser rather than large-scale batch conversions. Emphatically, our converter could do 200k+ addresses in a few minutes. Doing this using any existing CLI tools or shell scripts can take hours.

Why is this valuable

This is valuable since it can automate airdrops or distributions to any account, just from a Cosmos Hub address in bulk, making data calculations far more efficient.

For new chains in the Cosmos Ecosystem, this makes it much easier for the core team and Cosmonauts to discover and utilise their account addresses and carry out distributions.

Conclusion

Phew! There’s a lot here but we really want to make sure everything we do for cheqd is useful far beyond our project. Contributing back to the Web3 and SSI community is a shared belief across the cheqd team. It as one of our foundational principles.

As always, we’d love to hear your thoughts on our writing and what this means for your company. Feel free to contact the product team directly — [email protected], or alternatively start a thread in either our Slack channel or Discord.

Entropy & Decentralisation: a cheq up

entropy and decentralisation-1-2

The concept of Entropy in decentralised governance was created by the team at cheqd to model how the control of the network changes over time, from the initial launch where the core team had a larger portion of control (Low Entropy), to a state where the community and users of cheqd have a decentralised spread of control over the Network (High Entropy).

This blog post intends to cheq up on the progress to date.

Increasing Entropy was something very important to cheqd because it:

  1. Correlates with higher Network security and resiliency across countries;
  2. Means broader contributions to the Network from a multidisciplinary and diverse collective;
  3. Enables increased integration capabilities with other technologies to improve the ecosystem as a whole;
  4. Dilutes the control from a select group of people to a genuinely decentralised and diverse collective.

In terms of modelling this change, we focussed on a number of key metrics for the network and created a scoring model which could be easily digested and understood based on five distinct Entropy levels. 

Variable

Entropy Level 1

Entropy Level 2

Entropy Level 3

Entropy Level 4

Entropy Level 5

Number of Node Operators (Validators)

5

10

25

50

100

Number of commits from the outside core team

5

10

25

50

100

Number of distinct Participants with bonded tokens

100

500

1000

5000

10,000

Number of stakeholders to achieve 51% of Network (Nakamoto coefficient)

2

4

8

15

30

Exchanges (CEX and DEX) supported by

1

2

4

6

8

Country distribution of node operators

5

10

20

40

60

Number of accepted Proposals after genesis

5

10

20

40

60

If you are interested in learning more about the scoring model and how we designed it, jump into our Entropy blog series here.

So, where are we now?

We can use our Entropy scorecard and table to pinpoint where cheqd is in terms of Entropy.

Variable 

Result

Entropy Level

Number of Node Operators (Validators)

62

4

Number of commits from the outside core team

17

2

Number of distinct Participants with bonded tokens

~8000

4

Number of stakeholders to achieve 51% of Network (Nakamoto coefficient)

8

3

Exchanges (CEX and DEX) supported

4

3

Country distribution of node operators

20+

3

Number of accepted Proposals after genesis

2

1

OVERALL SCORE

 

20

In terms of modelling this on our scorecard, this is how it looks:

This is an excellent start, given it’s been less than six months since we launched cheqd mainnet. Comparing this to where we started at cheqd mainnet launch, we have decentralised in almost all categories, improving from a score of 9 to a score of 20. 

But there is still a long way to go, both in achieving a higher overall score and also consistently higher individual scores.

Given where we are now, it is clear that the areas where we can improve:

  • Firstly, by encouraging the community and partners to focus on codebase commits, through better documentation and tutorials; and
  • Secondly, by driving more community participation in on-chain governance.

We intend to continually improve our existing processes by:

  • Making it easier to contribute to governance processes by having clear instructions on how to use the cheqd forum to make governance proposals and decisions;
  • Increasing the amount of discussion on the cheqd forums on technical topics regarding SSI and cheqd’s product;
  • Running workshops with our partners to increase understanding about where experts and vendors could build alongside the core team;
  • Suggesting that funds from the Community Pool are put towards the community (technical and non-technical initiatives)
  • Creating an Entropy dashboard to increase the visibility of what metrics need to be focussed on the most.

And finally,

High Entropy was never designed to be reached overnight, it is a gradual process. What is important, however, is cheqd’s Foundational Principle of Increasing Entropy. This is why it’s crucial to take stock, reflect and assess where core processes can be improved and iterated – to cheq up.

We, at cheqd, help companies leverage SSI. cheqd’s network is built on a blockchain with a dedicated token for payment, which enables new business models for verifiers, holders and issuers. In these business models, verifiable credentials are exchanged in a trusted, reusable, safer, and cheaper way — alongside a customisable fee.

Find out more about our solution here or get in touch if you wish to collaborate and/or join our ecosystem by contacting us at [email protected].

Some rights reserved

Liquidity Pools explained — what, why, and how…

Liquidity Pools explained   what, why, and how…

Liquidity pools are an innovative solution within DeFi to create the mechanics of a market maker in a decentralised fashion. Although often met with confusion, they are simply clusters of tokens with pre-determined weights.

A token’s weight is how much its value accounts for the total value within the pool. Liquidity Pools are an exciting and equalising tool, which represent the true nature of the Decentralised Finance (DeFi) and Web3.0 movement.

This blog will offer some insight into Liquidity Pools.

It will first take you through what they are and why they exist, followed by how they work to create an environment, which incentivises contribution. It will then explore some suggestions as to why you may be interested in engaging with them and finally how to get involved.

Liquidity Pools explained   what, why, and how…

What is a liquidity pool?

In a previous blog post we outlined where liquidity pools derived from which we’d recommend you read here first if you haven’t already.

At a high level, liquidity pools are a method of increasing liquidity, similar to the way traditional exchanges use market makers.

Yet, where traditional finance requires expensive and centralised intermediaries, which have a level of power to manipulate prices, liquidity pools offer a decentralised alternative through automated market makers, which offer a unique opportunity for anybody to contribute to a pool which behaves similar to a market maker. The pool is essentially a shared market maker, the gains from which are distributed between those that contribute.

This both embodies the ideals of blockchain and decentralisation generally and offers users and companies unique opportunities to trade more efficiently and cheaply whilst having total trust in the system makes it so. Before they arrived on the scene, liquidity, i.e. how easy it is for one asset to be converted into another, often fiat currency without affecting its market price, was difficult for DEXs.

How do they work?

In order for Liquidity Pools to function in the way that leads to the outcomes laid out above, i.e. greater decentralisation of projects and increasing liquidity, there are a number of key aspects worth understanding:

  • token weighting;
  • pricing;
  • market-making functions;
  • LP tokens.

(much of the following is taken from the official documentation from Osmosis Labs).

Token weight

Liquidity pools are simply clusters of tokens with pre-determined weights. A token’s weight is how much its value accounts for the total value within the pool.

For example, Uniswap pools involve two tokens with 50–50 weights. The total value of Asset A must remain equal to the total value of Asset B. Other token weights are possible, such as 90–10.

Pricing

With fixed predetermined token weights, it is possible for AMMs to achieve deterministic pricing, i.e. outcomes are precisely determined through known relationships among states and events, without any room for random variation. As a result, tokens in LPs maintain their value relative to one another, even as the number of tokens within the pool changes. Prices adjust so that the relative value between tokens remains equal.

For example, in a pool with 50–50 weights between Asset A and Asset B, a large buy of Asset A results in fewer Asset A tokens in the pool. There are now more Asset B tokens in the pool than before. The price of Asset A increases so that the remaining Asset A tokens remain equal in value to the total number of Asset B tokens in the pool.

Consequently, the cost of each trade is based on how much it disrupts the ratio of assets within the pool. Traders prefer deep liquid pools because each order tends to involve only a small percentage of assets within the pool. In small pools, a single order can cause dramatic price swings; it is much more difficult to purchase say 1,000 ATOMs from a liquidity pool with 2,000 ATOMs than a pool with 2,000,000 ATOMs.

Market-Making Functions

AMMs leverage a formula that decides how assets will be priced in the pool. Many AMMs utilise the Constant Product Market Maker model (x * y = k). This design requires that the total amount of liquidity (k) within the pool remains constant. Liquidity equals the total value of Asset A (x) multiplied by the value of Asset B (y).

Other market-making functions also exist, you can find out more about these here.

Liquidity Pool Tokens (LP tokens)

When a user deposits assets into a Liquidity Pool, they receive LP tokens. These represent their share of the total pool.

For example, if Pool #1 is the OSMO<>ATOM pool, users can deposit OSMO and ATOM tokens into the pool and receive back Pool1 share tokens. These tokens do not correspond to an exact quantity of tokens, but rather the proportional ownership of the pool. When users remove their liquidity from the pool, they get back the percentage of liquidity that their LP tokens represent.

Source: Osmosis Labs docs

Why should I care?

The collection of the mechanisms above is used to ensure liquidity pools are able to maintain a stable price and ultimately work as a traditional market maker would do.

However, in order to achieve their ultimate goals, encouraging token holders to provide liquidity to pools is required.

The aspects in place to do so is what is known as ‘liquidity mining’ or ‘yield farming’. Contributing to a pool makes an individual a liquidity provider (LPs).

Liquidity mining

Liquidity providers earn through fees and special pool rewards. LP rewards come from swaps that occur in the pool and are distributed among the LPs in proportion to their shares of the pool’s total liquidity. So where do the rewards themselves come from?

Liquidity rewards are derived from the parameters laid out in the genesis of the AMM, in the case of the Cosmos Ecosystem this is Osmosis. For Osmosis, each day, 45% of released tokens go towards liquidity mining incentives.

When a liquidity provider bonds their tokens they become eligible for the OSMO rewards. On top of this, the Osmosis community decides on the allocation of rewards to a specific bonded liquidity gauge through a governance vote.

Bonded Liquidity Gauges

Bonded Liquidity Gauges are mechanisms for distributing liquidity incentives to LP tokens that have been bonded for a minimum amount of time. For instance, a Pool 1 LP share, 1-week gauge would distribute rewards to users who have bonded Pool1 LP tokens for one week or longer. The amount that each user receives is in proportion to the number of their bonded tokens.

The rewards earned from liquidity mining are not subject to unbonding. Rewards are liquid and transferable immediately. Only the principal bonded shares are subject to the unbonding period.

However, as with any opportunity for gain, there is of course some degree of risk; i.e. an individual could be better off holding the tokens rather than supplying them.

This outcome is called impermanent loss and essentially describes the difference in net worth between HODLing and LPing (more here). Liquidity mining mentioned above helps to offset impermanent loss for LPs. There are also other initiatives within the Osmosis ecosystem and beyond exploring other mechanisms to reduce impairment loss.

How do I get involved in liquidity pools

So you’re sold on their potential and now you want to get involved?

Liquidity pools can be access across DeFi, whether in the Ethereum ecosystem using UniSwap and SushiSwap, or closer to home for cheqd in Cosmos, through Osmosis and Emeris

For the purpose of this article we’ll share how to get involved using Osmosis.

First, head to Osmosis and click enter the lab. Once you’ve agreed to terms and you’re ‘in the lab’ you’ll see some trading pairs and a button to connect your wallet (bottom left of the dashboard).

enter the osmosis lab cheqd blog setting up

You can then select Keplr wallet which will automatically connect to your Keplr wallet if you’ve already set it up as a Browser extension.

enter the osmosis lab cheqd blog wallet

Next, you’ll need to deposit the assets you would like to contribute towards a Liquidity Pool. You can see the available Liquidity Pools under ‘Pools’. For example, if you would like to contribute to the Pool #602 : CHEQ / OSMO, you will need to deposit both of these tokens.

To do so, select ‘Assets’ and find the tokens you would like to deposit to contribute to the pool.

Note: if you already hold OSMO in your Keplr wallet you won’t be required to deposit.

Once you have deposited enough tokens for both sides of the pools (i.e. ensure that if the pool is setup as 50:50, you must have the equivalent amount is USD on both sides)

Next, find your pool and select ‘Add/ Remove Liquidity’.

Here you’ll be able to add tokens on both sides of the pool.

On selecting ‘Add Liquidity’ you’ll then be directed back to Keplr to approve the transaction (a small fee is required).

Once you have added liquidity to the pool, you’ll receive your LP tokens (a token representing your share of the total pool). Now it’s time to start ‘Liquidity Mining’.

You’ll now be able to see your total Available LP tokens. Below this you’ll see an option to ‘Start Earning’.

Once here you’ll see a few options for your unbonding period (i.e. the amount of days it takes to remove your tokens from the pool if you decide to withdraw). The longer you choose to bond your tokens, the higher the rewards you’ll be eligible to earn.

Next select the amount of your LP tokens you’d like to contribute to the pool and finally hit ‘Bond’ (this will kick off another approval through a Keplr pop-up).

You’ll now see your total bonded tokens. Each day rewards will then be distributed. When you decide to withdraw from the pool you’ll simply need to select ‘Remove Liquidity’ and select the amount you’d like to withdraw.

Conclusion

Overall, liquidity pools offer a new avenue for projects to gain more liquidity, and believers of these to show their support. Where for many years engaging in and benefiting from such financial systems was reserved solely for the wealthiest individuals and large organisations, now anyone can gain access and start contributing to their favourite projects, voting on their future direction and earning from the part they play.

Note: for the purpose of engaging with cheqd through its token the information above is not required, however, we strongly believe in the value of educating and sharing what we’re learning with our community to help you better understand DeFi and support us in raising the awareness of the shift to Web 3.0.

Why Centralised Decentralised Finance (CeDeFi) & Self-sovereign Identity (SSI) work together

Why CeDeFi and Self-sovereign Identity (SSI) work together

The idea of CeDeFi — the combination of Centralised and Decentralised Finance — unites two ways of interacting with assets into one. Centralised Finance (CeFi) represents traditional entities (e.g. banks, brokers, funds), Decentralised Finance (DeFi) covers blockchain financial applications, cryptocurrencies, exchanges, decentralised payment services, etc. By merging the two, high transparency, impactful innovation, and wide adoption can be achieved.

We already know that blockchain applications are not limited to finance. CeDeFi is just one part of the Centralised Decentralised Industry (CeDeX), where X stands for every industry and possible application.

A common challenge for CeDeX: SSI is a solution

Centralised and Decentralised providers adopt different approaches to interact with users and their data. Centralised companies typically collect as much data as they can to provide personalised, monetizable, and targeted services. Decentralised providers prioritise users’ anonymity and transparency: for the sake of decentralisation values, only minimal information is requested, even if it’s in tension with user safety and compliance.

Self-sovereign identity (SSI) is a way to provide users control over their data, where information is issued to a user-managed wallet before being shared to the third party. This approach opposes the current model of data collection and management, where users rarely know how their data is collected, and have a limited say on its use.

For now, neither CeFi or DeFi (and, CeXs and DeXs by extension) have found a safe and comfortable way to interact with sovereign identities. Let’s take a look at two examples to summarise why.

  • CeFi: Central Bank Digital Currencies (CDBC) are an example of innovating CeFi’s approach. By introducing CDBC into circulation, financial institutions can provide users with “nearly fast” processing and “nearly zero” commission on all transfers, including cross-border payments. However, widespread use of CDBC will generate a lot of data: in the worst-case scenario, institutions and governments would be able to track the financial movements of every person. Without a well thought-out and secure identity layer, this innovation could quickly turn into a dystopian privacy invasion.
  • DeFi: existing decentralised exchanges (like Uniswap), allow users to make payments without providing identity information. DeFi ignores established Know Your Customer (KYC) and Anti-money Laundering (AML) measures, exposing itself to the risks of sponsoring money laundering, tax evasion, etc. Regulators’ concerns are obviously increasing — so much so that the lack of a proper identity layer could become a driver for massive legal penalties.

An ideal identity layer that suits both CeFi and DeFi needs

Since both approaches are imperfect, it’s necessary to come up with a new way to interact with user data. What would be the key characteristics of a reliable self-sovereign identity?

1. KYC with anonymity

To fit the CeDeX needs, a new approach should find ways to collect data without disclosing users’ identities. The only goal for collecting data should be to prove the legitimacy of the user. So, storing data for personalisation, financial tracking, or advertising should not be allowed, at least not unless the user explicitly agrees.

2. A new perspective on compliance

Building an SSI system is a technical and regulatory challenge. First and foremost, we should reconsider the KYC requirements. What and how much data is enough to consider a user KYC’ed? Are the current KYC and AML systems achieving their goals, and what could be improved? Why do users dislike KYC and how to eliminate the major pain points? SSI management is a multi-dimensional challenge that requires the cooperation of financial providers, legislators, and end participants.

3. Creation of revenue streams

The ability for individuals and companies to hold and control their data will create entirely new revenue streams. For example, who could foresee that a combination of passport, antibody test, and boarding pass would be a crucial blend to allow international travel. Furthermore, since the user is now at the centre of the ecosystem, they can become part of the commercial framework rather than being surveilled and monetised. Extending the work that Brave browser and the Basic Attention Token have begun, incorporating SSI and authentic data will push the value of personal data even higher since it is accurate and trustworthy.

4. Providing full trust

When the payment sender has no verification of the receiver’s identity, there’s a risk of making payment to the wrong address. If there was a way for participants to know more about each other without compromising key identity information, it would greatly reduce the chance of erroneous money transfers.

Additionally, it would open digital asset transactions to much wider adoption. Trusted SSIs remove the need to triple-check addresses, conduct test payments, Zoom calls, etc. SSI on its own should be enough to provide all the proof. This could be as comprehensive as sharing a digital version of a passport or as privacy-preserving as just a Telegram handle with both of these cryptographically attested and visible to only those involved in the transition, i.e. peer-to-peer.

5. Defining opportunities and limits

SSI has thousands of potential applications each with unique requirements. When we talk about managing identities, we care about the legitimacy of participants and the nature of transactions.

If we incorporate SSI into an advertising network, we need to define compensations for user data, and the limits to which privacy can be exploited even under consent for commercial purposes, etc.

Applications of SSI

Let’s take a look at several possible use cases for SSI to appreciate the versatility of the challenge.

  • NFT: SSI can solve the provenance issue with NFTs regardless of the ledger.
  • Content: fully decentralised content consumption with payment and identity, i.e. consumption of media/content directly from the creator without a distribution channel. A possibility for creators to receive fair payments for their work and interact with audiences directly.
  • InterPlanetary File System (IPFS) for controlled decentralised storage: SSI can be implemented to manage the data of participants and store distributed files with decentralised access control
  • Payment verification: the risky nature of CeDeFi transfers has been holding back many users. There’s no way to further verify the identity of the payment receiver beyond the wallet address. It’s a problem even for low-volume retail traders, but increasingly more so on the institutional level, where large trading volumes risk being lost. Institutional participation is an influential driver for mass adoption, and the introduction of SSI of that level can result in significant growth of decentralised finance.

CeDeFi (Unizen) as the main application

Despite the multitude of SSI applications, finance remains the biggest, and potentially the most rewarding challenge. If CeFi and DeFi providers can come together to figure out a new way to ensure KYC and AML, broadening the applications of the solution is a much easier task.

In most other applications, financial challenges enter the picture only at later stages (take content creation, for instance). In finance, all transactions have risks, many of which can be solved only with SSI. This is what makes CeDeFi a perfect market for the introduction of SSI. Also, neither centralised nor decentralised finance is satisfied with current options, which is why they are highly incentivised to explore alternatives.

cheqd’s take

We, at cheqd, are building the payment rails for authentic data*. We want to make it as easy as possible to create authentic data ecosystems through customisable commercial models and governance structures, all built upon a public permissionless network with a dedicated token for payment.

We’re starting off with a focus on SSI, using commercial and governance frameworks to make it easy to stand up new ecosystems and return privacy and data control to individuals. Our team is passionate about the privacy movement as well as creating new business models and marketplaces around authentic data.

*Authentic data is where the source of that data can be proved. This is in contrast to most existing data economy solutions where the data has no lineage. For example, being able to prove passport data came from a legitimate passport issuer rather than simply reading the contents.

Core to the business model implementation is a dedicated network and token. When the user is at the centre of the ecosystem, companies will no longer have contracts with every other company they interact with. If a passport is issued in one country and then used across the world in another to open a bank/exchange/choose account, there should be a corresponding value flow. This wouldn’t be possible if contracts and fiat rails are used!

Despite the blockchain network and token underpinnings, we don’t believe any client/end-user should need to care about these. Our ultimate vision is that we establish the payment rails for identity, initially self-sovereign identity without anyone needing to worry about the underlying technology. This perfectly aligns with the CeDeFi vision of providing a spectrum of financial services without the need to worry about whether they are centralised, decentralised or what technology they are built on.

This blending of CeFi and DeFi also prevents the need for multiple, siloed identities which is exactly the problem SSI is built to solve. We need to avoid recreating siloes in any form. The greatest priority is getting to the right outcome for the individual.

Why CeDeFi and Self-sovereign Identity (SSI) work together cheqd blog

Conclusion

The relationship of self-sovereign identity and CeDeFi is more than a symbiotic one. Not only do the two benefit from each other, in fact, each of them would also have difficulties thriving without the other:

  • SSI is a necessary component of CeDeFi. Without ways to create and manage SSIs, CeDeFi will not be able to meet KYC and AML needs together with decentralisation expectations. SSIs create a bridge between traditional data-heavy interactions and an anonymous DeFi approach.
  • CeDeFI will provide the financial infrastructure which is required for global SSI adoption. SSI adoption relies on the ability of SSI owners to get paid for their data and participation, as well as create entirely new business models through the blend of previously uncombined datasets. Thus, there’s a need for fast, secure, and usable technology, as well as for the legislative framework for data-asset interactions.
  • CeDeFi and SSI providers should cooperate to ensure the best user experience and security of all participants. Once the SSI has proven itself on one of the riskiest applications — the financial one — it’ll easily cater to other industries.

Authors & contributors: Fraser Edwards (Chief Executive Officer, cheqd), Ankur Banerjee (Chief Technology Officer, cheqd), James Taylor (Chief Business Development Officer, Unizen), and Ghost of St. Miklos (Community Contributor, Unizen).

The GDPR Nightmare

The GDPR Nightmare cheqd blog

In the previous blog, part one, we dived into the ethics around marketing, and whether it is possible to use the existing tools for marketing in a way that respects and upholds the best interests of personal privacy and data protection.

In this blog, we are going to explore three topics:

  1. An example of a GDPR nightmare which we recently encountered at cheqd;
  2. Why consent is broken in the digital world; and
  3. How the latest proposed regulatory developments in privacy and data protection may improve the current state of play.

Entering the Lion’s Den

We recently attended an identity event that champions privacy, data protection and the autonomous control of digital identities. Following the conference we received the following email:

The GDPR Nightmare cheqd blog data selling

Now, ironically, for a conference revolving around privacy and controlling how your personal data is shared with, all attendees had, without consent, their personal information taken and sold.

To attend the event there were blanket terms and conditions which were necessary to accept and there was no explicit ‘opt-in’ for marketing purposes, which is required as a specific purpose under the GDPR. Or in other words, having your data processed for marketing purposes was a precondition of signing up, in direct violation of Article 7 GDPR.

And this is by no means a one-off…

Consent is broken

Individuals generally have to navigate through ‘clickwraps’ or ‘browsewraps’ to access a service. If they do not accept the terms and agree to legal jargon, then they often cannot use the service, or at least not quickly.

This makes it difficult to give meaningful consent.

Furthermore, the rules for when consent is needed are convoluted and easy to get wrong. The table below highlights what type of provision is needed for what purpose.

The GDPR Nightmare cheqd blog legal grounds

For people to enforce their own data protection rights and hold companies accountable, they need to understand the rules first. Similarly, for companies to respect and uphold better standards on data protection and privacy, they must be able to have confidence that they can carry out marketing without feeling like they are doing something unethical. The current state of digital marketing benefits neither the marketer nor the individual.

For data subjects to be truly empowered to harbour more control over their personal data, significant change needs to happen. We believe that greater control of data needs to be given to data subjects, since privacy at its heart, is all about control. And this is what cheqd and new digital identity paradigms are currently seeking to achieve.

Improvement on the horizon

The European Commission has proposed a framework for an EU-wide digital identity framework, allowing businesses and citizens to take much greater control of their digital identity, by being able to hold verified attributes and claims in a digital wallet.

The European Digital Identity Wallet is being designed to store and process Credentials to enable Europeans to access services using a digital identity, without oversharing personal data. This will give individuals much greater control over the data they share — almost empowering them as data controllers for their own data.

From the current information about these legislative changes, we believe that the new law will recommend the same technology, or at least compatible privacy-by-design technology to what cheqd natively supports, namely Self-Sovereign Identity (SSI). And we want to work closely with the EU and UK to make sure the technical stacks built on top of our Network are directly compliant and semantically interoperable.

A State of Limbo

It’s a strange time in the privacy and data protection world. The necessary change is on the horizon, with new technical innovations and proposed legislation— however, there is a lack of clarity about the specifics of the changes or when exactly they will come into force.

But, we have confidence that things will improve.

All we can do right now is sit tight, make sensible suggestions, using open standards and frameworks to support what we believe to be a more privacy-preserving future.

It’s new ground, and it’s exciting. We’re all freestyling, in a regulatory limbo; hopefully to the tune of a more privacy-preserving and user-centric future.

Cheq out our Telegram and Twitter to stay in the loop with the latest cheqd updates and news.

A Regulatory Game of Red Light, Green Light

A Regulatory Game of Red Light, Green Light

Regulators across countries worldwide are facing a crossroads when it comes to cryptocurrency, tokens and Decentralised Autonomous Organisations (DAOs).

Generally, such decentralised ecosystems struggle to fit within legal frameworks and clearly defined lines. To boot, the constant changing and evolving nature of the crypto ecosystem makes regulating these technologies increasingly difficult.

Additionally, these digital assets often have no directly accountable legal entity. This can be problematic because they can be used to facilitate criminal activity, such as the purchase of illicit items, as well as money laundering. For this reason, countries have taken different measures and stances to attempt to regulate this activity.

And to put this into a very current analogy, it is a bit like a game of Red Light, Green Light from the latest Netflix craze, Squid Game.

Okay, not quite! But hear me out.

Without giving away any spoilers, in this game, players had to carefully approach the Doll in the image above when the Doll said Green Light, and stay perfectly still when the Doll said Red Light, or else face elimination.

Red light.

Green light.

Player 027 Eliminated.

Backdrop

Before the inception of the internet, regulation was much more simple. It stood that if you were on the land of a country, you were bound by the laws of that country.

Then the internet came along and made everything a little bit more complex, with cyberlibertarianism declaring the internet independent from regulatory clutches.

The Dark Web, for example, is a good example of this in practice. It is the internet in its rawest form, detached from indexing services like Google, which, unfortunately, has made it into a playground for cybercriminals.

Cryptocurrencies and tokens have, to a large degree, enjoyed similar lawlessness of existing outside of national borders.

Elon Musk wrote this tweet last year lampooning various concepts. Firstly, it is a direct reference to the 1984 sci-fi film Dune, but secondly, it is an indirect reference to many articles that have come out since then Dune, referring to gating access to services such as the internet and net neutrality.

The suggestion in these articles was that you cannot regulate the internet directly, but if you control the pipes and the access to the internet, you can regulate who uses it.

Musk, of course, parodies this idea, instead suggesting that the control of the crypto universe is based on the communities (and memes) that thrive in the background, rather than through the control of a centralised state or system.

This tweet, referencing the capacity of the state to control crypto assets is at the heart of the regulatory game of Red Light, Green Light.

Red Light

To deal with the lawlessness of new digital assets such as coins and tokens on decentralised networks, countries can either decide to try and regulate them or simply prevent access to them.

1. Blanket ban

The idea of technology functioning outside of a countries’ jurisdiction and legal framework is a difficult pill to swallow. Furthermore, the oversaturation of different types of coin, token and stablecoin makes keeping up from a regulatory perspective a huge challenge.

As such, in 2013, China banned its banks and financial services from making transactions using Bitcoin, and in January 2018, China’s leading internet-finance regulator issued a notice requiring Bitcoin-mining companies to ‘orderly’ stop their business. Recently in 2021, China has made all cryptocurrency transactions illegal.

This method of giving crypto a full Red Light is effective to a certain extent, in the sense it removes the problem rather than dealing with it. Although, it does not make any inroads into progressing cryptocurrency. Moreover, it pushes cryptocurrency into being used in a black-market, beyond the visibility of the State — which is not a desirable outcome either.

2. Target Law on middlemen

In 2017, the European Parliament amended the fourth Anti-Money Laundering Directive to incorporate Virtual Currency Provisions. Article 32a of the Directive was amended to ensure that ‘Member States put automated centralised mechanisms in place […] which allow for the timely identification of any natural or legal persons holding or controlling payment accounts, and bank accounts held by a credit institution within their territory’.

The objective of the amendment was to ensure that money laundering could not occur through exchanges or custodian wallet providers, making these third-party sites accountable for the persons in their user-base.

Similarly, the UK has recently implemented a new rule known as the ‘Travel Rule’ which requires that any virtual-asset transfer of above £1,000 must be accompanied by detailed personal information of both the originator and beneficiary.

Currently, this would be achieved through an exchange or custodian doing KYC on its users and making this information available.

Again, this type of regulation has worked to a certain extent. However, given the inconsistency of regulation amongst countries, users can simply hold and exchange their coins and tokens through services in countries where it is allowed. Or, use a decentralised exchange (DEX), which does not rely on a centralised middleman but facilitates peer-to-peer transactions.

In this way, the regulation only has a limited effect.

3. Target Law on tokens

Most countries around the world have Securities laws, which digital assets such as crypto may fall under. The strength of the Securities Law and the nuances in the detail determine whether an exchange token will be able to be launched, exchanged and stored in that jurisdiction.

The USA for example has a strong Securities Law, based on many years of legal precedent, and notably the famous Howey Test. This does have a degree of effect on regulating tokens, since the SEC has not been shy to crack down on creators of tokens that constitute securities, based in the USA.

However, Securities Law differs vastly around the world. So a token constituting a Security in the US does not mean that it will constitute Security in other jurisdictions and vice versa.

This means that tokens that are prevented from being held in one country can quite easily be held, and given a green light in another, diluting the strength of global regulation on cryptocurrency.

Green Light

Owing to the success and resilience of digital assets such as cryptocurrency and tokens, and their non-reliance on banks or centralised bodies, some countries have given the Green Light to digital assets within a certain scope. For example, recently, El Salvador made Bitcoin legal tender, meaning that it can be freely traded without regulatory hurdles to navigate. The only regulation in place here is that service providers, such as exchanges and custodians need to onboard onto a Service Providers Register.

This low barrier to entry enables not only citizens of El Salvador to access these services, but also persons from other countries can benefit from the lack of regulatory barriers — and store or trade their assets through an exchange or custodian based in El Salvador.

Red Light, Green Light

A combination of the rigidity of existing law, the inconsistency of law across multiple countries, and the rapid evolution of new tokens and coins makes regulatory clarity around digital assets problematic for regulators worldwide.

Gating access and putting up a ‘Red Light’ is largely ineffective at preventing the use of crypto at scale when exchanges and custodians can exist in ‘Green Light’ countries.

For this reason, digital assets can, due to their decentralised structure, function outside of regulatory frameworks — and a different approach to regulate them is needed.

A different approach

At cheqd, we believe that there is a fundamental misalignment between the will to regulate digital assets & crypto and the practicalities of regulating digital assets & crypto.

And for this reason, countries need to look beyond the law to regulate — and take proactive steps to encourage safer, more trustworthy systems which are regulated by the network architecture itself.

What if the parties in the transaction itself had verified digital identities, and could share a verified proof that they were a verified, actual person? And this was done in a privacy-preserving manner.

Upon entering a transaction, the originator could create a direct channel with the beneficiary, sharing a Verifiable Credential, entirely off-ledger to maintain privacy. Below £1000 this could just be a proof of being an actual, verified person, and above £1000 this could be an actual exchange of personal information.

Such an architectural shift would enable greater compliance with new regulations, enabling trusted transactions by default. It would also disincentivise criminals from using the token for illicit activities because there would be a much higher risk when transacting alongside a verified identity.

At cheqd, by bridging cryptocurrency and tokens with digital identity, our technology opens up a range of new options to achieve trust, transparency and regulatory compliance in transactions. This not only makes crypto and tokens safer, but also more commercially viable for real-world use cases and transactions in regulated industries, such as financial services.

cheqd can potentially create a world in which most countries do not have to pick Red Light or Green Light, but can find a strong middle-ground.

Enter the Osmosis lab… What you need to know

Product Vision for 2022 Part 2

Disclaimer: All information provided is intended to help users get set up on cheqd. However, we do not expressly recommend or mandate a certain approach. All actions taken are your personal responsibility.

In our previous blog, we guided you through getting setup on Emeris to access Gravity DEX (using your Keplr wallet). In this one we’ll share some information on Osmosis, another DEX you might see us on soon.

What is Osmosis?

Osmosis is a Decentralised Exchange (DEX) on the Cosmos network. We covered the differences between DEXs and CEXs (Centralised Exchanges) in a previous piece.

It is the first major application of the Inter-Blockchain Communication Protocol protocol at scale (IBC covered here), and, aside from its brilliant UI and overall branding and imagery, its features are proving to be a powerful application and new entrant to the DeFi ecosystem.

Built using the Cosmos SDK, Osmosis is its own sovereign blockchain with its own token — OSMO — used for network staking and governance. What makes Osmosis so ground-breaking for the Cosmos ecosystem, and in fact, the broader decentralised ecosystem outside of Ethereum, is the way in which Osmosis acts as an automated market maker, made possible through its liquidity pools.

If you’re just getting started with crypto the following section is a little more complex. Feel free to skip this to where we explain how to get set up on Osmosis.

Understanding the origination of Automated Market Makers (AMMs) and Liquidity Pools

Here we’ll explain what they are and why they are needed in decentralised finance (DeFi) but first, like many things in the Web 3.0 and DeFi space, it’s important to understand the existing world of trade finance.

Traditional exchanges, i.e. Nasdaq, London Stock Exchange, STAR, work through an order book model which records the average of the current bid and ask prices being quoted. In this model buyers and sellers come together to trade; buyers simply try to buy at the lowest price possible, and sellers try to sell for the highest price.

For a trade to be completed both parties must agree on a fair price meaning either the buyer comes up or the seller goes down. However, it’s not that simple because finding someone who wants to both buy that specific amount and for the price they are looking to sell for is unlikely.

Let’s use an example.

enter the osmosis lab cheqd blog apples

Photograph: Shutterstock

You’ve just picked 100 apples on a farm but suddenly you have to leave town and need to sell every single one of them. You have to sell them at the market price of $1 in order to have enough to pay the farm their fee. You can’t find anyone willing to pay this price and you can’t take them with you. You’re stuck. This is where market makers come in handy.

In this example, the market maker would be an individual or company who is permanently on hand to purchase the apples at the market price of $1. When you place a market order to sell your apples, the market maker will buy them from you even if it doesn’t have a buyer lined up. Likewise, the reverse is also true; a buyer can purchase the apples even if a seller isn’t lined up.

Market makers in return earn a profit through the spread between the bid and offer price as they bear the risk of covering the apple which may drop below the market price. Without them, it would take considerably longer for buyers and sellers to be matched up, which in turn would reduce liquidity, making it more difficult to enter or exit positions (or leave town). They also track the current price of assets by changing their prices — hence they ‘make’ the market. This is the same method that Centralised Exchanges, such as Coinbase and Binance, work, however, it is not truly decentralised whilst you have a market maker acting as an intermediary to exchange.

That said, although it is valuable to buyers and sellers alike, as market makers perform this delicate balancing act, they command a disproportionate amount of power over the market and ultimately act as an intermediary… a big no in the decentralised vision.

Enter AMMs and liquidity pools

Where traditional finance requires expensive and centralised intermediaries which have a level of power to manipulate prices, AMMs allow digital assets to be traded in a permissionless and automatic way.

This both embodies the ideals of blockchain and decentralisation generally and offers users and companies unique opportunities to trade more efficiently and cheaply whilst having total trust in the system makes it so. Before they arrived on the scene, liquidity, i.e. how easy it is for one asset to be converted into another, often fiat currency without affecting its market price, was difficult for DEXs.

AMMs offer a solution to scarce liquidity through liquidity pools; a shared pot of tokens that users can trade against. Users can create a liquidity pool and others can supply tokens to it. In return, like with traditional market makers, those that are willing to take on the risk of providing liquidity to the pool earn fees and other rewards. An explanation specific to Osmosis can be found here. At the time of writing Osmosis’ liquidity pools contain about $544 million in total value locked (TVL).

Note: for the purpose of engaging with cheqd through its token the information above is not required, however, we strongly believe in the value of educating and sharing what we’re learning with our community to help you better understand DeFi and support us in raising the awareness of the shift to Web 3.0.

You get all that? Good… let’s get set up on Osmosis

Head to Osmosis and click enter the lab. Once you’ve agreed to terms and you’re ‘in the lab’ you’ll see some trading pairs and a button to connect your wallet (bottom left of the dashboard).

enter the osmosis lab cheqd blog setting up

You can then select Keplr wallet which will automatically connect to your Keplr wallet if you’ve already set it up as a Browser extension.

And you’re in. If you’ve already deposited to your Keplr wallet you’ll be able to start engaging start exploring pairing and liquidity pools — all at your own risk.

Enjoy your first walk around the lab….

Further reading/viewing:

Get ready…

We’re extremely excited as our network launch is coming so very soon! And, we want to make sure that we reward our community. If you haven’t already joined our Telegram group, follow us on Twitter and sign up for a surprise.

De-Fi jargon — debunked: What you need to know

DeFi jargon debunked cheqd blog

Co-authored by Ross Power and Alex Tweeddale

Disclaimer: All information provided is intended to help users get set up on cheqd. However, we do not expressly recommend or mandate a certain approach. This is also not financial advice. Staking, delegation, and cryptocurrencies involve a high degree of risk, and there is always the possibility of loss, including the failure of all staked digital assets. Additionally, Node Operators are at risk of slashing in case of security or liveness faults on some protocols. We advise you to do your due diligence before choosing a validator. All actions taken are your personal responsibility.

So far we’ve shown you how to get set up with a Keplr wallet and link this to Gravity DEX and Osmosis; Decentralised Exchanges in the Cosmos ecosystem which you can use to exchange tokens. In this blog, we want to share how you can take your engagement with the cheqd network a step further through staking.

The world of Decentralised Finance (DeFi) can be daunting for newcomers. Documentation is littered with words, which mean very little, creating a seemingly large knowledge gap and journey to embark on in order to make informed decisions.

For this reason, in this blog, we’ll cover the basics through a simple question and answer format. By the end we’d hope you could answer the following questions:

  • What is a Validator or Node Operator?
  • What does a Validator actually do?
  • What is staking and who is involved?
  • What does it mean to bond / what does bonding involve?
  • What is delegation?
  • How can an individual be involved in and benefit from staking?
  • How does voting work?
  • What is slashing and how does it work?

What is a Validator or Node Operator?

In blockchain ecosystems, the Node Operator runs what is called a node. A node can be thought of as a power pylon in the physical world, which helps to distribute electricity around a wide network of users.

De-Fi jargon — debunked- What you need to know

Source: National Grid

Without these pylons, electricity would be largely centralised in one location; the pylons help to distribute power to entire wide-scale populations. And if one pylon fails, the grid is set up to circumvent this pylon and re-route the electricity to a different route.

Similarly, in blockchain infrastructure, each node runs an instance of the consensus protocol and helps to create a broad, robust network, with no single point of failure. A node failing will have little impact on the Network as a whole; however, if multiple nodes fail, or disagree with information entered into the transaction, then the block may not be signed, and there are fail-safe measures to notify the rest of the Node Operators of this.

The terms Validator and Node Operator are somewhat synonymous. Validator is the term used more commonly in the Cosmos documentation when referring to a Node Operator that is validating transactions on a blockchain. The only point worth mentioning is you can have a Node Operator that is NOT a Validator. These are known as Observer nodes, which play a more passive role on the network, as they don’t stake on the network or validate transactions, but can observe them.

What does a Validator actually do?

The Cosmos Hub is based on Tendermint, which relies on a set of validators to secure the network. By ‘secure the network’, this refers to the way in which validators “participate in consensus by broadcasting votes which contain cryptographic signatures signed by their private key”. In English pls…

A cryptographic signature is a mathematical scheme for verifying the authenticity of digital messages or documents. A private key is like a password — a string of letters and numbers — that allows you to access and manage your crypto funds (your mnemonic is a version of this). So, the above is saying validators can broadcast that they agree with transactions in a block, using their password to sign their agreement in a mathematical way which ensures security and privacy.

What does staking mean?

A stake is the amount of tokens a Node Operator puts aside and dedicates to a network’s active pool to contribute to governance and earn rewards. Staking is the verb used to describe this contribution. As cheqd is a Proof of Stake (PoS) Network, rewards can be earned in direct correlation with the amount of stake a Node Operator contributes.

What is delegation?

Token holders, ‘users’, can delegate their tokens to Node Operators in order to earn rewards and participate in governance. Once a user has delegated tokens to a Node Operator and has tokens added to the active pool, they are known as Participants.

Users can delegate to multiple Node Operators at the same time to essentially diversify their bonded token portfolio.

What does bonded mean?

Bonded tokens are those present in the active pool.

Bonded tokens = staked tokens by Node Operator + delegated tokens by a user.

Governance and voting

Users with bonded tokens, Participants, are able to vote on Governance Proposals. The weight of a vote is directly tied to the amount of bonded tokens delegated to Node Operators.

The specifics of how a participant can vote on Proposals, or create Proposals, is detailed further in our Governance Framework.

If the User does not want to vote on a Governance Proposal or misses it for any particular reason, the Node Operator will inherit the delegated tokens and can use them to vote with.

Node Operator voting power = initial stake + inherited delegated tokens (if participants do not vote)

Participant voting power = delegated tokens (if a participant chooses to vote)

What if I want to vote unilaterally, i.e. without a Node Operator/Validator?

If you are particularly interested or passionate about a specific governance proposal or do not agree with your bonded Node Operator, it is absolutely possible to vote unilaterally. However, you must still have delegated tokens, bonded with a Node Operator to do so. To do this, follow the instructions in the section Voting on cheqd.

Can Participants earn tokens?

In short, yes. Participants may be eligible for a percentage of the rewards that Node Operators earn during the course of running a node. Node Operators may offer a certain commission for bonding to them. These rewards may come in two forms:

1. Transaction fees

Writes to the cheqd Network incur what is known as a transaction fee, which is calculated based on gas. Gas may be higher when there are high transaction volumes on the Network, and vice versa. Node Operators may also set their own gas prices, above which they are considered in the pool of who creates that transaction block. However, we will not get into the nuances of gas here.

2. Block rewards

Block rewards depend on inflation. Inflation is the gradual increase in the number of blocks on the Network. A Node Operator may earn block rewards during a period of inflation, which can be disseminated to the Users delegated to the Node Operator. For this reason, it is suggested that token holders bond and delegate their tokens, to create a healthy Network and earn passive income.

These rewards are distributed continuously and instantly. You are able to see your accumulated rewards on our dashboard: https://cheqd.omniflix.co/ and can always claim them back into your account.

Omniflix cheqd DeFi jargon debunked

How do I choose which Node Operator to delegate to?

Choosing your Node Operator or multiple Node Operators is an important decision. There are a few things you can take into consideration:

Commission rate

The incentive for delegating tokens to a Node Operator is that you, as a participant, can earn rewards based on the stake of the Node Operator. Each Node Operator will have a commission rate. This is the percentage of tokens that the Node Operator will take as commission for running the infrastructure — the remaining percentage is distributed to the Participants.

Reputation

You should be mindful of what reputation the Node Operator has. This is because the Node Operator may use your votes against the best interest of yourself and the community. As cheqd evolves, it is likely that there will become a political spectrum of Node Operators, who will cast their vote in different directions. Some may want to create chaos on the network and vote to disrupt the established paradigms, for example. A chaotic actor may lure users to delegate to them with a favourable commission rate and use the accumulated bonded tokens against the network’s best interests. For this reason, the choice of Node Operator you delegate to is very important.

Slashing and Validator Jail

As the name would suggest, staking is not risk-free. As the word stake literally means “having something to gain or lose by having a form of ownership of something”, individuals should be wary of the risk, as we’ll come on to.

Think of it like this, if someone says to you “what’s at stake?” they are essentially asking: “what am I risking in return for the potential rewards?

Node Operators might exhibit bad behaviour on the Network and, as a result, have their stake slashed. Slashing means taking the stake away from the Node Operator and adding it to the Community Pool.

Bad behaviour in this context usually means that the Node Operator has not signed a sufficient number of blocks as ‘pre commits’ over a certain period of time. This could be due to inactivity or potential malicious intent.

For example in June 2019, CosmosPool a former Cosmos validator experienced a server outage on their main node; downtime that resulted in its validator being temporarily jailed and its stake being slashed by 0.01%, including that of its delegators. This was what’s called a downtime slashing incident (soft slashing) whereby the validator and delegators were punished for downtime proportionally to their stake on the network (JohnnieCosmos). On top of this, further slashing later occurred as evidence was found of double block signing. Both CosmosPool AND the delegators’ stakes were slashed an additional 5% and the validator was permanently removed (‘tombstoned’).

Slashing can therefore certainly affect your delegated and bonded tokens, so it is important to consider your choice.

We currently have the slashing values to:

  • 5% slashed for double signing
  • 1% slashed for downtime (getting jailed)

These values may change over time through proposals that are voted on the network. You can read more about slashing here.

What if I change my mind about a Node Operator? Is it possible to redelegate or unbond from a Node Operator?

Yes, it is possible to instantly redelegate to a new Node Operator; however, you cannot ‘hop’ between Node Operators quickly. You must complete your redelegation to a Node Operator before moving again. If you want to completely withdraw and unbond your tokens, you need to be mindful that this process takes two weeks for the tokens to become available for use again. This unbonding period is an example of a parameter, which can be adjusted by the Governance Framework process.

Summary of touchpoints

Mira Storm published a great checklist to help with this — find out more about each here.

  1. Technical setup of the validator
  2. Amount of self-bonded coins of the validator
  3. Current commission rate, commission change rate and maximum commission
  4. Total amount and number of delegations
  5. Community Involvement and Longevity
  6. The level of network decentralization
  7. Uptime of the validator
  8. The ability of delegators to vote and participate in governance
  9. Slash Protection.