Human Organization 4.0, part six
Blog

Human Organization 4.0, part six

4 хв
1 year ago

Tokenization = Efficient Information Sharing

Human Organization 4.0, part six

When I say “tokenization”, what do you think of?  Do you think about financializing a product?  Or perhaps you think about the utility a token can provide an application.  Maybe your mind goes to the T in NFT?  Tokenization, the act of creating a digital token that has a ledger of its activity, means many things to many people and has become a buzz word within the crypto industry.  However, I hope to change that with this article and then explain why tokenization will lead to advancements in human development.

This is the sixth entry in the Human Organization 4.0 blog series.

From now on, when someone says “tokenization” I want you to think in terms of efficient information sharing.  In essence whether you think a token is a financial instrument, an utility instrument, a non-fungible item or anything else, it all equates to the ability to convey information across space and time, particularly preferential information.  The nature of tokens, (their near-zero cost of transactions, their security and their programmability) means that they are the most efficient and secure way for humans to express preference between choices and to weigh their preferences.  Let’s use some examples to drive this point home…

Example 1 - In the last blog in this series I discussed liquid democracy in the context of the election for the President of the United States.  The current form of Americans expressing their preference for President goes like this: 1) Americans physically travel to a voting place on election day or mail in a ballot prior, 2) ballots are counted (some by hand!) over the course of days or weeks, 3) based on the ballots totals for each state, representatives (electors) are sent to Washington DC to physically vote, 4) occasionally (as in 2000) court cases are settled to clear objections.  In a tokenized world, voting for President would go like this: 1) each voting American is provided a token that represents their vote and they send their token to the presidential candidate of their choice within the election period, 2) at the end of the period, the token ledger is evaluated to determine the President.  The technology already exists to do this tokenization in an extremely secure and auditable manner.

Example 2 - Another example of how tokenization can convey information efficiently is annual employee reviews.  Imagine a corporation that provides each employee a basket of tokens a the start of the year in which they can send to other employees they believe are furthering the company’s goals or helping their department meet its goals.  Annual employee reviews then become a review of who sent an employee tokens and how many, which is far more efficient (and fair) than current corporate methodologies like self-review, manager-review or 360 reviews.

Example 3 - One final example to drive home the point, this one closer to home.  There’s a lot of content being created on the internet and it can be challenging to determine what content is valuable and what is junk.  The Like and Share buttons on social media apps are one solution to this, but they lack nuance and are subjected to botting and gaming.  Instead, imagine a system in which the Like button triggers a micro token transfer to the content creator and the token has various uses within the application.  In such a system, each press of Like carries a very small cost, removing the incentives to bot and making the like far more meaningful.  Such a system could easily allow nuance by including multiple like buttons with different token transfer amounts.  This system would also reward the content creators in a way that current social media can not.  In fact, many applications on the Internet Computer are experimenting with variations of this approach.

In short, tokenization allows for more efficient transfer of information (whether that’s preference of presidential candidate, interest in a cat video or demonstration of your coworker’s superior skills) and does so in a secure, cheap and programmable way that is not possible without blockchain.  But what is the bigger picture?

In this blog series I’ve discussed why DAOs will lead to bottom up decision-making and in doing so will unlock the talent of much of the world that lacks opportunity.  I also discussed how liquid democracy will improve decision-making and voting by ensuring the process is heavily skewed towards parties that have demonstrated both high levels of competency and alignment with the voting body.  Tokenization is the third part of this triad, providing incredible efficiency to the sharing of information and preference.  Tokenization is a necessary part of the bottom-up decision making process because there is a natural inefficiency around large groups of people making decisions.  Tokenization is the optimization that reduces (maybe even eliminates) that inefficiency, much in the same way as a free market removes the information and decision inefficiencies of a centrally-planned market.  In fact, you could argue that tokenization is the equivalent of creating micro-free markets.

If you’ve read all six blogs in this series, I hope you are starting to see how DAOs, liquid democracy and tokenization is driving the next evolution of human progress.  In addition, organizations that adopt this new paradigm will naturally outperform organizations operating with the top-down decision making process.  However, there are lots of criticisms and valid arguments against this point. In the next blog in this series I will address these arguments and discuss what needs to be done to overcome the valid points.  This series will then conclude with some examples of today’s organizations and how they could be radically changed to a DAO-centric mindset.

By Kyle Langham, Director of Data & Analytics @ DFINITY Foundation

1 person liked this article