In computer science, Garbage In, Garbage Out (GIGO), is an expression used to convey the concept that incorrect or poor-quality input data will produce a faulty output or “garbage” output.

The token valuation space currently suffers from this issue. Having reviewed the existing literature and some of the work being done in the space, we found the following:

  1. Most of the current valuation models being explored (and used) are essentially retro-fitted versions of stock valuation models — Except a Token is not a Stock and does not bear the same features. Furthermore, tokens are more complicated instruments than stocks as they have money-like properties as well. Thus using retro-fitted stock valuation models are ill-suited to this new investment vehicle.
  2. There is no standard taxonomy for tokens. As the crypto space expands and the technology matures, we have seen the introduction of a plethora of tokens — Utility Tokens, Security Tokens, etc…the list goes on. As a result of this variety, having a ‘one-size-fits-all’ valuation model is a gross oversimplification.
  3. Lack of empirical data — Most of the ICOs that have been funded are still in the process of development. To be able to ascertain the market penetration, customer adoption curves and beneficial trade-offs of using token based solutions in comparison to existing products/services, we need to track and analyse the related data. As some of these projects will only be released in 2019 or 2020, making a valuation today is based on a large number of assumptions.
  4. “Technical Analysis” or Chartism — Token price is related to market speculation effects. Most traders use the term “technical analysis” to refer to the act of making trades based on patterns they see on trading platforms, such as Trading View. But this form of analysis is a self-fulfilling prophecy that is based on very little scientific fact. A good example of this is the “Vomiting Camel Pattern”, which is used to make trades under the assumption that the market is turning bearish, which in turn affects a token’s price. The phenomenon is best explained by the person who coined the term (see video below) and shows the fallacy with the current definition of technical analysis:

In light of these findings, we decided to go back to the basics. How does one go about making a valuation model, or any model for that matter? The answer lies in the variables that are selected.

Our working report, therefore, addresses this concept. With the aid of ChainSecurity, Autonomous NEXT and Stratumn, we compiled and provide a list of fundamental and technical variables that can help us start thinking how to go about building a valuation model, that respects the granularity of the token space.

What this report is NOT:

  • It does not provide a valuation model. Instead, we focus on what we believe needs to be taken into consideration to start creating the building blocks of token valuation models.

What this report is:

  • A review of the current valuation methods and how they compare to each other.
  • Provides a list of variables that could be used to build context-relevant token valuation models.

Why write this report?

Token investing was supposed to be a way for us to scale access to investment in new technologies and companies. However, owing to the lack of a structural framework to help investors determine if an investment is good or not, a large number of scams have proliferated the space. Those who have the necessary means and skills to perform the educated analysis are generally large institutional investors or VCs.

As a result of this knowledge dichotomy, confidence in token sales has started to wane and VCs with their more in-depth knowledge and resources, are now calling the shots. Most ICO investments are now handshake deals, and VC funding is rapidly gaining traction, as can be seen in the image below:

This goes against the principal reason for creating and engaging in an ICO, which is to allow the democratization of investment at scale.

By providing our variables of analysis and by making our work an openly accessible Working Paper, we invite other researchers, entrepreneurs and experts to add to the toolkit of variables, with the objective of building a Modular Valuation Model that is capable of addressing the context and granularity of token diversity.

We believe that adopting a modular approach is key, as tokens have money-like features and each token economy functions as a private economy of sorts. Thus, findings and variables from endogenous monetary systems need to be incorporated into token valuation models, which is currently not the case.

A modular valuation model along with an extensive variable list would give us the capability of selecting the right variables, with respect to the type of Token being considered, and build a model that best adheres to the token’s specifications. Rather than using a one-size-fits-all approach, we could build context-relevant valuation models, which would be the first step in establishing a standard taxonomy in the space.

It is only by getting such inputs and building a toolbox of variables that we will be able to:

  • Develop a taxonomy of token projects.
  • Create valuation models that respect the specific functionalities of the different token types.
  • Offer regular investors a chance to make better investment decisions in the near future.

The working report can be downloaded here, and the associated slides can be found here. An introduction to the project can be found at the ILB website.

We look forward to the community ’s input and feedback so that we can collaboratively build a mature understanding of this new investment vehicle.

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — -

The report has been written within the Blockchain Perspectives Joint Research Initiative, with the support of BNP Paribas and CDC Recherche. All thoughts expressed are the writer’s views only and do not reflect the stance of the supporters.

--

--