Building a Token Valuation Methodology From the Ground Up

Written by karybheemaiah | Published 2018/11/30
Tech Story Tags: blockchain | token-valuation | token | token-valuation-method | blockchain-valuation

TLDRvia the TL;DR App

“An unsophisticated forecaster uses statistics as a drunken man uses lamp posts — For support rather than for illumination”, Andrew Lang ( Nobel Prize in Literature, 1912)

Recently, along with researchers from BNP Paribas and CDC Recherche, we published a report that identified the variables of analysis when building a token valuation model. The report is the first in a series of reports on this subject, but the objective remains the same — to build a token valuation model, one must first determine the variables of analysis.

In computer science, Garbage In, Garbage Out (GIGO), is an expression used to convey the concept that incorrect or poor-quality input data will produce a faulty output or “garbage” output. The token valuation space currently suffers from this issue. Since the launch of Token Sales or ICO’s in the past few years, the notion of developing valuation methodologies that can accurately ascertain the future price of a token has become a subject of increasing interest and debate, with some experts attempting to retrofit stock valuation models in the hope of creating accurate token price prediction models. While these efforts are laudable and necessary for this new investment vehicle, they suffer from multiple flaws. Having reviewed the existing literature and some of the work being done in the space, we found the following:

Firstly, there is a lack of empirical analysis — making any kind of prediction model requires rigorous empirical evidence. Most of the ICOs that have been funded are still in the process of development. To be able to ascertain the market penetration, customer adoption curves and beneficial trade-offs of using token-based solutions in comparison to existing products/services, we need to track and analyse the related data. As some of these projects will only be released in 2019 or 2020, making a valuation today is based on a large number of assumptions.

Secondly, most of the current valuation models being explored (and used) are essentially retro-fitted versions of stock valuation models — Except a Token is not a Stock and does not bear the same features. This issue is further compounded by the fact that there is a significant degree of diversity in the token space — today we have work tokens, utility tokens, security tokens, etc….. As we have no standard taxonomy for tokens, having a ‘one-size-fits-all’ valuation model is a gross oversimplification which does not respect the fundamental functions of the token. Thus using retro-fitted stock valuation models are ill-suited to this new investment vehicle.

Thirdly, tokens have currency-like properties along with functional objectives. This feature of cryptoassets and Blockchains is what makes it unique from other kinds of technologies. While AI, VR, 3D Bio-Printers, or any other kind of technology has a functional purpose which generates economic value, the distinction between the tech and the economic value its generates is clearly demarked. Cryptoassets, on the other hand, have the economic value intrinsically built into the functionality of the token, making every token project resemble a mini-macroeconomy with its own currency supply and velocity of circulation attributes.

This macroeconomic perspective of cryptoassets is often not taken into consideration when building valuation models and in most cases is unclearly addressed in the whitepaper. In mid-2018, when researchers from the University of Pennsylvania analysed the smart contracts of the 50 top grossing ICO’s, they found that most token issuers had failed to properly define the supply dynamics of the tokens (which directly affects its price) and even fewer had encoded these dynamics into their smart contracts. The image below summaries their results:

As a result of these issues, coupled with the turbulent regulatory changes in the space, the price of a token is highly subject to speculation effects of the market and to the corresponding actions of nefarious actors. A practice that highlights this point is Pump n’ Dump schemes where the price of a token is artificially inflated by a fringe group to make a quick buck in a short time. No consideration is given to the underlying business as value is sacrificed at the altar of easy money and FOMO.

Image source: https://www.ccn.com/pump-dump-know-signs-trading-altcoins/:

Such actions, combined with the lack of investor understanding and bad investment practices, skew the actions of legitimate investors to unscientific practices which further affect the price of a token. A case in point to illustrate this link is the practice of Technical Analysis or Chartism. Most token traders (or even stock traders for that matter) use the term “technical analysis” to refer to the act of making trades based on patterns they see on trading platforms, such as Trading View. But this form of analysis is a self-fulfilling prophecy that is based on very little scientific fact. A good example of this is the “Vomiting Camel Pattern”, which is used to make trades under the assumption that the market is turning bearish, which in turn affects a token’s price. The phenomenon is best explained by the person who coined the term (see video below) and shows the fallacy with the current definition of technical analysis and how it affects the price of a token:

Building a Modular Valuation Methodology for Token Valuation:

In light of the problems listed above, the blockchain community needs to acknowledge the fact that what is required is a new valuation model for this emerging investment vehicle. Such a methodology would need its own framework, its proper definitions, jargon and most importantly, its own set of variables.

Anyone who has built a valuation model knows that the variables and the corresponding data that are used as inputs to the model determine the accuracy and forecastable usefulness of the model. The first person who crystallised this importance of this approach was Benjamin Graham, the author of the book “Security Analysis” and “The Intelligent Investor” who is often considered the father of value investing.

Taking these issues into consideration, we decided to go back to the drawing board and develop a new methodology for token valuation. Following a step by step approach, we began with the basics and addressed the issue of technical analysis but with a new definition — we define technical analysis as the analysis of the technical features of the ICO’s with a focus on smart contracts as they are the governance model of the ICO and directly impact the supply and thus price of the tokens. Our objective is to recreate the actions of Benjamin Graham which enabled us to separate the Intrinsic value of a Stock from its Speculative value.

Our report “Cryptoasset Valuation: Identifying the variables of analysis” thus details the technical variables that need to be taken into consideration when investors are analysing an ICO whitepaper. We detail the limitations of smart contracts and using inputs from ChainSecurity and Stratumn, provide a compiled list of technical variables that can help us start thinking how to go about building a valuation model, that respects the granularity of the token space. Our list includes:

  • Basic attributes of the Codebase (language and version)
  • Identity management system for the Token Sale
  • Scaling factors
  • Compliance with ERC20 Standards
  • Presence of negated conditions
  • Use of modifiers
  • Limits of the functions
  • Return Values of Functions
  • Over and under flows
  • Re-entrancy and Reordering attacks
  • GITHUB/Etherscan related variables, etc…

What this report is NOT: It does not provide a valuation model. Instead, we focus on what we believe needs to be taken into consideration to start creating the building blocks of token valuation models.

What this report is:

  • A review of the current valuation methods and how they compare to each other.
  • A list of technical variables that could be used to build context-relevant token valuation models.

Moving forward:

It is our hope that this report will spark the initiative of building a new token valuation approach. Nevertheless, we have just scratched the surface. Moving forward, we are currently focusing on the macroeconomic variables that need to be considered when performing a token valuation. As mentioned before, a token project is like a macroeconomy as tokens have currency-like properties and the token project has its own money circulation considerations. Hence, using the advances in endogenous monetary systems and borrowing from findings in monetary and fiscal policy, the second part of our report will focus on these issues and identify the monetary variables for building a token valuation methodology.

Our ultimate objective of determining the variables of valuation is to create a Modular Valuation Methodology. As tokens can possess different functions based on the type of service they provide, a “one-size-fits-all” valuation model is asinine. What is required is a modular valuation model along with an extensive variable list would give us the capability of selecting the right variables, with respect to the type of Token being considered, and build valuation models that best adheres to the token’s specifications. Rather than using a one-size-fits-all approach, we could build context-relevant valuation models, which would be the first step in establishing a standard taxonomy in the space.

It is only by getting such inputs and building a toolbox of variables that we will be able to:

  • Develop a taxonomy of token projects.
  • Create valuation models that respect the specific functionalities of the different token types.
  • Offer regular investors a chance to make better investment decisions in the near future.

Conclusion:

Token investing was supposed to be a way for us to scale access to investment in new technologies and companies. However, owing to the lack of a structural framework to help investors determine if an investment is good or not, a large number of scams have proliferated the space and undermined the potential of this investment vehicle. Those who have the necessary means and skills to perform the educated analysis are generally large institutional investors or VC’s who are in no hurry to share their secret sauce with the community.

As a result of this knowledge dichotomy, confidence in token sales has started to wane and VC’s with their more in-depth knowledge and resources, are now calling the shots. Today, most ICO investments are now handshake deals, and VC funding is rapidly gaining traction. As of Oct 2018, ICO and traditional venture funding is at par ($600 million from ICO’s and VC’s each making a total of $1.2 Billion) as can be seen in the image below:

Source: Autonomous NEXT.

To address this situation which is currently out of kilter and allow ICOs to act as the democratic investing vehicle that they are supposed to be, we need to create, test, and develop a new valuation framework that can be accessed and comprehended by the crypto community. Without this approach, we risk going back to the siloed and centralised financial practises of the past, which this technology was supposed to fragment.

ICO’s need to be treated as an antidote to our financial and economic malaise … not its twin camouflaged in a token garb.

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — -

Access the report.

All thoughts expressed in this post, represent the point of view of the author and not the sponsors of the report.


Published by HackerNoon on 2018/11/30