NXTER.ORG

Ardor vs. the Competition, Pt. 7: Ethereum (Smart Contracts)

This post is part of a series that compares Ardor to other blockchain projects with similar features or goals. You can find the previous posts here:

This week I studied Ethereum, which probably needs no introduction.

For several of the projects I’ve surveyed throughout this series, it has been rather difficult to find detailed, technical information. Ethereum has exactly the opposite problem: there is so much information available that it is difficult to distill it into a reasonable-length article without oversimplifying important ideas.

For this reason, I have chosen only two aspects of Ethereum to compare to Ardor. This installment compares its smart contracts to Ardor’s smart transactions, and the next article will compare the approaches that the two platforms take to managing blockchain bloat. There are many more topics I would have liked to cover–its plans to move to Proof-of-Stake (Casper), its state-channel strategies (Raiden and Plasma), its partnerships with large companies through the Enterprise Ethereum Alliance, and a sampling of the projects running on it, for example–but discussing even a couple of these topics in satisfactory depth is a daunting enough task. Besides, the two topics I chose offer the most interesting comparisons between the two platforms, in my opinion (but see the Ardor vs. Plasma post, linked above, for some thoughts on Plasma).

Without further ado, let’s talk about smart contracts.

Smart Contracts and “Rich Statefulness”

Ethereum’s design combines elements of Bitcoin and Nxt, and adds several novel features. Like Bitcoin, Ethereum uses a low-level scripting language to encode transactions, and it stores the contents of each block in Merkle trees whose root hashes are recorded in the block headers (more on this in the next article). And like Nxt, it tracks the current state of account balances and other account-specific data directly instead of using Bitcoin’s unspent transaction output (UTXO) model.

The most important innovations that Ethereum adds to this mixture are twofold: the ability to store scripts (contracts) in so-called “contract accounts,” which transact autonomously instead of being controlled by a user; and the ability to persist data in an account from one transaction to the next. Ethereum’s scripting language is also somewhat more powerful than Bitcoin’s language, allowing contracts to include loops and to invoke other contracts.

Combining these ideas, it is possible to create stateful “smart contracts,” which are bits of code and data that live in contract accounts and act as autonomous agents, listening for input from users and other contracts and transacting with them according to the rules defined in their contract code. The “stateful” modifier in the previous sentence is crucial: because a smart contract can have its own internal state, it is possible for one transaction to affect how subsequent transactions are processed. This is a significant departure from Bitcoin’s model, where transaction scripts only execute a single time and where the notion of the “state” available to a script is essentially limited to whether a given output is spent or unspent.

(You might have noticed that I haven’t said anything about Turing completeness. Depending on how pedantic you’re feeling, you could argue either side of the question of whether Ethereum’s scripting language is actually Turing complete. As the speaker in this excellent video explains, though, Turing completeness is a bit of a red herring anyway. Much more important is the fact that smart contracts are stateful and can transact with one another and with users in interesting ways.)

The potential applications of smart contracts extend far beyond setting conditions on the transfer of money from one account to another. Even the original white paper (which is a great read, by the way) proposed a handful of non-financial uses, including file storage, voting, distributed computing, governance of decentralized organizations, and decentralized marketplaces. Since then, developers have found plenty of other applications, too, such as decentralized messaging. And of course, the most common application of Ethereum so far, seemingly by an overwhelming margin, has been to conduct token sales for various projects.

Ardor’s “Smart Transactions”

If that list of applications sounds familiar, it might be because all but one of them have already been implemented in Nxt and Ardor as prepackaged “smart transactions.” Pioneered by Ardor’s predecessor, Nxt, smart transactions are bits of “blockchain 2.0” functionality that the Nxt and Ardor developers have made available as part of the protocol itself. They allow developers to create blockchain applications without having to write and test their own smart contracts.

In order to enable ordinary users (i.e., non-developers) to take advantage of this functionality, too, the official Nxt and Ardor wallets include a handful of features built from smart transactions. These include:

  • the Asset Exchange, where users can issue assets, trade them, and pay dividends to asset holders;
  • the Monetary System, where users can issue currencies and conduct several different types of crowdfunding campaigns;
  • a messaging system, which allows users to send each other plain-text or encrypted messages;
  • a voting system, which allows users to conduct polls by account, account balance, asset balance, or currency balance;
  • an integrated coin shuffler, which can afford users a degree of privacy by obscuring their transaction histories;
  • a decentralized data store, which can record the hash of a file permanently on the blockchain and, optionally, record the file itself permanently in special archival nodes;
  • a decentralized marketplace, where users can buy and sell goods and services peer-to-peer;
  • a new Coin Exchange (Ardor only), where users can trade child-chain coins directly for one another; and,
  • a number of advanced features, such as phased transactions, which allow users to set constraints on when and how other transactions are executed, and account properties, which can be used to associate arbitrary data with an account.

These are not the only applications that can be built from smart transactions, of course, but they do illustrate the breadth of what can be achieved with them. All of these features, plus a few more, will be available on Ignis, Ardor’s first child chain. Creators of other child chains will have the option to implement as many of these features as needed to suit their projects.

I’ve heard several analogies to describe smart transactions, but my favorite is that they are like Legos, while smart contracts are like clay: the former don’t provide the same degree of control over the finer details, but they are quicker and easier to use than the latter, and can still be combined to form some quite impressive final products.

The analogy isn’t perfect, of course. A strong argument for smart contracts is that it is possible for potentially all of the business logic of a decentralized application (Dapp) to be recorded permanently and immutably on the blockchain, for example, whereas a Dapp built from a combination of smart transactions likely includes some external code. In the latter case, using the Dapp might require some degree of trust in the developer not to change the rules in later versions of it.

Viewed from another angle, though, this comparison hints at arguably the biggest drawback of smart contracts: the ease with which they allow programmers to make multimillion-dollar mistakes that cannot be corrected.

Security Considerations

Just about all software that is even modestly complex contains flaws, and too often these flaws make the software vulnerable to exploitation by an attacker. Smart contract developers face a particularly difficult task because the code they write is immutable, and as a result its vulnerabilities are permanent.

Unfortunately, catastrophic failures of buggy smart contracts have not been rare. The attack that froze $150 M worth of ether stored in multisig Parity wallets and the $30 M hack of that same wallet several months prior are the most recent examples to grab headlines, but they are not the first and almost certainly not the last. For an overview of some common vulnerabilities and analysis of several real attacks, including the infamous DAO hack, I strongly recommend this excellent paper by three researchers from the University of Cagliari.

It is worth noting that the Ethereum protocol and the Ethereum Virtual Machine (EVM) were not responsible for any of these attacks. Ethereum’s supporters sometimes point this out, arguing that Ethereum itself is quite secure, and all that is needed is for developers to write better smart contracts. In a literal sense they’re right, of course: in all cases, Ethereum did what it was supposed to do, and ultimately the blame lies with smart contract developers.

But personally, I wonder whether this assessment is too quick to absolve Ethereum, and whether the problem might run a bit deeper than just a few buggy smart contracts. For now, anyway, it seems to me that Ethereum’s fundamental predicament is that it gives programmers tremendous power, but insufficient tools to use that power safely.

Developers’ ambitions almost always exceed the level of complexity that they can achieve while keeping their code perfectly bug-free, and there will therefore be a constant temptation to make functionality a higher priority than security (this is nearly universal in software development, by the way). Immortalizing the buggy code that they produce by storing it in the blockchain, Ethereum brutally and mercilessly holds them to account for their sins.

Thankfully, there are certainly ways to mitigate the risk of writing vulnerable smart contracts. For example, it is possible to design a smart contract that can be updated by having it delegate its responsibilities to a second contract, commonly called a “library contract,” at an address that can be changed to point to a different library contract later.

This approach allows developers to patch vulnerabilities, but as a consequence, it introduces the thorny question of who is allowed to switch to a new library contract. If it is a single third-party account, then the design reintroduces some degree of trust between that account and users. On the other hand, if the developers take another approach, such as allowing a majority of users to vote in order to approve each new library contract, then there are potentially further problems to solve, such as writing a secure voting mechanism, making sure that users are sufficiently informed and engaged to vote, and preventing an attacker from doing significant damage in the time it takes to organize a vote.

Another very promising approach towards securing smart contracts is to use techniques of formal verification borrowed from mathematics. I do not know much about formal methods, so please take what I write here with a grain of salt, but I do know that it is easiest (or indeed, feasible at all) with simple programs whose proper functioning can be expressed as a set of short, simple rules. In such cases, it can be possible to prove with certainty that the program contains no bugs. Even straightforward techniques like looping and recursion can complicate the analysis significantly, though, so it is best if the program under test is as simple as possible.

Why am I droning on and on about all this? Putting these thoughts together, it would seem that the best way to write smart contracts might involve: 1) keeping them as short and as simple as possible; 2) delegating the core business logic to library contracts that can be updated if necessary; and 3) reusing libraries that have been thoroughly vetted, so as to keep the amount of new code to a minimum. If the second of these points requires that users trust the contract’s author to some degree, as is often the case, then contracts designed according to these three guidelines start to look a lot like Ardor’s smart transactions: bits of stable, thoroughly tested code that expose the most commonly needed functionality, which developers can assemble into more complex programs.

Trade-offs between Security and Flexibility

I am not suggesting that Ardor’s smart transactions can accomplish all of what Ethereum’s smart contracts can securely accomplish, nor am I even arguing that combinations of smart transactions can always emulate smart contracts. What I am saying, though, is that I think there is a natural tension between the flexibility that a platform offers and the security of the code that developers inevitably write for it.

In this view, blockchain platforms can be located on a security-flexibility continuum. Near the “security” extreme is Bitcoin, whose scripting language is deliberately quite limited in order to prevent users from locking their coins with vulnerable scripts (though this is still possible, of course). Nxt and Ardor occupy a position somewhere toward the middle of the spectrum, limiting developers to a set of predefined transaction types but including an awful lot of functionality in those types.

Ethereum’s smart contracts, on the other hand, occupy the entire spectrum. It is possible to write extremely simple, trivially secure scripts on Ethereum, and it is also possible to write more complicated scripts that contain very subtle vulnerabilities. Perhaps just as importantly, it is difficult for users to tell the difference between these cases–and unreasonable, in any event, to expect them to try. Using Ethereum safely necessarily means avoiding the “flexibility” end of the spectrum, even if it comes at the cost of introducing some extra trust between users and developers.

Finally, it is worth mentioning that Ardor offers a new feature, not previously available in Nxt, that helps it inch towards the “flexibility” end of the continuum: the ability to combine phasing conditions using Boolean AND, OR, and NOT operators to achieve primitive smart-contract-like behavior.

Briefly, phased transactions allow users to condition an underlying transaction on some event, such as approval by a certain number of specific accounts (m-of-n multisig), a vote by accounts holding a particular asset, the expiration of some amount of time (timelock), or the revelation of a secret (e.g., a hashlock). On Ardor, combinations of these phasing types can encode more complex conditions, such as, “transaction X is valid if a majority of ABC Corp.’s asset holders approve of it by date Y, unless it is vetoed by a supermajority of ABC Corp.’s board members.”

It will no doubt be possible to combine phasing conditions in ways that allow for unexpected outcomes, possibly including theft or loss of funds. But the advantage over smart contracts in terms of security is still there, I would argue, since developers can focus on making sure the business logic of the transaction is sound, without having to worry about low-level bugs like race conditions. And of course, the drawback of offering less flexibility than a smart contract is still there, too.

Conclusion

With a protocol defined by a set of prepackaged smart transactions instead of a low-level scripting language, Ardor will probably never be able to offer developers as wide a range of possibilities as Ethereum does, at least in cases where everything must be done on-chain for minimal trust between parties. On the other hand, writing nontrivial contracts that follow security best practices might well require additional trust between users and developers anyway. And of course, Ethereum users ultimately have to trust the authors of smart contracts not to have made any mistakes and to have duly scrutinized and tested their code in order to make sure of it.

Naturally, you might say the same thing about any software, including Ardor’s smart transactions, but there is a key difference: there is simply so much more code running on Ethereum. Nxt has been open-source since its inception, providing ample opportunity for peer review, and Ardor’s code, which builds on the Nxt codebase, will be opened soon. Moreover, each new change to the protocol has been vetted thoroughly on a public testnet before being officially released. The same ought to be true of each and every smart contract, but with so much code being written, it seems like there are inevitably more opportunities for bugs to slip through into production.

In any event, I suspect that the degree to which most successful Dapps will rely on immutable code is still an open question. If access to an immutable database and a handful of simple operations on that data are sufficient for most applications, then Ardor’s smart transactions seem to me to have an obvious advantage over smart contracts. If, in contrast, the notion that “code is law” turns out to be essential to the viability of most Dapps, with each Dapp requiring most of its unique code to be recorded on the blockchain in order to be truly trustless, then Ethereum’s approach is probably superior.

I expect that there will be real-world applications that suit each platform. But I also wonder whether it will eventually become clear that one of the two approaches best handles a sizable majority of applications. Which approach will ultimately “win” is not at all clear to me, but I suspect that the deciding factor will be users’ judgments of the degree of trust that each case requires. And since the entire appeal of blockchain technology is that it allows users to transact with minimal trust, I’d say that outcome would be quite appropriate.

Thanks for reading! If you enjoyed this article, be sure to read the next part of the series, which compares the ways that Ardor and Ethereum cope with blockchain bloat.


Try Ardor on testnet

About the latest Ardor testnet version

Ardor vs. the Competition, Pt. 5: Stratis

This post is part of a series that compares Ardor to other blockchain projects with similar features or goals. You can find the previous posts here:

This week I studied Stratis, a blockchain-as-a-service platform based on the Bitcoin protocol.

Stratis

The goal of the Stratis project is to enable businesses to create their own customizable blockchains, choosing from a set of prepackaged features. Additionally, the Stratis Group, which guides the development of Stratis, will offer consulting services to help businesses find ways to use blockchain technology effectively, and presumably will also help them configure and deploy custom blockchains on the Stratis platform.

Put this way, Stratis sounds an awful lot like Ardor. But in most of the details–to the extent that details about Stratis are available, anyway–the two platforms are quite different. More on those differences in a bit.

Currently, the Stratis platform comprises several parts:

  • NBitcoin, a comprehensive Bitcoin implementation in C# inspired by Bitcoin Core;
  • NStratis, a fork of NBitcoin that adds a proof-of-stake mining algorithm and an alternative proof-of-work algorithm;
  • the Stratis Bitcoin Full Node, which can run on either the Bitcoin network or the Stratis network, and which serves as the basis for the rest of the platform;
  • the Breeze Wallet, a simplified payment verification (SPV) wallet for both Bitcoin and Stratis that implements TumbleBit to make transactions private; and,
  • the Stratis Identity module, which allows third parties to attest to the identity of the person controlling a Stratis account.

Note that most of these components are currently in alpha.

Particularly noteworthy in this list is the integration of TumbleBit into the Breeze Wallet. The TumbleBit paper is rather dense; if you’re interested in the details, I recommend instead this excellent presentation by two of the authors. In a nutshell, TumbleBit uses one-way payment channels to transfer funds from a set of payers to an intermediary called the Tumbler, and from the Tumbler to a set of payees, without any of the parties having to trust one another. The key innovation over other payment channel implementations is that TumbleBit uses blind RSA signatures in a clever way to prevent the Tumbler from knowing which incoming transaction maps to a given outgoing transaction. If many accounts are transacting through the Tumbler, then it is impossible to trace the funds in an output account back to the input account that sent them. Not even the Tumbler can link the two accounts.

Stratis’s Breeze Wallet provides TumbleBit functionality for both Bitcoin and Stratis, making it useful to a much larger audience than would be the case if it worked only on the Stratis network. Moreover, since the TumbleBit protocol uses off-blockchain payment channels, it is possible to make many payments through the Tumbler in approximately the same amount of time as it takes to make a single payment.

The Stratis Identity module is still at the proof-of-concept stage, but it is functional nevertheless. Users can log into their Microsoft, Google, or LinkedIn accounts using the Stratis Identity mobile app, and these services will notify Stratis of the successful login. A special account owned by Stratis then records an attestation to the successful login by hashing the corresponding personally identifiable information (e.g., name and email address) and storing it on the Stratis blockchain.

An attestation by Google that a person owns a particular Gmail account is perhaps not the most useful identity service, but it is easy to see how the same mechanism could be used to prove ownership of some piece of information that is much more difficult to verify. For example, a government agent might attest that somebody presented a valid photo ID, together with a name and address. If a user can provide the name and address that match the hash on the blockchain, that would probably convince a service provider that the user also owned the corroborating photo ID, since the government agent attested to all three pieces of information together.

TumbleBit integration in the Breeze Wallet and the Stratis Identity module are two examples of the kinds of features that Stratis intends to offer on their platform. I’m not completely sure I’ve grasped the overall architecture of Stratis, but from what I can understand, the idea is for the Stratis blockchain to delegate the backend processing for each new feature, such as TumbleBit and Stratis Identity, to a dedicated set of masternodes. For example, the upcoming Breeze Node–not to be confused with the Breeze Wallet, which uses SPV instead of requiring a full node–will be a masternode that serves as a Tumbler. Similarly, there are plans to build masternodes that process Stratis Identity transactions, though I don’t really know what that means and can’t find any details.

Finally, it is worth mentioning that the Stratis team has planned several other features, most notably a way to deploy sidechains anchored to the Stratis chain. My understanding is that this will be the main mechanism that Stratis uses to provide customizable, private blockchains to clients.

Unfortunately, I haven’t been able to find any details about how sidechains on Stratis will work. The Stratis white paper refers to Blockstream’s sidechain paper, but that is the only hint I have found so far about Stratis’s design. In particular, it is not so easy to securely and trustlessly transfer value between two blockchains without having at least some of the miners on each chain validate all transactions on both chains. The details, including how the sidechain protocol handles forks and reorginzations, are crucial in order to evaluate how secure the mechanism is.

Even supposing that transfers between the Stratis chain and sidechains are secure, there is also the matter of the security of the sidechains themselves. The Stratis white paper says in several places that the Stratis chain will somehow provide security for its sidechains, but it doesn’t explain how that will work. Typically, sidechains are completely independent and must secure themselves.

Compared to Ardor

With Ardor, on the other hand, the parent chain does provide security for each child chain.

In fact, this is one of the most important differences between Ardor’s parent-chain/child-chain architecture and typical sidechain implementations. Unfortunately, without more technical details from the Stratis team, it is impossible to do a proper comparison between their design and Ardor’s approach.

One comparison that we can do is between Stratis’s TumbleBit feature and Ardor’s Coin Shuffling feature. (Note that Coin Shuffling will not be available on the Ardor chain itself, but it will be available on Ignis, the first child chain, and other child chains can also choose to support it.) This feature is Nxt’s implementation of the CoinShuffle algorithm, which allows a group of users to trustlessly agree to transfer a fixed quantity of coins from their (input) accounts to a set of output accounts, one per input, without any user being able to know which of the other users controls each of the other output accounts. The algorithm is not very complicated, and section 4.2 of the CoinShuffle paper gives a good overview of how it works.

I don’t claim to be an expert on either algorithm, but the TumbleBit approach seems to me to have a couple of advantages over CoinShuffle. Because it uses off-blockchain payment channels, it is potentially capable of scaling to a high transaction rate in addition to adding a measure of privacy to payments, addressing two problems at once. Also, if the goal is to prevent an observer from noticing correlations between several payments–which might leak information about a business’s customers or supply chain, for example–it would probably be more convenient to make the payments back-to-back from the same account via TumbleBit instead of having to first shuffle each payment to a new account.

On the subject of identity verification, I think the Stratis Identity module is an interesting proof of concept, but in my opinion Ardor provides a much richer set of tools for identity-related services. While a service like Stratis Identity can be built relatively easily on any blockchain, Ardor offers a couple of unique features that could extend such a service for some interesting applications.

On Ardor, identity validators will be able to attest to the identities of account owners using Account Properties. These are arbitrary bits of data that can be permanently associated with an account on the blockchain, rather like attestations in Stratis Identity. One novel feature that Ardor will add, though, is the ability to issue assets that can only be traded by accounts that have a specific property set.

In cases where government regulations require that asset issuers know who is purchasing their assets, this feature will allow issuers to restrict trading of their assets to accounts whose owners’ identities have been verified by compliant identity providers. This level of control will hopefully help put blockchain-based securities on a firmer legal foundation, and will make it easier for asset issuers to comply with the law.

Even apart from regulatory compliance, asset issuers will probably find other uses for this feature. For example, a club or other private organization could express eligibility requirements for membership as a set of required account properties, issue an asset that only eligible accounts could obtain, and then use the asset to pay dividends to or conduct polls of members.

Some Thoughts on Marketing

Even having read this far, you might still be wondering what exactly the Stratis platform is and how it works. To be frank, I have found myself asking these questions too, even after many hours of reading about Stratis. At the risk of speaking perhaps a bit too close to the edge of my knowledge, I think it might be helpful to compare and contrast the marketing efforts of Jelurida and the Stratis Group in order to shed some light on why it is hard for me to answer these very basic questions.

Reading the Stratis website and the white paper (linked above), I got the distinct impression that, to be blunt, those resources weren’t really written for me. The language they use reminds me of how the salespeople at my company talk, and I learned a while ago that engineers and salespeople tend not to understand each other very well.

I read that Stratis offers “simple and affordable end-to-end solutions” to “streamline and accelerate [my] blockchain project development”; that it is a “powerful and flexible blockchain development platform designed for the needs of real-world financial services businesses and other organizations that want to develop, test and deploy applications on the blockchain”; and that its “one-click process means that new chains can be launched with unprecedented speed, tailored for the needs of the organization”; but I still don’t really understand what any of this means, much less how Stratis will accomplish these things.

This type of language conveys precisely zero information to me. Without technical details, I am completely, hopelessly lost. I know that there are plenty of people who are fluent in business-speak, though, and those people can probably read the Stratis white paper and come away with a decent, if very high-level, understanding of what the company plans to do. In contrast, it took me multiple passes through the white paper before I began to grasp the big picture, and I’m still not sure I have it right.

The Ardor white paper, on the other hand, contains substantial technical detail about how Ardor works and what distinguishes it from other blockchain platforms. It is obvious, both from its content and how that content is organized, that engineers played a significant role in writing it. Upon completing my first pass through it, I understood pretty well what problems Ardor solves and how it solves them.

The point I’m trying to make with this comparison is that business-minded people and technically-minded people often speak different languages, and the marketing materials that the Stratis Group and Jelurida have created seem to reflect this difference. Personally, I found it extremely frustrating to find so little technical substance in Stratis’s resources, and this frustration has probably prevented me from really understanding Stratis.

Conclusion

Is my assessment of Stratis too harsh? Maybe. I do think that TumbleBit is an interesting piece of technology, and it seems smart for the Breeze Wallet to implement it for both Stratis and Bitcoin. Moreover, if we drop the white paper’s contention that the Stratis chain will secure its sidechains, and instead assume that sidechains will be responsible for their own security, then I can use my imagination to fill in enough of the gaps to come up with at least a rough mental image of what Stratis will look like when it is complete.

This mental image, though, is basically a direct competitor to Lisk. Sure, Stratis is based on .NET and the Bitcoin protocol instead of JavaScript and Lisk’s predefined transaction types, and the feature sets that the two teams intend to offer don’t overlap perfectly, but essentially both projects aim to provide a central, public blockchain and a set of tools for easily creating sidechains on it. Both projects are in rather early stages of development, too, and for this reason it can be difficult to find technical details about them.

Ardor is quite different. Built on the Nxt codebase, it is already far more mature than Stratis, despite not having launched on its mainnet yet. Its parent-chain/child-chain architecture achieves the goal described in the Stratis white paper–a means for businesses to create customizable blockchains without having to worry about securing them–better than existing sidechain architectures. And the rich variety of features that Ardor already supports will take quite some time for Stratis to emulate.

Perhaps just as importantly, Jelurida and the Nxt community have done a great job of making technical information about Ardor and Nxt publicly available. This information lends credibility to the Ardor project and strengthens the community. In my opinion, it is what separates true marketing from hype.


Try Ardor on testnet

Ardor vs. the Competition, Pt. 4: Waves

This post is part of a series that compares Ardor to other blockchain projects with similar features or goals. You can find the previous posts here:

Until now, one of my main goals with this series has been to survey different approaches to scaling a distributed ledger. This week and for the next couple of posts, though, I’m shifting my focus slightly towards the business side of blockchain technology. I’ll attempt to explore the real-world problems that blockchains can solve and the ways that different projects have positioned themselves to suit their target markets.

These subjects are a bit outside my comfort zone, so I’ll thank you in advance for your patience with me in case I say something ignorant or naive. And as always, I greatly appreciate constructive criticism. 🙂

This disclaimer is especially important this week, because this week I studied Waves. As a newcomer to Nxt, I’ve read just enough about its history to know that the founder of Waves, Sasha Ivanov (a.k.a. Coinomat on nxtforum.org), had been an active member of the Nxt community until the turbulent period of early 2016, at which time he left to found Waves. I won’t attempt to rehash the debate over Ardor and the future of Nxt, which I understand ended with many asset issuers like Sasha leaving the community, but if you’re interested I’d highly recommend apenzl’s summary in SNAPSHOT and the references therein.

Instead, for this post I’ll mostly ignore the histories of Nxt and Waves, and will approach both projects with an open mind and a view towards the future. I do think there would probably be some value in a proper historical analysis, but I simply am not qualified to offer one.

With that out of the way, let’s talk about Waves.

Waves

At first glance, Waves looks a lot like a stripped-down version of Nxt. It is primarily a decentralized exchange (DEX), inspired by and conceptually similar to the Nxt Asset Exchange. Like Nxt, it uses a proof-of-stake consensus algorithm and allows users to lease their balances to other accounts in order to forge in pools. It recently added a way to associate a human-readable alias to an account number, partially replicating the functionality of Nxt’s Alias System. Even a couple features still in development–namely, a voting system and a way to send encrypted messages–duplicate functionality that Nxt already offers.

At the same time, Waves is missing many of Nxt’s most powerful features. For now, it doesn’t support anything similar to Nxt’s phased transactions or account control options, for example, though it is worth noting that both smart contracts and multisig transactions are on the agenda.

Additionally, the white paper suggests that crowdfunding will be one of the main uses of the Waves platform, but tokens on Waves lack the customizable properties that make Nxt’s Monetary System currencies so useful for this application. For example, the Monetary System offers the ability to condition the transfer of funds on meeting a fundraising goal, a la Kickstarter, and also the option to restrict trading so as to prevent scalpers from creating a secondary market. Using this latter feature, called a “Controllable” currency in Nxt’s terminology, it is even possible for issuers to dictate both a fixed asking price and a fixed bid for the currency, enabling them to offer buyers full or partial refunds for their tokens. Crowdfunding on Waves, in contrast, is limited to issuing a token essentially at the market rate.

These observations notwithstanding, in my opinion it would be a terrible mistake to dismiss Waves as just another Nxt copycat with fewer features. For one thing, Waves offers several key features that Nxt and other platforms do not have, which I’ll describe next. Perhaps even more importantly, though, the Waves team has built a strong brand and has offered a clear and consistent vision since the platform’s inception. The field is currently so crowded, and innovation so rapid, that the combination of a simple, clear message, a strong marketing effort, and a demonstrated ability to deliver on those promises might be even more important to the long-term success of a project than the richness or novelty of its underlying technology.

Unique Features

One interesting feature that distinguishes Waves from many other platforms is the design of its DEX. It is a hybrid approach that combines a centralized order-matching engine, called the Matcher, with decentralized settlement on the Waves blockchain.

When users place orders on Waves, the Waves client sends those orders to central Matcher nodes, which maintain the order books for all tradeable pairs. Each new order is either matched against existing orders or added to the order book for the pair in question, but either way the user who submitted the new order is notified immediately whether the order was filled. It is still necessary to wait for the next block(s) to be added to the blockchain to fully confirm the transaction, but in the meantime, the user knows with high confidence the result of the order.

This might not seem like a big improvement over a fully decentralized exchange, but from the handful of transactions I made on Waves, I must say I was quite impressed by the user experience. The ability to see real-time updates to the order book, and to know immediately whether my orders were filled, made a bigger difference than I had expected.

In principle, any full node can become a Matcher. The lite client currently only connects to Matchers at nodes.wavesnodes.com by default, though, so Matchers on the rest of the network probably do not see much volume. With new orders transmitted directly to these centralized nodes, and only broadcast to the whole network once they have been filled (I think), this design allows the order books to remain anonymous. I don’t know for sure how important it is for open orders to be anonymous, but it certainly seems like a feature that traders might value highly.

Another distinguishing feature of Waves is the ability to trade any token against any other token without first converting to WAVES. Combined with the integrated gateways that issue tokens pegged to U.S. dollars, euros, and several cryptocurrencies, this feature enables Waves to function as a decentralized foreign exchange market. It also allows token issuers to conduct an initial offering directly in fiat-pegged tokens. With the full client, it is even possible to pay fees in tokens instead of WAVES.

Additionally, it is worth noting that there are several features in development or on the roadmap that also distinguish Waves from other platforms. One is a reputation system that will score accounts by their age, transaction history, and other factors. There are not many details yet, but the goal is to provide users with at least a rough indication of how trustworthy a given token issuer is. The white paper even goes so far as to suggest that the reputation system will serve as “some form of decentralized KYC/AML” (know your customer/anti-money laundering) system. While it’s difficult to see how a decentralized reputation system could help issuers actually comply with KYC and AML laws, it’s not unreasonable to suppose that it could serve some analogous purpose in a blockchain community.

Speaking of compliance issues, Waves has also announced a new project, Tokenomica, that will provide a “100% compliant legal framework for different types of token crowdsales, including private equity crowdsales.” Unfortunately, that quote from the 2017 roadmap is just about the full extent of information I’ve been able to find about Tokenomica. My impression is that the project is still in its early stages, but it shows that the team is taking regulatory compliance seriously.

For completeness, I should probably mention that the Waves team is also planning to incorporate smart contracts into Waves. The scripting language will not be Turing complete, and there will be no equivalent to Ethereum’s concept of “gas,” presumably because there will be no loops. Beyond these details, there isn’t much other information available yet.

Finally, I must mention the approach that the Waves team has outlined for scaling. It consists primarily of two parts: a redesign of the forging process that breaks large blocks into “microblocks” to optimize bandwidth usage; and an optimization to how account balances are stored–or rather, not stored–that reduces memory requirements for full nodes.

The first of these two proposals, called Waves NG, is based on Bitcoin NG. In a nutshell, once a node has won the right to forge the next block, it immediately issues a key block, which is usually empty, and then broadcasts microblocks containing transactions every few seconds. The motivation for this design is that broadcasting one large block each block interval is a much less efficient way to use the network’s bandwidth, and the corresponding spikes in network activity place an artificially low bound on the number of transactions that the network can handle. By spreading transactions out over a sequence of microblocks, it is possible to increase the average data rate over the network but decrease the peak data rate, lessening the constraints that bandwidth and latency impose on the maximum transaction rate.

The second component of the scaling plan is to implement the ideas described in this paper by Leonid Reyzin, Dmitry Meshkov, Alexander Chepurnoy, and Sasha Ivanov. I admit I haven’t spent very much time with it, but the gist is that full nodes will not all be required to store every account’s balance of every token in memory in order to validate transactions. Instead, they will store a compact digest of this information, and forgers that do store it in full–or some subset of it, if they choose to only forge transactions involving specific tokens–will generate cryptographic proofs that they have updated the account balances correctly. The forgers will then include the proofs and an updated digest in the header of each new block. Nodes that have chosen not to record the balances of all tokens involved in those transactions will still be able to validate them by using their current digest and the forger’s proofs to compute an updated digest, which they can compare to the one the forger reported.

The authors argue that this approach can reduce the amount of memory required for a full node under realistic conditions by about a factor of four. Moreover, if this optimization is able to keep all required information in memory in cases where it would otherwise have to be stored on disk, the performance improvement could be far greater–about a factor of 20, the authors suggest.

Comparison with Ardor

Although a couple of the features described were not present in Nxt, there will be similar features available in Ardor.

Specifically, Ardor’s parent-chain/child-chain architecture will allow users to trade all pairs of child chain coins, some of which could be pegged to fiat currencies and other cryptocurrencies. It will also be possible to price assets in any of the child chain coins, and to pay fees in the child chain coin when transacting on a given child chain. It will not be possible to trade assets against each other directly, but most of those trading pairs would probably have such low volume that it wouldn’t really be worthwhile to add this feature anyway.

As for the improvements that the Waves team has made to their DEX by partially centralizing it, it should be possible to mimic this functionality pretty closely by building a centralized order matcher on top of Nxt/Ardor. Indeed, the InstantDEX project accomplished something similar in the past, using Nxt to settle transactions in a decentralized manner.

On the subject of scaling, the proposal to reduce in-memory storage requirements for full nodes is intriguing, but I wonder whether there might be a small trade-off with security. (If you’ve read the previous articles in this series, then you have probably figured out by now that I always suspect that performance improvements entail reductions in security.) In particular, if nodes are not required to store the current state of every account, and must use the proofs and digest in each new block’s header to validate the transactions contained in it, then I assume that means that nodes will not be required, nor even will they be able, to validate unconfirmed transactions before broadcasting them to their peers. I don’t know the consequences of allowing nodes to propagate potentially invalid transactions across the network, but the thought makes me a bit uneasy.

Ardor’s approach to scaling is for all nodes to validate all transactions, but for only the minimum possible amount of information to be permanently recorded on the Ardor blockchain. In particular, only those transactions that change the balances of ARDR, the forging token, need to be stored on the blockchain in order for other nodes to trustlessly verify that each block was forged by an account that was eligible to do so. In contrast, the whole history of transactions involving only child chain coins and the assets and currencies traded on those child chains does not need to be stored on the blockchain, and hence can be pruned away, leaving only cryptographic hashes of that information behind. The result is that the blockchain stays much smaller and grows more slowly than would be the case if it stored all of this extra information.

Which approach is better depends on whether permanent storage of the blockchain or in-memory storage of current account balances presents a bigger problem as the two platforms grow. I don’t know the answer to this question, but there are a couple of related points that are probably worth making. One is that the timescales of the two problems could be quite different: I could see an explosion of new assets on the Ardor platform placing an immediate strain on memory, whereas blockchain bloat would likely pose a severe long-term problem for Waves, especially if it reaches hundreds or thousands of transactions per second, which is the current goal. My other thought is that Ardor required an entirely new architecture to implement its scaling solution, whereas Waves’s approach will not. It would no doubt be easier for Ardor to incorporate Waves’s solution at some point in the future than for Waves to implement Ardor’s solution.

Finally, perhaps the most interesting subject in this comparison is the issue of regulatory compliance. Waves has positioned itself as a platform for creating and issuing tokens, with a special focus on crowdfunding. To that end, the Waves team has indicated that they are taking a serious look at the regulatory complications that go along with crowdfunding–which might involve selling securities, for example–in order to help users comply with the law. While the suggestion that a decentralized reputation system might eventually replace traditional KYC/AML requirements strains credulity, it could at least help suppress scams and reduce the opportunities for bad actors to take advantage of others. In that sense, it might accomplish some of the same goals that regulators aim to achieve.

Ardor, for its part, will offer a couple of enhancements over Nxt that will be quite valuable for regulatory compliance. One is the ability to issue assets that can only be traded with a certain type of phased transaction, and the other is the addition of a new phased transaction type, which allows an account to approve a transaction only if the account has a certain specific property. Combining these two features, a user can issue an asset which can only be purchased by accounts that have a property that, for example, a KYC/AML-compliant identity provider has added to designate that it has verified the owner’s identity.

If your asset represents shares of a company, or a mutual fund, or some other type of security, this feature would enable you to prove to regulators that you know who is purchasing your tokens. Moreover, if you are a user interested in purchasing those types of tokens, recording a proof of your identity on the blockchain via your account’s properties will hopefully allow you to spend less time trying to convince businesses that you are who you say you are and that you aren’t laundering money.

In addition, it will be possible to create child chains that support only a subset of the features that the Ardor platform offers. This will allow child chain creators to disable certain features, such as coin shuffling, that might raise red flags with regulators in some jurisdictions.

Conclusion

What, then, do we make of Waves? There is definitely something to be said for choosing one problem and trying to solve it better than anybody else can do. Abandoning Nxt’s “Swiss Army knife” approach and focusing instead on the single goal of building a great token-trading platform no doubt made it easier to pitch, develop, and market Waves. There is also a lot to be said for starting off well-funded, as Waves did with a $16M ICO.

At the same time, though, I’m not sure that an objective comparison of Waves and Ardor could conclude that Waves is as technologically mature as Ardor is. (For the record, I have tried to do a fair and objective comparison in this article, but I am not claiming that I succeeded. That’s ultimately your call.) Nxt is already capable of almost all of what Waves can do, not to mention all of the things that Waves cannot do, and Ardor is adding new functionality, too.

Perhaps Ardor’s biggest remaining challenge is to truly sell its vision the way that the Bitcoin community and the Ethereum Foundation have sold their visions, and this is where Waves has a sizable head start. Being capable of so many different things, but not purpose-built for anything in particular, Ardor faces a very difficult task here. The worst possible outcome would be for users and businesses to see it as “just another platform,” or perhaps to fail to grasp the full range of what it can do, and to simply ignore it as a result.

As for Waves, I’m excited to see what the future holds. The improvements that it has made to the Nxt Asset Exchange, though modest in my opinion, have nonetheless distinguished it as a formidable DEX. If the Waves team can follow through on their roadmap, Waves will be a fierce competitor among exchanges–centralized and decentralized alike.

Ardor vs. the Competition, Pt. 2: NEM/Mijin/Catapult

This post is part of a series that compares Ardor to other blockchain projects with similar features or goals. You can find the previous posts here:

This week I studied NEM, a public blockchain similar to Nxt in many ways. As I’m primarily interested in each blockchain project’s approach to scaling, I also researched Mijin, a version of NEM for private blockchains, and Catapult, a rewrite of Mijin which promises large performance gains and which will also be incorporated into future releases of NEM.

NEM

Although NEM’s core developers abandoned their initial plan to start NEM as a fork of Nxt, choosing instead to start the project from scratch, NEM and Nxt are still fairly similar. Like Nxt, the NEM platform provides a predefined set of allowed transactions which applications can use as building blocks to create more complex features, as opposed to using a low-level scripting language to construct transactions, like Bitcoin or Ethereum.

Both platforms support a variety of “blockchain 2.0” features, like sending messages, creating and transfering assets, and sending transactions requiring the approval of multiple accounts (m-of-n multisig). And both platforms expose their functionality through HTTP-based APIs, so developers can use virtually any language to write applications for them.

Despite these similarities, NEM also has some notable differences compared to Nxt.

Perhaps the most fundamental one is its novel consensus algorithm, called proof-of-importance. This algorithm is similar to proof-of-stake, except the probability that an account may harvest (i.e., forge) the next block depends not only on its stake of XEM, which is the native coin on NEM, but also on how recently it has transacted with other accounts and how much XEM was exchanged. Accounts that hold a large stake of XEM and which transact frequently and in high volume harvest more blocks than accounts with less XEM or accounts which only rarely transact.

The authors of the NEM Technical Reference argue that, compared to proof-of-stake, the proof-of-importance algorithm gives somewhat less weight to the wealthiest accounts when determining the right to forge/harvest the next block (Section 7.8). Proof-of-importance is also central to NEM’s spam filter, which requires that an attacker not only control a lot of accounts, which is easy to do, in order to spam the network with a large number of unconfirmed transactions, but also to hold a large stake in each account and transact frequently with other high-importance accounts.

In my view, another main difference between NEM and Nxt is the extent to which each platform’s “blockchain 2.0” features are integrated directly into the API. For example, NEM’s assets, called “mosaics,” share several features with the Nxt Monetary System’s currencies, but NEM does not have a built-in decentralized exchange for mosaics. (As a side note, the NEM Foundation has contracted with Blockchain Global to create a traditional, centralized exchange featuring mosaic-based ICO tokens.) Similarly, while you could certainly build a decentralized marketplace on top of NEM where users could buy and sell goods and services, NEM does not have such a marketplace built into its API the way that Nxt does.

Finally, one subtle but very important difference between NEM and most other blockchains, including Nxt, is the way that it handles multisignature transactions. Instead of allowing any account to generate a multisig transaction, NEM introduces the concept of a multisig account and requires that all multisig transactions originate from such accounts. Any co-signatory on the account can initiate a transaction from it, and the transaction is only executed if a sufficient number of the other co-signatories approve it.

At first this might appear to be a limitation, since it requires a separate multisig account for each set of co-signatories a user wants to cosign with, but it has two key advantages: the multisig account is a full account, capable of receiving payments, messages, and mosaics, for example; and co-signatories can be added and removed, so custody of the multisig account can be transferred. It is possible to create a “1-of-1” multisig account, i.e., an account with a single custodian who can transfer it to a different custodian if desired. In this way, multisig accounts on NEM can act like transferable containers for XEM, mosaics, and messages.

One particularly impressive application of this concept is a notary service built on NEM called Apostille. With Apostille, the process of notarizing a document looks like this:

  1. Hash and sign the name of the document.
  2. Create a multisig account for the document derived from the resulting signature.
  3. Hash and sign the contents of the document.
  4. Send a message containing the result to the document’s multisig account.

Note that the last step also attaches a timestamp to the document, since the transaction that transfers the document’s signed hash to the multisig account is recorded on the blockchain.

As an example of a potential application of Apostille, the authors of the white paper consider a case where the notarized document is a car title. Ownership of the car can be transferred by changing co-signatories on the multisig account that contains the title; messages describing maintenance and repairs can be sent to the multisig account to record the car’s service history; and mosaics issued by governments or insurers could attest to payment of fees. In this way, the multisig account represents both the car itself and the history of other accounts’ interactions with it.

Anyway, that’s quite enough about NEM. Next, Mijin.

Mijin

At a high level, Mijin is a version of NEM that three of the core NEM developers and a company called Tech Bureau developed as a private, permissioned blockchain product. Like any private blockchain–and in contrast to NEM, which is public–a Mijin blockchain is owned and controlled by a central authority, such as a company.

This isn’t the place for a full debate about the utility of private blockchains, but as Mijin and Catapult are an important part of the NEM ecosystem, please indulge me for a minute. In my opinion, the more “private” a private blockchain becomes, the less useful it is. While I can see a case to be made for “consortium” blockchains, where a handful of independent organizations who don’t necessarily trust each other cooperate to secure the network against abuses by any one member of the group, I have trouble seeing the value in a blockchain controlled by a single authority. In my view, a blockchain without trustless consensus is basically just an extremely slow, extremely inefficient database.

I know there are plenty of people who disagree with me, though, so for the remainder of this post I’m going to assume private blockchains have value and that there is a market for them, especially in financial services, which seems to be the main industry that Tech Bureau intends for Mijin to serve.

There is not nearly as much information about Mijin available on the internet as there is about NEM, but I did learn some interesting facts that hint at its potential. For one thing, although Mijin and NEM are completely separate projects, Mijin does share the NEM API (or at least the two APIs overlap substantially), which suggests that it will be relatively easy for developers to write applications that run on either platform. The common API might also facilitate interactions between Mijin chains and the public NEM chain, but I haven’t found any information about the details of those interactions.

Additionally, the Mijin website states that Mijin will support smart contracts, though the Catapult white paper seems to slightly contradict that statement when it says, “the approach here is to make the smart contract an external component, whether centralized (i.e., status quo with existing systems) or decentralized. The outputs of these smart contracts will then enter their transactions into the ledger through a secure transaction process.” To me, this implies that the contracts themselves will be neither stored on the blockchain nor executed by all nodes on the network.

Speaking of Catapult…

Catapult

Catapult is a rewrite of Mijin with a focus on increasing the rate at which transactions can be confirmed. Judging from the white paper (linked above), the first deployments of Catapult will be at banks and other financial institutions, where the author envisions it will replace patchworks of “disjointed monolithic systems” that he says are commonly used today. Eventually, the developers also plan to integrate Catapult into NEM to facilitate scaling the public blockchain as well.

Like Mijin, Catapult is currently closed-source and many technical details are not public. I was able to find some good information digging around the NEM blog, though, especially in this thread by one of the developers.

Catapult divides the work that the network does among three types of nodes:

  • P2P nodes, which add new blocks to the blockchain and maintain consensus about its state;
  • REST nodes, which present client applications with all the features they can use from the Catapult API; and
  • API nodes, which, like P2P nodes, store the blockchain and can read directly from it (I think), but which do not add blocks to it. These nodes serve data to the REST nodes to fulfill client applications’ requests.

This breakdown appears to roughly correspond to the three-tier architecture commonly used for web applications, where the blockchain (P2P nodes) is the database, the REST nodes are the front-end, and the API nodes handle the business logic of interpreting and interacting with data in the database.

If this analogy is correct, then presumably the goal of this architecture is to allow each tier to scale independently. Especially for a private blockchain, the optimal number of P2P nodes used to establish consensus might be much smaller than the number of REST and API nodes required to handle all of the requests that applications send to the network. Delegating these responsibilities to separate nodes on the network should allow nodes of each type to be added or removed as needed to optimize performance.

Apart from this new architecture, Catapult also makes some other optimizations to improve performance. Whereas Mijin and NEM are written in Java and use HTTP for communicating with full nodes, Catapult is being written in C++, and communication between at least the API nodes and REST nodes uses full-duplex sockets (via ZeroMQ), potentially allowing for lower latency than HTTP.

A performance test of three Catapult nodes located in the same datacenter and configured to service requests from 10.8 million accounts showed that the network was able to process just over 3,000 transactions per second. It isn’t completely clear from the press release, but it sounds like each of the three nodes in this test played all three roles: P2P, API, and REST. Confusingly, the accompanying diagram appears to refer to API nodes as “blockchain data ingestion servers” and to REST nodes as “API gateway” servers.

Compared to Ardor

How does NEM compare to Ardor, then?

Really, there are (at least) two separate questions: how do NEM’s features compare to Ardor’s features? And how does NEM’s approach to scaling compare to Ardor’s approach?

Since Ardor (the platform, not the parent chain) will support all of Nxt’s current features, the comparisons I noted above between NEM and Nxt apply equally well to Ardor.

In particular, Ardor’s child chains will have at their disposal a somewhat larger variety of built-in transaction types that support a richer set of features.

For example, NEM does not natively support a peer-to-peer exchange for mosaics, dividend payments to mosaic holders, transactions conditioned on votes by mosaic holders (or most of Nxt’s phased transaction types, for that matter), account properties, a decentralized marketplace, or anything like Nxt’s shuffling and alias systems.

Ardor’s parent-chain/child-chain architecture will add some extra functionality, too.

In particular, users will be able to exchange different child chain tokens for one another directly, without first converting to ARDR. This will be especially useful on pegged child chains, where users will be able to trade dollar-pegged coins directly for bitcoin-pegged coins (for example), whereas on NEM, somebody holding a dollar-pegged mosaic would have to sell it for XEM, then buy a bitcoin-pegged mosaic.

These differences notwithstanding, NEM still offers a rich set of features that application developers can use in interesting ways. Perhaps the best example is Apostille’s creative use of NEM’s unique multisig accounts. I’m not sure how easy it would be to replicate that kind of functionality on Ardor.

[EDIT]: Lior Yaffe, core dev and co-founder of Jelurida, has the following comment:

With NXT this can be achieved by issuing a singleton asset for each license registration and sending it between accounts.

On the question of how to scale, the two platforms differ much more dramatically.

Catapult’s approach, which NEM will eventually incorporate, is twofold: a new three-tier architecture to distribute the network’s responsibilities among three specialized types of nodes; and a series of application-level optimizations, e.g., using C++ instead of Java. We will need to defer judgment of the latter approach until additional benchmarking tests are available, but we can still cautiously speculate about the implications of the new architecture.

The biggest advantage seems to be for private blockchains, where the owner can fine-tune the quantities of the three types of nodes and the topology of the network to optimize throughput. Moreover, in such a context, blockchain bloat isn’t as severe a problem as it is for a public blockchain since companies can easily dedicate terabytes of storage on their servers to storing the blockchain.

The improvement in NEM’s performance with this new architecture, on the other hand, is much harder to predict. It is not clear whether each peer on the network would have to run all three services (P2P, API, REST) or just one of the three. In the former case, the scaling advantage to the new architecture would presumably be lost. In the latter case, the classic trade-off between speed (fewer P2P nodes, more API and REST nodes) and security (greater fraction of P2P nodes) would remain. And since nobody could control the number of each type of node on a public network, the question of what the optimal balance is would be moot.

In contrast, Ardor’s design does not try to achieve the highest possible throughput, at least initially. Rather, Ardor’s main scaling goal is to greatly reduce the size and rate of growth of the blockchain. It does this using a unique parent-chain/child-chain architecture, where all nodes on the network validate all transactions, but only those belonging to accounts holding the parent chain coin (ARDR) forge. Since the child chain coins can’t be used to forge, the child chains’ transaction history is irrelevant to the security of the network and can be pruned away.

It is worth noting, however, that computational scaling is on the Ardor roadmap.

Specifically, it is possible that child chain transaction processing will be delegated to separate subnets of the Ardor network in the future, allowing most nodes to ignore most transactions.

Conclusion

Ardor and NEM both offer rich, largely overlapping sets of features.

Overall, my impression is that developers will probably be able to build similarly complex applications on either blockchain with comparable ease. In that sense, the two platforms are direct competitors.

In their approaches to scaling, though, Ardor and NEM are quite different.

While Catapult will likely achieve a significant improvement in the rate that private blockchains can confirm transactions, I am somewhat more skeptical of the performance improvement that can be achieved on a public blockchain like NEM using the same approach.

Ardor, on the other hand, does not attempt to address the computational scaling problem (for now), but has found a very effective solution to the problem of blockchain bloat.

I suppose time will tell whether computational scaling or blockchain bloat is ultimately going to pose the biggest long-term problem for blockchain tech, and time will also tell whether either platform has found an adequate solution.

Ardor vs. the Competition, Pt. 1: Lisk

I recently decided to start a series of posts that compare and contrast Ardor with other blockchain projects that appear to have similar goals or features. Roughly each week, I'll pick a project whose scope overlaps at least a little with Ardor's, study its technical documentation, and post a summary of my findings here for you to critique.

This week, I've been reading about Lisk.

Lisk

In a nutshell, Lisk is a platform for developing decentralized applications (dapps) that run on sidechains anchored to the Lisk mainchain. It uses a delegated proof-of-stake (DPOS) consensus mechanism to secure the mainchain, while sidechains are each responsible for their own security (sort of, but see the description of the delegate marketplace below). The protocol uses a set of predefined transactions, rather like Nxt and Ardor, as opposed to a low-level scripting language like Bitcoin or Ethereum.

Before I get into the details, I should start by saying that Lisk is definitely in an early stage of development. The team is currently in the middle of rewriting the Lisk SDK, which will support sidechain development, and is continuously refactoring Lisk Core, which is the full node.

With the code in flux, some important architectural questions, particularly about sidechains and how they will interact with one another and with the mainchain, do not appear to have been settled yet. On the other hand, I had some difficulty finding a current, authoritative source of technical information about Lisk, so what I present here might be out of date. The best information I could find was in the wiki, this article by one of the co-founders, the roadmap, and these YouTube videos. None of the first three sources are recent, unfortunately, and even the videos don't go into much depth (though I admit I haven't watched all 6+ hours of them). If you've found better references, I'd be grateful if you could send them my way.

The marketing buzz surrounding Lisk seems to focus on the SDK, the goal of which is to make it easy to build, deploy, and secure a dapp running on a customizable blockchain. The devs wrote the SDK in JavaScript because they want to make Lisk accessible to as wide an audience as possible, and they also wrote the backend in JavaScript (Node.js) because...well, I guess I'll never understand why people insist on using JavaScript on the backend. 🙂

But clearly, ease of developing and deploying a custom blockchain is not the only goal of Lisk. If it were, then what purpose would the mainchain serve? You might as well clone Bitcoin or Nxt if all you want is a good starting point for building your own blockchain.

The mainchain/sidechain architecture is the real distinguishing feature of this platform. As far as I can tell, the mainchain serves at least three important functions:

  1. The Lisk API will allow deposits of LSK on the mainchain to be transferred to and from sidechains. With two such transactions, it will be possible to send LSK from one sidechain through the mainchain and to another sidechain. Unfortunately, according to the article by one of the co-founders linked above, it sounds like transferring LSK onto a sidechain will require sending it to the sidechain's owner, which obviously requires some degree of trust. To avoid this problem, it will be possible to create sidechains that use their own forging tokens instead of LSK. This token would then need to be traded for LSK in order to transact through the mainchain with another sidechain. Alternatively, it might be possible for one sidechain to transact directly with another sidechain without going through the mainchain, but the developers are still researching how this would work.
  2. Eventually, the team plans to build a "delegate marketplace" where delegates who are not securing the mainchain can offer to secure sidechains and are paid "either by the [sidechain] application owner or its users." Again, the details are a little fuzzy, but there seems to be a lot of value here: presumably the Lisk network is already far larger than a typical brand new blockchain network, and the delegate marketplace gives sidechains an "off-the-shelf" set of nodes that they can use to secure themselves in their infancy.
  3. Some nodes on the network (not sure which ones) will periodically hash sidechains and store the hashes on the mainchain as a "basic validation of sidechain integrity." I haven't been able to find any details about how this mechanism will work, though.

Apart from these functions, and from the obvious role it plays in transferring LSK between accounts, the mainchain itself doesn't seem to have any other intended uses. All of the business activity is supposed to occur on the sidechains.

Compared to Ardor

How does this architecture compare with Ardor's parent chain and child chains?

Maybe the most obvious difference is that each sidechain must have its own set of nodes to secure it, whether these are provided by the sidechain creator, the users, or eventually the delegate marketplace.

With Ardor, in contrast, every node on the network validates child chain transactions, but only accounts holding ARDR forge. The fact that accounts holding child chain tokens don't forge with them means that it doesn't matter how small child chains are or how unequal the distribution of tokens on them is; they are all just as secure as the parent chain.

One additional note about Lisk is that, until the delegate marketplace opens, sidechain creators choose the nodes that forge on their chains, which seems to require that users place a great deal of trust in them. On the other hand, the team has also suggested that Lisk will be flexible enough to allow sidechains to use an entirely different consensus algorithm, like proof-of-work, so it seems that sidechain creators wouldn't determine which nodes secure the chain in that case.

There are also plans to allow existing sidechains to switch consensus mechanisms even after they launch, but again I haven't been able to find details.

Clearly, both Lisk and Ardor intend to offer scaling advantages over traditional blockchains. With Lisk, the computational scaling advantage is obvious, since each forging node validates only the transactions on a single blockchain, either the mainchain or a sidechain. The reduction in required storage space (i.e., blockchain bloat) is less clear, though. Compared to Ethereum, say, it's obvious that for a similar level of total activity, the many chains in the Lisk ecosystem will each grow more slowly than the single Ethereum chain, simply because sidechains will not store each other's data.

Compared to Ardor, though, the storage savings would be modest. Ardor's parent chain will grow at a similar rate to the Lisk mainchain--as both will store only hashes of sidechain or child chain data instead of the data itself--but on Ardor the child chain data will be pruned away, eliminating the blockchain bloat problem that Lisk will still have on each sidechain.

Conclusion

What, then, should we make of Lisk? Honestly--and I'm very disappointed to write this--I think it's simply too early to tell. Too many important details have yet to materialize:

  • Will it be possible to convert one sidechain's token directly to another sidechain's token without converting to and from LSK? How?
  • When the delegate marketplace opens, will it be possible for users to elect delegates using sidechain tokens? Or will they have to use LSK? Or will sidechain owners maintain control over which delegates forge?
  • What will Lisk do with the hashes of sidechains that are stored on the mainchain? Will it be possible to roll back recent transactions on a sidechain to "restore" it to the state it had when it was hashed? If so, will there be some time after which this will not be possible, so that the sidechain can still be considered immutable?
  • Will the Lisk SDK provide some clean mechanism for changing the consensus algorithm on an existing sidechain? I'm not sure what this would look like.
  • What happens if a sidechain that uses LSK forks? Obviously, the LSK tokens on both resulting sidechains cannot be simultaneously backed by the same LSK reserves on the mainchain. I would assume the sidechain creator effectively gets to choose which chain is the "real" one, since he or she is the one holding the reserves on the mainchain, but I don't know for sure that this is correct.
  • Depending on how Lisk will support transactions directly between sidechains, this same concern could require additional trust between sidechain creators. In particular, if sidechain creators must hold reserves of each other's tokens to enable cross-chain transactions, which seems like one plausible way to do it, then a fork in one sidechain could give the other sidechain's creator some influence over which branch of the fork is honored. Moreover, if the forking sidechain transacts with several other sidechains, each of which hold reserves of the split token, then the situation could get ugly pretty quickly.

In my opinion, the most important advantage Lisk has over most blockchain platforms, including Ardor, is that it will accomplish a natural computational scaling by segregating each dapp onto its own blockchain. If, in addition, sidechains will be able to transact seamlessly and trustlessly with one another, then it seems like the design has immense potential.

If we're making the assumption that the Lisk team will successfully implement all the features required to make this happen, though, then we ought to grant Jelurida the same courtesy and assume that they'll be able to carry out their own scaling plans. In particular, one potential improvement on the Ardor roadmap is to confine child chain transaction processing to dedicated subnets of the Ardor network. It seems to me that this would accomplish a similar computational scaling to Lisk, while preserving Ardor's substantial advantage in reducing blockchain bloat.

In conclusion, Lisk's mainchain/sidechain architecture could potentially help it scale to accommodate a large number of dapps that could interact in interesting ways, but right now there seems to be a lot of uncertainty in the technical details. Ardor's approach is technically quite different but solves some of the same problems, namely blockchain bloat, potentially computational scaling, and the ability to transact easily between separate chains.

It will be very interesting to see how Lisk develops in the next two or three years, but then again, by that time Ardor will have been live for a long time already.

- segfaultsteve