Glossary

Abstraction Scalability

Moderate

Abstraction scalability is the expansion in the overall ability of a system that allows programming components to be used as building blocks in a new development environment.

What Is Abstraction Scalability?

Abstraction scalability is the expansion in the overall ability of a system that allows programming components to be used as building blocks in a new development environment. It increases the number of important operations a developer can perform without having to rebuild them.”

When talking about the scalability of a program or network, the conversation often centers on computational scalability — or the capacity of that protocol to handle high throughput or complex transactions. However, there’s another category of scalability that isn’t quite as quantifiable but is just as crucial: abstraction scalability.

At its core, abstraction scalability refers to the ability of developers to build applications by referencing and reusing previous work of other developers. In short, abstraction scalability is about not forcing developers to “reinvent the wheel.” 

In software today, abstraction is experienced as a layered ecosystem of libraries, tools, operating systems, compilers, interpreters and many other components — each of which abstracts away the underlying system. 

Developers benefit from abstraction by grouping existing programs together, allowing them to serve as building blocks for more complex applications. Not only do developers not have to write these abstractions themselves, they don’t even necessarily need to understand their inner workings. They can leverage specific abstractions with the confidence of knowing that thousands of other developers have successfully used them.

The ability to assemble existing bits of programs without the burden of writing or even completely understanding them allows developers to build faster and easier with the knowledge they might not possess individually. This reserves their time for the more complicated and custom parts of their applications. It also enables writing software that wouldn’t be possible otherwise. A system where developers can leverage more already-existing abstractions is a system with more abstraction scalability.

In traditional software environments (web2), abstractions are plentiful. Thousands of popular tools and libraries are filled with commonly-used programs that developers can browse and choose. In short, web2 benefits greatly from the abstraction scalability that has resulted from decades of developer activity.

Abstraction Scalability in Web3

Web3 isn’t entirely devoid of abstractions; things like the Solidity and Vyper compiler, as well as libraries, toolings, SDKs and other developer resources are all examples. But as a comparatively nascent ecosystem, web3 lacks the vast array of abstractions found in more established development environments.

Web3 developers today are still largely testing and iterating in bespoke execution environments without the benefit of well-established abstractions. As a result, they’re often building “from scratch,” a process that overall results in less-scalable software development today. Web3 software is more limited, less safe, as well as slower to write and execute when abstraction scalability is missing.

Scaling Abstractions

Abstraction scaling occurs by tapping into mature tools and code libraries, enabling developers to tap into decades of prior work. The result is an increased capacity for applications to be more complex and expressive. Additionally, abstraction scalability improves performance and security, allowing developers to choose from programs that have been sufficiently battle-tested and refined.

Abstraction scalability isn’t often thought of until a developer confronts its absence. Web3 is still in its early stages, and developers today are proceeding without the full benefit of abstractions. As the web3 ecosystem grows and developers contribute more and more abstractions, we can anticipate a surge in the complexity, efficiency, security, and sheer volume of web3 applications.

Author:

Gabriel Coutinho de Paula is a contributor to the Cartesi ecosystem, building its core technologies and infrastructure. He joined the project part-time in 2020, and full-time in 2021 after defending his Master's thesis in programming languages. He leads the development of Cartesi's fault-proof system, based on the permissionless refereed tournaments technique, writing both on-chain components like smart contracts and off-chain components like the validator node.