The Importance Of Decentralized Compute In Running Unbiased, Uncensored AI
Blog

The Importance Of Decentralized Compute In Running Unbiased, Uncensored AI

4m
Created 3mo ago, last updated 2mo ago

Op-Ed: As AI engines mature, they will be seen as a source of truth for users, but every company, society and government has its own view as to what the ‘truth’ is.

The Importance Of Decentralized Compute In Running Unbiased, Uncensored AI

ChatGPT has opened the world's eyes to the power of AI, but as governments realize the reach of these LLMs (large language models), they will start to influence the outputs. LLMs run on open, auditable, and decentralized computing networks will be critical in maintaining and running unbiased platforms that are free from the coercion of the largest corporations and governments.

ChatGPT set a record for the fastest product to cross one million users, and while the ‘death of Google’ is certainly overstated, the potential for explosive growth in similar AI products shows the interest in a tool that can answer questions quickly, thoughtfully, and thoroughly. Later iterations of LLMs like ChatGPT will undoubtedly be even more powerful, but with this power and reach will come a freight train of scrutiny and controversy.

As these AI engines mature, they will be seen as a source of truth for users, but every company, society and government has its own view as to what the ‘truth’ is regarding certain topics.

AI Responses To Delicate Questions Are Already Drawing Attention

ChatGPT can explain the benefits of green energy but some users have found it hard to get answers for ‘the importance of carbon based energy’. What would ChatGPT have answered about masking in early 2020 when Dr. Fauci didn’t recommend it? Would that answer be the same in late 2020 when the CDC strongly recommended masking for everyone? The sensitivities get even greater in countries with even more charged topics such as in India or Pakistan, not to mention countries under totalitarian regimes that enforce particular narratives aligned with their agendas. We have already seen the US government’s interest in amplifying or suppressing certain topics on Twitter, and a single source of truth (like a popular LLM) will be even more threatening to governments, politicians, and even communities.

How can a company or organization possibly manage all of the competing demands regarding particular answers? We can expect AI engines to be banned in some countries, but where it isn’t banned, it certainly will be subject to regulatory requirements that will erode its credibility and likely will reduce the scope of topics it can address.

Alleviating AI Biases via Open Source Engines and Decentralized Computing Networks

The first step in reducing outside influence is to build open source AI engines where a community can train the model transparently. Several projects like Bloom and EleutherAI are already working toward this goal. But training is just the beginning; no matter the inputs, scrutiny will be on the outputs, and any organization running a given model will be subject to external pressures. The only inoculation from this pressure is to decentralize the actual running of the AI engine by hosting it on an open, globally distributed, decentralized network, without owners or central points of control. When any storage provider can host an AI model, and any compute provider can process auditable queries, users can trust that the answers they receive are free from hidden bias.

The good news is that the technology already exists to make this a reality — but how will it be implemented, and by whom? And some may even find some of the responses from an uncensored version to be morally questionable.

We Should Encourage Competition Between LLMs

The future will likely see multiple AI engines competing to serve as the source of truth. This competition is healthy and should be encouraged. We have seen open source software outcompete the largest companies before: Linux is the leading open source operating system, built by the global community of developers despite Microsoft’s massive resources and a decade head start with Windows. The open source community could have the same success with building AI LLM models, but as important as building/training the AI, is the running of it.

At Fluence, we believe that without open source and decentralized AI, we are in danger of being controlled and influenced by a few huge companies and governments. We see a future with decentralized compute that allows an inexpensive, highly performant, auditable, and fully-decentralized cloud which frees applications from the centralized, closed cloud ecosystems — and from oligopoly censorship – empowering us all.

-

Tom Trowbridge is an accomplished business builder and web3 focused entrepreneur, and the Co-Founder & CEO of Fluence Labs, an institutional grade decentralized serverless computing network that frees computation from centralized cloud providers. Tom is also a board member at Stronghold Digital Mining (NASDAQ: SDIG), and an investor in a number of leading web3 projects. Previously, he was a founding member and the President of Hedera Hashgraph (HBAR). Tom has a BA from Yale University and an MBA from Columbia University, and started his career by financing telecom and technology companies at Bear, Stearns & Co., before joining a Boston-based VC firm to invest in early-stage technology companies. He then spent four years at Goldman Sachs as a VP,  before leaving to build businesses at several other financial firms.

Fluence Labs has developed an institutional grade decentralized serverless compute network that frees computation from centralized cloud providers. To ameliorate the web’s dependency on centralized clouds for computation, Fluence offers low cost and verifiable compute — complemented by a natively decentralized protocol — making applications faster to build, easier to integrate, and more secure, allowing developers to focus on improving UX. Fluence Labs was seeded by 1KX, and Multicoin led its Series A round with participation from other leading web3 investors, angels, and projects such as Protocol labs and Arweave.

This article contains links to third-party websites or other content for information purposes only (“Third-Party Sites”). The Third-Party Sites are not under the control of CoinMarketCap, and CoinMarketCap is not responsible for the content of any Third-Party Site, including without limitation any link contained in a Third-Party Site, or any changes or updates to a Third-Party Site. CoinMarketCap is providing these links to you only as a convenience, and the inclusion of any link does not imply endorsement, approval or recommendation by CoinMarketCap of the site or any association with its operators. This article is intended to be used and must be used for informational purposes only. It is important to do your own research and analysis before making any material decisions related to any of the products or services described. This article is not intended as, and shall not be construed as, financial advice. The views and opinions expressed in this article are the author’s [company’s] own and do not necessarily reflect those of CoinMarketCap.
7 people liked this article