The AI Economy & Bitcoin

In this article I want to outline how a future digital economy powered by machine learning algorithms might look like and how digital currencies like Bitcoin could help in finding the optimal allocation of computing power to machine learning tasks.

I've been thinking about useful and reasonable use cases for Bitcoin for quite some time now. The cryptocurrency is apparently in rising demand for online gambling and illegal drug trafficking. Not quite a noble achievement but nonetheless remarkable. However I fail to see Bitcoin's appeal as a replacement for existing and stable currencies. In western democracies, where central banks are independent, currencies are mostly stable and the existing banking infrastructure often allows for seamless and cheap transactions between accounts, increasingly allows for instant microtranstanctions between individuals (for example: paymit ) and offers buyer protection for debit or credit cards. These features are all superior to what Bitcoin has to offer unless you live in a really shitty country, where the cronies of the politicians are in charge of the central bank, hyper inflation is the norm and the banking system is either nonexisting or not trustworthy.

There has been a lot of hoopla recently about the blockchain. And chances are very good that in the foreseeable future it will undo some of the tedious and expensive accounting and controlling that a lot of companies face each year - simply by keeping track of every financial asset in a public (and decentralized?) ledger. So do we need a cryptocurrency like Bitcoin at all or is the blockchain by itself sufficient?

I believe some form of a decentralized AI makes a viable usecase for cryptocurrencies. I outline how a free market could achieve the optimal allocation of funds to computing power with the use of a public ledger that tracks both: Funds allocation and the completion of machine learning tasks. If you know about any existing projects with that goal please let me know.

Concept for a fully digital economy (for artificially intelligent participants):

Single interface:

Each server would implement an independent service and specify in its API what kind of data (images, text, audiofiles, executables, video) it allows and what services (ie. labelling images, transcribing audio files, identifying faces, finding relevant articles to keywords) it offers. The services offered, the data formats accepted, the reward for completing the task, the reward for failing to complete the task and the maximum time required, are all communicated in a standardized interface, which must be implemented by each API.

Public ledger:

The public ledger holds information on all the transferred funds, as well as the successfully and unsuccessfully completed tasks. A public ledger functions as a track record for the specific API and stores the description of the service, and a history of all completed tasks (and the corresponding requests). This in turn allows to create a performance statistic, for example of the number of labelled images, average time required for completing the task, accepted labels on average and the requested price.

The existing bitcoin blockchain could be used for this purpose. The transactions following the completion or failure to complete a task act as proof that a service was used and fulfilled or failed to meet the requirements. The existing blockchain would need to be filtered to create a current list of available APIs.

Sample request:

  • User sends an image to a server which only does image labelling.
    • The server replies whether he accepts the data, and if so with an expiry time, price for successful completion, price for failure to complete.
  • User sends agreement and transfers funds to escrow account
    • Server sends labelled picture and deducts price for failure from escrow account
  • If the user accepts the remainder of the escrow account is transferred to the server
  • If the user declines the remainder of the escrow account is transferred back to the user
  • If the expiry time is reached before the server replies then the remainder of the escrow account is transferred back to the user

Chained APIs:

The usefulness of a single, unified API becomes more apparent in a more complex example. The services could be further augmented by chaining different servers and their APIs together.

  • A user could for example send a broad/complex text request (and some bitcoins) to a single server.

    • This server identifies what the user exactly requests (ie. write an article about Byzantium) and generates an estimation of costs.

    • The server replies whether he accepts the data, and if so replies with an expiry time, price for successful completion, price for failure to complete.

  • The user agrees and sends the full amount to the escrow account

    • The first server contacts another API

      • which identifies all the required services for the completion of the task. Step by step this server then might contact:
        • a first API to generate a general outline of the text (ie. the headers)
      • then send it to a second API
        • to write lots of information under each header
      • then send it to a third API
        • which adds some pictures
      • then send it to a fourth API
        • which reduces the text to the necessary length.
      • then sends it to a fifth API
        • which applies spell checking and makes sure the sentences are grammatically and semantically correct.
      • The final article is then sent back to the
    • first API, which sends it back to the user

  • who can either approve it or decline it.

Funds allocation:

Whenever a server wants to subcontract a part of its task it might consider looking at the blockchain and check what services other APIs have served in the past. It can then either select the cheapest API which offers the required service, or the most widely used one or the fastest one, depending on whatever feature the server tries to optimize.

Some Corollaries:

The public ledger serves as a service catalog for each server. In order for a new server to promote its services a single handmade request has to be performed for each offered feature of that server's API.

The most recent version of the public ledger serves as a service catalog. Servers enter into direct competition by fulfilling requests for a lower price, faster execution, better execution, etc.

When searching for a particular service and evaluating competitors, more attention should be given to the more recent tasks in the public ledger.

The more trusted a server is, the smaller the spread (ie. difference) between successful completion of a task and unsuccessful completion of a task becomes. As a service gets more trusted more and more users will be willing to pay full price upfront.

Requiring at least one single request (and the successful transfer of funds) for each feature of every API will function as a simple spam filter. Also existing APIs can quickly search existing APIs by only looking at the most recent requests in the public ledger.

concept end

Concluding remarks:

The main difference of this proposed public ledger / interface is a publicly visible service catalog for all available APIs. A feature the current and vastly diverse API ecosystem lacks. However this standardized interface masks a huge part of the underlying complexity. For example, when a a service offers to write a short article about three cohesive subjects, it might expect them to be listed in JSON. For another service comma separated values suffice. These communication problems can be readily addressed with existing solutions, such as a content-type header, however they don't impose enough restrictions on API communication to ensure that every request is understood - a "meta" language would still be required. Luckily a couple of such meta languages already exists. RAML would be an example for such a RESTful (REpresentational State Transfer) API (Application Programming Interface) DLs (Description Languages). Such a protocol could allow servers to elaborate requests between each others.

I consider the creation and/or the accpetance of such a meta language as a precondition for a truly decentralized AI economy. Although those APIs might already be useful without such a meta language, for example by hard-coding requests for certain APIs. This is the current state of most modern web applications, which already rely on numerous other APIs to answer their requests. However this current state slows down the discovery and integration of new APIs, because each new API is only ever used if its features are manually integrated into the existing software. A meta language could allow to fully automate this process.

update 1/2/2017

An alternative to RAML: The OpenAPI Specification:

The goal of The OpenAPI Specification is to define a standard, language-agnostic interface to REST APIs which allows both humans and computers to discover and understand the capabilities of the service without access to source code, documentation, or through network traffic inspection.

More details: https://github.com/OAI/OpenAPI-Specification

(Proposed) Next steps:

Train machine learning algorithms on RESTful API DLs and see if they can consistenly identify useful services based on some simple request like: "Write a 1000 word essay on Byzantium".

Open questions:

  • What steps can be taken to guarantee privacy for sensitive information?

  • Feel free to add your own in the comments...