This article is currently being written (23 April 2020)
Offerbots allow us to build any kind of map to navigate our economy, society and information.
Perhaps the easiest way to understand this is to compare them to web browsers.
Browsers and offerbots
Web browsers are complex applications but ultimately all they do is display a web resource which someone else has created. That is, they request a particular web resource and process whatever HTML, JavaScript, CSS and other content is found and referenced there. A browser is doing its job, therefore, when it faithfully displays and executes whatever the publisher wanted their website to do (except, of course, when the publisher wants to harm or abuse the privacy of web users).
In the early days of the web, browsers were all we needed. That’s because people used to create their own websites and, therefore, visiting their website with our browser enabled them to communicate information and offers to us. Today, however, most of our website visits are to sites published by aggregators who control what each of us can communicate and, in particular, which offers we can make and receive with each other.
Offerbots, therefore, need to operate at a different layer to browsers. They will:
- use the web to send and receive tens, hundreds or thousands of offers with offerbots (through web APIs),
- filter, score and rank received offers (according to its owners preferences), and
- act as a publisher by generating maps (web documents, emails, app views, notifications, etc.) which represent offers to their owner and their choice of other people (through a browser, email program, app, etc.).
Note that this can happen in real time but that it doesn’t have to – an offerbot can be proactive in gathering and processing offers so that your maps are always ready for you.
Either way, the offerbot allows you to publish two kinds of information:
- the offers as data for other offerbots to consume, and
- web documents, emails, app views and other content for you and other people to consume.
It’s worth noting that this is not a new idea.
Back to the future
Every time your browser makes a web request, it passes a user agent string to the web server. Here’s one, for instance, from an iPhone running a Firefox browser:
Mozilla/5.0 (iPhone; CPU iPhone OS 13_1 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) CriOS/79.0.3945.73 Mobile/15E148 Safari/604.1
Browsers can be described as ‘user agents’ because they act on our behalf (by fetching and displaying web content for us). In the late 1990s and early 2000s, however, there was a much richer concept for agents on the internet. Here’s what IBM research wrote in the early 2000s:
“Today, we are witnessing the first steps in the evolution of the Internet towards an open, free-market information economy of software agents buying and selling a rich variety of information goods and services. We envision the Internet some years hence as a seething milieu in which billions of economically-motivated software agents find and process information and disseminate it to humans and, increasingly, to other agents. Agents will naturally evolve from facilitators into decision-makers, and their degree of autonomy and responsibility will continue to increase with time. Ultimately, transactions among economic software agents will constitute an essential and perhaps even dominant portion of the world economy. “
These agents were part of the vision for a very different kind of internet. In 1993, Al Gore addressed the National Press Club on the topic of the National Information Infrastructure (the internet). In that speech he envisaged a “new information marketplace” which would include “information providers – local broadcasters, digital libraries, information service providers, and millions of individuals who will have information they want to share or sell”. In this marketplace, “anyone who wants to form a business to deliver information will have the means of reaching customers. And any person who wants information will be able to choose among competing information providers, at reasonable prices.”
These visions have not come to pass. Instead of having autonomous agents which learn their owner’s preferences and make decisions on their behalf, we got Web 2.0 (in which aggregators learned the preferences of millions of users and used that data to capture and sell our attention). We therefore don’t buy and sell information, we receive it for ‘free’ (which turns out to be a very high cost).
It’s not, however, too late.
Our opportunity
[This section is currently being written – 27 April 2020]
Offerbots are not intended to be fully autonomous. They gather and filter offers based on the stated interests of their owners.
Offerbots do not bypass humans, they put humans back into the system by bypassing aggregators.
Each person will have full control over which offers they receive and how they are represented. This, paradoxically, will be the most effective way for each person to have their offers faithfully transmitted to other people (rather than have aggregators filter based on their profit motive).
That is, offerbots increase our range of opportunities by allowing us to:
- make offers to a large number of people, and
- receive a vast array of offers from other people.
Offerbots will do this (and conserve our attention) by pre-processing received offers on our behalf in order to prioritize the offers we’re looking for and filter out the offers we are unlikely to be interested in.
What we must do
Offerbots do not currently exist. The purpose of this project is to understand, design, build, standardize and promote them.