Fango 1000
“Anno Domini 1173, somewhere in the Italian Alps. You have been walking for seven days and seven nights, through rain, fog and mud, on your way to Fango. You heard that a trove of a thousand chronicles was found in the local monastery. They speak of a powerful machine, a devilish device, trained by humans from a future time… What kind of mystery or misery is lying there? And what’s in it for you?”
—Fango 1000, opening sequence
Fango 1000, game screenshot
Link to play: fango1000.dmstfctn.net
Fango 1000 is a multiplayer onchain text game where players compete for narrative control. The game is set in the fictional medieval village of Fango, where a trove containing a thousand chronicles has suddenly appeared inside a monastery, each describing the training of a mysterious 'machine intelligence' from the future.
Players join either Monks, Scholars or Fools - factions with conflicting worldviews - and compete to tell stories about what secret the machine may hold. Some writing will persist by player consensus, other writing will be lost.
The game tasks players with interpreting AI from a mediaeval point of view. Yet the ‘soft’ stories written by a community of players must comply with the ‘hard’ facts inscribed in the chronicles - which contain data generated by players of dmstfctn’s AI training game Godmode Epochs (2023), data now inscribed onto a blockchain, and discoverable in the world of Fango 1000.
The game implements a protocol for onchain narrative building that explicitly links player’s actions to their (contestable) stories. The protocol was originally proposed in ‘Large Lore Models’ (Autonomous Worlds N1, 2023), an essay by dmstfctn, Eva Jäger and Alasdair Milne describing the problem of narrative capture in onchain worlds.
Fango 1000, game screenshot
Large Lore Models
In the IRL world – or ‘groundworld’1– consensus narratives are fracturing. The Trad Web (the entrenched social networks of Web2) in particular is ineffective at delivering protocols for both ground truthing on the one hand and healthy contestation on the other. Commentary on once verifiable events has become adversarial, with events themselves (transactions, exchanges, altercations) often difficult to verify. Here, the most fundamental record keeping becomes centralized through ‘captured’ top-down infrastructure that resembles a surveillance architecture. The capacity to verify and contest becomes attached to hierarchy, access, and control.
While Autonomous Worlds2 are in their nascency, and susceptible to becoming siloed on-chain gameworlds3 for niche communities, they possess a wider potential as generative infrastructure for a new kind of commons – a public resource with a tangible benefit.4 In this article, we submit that Autonomous Worlds produce decentralized narratives5 as their essential public resource and we explore the potential for Autonomous Worlds to act as a commons geared to bringing this resource and its public value into the groundworld.
Autonomous Worlds can offer the ingredients and the world-weaving potential to become what we term an ‘autonomous knowledge commons’, afforded by its involuntary relationship of collective infrastructure ownership.6 Here we take the core features of an autonomous knowledge commons to be noncentralized yet formalized access both to the means of narrative creation, linked to a permanent record of events, and to the right to redeploy, or reseed, the results elsewhere. Building on gubsheep’s proposal that crypto-native games operate as sandboxes or ‘microcosms of the integrated digital worlds of the future’,7 we suggest that building an autonomous knowledge commons might offer a way to export procedures or knowledge in the form of narrative back into the groundworld. With this in mind, an autonomous-knowledge commons could be a space to experiment with new ways of building both a consensus reality that has been lost in the groundworld and shared narratives drawn from the emergent, permanent canon afforded by on-chain games.
What, then, is required to bolster the propensity of Autonomous Worlds to become an autonomous knowledge commons? Here, we will propose a new ‘modding’ superstructure: a ‘Large Lore Model’ (LLoreM) plug-in that facilitates collective and decentralized narrative-building and explicitly links to the on-chain record. The LLoreM offers an interface that maintains a link to interobjective realities whilst communicating with the intersubjective groundworld. In other words, a LLoreM enables a commonly built consensus reality that is collectively generated and governed.8 The LLoreM could be prototypical scaffolding for stewarding the shift from a worlding wilderness to a mission-driven commons.
Lore Generation
Having established the potential, given the right infrastructure, of Autonomous Worlds to become an autonomous knowledge commons, we look to define lore and ‘lore generation’ in concrete terms by using a procedural spoken language game, I Went To the Shop, as an example. From there, we will consider how to hardcode this process of lore generation into a viable tool (LLoreM) that combines the value of player lore generation with the unique affordances of on-chain games, examining the implications of such a tool in comparison to legacy Web2 lore-recording.
In the I Went To the Shop memory game, commonly played by kids to pass the time on road trips, we find a temporary World created through a shared narrative of a trip to the shop. The rules of the game require that players repeat the phrase: ‘Today I went to the shop and I bought…’ appending it each time with a new purchased item. The first item added must begin with the first letter of the alphabet and each subsequent item must begin with the next letter. So after three rounds the phrase might be: ‘Today I went to the shop and I bought an apple, bubble gum, and a cold beer.’ Gameplay emerges through the balance between adding items that one player can remember and items to throw off another players’ memory. Players must keep a level of engagement and entertainment by listing uncommon, surreal, or offensive items. Even while the narrative produced is held in common, its texture is contingent on the particular circumstances of individual players: in-jokes, shared language (for ‘Z’ US English reasonably allows one to buy a zucchini, whilst in UK English you might have to buy a zebra), and – in the case of bored kids in a car – a desire to get a reaction from the adult driving might all be factors that play into these choices.
We take lore here to be the decentralized, accrued narrative. Lore generation is the general concept used to describe procedures (within the World of a game or otherwise) that produce this decentralized narrative. A reified tool built for this purpose could be called a lore generator. Lore generation is important because it acts as a means of creating knowledge-claims in a suspended context, where they could later be contested. It gamifies the production of narrative, framing this production as something common and mutually constructed between players, like an amusing list of commodities bought at a shop.
But at the end of the I Went To the Shop game, there is no record of the ‘ledger’ of ingredients, which is lost. Similarly lost is the shared experience of the players and the emergent narrative. Further, even if some player records the ledger (as a trustworthy elder figure), or an on-chain version collected the list of ingredients introduced via its Digital Physics into its canon (creating a hard diegetic boundary through digital consensus), there is no mutually agreed protocol for recording and verifying the emergent narrative. In order to transform this narrative into lore, we need a system – and interface – that is collectively agreed upon. We will now turn to a prototypical LLoreM that could be used to produce a dual ledger capable of recording and verifying a lore-claim alongside an immutable record.
Large Lore Models (LLoreM)
A LLoreM is a tool we conceive of as a collective writing plug-in that attaches to a ‘host’ on-chain game, allowing it to be used as what Moving Castles calls a ‘narrative engine’.9 It is by definition a lore generator, but with additional design specifications that are conceptualized to respond to some of the initial epistemic problems of consensus-building that plague the groundworld. Accompanying the game’s ledger, in which players’ actions are recorded through transactions on-chain, is a ‘para-ledger’. The para-ledger becomes a contestable mythology of the core on-chain gameplay events.
Writing in the para-ledger must correspond to a block on the chain. The resultant lore, however, is written by players and is in itself nonprocedural. The function of this design is to maximize the affordances of the blockchain’s immutability while preserving the inherent mutability of human narrative (the narrative’s contestability). The human component combines the value of subjective testimony with the incontestable actions recorded on each block. As such, the para-ledger operates as an interface through which players and observers can access and make use of the blockchain ledger, offering players an opportunity to collate or reflect upon their actions while providing observers with context beyond a list of actions. The para-ledger, of course, remains contestable: this is a core part of the design and reflects a need for intersubjective consensus in all attempts to produce collective knowledge. But any contestation at this stage, even against elder testimony, is made against the backdrop of groundtruthed canon blocks.
There are further speculative possibilities for interoperability here, too: if multiple on-chain games operate on a shared blockchain ledger, and players use a singular identity to play these, the LLoreM can triangulate between the different games by locating the players’ activities across the ledger, thus producing an ‘interdimensional’ narrative of the players as they move between Autonomous Worlds. Inversely, a lore iteration can be replayed in another game, stacking the para-ledger lore to correspond with the blockchain and producing iterative parafictions.10
Community Loremaxxing
Given the stakes of narrative control we outlined at the outset, it is worth reflecting on the potential for blockchain to provide a fixed ‘ordering’ for a historical record. Narrative itself is an affordance that has been lost in the Trad Web epoch, as Lev Manovich suggests of ‘rewriting’ in Web2: ‘It is as easy to add new elements to the end of a list as it is to insert them anywhere in it. All this further contributes to the antinarrative logic of the Web. If new elements are being added over time, the result is a collection, not a story.’11 This offers a sense of why achieving a consensus reality might be hard while such a system remains dominant. By enabling packets of narrative to be anchored to the fixed-order events of an Autonomous World’s concrete, decentralized ledger, a LLoreM makes the events sense-able and transparent, and thus reenables narration as a slow burning premodern craft that is resistant to singular ownership by designated heroes or monarchs.
Through the specific affordances of on-chain World weaving, the LLoreM offers an infrastructure for Autonomous Worlds to become autonomous knowledge commons. This collective lore generation system offers a procedural alternative to increasing reliance on GPT-pilled automated myth-generation. Whereas GPT models also draw upon a collective form of writing [vast datasets of human text], the centralized and automated technical intervention used is much more heavily weighted, and as such fails to preserve community control over written outputs in most instances. Instead, the LLoreM seeks a way of engaging the blockchain that uses it efficiently while also aiming towards its most unique affordances; simultaneously, it is tuned to maximize the opportunities to tap into the creative potential of human collectivity: loremaxxing.
This text was originally published in Autonomous Worlds N1,
ed. Guy Mackinnon-Little (London: 0xPARC, 2023), pp.29-35.
- ‘Groundworld’ here refers to our IRL world that we take as ground and is collectively constituted. Alasdair Milne, Collaborative Systems in Machine Learning Artistic Research (PhD thesis), Serpentine Galleries R&D Platform and King’s College London (forthcoming, 2024).
- Autonomous Worlds is a specific research agenda working towards the understanding and construction of onchain games. For more information, see the postface below, or the Autonomous Worlds N1, ed. Guy Mackinnon-Little, 0xParc, Autonomous Worlds Network, Metalabel, 2024.
- An onchain game is a game built on a blockchain. As such, the game’s rules, state and data – in other words the whole game, excluding the interface visible to players – are executed or stored through the blockchain rather than relying on a centralized gaming server. Some onchain games aim to register all in-game events on the blockchain's 'ledger'; but most aim to record only core events, often indicative of the wider landscape of play, or to keep a preferential record of a specific type of event. Few games to date have been fully deployed on blockchains, due lack of mature infrastructure and prohibitive, fluctuating costs. Most onchain games – including Fango 1000 – are tested using ‘virtual blockchains’ with ‘virtual miners’ and tend not to evolve past a prototypical stage.
- Elinor Ostrom’s definition of commons is a scarce resource that provides users with tangible benefits, but aren’t owned by anyone. Elinor Ostrom, Governing the Commons: The Evolution of Institutions for Collective Action (Cambridge: Cambridge University Press, 2015). See also: Martin Zeilinger, ‘Can the blockchain finally create a commons?’ Spike Art Magazine, 2022.
- We take narrative as a subjective, chronological rendering of either events in the groundworld, a fictional World, or some combination of the two.
- Future Art Ecosystems 3: Art x Decentralised Tech (Serpentine Galleries, 2022). Available at: https://www.serpentinegalleries.org/whats-on/future-art-ecosystems/
- gubsheep, ‘The Strongest Crypto Gaming Thesis’, 2021. Available at: https://gubsheep.mirror.xyz/nsteOfjATPSKH0J8lRD0j2iynmvv_C8i8eb483UzcTM
- Journalist and scholar Nathan Schneider has been researching cooperative models and DAOs in order to understand how crypto might offer something unique for coop tooling. Here he discusses self-governance for online communities. Nathan Schneider, ‘Modpol is a Self-Governance Toolkit for Communities in Online Worlds’, Hackernoon, 2022. Available at: https://hackernoon.com/modpol-is-a-self-governance-toolkit-for-communities-in-online-worlds
- Autonomous Worlds Residency Demo, 5 December 2022. See: https://twitter.com/heylukegibson/status/1599888301699186689
- Carrie Lambert-Beatty, ‘Make-Believe: Parafiction and Plausibility’, October, (2009): 51-84.
- Lev Manovich, The Language of New Media (Cambridge, MA.: The MIT Press, 2001), 220.
related entries
What is Artificial Experience (AX)? Why the Application Layer Is the Interface and the Human Is the Limit
William Morgan
publication
(...)Artificial experience does not refer to just any technologically mediated interaction; it is precisely the kind of experience that is uniquely enabled by AI’s infrastructural properties. AX asks explicitly: 'What experiential affordances can AI uniquely deliver that no other medium, tool, or infrastructure could?' This question is infrastructural specificity in action, which I take to be a hallmark of AX design.
Furthermore, AX only emerges when the infrastructural intelligence of AI becomes ubiquitous enough to disappear into habit. In other words, experience is what remains when infrastructure is no longer visible. For this reason, AX design is not about the direct perception of intelligence, but rather the surprising, yet welcome experiences that you didn't anticipate but are glad to encounter. ..(more)What we do in the shadows
Ivar Frich, Jenn Leung, Chloe Loewith
publication
(...)When ‘seeing in the light is blindness’24, and when the epistemology of light is being replaced by an epistemology of
darkness, can we still think of contemporary computation, and thus AI alignment, as something which mirrors
human thoughts and values?
Artist Diemut Strebe and Brian Wardle, professor of aeronautics and astronautics at MIT, collaborated on an arts and
science project to create the blackest black material to date. In an interview Wardle proposes that the darkest
material is ‘is a constantly moving target’25. The aerospace community celebrates darkness to prevent glare; perhaps
this same principle needs to be redirected toward alignment research. As Pasquinelli asks, “will darkness ever have
its own medium of communication? Will it ever be possible to envision a medium that operates via negation,
abduction, absence, the void, and the non-luminous?”...(more)Design in Rising Winds
Flora Weil
publication
(...) What might we learn from materials that refuse our models? Nora Khan proposes thinking of AI as "as a primordial force of nature, like a star system or a hurricane — something strong, but indifferent."50 Perhaps sand offers another metaphor: intelligence as that which slips between categories, accumulating into new forms through its very resistance to capture. As Catherine Malabou states, "Intelligence 'is' not; rather, it only exists through its own transformations."51 These demon cities and herd kindergartens show us perhaps that notions that have come to define our species–intelligence, evolution–emerge precisely from where our models fail. What might science look like when it mirrors the incoherent, the unsound, and the anomalous? In a project describing future human-AI interactions...(more)