SEC-06: Identity and Signers, week 3

SEC-06: Identity and Signers, W3

Opening slide. Establish the thesis early: this is not just a tool talk, it is a talk about how sovereign systems should handle trust.
Identity
Framing

Identity

Identity is not a fixed thing, it is an ongoing process.

Identity is not your “true self”, it is something you become through ongoing participation in reality, environment, society, and communities.

  • Identity is dynamic, self-modeled, and shaped by culture, practices, relationships, and experiences
  • Identity is not a fixed “What I am” but “How I am becoming”
  • Identity and context are intrinsically related: we express and assume different dimensions of identity depending on the context
Stay very close to the note wording here. This section is the base for why trust must also be contextual.
Trust
Definitions

Trust

  • Trust is something relative, contextual, and subjective
  • Trust is also a scaling solution: it allows us to collaborate beyond our immediate circle
  • Trust is about the meaningfulness of our connections, not just their existence
  • If identity is “how you are becoming”, trust is “how you are becoming with”
  • Identity stabilizes through webs of trusting relationships, belonging, community, and shared purposes
  • Trust and identity are also correlated. I may trust you as an engineer, but not as a gardener.
  • When trust collapses, whether between people, with institutions, or even with reality itself, identity becomes fragile and defensive
These distinctions are already in the notes and should remain explicit in the slides.
Ontology and its problems
Limits

Ontology and its problems

  • Ontology is fundamentally about uncovering the underlying structures of reality or a system
  • It explores questions about what exists, how entities are categorized, and how they relate to each other
  • This can feel appealing when building systems that try to model identity or trust
  • The problem is not ontology itself, but the temptation to force living systems into rigid categories
Dynamic systems
  • Trust and identity aren't a static object, they behave more like a dynamic system
  • Dynamic systems involve flux, emergence, feedback loops, and non-linearity
  • In that sense, trust is closer to “becoming” than to “being”
  • So the challenge is not to define trust once and for all, but to build systems that can work with evolving relations
Keep the point narrow: ontology is not the enemy, but rigid categories fail to capture living systems like trust.
WoT
History

WoT

  • This concept emerged in the digital world with PGP, as digital relationships between individuals created challenges that did not exist in the same way before
  • The PGP “web of trust” is a decentralized way to decide which public keys you believe really belong to which people, based on users signing each other’s keys instead of relying on a central certificate authority
  • It was one of the earliest serious attempts to solve a sovereign trust problem on the internet
  • If trust is about the meaningfulness of our relations, how do we determine trust with individuals we cannot easily meet in meat space? How do we establish a link between a cryptographic identifier and the person behind it?
This should read almost like the notes: PGP is the historical entry point for digital Web of Trust.
WoT
Core idea

Core idea

1

Each person has a public key plus identity info (name, email, etc.) bundled into a certificate.

2

When you verify someone’s identity (e.g. in person with ID), you sign their public key with your own private key, saying “I attest this key belongs to this person.”

3

PGP then uses these signatures plus your personal trust settings to decide which keys are valid for you to use.

4

PGP web of trust uses discrete levels of owner trust: Undefined/Unknown, Never/Do not trust, Marginal, Full, Ultimate.

Key signing parties are in-person gatherings where PGP users verify and sign each other's public keys to strengthen the decentralized web of trust.

This slide should preserve the mechanics from the notes with minimal reinterpretation.
WoT
Lesson

PGP is both inspiration and warning

Inspiration

It proved decentralized trust could work without a certificate authority.

Warning

Manual trust assignment and key-signing rituals did not scale well socially or ergonomically.

This is still a bridge slide, but the wording should stay close to “inspiration and warning.”
Measuring trust
Signals

Measuring trust

  • The six degrees of separation theory posits that any two people in the world are connected through no more than six social links
  • WoT and key-signing practices were one of the first precedents of measurable or computable trust in digital systems
  • As we started to rely more and more on digital systems and the internet, we left more breadcrumbs about who we are, what we do, and how we relate to others
  • From that information we can infer signals of meaningful relations, proximity, credibility, and reliability
  • But what we really compute are trust-relevant signals, not trust in any absolute sense
Nostr has these concepts intrinsically present
  • WoT: contact lists form signed social graphs
  • Six degrees of separation: measurable through social graphs
  • Rich content types: each kind represents a different measurable signal
  • Interactions, replies, reposts, zaps, follows, lists, all reveal different relational signals
  • Combining all of that gives a much richer picture than a static trust label
This slide should stay descriptive and close to the notes: what gets computed are signals, and Nostr has those ingredients natively.
Measuring trust
Why Nostr

Nostr vs Other platforms

  • The graph is not trapped inside one company database
  • Identities and other signals are cryptographically attributable
  • Trust logic can be user-driven, inspectable, composable, and portable
Why this matters
  • Trust is not trapped inside platform heuristics
  • Signals can be combined into a richer picture than a static trust label
  • The user can become the root perspective instead of delegating judgment to a central authority
This slide can still frame significance, but should remain closer to the note language around portability and cryptographic attribution.
Challenges
Adversarial reality

Challenges

  • Decentralized networks present unique challenges for this
  • No central authority means there is no single canonical way to determine how trustworthy something or someone is
  • Information is disseminated across relays, so there is no central repository from which to observe the whole network
  • Cryptographic keys are cheap to generate, which means impersonation, spam, sybil attacks, and fake signals are constant possibilities
  • If trust signals can be measured, they can also be gamed
This can feel chaotic, but:

The goal is not to eliminate chaos, but to navigate it better.

In decentralized systems, the chaos is not a bug to remove, but a condition to design for.

Keep this section sober and close to the notes: no central authority, fragmented visibility, cheap keys, fake signals.
Relatr, Relate, Relative, Relation
System

Relatr, Relate, Relative, Relation

Relatr is an attempt to make sense of all of this by combining the ideas above into an operational system.

It is better understood not as a “trust oracle” but as an engine for contextual trust inference from signed relational signals.

Source pubkey
There is a source pubkey for the trust computations; since there is no central authority, the user becomes the root perspective.
Signals
The operator can integrate as many signals as they want into the algorithm.
Meaning
The important point is not that Relatr discovers truth, but that it makes trust logic explicit, inspectable, and adaptable.
This slide should remain close to the note phrasing, especially around source pubkey, operator signals, and inspectable logic.
Relatr
Mechanics

How Relatr works

Normalize

Every signal and social-graph relation is normalized into a floating point value between 0 and 1.

Map

This creates flexibility because many different kinds of evidence can be mapped into a common scoring space.

Aggregate

The final score is the result of aggregating these components in a weighted formula, then normalizing again between 0 and 1.

TTL

Scores have a TTL so they can decay and evolve over time instead of pretending trust is static, and scores from multiple Relatr instances can also be combined.

This is the most mechanical slide. Keep the language as close as possible to the notes.
Relatr + TAs
Distribution

TAs

  • Trusted Assertions (TA) are an emerging standard in Nostr described by NIP-85
  • In practice, they define a way for a service to publish an assertion about a pubkey as a Nostr event
  • We integrated this in Relatr as an opt-in feature, an alternative way to consume the scores generated by Relatr
  • This complements the original RPC-like pattern in Relatr, provided by ContextVM, with a producer-consumer model
Consumption model
  • The RPC-like interface allows real-time computation and interactive operations like search
  • TAs allow any client to query relays for published assertions, which is simpler and more interoperable
  • RPC gives freshness and flexibility, while TAs give distribution, caching, and easy client consumption
Keep this very close to the notes: opt-in feature, alternative consumption path, RPC versus published assertions.
Other related projects
Ecosystem

Other related projects

Vertex
Web of Trust as a Service using personalized and global PageRank. Closed source.
Brainstorm
Calculates personalized trust scores based on your trusted Nostr community and publishes them as TAs. Currently in alpha.
Keep this factual and close to the notes. The point is to situate the work in an emerging ecosystem.
Closing
Closing

Thanks

Simple closing slide.