Feature branch: PDS v2 ()

* cleanup repeat process all

* wip

* skip actor search test

* skip actor search test

* tweak processAll

* decrease wait to 1 sec

* repo_blob -> record_blob

* simplify backlink linkTo

* return repo_root to one row

* sequence before updating repo_root

* invite code forUser -> forAccount

* ipld_block -> repo_block

* use lru-cache fetchMethod

* move did_cache to own db

* better error handling on did cache

* drop did_handle

* fix sequencer wait time

* debug

* debug

* more debug

* check something

* fix bday paradox

* fix bday paradox

* tidy up pds service auth

* rm skipped test

* retry http

* tidy

* improve fanout error handling

* fix test

* return signing key in did-web

* more tests

* tidy serivce auth checks

* user_account -> account

* remove inviteNote

* keypair per repo

* use an lru cache for keypairs as well

* clean up repo

* wip

* wrap up accoutn manager

* tidy

* tidy

* fix tests

* fix disabled codes

* fix appview tests

* add note

* set pragmas

* tidy account manager getDb

* rename pref transactor

* user pref -> account pref

* handle blob imports

* tidy imports

* add reserveSigningKey

* wip transferAccount

* clean up transferAccount

* tests

* tidy

* tidy

* configure entryway url on pds

* handle entryway in pds admin endpoints

* make importRepo temp

* fix imports

* make email optional on pds when using entryway

* handle diffs

* handle pds entryway usage for server, identity, admin endpoints

* pds support for credentials from entryway

* setup pds tests w/ entryway service

* tidy

* tidy

* update entryway version

* wip

* test handle updates w/ entryway

* split account table into two

* tidy

* tweak scripts

* tidy tests

* tidy

* better config for actorstore & dbs

* clean up cfg more

* reorg actorstore fs layout

* handle erros on actor db create

* pr tidy & fix accoutn deletion test

* pr feedback

* fix bad merge

* unskip test

* fix subscribe repos tests

* tidy repo root tables

* tidy

* fix tests

* tidy delete tokens

* tidy account getters

* tidy

* bulk deletesg

* increase chunk size

* handle racing refreshes

* wip

* fix auth test

* invert import flow

* clean up actor store on create account failure

* tweak sequencer

* prevent invite code races on createAccount

* rm note

* add back in race protection on getAccountInviteCodes

* start feature branch

* deleted app migration table

* patch up new auth test

* rm note

* g

* create accoutn delegated from entryway

* tidy

* fix test

* change plcOp type to unknown

* small fixes

* sync up w entryway branch

* Use proper error when authed account is not found ()

provide proper error when account not found in access-takedown check

* build branch

* build on ghcr

* tweak service file

* tweak service file

* change where we save reserved keys

* no tmp dir in blobstore either

* fix blobstore temp location again

* handle repeat record_blobs

* create account before submitting plc op & undo if fail

* small tweak

* limit the number of local records

* push out empty commit on transfer

* fix issue with record_blob

* add push blob endpoint

* Set and validate token audiences on pds v2 ()

set and validate token audience on pds v2

* merge

* include entryway did on tests

* build branch

* fix cache issue

* xrpc server blob limit

* put correct bytes

* add auth to routes

* handle quarantining/unquarantining a blob that does not exist

* tidy

* fix transfer tests

* fix email request routes for entryway

* PDS v2 entryway account deletion ()

* add admin lexicon for account deletion

* implement admin account deletion endpoint

* fix entryway proxying on account email checks

* proxy to entryway for acct deletion

* read-after-write sanity check

* tweak

* wip

* finish refactor

* fix test schema

* application retry logic for busy

* pr feedback

* rm lru-cache

* fix test pg schema

* fix transfer test

* Sqlite instrumentation for pds v2 ()

* sqlite instrumentation

* build

* remove build

* dont reimport blobs

* send ticks during import

* close on error

* catch handle validation error

* add log

* fix test

* return emailConfirmedAt on getAccountInfo

* Upgrade sharp on pds v2 ()

upgrade sharp to 0.32.6

* read all bytes before parsing car

* Async car reader ()

* asynchronously read in car

* dont buffer car

* tweak

* Gracefully handle indexing of invalid records ()

* gracefully handle indexing of invalid records

* fix repo tests

* Fix role auth for access-or-role verifier, getBlob check on actor takedowns ()

fix role auth for access-or-role verifier, fix getBlob actor takedown check

* better cleanup of actor-stores

* add ability to not ensure leaves

* tidy

* allow did:web transfer

* Migration utility for actor-store ()

beginnings of helper for migrating all actors

Co-authored-by: Devin Ivy <devinivy@gmail.com>

* base case for findBlobRefs

* App-level retries for sqlite on pds ()

* revamp retry helper to be more flexible re: backoff strategies

* sqlite timeout helper

* ensure sqlite wal on db creation/migration rather than every open

* layer retries for sqlite on writes outside transactions on pds

* tidy

* fix up lockfile

* tidy

* fix lex codegen

* fix timing bug in threadgate test

* No-op update handling ()

do no produce commits on no-op updates

* Retry on all SQLITE_BUSY error codes ()

retry on all sqlite_busy error codes

* Pds v2 ensure sqlite ready ()

ensure sqlite is ready before making queries

* try something

* tidy

* dont build branch

---------

Co-authored-by: Devin Ivy <devinivy@gmail.com>
This commit is contained in:
Daniel Holmgren 2023-12-04 18:00:09 -06:00 committed by GitHub
parent cad30a7cc8
commit f9fd3e68ca
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
278 changed files with 8610 additions and 6933 deletions
lexicons/com/atproto
packages
api/src/client
aws/src
bsky
common-web
common/src
crypto
dev-env/src
lex-cli/src/codegen
pds

@ -0,0 +1,20 @@
{
"lexicon": 1,
"id": "com.atproto.admin.deleteAccount",
"defs": {
"main": {
"type": "procedure",
"description": "Delete a user account as an administrator.",
"input": {
"encoding": "application/json",
"schema": {
"type": "object",
"required": ["did"],
"properties": {
"did": { "type": "string", "format": "did" }
}
}
}
}
}
}

@ -0,0 +1,27 @@
{
"lexicon": 1,
"id": "com.atproto.temp.importRepo",
"defs": {
"main": {
"type": "procedure",
"description": "Gets the did's repo, optionally catching up from a specific revision.",
"parameters": {
"type": "params",
"required": ["did"],
"properties": {
"did": {
"type": "string",
"format": "did",
"description": "The DID of the repo."
}
}
},
"input": {
"encoding": "application/vnd.ipld.car"
},
"output": {
"encoding": "text/plain"
}
}
}
}

@ -0,0 +1,24 @@
{
"lexicon": 1,
"id": "com.atproto.temp.pushBlob",
"defs": {
"main": {
"type": "procedure",
"description": "Gets the did's repo, optionally catching up from a specific revision.",
"parameters": {
"type": "params",
"required": ["did"],
"properties": {
"did": {
"type": "string",
"format": "did",
"description": "The DID of the repo."
}
}
},
"input": {
"encoding": "*/*"
}
}
}
}

@ -0,0 +1,44 @@
{
"lexicon": 1,
"id": "com.atproto.temp.transferAccount",
"defs": {
"main": {
"type": "procedure",
"description": "Transfer an account.",
"input": {
"encoding": "application/json",
"schema": {
"type": "object",
"required": ["handle", "did", "plcOp"],
"properties": {
"handle": { "type": "string", "format": "handle" },
"did": { "type": "string", "format": "did" },
"plcOp": { "type": "unknown" }
}
}
},
"output": {
"encoding": "application/json",
"schema": {
"type": "object",
"required": ["accessJwt", "refreshJwt", "handle", "did"],
"properties": {
"accessJwt": { "type": "string" },
"refreshJwt": { "type": "string" },
"handle": { "type": "string", "format": "handle" },
"did": { "type": "string", "format": "did" }
}
}
},
"errors": [
{ "name": "InvalidHandle" },
{ "name": "InvalidPassword" },
{ "name": "InvalidInviteCode" },
{ "name": "HandleNotAvailable" },
{ "name": "UnsupportedDomain" },
{ "name": "UnresolvableDid" },
{ "name": "IncompatibleDidDoc" }
]
}
}
}

@ -8,6 +8,7 @@ import {
import { schemas } from './lexicons'
import { CID } from 'multiformats/cid'
import * as ComAtprotoAdminDefs from './types/com/atproto/admin/defs'
import * as ComAtprotoAdminDeleteAccount from './types/com/atproto/admin/deleteAccount'
import * as ComAtprotoAdminDisableAccountInvites from './types/com/atproto/admin/disableAccountInvites'
import * as ComAtprotoAdminDisableInviteCodes from './types/com/atproto/admin/disableInviteCodes'
import * as ComAtprotoAdminEmitModerationEvent from './types/com/atproto/admin/emitModerationEvent'
@ -76,6 +77,9 @@ import * as ComAtprotoSyncNotifyOfUpdate from './types/com/atproto/sync/notifyOf
import * as ComAtprotoSyncRequestCrawl from './types/com/atproto/sync/requestCrawl'
import * as ComAtprotoSyncSubscribeRepos from './types/com/atproto/sync/subscribeRepos'
import * as ComAtprotoTempFetchLabels from './types/com/atproto/temp/fetchLabels'
import * as ComAtprotoTempImportRepo from './types/com/atproto/temp/importRepo'
import * as ComAtprotoTempPushBlob from './types/com/atproto/temp/pushBlob'
import * as ComAtprotoTempTransferAccount from './types/com/atproto/temp/transferAccount'
import * as AppBskyActorDefs from './types/app/bsky/actor/defs'
import * as AppBskyActorGetPreferences from './types/app/bsky/actor/getPreferences'
import * as AppBskyActorGetProfile from './types/app/bsky/actor/getProfile'
@ -143,6 +147,7 @@ import * as AppBskyUnspeccedSearchActorsSkeleton from './types/app/bsky/unspecce
import * as AppBskyUnspeccedSearchPostsSkeleton from './types/app/bsky/unspecced/searchPostsSkeleton'
export * as ComAtprotoAdminDefs from './types/com/atproto/admin/defs'
export * as ComAtprotoAdminDeleteAccount from './types/com/atproto/admin/deleteAccount'
export * as ComAtprotoAdminDisableAccountInvites from './types/com/atproto/admin/disableAccountInvites'
export * as ComAtprotoAdminDisableInviteCodes from './types/com/atproto/admin/disableInviteCodes'
export * as ComAtprotoAdminEmitModerationEvent from './types/com/atproto/admin/emitModerationEvent'
@ -211,6 +216,9 @@ export * as ComAtprotoSyncNotifyOfUpdate from './types/com/atproto/sync/notifyOf
export * as ComAtprotoSyncRequestCrawl from './types/com/atproto/sync/requestCrawl'
export * as ComAtprotoSyncSubscribeRepos from './types/com/atproto/sync/subscribeRepos'
export * as ComAtprotoTempFetchLabels from './types/com/atproto/temp/fetchLabels'
export * as ComAtprotoTempImportRepo from './types/com/atproto/temp/importRepo'
export * as ComAtprotoTempPushBlob from './types/com/atproto/temp/pushBlob'
export * as ComAtprotoTempTransferAccount from './types/com/atproto/temp/transferAccount'
export * as AppBskyActorDefs from './types/app/bsky/actor/defs'
export * as AppBskyActorGetPreferences from './types/app/bsky/actor/getPreferences'
export * as AppBskyActorGetProfile from './types/app/bsky/actor/getProfile'
@ -366,6 +374,17 @@ export class AdminNS {
this._service = service
}
deleteAccount(
data?: ComAtprotoAdminDeleteAccount.InputSchema,
opts?: ComAtprotoAdminDeleteAccount.CallOptions,
): Promise<ComAtprotoAdminDeleteAccount.Response> {
return this._service.xrpc
.call('com.atproto.admin.deleteAccount', opts?.qp, data, opts)
.catch((e) => {
throw ComAtprotoAdminDeleteAccount.toKnownErr(e)
})
}
disableAccountInvites(
data?: ComAtprotoAdminDisableAccountInvites.InputSchema,
opts?: ComAtprotoAdminDisableAccountInvites.CallOptions,
@ -1108,6 +1127,39 @@ export class TempNS {
throw ComAtprotoTempFetchLabels.toKnownErr(e)
})
}
importRepo(
data?: ComAtprotoTempImportRepo.InputSchema,
opts?: ComAtprotoTempImportRepo.CallOptions,
): Promise<ComAtprotoTempImportRepo.Response> {
return this._service.xrpc
.call('com.atproto.temp.importRepo', opts?.qp, data, opts)
.catch((e) => {
throw ComAtprotoTempImportRepo.toKnownErr(e)
})
}
pushBlob(
data?: ComAtprotoTempPushBlob.InputSchema,
opts?: ComAtprotoTempPushBlob.CallOptions,
): Promise<ComAtprotoTempPushBlob.Response> {
return this._service.xrpc
.call('com.atproto.temp.pushBlob', opts?.qp, data, opts)
.catch((e) => {
throw ComAtprotoTempPushBlob.toKnownErr(e)
})
}
transferAccount(
data?: ComAtprotoTempTransferAccount.InputSchema,
opts?: ComAtprotoTempTransferAccount.CallOptions,
): Promise<ComAtprotoTempTransferAccount.Response> {
return this._service.xrpc
.call('com.atproto.temp.transferAccount', opts?.qp, data, opts)
.catch((e) => {
throw ComAtprotoTempTransferAccount.toKnownErr(e)
})
}
}
export class AppNS {

@ -820,6 +820,29 @@ export const schemaDict = {
},
},
},
ComAtprotoAdminDeleteAccount: {
lexicon: 1,
id: 'com.atproto.admin.deleteAccount',
defs: {
main: {
type: 'procedure',
description: 'Delete a user account as an administrator.',
input: {
encoding: 'application/json',
schema: {
type: 'object',
required: ['did'],
properties: {
did: {
type: 'string',
format: 'did',
},
},
},
},
},
},
},
ComAtprotoAdminDisableAccountInvites: {
lexicon: 1,
id: 'com.atproto.admin.disableAccountInvites',
@ -3979,6 +4002,135 @@ export const schemaDict = {
},
},
},
ComAtprotoTempImportRepo: {
lexicon: 1,
id: 'com.atproto.temp.importRepo',
defs: {
main: {
type: 'procedure',
description:
"Gets the did's repo, optionally catching up from a specific revision.",
parameters: {
type: 'params',
required: ['did'],
properties: {
did: {
type: 'string',
format: 'did',
description: 'The DID of the repo.',
},
},
},
input: {
encoding: 'application/vnd.ipld.car',
},
output: {
encoding: 'text/plain',
},
},
},
},
ComAtprotoTempPushBlob: {
lexicon: 1,
id: 'com.atproto.temp.pushBlob',
defs: {
main: {
type: 'procedure',
description:
"Gets the did's repo, optionally catching up from a specific revision.",
parameters: {
type: 'params',
required: ['did'],
properties: {
did: {
type: 'string',
format: 'did',
description: 'The DID of the repo.',
},
},
},
input: {
encoding: '*/*',
},
},
},
},
ComAtprotoTempTransferAccount: {
lexicon: 1,
id: 'com.atproto.temp.transferAccount',
defs: {
main: {
type: 'procedure',
description: 'Transfer an account.',
input: {
encoding: 'application/json',
schema: {
type: 'object',
required: ['handle', 'did', 'plcOp'],
properties: {
handle: {
type: 'string',
format: 'handle',
},
did: {
type: 'string',
format: 'did',
},
plcOp: {
type: 'unknown',
},
},
},
},
output: {
encoding: 'application/json',
schema: {
type: 'object',
required: ['accessJwt', 'refreshJwt', 'handle', 'did'],
properties: {
accessJwt: {
type: 'string',
},
refreshJwt: {
type: 'string',
},
handle: {
type: 'string',
format: 'handle',
},
did: {
type: 'string',
format: 'did',
},
},
},
},
errors: [
{
name: 'InvalidHandle',
},
{
name: 'InvalidPassword',
},
{
name: 'InvalidInviteCode',
},
{
name: 'HandleNotAvailable',
},
{
name: 'UnsupportedDomain',
},
{
name: 'UnresolvableDid',
},
{
name: 'IncompatibleDidDoc',
},
],
},
},
},
AppBskyActorDefs: {
lexicon: 1,
id: 'app.bsky.actor.defs',
@ -7671,6 +7823,7 @@ export const schemas: LexiconDoc[] = Object.values(schemaDict) as LexiconDoc[]
export const lexicons: Lexicons = new Lexicons(schemas)
export const ids = {
ComAtprotoAdminDefs: 'com.atproto.admin.defs',
ComAtprotoAdminDeleteAccount: 'com.atproto.admin.deleteAccount',
ComAtprotoAdminDisableAccountInvites:
'com.atproto.admin.disableAccountInvites',
ComAtprotoAdminDisableInviteCodes: 'com.atproto.admin.disableInviteCodes',
@ -7746,6 +7899,9 @@ export const ids = {
ComAtprotoSyncRequestCrawl: 'com.atproto.sync.requestCrawl',
ComAtprotoSyncSubscribeRepos: 'com.atproto.sync.subscribeRepos',
ComAtprotoTempFetchLabels: 'com.atproto.temp.fetchLabels',
ComAtprotoTempImportRepo: 'com.atproto.temp.importRepo',
ComAtprotoTempPushBlob: 'com.atproto.temp.pushBlob',
ComAtprotoTempTransferAccount: 'com.atproto.temp.transferAccount',
AppBskyActorDefs: 'app.bsky.actor.defs',
AppBskyActorGetPreferences: 'app.bsky.actor.getPreferences',
AppBskyActorGetProfile: 'app.bsky.actor.getProfile',

@ -0,0 +1,32 @@
/**
* GENERATED CODE - DO NOT MODIFY
*/
import { Headers, XRPCError } from '@atproto/xrpc'
import { ValidationResult, BlobRef } from '@atproto/lexicon'
import { isObj, hasProp } from '../../../../util'
import { lexicons } from '../../../../lexicons'
import { CID } from 'multiformats/cid'
export interface QueryParams {}
export interface InputSchema {
did: string
[k: string]: unknown
}
export interface CallOptions {
headers?: Headers
qp?: QueryParams
encoding: 'application/json'
}
export interface Response {
success: boolean
headers: Headers
}
export function toKnownErr(e: any) {
if (e instanceof XRPCError) {
}
return e
}

@ -0,0 +1,33 @@
/**
* GENERATED CODE - DO NOT MODIFY
*/
import { Headers, XRPCError } from '@atproto/xrpc'
import { ValidationResult, BlobRef } from '@atproto/lexicon'
import { isObj, hasProp } from '../../../../util'
import { lexicons } from '../../../../lexicons'
import { CID } from 'multiformats/cid'
export interface QueryParams {
/** The DID of the repo. */
did: string
}
export type InputSchema = string | Uint8Array
export interface CallOptions {
headers?: Headers
qp?: QueryParams
encoding: 'application/vnd.ipld.car'
}
export interface Response {
success: boolean
headers: Headers
data: Uint8Array
}
export function toKnownErr(e: any) {
if (e instanceof XRPCError) {
}
return e
}

@ -0,0 +1,32 @@
/**
* GENERATED CODE - DO NOT MODIFY
*/
import { Headers, XRPCError } from '@atproto/xrpc'
import { ValidationResult, BlobRef } from '@atproto/lexicon'
import { isObj, hasProp } from '../../../../util'
import { lexicons } from '../../../../lexicons'
import { CID } from 'multiformats/cid'
export interface QueryParams {
/** The DID of the repo. */
did: string
}
export type InputSchema = string | Uint8Array
export interface CallOptions {
headers?: Headers
qp?: QueryParams
encoding: string
}
export interface Response {
success: boolean
headers: Headers
}
export function toKnownErr(e: any) {
if (e instanceof XRPCError) {
}
return e
}

@ -0,0 +1,92 @@
/**
* GENERATED CODE - DO NOT MODIFY
*/
import { Headers, XRPCError } from '@atproto/xrpc'
import { ValidationResult, BlobRef } from '@atproto/lexicon'
import { isObj, hasProp } from '../../../../util'
import { lexicons } from '../../../../lexicons'
import { CID } from 'multiformats/cid'
export interface QueryParams {}
export interface InputSchema {
handle: string
did: string
plcOp: {}
[k: string]: unknown
}
export interface OutputSchema {
accessJwt: string
refreshJwt: string
handle: string
did: string
[k: string]: unknown
}
export interface CallOptions {
headers?: Headers
qp?: QueryParams
encoding: 'application/json'
}
export interface Response {
success: boolean
headers: Headers
data: OutputSchema
}
export class InvalidHandleError extends XRPCError {
constructor(src: XRPCError) {
super(src.status, src.error, src.message, src.headers)
}
}
export class InvalidPasswordError extends XRPCError {
constructor(src: XRPCError) {
super(src.status, src.error, src.message, src.headers)
}
}
export class InvalidInviteCodeError extends XRPCError {
constructor(src: XRPCError) {
super(src.status, src.error, src.message, src.headers)
}
}
export class HandleNotAvailableError extends XRPCError {
constructor(src: XRPCError) {
super(src.status, src.error, src.message, src.headers)
}
}
export class UnsupportedDomainError extends XRPCError {
constructor(src: XRPCError) {
super(src.status, src.error, src.message, src.headers)
}
}
export class UnresolvableDidError extends XRPCError {
constructor(src: XRPCError) {
super(src.status, src.error, src.message, src.headers)
}
}
export class IncompatibleDidDocError extends XRPCError {
constructor(src: XRPCError) {
super(src.status, src.error, src.message, src.headers)
}
}
export function toKnownErr(e: any) {
if (e instanceof XRPCError) {
if (e.error === 'InvalidHandle') return new InvalidHandleError(e)
if (e.error === 'InvalidPassword') return new InvalidPasswordError(e)
if (e.error === 'InvalidInviteCode') return new InvalidInviteCodeError(e)
if (e.error === 'HandleNotAvailable') return new HandleNotAvailableError(e)
if (e.error === 'UnsupportedDomain') return new UnsupportedDomainError(e)
if (e.error === 'UnresolvableDid') return new UnresolvableDidError(e)
if (e.error === 'IncompatibleDidDoc') return new IncompatibleDidDocError(e)
}
return e
}

@ -17,7 +17,7 @@ export class S3BlobStore implements BlobStore {
private client: aws.S3
private bucket: string
constructor(cfg: S3Config) {
constructor(public did: string, cfg: S3Config) {
const { bucket, ...rest } = cfg
this.bucket = bucket
this.client = new aws.S3({
@ -26,20 +26,26 @@ export class S3BlobStore implements BlobStore {
})
}
static creator(cfg: S3Config) {
return (did: string) => {
return new S3BlobStore(did, cfg)
}
}
private genKey() {
return randomStr(32, 'base32')
}
private getTmpPath(key: string): string {
return `tmp/${key}`
return `tmp/${this.did}/${key}`
}
private getStoredPath(cid: CID): string {
return `blocks/${cid.toString()}`
return `blocks/${this.did}/${cid.toString()}`
}
private getQuarantinedPath(cid: CID): string {
return `quarantine/${cid.toString()}`
return `quarantine/${this.did}/${cid.toString()}`
}
async putTemp(bytes: Uint8Array | stream.Readable): Promise<string> {
@ -122,11 +128,24 @@ export class S3BlobStore implements BlobStore {
await this.deleteKey(this.getStoredPath(cid))
}
async deleteMany(cids: CID[]): Promise<void> {
const keys = cids.map((cid) => this.getStoredPath(cid))
await this.deleteManyKeys(keys)
}
async hasStored(cid: CID): Promise<boolean> {
return this.hasKey(this.getStoredPath(cid))
}
async hasTemp(key: string): Promise<boolean> {
return this.hasKey(this.getTmpPath(key))
}
private async hasKey(key: string) {
try {
const res = await this.client.headObject({
Bucket: this.bucket,
Key: this.getStoredPath(cid),
Key: key,
})
return res.$metadata.httpStatusCode === 200
} catch (err) {
@ -141,16 +160,37 @@ export class S3BlobStore implements BlobStore {
})
}
private async deleteManyKeys(keys: string[]) {
await this.client.deleteObjects({
Bucket: this.bucket,
Delete: {
Objects: keys.map((k) => ({ Key: k })),
},
})
}
private async move(keys: { from: string; to: string }) {
await this.client.copyObject({
Bucket: this.bucket,
CopySource: `${this.bucket}/${keys.from}`,
Key: keys.to,
})
await this.client.deleteObject({
Bucket: this.bucket,
Key: keys.from,
})
try {
await this.client.copyObject({
Bucket: this.bucket,
CopySource: `${this.bucket}/${keys.from}`,
Key: keys.to,
})
await this.client.deleteObject({
Bucket: this.bucket,
Key: keys.from,
})
} catch (err) {
handleErr(err)
}
}
}
const handleErr = (err: unknown) => {
if (err?.['Code'] === 'NoSuchKey') {
throw new BlobNotFoundError()
} else {
throw err
}
}

@ -9,6 +9,7 @@ import {
StreamAuthVerifier,
} from '@atproto/xrpc-server'
import { schemas } from './lexicons'
import * as ComAtprotoAdminDeleteAccount from './types/com/atproto/admin/deleteAccount'
import * as ComAtprotoAdminDisableAccountInvites from './types/com/atproto/admin/disableAccountInvites'
import * as ComAtprotoAdminDisableInviteCodes from './types/com/atproto/admin/disableInviteCodes'
import * as ComAtprotoAdminEmitModerationEvent from './types/com/atproto/admin/emitModerationEvent'
@ -73,6 +74,9 @@ import * as ComAtprotoSyncNotifyOfUpdate from './types/com/atproto/sync/notifyOf
import * as ComAtprotoSyncRequestCrawl from './types/com/atproto/sync/requestCrawl'
import * as ComAtprotoSyncSubscribeRepos from './types/com/atproto/sync/subscribeRepos'
import * as ComAtprotoTempFetchLabels from './types/com/atproto/temp/fetchLabels'
import * as ComAtprotoTempImportRepo from './types/com/atproto/temp/importRepo'
import * as ComAtprotoTempPushBlob from './types/com/atproto/temp/pushBlob'
import * as ComAtprotoTempTransferAccount from './types/com/atproto/temp/transferAccount'
import * as AppBskyActorGetPreferences from './types/app/bsky/actor/getPreferences'
import * as AppBskyActorGetProfile from './types/app/bsky/actor/getProfile'
import * as AppBskyActorGetProfiles from './types/app/bsky/actor/getProfiles'
@ -194,6 +198,17 @@ export class AdminNS {
this._server = server
}
deleteAccount<AV extends AuthVerifier>(
cfg: ConfigOf<
AV,
ComAtprotoAdminDeleteAccount.Handler<ExtractAuth<AV>>,
ComAtprotoAdminDeleteAccount.HandlerReqCtx<ExtractAuth<AV>>
>,
) {
const nsid = 'com.atproto.admin.deleteAccount' // @ts-ignore
return this._server.xrpc.method(nsid, cfg)
}
disableAccountInvites<AV extends AuthVerifier>(
cfg: ConfigOf<
AV,
@ -953,6 +968,39 @@ export class TempNS {
const nsid = 'com.atproto.temp.fetchLabels' // @ts-ignore
return this._server.xrpc.method(nsid, cfg)
}
importRepo<AV extends AuthVerifier>(
cfg: ConfigOf<
AV,
ComAtprotoTempImportRepo.Handler<ExtractAuth<AV>>,
ComAtprotoTempImportRepo.HandlerReqCtx<ExtractAuth<AV>>
>,
) {
const nsid = 'com.atproto.temp.importRepo' // @ts-ignore
return this._server.xrpc.method(nsid, cfg)
}
pushBlob<AV extends AuthVerifier>(
cfg: ConfigOf<
AV,
ComAtprotoTempPushBlob.Handler<ExtractAuth<AV>>,
ComAtprotoTempPushBlob.HandlerReqCtx<ExtractAuth<AV>>
>,
) {
const nsid = 'com.atproto.temp.pushBlob' // @ts-ignore
return this._server.xrpc.method(nsid, cfg)
}
transferAccount<AV extends AuthVerifier>(
cfg: ConfigOf<
AV,
ComAtprotoTempTransferAccount.Handler<ExtractAuth<AV>>,
ComAtprotoTempTransferAccount.HandlerReqCtx<ExtractAuth<AV>>
>,
) {
const nsid = 'com.atproto.temp.transferAccount' // @ts-ignore
return this._server.xrpc.method(nsid, cfg)
}
}
export class AppNS {
@ -1549,11 +1597,13 @@ type RouteRateLimitOpts<T> = {
calcKey?: (ctx: T) => string
calcPoints?: (ctx: T) => number
}
type HandlerOpts = { blobLimit?: number }
type HandlerRateLimitOpts<T> = SharedRateLimitOpts<T> | RouteRateLimitOpts<T>
type ConfigOf<Auth, Handler, ReqCtx> =
| Handler
| {
auth?: Auth
opts?: HandlerOpts
rateLimit?: HandlerRateLimitOpts<ReqCtx> | HandlerRateLimitOpts<ReqCtx>[]
handler: Handler
}

@ -820,6 +820,29 @@ export const schemaDict = {
},
},
},
ComAtprotoAdminDeleteAccount: {
lexicon: 1,
id: 'com.atproto.admin.deleteAccount',
defs: {
main: {
type: 'procedure',
description: 'Delete a user account as an administrator.',
input: {
encoding: 'application/json',
schema: {
type: 'object',
required: ['did'],
properties: {
did: {
type: 'string',
format: 'did',
},
},
},
},
},
},
},
ComAtprotoAdminDisableAccountInvites: {
lexicon: 1,
id: 'com.atproto.admin.disableAccountInvites',
@ -3979,6 +4002,135 @@ export const schemaDict = {
},
},
},
ComAtprotoTempImportRepo: {
lexicon: 1,
id: 'com.atproto.temp.importRepo',
defs: {
main: {
type: 'procedure',
description:
"Gets the did's repo, optionally catching up from a specific revision.",
parameters: {
type: 'params',
required: ['did'],
properties: {
did: {
type: 'string',
format: 'did',
description: 'The DID of the repo.',
},
},
},
input: {
encoding: 'application/vnd.ipld.car',
},
output: {
encoding: 'text/plain',
},
},
},
},
ComAtprotoTempPushBlob: {
lexicon: 1,
id: 'com.atproto.temp.pushBlob',
defs: {
main: {
type: 'procedure',
description:
"Gets the did's repo, optionally catching up from a specific revision.",
parameters: {
type: 'params',
required: ['did'],
properties: {
did: {
type: 'string',
format: 'did',
description: 'The DID of the repo.',
},
},
},
input: {
encoding: '*/*',
},
},
},
},
ComAtprotoTempTransferAccount: {
lexicon: 1,
id: 'com.atproto.temp.transferAccount',
defs: {
main: {
type: 'procedure',
description: 'Transfer an account.',
input: {
encoding: 'application/json',
schema: {
type: 'object',
required: ['handle', 'did', 'plcOp'],
properties: {
handle: {
type: 'string',
format: 'handle',
},
did: {
type: 'string',
format: 'did',
},
plcOp: {
type: 'unknown',
},
},
},
},
output: {
encoding: 'application/json',
schema: {
type: 'object',
required: ['accessJwt', 'refreshJwt', 'handle', 'did'],
properties: {
accessJwt: {
type: 'string',
},
refreshJwt: {
type: 'string',
},
handle: {
type: 'string',
format: 'handle',
},
did: {
type: 'string',
format: 'did',
},
},
},
},
errors: [
{
name: 'InvalidHandle',
},
{
name: 'InvalidPassword',
},
{
name: 'InvalidInviteCode',
},
{
name: 'HandleNotAvailable',
},
{
name: 'UnsupportedDomain',
},
{
name: 'UnresolvableDid',
},
{
name: 'IncompatibleDidDoc',
},
],
},
},
},
AppBskyActorDefs: {
lexicon: 1,
id: 'app.bsky.actor.defs',
@ -7671,6 +7823,7 @@ export const schemas: LexiconDoc[] = Object.values(schemaDict) as LexiconDoc[]
export const lexicons: Lexicons = new Lexicons(schemas)
export const ids = {
ComAtprotoAdminDefs: 'com.atproto.admin.defs',
ComAtprotoAdminDeleteAccount: 'com.atproto.admin.deleteAccount',
ComAtprotoAdminDisableAccountInvites:
'com.atproto.admin.disableAccountInvites',
ComAtprotoAdminDisableInviteCodes: 'com.atproto.admin.disableInviteCodes',
@ -7746,6 +7899,9 @@ export const ids = {
ComAtprotoSyncRequestCrawl: 'com.atproto.sync.requestCrawl',
ComAtprotoSyncSubscribeRepos: 'com.atproto.sync.subscribeRepos',
ComAtprotoTempFetchLabels: 'com.atproto.temp.fetchLabels',
ComAtprotoTempImportRepo: 'com.atproto.temp.importRepo',
ComAtprotoTempPushBlob: 'com.atproto.temp.pushBlob',
ComAtprotoTempTransferAccount: 'com.atproto.temp.transferAccount',
AppBskyActorDefs: 'app.bsky.actor.defs',
AppBskyActorGetPreferences: 'app.bsky.actor.getPreferences',
AppBskyActorGetProfile: 'app.bsky.actor.getProfile',

@ -0,0 +1,38 @@
/**
* GENERATED CODE - DO NOT MODIFY
*/
import express from 'express'
import { ValidationResult, BlobRef } from '@atproto/lexicon'
import { lexicons } from '../../../../lexicons'
import { isObj, hasProp } from '../../../../util'
import { CID } from 'multiformats/cid'
import { HandlerAuth } from '@atproto/xrpc-server'
export interface QueryParams {}
export interface InputSchema {
did: string
[k: string]: unknown
}
export interface HandlerInput {
encoding: 'application/json'
body: InputSchema
}
export interface HandlerError {
status: number
message?: string
}
export type HandlerOutput = HandlerError | void
export type HandlerReqCtx<HA extends HandlerAuth = never> = {
auth: HA
params: QueryParams
input: HandlerInput
req: express.Request
res: express.Response
}
export type Handler<HA extends HandlerAuth = never> = (
ctx: HandlerReqCtx<HA>,
) => Promise<HandlerOutput> | HandlerOutput

@ -0,0 +1,45 @@
/**
* GENERATED CODE - DO NOT MODIFY
*/
import express from 'express'
import stream from 'stream'
import { ValidationResult, BlobRef } from '@atproto/lexicon'
import { lexicons } from '../../../../lexicons'
import { isObj, hasProp } from '../../../../util'
import { CID } from 'multiformats/cid'
import { HandlerAuth } from '@atproto/xrpc-server'
export interface QueryParams {
/** The DID of the repo. */
did: string
}
export type InputSchema = string | Uint8Array
export interface HandlerInput {
encoding: 'application/vnd.ipld.car'
body: stream.Readable
}
export interface HandlerSuccess {
encoding: 'text/plain'
body: Uint8Array | stream.Readable
headers?: { [key: string]: string }
}
export interface HandlerError {
status: number
message?: string
}
export type HandlerOutput = HandlerError | HandlerSuccess
export type HandlerReqCtx<HA extends HandlerAuth = never> = {
auth: HA
params: QueryParams
input: HandlerInput
req: express.Request
res: express.Response
}
export type Handler<HA extends HandlerAuth = never> = (
ctx: HandlerReqCtx<HA>,
) => Promise<HandlerOutput> | HandlerOutput

@ -0,0 +1,39 @@
/**
* GENERATED CODE - DO NOT MODIFY
*/
import express from 'express'
import stream from 'stream'
import { ValidationResult, BlobRef } from '@atproto/lexicon'
import { lexicons } from '../../../../lexicons'
import { isObj, hasProp } from '../../../../util'
import { CID } from 'multiformats/cid'
import { HandlerAuth } from '@atproto/xrpc-server'
export interface QueryParams {
/** The DID of the repo. */
did: string
}
export type InputSchema = string | Uint8Array
export interface HandlerInput {
encoding: '*/*'
body: stream.Readable
}
export interface HandlerError {
status: number
message?: string
}
export type HandlerOutput = HandlerError | void
export type HandlerReqCtx<HA extends HandlerAuth = never> = {
auth: HA
params: QueryParams
input: HandlerInput
req: express.Request
res: express.Response
}
export type Handler<HA extends HandlerAuth = never> = (
ctx: HandlerReqCtx<HA>,
) => Promise<HandlerOutput> | HandlerOutput

@ -0,0 +1,62 @@
/**
* GENERATED CODE - DO NOT MODIFY
*/
import express from 'express'
import { ValidationResult, BlobRef } from '@atproto/lexicon'
import { lexicons } from '../../../../lexicons'
import { isObj, hasProp } from '../../../../util'
import { CID } from 'multiformats/cid'
import { HandlerAuth } from '@atproto/xrpc-server'
export interface QueryParams {}
export interface InputSchema {
handle: string
did: string
plcOp: {}
[k: string]: unknown
}
export interface OutputSchema {
accessJwt: string
refreshJwt: string
handle: string
did: string
[k: string]: unknown
}
export interface HandlerInput {
encoding: 'application/json'
body: InputSchema
}
export interface HandlerSuccess {
encoding: 'application/json'
body: OutputSchema
headers?: { [key: string]: string }
}
export interface HandlerError {
status: number
message?: string
error?:
| 'InvalidHandle'
| 'InvalidPassword'
| 'InvalidInviteCode'
| 'HandleNotAvailable'
| 'UnsupportedDomain'
| 'UnresolvableDid'
| 'IncompatibleDidDoc'
}
export type HandlerOutput = HandlerError | HandlerSuccess
export type HandlerReqCtx<HA extends HandlerAuth = never> = {
auth: HA
params: QueryParams
input: HandlerInput
req: express.Request
res: express.Response
}
export type Handler<HA extends HandlerAuth = never> = (
ctx: HandlerReqCtx<HA>,
) => Promise<HandlerOutput> | HandlerOutput

@ -7,6 +7,7 @@ import {
verifyRepo,
Commit,
VerifiedRepo,
getAndParseRecord,
} from '@atproto/repo'
import { AtUri } from '@atproto/syntax'
import { IdResolver, getPds } from '@atproto/identity'
@ -201,10 +202,11 @@ export class IndexingService {
if (op.op === 'delete') {
await this.deleteRecord(uri)
} else {
const parsed = await getAndParseRecord(blocks, cid)
await this.indexRecord(
uri,
cid,
op.value,
parsed.record,
op.op === 'create' ? WriteOpAction.Create : WriteOpAction.Update,
now,
)
@ -389,19 +391,15 @@ type UriAndCid = {
cid: CID
}
type RecordDescript = UriAndCid & {
value: unknown
}
type IndexOp =
| ({
op: 'create' | 'update'
} & RecordDescript)
} & UriAndCid)
| ({ op: 'delete' } & UriAndCid)
const findDiffFromCheckout = (
curr: Record<string, UriAndCid>,
checkout: Record<string, RecordDescript>,
checkout: Record<string, UriAndCid>,
): IndexOp[] => {
const ops: IndexOp[] = []
for (const uri of Object.keys(checkout)) {
@ -428,14 +426,13 @@ const findDiffFromCheckout = (
const formatCheckout = (
did: string,
verifiedRepo: VerifiedRepo,
): Record<string, RecordDescript> => {
const records: Record<string, RecordDescript> = {}
): Record<string, UriAndCid> => {
const records: Record<string, UriAndCid> = {}
for (const create of verifiedRepo.creates) {
const uri = AtUri.make(did, create.collection, create.rkey)
records[uri.toString()] = {
uri,
cid: create.cid,
value: create.record,
}
}
return records

@ -1,26 +1,6 @@
import { AxiosError } from 'axios'
import { wait } from '@atproto/common'
import { XRPCError, ResponseType } from '@atproto/xrpc'
export async function retry<T>(
fn: () => Promise<T>,
opts: RetryOptions = {},
): Promise<T> {
const { max = 3, retryable = () => true } = opts
let retries = 0
let doneError: unknown
while (!doneError) {
try {
if (retries) await backoff(retries)
return await fn()
} catch (err) {
const willRetry = retries < max && retryable(err)
if (!willRetry) doneError = err
retries += 1
}
}
throw doneError
}
import { RetryOptions, retry } from '@atproto/common'
export async function retryHttp<T>(
fn: () => Promise<T>,
@ -44,26 +24,3 @@ export function retryableHttp(err: unknown) {
const retryableHttpStatusCodes = new Set([
408, 425, 429, 500, 502, 503, 504, 522, 524,
])
type RetryOptions = {
max?: number
retryable?: (err: unknown) => boolean
}
// Waits exponential backoff with max and jitter: ~50, ~100, ~200, ~400, ~800, ~1000, ~1000, ...
async function backoff(n: number, multiplier = 50, max = 1000) {
const exponentialMs = Math.pow(2, n) * multiplier
const ms = Math.min(exponentialMs, max)
await wait(jitter(ms))
}
// Adds randomness +/-15% of value
function jitter(value: number) {
const delta = value * 0.15
return value + randomRange(-delta, delta)
}
function randomRange(from: number, to: number) {
const rand = Math.random() * (to - from)
return rand + from
}

@ -97,9 +97,11 @@ describe('admin get repo view', () => {
expect(beforeEmailVerification.emailConfirmedAt).toBeUndefined()
const timestampBeforeVerification = Date.now()
const bobsAccount = sc.accounts[sc.dids.bob]
const verificationToken = await network.pds.ctx.services
.account(network.pds.ctx.db)
.createEmailToken(sc.dids.bob, 'confirm_email')
const verificationToken =
await network.pds.ctx.accountManager.createEmailToken(
sc.dids.bob,
'confirm_email',
)
await agent.api.com.atproto.server.confirmEmail(
{ email: bobsAccount.email, token: verificationToken },
{

@ -31,7 +31,6 @@ describe('algo hot-classic', () => {
alice = sc.dids.alice
bob = sc.dids.bob
await network.processAll()
await network.bsky.processAll()
})
afterAll(async () => {
@ -59,7 +58,7 @@ describe('algo hot-classic', () => {
await sc.like(sc.dids[name], two.ref)
await sc.like(sc.dids[name], three.ref)
}
await network.bsky.processAll()
await network.processAll()
const res = await agent.api.app.bsky.feed.getFeed(
{ feed: feedUri },

@ -36,7 +36,7 @@ describe('auth', () => {
{ headers: { authorization: `Bearer ${jwt}` } },
)
}
const origSigningKey = network.pds.ctx.repoSigningKey
const origSigningKey = await network.pds.ctx.actorStore.keypair(issuer)
const newSigningKey = await Secp256k1Keypair.create({ exportable: true })
// confirm original signing key works
await expect(attemptWithKey(origSigningKey)).resolves.toBeDefined()

@ -40,26 +40,25 @@ describe('labeler', () => {
await usersSeed(sc)
await network.processAll()
alice = sc.dids.alice
const repoSvc = pdsCtx.services.repo(pdsCtx.db)
const storeBlob = async (bytes: Uint8Array) => {
const blobRef = await repoSvc.blobs.addUntetheredBlob(
alice,
'image/jpeg',
Readable.from([bytes], { objectMode: false }),
)
const preparedBlobRef = {
cid: blobRef.ref,
mimeType: 'image/jpeg',
constraints: {},
}
await repoSvc.blobs.verifyBlobAndMakePermanent(alice, preparedBlobRef)
await repoSvc.blobs.associateBlob(
preparedBlobRef,
postUri(),
TID.nextStr(),
alice,
)
return blobRef
const storeBlob = (bytes: Uint8Array) => {
return pdsCtx.actorStore.transact(alice, async (store) => {
const blobRef = await store.repo.blob.addUntetheredBlob(
'image/jpeg',
Readable.from([bytes], { objectMode: false }),
)
const preparedBlobRef = {
cid: blobRef.ref,
mimeType: 'image/jpeg',
constraints: {},
}
await store.repo.blob.verifyBlobAndMakePermanent(preparedBlobRef)
await store.repo.blob.associateBlob(
preparedBlobRef,
postUri(),
TID.nextStr(),
)
return blobRef
})
}
const bytes1 = new Uint8Array([1, 2, 3, 4])
const bytes2 = new Uint8Array([5, 6, 7, 8])

@ -106,11 +106,15 @@ describe('takedowner', () => {
.executeTakeFirst()
expect(record?.takedownId).toBeGreaterThan(0)
const recordPds = await network.pds.ctx.db.db
.selectFrom('record')
.where('uri', '=', post.ref.uriStr)
.select('takedownRef')
.executeTakeFirst()
const recordPds = await network.pds.ctx.actorStore.read(
post.ref.uri.hostname,
(store) =>
store.db.db
.selectFrom('record')
.where('uri', '=', post.ref.uriStr)
.select('takedownRef')
.executeTakeFirst(),
)
expect(recordPds?.takedownRef).toEqual(takedownEvent.id.toString())
expect(testInvalidator.invalidated.length).toBe(1)
@ -162,11 +166,13 @@ describe('takedowner', () => {
.executeTakeFirst()
expect(record?.takedownId).toBeGreaterThan(0)
const recordPds = await network.pds.ctx.db.db
.selectFrom('record')
.where('uri', '=', res.data.uri)
.select('takedownRef')
.executeTakeFirst()
const recordPds = await network.pds.ctx.actorStore.read(alice, (store) =>
store.db.db
.selectFrom('record')
.where('uri', '=', res.data.uri)
.select('takedownRef')
.executeTakeFirst(),
)
expect(recordPds?.takedownRef).toEqual(takedownEvent.id.toString())
expect(testInvalidator.invalidated.length).toBe(2)

@ -77,8 +77,10 @@ describe('blob resolver', () => {
})
it('fails on blob with bad signature check.', async () => {
await network.pds.ctx.blobstore.delete(fileCid)
await network.pds.ctx.blobstore.putPermanent(fileCid, randomBytes(100))
await network.pds.ctx.blobstore(fileDid).delete(fileCid)
await network.pds.ctx
.blobstore(fileDid)
.putPermanent(fileCid, randomBytes(100))
const tryGetBlob = client.get(`/blob/${fileDid}/${fileCid.toString()}`)
await expect(tryGetBlob).rejects.toThrow(
'maxContentLength size of -1 exceeded',

@ -102,11 +102,7 @@ describe('handle invalidation', () => {
it('deals with handle contention', async () => {
await backdateIndexedAt(bob)
// update alices handle so that the pds will let bob take her old handle
await network.pds.ctx.db.db
.updateTable('did_handle')
.where('did', '=', alice)
.set({ handle: 'not-alice.test' })
.execute()
await network.pds.ctx.accountManager.updateHandle(alice, 'not-alice.test')
await pdsAgent.api.com.atproto.identity.updateHandle(
{

@ -498,7 +498,8 @@ describe('indexing', () => {
it('skips invalid records.', async () => {
const { db, services } = network.bsky.indexer.ctx
const { db: pdsDb, services: pdsServices } = network.pds.ctx
const { accountManager } = network.pds.ctx
// const { db: pdsDb, services: pdsServices } = network.pds.ctx
// Create a good and a bad post record
const writes = await Promise.all([
pdsRepo.prepareCreate({
@ -513,9 +514,20 @@ describe('indexing', () => {
validate: false,
}),
])
await pdsServices
.repo(pdsDb)
.processWrites({ did: sc.dids.alice, writes }, 1)
const writeCommit = await network.pds.ctx.actorStore.transact(
sc.dids.alice,
(store) => store.repo.processWrites(writes),
)
await accountManager.updateRepoRoot(
sc.dids.alice,
writeCommit.cid,
writeCommit.rev,
)
await network.pds.ctx.sequencer.sequenceCommit(
sc.dids.alice,
writeCommit,
writes,
)
// Index
const { data: commit } =
await pdsAgent.api.com.atproto.sync.getLatestCommit({
@ -643,15 +655,10 @@ describe('indexing', () => {
)
await expect(getProfileBefore).resolves.toBeDefined()
// Delete account on pds
await pdsAgent.api.com.atproto.server.requestAccountDelete(undefined, {
headers: sc.getHeaders(alice),
})
const { token } = await network.pds.ctx.db.db
.selectFrom('email_token')
.selectAll()
.where('purpose', '=', 'delete_account')
.where('did', '=', alice)
.executeTakeFirstOrThrow()
const token = await network.pds.ctx.accountManager.createEmailToken(
alice,
'delete_account',
)
await pdsAgent.api.com.atproto.server.deleteAccount({
token,
did: alice,

@ -103,6 +103,8 @@ export default async (sc: SeedClient, users = true) => {
'tests/sample-img/key-landscape-small.jpg',
'image/jpeg',
)
// must ensure ordering of replies in indexing
await sc.network.processAll()
await sc.reply(
bob,
sc.posts[alice][1].ref,
@ -117,6 +119,7 @@ export default async (sc: SeedClient, users = true) => {
sc.posts[alice][1].ref,
replies.carol[0],
)
await sc.network.processAll()
const alicesReplyToBob = await sc.reply(
alice,
sc.posts[alice][1].ref,

@ -1,7 +1,6 @@
import AtpAgent from '@atproto/api'
import { TestNetwork, SeedClient } from '@atproto/dev-env'
import { CommitData } from '@atproto/repo'
import { RepoService } from '@atproto/pds/src/services/repo'
import { PreparedWrite } from '@atproto/pds/src/repo'
import * as sequencer from '@atproto/pds/src/sequencer'
import { cborDecode, cborEncode } from '@atproto/common'
@ -84,9 +83,8 @@ describe('sync', () => {
it('indexes actor when commit is unprocessable.', async () => {
// mock sequencing to create an unprocessable commit event
const afterWriteProcessingOriginal =
RepoService.prototype.afterWriteProcessing
RepoService.prototype.afterWriteProcessing = async function (
const sequenceCommitOrig = network.pds.ctx.sequencer.sequenceCommit
network.pds.ctx.sequencer.sequenceCommit = async function (
did: string,
commitData: CommitData,
writes: PreparedWrite[],
@ -95,7 +93,7 @@ describe('sync', () => {
const evt = cborDecode(seqEvt.event) as sequencer.CommitEvt
evt.blocks = new Uint8Array() // bad blocks
seqEvt.event = cborEncode(evt)
await sequencer.sequenceEvt(this.db, seqEvt)
await network.pds.ctx.sequencer.sequenceEvt(seqEvt)
}
// create account and index the initial commit event
await sc.createAccount('jack', {
@ -103,12 +101,11 @@ describe('sync', () => {
email: 'jack@test.com',
password: 'password',
})
await network.pds.ctx.sequencerLeader?.isCaughtUp()
await network.processAll()
// confirm jack was indexed as an actor despite the bad event
const actors = await dumpTable(ctx.db.getPrimary(), 'actor', ['did'])
expect(actors.map((a) => a.handle)).toContain('jack.test')
RepoService.prototype.afterWriteProcessing = afterWriteProcessingOriginal
network.pds.ctx.sequencer.sequenceCommit = sequenceCommitOrig
})
async function updateProfile(

@ -49,6 +49,7 @@ describe('views with thread gating', () => {
{ post: post.ref.uriStr, createdAt: iso(), allow: [] },
sc.getHeaders(sc.dids.carol),
)
await network.processAll()
await sc.reply(sc.dids.alice, post.ref, post.ref, 'empty rules reply')
await network.processAll()
const {
@ -71,6 +72,7 @@ describe('views with thread gating', () => {
{ post: post.ref.uriStr, createdAt: iso(), allow: [] },
sc.getHeaders(sc.dids.carol),
)
await network.processAll()
const reply = await sc.reply(
sc.dids.alice,
post.ref,
@ -118,6 +120,7 @@ describe('views with thread gating', () => {
},
sc.getHeaders(sc.dids.carol),
)
await network.processAll()
await sc.reply(
sc.dids.alice,
post.ref,
@ -167,6 +170,7 @@ describe('views with thread gating', () => {
},
sc.getHeaders(sc.dids.carol),
)
await network.processAll()
// carol only follows alice
await sc.reply(
sc.dids.dan,
@ -257,6 +261,7 @@ describe('views with thread gating', () => {
},
sc.getHeaders(sc.dids.carol),
)
await network.processAll()
//
await sc.reply(sc.dids.bob, post.ref, post.ref, 'list rule reply disallow')
const aliceReply = await sc.reply(
@ -324,6 +329,7 @@ describe('views with thread gating', () => {
},
sc.getHeaders(sc.dids.carol),
)
await network.processAll()
await sc.reply(
sc.dids.alice,
post.ref,
@ -365,6 +371,7 @@ describe('views with thread gating', () => {
},
sc.getHeaders(sc.dids.carol),
)
await network.processAll()
// carol only follows alice, and the post mentions dan.
await sc.reply(sc.dids.bob, post.ref, post.ref, 'multi rule reply disallow')
const aliceReply = await sc.reply(
@ -423,6 +430,7 @@ describe('views with thread gating', () => {
{ post: post.ref.uriStr, createdAt: iso() },
sc.getHeaders(sc.dids.carol),
)
await network.processAll()
const aliceReply = await sc.reply(
sc.dids.alice,
post.ref,
@ -458,6 +466,7 @@ describe('views with thread gating', () => {
},
sc.getHeaders(sc.dids.carol),
)
await network.processAll()
// carol only follows alice
const orphanedReply = await sc.reply(
sc.dids.alice,
@ -519,6 +528,7 @@ describe('views with thread gating', () => {
{ post: post.ref.uriStr, createdAt: iso(), allow: [] },
sc.getHeaders(sc.dids.carol),
)
await network.processAll()
const selfReply = await sc.reply(
sc.dids.carol,
post.ref,
@ -553,6 +563,7 @@ describe('views with thread gating', () => {
},
sc.getHeaders(sc.dids.carol),
)
await network.processAll()
// carol only follows alice
const badReply = await sc.reply(
sc.dids.dan,
@ -597,6 +608,7 @@ describe('views with thread gating', () => {
{ post: postB.ref.uriStr, createdAt: iso(), allow: [] },
sc.getHeaders(sc.dids.carol),
)
await network.processAll()
await sc.reply(sc.dids.alice, postA.ref, postA.ref, 'ungated reply')
await sc.reply(sc.dids.alice, postB.ref, postB.ref, 'ungated reply')
await network.processAll()

@ -72,6 +72,8 @@ export class AsyncBuffer<T> {
private buffer: T[] = []
private promise: Promise<void>
private resolve: () => void
private closed = false
private toThrow: unknown | undefined
constructor(public maxSize?: number) {
// Initializing to satisfy types/build, immediately reset by resetPromise()
@ -88,6 +90,10 @@ export class AsyncBuffer<T> {
return this.buffer.length
}
get isClosed(): boolean {
return this.closed
}
resetPromise() {
this.promise = new Promise<void>((r) => (this.resolve = r))
}
@ -104,7 +110,17 @@ export class AsyncBuffer<T> {
async *events(): AsyncGenerator<T> {
while (true) {
if (this.closed && this.buffer.length === 0) {
if (this.toThrow) {
throw this.toThrow
} else {
return
}
}
await this.promise
if (this.toThrow) {
throw this.toThrow
}
if (this.maxSize && this.size > this.maxSize) {
throw new AsyncBufferFullError(this.maxSize)
}
@ -117,6 +133,17 @@ export class AsyncBuffer<T> {
}
}
}
throw(err: unknown) {
this.toThrow = err
this.closed = true
this.resolve()
}
close() {
this.closed = true
this.resolve()
}
}
export class AsyncBufferFullError extends Error {

@ -6,6 +6,7 @@ export * from './async'
export * from './util'
export * from './tid'
export * from './ipld'
export * from './retry'
export * from './types'
export * from './times'
export * from './strings'

@ -0,0 +1,52 @@
import { wait } from './util'
export type RetryOptions = {
maxRetries?: number
getWaitMs?: (n: number) => number | null
retryable?: (err: unknown) => boolean
}
export async function retry<T>(
fn: () => Promise<T>,
opts: RetryOptions = {},
): Promise<T> {
const { maxRetries = 3, retryable = () => true, getWaitMs = backoffMs } = opts
let retries = 0
let doneError: unknown
while (!doneError) {
try {
return await fn()
} catch (err) {
const waitMs = getWaitMs(retries)
const willRetry =
retries < maxRetries && waitMs !== null && retryable(err)
if (willRetry) {
retries += 1
if (waitMs !== 0) {
await wait(waitMs)
}
} else {
doneError = err
}
}
}
throw doneError
}
// Waits exponential backoff with max and jitter: ~100, ~200, ~400, ~800, ~1000, ~1000, ...
export function backoffMs(n: number, multiplier = 100, max = 1000) {
const exponentialMs = Math.pow(2, n) * multiplier
const ms = Math.min(exponentialMs, max)
return jitter(ms)
}
// Adds randomness +/-15% of value
function jitter(value: number) {
const delta = value * 0.15
return value + randomRange(-delta, delta)
}
function randomRange(from: number, to: number) {
const rand = Math.random() * (to - from)
return rand + from
}

@ -0,0 +1,93 @@
import { retry } from '../src/index'
describe('retry', () => {
describe('retry()', () => {
it('retries until max retries', async () => {
let fnCalls = 0
let waitMsCalls = 0
const fn = async () => {
fnCalls++
throw new Error(`Oops ${fnCalls}!`)
}
const getWaitMs = (retries) => {
waitMsCalls++
expect(retries).toEqual(waitMsCalls - 1)
return 0
}
await expect(retry(fn, { maxRetries: 13, getWaitMs })).rejects.toThrow(
'Oops 14!',
)
expect(fnCalls).toEqual(14)
expect(waitMsCalls).toEqual(14)
})
it('retries until max wait', async () => {
let fnCalls = 0
let waitMsCalls = 0
const fn = async () => {
fnCalls++
throw new Error(`Oops ${fnCalls}!`)
}
const getWaitMs = (retries) => {
waitMsCalls++
expect(retries).toEqual(waitMsCalls - 1)
if (retries === 13) {
return null
}
return 0
}
await expect(
retry(fn, { maxRetries: Infinity, getWaitMs }),
).rejects.toThrow('Oops 14!')
expect(fnCalls).toEqual(14)
expect(waitMsCalls).toEqual(14)
})
it('retries until non-retryable error', async () => {
let fnCalls = 0
let waitMsCalls = 0
const fn = async () => {
fnCalls++
throw new Error(`Oops ${fnCalls}!`)
}
const getWaitMs = (retries) => {
waitMsCalls++
expect(retries).toEqual(waitMsCalls - 1)
return 0
}
const retryable = (err: unknown) => err?.['message'] !== 'Oops 14!'
await expect(
retry(fn, { maxRetries: Infinity, getWaitMs, retryable }),
).rejects.toThrow('Oops 14!')
expect(fnCalls).toEqual(14)
expect(waitMsCalls).toEqual(14)
})
it('returns latest result after retries', async () => {
let fnCalls = 0
const fn = async () => {
fnCalls++
if (fnCalls < 14) {
throw new Error(`Oops ${fnCalls}!`)
}
return 'ok'
}
const getWaitMs = () => 0
const result = await retry(fn, { maxRetries: Infinity, getWaitMs })
expect(result).toBe('ok')
expect(fnCalls).toBe(14)
})
it('returns result immediately on success', async () => {
let fnCalls = 0
const fn = async () => {
fnCalls++
return 'ok'
}
const getWaitMs = () => 0
const result = await retry(fn, { maxRetries: Infinity, getWaitMs })
expect(result).toBe('ok')
expect(fnCalls).toBe(1)
})
})
})

@ -13,3 +13,30 @@ export const fileExists = async (location: string): Promise<boolean> => {
throw err
}
}
export const readIfExists = async (
filepath: string,
): Promise<Uint8Array | undefined> => {
try {
return await fs.readFile(filepath)
} catch (err) {
if (isErrnoException(err) && err.code === 'ENOENT') {
return
}
throw err
}
}
export const rmIfExists = async (
filepath: string,
recursive = false,
): Promise<void> => {
try {
await fs.rm(filepath, { recursive })
} catch (err) {
if (isErrnoException(err) && err.code === 'ENOENT') {
return
}
throw err
}
}

@ -28,5 +28,8 @@
"@noble/curves": "^1.1.0",
"@noble/hashes": "^1.3.1",
"uint8arrays": "3.0.0"
},
"devDependencies": {
"@atproto/common": "workspace:^"
}
}

@ -9,3 +9,10 @@ export const sha256 = async (
typeof input === 'string' ? uint8arrays.fromString(input, 'utf8') : input
return noble.sha256(bytes)
}
export const sha256Hex = async (
input: Uint8Array | string,
): Promise<string> => {
const hash = await sha256(input)
return uint8arrays.toString(hash, 'hex')
}

@ -9,6 +9,10 @@ export interface Didable {
export interface Keypair extends Signer, Didable {}
export interface ExportableKeypair extends Keypair {
export(): Promise<Uint8Array>
}
export type DidKeyPlugin = {
prefix: Uint8Array
jwtAlg: string

@ -2,11 +2,11 @@ import fs from 'node:fs'
import * as uint8arrays from 'uint8arrays'
import { secp256k1 as nobleK256 } from '@noble/curves/secp256k1'
import { p256 as nobleP256 } from '@noble/curves/p256'
import { cborEncode } from '@atproto/common'
import EcdsaKeypair from '../src/p256/keypair'
import Secp256k1Keypair from '../src/secp256k1/keypair'
import * as p256 from '../src/p256/operations'
import * as secp from '../src/secp256k1/operations'
import { cborEncode } from '@atproto/common'
import {
bytesToMultibase,
multibaseToBytes,

@ -18,11 +18,12 @@ const run = async () => {
pds: {
port: 2583,
hostname: 'localhost',
dbPostgresSchema: 'pds',
enableDidDocWithSession: true,
},
bsky: {
dbPostgresSchema: 'bsky',
port: 2584,
publicUrl: 'http://localhost:2584',
},
plc: { port: 2582 },
})

@ -306,7 +306,6 @@ export async function processAll(
network: TestNetworkNoAppView,
ingester: bsky.BskyIngester,
) {
assert(network.pds.ctx.sequencerLeader, 'sequencer leader does not exist')
await network.pds.processAll()
await ingestAll(network, ingester)
// eslint-disable-next-line no-constant-condition
@ -326,25 +325,17 @@ export async function ingestAll(
network: TestNetworkNoAppView,
ingester: bsky.BskyIngester,
) {
assert(network.pds.ctx.sequencerLeader, 'sequencer leader does not exist')
const pdsDb = network.pds.ctx.db.db
const sequencer = network.pds.ctx.sequencer
await network.pds.processAll()
// eslint-disable-next-line no-constant-condition
while (true) {
await wait(50)
// check sequencer
const sequencerCaughtUp = await network.pds.ctx.sequencerLeader.isCaughtUp()
if (!sequencerCaughtUp) continue
// check ingester
const [ingesterCursor, { lastSeq }] = await Promise.all([
const [ingesterCursor, curr] = await Promise.all([
ingester.sub.getCursor(),
pdsDb
.selectFrom('repo_seq')
.where('seq', 'is not', null)
.select(pdsDb.fn.max('repo_seq.seq').as('lastSeq'))
.executeTakeFirstOrThrow(),
sequencer.curr(),
])
const ingesterCaughtUp = ingesterCursor === lastSeq
const ingesterCaughtUp = curr !== null && ingesterCursor === curr
if (ingesterCaughtUp) return
}
}

@ -1,3 +1,4 @@
export const ADMIN_PASSWORD = 'admin-pass'
export const MOD_PASSWORD = 'mod-pass'
export const TRIAGE_PASSWORD = 'triage-pass'
export const JWT_SECRET = 'jwt-secret'

@ -13,17 +13,8 @@ export class TestNetworkNoAppView {
static async create(
params: Partial<TestServerParams> = {},
): Promise<TestNetworkNoAppView> {
const dbPostgresUrl = params.dbPostgresUrl || process.env.DB_POSTGRES_URL
const dbPostgresSchema =
params.dbPostgresSchema || process.env.DB_POSTGRES_SCHEMA
const dbSqliteLocation =
dbPostgresUrl === undefined ? ':memory:' : undefined
const plc = await TestPlc.create(params.plc ?? {})
const pds = await TestPds.create({
dbPostgresUrl,
dbPostgresSchema,
dbSqliteLocation,
didPlcUrl: plc.url,
...params.pds,
})

@ -45,9 +45,6 @@ export class TestNetwork extends TestNetworkNoAppView {
})
const pds = await TestPds.create({
port: pdsPort,
dbPostgresUrl,
dbPostgresSchema,
dbPostgresPoolSize: 5,
didPlcUrl: plc.url,
bskyAppViewUrl: bsky.url,
bskyAppViewDid: bsky.ctx.cfg.serverDid,
@ -62,26 +59,17 @@ export class TestNetwork extends TestNetworkNoAppView {
async processFullSubscription(timeout = 5000) {
const sub = this.bsky.indexer.sub
const { db } = this.pds.ctx.db
const start = Date.now()
const lastSeq = await this.pds.ctx.sequencer.curr()
if (!lastSeq) return
while (Date.now() - start < timeout) {
await wait(50)
if (!this.pds.ctx.sequencerLeader) {
throw new Error('Sequencer leader not configured on the pds')
}
const caughtUp = await this.pds.ctx.sequencerLeader.isCaughtUp()
if (!caughtUp) continue
const { lastSeq } = await db
.selectFrom('repo_seq')
.where('seq', 'is not', null)
.select(db.fn.max('repo_seq.seq').as('lastSeq'))
.executeTakeFirstOrThrow()
const { cursor } = sub.partitions.get(0)
if (cursor === lastSeq) {
const partitionState = sub.partitions.get(0)
if (partitionState?.cursor >= lastSeq) {
// has seen last seq, just need to wait for it to finish processing
await sub.repoQueue.main.onIdle()
return
}
await wait(5)
}
throw new Error(`Sequence was not processed within ${timeout}ms`)
}
@ -93,10 +81,11 @@ export class TestNetwork extends TestNetworkNoAppView {
}
async serviceHeaders(did: string, aud?: string) {
const keypair = await this.pds.ctx.actorStore.keypair(did)
const jwt = await createServiceJwt({
iss: did,
aud: aud ?? this.bsky.ctx.cfg.serverDid,
keypair: this.pds.ctx.repoSigningKey,
keypair,
})
return { authorization: `Bearer ${jwt}` }
}

@ -1,13 +1,19 @@
import path from 'node:path'
import os from 'node:os'
import fs from 'node:fs/promises'
import getPort from 'get-port'
import * as ui8 from 'uint8arrays'
import * as pds from '@atproto/pds'
import { createSecretKeyObject } from '@atproto/pds/src/auth-verifier'
import { Secp256k1Keypair, randomStr } from '@atproto/crypto'
import { AtpAgent } from '@atproto/api'
import { PdsConfig } from './types'
import { uniqueLockId } from './util'
import { ADMIN_PASSWORD, MOD_PASSWORD, TRIAGE_PASSWORD } from './const'
import {
ADMIN_PASSWORD,
JWT_SECRET,
MOD_PASSWORD,
TRIAGE_PASSWORD,
} from './const'
export class TestPds {
constructor(
@ -17,8 +23,6 @@ export class TestPds {
) {}
static async create(config: PdsConfig): Promise<TestPds> {
const repoSigningKey = await Secp256k1Keypair.create({ exportable: true })
const repoSigningPriv = ui8.toString(await repoSigningKey.export(), 'hex')
const plcRotationKey = await Secp256k1Keypair.create({ exportable: true })
const plcRotationPriv = ui8.toString(await plcRotationKey.export(), 'hex')
const recoveryKey = (await Secp256k1Keypair.create()).did()
@ -27,21 +31,22 @@ export class TestPds {
const url = `http://localhost:${port}`
const blobstoreLoc = path.join(os.tmpdir(), randomStr(8, 'base32'))
const dataDirectory = path.join(os.tmpdir(), randomStr(8, 'base32'))
await fs.mkdir(dataDirectory, { recursive: true })
const env: pds.ServerEnvironment = {
port,
dataDirectory: dataDirectory,
blobstoreDiskLocation: blobstoreLoc,
recoveryDidKey: recoveryKey,
adminPassword: ADMIN_PASSWORD,
moderatorPassword: MOD_PASSWORD,
triagePassword: TRIAGE_PASSWORD,
jwtSecret: 'jwt-secret',
jwtSecret: JWT_SECRET,
serviceHandleDomains: ['.test'],
sequencerLeaderLockId: uniqueLockId(),
bskyAppViewUrl: 'https://appview.invalid',
bskyAppViewDid: 'did:example:invalid',
bskyAppViewCdnUrlPattern: 'http://cdn.appview.com/%s/%s/%s',
repoSigningKeyK256PrivateKeyHex: repoSigningPriv,
plcRotationKeyK256PrivateKeyHex: plcRotationPriv,
inviteRequired: false,
...config,
@ -51,20 +56,6 @@ export class TestPds {
const server = await pds.PDS.create(cfg, secrets)
// Separate migration db on postgres in case migration changes some
// connection state that we need in the tests, e.g. "alter database ... set ..."
const migrationDb =
cfg.db.dialect === 'pg'
? pds.Database.postgres({
url: cfg.db.url,
schema: cfg.db.schema,
})
: server.ctx.db
await migrationDb.migrateToLatestOrThrow()
if (migrationDb !== server.ctx.db) {
await migrationDb.close()
}
await server.start()
return new TestPds(url, port, server)
@ -97,6 +88,10 @@ export class TestPds {
}
}
jwtSecretKey() {
return createSecretKeyObject(JWT_SECRET)
}
async processAll() {
await this.ctx.backgroundQueue.processAll()
}

@ -203,6 +203,11 @@ const indexTs = (
}`,
})
file.addTypeAlias({
name: 'HandlerOpts',
type: `{ blobLimit?: number }`,
})
file.addTypeAlias({
name: 'HandlerRateLimitOpts',
typeParameters: [{ name: 'T' }],
@ -220,6 +225,7 @@ const indexTs = (
| Handler
| {
auth?: Auth
opts?: HandlerOpts
rateLimit?: HandlerRateLimitOpts<ReqCtx> | HandlerRateLimitOpts<ReqCtx>[]
handler: Handler
}`,

@ -1,139 +0,0 @@
import { randomBytes } from '@atproto/crypto'
import { cborEncode } from '@atproto/common'
import { randomCid } from '@atproto/repo/tests/_util'
import { BlockMap, blocksToCarFile } from '@atproto/repo'
import { byFrame } from '@atproto/xrpc-server'
import { WebSocket } from 'ws'
import { Database } from '../src'
import { TestNetworkNoAppView } from '@atproto/dev-env'
describe('sequencer bench', () => {
let network: TestNetworkNoAppView
let db: Database
beforeAll(async () => {
network = await TestNetworkNoAppView.create({
dbPostgresSchema: 'sequencer_bench',
pds: {
maxSubscriptionBuffer: 20000,
},
})
if (network.pds.ctx.cfg.db.dialect !== 'pg') {
throw new Error('no postgres url')
}
db = Database.postgres({
url: network.pds.ctx.cfg.db.url,
schema: network.pds.ctx.cfg.db.schema,
poolSize: 50,
})
network.pds.ctx.sequencerLeader?.destroy()
})
afterAll(async () => {
await network.close()
})
const doWrites = async (batches: number, batchSize: number) => {
const cid = await randomCid()
const blocks = new BlockMap()
await blocks.add(randomBytes(500))
await blocks.add(randomBytes(500))
await blocks.add(randomBytes(500))
await blocks.add(randomBytes(500))
await blocks.add(randomBytes(500))
await blocks.add(randomBytes(500))
const car = await blocksToCarFile(cid, blocks)
const evt = {
rebase: false,
tooBig: false,
repo: 'did:plc:123451234',
commit: cid,
prev: cid,
ops: [{ action: 'create', path: 'app.bsky.feed.post/abcdefg1234', cid }],
blocks: car,
blobs: [],
}
const encodeEvt = cborEncode(evt)
const promises: Promise<unknown>[] = []
for (let i = 0; i < batches; i++) {
const rows: any[] = []
for (let j = 0; j < batchSize; j++) {
rows.push({
did: 'did:web:example.com',
eventType: 'append',
event: encodeEvt,
sequencedAt: new Date().toISOString(),
})
}
const insert = db.db.insertInto('repo_seq').values(rows).execute()
promises.push(insert)
}
await Promise.all(promises)
}
const readAll = async (
totalToRead: number,
cursor?: number,
): Promise<number> => {
const serverHost = network.pds.url.replace('http://', '')
let url = `ws://${serverHost}/xrpc/com.atproto.sync.subscribeRepos`
if (cursor !== undefined) {
url += `?cursor=${cursor}`
}
const ws = new WebSocket(url)
let start = Date.now()
let count = 0
const gen = byFrame(ws)
for await (const _frame of gen) {
if (count === 0) {
start = Date.now()
}
count++
if (count >= totalToRead) {
break
}
}
if (count < totalToRead) {
throw new Error('Did not read full websocket')
}
return Date.now() - start
}
it('benches', async () => {
const BATCHES = 100
const BATCH_SIZE = 100
const TOTAL = BATCHES * BATCH_SIZE
const readAllPromise = readAll(TOTAL, 0)
const start = Date.now()
await doWrites(BATCHES, BATCH_SIZE)
const setup = Date.now()
await network.pds.ctx.sequencerLeader?.sequenceOutgoing()
const sequencingTime = Date.now() - setup
const liveTailTime = await readAllPromise
const backfillTime = await readAll(TOTAL, 0)
console.log(`
${TOTAL} events
Setup: ${setup - start} ms
Sequencing: ${sequencingTime} ms
Sequencing Rate: ${formatRate(TOTAL, sequencingTime)} evt/s
Live tail: ${liveTailTime} ms
Live tail Rate: ${formatRate(TOTAL, liveTailTime)} evt/s
Backfilled: ${backfillTime} ms
Backfill Rate: ${formatRate(TOTAL, backfillTime)} evt/s`)
})
})
const formatRate = (evts: number, timeMs: number): string => {
const evtPerSec = (evts * 1000) / timeMs
return evtPerSec.toFixed(3)
}

@ -24,10 +24,11 @@
"build": "node ./build.js",
"postbuild": "tsc --build tsconfig.build.json",
"test": "../dev-infra/with-test-redis-and-db.sh jest",
"test:sqlite": "jest",
"test:sqlite-only": "jest --testPathIgnorePatterns /tests/proxied/*",
"test:log": "tail -50 test.log | pino-pretty",
"update-main-to-dist": "node ../../update-main-to-dist.js packages/pds",
"bench": "../dev-infra/with-test-redis-and-db.sh jest --config jest.bench.config.js",
"test:sqlite": "jest --testPathIgnorePatterns /tests/proxied/*",
"test:log": "tail -50 test.log | pino-pretty",
"test:updateSnapshot": "jest --updateSnapshot",
"migration:create": "ts-node ./bin/migration-create.ts"
},
@ -56,7 +57,8 @@
"http-errors": "^2.0.0",
"http-terminator": "^3.2.0",
"ioredis": "^5.3.2",
"jsonwebtoken": "^8.5.1",
"jose": "^5.0.1",
"key-encoder": "^2.0.3",
"kysely": "^0.22.0",
"multiformats": "^9.9.0",
"nodemailer": "^6.8.0",
@ -75,16 +77,17 @@
"@atproto/bsky": "workspace:^",
"@atproto/dev-env": "workspace:^",
"@atproto/lex-cli": "workspace:^",
"@atproto/pds-entryway": "npm:@atproto/pds@0.3.0-entryway.2",
"@did-plc/server": "^0.0.1",
"@types/cors": "^2.8.12",
"@types/disposable-email": "^0.2.0",
"@types/express": "^4.17.13",
"@types/express-serve-static-core": "^4.17.36",
"@types/jsonwebtoken": "^8.5.9",
"@types/nodemailer": "^6.4.6",
"@types/pg": "^8.6.6",
"@types/qs": "^6.9.7",
"axios": "^0.27.2",
"get-port": "^6.1.2",
"ws": "^8.12.0"
}
}

@ -0,0 +1,21 @@
import { Database, Migrator } from '../../db'
import { DatabaseSchema } from './schema'
import migrations from './migrations'
export * from './schema'
export type AccountDb = Database<DatabaseSchema>
export const getDb = (
location: string,
disableWalAutoCheckpoint = false,
): AccountDb => {
const pragmas: Record<string, string> = disableWalAutoCheckpoint
? { wal_autocheckpoint: '0' }
: {}
return Database.sqlite(location, { pragmas })
}
export const getMigrator = (db: AccountDb) => {
return new Migrator(db.db, migrations)
}

@ -0,0 +1,115 @@
import { Kysely, sql } from 'kysely'
export async function up(db: Kysely<unknown>): Promise<void> {
await db.schema
.createTable('app_password')
.addColumn('did', 'varchar', (col) => col.notNull())
.addColumn('name', 'varchar', (col) => col.notNull())
.addColumn('passwordScrypt', 'varchar', (col) => col.notNull())
.addColumn('createdAt', 'varchar', (col) => col.notNull())
.addPrimaryKeyConstraint('app_password_pkey', ['did', 'name'])
.execute()
await db.schema
.createTable('invite_code')
.addColumn('code', 'varchar', (col) => col.primaryKey())
.addColumn('availableUses', 'integer', (col) => col.notNull())
.addColumn('disabled', 'int2', (col) => col.defaultTo(0))
.addColumn('forAccount', 'varchar', (col) => col.notNull())
.addColumn('createdBy', 'varchar', (col) => col.notNull())
.addColumn('createdAt', 'varchar', (col) => col.notNull())
.execute()
await db.schema
.createIndex('invite_code_for_account_idx')
.on('invite_code')
.column('forAccount')
.execute()
await db.schema
.createTable('invite_code_use')
.addColumn('code', 'varchar', (col) => col.notNull())
.addColumn('usedBy', 'varchar', (col) => col.notNull())
.addColumn('usedAt', 'varchar', (col) => col.notNull())
.addPrimaryKeyConstraint(`invite_code_use_pkey`, ['code', 'usedBy'])
.execute()
await db.schema
.createTable('refresh_token')
.addColumn('id', 'varchar', (col) => col.primaryKey())
.addColumn('did', 'varchar', (col) => col.notNull())
.addColumn('expiresAt', 'varchar', (col) => col.notNull())
.addColumn('nextId', 'varchar')
.addColumn('appPasswordName', 'varchar')
.execute()
await db.schema // Aids in refresh token cleanup
.createIndex('refresh_token_did_idx')
.on('refresh_token')
.column('did')
.execute()
await db.schema
.createTable('repo_root')
.addColumn('did', 'varchar', (col) => col.primaryKey())
.addColumn('cid', 'varchar', (col) => col.notNull())
.addColumn('rev', 'varchar', (col) => col.notNull())
.addColumn('indexedAt', 'varchar', (col) => col.notNull())
.execute()
await db.schema
.createTable('actor')
.addColumn('did', 'varchar', (col) => col.primaryKey())
.addColumn('handle', 'varchar')
.addColumn('createdAt', 'varchar', (col) => col.notNull())
.addColumn('takedownRef', 'varchar')
.execute()
await db.schema
.createIndex(`actor_handle_lower_idx`)
.unique()
.on('actor')
.expression(sql`lower("handle")`)
.execute()
await db.schema
.createIndex('actor_cursor_idx')
.on('actor')
.columns(['createdAt', 'did'])
.execute()
await db.schema
.createTable('account')
.addColumn('did', 'varchar', (col) => col.primaryKey())
.addColumn('email', 'varchar', (col) => col.notNull())
.addColumn('passwordScrypt', 'varchar', (col) => col.notNull())
.addColumn('emailConfirmedAt', 'varchar')
.addColumn('invitesDisabled', 'int2', (col) => col.notNull().defaultTo(0))
.execute()
await db.schema
.createIndex(`account_email_lower_idx`)
.unique()
.on('account')
.expression(sql`lower("email")`)
.execute()
await db.schema
.createTable('email_token')
.addColumn('purpose', 'varchar', (col) => col.notNull())
.addColumn('did', 'varchar', (col) => col.notNull())
.addColumn('token', 'varchar', (col) => col.notNull())
.addColumn('requestedAt', 'varchar', (col) => col.notNull())
.addPrimaryKeyConstraint('email_token_pkey', ['purpose', 'did'])
.addUniqueConstraint('email_token_purpose_token_unique', [
'purpose',
'token',
])
.execute()
}
export async function down(db: Kysely<unknown>): Promise<void> {
await db.schema.dropTable('email_token').execute()
await db.schema.dropTable('account').execute()
await db.schema.dropTable('actor').execute()
await db.schema.dropTable('repo_root').execute()
await db.schema.dropTable('refresh_token').execute()
await db.schema.dropTable('invite_code_use').execute()
await db.schema.dropTable('invite_code').execute()
await db.schema.dropTable('app_password').execute()
}

@ -0,0 +1,5 @@
import * as init from './001-init'
export default {
'001': init,
}

@ -0,0 +1,15 @@
import { Generated, Selectable } from 'kysely'
export interface Account {
did: string
email: string
passwordScrypt: string
emailConfirmedAt: string | null
invitesDisabled: Generated<0 | 1>
}
export type AccountEntry = Selectable<Account>
export const tableName = 'account'
export type PartialDB = { [tableName]: Account }

@ -0,0 +1,14 @@
import { Selectable } from 'kysely'
export interface Actor {
did: string
handle: string | null
createdAt: string
takedownRef: string | null
}
export type ActorEntry = Selectable<Actor>
export const tableName = 'actor'
export type PartialDB = { [tableName]: Actor }

@ -8,7 +8,7 @@ export interface EmailToken {
purpose: EmailTokenPurpose
did: string
token: string
requestedAt: Date
requestedAt: string
}
export const tableName = 'email_token'

@ -0,0 +1,23 @@
import * as actor from './actor'
import * as account from './account'
import * as repoRoot from './repo-root'
import * as refreshToken from './refresh-token'
import * as appPassword from './app-password'
import * as inviteCode from './invite-code'
import * as emailToken from './email-token'
export type DatabaseSchema = actor.PartialDB &
account.PartialDB &
refreshToken.PartialDB &
appPassword.PartialDB &
repoRoot.PartialDB &
inviteCode.PartialDB &
emailToken.PartialDB
export type { Actor, ActorEntry } from './actor'
export type { Account, AccountEntry } from './account'
export type { RepoRoot } from './repo-root'
export type { RefreshToken } from './refresh-token'
export type { AppPassword } from './app-password'
export type { InviteCode, InviteCodeUse } from './invite-code'
export type { EmailToken, EmailTokenPurpose } from './email-token'

@ -2,7 +2,7 @@ export interface InviteCode {
code: string
availableUses: number
disabled: 0 | 1
forUser: string
forAccount: string
createdBy: string
createdAt: string
}

@ -1,10 +1,8 @@
// @NOTE also used by app-view (moderation)
export interface RepoRoot {
did: string
root: string
rev: string | null
cid: string
rev: string
indexedAt: string
takedownRef: string | null
}
export const tableName = 'repo_root'

@ -0,0 +1,210 @@
import { isErrUniqueViolation, notSoftDeletedClause } from '../../db'
import { AccountDb, ActorEntry } from '../db'
import { StatusAttr } from '../../lexicon/types/com/atproto/admin/defs'
export class UserAlreadyExistsError extends Error {}
export type ActorAccount = ActorEntry & {
email: string | null
emailConfirmedAt: string | null
invitesDisabled: 0 | 1 | null
}
const selectAccountQB = (db: AccountDb, includeSoftDeleted: boolean) => {
const { ref } = db.db.dynamic
return db.db
.selectFrom('actor')
.leftJoin('account', 'actor.did', 'account.did')
.if(!includeSoftDeleted, (qb) =>
qb.where(notSoftDeletedClause(ref('actor'))),
)
.select([
'actor.did',
'actor.handle',
'actor.createdAt',
'actor.takedownRef',
'account.email',
'account.emailConfirmedAt',
'account.invitesDisabled',
])
}
export const getAccount = async (
db: AccountDb,
handleOrDid: string,
includeSoftDeleted = false,
): Promise<ActorAccount | null> => {
const found = await selectAccountQB(db, includeSoftDeleted)
.where((qb) => {
if (handleOrDid.startsWith('did:')) {
return qb.where('actor.did', '=', handleOrDid)
} else {
return qb.where('actor.handle', '=', handleOrDid)
}
})
.executeTakeFirst()
return found || null
}
export const getAccountByEmail = async (
db: AccountDb,
email: string,
includeSoftDeleted = false,
): Promise<ActorAccount | null> => {
const found = await selectAccountQB(db, includeSoftDeleted)
.where('email', '=', email.toLowerCase())
.executeTakeFirst()
return found || null
}
export const registerActor = async (
db: AccountDb,
opts: {
did: string
handle: string
},
) => {
const { did, handle } = opts
const [registered] = await db.executeWithRetry(
db.db
.insertInto('actor')
.values({
did,
handle,
createdAt: new Date().toISOString(),
})
.onConflict((oc) => oc.doNothing())
.returning('did'),
)
if (!registered) {
throw new UserAlreadyExistsError()
}
}
export const registerAccount = async (
db: AccountDb,
opts: {
did: string
email: string
passwordScrypt: string
},
) => {
const { did, email, passwordScrypt } = opts
const [registered] = await db.executeWithRetry(
db.db
.insertInto('account')
.values({
did,
email: email.toLowerCase(),
passwordScrypt,
})
.onConflict((oc) => oc.doNothing())
.returning('did'),
)
if (!registered) {
throw new UserAlreadyExistsError()
}
}
export const deleteAccount = async (
db: AccountDb,
did: string,
): Promise<void> => {
// Not done in transaction because it would be too long, prone to contention.
// Also, this can safely be run multiple times if it fails.
await db.executeWithRetry(
db.db.deleteFrom('repo_root').where('did', '=', did),
)
await db.executeWithRetry(
db.db.deleteFrom('email_token').where('did', '=', did),
)
await db.executeWithRetry(
db.db.deleteFrom('refresh_token').where('did', '=', did),
)
await db.executeWithRetry(
db.db.deleteFrom('account').where('account.did', '=', did),
)
await db.executeWithRetry(
db.db.deleteFrom('actor').where('actor.did', '=', did),
)
}
export const updateHandle = async (
db: AccountDb,
did: string,
handle: string,
) => {
const [res] = await db.executeWithRetry(
db.db
.updateTable('actor')
.set({ handle })
.where('did', '=', did)
.whereNotExists(
db.db.selectFrom('actor').where('handle', '=', handle).selectAll(),
),
)
if (res.numUpdatedRows < 1) {
throw new UserAlreadyExistsError()
}
}
export const updateEmail = async (
db: AccountDb,
did: string,
email: string,
) => {
try {
await db.executeWithRetry(
db.db
.updateTable('account')
.set({ email: email.toLowerCase(), emailConfirmedAt: null })
.where('did', '=', did),
)
} catch (err) {
if (isErrUniqueViolation(err)) {
throw new UserAlreadyExistsError()
}
throw err
}
}
export const setEmailConfirmedAt = async (
db: AccountDb,
did: string,
emailConfirmedAt: string,
) => {
await db.executeWithRetry(
db.db
.updateTable('account')
.set({ emailConfirmedAt })
.where('did', '=', did),
)
}
export const getAccountTakedownStatus = async (
db: AccountDb,
did: string,
): Promise<StatusAttr | null> => {
const res = await db.db
.selectFrom('actor')
.select('takedownRef')
.where('did', '=', did)
.executeTakeFirst()
if (!res) return null
return res.takedownRef
? { applied: true, ref: res.takedownRef }
: { applied: false }
}
export const updateAccountTakedownStatus = async (
db: AccountDb,
did: string,
takedown: StatusAttr,
) => {
const takedownRef = takedown.applied
? takedown.ref ?? new Date().toISOString()
: null
await db.executeWithRetry(
db.db.updateTable('actor').set({ takedownRef }).where('did', '=', did),
)
}

@ -0,0 +1,184 @@
import assert from 'node:assert'
import { KeyObject } from 'node:crypto'
import * as jose from 'jose'
import * as ui8 from 'uint8arrays'
import * as crypto from '@atproto/crypto'
import { AuthScope } from '../../auth-verifier'
import { AccountDb } from '../db'
export type AuthToken = {
scope: AuthScope
sub: string
exp: number
}
export type RefreshToken = AuthToken & { scope: AuthScope.Refresh; jti: string }
export const createTokens = async (opts: {
did: string
jwtKey: KeyObject
serviceDid: string
scope?: AuthScope
jti?: string
expiresIn?: string | number
}) => {
const { did, jwtKey, serviceDid, scope, jti, expiresIn } = opts
const [accessJwt, refreshJwt] = await Promise.all([
createAccessToken({ did, jwtKey, serviceDid, scope, expiresIn }),
createRefreshToken({ did, jwtKey, serviceDid, jti, expiresIn }),
])
return { accessJwt, refreshJwt }
}
export const createAccessToken = (opts: {
did: string
jwtKey: KeyObject
serviceDid: string
scope?: AuthScope
expiresIn?: string | number
}): Promise<string> => {
const {
did,
jwtKey,
serviceDid,
scope = AuthScope.Access,
expiresIn = '120mins',
} = opts
const signer = new jose.SignJWT({ scope })
.setProtectedHeader({ alg: 'HS256' }) // only symmetric keys supported
.setAudience(serviceDid)
.setSubject(did)
.setIssuedAt()
.setExpirationTime(expiresIn)
return signer.sign(jwtKey)
}
export const createRefreshToken = (opts: {
did: string
jwtKey: KeyObject
serviceDid: string
jti?: string
expiresIn?: string | number
}): Promise<string> => {
const {
did,
jwtKey,
serviceDid,
jti = getRefreshTokenId(),
expiresIn = '90days',
} = opts
const signer = new jose.SignJWT({ scope: AuthScope.Refresh })
.setProtectedHeader({ alg: 'HS256' }) // only symmetric keys supported
.setAudience(serviceDid)
.setSubject(did)
.setJti(jti)
.setIssuedAt()
.setExpirationTime(expiresIn)
return signer.sign(jwtKey)
}
// @NOTE unsafe for verification, should only be used w/ direct output from createRefreshToken() or createTokens()
export const decodeRefreshToken = (jwt: string) => {
const token = jose.decodeJwt(jwt)
assert.ok(token.scope === AuthScope.Refresh, 'not a refresh token')
return token as RefreshToken
}
export const storeRefreshToken = async (
db: AccountDb,
payload: RefreshToken,
appPasswordName: string | null,
) => {
const [result] = await db.executeWithRetry(
db.db
.insertInto('refresh_token')
.values({
id: payload.jti,
did: payload.sub,
appPasswordName,
expiresAt: new Date(payload.exp * 1000).toISOString(),
})
.onConflict((oc) => oc.doNothing()), // E.g. when re-granting during a refresh grace period
)
return result
}
export const getRefreshToken = async (db: AccountDb, id: string) => {
return db.db
.selectFrom('refresh_token')
.where('id', '=', id)
.selectAll()
.executeTakeFirst()
}
export const deleteExpiredRefreshTokens = async (
db: AccountDb,
did: string,
now: string,
) => {
await db.executeWithRetry(
db.db
.deleteFrom('refresh_token')
.where('did', '=', did)
.where('expiresAt', '<=', now),
)
}
export const addRefreshGracePeriod = async (
db: AccountDb,
opts: {
id: string
expiresAt: string
nextId: string
},
) => {
const { id, expiresAt, nextId } = opts
const [res] = await db.executeWithRetry(
db.db
.updateTable('refresh_token')
.where('id', '=', id)
.where((inner) =>
inner.where('nextId', 'is', null).orWhere('nextId', '=', nextId),
)
.set({ expiresAt, nextId })
.returningAll(),
)
if (!res) {
throw new ConcurrentRefreshError()
}
}
export const revokeRefreshToken = async (db: AccountDb, id: string) => {
const [{ numDeletedRows }] = await db.executeWithRetry(
db.db.deleteFrom('refresh_token').where('id', '=', id),
)
return numDeletedRows > 0
}
export const revokeRefreshTokensByDid = async (db: AccountDb, did: string) => {
const [{ numDeletedRows }] = await db.executeWithRetry(
db.db.deleteFrom('refresh_token').where('did', '=', did),
)
return numDeletedRows > 0
}
export const revokeAppPasswordRefreshToken = async (
db: AccountDb,
did: string,
appPassName: string,
) => {
const [{ numDeletedRows }] = await db.executeWithRetry(
db.db
.deleteFrom('refresh_token')
.where('did', '=', did)
.where('appPasswordName', '=', appPassName),
)
return numDeletedRows > 0
}
export const getRefreshTokenId = () => {
return ui8.toString(crypto.randomBytes(32), 'base64')
}
export class ConcurrentRefreshError extends Error {}

@ -0,0 +1,80 @@
import { InvalidRequestError } from '@atproto/xrpc-server'
import { MINUTE, lessThanAgoMs } from '@atproto/common'
import { getRandomToken } from '../../api/com/atproto/server/util'
import { AccountDb, EmailTokenPurpose } from '../db'
export const createEmailToken = async (
db: AccountDb,
did: string,
purpose: EmailTokenPurpose,
): Promise<string> => {
const token = getRandomToken().toUpperCase()
const now = new Date().toISOString()
await db.executeWithRetry(
db.db
.insertInto('email_token')
.values({ purpose, did, token, requestedAt: now })
.onConflict((oc) =>
oc.columns(['purpose', 'did']).doUpdateSet({ token, requestedAt: now }),
),
)
return token
}
export const deleteEmailToken = async (
db: AccountDb,
did: string,
purpose: EmailTokenPurpose,
) => {
await db.executeWithRetry(
db.db
.deleteFrom('email_token')
.where('did', '=', did)
.where('purpose', '=', purpose),
)
}
export const assertValidToken = async (
db: AccountDb,
did: string,
purpose: EmailTokenPurpose,
token: string,
expirationLen = 15 * MINUTE,
) => {
const res = await db.db
.selectFrom('email_token')
.selectAll()
.where('purpose', '=', purpose)
.where('did', '=', did)
.where('token', '=', token.toUpperCase())
.executeTakeFirst()
if (!res) {
throw new InvalidRequestError('Token is invalid', 'InvalidToken')
}
const expired = !lessThanAgoMs(new Date(res.requestedAt), expirationLen)
if (expired) {
throw new InvalidRequestError('Token is expired', 'ExpiredToken')
}
}
export const assertValidTokenAndFindDid = async (
db: AccountDb,
purpose: EmailTokenPurpose,
token: string,
expirationLen = 15 * MINUTE,
): Promise<string> => {
const res = await db.db
.selectFrom('email_token')
.selectAll()
.where('purpose', '=', purpose)
.where('token', '=', token.toUpperCase())
.executeTakeFirst()
if (!res) {
throw new InvalidRequestError('Token is invalid', 'InvalidToken')
}
const expired = !lessThanAgoMs(new Date(res.requestedAt), expirationLen)
if (expired) {
throw new InvalidRequestError('Token is expired', 'ExpiredToken')
}
return res.did
}

@ -0,0 +1,259 @@
import { chunkArray } from '@atproto/common'
import { InvalidRequestError } from '@atproto/xrpc-server'
import { AccountDb, InviteCode } from '../db'
import { countAll } from '../../db'
export const createInviteCodes = async (
db: AccountDb,
toCreate: { account: string; codes: string[] }[],
useCount: number,
) => {
const now = new Date().toISOString()
const rows = toCreate.flatMap((account) =>
account.codes.map((code) => ({
code: code,
availableUses: useCount,
disabled: 0 as const,
forAccount: account.account,
createdBy: 'admin',
createdAt: now,
})),
)
await Promise.all(
chunkArray(rows, 50).map((chunk) =>
db.executeWithRetry(db.db.insertInto('invite_code').values(chunk)),
),
)
}
export const createAccountInviteCodes = async (
db: AccountDb,
forAccount: string,
codes: string[],
expectedTotal: number,
disabled: 0 | 1,
): Promise<CodeDetail[]> => {
const now = new Date().toISOString()
const rows = codes.map(
(code) =>
({
code,
availableUses: 1,
disabled,
forAccount,
createdBy: forAccount,
createdAt: now,
} as InviteCode),
)
await db.executeWithRetry(db.db.insertInto('invite_code').values(rows))
const finalRoutineInviteCodes = await db.db
.selectFrom('invite_code')
.where('forAccount', '=', forAccount)
.where('createdBy', '!=', 'admin') // dont count admin-gifted codes aginast the user
.selectAll()
.execute()
if (finalRoutineInviteCodes.length > expectedTotal) {
throw new InvalidRequestError(
'attempted to create additional codes in another request',
'DuplicateCreate',
)
}
return rows.map((row) => ({
...row,
available: 1,
disabled: row.disabled === 1,
uses: [],
}))
}
export const recordInviteUse = async (
db: AccountDb,
opts: {
did: string
inviteCode: string | undefined
now: string
},
) => {
if (!opts.inviteCode) return
await db.executeWithRetry(
db.db.insertInto('invite_code_use').values({
code: opts.inviteCode,
usedBy: opts.did,
usedAt: opts.now,
}),
)
}
export const ensureInviteIsAvailable = async (
db: AccountDb,
inviteCode: string,
): Promise<void> => {
const invite = await db.db
.selectFrom('invite_code')
.leftJoin('actor', 'actor.did', 'invite_code.forAccount')
.where('takedownRef', 'is', null)
.selectAll('invite_code')
.where('code', '=', inviteCode)
.executeTakeFirst()
if (!invite || invite.disabled) {
throw new InvalidRequestError(
'Provided invite code not available',
'InvalidInviteCode',
)
}
const uses = await db.db
.selectFrom('invite_code_use')
.select(countAll.as('count'))
.where('code', '=', inviteCode)
.executeTakeFirstOrThrow()
if (invite.availableUses <= uses.count) {
throw new InvalidRequestError(
'Provided invite code not available',
'InvalidInviteCode',
)
}
}
export const selectInviteCodesQb = (db: AccountDb) => {
const ref = db.db.dynamic.ref
const builder = db.db
.selectFrom('invite_code')
.select([
'invite_code.code as code',
'invite_code.availableUses as available',
'invite_code.disabled as disabled',
'invite_code.forAccount as forAccount',
'invite_code.createdBy as createdBy',
'invite_code.createdAt as createdAt',
db.db
.selectFrom('invite_code_use')
.select(countAll.as('count'))
.whereRef('invite_code_use.code', '=', ref('invite_code.code'))
.as('uses'),
])
return db.db.selectFrom(builder.as('codes')).selectAll()
}
export const getAccountInviteCodes = async (
db: AccountDb,
did: string,
): Promise<CodeDetail[]> => {
const res = await selectInviteCodesQb(db)
.where('forAccount', '=', did)
.execute()
const codes = res.map((row) => row.code)
const uses = await getInviteCodesUses(db, codes)
return res.map((row) => ({
...row,
uses: uses[row.code] ?? [],
disabled: row.disabled === 1,
}))
}
export const getInviteCodesUses = async (
db: AccountDb,
codes: string[],
): Promise<Record<string, CodeUse[]>> => {
const uses: Record<string, CodeUse[]> = {}
if (codes.length > 0) {
const usesRes = await db.db
.selectFrom('invite_code_use')
.where('code', 'in', codes)
.orderBy('usedAt', 'desc')
.selectAll()
.execute()
for (const use of usesRes) {
const { code, usedBy, usedAt } = use
uses[code] ??= []
uses[code].push({ usedBy, usedAt })
}
}
return uses
}
export const getInvitedByForAccounts = async (
db: AccountDb,
dids: string[],
): Promise<Record<string, CodeDetail>> => {
if (dids.length < 1) return {}
const codeDetailsRes = await selectInviteCodesQb(db)
.where('code', 'in', (qb) =>
qb
.selectFrom('invite_code_use')
.where('usedBy', 'in', dids)
.select('code')
.distinct(),
)
.execute()
const uses = await getInviteCodesUses(
db,
codeDetailsRes.map((row) => row.code),
)
const codeDetails = codeDetailsRes.map((row) => ({
...row,
uses: uses[row.code] ?? [],
disabled: row.disabled === 1,
}))
return codeDetails.reduce((acc, cur) => {
for (const use of cur.uses) {
acc[use.usedBy] = cur
}
return acc
}, {} as Record<string, CodeDetail>)
}
export const disableInviteCodes = async (
db: AccountDb,
opts: { codes: string[]; accounts: string[] },
) => {
const { codes, accounts } = opts
if (codes.length > 0) {
await db.executeWithRetry(
db.db
.updateTable('invite_code')
.set({ disabled: 1 })
.where('code', 'in', codes),
)
}
if (accounts.length > 0) {
await db.executeWithRetry(
db.db
.updateTable('invite_code')
.set({ disabled: 1 })
.where('forAccount', 'in', accounts),
)
}
}
export const setAccountInvitesDisabled = async (
db: AccountDb,
did: string,
disabled: boolean,
) => {
await db.executeWithRetry(
db.db
.updateTable('account')
.where('did', '=', did)
.set({ invitesDisabled: disabled ? 1 : 0 }),
)
}
export type CodeDetail = {
code: string
available: number
disabled: boolean
forAccount: string
createdBy: string
createdAt: string
uses: CodeUse[]
}
type CodeUse = {
usedBy: string
usedAt: string
}

@ -0,0 +1,109 @@
import { randomStr } from '@atproto/crypto'
import { InvalidRequestError } from '@atproto/xrpc-server'
import * as scrypt from './scrypt'
import { AccountDb } from '../db'
import { AppPassword } from '../../lexicon/types/com/atproto/server/createAppPassword'
export const verifyAccountPassword = async (
db: AccountDb,
did: string,
password: string,
): Promise<boolean> => {
const found = await db.db
.selectFrom('account')
.selectAll()
.where('did', '=', did)
.executeTakeFirst()
return found ? await scrypt.verify(password, found.passwordScrypt) : false
}
export const verifyAppPassword = async (
db: AccountDb,
did: string,
password: string,
): Promise<string | null> => {
const passwordScrypt = await scrypt.hashAppPassword(did, password)
const found = await db.db
.selectFrom('app_password')
.selectAll()
.where('did', '=', did)
.where('passwordScrypt', '=', passwordScrypt)
.executeTakeFirst()
return found?.name ?? null
}
export const updateUserPassword = async (
db: AccountDb,
opts: {
did: string
passwordScrypt: string
},
) => {
await db.executeWithRetry(
db.db
.updateTable('account')
.set({ passwordScrypt: opts.passwordScrypt })
.where('did', '=', opts.did),
)
}
export const createAppPassword = async (
db: AccountDb,
did: string,
name: string,
): Promise<AppPassword> => {
// create an app password with format:
// 1234-abcd-5678-efgh
const str = randomStr(16, 'base32').slice(0, 16)
const chunks = [
str.slice(0, 4),
str.slice(4, 8),
str.slice(8, 12),
str.slice(12, 16),
]
const password = chunks.join('-')
const passwordScrypt = await scrypt.hashAppPassword(did, password)
const [got] = await db.executeWithRetry(
db.db
.insertInto('app_password')
.values({
did,
name,
passwordScrypt,
createdAt: new Date().toISOString(),
})
.returningAll(),
)
if (!got) {
throw new InvalidRequestError('could not create app-specific password')
}
return {
name,
password,
createdAt: got.createdAt,
}
}
export const listAppPasswords = async (
db: AccountDb,
did: string,
): Promise<{ name: string; createdAt: string }[]> => {
return db.db
.selectFrom('app_password')
.select(['name', 'createdAt'])
.where('did', '=', did)
.execute()
}
export const deleteAppPassword = async (
db: AccountDb,
did: string,
name: string,
) => {
await db.executeWithRetry(
db.db
.deleteFrom('app_password')
.where('did', '=', did)
.where('name', '=', name),
)
}

@ -0,0 +1,24 @@
import { CID } from 'multiformats/cid'
import { AccountDb } from '../db'
export const updateRoot = async (
db: AccountDb,
did: string,
cid: CID,
rev: string,
) => {
// @TODO balance risk of a race in the case of a long retry
await db.executeWithRetry(
db.db
.insertInto('repo_root')
.values({
did,
cid: cid.toString(),
rev,
indexedAt: new Date().toISOString(),
})
.onConflict((oc) =>
oc.column('did').doUpdateSet({ cid: cid.toString(), rev }),
),
)
}

@ -0,0 +1,353 @@
import { KeyObject } from 'node:crypto'
import { HOUR } from '@atproto/common'
import { CID } from 'multiformats/cid'
import { AccountDb, EmailTokenPurpose, getDb, getMigrator } from './db'
import * as scrypt from './helpers/scrypt'
import * as account from './helpers/account'
import { ActorAccount } from './helpers/account'
import * as repo from './helpers/repo'
import * as auth from './helpers/auth'
import * as invite from './helpers/invite'
import * as password from './helpers/password'
import * as emailToken from './helpers/email-token'
import { AuthScope } from '../auth-verifier'
import { StatusAttr } from '../lexicon/types/com/atproto/admin/defs'
export class AccountManager {
db: AccountDb
constructor(
dbLocation: string,
private jwtKey: KeyObject,
private serviceDid: string,
disableWalAutoCheckpoint = false,
) {
this.db = getDb(dbLocation, disableWalAutoCheckpoint)
}
async migrateOrThrow() {
await this.db.ensureWal()
await getMigrator(this.db).migrateToLatestOrThrow()
}
close() {
this.db.close()
}
// Account
// ----------
async getAccount(
handleOrDid: string,
includeSoftDeleted = false,
): Promise<ActorAccount | null> {
return account.getAccount(this.db, handleOrDid, includeSoftDeleted)
}
async getAccountByEmail(
email: string,
includeSoftDeleted = false,
): Promise<ActorAccount | null> {
return account.getAccountByEmail(this.db, email, includeSoftDeleted)
}
// Repo exists and is not taken-down
async isRepoAvailable(did: string) {
const got = await this.getAccount(did)
return !!got
}
async getDidForActor(
handleOrDid: string,
includeSoftDeleted = false,
): Promise<string | null> {
const got = await this.getAccount(handleOrDid, includeSoftDeleted)
return got?.did ?? null
}
async createAccount(opts: {
did: string
handle: string
email?: string
password?: string
repoCid: CID
repoRev: string
inviteCode?: string
}) {
const { did, handle, email, password, repoCid, repoRev, inviteCode } = opts
const passwordScrypt = password
? await scrypt.genSaltAndHash(password)
: undefined
const { accessJwt, refreshJwt } = await auth.createTokens({
did,
jwtKey: this.jwtKey,
serviceDid: this.serviceDid,
scope: AuthScope.Access,
})
const refreshPayload = auth.decodeRefreshToken(refreshJwt)
const now = new Date().toISOString()
await this.db.transaction(async (dbTxn) => {
if (inviteCode) {
await invite.ensureInviteIsAvailable(dbTxn, inviteCode)
}
await Promise.all([
account.registerActor(dbTxn, { did, handle }),
email && passwordScrypt
? account.registerAccount(dbTxn, { did, email, passwordScrypt })
: Promise.resolve(),
invite.recordInviteUse(dbTxn, {
did,
inviteCode,
now,
}),
auth.storeRefreshToken(dbTxn, refreshPayload, null),
repo.updateRoot(dbTxn, did, repoCid, repoRev),
])
})
return { accessJwt, refreshJwt }
}
// @NOTE should always be paired with a sequenceHandle().
// the token output from this method should be passed to sequenceHandle().
async updateHandle(did: string, handle: string) {
return account.updateHandle(this.db, did, handle)
}
async deleteAccount(did: string) {
return account.deleteAccount(this.db, did)
}
async takedownAccount(did: string, takedown: StatusAttr) {
await this.db.transaction((dbTxn) =>
Promise.all([
account.updateAccountTakedownStatus(dbTxn, did, takedown),
auth.revokeRefreshTokensByDid(dbTxn, did),
]),
)
}
async getAccountTakedownStatus(did: string) {
return account.getAccountTakedownStatus(this.db, did)
}
async updateRepoRoot(did: string, cid: CID, rev: string) {
return repo.updateRoot(this.db, did, cid, rev)
}
// Auth
// ----------
async createSession(did: string, appPasswordName: string | null) {
const { accessJwt, refreshJwt } = await auth.createTokens({
did,
jwtKey: this.jwtKey,
serviceDid: this.serviceDid,
scope: appPasswordName === null ? AuthScope.Access : AuthScope.AppPass,
})
const refreshPayload = auth.decodeRefreshToken(refreshJwt)
await auth.storeRefreshToken(this.db, refreshPayload, appPasswordName)
return { accessJwt, refreshJwt }
}
async rotateRefreshToken(id: string) {
const token = await auth.getRefreshToken(this.db, id)
if (!token) return null
const now = new Date()
// take the chance to tidy all of a user's expired tokens
// does not need to be transactional since this is just best-effort
await auth.deleteExpiredRefreshTokens(this.db, token.did, now.toISOString())
// Shorten the refresh token lifespan down from its
// original expiration time to its revocation grace period.
const prevExpiresAt = new Date(token.expiresAt)
const REFRESH_GRACE_MS = 2 * HOUR
const graceExpiresAt = new Date(now.getTime() + REFRESH_GRACE_MS)
const expiresAt =
graceExpiresAt < prevExpiresAt ? graceExpiresAt : prevExpiresAt
if (expiresAt <= now) {
return null
}
// Determine the next refresh token id: upon refresh token
// reuse you always receive a refresh token with the same id.
const nextId = token.nextId ?? auth.getRefreshTokenId()
const { accessJwt, refreshJwt } = await auth.createTokens({
did: token.did,
jwtKey: this.jwtKey,
serviceDid: this.serviceDid,
scope:
token.appPasswordName === null ? AuthScope.Access : AuthScope.AppPass,
jti: nextId,
})
const refreshPayload = auth.decodeRefreshToken(refreshJwt)
try {
await this.db.transaction((dbTxn) =>
Promise.all([
auth.addRefreshGracePeriod(dbTxn, {
id,
expiresAt: expiresAt.toISOString(),
nextId,
}),
auth.storeRefreshToken(dbTxn, refreshPayload, token.appPasswordName),
]),
)
} catch (err) {
if (err instanceof auth.ConcurrentRefreshError) {
return this.rotateRefreshToken(id)
}
throw err
}
return { accessJwt, refreshJwt }
}
async revokeRefreshToken(id: string) {
return auth.revokeRefreshToken(this.db, id)
}
// Passwords
// ----------
async createAppPassword(did: string, name: string) {
return password.createAppPassword(this.db, did, name)
}
async listAppPasswords(did: string) {
return password.listAppPasswords(this.db, did)
}
async verifyAccountPassword(
did: string,
passwordStr: string,
): Promise<boolean> {
return password.verifyAccountPassword(this.db, did, passwordStr)
}
async verifyAppPassword(
did: string,
passwordStr: string,
): Promise<string | null> {
return password.verifyAppPassword(this.db, did, passwordStr)
}
async revokeAppPassword(did: string, name: string) {
await this.db.transaction(async (dbTxn) =>
Promise.all([
password.deleteAppPassword(dbTxn, did, name),
auth.revokeAppPasswordRefreshToken(dbTxn, did, name),
]),
)
}
// Invites
// ----------
async ensureInviteIsAvailable(code: string) {
return invite.ensureInviteIsAvailable(this.db, code)
}
async createInviteCodes(
toCreate: { account: string; codes: string[] }[],
useCount: number,
) {
return invite.createInviteCodes(this.db, toCreate, useCount)
}
async createAccountInviteCodes(
forAccount: string,
codes: string[],
expectedTotal: number,
disabled: 0 | 1,
) {
return invite.createAccountInviteCodes(
this.db,
forAccount,
codes,
expectedTotal,
disabled,
)
}
async getAccountInvitesCodes(did: string) {
return invite.getAccountInviteCodes(this.db, did)
}
async getInvitedByForAccounts(dids: string[]) {
return invite.getInvitedByForAccounts(this.db, dids)
}
async getInviteCodesUses(codes: string[]) {
return invite.getInviteCodesUses(this.db, codes)
}
async setAccountInvitesDisabled(did: string, disabled: boolean) {
return invite.setAccountInvitesDisabled(this.db, did, disabled)
}
async disableInviteCodes(opts: { codes: string[]; accounts: string[] }) {
return invite.disableInviteCodes(this.db, opts)
}
// Email Tokens
// ----------
async createEmailToken(did: string, purpose: EmailTokenPurpose) {
return emailToken.createEmailToken(this.db, did, purpose)
}
async assertValidEmailToken(
did: string,
purpose: EmailTokenPurpose,
token: string,
) {
return emailToken.assertValidToken(this.db, did, purpose, token)
}
async confirmEmail(opts: { did: string; token: string }) {
const { did, token } = opts
await emailToken.assertValidToken(this.db, did, 'confirm_email', token)
const now = new Date().toISOString()
await this.db.transaction((dbTxn) =>
Promise.all([
emailToken.deleteEmailToken(dbTxn, did, 'confirm_email'),
account.setEmailConfirmedAt(dbTxn, did, now),
]),
)
}
async updateEmail(opts: { did: string; email: string; token?: string }) {
const { did, email, token } = opts
if (token) {
await this.db.transaction((dbTxn) =>
Promise.all([
account.updateEmail(dbTxn, did, email),
emailToken.deleteEmailToken(dbTxn, did, 'update_email'),
]),
)
} else {
return account.updateEmail(this.db, did, email)
}
}
async resetPassword(opts: { password: string; token: string }) {
const did = await emailToken.assertValidTokenAndFindDid(
this.db,
'reset_password',
opts.token,
)
const passwordScrypt = await scrypt.genSaltAndHash(opts.password)
await this.db.transaction(async (dbTxn) =>
Promise.all([
password.updateUserPassword(dbTxn, { did, passwordScrypt }),
emailToken.deleteEmailToken(dbTxn, did, 'reset_password'),
auth.revokeRefreshTokensByDid(dbTxn, did),
]),
)
}
}

@ -0,0 +1,76 @@
import stream from 'stream'
import { CID } from 'multiformats/cid'
import { BlobNotFoundError, BlobStore } from '@atproto/repo'
import { InvalidRequestError } from '@atproto/xrpc-server'
import { ActorDb } from '../db'
import { notSoftDeletedClause } from '../../db/util'
import { StatusAttr } from '../../lexicon/types/com/atproto/admin/defs'
export class BlobReader {
constructor(public db: ActorDb, public blobstore: BlobStore) {}
async getBlob(
cid: CID,
): Promise<{ size: number; mimeType?: string; stream: stream.Readable }> {
const { ref } = this.db.db.dynamic
const found = await this.db.db
.selectFrom('blob')
.selectAll()
.where('blob.cid', '=', cid.toString())
.where(notSoftDeletedClause(ref('blob')))
.executeTakeFirst()
if (!found) {
throw new InvalidRequestError('Blob not found')
}
let blobStream
try {
blobStream = await this.blobstore.getStream(cid)
} catch (err) {
if (err instanceof BlobNotFoundError) {
throw new InvalidRequestError('Blob not found')
}
throw err
}
return {
size: found.size,
mimeType: found.mimeType,
stream: blobStream,
}
}
async listBlobs(opts: {
since?: string
cursor?: string
limit: number
}): Promise<string[]> {
const { since, cursor, limit } = opts
let builder = this.db.db
.selectFrom('record_blob')
.select('blobCid')
.orderBy('blobCid', 'asc')
.groupBy('blobCid')
.limit(limit)
if (since) {
builder = builder
.innerJoin('record', 'record.uri', 'record_blob.recordUri')
.where('record.repoRev', '>', since)
}
if (cursor) {
builder = builder.where('blobCid', '>', cursor)
}
const res = await builder.execute()
return res.map((row) => row.blobCid)
}
async getBlobTakedownStatus(cid: CID): Promise<StatusAttr | null> {
const res = await this.db.db
.selectFrom('blob')
.select('takedownRef')
.where('cid', '=', cid.toString())
.executeTakeFirst()
if (!res) return null
return res.takedownRef
? { applied: true, ref: res.takedownRef }
: { applied: false }
}
}

@ -3,27 +3,33 @@ import crypto from 'crypto'
import { CID } from 'multiformats/cid'
import bytes from 'bytes'
import { fromStream as fileTypeFromStream } from 'file-type'
import { BlobStore, CidSet, WriteOpAction } from '@atproto/repo'
import { BlobNotFoundError, BlobStore, WriteOpAction } from '@atproto/repo'
import { AtUri } from '@atproto/syntax'
import { cloneStream, sha256RawToCid, streamSize } from '@atproto/common'
import { InvalidRequestError } from '@atproto/xrpc-server'
import { BlobRef } from '@atproto/lexicon'
import { PreparedBlobRef, PreparedWrite } from '../../repo/types'
import Database from '../../db'
import { Blob as BlobTable } from '../../db/tables/blob'
import { ActorDb, Blob as BlobTable } from '../db'
import {
PreparedBlobRef,
PreparedWrite,
PreparedDelete,
PreparedUpdate,
} from '../../repo/types'
import * as img from '../../image'
import { PreparedDelete, PreparedUpdate } from '../../repo'
import { BackgroundQueue } from '../../background'
import { BlobReader } from './reader'
import { StatusAttr } from '../../lexicon/types/com/atproto/admin/defs'
export class RepoBlobs {
export class BlobTransactor extends BlobReader {
constructor(
public db: Database,
public db: ActorDb,
public blobstore: BlobStore,
public backgroundQueue: BackgroundQueue,
) {}
) {
super(db, blobstore)
}
async addUntetheredBlob(
creator: string,
userSuggestedMime: string,
blobStream: stream.Readable,
): Promise<BlobRef> {
@ -41,7 +47,6 @@ export class RepoBlobs {
await this.db.db
.insertInto('blob')
.values({
creator,
cid: cid.toString(),
mimeType,
size,
@ -52,7 +57,7 @@ export class RepoBlobs {
})
.onConflict((oc) =>
oc
.columns(['creator', 'cid'])
.column('cid')
.doUpdateSet({ tempKey })
.where('blob.tempKey', 'is not', null),
)
@ -60,8 +65,8 @@ export class RepoBlobs {
return new BlobRef(cid, mimeType, size)
}
async processWriteBlobs(did: string, rev: string, writes: PreparedWrite[]) {
await this.deleteDereferencedBlobs(did, writes)
async processWriteBlobs(rev: string, writes: PreparedWrite[]) {
await this.deleteDereferencedBlobs(writes)
const blobPromises: Promise<void>[] = []
for (const write of writes) {
@ -70,15 +75,37 @@ export class RepoBlobs {
write.action === WriteOpAction.Update
) {
for (const blob of write.blobs) {
blobPromises.push(this.verifyBlobAndMakePermanent(did, blob))
blobPromises.push(this.associateBlob(blob, write.uri, rev, did))
blobPromises.push(this.verifyBlobAndMakePermanent(blob))
blobPromises.push(this.associateBlob(blob, write.uri))
}
}
}
await Promise.all(blobPromises)
}
async deleteDereferencedBlobs(did: string, writes: PreparedWrite[]) {
async updateBlobTakedownStatus(blob: CID, takedown: StatusAttr) {
const takedownRef = takedown.applied
? takedown.ref ?? new Date().toISOString()
: null
await this.db.db
.updateTable('blob')
.set({ takedownRef })
.where('cid', '=', blob.toString())
.executeTakeFirst()
try {
if (takedown.applied) {
await this.blobstore.quarantine(blob)
} else {
await this.blobstore.unquarantine(blob)
}
} catch (err) {
if (!(err instanceof BlobNotFoundError)) {
throw err
}
}
}
async deleteDereferencedBlobs(writes: PreparedWrite[]) {
const deletes = writes.filter(
(w) => w.action === WriteOpAction.Delete,
) as PreparedDelete[]
@ -89,19 +116,17 @@ export class RepoBlobs {
if (uris.length === 0) return
const deletedRepoBlobs = await this.db.db
.deleteFrom('repo_blob')
.where('did', '=', did)
.deleteFrom('record_blob')
.where('recordUri', 'in', uris)
.returningAll()
.execute()
if (deletedRepoBlobs.length < 1) return
const deletedRepoBlobCids = deletedRepoBlobs.map((row) => row.cid)
const deletedRepoBlobCids = deletedRepoBlobs.map((row) => row.blobCid)
const duplicateCids = await this.db.db
.selectFrom('repo_blob')
.where('did', '=', did)
.where('cid', 'in', deletedRepoBlobCids)
.select('cid')
.selectFrom('record_blob')
.where('blobCid', 'in', deletedRepoBlobCids)
.select('blobCid')
.execute()
const newBlobCids = writes
@ -112,7 +137,10 @@ export class RepoBlobs {
)
.flat()
.map((b) => b.cid.toString())
const cidsToKeep = [...newBlobCids, ...duplicateCids.map((row) => row.cid)]
const cidsToKeep = [
...newBlobCids,
...duplicateCids.map((row) => row.blobCid),
]
const cidsToDelete = deletedRepoBlobCids.filter(
(cid) => !cidsToKeep.includes(cid),
)
@ -120,51 +148,23 @@ export class RepoBlobs {
await this.db.db
.deleteFrom('blob')
.where('creator', '=', did)
.where('cid', 'in', cidsToDelete)
.execute()
// check if these blobs are used by other users before deleting from blobstore
const stillUsedRes = await this.db.db
.selectFrom('blob')
.where('cid', 'in', cidsToDelete)
.select('cid')
.distinct()
.execute()
const stillUsed = stillUsedRes.map((row) => row.cid)
const blobsToDelete = cidsToDelete.filter((cid) => !stillUsed.includes(cid))
// move actual blob deletion to the background queue
if (blobsToDelete.length > 0) {
this.db.onCommit(() => {
this.backgroundQueue.add(async () => {
await Promise.allSettled(
blobsToDelete.map((cid) => this.blobstore.delete(CID.parse(cid))),
)
})
this.db.onCommit(() => {
this.backgroundQueue.add(async () => {
await Promise.allSettled(
cidsToDelete.map((cid) => this.blobstore.delete(CID.parse(cid))),
)
})
}
})
}
async verifyBlobAndMakePermanent(
creator: string,
blob: PreparedBlobRef,
): Promise<void> {
const { ref } = this.db.db.dynamic
async verifyBlobAndMakePermanent(blob: PreparedBlobRef): Promise<void> {
const found = await this.db.db
.selectFrom('blob')
.selectAll()
.where('creator', '=', creator)
.where('cid', '=', blob.cid.toString())
.whereNotExists(
// Check if blob has been taken down
this.db.db
.selectFrom('repo_blob')
.selectAll()
.where('takedownRef', 'is not', null)
.whereRef('cid', '=', ref('blob.cid')),
)
.where('takedownRef', 'is', null)
.executeTakeFirst()
if (!found) {
throw new InvalidRequestError(
@ -183,63 +183,16 @@ export class RepoBlobs {
}
}
async associateBlob(
blob: PreparedBlobRef,
recordUri: AtUri,
repoRev: string,
did: string,
): Promise<void> {
async associateBlob(blob: PreparedBlobRef, recordUri: AtUri): Promise<void> {
await this.db.db
.insertInto('repo_blob')
.insertInto('record_blob')
.values({
cid: blob.cid.toString(),
blobCid: blob.cid.toString(),
recordUri: recordUri.toString(),
repoRev,
did,
})
.onConflict((oc) => oc.doNothing())
.execute()
}
async listSinceRev(did: string, rev?: string): Promise<CID[]> {
let builder = this.db.db
.selectFrom('repo_blob')
.where('did', '=', did)
.select('cid')
if (rev) {
builder = builder.where('repoRev', '>', rev)
}
const res = await builder.execute()
const cids = res.map((row) => CID.parse(row.cid))
return new CidSet(cids).toList()
}
async deleteForUser(did: string): Promise<void> {
// Not done in transaction because it would be too long, prone to contention.
// Also, this can safely be run multiple times if it fails.
const deleted = await this.db.db
.deleteFrom('blob')
.where('creator', '=', did)
.returningAll()
.execute()
await this.db.db.deleteFrom('repo_blob').where('did', '=', did).execute()
const deletedCids = deleted.map((d) => d.cid)
let duplicateCids: string[] = []
if (deletedCids.length > 0) {
const res = await this.db.db
.selectFrom('repo_blob')
.where('cid', 'in', deletedCids)
.selectAll()
.execute()
duplicateCids = res.map((d) => d.cid)
}
const toDelete = deletedCids.filter((cid) => !duplicateCids.includes(cid))
if (toDelete.length > 0) {
await Promise.all(
toDelete.map((cid) => this.blobstore.delete(CID.parse(cid))),
)
}
}
}
export class CidNotFound extends Error {

@ -0,0 +1,20 @@
import { DatabaseSchema } from './schema'
import { Database, Migrator } from '../../db'
import migrations from './migrations'
export * from './schema'
export type ActorDb = Database<DatabaseSchema>
export const getDb = (
location: string,
disableWalAutoCheckpoint = false,
): ActorDb => {
const pragmas: Record<string, string> = disableWalAutoCheckpoint
? { wal_autocheckpoint: '0' }
: {}
return Database.sqlite(location, { pragmas })
}
export const getMigrator = (db: Database<DatabaseSchema>) => {
return new Migrator(db.db, migrations)
}

@ -0,0 +1,105 @@
import { Kysely } from 'kysely'
export async function up(db: Kysely<unknown>): Promise<void> {
await db.schema
.createTable('repo_root')
.addColumn('did', 'varchar', (col) => col.primaryKey())
.addColumn('cid', 'varchar', (col) => col.notNull())
.addColumn('rev', 'varchar', (col) => col.notNull())
.addColumn('indexedAt', 'varchar', (col) => col.notNull())
.execute()
await db.schema
.createTable('repo_block')
.addColumn('cid', 'varchar', (col) => col.primaryKey())
.addColumn('repoRev', 'varchar', (col) => col.notNull())
.addColumn('size', 'integer', (col) => col.notNull())
.addColumn('content', 'blob', (col) => col.notNull())
.execute()
await db.schema
.createIndex('repo_block_repo_rev_idx')
.on('repo_block')
.columns(['repoRev', 'cid'])
.execute()
await db.schema
.createTable('record')
.addColumn('uri', 'varchar', (col) => col.primaryKey())
.addColumn('cid', 'varchar', (col) => col.notNull())
.addColumn('collection', 'varchar', (col) => col.notNull())
.addColumn('rkey', 'varchar', (col) => col.notNull())
.addColumn('repoRev', 'varchar', (col) => col.notNull())
.addColumn('indexedAt', 'varchar', (col) => col.notNull())
.addColumn('takedownRef', 'varchar')
.execute()
await db.schema
.createIndex('record_cid_idx')
.on('record')
.column('cid')
.execute()
await db.schema
.createIndex('record_collection_idx')
.on('record')
.column('collection')
.execute()
await db.schema
.createIndex('record_repo_rev_idx')
.on('record')
.column('repoRev')
.execute()
await db.schema
.createTable('blob')
.addColumn('cid', 'varchar', (col) => col.primaryKey())
.addColumn('mimeType', 'varchar', (col) => col.notNull())
.addColumn('size', 'integer', (col) => col.notNull())
.addColumn('tempKey', 'varchar')
.addColumn('width', 'integer')
.addColumn('height', 'integer')
.addColumn('createdAt', 'varchar', (col) => col.notNull())
.addColumn('takedownRef', 'varchar')
.execute()
await db.schema
.createIndex('blob_tempkey_idx')
.on('blob')
.column('tempKey')
.execute()
await db.schema
.createTable('record_blob')
.addColumn('blobCid', 'varchar', (col) => col.notNull())
.addColumn('recordUri', 'varchar', (col) => col.notNull())
.addPrimaryKeyConstraint(`record_blob_pkey`, ['blobCid', 'recordUri'])
.execute()
await db.schema
.createTable('backlink')
.addColumn('uri', 'varchar', (col) => col.notNull())
.addColumn('path', 'varchar', (col) => col.notNull())
.addColumn('linkTo', 'varchar', (col) => col.notNull())
.addPrimaryKeyConstraint('backlinks_pkey', ['uri', 'path'])
.execute()
await db.schema
.createIndex('backlink_link_to_idx')
.on('backlink')
.columns(['path', 'linkTo'])
.execute()
await db.schema
.createTable('account_pref')
.addColumn('id', 'integer', (col) => col.autoIncrement().primaryKey())
.addColumn('name', 'varchar', (col) => col.notNull())
.addColumn('valueJson', 'text', (col) => col.notNull())
.execute()
}
export async function down(db: Kysely<unknown>): Promise<void> {
await db.schema.dropTable('account_pref').execute()
await db.schema.dropTable('backlink').execute()
await db.schema.dropTable('record_blob').execute()
await db.schema.dropTable('blob').execute()
await db.schema.dropTable('record').execute()
await db.schema.dropTable('repo_block').execute()
await db.schema.dropTable('repo_root').execute()
}

@ -0,0 +1,5 @@
import * as init from './001-init'
export default {
'001': init,
}

@ -0,0 +1,11 @@
import { GeneratedAlways } from 'kysely'
export interface AccountPref {
id: GeneratedAlways<number>
name: string
valueJson: string // json
}
export const tableName = 'account_pref'
export type PartialDB = { [tableName]: AccountPref }

@ -1,8 +1,7 @@
export interface Backlink {
uri: string
path: string
linkToUri: string | null
linkToDid: string | null
linkTo: string
}
export const tableName = 'backlink'

@ -1,5 +1,4 @@
export interface Blob {
creator: string
cid: string
mimeType: string
size: number
@ -7,6 +6,7 @@ export interface Blob {
width: number | null
height: number | null
createdAt: string
takedownRef: string | null
}
export const tableName = 'blob'

@ -0,0 +1,23 @@
import * as accountPref from './account-pref'
import * as repoRoot from './repo-root'
import * as record from './record'
import * as backlink from './backlink'
import * as repoBlock from './repo-block'
import * as blob from './blob'
import * as recordBlob from './record-blob'
export type DatabaseSchema = accountPref.PartialDB &
repoRoot.PartialDB &
record.PartialDB &
backlink.PartialDB &
repoBlock.PartialDB &
blob.PartialDB &
recordBlob.PartialDB
export type { AccountPref } from './account-pref'
export type { RepoRoot } from './repo-root'
export type { Record } from './record'
export type { Backlink } from './backlink'
export type { RepoBlock } from './repo-block'
export type { Blob } from './blob'
export type { RecordBlob } from './record-blob'

@ -0,0 +1,8 @@
export interface RecordBlob {
blobCid: string
recordUri: string
}
export const tableName = 'record_blob'
export type PartialDB = { [tableName]: RecordBlob }

@ -2,10 +2,9 @@
export interface Record {
uri: string
cid: string
did: string
collection: string
rkey: string
repoRev: string | null
repoRev: string
indexedAt: string
takedownRef: string | null
}

@ -0,0 +1,10 @@
export interface RepoBlock {
cid: string
repoRev: string
size: number
content: Uint8Array
}
export const tableName = 'repo_block'
export type PartialDB = { [tableName]: RepoBlock }

@ -0,0 +1,10 @@
export interface RepoRoot {
did: string
cid: string
rev: string
indexedAt: string
}
const tableName = 'repo_root'
export type PartialDB = { [tableName]: RepoRoot }

@ -0,0 +1,261 @@
import path from 'path'
import fs from 'fs/promises'
import * as crypto from '@atproto/crypto'
import { Keypair, ExportableKeypair } from '@atproto/crypto'
import { BlobStore } from '@atproto/repo'
import {
chunkArray,
fileExists,
readIfExists,
rmIfExists,
} from '@atproto/common'
import { ActorDb, getDb, getMigrator } from './db'
import { BackgroundQueue } from '../background'
import { RecordReader } from './record/reader'
import { PreferenceReader } from './preference/reader'
import { RepoReader } from './repo/reader'
import { RepoTransactor } from './repo/transactor'
import { PreferenceTransactor } from './preference/transactor'
import { InvalidRequestError } from '@atproto/xrpc-server'
import { RecordTransactor } from './record/transactor'
import { CID } from 'multiformats/cid'
import DiskBlobStore from '../disk-blobstore'
import { mkdir } from 'fs/promises'
import { ActorStoreConfig } from '../config'
import { retrySqlite } from '../db'
type ActorStoreResources = {
blobstore: (did: string) => BlobStore
backgroundQueue: BackgroundQueue
reservedKeyDir?: string
}
export class ActorStore {
reservedKeyDir: string
constructor(
public cfg: ActorStoreConfig,
public resources: ActorStoreResources,
) {
this.reservedKeyDir = path.join(cfg.directory, 'reserved_keys')
}
async getLocation(did: string) {
const didHash = await crypto.sha256Hex(did)
const directory = path.join(this.cfg.directory, didHash.slice(0, 2), did)
const dbLocation = path.join(directory, `store.sqlite`)
const keyLocation = path.join(directory, `key`)
return { directory, dbLocation, keyLocation }
}
async exists(did: string): Promise<boolean> {
const location = await this.getLocation(did)
return await fileExists(location.dbLocation)
}
async keypair(did: string): Promise<Keypair> {
const { keyLocation } = await this.getLocation(did)
const privKey = await fs.readFile(keyLocation)
return crypto.Secp256k1Keypair.import(privKey)
}
async openDb(did: string): Promise<ActorDb> {
const { dbLocation } = await this.getLocation(did)
const exists = await fileExists(dbLocation)
if (!exists) {
throw new InvalidRequestError('Repo not found', 'NotFound')
}
const db = getDb(dbLocation, this.cfg.disableWalAutoCheckpoint)
// run a simple select with retry logic to ensure the db is ready (not in wal recovery mode)
try {
await retrySqlite(() =>
db.db.selectFrom('repo_root').selectAll().execute(),
)
} catch (err) {
db.close()
throw err
}
return db
}
async read<T>(did: string, fn: ActorStoreReadFn<T>) {
const db = await this.openDb(did)
try {
const reader = createActorReader(did, db, this.resources, () =>
this.keypair(did),
)
return await fn(reader)
} finally {
db.close()
}
}
async transact<T>(did: string, fn: ActorStoreTransactFn<T>) {
const keypair = await this.keypair(did)
const db = await this.openDb(did)
try {
return await db.transaction((dbTxn) => {
const store = createActorTransactor(did, dbTxn, keypair, this.resources)
return fn(store)
})
} finally {
db.close()
}
}
async create(did: string, keypair: ExportableKeypair) {
const { directory, dbLocation, keyLocation } = await this.getLocation(did)
// ensure subdir exists
await mkdir(directory, { recursive: true })
const exists = await fileExists(dbLocation)
if (exists) {
throw new InvalidRequestError('Repo already exists', 'AlreadyExists')
}
const privKey = await keypair.export()
await fs.writeFile(keyLocation, privKey)
const db: ActorDb = getDb(dbLocation, this.cfg.disableWalAutoCheckpoint)
try {
await db.ensureWal()
const migrator = getMigrator(db)
await migrator.migrateToLatestOrThrow()
} finally {
db.close()
}
}
async destroy(did: string) {
const blobstore = this.resources.blobstore(did)
if (blobstore instanceof DiskBlobStore) {
await blobstore.deleteAll()
} else {
const blobRows = await this.read(did, (store) =>
store.db.db.selectFrom('blob').select('cid').execute(),
)
const cids = blobRows.map((row) => CID.parse(row.cid))
await Promise.allSettled(
chunkArray(cids, 500).map((chunk) => blobstore.deleteMany(chunk)),
)
}
const { directory } = await this.getLocation(did)
await rmIfExists(directory, true)
}
async reserveKeypair(did?: string): Promise<string> {
let keyLoc: string | undefined
if (did) {
keyLoc = path.join(this.reservedKeyDir, did)
const maybeKey = await loadKey(keyLoc)
if (maybeKey) {
return maybeKey.did()
}
}
const keypair = await crypto.Secp256k1Keypair.create({ exportable: true })
const keyDid = keypair.did()
keyLoc = keyLoc ?? path.join(this.reservedKeyDir, keyDid)
await mkdir(this.reservedKeyDir, { recursive: true })
await fs.writeFile(keyLoc, await keypair.export())
return keyDid
}
async getReservedKeypair(
signingKeyOrDid: string,
): Promise<ExportableKeypair | undefined> {
return loadKey(path.join(this.reservedKeyDir, signingKeyOrDid))
}
async clearReservedKeypair(keyDid: string, did?: string) {
await rmIfExists(path.join(this.reservedKeyDir, keyDid))
if (did) {
await rmIfExists(path.join(this.reservedKeyDir, did))
}
}
async storePlcOp(did: string, op: Uint8Array) {
const { directory } = await this.getLocation(did)
const opLoc = path.join(directory, `did-op`)
await fs.writeFile(opLoc, op)
}
async getPlcOp(did: string): Promise<Uint8Array> {
const { directory } = await this.getLocation(did)
const opLoc = path.join(directory, `did-op`)
return await fs.readFile(opLoc)
}
async clearPlcOp(did: string) {
const { directory } = await this.getLocation(did)
const opLoc = path.join(directory, `did-op`)
await rmIfExists(opLoc)
}
}
const loadKey = async (loc: string): Promise<ExportableKeypair | undefined> => {
const privKey = await readIfExists(loc)
if (!privKey) return undefined
return crypto.Secp256k1Keypair.import(privKey, { exportable: true })
}
const createActorTransactor = (
did: string,
db: ActorDb,
keypair: Keypair,
resources: ActorStoreResources,
): ActorStoreTransactor => {
const { blobstore, backgroundQueue } = resources
const userBlobstore = blobstore(did)
return {
did,
db,
repo: new RepoTransactor(db, did, keypair, userBlobstore, backgroundQueue),
record: new RecordTransactor(db, userBlobstore),
pref: new PreferenceTransactor(db),
}
}
const createActorReader = (
did: string,
db: ActorDb,
resources: ActorStoreResources,
getKeypair: () => Promise<Keypair>,
): ActorStoreReader => {
const { blobstore } = resources
return {
did,
db,
repo: new RepoReader(db, blobstore(did)),
record: new RecordReader(db),
pref: new PreferenceReader(db),
transact: async <T>(fn: ActorStoreTransactFn<T>): Promise<T> => {
const keypair = await getKeypair()
return db.transaction((dbTxn) => {
const store = createActorTransactor(did, dbTxn, keypair, resources)
return fn(store)
})
},
}
}
export type ActorStoreReadFn<T> = (fn: ActorStoreReader) => Promise<T>
export type ActorStoreTransactFn<T> = (fn: ActorStoreTransactor) => Promise<T>
export type ActorStoreReader = {
did: string
db: ActorDb
repo: RepoReader
record: RecordReader
pref: PreferenceReader
transact: <T>(fn: ActorStoreTransactFn<T>) => Promise<T>
}
export type ActorStoreTransactor = {
did: string
db: ActorDb
repo: RepoTransactor
record: RecordTransactor
pref: PreferenceTransactor
}

@ -0,0 +1,39 @@
import { sql } from 'kysely'
import AppContext from '../context'
import PQueue from 'p-queue'
export const forEachActorStore = async (
ctx: AppContext,
opts: { concurrency?: number },
fn: (ctx: AppContext, did: string) => Promise<string>,
) => {
const { concurrency = 1 } = opts
const queue = new PQueue({ concurrency })
const actorQb = ctx.accountManager.db.db
.selectFrom('actor')
.selectAll()
.limit(2 * concurrency)
let cursor: { createdAt: string; did: string } | undefined
do {
const actors = cursor
? await actorQb
.where(
sql`("createdAt", "did")`,
'>',
sql`(${cursor.createdAt}, ${cursor.did})`,
)
.execute()
: await actorQb.execute()
queue.addAll(
actors.map(({ did }) => {
return () => fn(ctx, did)
}),
)
cursor = actors.at(-1)
await queue.onEmpty() // wait for all remaining items to be in process, then move on to next page
} while (cursor)
// finalize remaining work
await queue.onIdle()
}

@ -0,0 +1,22 @@
import { ActorDb } from '../db'
export class PreferenceReader {
constructor(public db: ActorDb) {}
async getPreferences(namespace?: string): Promise<AccountPreference[]> {
const prefsRes = await this.db.db
.selectFrom('account_pref')
.orderBy('id')
.selectAll()
.execute()
return prefsRes
.filter((pref) => !namespace || prefMatchNamespace(namespace, pref.name))
.map((pref) => JSON.parse(pref.valueJson))
}
}
export type AccountPreference = Record<string, unknown> & { $type: string }
export const prefMatchNamespace = (namespace: string, fullname: string) => {
return fullname === namespace || fullname.startsWith(`${namespace}.`)
}

@ -0,0 +1,44 @@
import { InvalidRequestError } from '@atproto/xrpc-server'
import {
PreferenceReader,
AccountPreference,
prefMatchNamespace,
} from './reader'
export class PreferenceTransactor extends PreferenceReader {
async putPreferences(
values: AccountPreference[],
namespace: string,
): Promise<void> {
this.db.assertTransaction()
if (!values.every((value) => prefMatchNamespace(namespace, value.$type))) {
throw new InvalidRequestError(
`Some preferences are not in the ${namespace} namespace`,
)
}
// get all current prefs for user and prep new pref rows
const allPrefs = await this.db.db
.selectFrom('account_pref')
.select(['id', 'name'])
.execute()
const putPrefs = values.map((value) => {
return {
name: value.$type,
valueJson: JSON.stringify(value),
}
})
const allPrefIdsInNamespace = allPrefs
.filter((pref) => prefMatchNamespace(namespace, pref.name))
.map((pref) => pref.id)
// replace all prefs in given namespace
if (allPrefIdsInNamespace.length) {
await this.db.db
.deleteFrom('account_pref')
.where('id', 'in', allPrefIdsInNamespace)
.execute()
}
if (putPrefs.length) {
await this.db.db.insertInto('account_pref').values(putPrefs).execute()
}
}
}

@ -1,91 +1,20 @@
import { CID } from 'multiformats/cid'
import * as syntax from '@atproto/syntax'
import { AtUri, ensureValidAtUri } from '@atproto/syntax'
import * as ident from '@atproto/syntax'
import { cborToLexRecord, WriteOpAction } from '@atproto/repo'
import { dbLogger as log } from '../../logger'
import Database from '../../db'
import { cborToLexRecord } from '@atproto/repo'
import { CID } from 'multiformats/cid'
import { notSoftDeletedClause } from '../../db/util'
import { Backlink } from '../../db/tables/backlink'
import { ids } from '../../lexicon/lexicons'
import { ActorDb, Backlink } from '../db'
import { StatusAttr } from '../../lexicon/types/com/atproto/admin/defs'
import { RepoRecord } from '@atproto/lexicon'
export class RecordService {
constructor(public db: Database) {}
export class RecordReader {
constructor(public db: ActorDb) {}
static creator() {
return (db: Database) => new RecordService(db)
}
async indexRecord(
uri: AtUri,
cid: CID,
obj: unknown,
action: WriteOpAction.Create | WriteOpAction.Update = WriteOpAction.Create,
repoRev?: string,
timestamp?: string,
) {
this.db.assertTransaction()
log.debug({ uri }, 'indexing record')
const record = {
uri: uri.toString(),
cid: cid.toString(),
did: uri.host,
collection: uri.collection,
rkey: uri.rkey,
repoRev: repoRev ?? null,
indexedAt: timestamp || new Date().toISOString(),
}
if (!record.did.startsWith('did:')) {
throw new Error('Expected indexed URI to contain DID')
} else if (record.collection.length < 1) {
throw new Error('Expected indexed URI to contain a collection')
} else if (record.rkey.length < 1) {
throw new Error('Expected indexed URI to contain a record key')
}
// Track current version of record
await this.db.db
.insertInto('record')
.values(record)
.onConflict((oc) =>
oc.column('uri').doUpdateSet({
cid: record.cid,
repoRev: repoRev ?? null,
indexedAt: record.indexedAt,
}),
)
.execute()
// Maintain backlinks
const backlinks = getBacklinks(uri, obj)
if (action === WriteOpAction.Update) {
// On update just recreate backlinks from scratch for the record, so we can clear out
// the old ones. E.g. for weird cases like updating a follow to be for a different did.
await this.removeBacklinksByUri(uri)
}
await this.addBacklinks(backlinks)
log.info({ uri }, 'indexed record')
}
async deleteRecord(uri: AtUri) {
this.db.assertTransaction()
log.debug({ uri }, 'deleting indexed record')
const deleteQuery = this.db.db
.deleteFrom('record')
.where('uri', '=', uri.toString())
const backlinkQuery = this.db.db
.deleteFrom('backlink')
.where('uri', '=', uri.toString())
await Promise.all([deleteQuery.execute(), backlinkQuery.execute()])
log.info({ uri }, 'deleted indexed record')
}
async listCollectionsForDid(did: string): Promise<string[]> {
async listCollections(): Promise<string[]> {
const collections = await this.db.db
.selectFrom('record')
.select('collection')
.where('did', '=', did)
.groupBy('collection')
.execute()
@ -93,7 +22,6 @@ export class RecordService {
}
async listRecordsForCollection(opts: {
did: string
collection: string
limit: number
reverse: boolean
@ -103,7 +31,6 @@ export class RecordService {
includeSoftDeleted?: boolean
}): Promise<{ uri: string; cid: string; value: object }[]> {
const {
did,
collection,
limit,
reverse,
@ -116,12 +43,7 @@ export class RecordService {
const { ref } = this.db.db.dynamic
let builder = this.db.db
.selectFrom('record')
.innerJoin('ipld_block', (join) =>
join
.onRef('ipld_block.cid', '=', 'record.cid')
.on('ipld_block.creator', '=', did),
)
.where('record.did', '=', did)
.innerJoin('repo_block', 'repo_block.cid', 'record.cid')
.where('record.collection', '=', collection)
.if(!includeSoftDeleted, (qb) =>
qb.where(notSoftDeletedClause(ref('record'))),
@ -169,11 +91,7 @@ export class RecordService {
const { ref } = this.db.db.dynamic
let builder = this.db.db
.selectFrom('record')
.innerJoin('ipld_block', (join) =>
join
.onRef('ipld_block.cid', '=', 'record.cid')
.on('ipld_block.creator', '=', uri.host),
)
.innerJoin('repo_block', 'repo_block.cid', 'record.cid')
.where('record.uri', '=', uri.toString())
.selectAll()
.if(!includeSoftDeleted, (qb) =>
@ -213,56 +131,67 @@ export class RecordService {
return !!record
}
async deleteForActor(did: string) {
// Not done in transaction because it would be too long, prone to contention.
// Also, this can safely be run multiple times if it fails.
await this.db.db.deleteFrom('record').where('did', '=', did).execute()
}
async removeBacklinksByUri(uri: AtUri) {
await this.db.db
.deleteFrom('backlink')
async getRecordTakedownStatus(uri: AtUri): Promise<StatusAttr | null> {
const res = await this.db.db
.selectFrom('record')
.select('takedownRef')
.where('uri', '=', uri.toString())
.execute()
.executeTakeFirst()
if (!res) return null
return res.takedownRef
? { applied: true, ref: res.takedownRef }
: { applied: false }
}
async addBacklinks(backlinks: Backlink[]) {
if (backlinks.length === 0) return
await this.db.db
.insertInto('backlink')
.values(backlinks)
.onConflict((oc) => oc.doNothing())
.execute()
async getCurrentRecordCid(uri: AtUri): Promise<CID | null> {
const res = await this.db.db
.selectFrom('record')
.select('cid')
.where('uri', '=', uri.toString())
.executeTakeFirst()
return res ? CID.parse(res.cid) : null
}
async getRecordBacklinks(opts: {
did: string
collection: string
path: string
linkTo: string
}) {
const { did, collection, path, linkTo } = opts
const { collection, path, linkTo } = opts
return await this.db.db
.selectFrom('record')
.innerJoin('backlink', 'backlink.uri', 'record.uri')
.where('backlink.path', '=', path)
.if(linkTo.startsWith('at://'), (q) =>
q.where('backlink.linkToUri', '=', linkTo),
)
.if(!linkTo.startsWith('at://'), (q) =>
q.where('backlink.linkToDid', '=', linkTo),
)
.where('record.did', '=', did)
.where('backlink.linkTo', '=', linkTo)
.where('record.collection', '=', collection)
.selectAll('record')
.execute()
}
// @NOTE this logic is a placeholder until we allow users to specify these constraints themselves.
// Ensures that we don't end-up with duplicate likes, reposts, and follows from race conditions.
async getBacklinkConflicts(uri: AtUri, record: RepoRecord): Promise<AtUri[]> {
const recordBacklinks = getBacklinks(uri, record)
const conflicts = await Promise.all(
recordBacklinks.map((backlink) =>
this.getRecordBacklinks({
collection: uri.collection,
path: backlink.path,
linkTo: backlink.linkTo,
}),
),
)
return conflicts
.flat()
.map(({ rkey }) => AtUri.make(uri.hostname, uri.collection, rkey))
}
}
// @NOTE in the future this can be replaced with a more generic routine that pulls backlinks based on lex docs.
// For now we just want to ensure we're tracking links from follows, blocks, likes, and reposts.
function getBacklinks(uri: AtUri, record: unknown): Backlink[] {
export const getBacklinks = (uri: AtUri, record: RepoRecord): Backlink[] => {
if (
record?.['$type'] === ids.AppBskyGraphFollow ||
record?.['$type'] === ids.AppBskyGraphBlock
@ -272,7 +201,7 @@ function getBacklinks(uri: AtUri, record: unknown): Backlink[] {
return []
}
try {
ident.ensureValidDid(subject)
syntax.ensureValidDid(subject)
} catch {
return []
}
@ -280,8 +209,7 @@ function getBacklinks(uri: AtUri, record: unknown): Backlink[] {
{
uri: uri.toString(),
path: 'subject',
linkToDid: subject,
linkToUri: null,
linkTo: subject,
},
]
}
@ -290,7 +218,7 @@ function getBacklinks(uri: AtUri, record: unknown): Backlink[] {
record?.['$type'] === ids.AppBskyFeedRepost
) {
const subject = record['subject']
if (typeof subject['uri'] !== 'string') {
if (typeof subject?.['uri'] !== 'string') {
return []
}
try {
@ -302,8 +230,7 @@ function getBacklinks(uri: AtUri, record: unknown): Backlink[] {
{
uri: uri.toString(),
path: 'subject.uri',
linkToUri: subject.uri,
linkToDid: null,
linkTo: subject['uri'],
},
]
}

@ -0,0 +1,106 @@
import { CID } from 'multiformats/cid'
import { AtUri } from '@atproto/syntax'
import { BlobStore, WriteOpAction } from '@atproto/repo'
import { dbLogger as log } from '../../logger'
import { ActorDb, Backlink } from '../db'
import { RecordReader, getBacklinks } from './reader'
import { StatusAttr } from '../../lexicon/types/com/atproto/admin/defs'
import { RepoRecord } from '@atproto/lexicon'
export class RecordTransactor extends RecordReader {
constructor(public db: ActorDb, public blobstore: BlobStore) {
super(db)
}
async indexRecord(
uri: AtUri,
cid: CID,
record: RepoRecord | null,
action: WriteOpAction.Create | WriteOpAction.Update = WriteOpAction.Create,
repoRev: string,
timestamp?: string,
) {
log.debug({ uri }, 'indexing record')
const row = {
uri: uri.toString(),
cid: cid.toString(),
collection: uri.collection,
rkey: uri.rkey,
repoRev: repoRev,
indexedAt: timestamp || new Date().toISOString(),
}
if (!uri.hostname.startsWith('did:')) {
throw new Error('Expected indexed URI to contain DID')
} else if (row.collection.length < 1) {
throw new Error('Expected indexed URI to contain a collection')
} else if (row.rkey.length < 1) {
throw new Error('Expected indexed URI to contain a record key')
}
// Track current version of record
await this.db.db
.insertInto('record')
.values(row)
.onConflict((oc) =>
oc.column('uri').doUpdateSet({
cid: row.cid,
repoRev: repoRev,
indexedAt: row.indexedAt,
}),
)
.execute()
if (record !== null) {
// Maintain backlinks
const backlinks = getBacklinks(uri, record)
if (action === WriteOpAction.Update) {
// On update just recreate backlinks from scratch for the record, so we can clear out
// the old ones. E.g. for weird cases like updating a follow to be for a different did.
await this.removeBacklinksByUri(uri)
}
await this.addBacklinks(backlinks)
}
log.info({ uri }, 'indexed record')
}
async deleteRecord(uri: AtUri) {
log.debug({ uri }, 'deleting indexed record')
const deleteQuery = this.db.db
.deleteFrom('record')
.where('uri', '=', uri.toString())
const backlinkQuery = this.db.db
.deleteFrom('backlink')
.where('uri', '=', uri.toString())
await Promise.all([deleteQuery.execute(), backlinkQuery.execute()])
log.info({ uri }, 'deleted indexed record')
}
async removeBacklinksByUri(uri: AtUri) {
await this.db.db
.deleteFrom('backlink')
.where('uri', '=', uri.toString())
.execute()
}
async addBacklinks(backlinks: Backlink[]) {
if (backlinks.length === 0) return
await this.db.db
.insertInto('backlink')
.values(backlinks)
.onConflict((oc) => oc.doNothing())
.execute()
}
async updateRecordTakedownStatus(uri: AtUri, takedown: StatusAttr) {
const takedownRef = takedown.applied
? takedown.ref ?? new Date().toISOString()
: null
await this.db.db
.updateTable('record')
.set({ takedownRef })
.where('uri', '=', uri.toString())
.executeTakeFirst()
}
}

@ -0,0 +1,17 @@
import { BlobStore } from '@atproto/repo'
import { SqlRepoReader } from './sql-repo-reader'
import { BlobReader } from '../blob/reader'
import { ActorDb } from '../db'
import { RecordReader } from '../record/reader'
export class RepoReader {
blob: BlobReader
record: RecordReader
storage: SqlRepoReader
constructor(public db: ActorDb, public blobstore: BlobStore) {
this.blob = new BlobReader(db, blobstore)
this.record = new RecordReader(db)
this.storage = new SqlRepoReader(db)
}
}

@ -0,0 +1,149 @@
import {
BlockMap,
CidSet,
ReadableBlockstore,
writeCarStream,
} from '@atproto/repo'
import { chunkArray } from '@atproto/common'
import { CID } from 'multiformats/cid'
import { ActorDb } from '../db'
import { sql } from 'kysely'
export class SqlRepoReader extends ReadableBlockstore {
cache: BlockMap = new BlockMap()
now: string
constructor(public db: ActorDb) {
super()
}
async getRoot(): Promise<CID> {
const root = await this.getRootDetailed()
return root?.cid ?? null
}
async getRootDetailed(): Promise<{ cid: CID; rev: string }> {
const res = await this.db.db
.selectFrom('repo_root')
.selectAll()
.executeTakeFirstOrThrow()
return {
cid: CID.parse(res.cid),
rev: res.rev,
}
}
async getBytes(cid: CID): Promise<Uint8Array | null> {
const cached = this.cache.get(cid)
if (cached) return cached
const found = await this.db.db
.selectFrom('repo_block')
.where('repo_block.cid', '=', cid.toString())
.select('content')
.executeTakeFirst()
if (!found) return null
this.cache.set(cid, found.content)
return found.content
}
async has(cid: CID): Promise<boolean> {
const got = await this.getBytes(cid)
return !!got
}
async getBlocks(cids: CID[]): Promise<{ blocks: BlockMap; missing: CID[] }> {
const cached = this.cache.getMany(cids)
if (cached.missing.length < 1) return cached
const missing = new CidSet(cached.missing)
const missingStr = cached.missing.map((c) => c.toString())
const blocks = new BlockMap()
await Promise.all(
chunkArray(missingStr, 500).map(async (batch) => {
const res = await this.db.db
.selectFrom('repo_block')
.where('repo_block.cid', 'in', batch)
.select(['repo_block.cid as cid', 'repo_block.content as content'])
.execute()
for (const row of res) {
const cid = CID.parse(row.cid)
blocks.set(cid, row.content)
missing.delete(cid)
}
}),
)
this.cache.addMap(blocks)
blocks.addMap(cached.blocks)
return { blocks, missing: missing.toList() }
}
async getCarStream(since?: string) {
const root = await this.getRoot()
if (!root) {
throw new RepoRootNotFoundError()
}
return writeCarStream(root, async (car) => {
let cursor: RevCursor | undefined = undefined
const writeRows = async (
rows: { cid: string; content: Uint8Array }[],
) => {
for (const row of rows) {
await car.put({
cid: CID.parse(row.cid),
bytes: row.content,
})
}
}
// allow us to write to car while fetching the next page
let writePromise: Promise<void> = Promise.resolve()
do {
const res = await this.getBlockRange(since, cursor)
await writePromise
writePromise = writeRows(res)
const lastRow = res.at(-1)
if (lastRow && lastRow.repoRev) {
cursor = {
cid: CID.parse(lastRow.cid),
rev: lastRow.repoRev,
}
} else {
cursor = undefined
}
} while (cursor)
// ensure we flush the last page of blocks
await writePromise
})
}
async getBlockRange(since?: string, cursor?: RevCursor) {
const { ref } = this.db.db.dynamic
let builder = this.db.db
.selectFrom('repo_block')
.select(['cid', 'repoRev', 'content'])
.orderBy('repoRev', 'desc')
.orderBy('cid', 'desc')
.limit(500)
if (cursor) {
// use this syntax to ensure we hit the index
builder = builder.where(
sql`((${ref('repoRev')}, ${ref('cid')}) < (${
cursor.rev
}, ${cursor.cid.toString()}))`,
)
}
if (since) {
builder = builder.where('repoRev', '>', since)
}
return builder.execute()
}
async destroy(): Promise<void> {
throw new Error('Destruction of SQL repo storage not allowed at runtime')
}
}
type RevCursor = {
cid: CID
rev: string
}
export class RepoRootNotFoundError extends Error {}

@ -0,0 +1,107 @@
import { CommitData, RepoStorage, BlockMap } from '@atproto/repo'
import { chunkArray } from '@atproto/common'
import { CID } from 'multiformats/cid'
import { ActorDb, RepoBlock } from '../db'
import { SqlRepoReader } from './sql-repo-reader'
export class SqlRepoTransactor extends SqlRepoReader implements RepoStorage {
cache: BlockMap = new BlockMap()
now: string
constructor(public db: ActorDb, public did: string, now?: string) {
super(db)
this.now = now ?? new Date().toISOString()
}
// proactively cache all blocks from a particular commit (to prevent multiple roundtrips)
async cacheRev(rev: string): Promise<void> {
const res = await this.db.db
.selectFrom('repo_block')
.where('repoRev', '=', rev)
.select(['repo_block.cid', 'repo_block.content'])
.limit(15)
.execute()
for (const row of res) {
this.cache.set(CID.parse(row.cid), row.content)
}
}
async putBlock(cid: CID, block: Uint8Array, rev: string): Promise<void> {
await this.db.db
.insertInto('repo_block')
.values({
cid: cid.toString(),
repoRev: rev,
size: block.length,
content: block,
})
.onConflict((oc) => oc.doNothing())
.execute()
this.cache.set(cid, block)
}
async putMany(toPut: BlockMap, rev: string): Promise<void> {
const blocks: RepoBlock[] = []
toPut.forEach((bytes, cid) => {
blocks.push({
cid: cid.toString(),
repoRev: rev,
size: bytes.length,
content: bytes,
})
})
await Promise.all(
chunkArray(blocks, 50).map((batch) =>
this.db.db
.insertInto('repo_block')
.values(batch)
.onConflict((oc) => oc.doNothing())
.execute(),
),
)
}
async deleteMany(cids: CID[]) {
if (cids.length < 1) return
const cidStrs = cids.map((c) => c.toString())
await this.db.db
.deleteFrom('repo_block')
.where('cid', 'in', cidStrs)
.execute()
}
async applyCommit(commit: CommitData, isCreate?: boolean) {
await Promise.all([
this.updateRoot(commit.cid, commit.rev, isCreate),
this.putMany(commit.newBlocks, commit.rev),
this.deleteMany(commit.removedCids.toList()),
])
}
async updateRoot(cid: CID, rev: string, isCreate = false): Promise<void> {
if (isCreate) {
await this.db.db
.insertInto('repo_root')
.values({
did: this.did,
cid: cid.toString(),
rev: rev,
indexedAt: this.now,
})
.execute()
} else {
await this.db.db
.updateTable('repo_root')
.set({
cid: cid.toString(),
rev: rev,
indexedAt: this.now,
})
.execute()
}
}
async destroy(): Promise<void> {
throw new Error('Destruction of SQL repo storage not allowed at runtime')
}
}

@ -0,0 +1,181 @@
import { CID } from 'multiformats/cid'
import * as crypto from '@atproto/crypto'
import { BlobStore, CommitData, Repo, WriteOpAction } from '@atproto/repo'
import { InvalidRequestError } from '@atproto/xrpc-server'
import { AtUri } from '@atproto/syntax'
import { SqlRepoTransactor } from './sql-repo-transactor'
import {
BadCommitSwapError,
BadRecordSwapError,
PreparedCreate,
PreparedWrite,
} from '../../repo/types'
import { BlobTransactor } from '../blob/transactor'
import { createWriteToOp, writeToOp } from '../../repo'
import { BackgroundQueue } from '../../background'
import { ActorDb } from '../db'
import { RecordTransactor } from '../record/transactor'
import { RepoReader } from './reader'
export class RepoTransactor extends RepoReader {
blob: BlobTransactor
record: RecordTransactor
storage: SqlRepoTransactor
now: string
constructor(
public db: ActorDb,
public did: string,
public signingKey: crypto.Keypair,
public blobstore: BlobStore,
public backgroundQueue: BackgroundQueue,
now?: string,
) {
super(db, blobstore)
this.blob = new BlobTransactor(db, blobstore, backgroundQueue)
this.record = new RecordTransactor(db, blobstore)
this.now = now ?? new Date().toISOString()
this.storage = new SqlRepoTransactor(db, this.did, this.now)
}
async createRepo(writes: PreparedCreate[]): Promise<CommitData> {
this.db.assertTransaction()
const writeOps = writes.map(createWriteToOp)
const commit = await Repo.formatInitCommit(
this.storage,
this.did,
this.signingKey,
writeOps,
)
await Promise.all([
this.storage.applyCommit(commit, true),
this.indexWrites(writes, commit.rev),
this.blob.processWriteBlobs(commit.rev, writes),
])
return commit
}
async processWrites(writes: PreparedWrite[], swapCommitCid?: CID) {
this.db.assertTransaction()
const commit = await this.formatCommit(writes, swapCommitCid)
await Promise.all([
// persist the commit to repo storage
this.storage.applyCommit(commit),
// & send to indexing
this.indexWrites(writes, commit.rev),
// process blobs
this.blob.processWriteBlobs(commit.rev, writes),
])
return commit
}
async formatCommit(
writes: PreparedWrite[],
swapCommit?: CID,
): Promise<CommitData> {
// this is not in a txn, so this won't actually hold the lock,
// we just check if it is currently held by another txn
const currRoot = await this.storage.getRootDetailed()
if (!currRoot) {
throw new InvalidRequestError(`No repo root found for ${this.did}`)
}
if (swapCommit && !currRoot.cid.equals(swapCommit)) {
throw new BadCommitSwapError(currRoot.cid)
}
// cache last commit since there's likely overlap
await this.storage.cacheRev(currRoot.rev)
const newRecordCids: CID[] = []
const delAndUpdateUris: AtUri[] = []
for (const write of writes) {
const { action, uri, swapCid } = write
if (action !== WriteOpAction.Delete) {
newRecordCids.push(write.cid)
}
if (action !== WriteOpAction.Create) {
delAndUpdateUris.push(uri)
}
if (swapCid === undefined) {
continue
}
const record = await this.record.getRecord(uri, null, true)
const currRecord = record && CID.parse(record.cid)
if (action === WriteOpAction.Create && swapCid !== null) {
throw new BadRecordSwapError(currRecord) // There should be no current record for a create
}
if (action === WriteOpAction.Update && swapCid === null) {
throw new BadRecordSwapError(currRecord) // There should be a current record for an update
}
if (action === WriteOpAction.Delete && swapCid === null) {
throw new BadRecordSwapError(currRecord) // There should be a current record for a delete
}
if ((currRecord || swapCid) && !currRecord?.equals(swapCid)) {
throw new BadRecordSwapError(currRecord)
}
}
const repo = await Repo.load(this.storage, currRoot.cid)
const writeOps = writes.map(writeToOp)
const commit = await repo.formatCommit(writeOps, this.signingKey)
// find blocks that would be deleted but are referenced by another record
const dupeRecordCids = await this.getDuplicateRecordCids(
commit.removedCids.toList(),
delAndUpdateUris,
)
for (const cid of dupeRecordCids) {
commit.removedCids.delete(cid)
}
// find blocks that are relevant to ops but not included in diff
// (for instance a record that was moved but cid stayed the same)
const newRecordBlocks = commit.newBlocks.getMany(newRecordCids)
if (newRecordBlocks.missing.length > 0) {
const missingBlocks = await this.storage.getBlocks(
newRecordBlocks.missing,
)
commit.newBlocks.addMap(missingBlocks.blocks)
}
return commit
}
async indexWrites(writes: PreparedWrite[], rev: string) {
this.db.assertTransaction()
await Promise.all(
writes.map(async (write) => {
if (
write.action === WriteOpAction.Create ||
write.action === WriteOpAction.Update
) {
await this.record.indexRecord(
write.uri,
write.cid,
write.record,
write.action,
rev,
this.now,
)
} else if (write.action === WriteOpAction.Delete) {
await this.record.deleteRecord(write.uri)
}
}),
)
}
async getDuplicateRecordCids(
cids: CID[],
touchedUris: AtUri[],
): Promise<CID[]> {
if (touchedUris.length === 0 || cids.length === 0) {
return []
}
const cidStrs = cids.map((c) => c.toString())
const uriStrs = touchedUris.map((u) => u.toString())
const res = await this.db.db
.selectFrom('record')
.where('cid', 'in', cidStrs)
.where('uri', 'not in', uriStrs)
.select('cid')
.execute()
return res.map((row) => CID.parse(row.cid))
}
}

@ -7,10 +7,9 @@ export default function (server: Server, ctx: AppContext) {
auth: ctx.authVerifier.access,
handler: async ({ auth }) => {
const requester = auth.credentials.did
const { services, db } = ctx
let preferences = await services
.account(db)
.getPreferences(requester, 'app.bsky')
let preferences = await ctx.actorStore.read(requester, (store) =>
store.pref.getPreferences('app.bsky'),
)
if (auth.credentials.scope !== AuthScope.Access) {
// filter out personal details for app passwords
preferences = preferences.filter(

@ -1,9 +1,12 @@
import { Server } from '../../../../lexicon'
import AppContext from '../../../../context'
import { authPassthru } from '../../../../api/com/atproto/admin/util'
import { authPassthru } from '../../../proxy'
import { OutputSchema } from '../../../../lexicon/types/app/bsky/actor/getProfile'
import { handleReadAfterWrite } from '../util/read-after-write'
import { LocalRecords } from '../../../../services/local'
import {
LocalViewer,
handleReadAfterWrite,
LocalRecords,
} from '../../../../read-after-write'
export default function (server: Server, ctx: AppContext) {
server.app.bsky.actor.getProfile({
@ -27,12 +30,10 @@ export default function (server: Server, ctx: AppContext) {
}
const getProfileMunge = async (
ctx: AppContext,
localViewer: LocalViewer,
original: OutputSchema,
local: LocalRecords,
): Promise<OutputSchema> => {
if (!local.profile) return original
return ctx.services
.local(ctx.db)
.updateProfileDetailed(original, local.profile.record)
return localViewer.updateProfileDetailed(original, local.profile.record)
}

@ -1,8 +1,11 @@
import AppContext from '../../../../context'
import { Server } from '../../../../lexicon'
import { OutputSchema } from '../../../../lexicon/types/app/bsky/actor/getProfiles'
import { LocalRecords } from '../../../../services/local'
import { handleReadAfterWrite } from '../util/read-after-write'
import {
LocalViewer,
handleReadAfterWrite,
LocalRecords,
} from '../../../../read-after-write'
export default function (server: Server, ctx: AppContext) {
server.app.bsky.actor.getProfiles({
@ -26,7 +29,7 @@ export default function (server: Server, ctx: AppContext) {
}
const getProfilesMunge = async (
ctx: AppContext,
localViewer: LocalViewer,
original: OutputSchema,
local: LocalRecords,
requester: string,
@ -35,9 +38,7 @@ const getProfilesMunge = async (
if (!localProf) return original
const profiles = original.profiles.map((prof) => {
if (prof.did !== requester) return prof
return ctx.services
.local(ctx.db)
.updateProfileDetailed(prof, localProf.record)
return localViewer.updateProfileDetailed(prof, localProf.record)
})
return {
...original,

@ -1,7 +1,7 @@
import { InvalidRequestError } from '@atproto/xrpc-server'
import { Server } from '../../../../lexicon'
import AppContext from '../../../../context'
import { UserPreference } from '../../../../services/account'
import { InvalidRequestError } from '@atproto/xrpc-server'
import { AccountPreference } from '../../../../actor-store/preference/reader'
export default function (server: Server, ctx: AppContext) {
server.app.bsky.actor.putPreferences({
@ -9,19 +9,16 @@ export default function (server: Server, ctx: AppContext) {
handler: async ({ auth, input }) => {
const { preferences } = input.body
const requester = auth.credentials.did
const { services, db } = ctx
const checkedPreferences: UserPreference[] = []
const checkedPreferences: AccountPreference[] = []
for (const pref of preferences) {
if (typeof pref.$type === 'string') {
checkedPreferences.push(pref as UserPreference)
checkedPreferences.push(pref as AccountPreference)
} else {
throw new InvalidRequestError('Preference is missing a $type')
}
}
await db.transaction(async (tx) => {
await services
.account(tx)
.putPreferences(requester, checkedPreferences, 'app.bsky')
await ctx.actorStore.transact(requester, async (actorTxn) => {
await actorTxn.pref.putPreferences(checkedPreferences, 'app.bsky')
})
},
})

@ -1,9 +1,12 @@
import { Server } from '../../../../lexicon'
import AppContext from '../../../../context'
import { authPassthru } from '../../../proxy'
import { OutputSchema } from '../../../../lexicon/types/app/bsky/feed/getAuthorFeed'
import { handleReadAfterWrite } from '../util/read-after-write'
import { authPassthru } from '../../../../api/com/atproto/admin/util'
import { LocalRecords } from '../../../../services/local'
import {
LocalViewer,
handleReadAfterWrite,
LocalRecords,
} from '../../../../read-after-write'
export default function (server: Server, ctx: AppContext) {
server.app.bsky.feed.getActorLikes({
@ -28,12 +31,11 @@ export default function (server: Server, ctx: AppContext) {
}
const getAuthorMunge = async (
ctx: AppContext,
localViewer: LocalViewer,
original: OutputSchema,
local: LocalRecords,
requester: string,
): Promise<OutputSchema> => {
const localSrvc = ctx.services.local(ctx.db)
const localProf = local.profile
let feed = original.feed
// first update any out of date profile pictures in feed
@ -44,7 +46,7 @@ const getAuthorMunge = async (
...item,
post: {
...item.post,
author: localSrvc.updateProfileViewBasic(
author: localViewer.updateProfileViewBasic(
item.post.author,
localProf.record,
),

@ -1,10 +1,13 @@
import { Server } from '../../../../lexicon'
import AppContext from '../../../../context'
import { authPassthru } from '../../../proxy'
import { OutputSchema } from '../../../../lexicon/types/app/bsky/feed/getAuthorFeed'
import { handleReadAfterWrite } from '../util/read-after-write'
import { authPassthru } from '../../../../api/com/atproto/admin/util'
import { LocalRecords } from '../../../../services/local'
import { isReasonRepost } from '../../../../lexicon/types/app/bsky/feed/defs'
import {
LocalViewer,
handleReadAfterWrite,
LocalRecords,
} from '../../../../read-after-write'
export default function (server: Server, ctx: AppContext) {
server.app.bsky.feed.getAuthorFeed({
@ -28,12 +31,11 @@ export default function (server: Server, ctx: AppContext) {
}
const getAuthorMunge = async (
ctx: AppContext,
localViewer: LocalViewer,
original: OutputSchema,
local: LocalRecords,
requester: string,
): Promise<OutputSchema> => {
const localSrvc = ctx.services.local(ctx.db)
const localProf = local.profile
// only munge on own feed
if (!isUsersFeed(original, requester)) {
@ -48,7 +50,7 @@ const getAuthorMunge = async (
...item,
post: {
...item.post,
author: localSrvc.updateProfileViewBasic(
author: localViewer.updateProfileViewBasic(
item.post.author,
localProf.record,
),
@ -59,7 +61,7 @@ const getAuthorMunge = async (
}
})
}
feed = await localSrvc.formatAndInsertPostsInFeed(feed, local.posts)
feed = await localViewer.formatAndInsertPostsInFeed(feed, local.posts)
return {
...original,
feed,

@ -3,6 +3,7 @@ import { AppBskyFeedGetPostThread } from '@atproto/api'
import { Headers } from '@atproto/xrpc'
import { Server } from '../../../../lexicon'
import AppContext from '../../../../context'
import { authPassthru } from '../../../proxy'
import {
ThreadViewPost,
isThreadViewPost,
@ -13,16 +14,13 @@ import {
QueryParams,
} from '../../../../lexicon/types/app/bsky/feed/getPostThread'
import {
LocalRecords,
LocalService,
RecordDescript,
} from '../../../../services/local'
import {
LocalViewer,
getLocalLag,
getRepoRev,
handleReadAfterWrite,
} from '../util/read-after-write'
import { authPassthru } from '../../../com/atproto/admin/util'
LocalRecords,
RecordDescript,
} from '../../../../read-after-write'
export default function (server: Server, ctx: AppContext) {
server.app.bsky.feed.getPostThread({
@ -57,12 +55,18 @@ export default function (server: Server, ctx: AppContext) {
)
} catch (err) {
if (err instanceof AppBskyFeedGetPostThread.NotFoundError) {
const local = await readAfterWriteNotFound(
ctx,
params,
requester,
err.headers,
)
const headers = err.headers
const keypair = await ctx.actorStore.keypair(requester)
const local = await ctx.actorStore.read(requester, (store) => {
const localViewer = ctx.localViewer(store, keypair)
return readAfterWriteNotFound(
ctx,
localViewer,
params,
requester,
headers,
)
})
if (local === null) {
throw err
} else {
@ -88,7 +92,7 @@ export default function (server: Server, ctx: AppContext) {
// ----------------
const getPostThreadMunge = async (
ctx: AppContext,
localViewer: LocalViewer,
original: OutputSchema,
local: LocalRecords,
): Promise<OutputSchema> => {
@ -98,7 +102,7 @@ const getPostThreadMunge = async (
return original
}
const thread = await addPostsToThread(
ctx.services.local(ctx.db),
localViewer,
original.thread,
local.posts,
)
@ -109,7 +113,7 @@ const getPostThreadMunge = async (
}
const addPostsToThread = async (
localSrvc: LocalService,
localViewer: LocalViewer,
original: ThreadViewPost,
posts: RecordDescript<PostRecord>[],
) => {
@ -117,7 +121,7 @@ const addPostsToThread = async (
if (inThread.length === 0) return original
let thread: ThreadViewPost = original
for (const record of inThread) {
thread = await insertIntoThreadReplies(localSrvc, thread, record)
thread = await insertIntoThreadReplies(localViewer, thread, record)
}
return thread
}
@ -135,12 +139,12 @@ const findPostsInThread = (
}
const insertIntoThreadReplies = async (
localSrvc: LocalService,
localViewer: LocalViewer,
view: ThreadViewPost,
descript: RecordDescript<PostRecord>,
): Promise<ThreadViewPost> => {
if (descript.record.reply?.parent.uri === view.post.uri) {
const postView = await threadPostView(localSrvc, descript)
const postView = await threadPostView(localViewer, descript)
if (!postView) return view
const replies = [postView, ...(view.replies ?? [])]
return {
@ -152,7 +156,7 @@ const insertIntoThreadReplies = async (
const replies = await Promise.all(
view.replies.map(async (reply) =>
isThreadViewPost(reply)
? await insertIntoThreadReplies(localSrvc, reply, descript)
? await insertIntoThreadReplies(localViewer, reply, descript)
: reply,
),
)
@ -163,10 +167,10 @@ const insertIntoThreadReplies = async (
}
const threadPostView = async (
localSrvc: LocalService,
localViewer: LocalViewer,
descript: RecordDescript<PostRecord>,
): Promise<ThreadViewPost | null> => {
const postView = await localSrvc.getPost(descript)
const postView = await localViewer.getPost(descript)
if (!postView) return null
return {
$type: 'app.bsky.feed.defs#threadViewPost',
@ -179,6 +183,7 @@ const threadPostView = async (
const readAfterWriteNotFound = async (
ctx: AppContext,
localViewer: LocalViewer,
params: QueryParams,
requester: string,
headers?: Headers,
@ -190,14 +195,13 @@ const readAfterWriteNotFound = async (
if (uri.hostname !== requester) {
return null
}
const localSrvc = ctx.services.local(ctx.db)
const local = await localSrvc.getRecordsSinceRev(requester, rev)
const local = await localViewer.getRecordsSinceRev(rev)
const found = local.posts.find((p) => p.uri.toString() === uri.toString())
if (!found) return null
let thread = await threadPostView(localSrvc, found)
let thread = await threadPostView(localViewer, found)
if (!thread) return null
const rest = local.posts.filter((p) => p.uri.toString() !== uri.toString())
thread = await addPostsToThread(localSrvc, thread, rest)
thread = await addPostsToThread(localViewer, thread, rest)
const highestParent = getHighestParent(thread)
if (highestParent) {
try {

@ -1,8 +1,11 @@
import { Server } from '../../../../lexicon'
import AppContext from '../../../../context'
import { OutputSchema } from '../../../../lexicon/types/app/bsky/feed/getTimeline'
import { handleReadAfterWrite } from '../util/read-after-write'
import { LocalRecords } from '../../../../services/local'
import {
LocalViewer,
handleReadAfterWrite,
LocalRecords,
} from '../../../../read-after-write'
export default function (server: Server, ctx: AppContext) {
server.app.bsky.feed.getTimeline({
@ -19,13 +22,14 @@ export default function (server: Server, ctx: AppContext) {
}
const getTimelineMunge = async (
ctx: AppContext,
localViewer: LocalViewer,
original: OutputSchema,
local: LocalRecords,
): Promise<OutputSchema> => {
const feed = await ctx.services
.local(ctx.db)
.formatAndInsertPostsInFeed([...original.feed], local.posts)
const feed = await localViewer.formatAndInsertPostsInFeed(
[...original.feed],
local.posts,
)
return {
...original,
feed,

@ -1,6 +1,6 @@
import { Server } from '../../../../lexicon'
import AppContext from '../../../../context'
import { authPassthru } from '../../../../api/com/atproto/admin/util'
import { authPassthru } from '../../../proxy'
export default function (server: Server, ctx: AppContext) {
server.app.bsky.graph.getFollowers({

@ -1,6 +1,6 @@
import { Server } from '../../../../lexicon'
import AppContext from '../../../../context'
import { authPassthru } from '../../../../api/com/atproto/admin/util'
import { authPassthru } from '../../../proxy'
export default function (server: Server, ctx: AppContext) {
server.app.bsky.graph.getFollows({

Some files were not shown because too many files have changed in this diff Show More